Working a Crowd

imagesIf crowdsourcing has not yet become a mainstream innovation tool, this is definitely not for the lack of attention. Crowdsourcing remains a topic of intense academic studies, and a recent paper by researchers from Simon Frazer University in Canada is a case in point. Written by John Prpic and co-authors and titled “How to Work a Crowd: Developing Crowd Capital through Crowdsourcing,” the article provides some theoretical background and practical guidance to managers wanting to use crowdsourcing to advance their business objectives. Presented below are a few key points I’ve extracted from the paper along with my comments.

  1. The authors begin with a crowdsourcing typology. They define crowdsourcing as “an on-line, distributed problem solving model…[that allows]…approaching crowds and asking for contributions [that] can help organization develop solutions to a variety of business challenges.” Using a number of criteria, Prpic et al. divide crowdsourcing into four categories: crowd-voting, idea crowdsourcing, micro-task crowdsourcing and solution crowdsourcing. Naturally, there are differences between the four categories in the ways crowdsourced information is collected and processed; consequently, Prpic et al. suggest that managers clearly understood their business needs to match them to a specific crowdsourcing category.

My comment: Defining and categorizing things is what academic researchers are especially good at. I like their definition of crowdsourcing, and their typology looks solid. My only concern is that for managers who have no prior experience with crowdsourcing, choosing between different crowdsourcing types might be too complicated, if not outright intimidating. My advice to them would be this: remember that above all other things, crowdsourcing is a question. And it doesn’t really matter what this question is about, for as long as it is well-thought-out, properly defined and clearly articulated. So, forget for now about definitions and typology; focus instead on problem definition and try to understand what kind of responses will represent a solution to your problem. If this understanding will be expressed in a simple and coherent problem statement followed by in a set of clear-cut requirements to a successful solution, your crowdsourcing campaign will be just fine.    

  1. The other indispensable part of any crowdsourcing campaign, in addition to a problem, is a crowd, and Prpic et al. correctly point out the need to “construct” a crowd. Two aspects are especially important in this context: the size of the crowd and its composition. The authors seem to favor an idea that larger crowds are generally more advantageous than smaller ones; however, they also see benefits of working with smaller (“closed”) crowds. At the same time, they state that “[d]ifferent crowds possess different knowledge, skills and other resources, and accordingly, can bring different types of value to an organization.” Taking to a logical conclusion, that would mean that organization should try to construct a customized crowd for each crowdsourcing campaign.

My comment: Constructing crowds of meaningful size and diversity is a long and          expensive process. Sure, once “constructed,” the crowds can be custom-modified, and available technology makes this task feasible. Yet, I’d strongly advise against playing too much with crowd size and composition. First, selecting “correct” participants for your crowdsourcing campaign only makes sense if you know them all. This is possible if you work with crowds composed of your own employees and/or a pool of trusted collaborators (academic partners, suppliers, selected customers, etc.). However, if you go outside your company–when using external innovation portals or innovation intermediaries–such a selection becomes cost-ineffective at best and outright impossible at worst. Second, even more importantly, the popular belief that only people with “relevant knowledge and expertise” can solve your problem is plain wrong. The experience of crowdsourcing experts, such as InnoCentive, clearly shows that innovation can come from completely unexpected–and therefore unpredictable–sources; moreover, it was proven time and again, that a solver’s likelihood of solving a problem actually increases with the distance between the solver’s own field of technical expertise and the problem’s domain. Instead of trying to select correct “solvers” to their problem, managers would better spend time to describe what a correct “solution” to this problem should be. As pointed above, composing a powerful problem statement followed by in a set of clear-cut requirement to a requested solution is a better way to make your crowdsourcing campaign successful.

  1. The authors emphasize the fact that simply engaging a crowd and successfully acquiring the desired contributions is not enough for the ultimate success of a crowdsourcing campaign. Equally important is a process of internal assimilation of the acquired crowdsourced information. To achieve this goal, organizations “need to institute internal organizational processes to organize and purpose the incoming knowledge and information.”

My comment: Here I completely agree with Prpic et al., for I’ve witnessed first-hand multiple examples of nicely designed and skillfully implemented crowdsourcing campaigns that became eventual failures just because the campaign organizers had no established internal structure to “marry” outside knowledge with the one produced inside. First, more often than not, outside knowledge, especially collected in the course of solution crowdsourcing, comes only “half-baked,” i.e. in need of further processing using internal resources. Second, there is a cultural aspect: the notorious “not invented here” syndrome is alive and well and is capable of preventing external knowledge from taking hold in any organization. Managers who dream of using crowdsourcing should therefore start with “internal crowdsourcing” that would bring together their own employees first. Such internal crowdsourcing could be run through internal innovation networks (INNs). INNs not only  foster the very culture of collaboration, bringing together corporate units (R&D, business development, marketing, etc.) that in many organizations often have no institutional platform to communicate on strategic issues; they also provide intellectual and operational support for the company’s external innovation programs. Once an organization has mastered the process of internal crowdsourcing, going outside often means just expanding its technological capabilities (the existence of other important issues, such as IP protection, notwithstanding).

In conclusion, academic literature keeps producing useful examples of “best practices” in using crowdsourcing to solve various technological and business problems. Yet, as we all know, the best practices are those that work specifically for our organization. So managers aspiring to become masters of crowdsourcing should not feel paralyzed by the growing amount of (often conflicting) crowdsourcing literature; they should start running their own crowdsourcing campaigns. After all, the best–and the only–way to learn swimming is to get into the water.

About Eugene Ivanov

Eugene Ivanov is the Founder of (WoC)2, an innovation consultancy that helps organizations extract maximum value from the wisdom of crowds by coordinated use of internal and external crowdsourcing.
This entry was posted in Crowdsourcing and tagged , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s