I’m often asked questions about crowdsourcing. Usually, they’re revolving around this central theme: what can crowdsourcing do? Can crowdsourcing solve this problem? Can crowdsourcing solve that problem? On occasion, a more perceptive question is posed: can crowdsourcing define a problem?
My answer to all these questions is standard: yes, it can. What I want my interlocutors to understand is that crowdsourcing is first and foremost a question, a question that you ask a carefully selected crowd of people. And it doesn’t really matter what this question is about, for as long as it well-thought-out, properly defined and clearly articulated. Yes, it can be a question about a solution to a problem. Yes, it can be a question about a problem itself.
Two examples of using crowdsourcing in both incarnations came from the same organization, Harvard Medical School. The first example shows how HMS scientists used crowdsourcing to solve a problem. This particular problem was how to improve the capacity of a DNA sequencing algorithm employed in one of the HMS projects. (Let me skip technical details here because I wrote about this case only a few weeks ago.) HMS first decided to solve the problem in-house and indeed made a significant (5.5-fold) improvement in the algorithm capacity. But this wasn’t enough, and they launched a two-week-long crowdsourcing campaign. 122 algorithms from the outside of HMS have been submitted, and the winning solution provided a 1,000-fold improvement over the initial algorithm, a 180-fold improvement over the internal solution.
But a few years before, HMS put the crowdsourcing approach in reverse: they use it to define a problem. Specifically, they asked a question: what do we not know to cure Type 1 diabetes? The idea behind the question was that as every prominent scientific topic, the Type 1 diabetes research was following a limited number of popular directions, chasing essentially the same set of problems. HMS decided to ask members of the Harvard community, as well as general public, to identify “neglected” problems, the problems that for whatever reasons were off the radars of existing labs involved in Type 1 diabetes research. Essentially HMS wanted the crowd to come up with different, better problems, regardless of whether the crowd had the expertise or resources to solve these problems.
The results were quite impressive. Of total of about 190 entries to the contest, 12 were chosen as the most “out-of-the-box.” (Interestingly enough, among people submitting winning proposals was a diabetes patient, an undergraduate student, an HR representative and a researcher with no immediate expertise in the diabetes field.) Some of the most promising problems were later converted into bona fide research projects.
So when asked what one needs to run a successful crowdsourcing campaign, my answer is, only two things: a question and a crowd.