(This post originally appeared on Danish Crowdsourcing Association website)
I strongly believe that as an open innovation tool, crowdsourcing has a bright future, but only if it proves its economic worth. In other words, when properly designed and executed, a crowdsourcing campaign should be able to solve a problem in a more cost-effective way than other tools.
This proof, however, doesn’t come easy, as economic analyses of crowdsourcing campaigns, whether successful or not, usually aren’t publicly available. Yet, fortunately, such data do appear in the open time to time–and they’re nothing short of spectacular.
In 2010, Forrester Consulting published a case study describing open innovation program at a large multinational agricultural company Syngenta. The study analyzed the total economic impact and return on investment (ROI) Syngenta had realized by using a crowdsourcing platform provided by InnoCentive, an open innovation intermediary. The Forrester’s analysis identified a number of benefits gained by Syngenta from this cooperation, including cost savings from finding solutions to difficult R&D problems, productivity savings for Syngenta’s researchers and reduction in intellectual property transfer time. The total value of these benefits was estimated at $11,861,688 over three years. Given that the total cost of using InnoCentive services over the same period amounted to $4,200,567, Forrester calculated that a three-year, risk-adjusted ROI for Syngenta was 182%, with a payback period of fewer than two months. Isn’t it cool?
More recently, a piece of data that would make crowdsourcing fans really happy came out of Harvard Medical School. In one of their research projects, HMS scientists employed the DNA sequencing algorithm with a capacity of processing 100,000 sequences in 260 minutes. This was way too slow, and in order to improve the efficiency of the algorithm, HMS hired a full-time developer with the annual salary of $120,000. The developer did lower the processing time to 47 minutes, a 5.5-fold improvement, but this was still not fast enough. HMS then launched an open innovation contest that ran for two weeks and offered $6,000 in prize money. 733 participants took part in the competition, and 122 of them (representing 69 countries) submitted algorithms. The winning solution was capable of doing the desired job in 16(!) seconds, a 1,000-fold improvement over the MegaBLAST algorithm and a 180-fold improvement over the internal solution. Given a 20-fold difference in costs ($120,000 vs. $6,000), the HMS crowdsourcing campaign was overall 3,600-fold more cost-effective than the internal solution. Let me repeat: 3,600-fold more cost-effective!
However, as I said in the beginning, good things could happen only when a crowdsourcing campaign is well designed and skillfully executed. And here is where the problem with crowdsourcing seems to reside: many organizations simply lack an appropriate expertise. But this is a topic for another conversation.