There are several reasons for the slow adoption of crowdsourcing as a practical problem-solving tool.
One of them is the lack of trust in the intellectual power of the crowd, its ability to tackle complex problems. Almost everyone would agree that the proverbial wisdom of crowds can be applied to a “simple” task, such as creating a corporate logo or naming a city landmark. However, when it comes to answering a question that requires specialized knowledge, organizations prefer to turn to experts.
This preference obviously sits well with the experts themselves. They’re often scornful of the idea that someone with no immediate experience in the field can solve a problem that they could not. This sentiment was eloquently summarized in a 2010 article: “Our trust in the expert appears to be increasingly supplanted by a willingness to rely on the knowledge derived from crowds of amateurs.”
“Crowds of amateurs.” Harsh words, huh?
Pitting experts against crowds is plain silly. Experts represent an essential part of any crowdsourcing campaign; in fact, crowdsourcing is impossible without experts. Only experts can identify and properly formulate problems facing organizations. Only experts can properly evaluate incoming external submissions to select those that make sense. Only experts can successfully integrate external information with the knowledge available in-house. It’s only at this midpoint of the problem-solving process – at the stage of generating potential solutions to the problem – that crowds are usually superior to experts.
Why? A recent study in the field of neurobiology provides useful insight. A team of scientists from Cold Spring Harbor Laboratory led by Dr. Anne Churchland analyzed neuronal activity in the brains of mice forced to learn new decision-making skills.
As the mice progressed through learning new tricks, more and more neurons in their brains got involved. However, the neuron activity rapidly became very selective: the neurons only responded when the mice made one choice and not another. This pattern became even stronger as the mice learned how to do a task better (i.e., became an “expert” in this task). Moreover, when the expertise was fully achieved, the mouse’s brain was ready for that expert decision even before the mouse began executing the task.
In other words, the “expert” mice know how to solve the problem even before starting to solve it!
In contrast, the neuronal activities in the brains of “non-expert” mice remain non-selective – meaning that the mice would approach the task with an “open mind.”
Were these findings held for humans, the implication would be that experts approach the problem with the patterns that are already pre-formed in their brains by their prior experience. In contrast, amateurs may approach the same problem from a completely different angle – and the more amateurs are involving in solving the problem, the more chance that a completely novel, unorthodox solution could be found.
That means that when solving a problem requires prior experience (e.g., when solving a similar problem as in the past), organizations should engage experts. However, if the problem is novel and may require a fresh look at it, crowds would be a better choice.
There is no sense to discuss which tool, experts or crowds, is better. They are different, complementary tools in the innovation management toolbox. Each should be used at its proper time and place.
Image provided by Tatiana Ivanov