Imagine you’re outside and need to hit the nail into the wall to hang a picture. You select the nail of the correct size and then look around for an appropriate hitting tool. You pick up a new, shiny shovel that you recently bought in the nearest Home Depot, aim it at the nail and – bang! – throw a punch. The nail bends in response. You look disapprovingly at the shovel and say: “Such a fancy tool – and not a cheap one, too! – but completely useless.”
Sounds ridiculous, right? But that’s what often happens when people and organizations decide to use a new “shiny” tool without figuring out first how to properly handle it. Facing disappointing results, frustrated operators conclude: “This tool doesn’t work for me/us.”
Crowdsourcing is one of such new shiny tools. In recent years, its popularity has skyrocketed. Unfortunately, along the way, the definition of crowdsourcing has lost its original meaning. It became synonymous with just about every event happening online, especially if this event engages a substantial number of people.
Mixing crowdsourcing with online networking is a frequent mistake – that’s why we have oxymorons like “crowdsourcing on Facebook” (or Twitter, or Yelp).
Another popular mistake is confusing crowdsourcing with brainstorming. In brainstorming, a question is presented to several people who’re asked to come up with answers. As brainstorming session progresses, people propose their own ideas, capitalize on ideas of others, or, perhaps, redefine the question itself. Many folks mistakenly believe that if you replace a group of eight to ten (reportedly the optimal number of people for brainstorming) with a crowd of dozens or even hundreds, you’re not brainstorming but crowdsourcing.
But this is not crowdsourcing. Crowdsourcing is different from brainstorming in one very important aspect: it requires independence of opinions, a feature of crowdsourcing highlighted in the classic James Surowiecki’s book, “The Wisdom of Crowds.” In contrast to brainstorming, during a crowdsourcing campaign, you must make sure that the members of your crowd, either individuals or small teams, provide their input independently of the opinion of others. It’s this aspect of crowdsourcing that results in the delivery of highly diversified, original, and often completely unexpected solutions to the problem -as opposed to brainstorming that almost always ends up with the group reaching a consensus.
Why is important to keep a crisp border between crowdsourcing and other problem-solving tools, such as brainstorming? Because if we want organizations to start using crowdsourcing in their innovation practices, we need to ensure that they know the basic rules of applying this technique.
Take, for example, a 2017 Harvard Business Review article titled “Rethinking Crowdsourcing.” The article described a review of 87 crowdsourcing projects aimed at generating new consumer product ideas. In the course of each project, managers allowed participants to “like” each other submissions – a feature that doesn’t belong in crowdsourcing.
The result? Some contributors began to “like” each other’s ideas so that the apparent value of their respective contributions became overinflated. No wonder that when the submitted proposals were assessed by independent evaluators, no correlation has been found between most “likable” ideas and those that led to successful products.
The conclusion of the article was even more troubling: “It can be unwise to rely on the crowd.” Not an encouraging statement for those who want to start exploring what crowdsourcing can do for their organizations!
I was equally puzzled by another, more recent HBR article, “Research: for crowdsourcing to work, everyone needs an equal voice.” Sure, the authors, two academic researchers, have come up with a correct conclusion: “In order for the wisdom of crowds to retain its accuracy for making predictions, every member of the group must be given an equal voice, without any one person dominating.” Yet, their usage of the generic term “the wisdom of crowds” – while describing a process that mixed both crowdsourcing and brainstorming – made me somewhat uneasy.
It’s impossible to overestimate the role that academic research could play in making crowdsourcing a mainstream problem-solving tool. There are two things that I, a crowdsourcing practitioner, expect from my academic colleagues. First, a solid classification of existing types of crowdsourcing. Second, a clear definition of what crowdsourcing is and what it is not. Muddying the “terminology waters” isn’t helpful.
Image provided by Tatiana Ivanov