I know you, I know you not. (How we find experts.)

knightatcrossroadsIn one of my previous posts I wrote that when facing a problem the majority of organizations have a natural inclination to begin the problem-solving process with engaging experts. Such an approach makes sense when an organization dealt with a similar problem in the past and knows people who could potentially solve a new problem too. (Actually, many large organizations have a host of pre-selected consultants for each area of strategic interest.)

But what if you face a problem you haven’t met in the past–and in our fast-changing environment this happens more and more often to more and more organizations? Where will you go to find experts in this uncharted territory? How will you know that the experts you’re going to hire are really good? And even more basic question: how do you know that someone is or is not an expert in this particular field?

The answer to the last question may look deceptively simple. Well, all experts are supposed to be present on LinkedIn or any another of a plethora of similar professional networks. You go there, type appropriate “key” words and, bingo, here is a list of everyone who might be considered an expert.

As a curious example of the perils of such a “targeted” search, here is the story of the 2006 ALS Biomarker Grand Challenge sponsored by Prize4Life, a nonprofit organization dedicated to finding a cure for Amyotrophic Lateral Sclerosis (a.k.a. Lou Gehrig Disease). The purpose of this crowdsourcing campaign (run on the online platform provided by InnoCentive) was to find an effective biomarker that could measure the disease progression (or regression in case of clinical trials) in ALS patients.

In 2009, Prize4Life awarded two “progress prizes” for solutions that had made the most significant progress towards meeting the final criteria of the challenge. One winner was Dr. Seward Rutkove, a neurologist at Beth Israel Deaconess Medical Center in Boston and a prominent researcher in the field of neuromuscular disorders, such as ALS. (Dr. Rutkove went on to win the $1 million Grand Prize in 2011.) But the other “progress prize” was awarded to Dr. Harvey Arbesman, a doctor in private practice in a suburb of Buffalo and someone virtually unknown in the ALS community. And why is that? Because Dr. Arbesman was a…dermatologist with no formal ties to the field of neuromuscular diseases. Although Dr. Arbesman’s biomarker did not fully meet the Challenge criteria, the sponsor of the Challenge immediately appreciated the potential of this biomarker in providing valuable insight into the fundamental mysteries of the disease.

Was there any chance for Dr. Arbesman to be selected as an expert by any organization willing to start working on finding ALS cure? No. But Prize4Life didn’t start with selecting experts; it started with formulating a problem to be solved and then talking to everyone who showed proven capability to solve this particular problem, regardless of their formal expertise, medical certification or place of employment. Prize 4Life didn’t go around looking for a solution. Instead, they announced that they had a problem and then waited until right solution would find them.

As a result, Prize4Life went from literally nothing to a fully validated ALS biomarker in a matter of 3-5 years, a feat that normally requires at least twice of that to perform. And this is not a small matter, given that that most ALS patients die within 2-5 years of diagnosis.

That is what I mean by saying that a properly designed crowdsourcing campaign is very cost-effective.

Image credit: Viktor Vasnetsov, “Knight at the Crossroads” (1878)

Posted in Crowdsourcing | Tagged , , , , , , , , , , , , | Leave a comment

The first rule of crowdsourcing: know what you want, understand what you need

A robot arm (foreground) retrieves assay plates from incubators and places them at compound transfer stations or hands them off to another arm (background) that services liquid dispensers or plate readers.

A robot arm (foreground) retrieves assay plates from incubators and places them at compound transfer stations or hands them off to another arm (background) that services liquid dispensers or plate readers.

I like to argue (for example, here) that the most important factor that defines the ultimate success or failure of any crowdsourcing campaign is the ability to properly identify and articulate the problem–technological, business or social–that the crowd will be asked to solve. I’d call it the “80:20 rule”: 80% of unsuccessful crowdsourcing campaigns I’m aware of failed because of the inability to properly formulate the question to be presented to the crowd; only 20% did so because of a poor match between the question and the crowd (the latter being usually a component of a chosen crowdsourcing platform).

So, what does it mean: to properly identify and articulate the problem to crowdsource? Let me start with a story. I worked with a client, a large pharmaceutical company (as in my previous post, I changed the industry affiliation of my client; it’s completely irrelevant to what I’m going to tell). My client wanted to design a high throughput screening (HTS) assay to measure one particular type of cellular transformation. To those unfamiliar with HTS assays I can say that widely used in drug development, HTS assays employ robotics and special software to screen for biologic activity literally hundreds of thousands of chemical compounds.

My counterpart at the client site, the head of the assay development group, dutifully described the most important parameters she expected from the future assay. This sample volume (the number of samples analyzed per hour or day), this ratio of false positives (the percentage of inactive compounds falsely identified as active), this ratio of false negatives (the percentage of active compounds falsely identified as inactive), the price (as cheap as possible; but of course).

I was making notes while listening to her and the sense that I was missing something very fundamental was gradually growing in me. Finally, I ventured to interrupt: “All right, everything is clear. But what is about your endpoint?” (In case of any assay, including HTS, endpoint is something that the assay actually measures).

For a split second, my client lost her confidence. She paused and then said, carefully choosing her words: “Well, we don’t have an endpoint. We thought that this would be part of the whole solution.”

It was my turn to carefully choose what I was about to say. “Well,” I said, “maybe we start with looking for an endpoint first and then, after we found it, we run a follow-up crowdsourcing campaign to design an HTS assay based on this endpoint?”

In response my interlocutor smiled broadly and said: “Look, if we had a good endpoint, we wouldn’t need you: my in-house assay developers will design an HTS version of the assay in a matter of weeks.”

That sealed it. Very rapidly the two of us put together a problem statement asking for a suitable molecule (protein or nucleic acid) whose change in structure or quantity would signal that the cellular transformation in question had taken place. We posted the statement online, and in about a couple of weeks I got a submission from a university professor living in one of the small Eastern European countries. The submission described a protein (I never heard about it before) that was overproduced by cells that had experienced the transformation in question; this overproduction can be easily detected by measuring the intensity of fluorescence, a slam dunk for any assay developer.

Frugally written, only half-page in length, the submission had only couple of paragraphs of text, a picture and a reference. But it was something I could share with my client, which I immediately did. Her response followed soon: “I love it! We’re buying it.”

And that was it. I completed the paperwork transferring all necessary IP rights to the solution to my client. I never heard from her again: apparently, her in-house assay developers were indeed as good as she described them.

Clients come to me always knowing what they want; very often, however, they don’t do enough digging to understand what they really need. I remember a client who wanted to change the design of a paint dispersing pump because it often clogged. We looked into the problem deeper and found that the cause was actually the paint whose viscosity dramatically increased with slight drops in air temperature. My client fixed the clogging problem himself by changing the paint formulation, without touching the pump. I remember another client who wanted to crowdsource an additive that would prevent a food product losing sweetness upon processing. To my client’s great surprise, the eventual winning solution proposed a change in the food processing itself leading to exactly the same result.

It’s tempting to say that what clients want is a symptom of a disease whereas what clients need is the disease cause. You can’t successfully fight the disease (solving the problem) unless you identify its real cause first (defining the problem). But enough scientific terminology for one post! Let me finish with formulating my first (“golden”) rule of crowdsourcing: know what you want, understand what you need.

Image credit: https://en.wikipedia.org/wiki/High-throughput_screening

Posted in Crowdsourcing | Tagged , , , , , , , , | Leave a comment

Know your neighbor (The virtues of crowdsourcing)


Before I turn to the virtues of crowdsourcing, let me tell you a story that happened 7-8 years ago. I worked with a client, a statistician in the consumer product field. (To tell this story freely, I changed the industry affiliation of my client. It doesn’t really matter here). My client felt that an analytical algorithm he used to process the data was suboptimal. He also had a hunch that there may be someone, somewhere, who has an idea on how to improve this algorithm. So we decided to run a crowdsourcing campaign asking for improvements to the old algorithm. The brief that we posted online was unavoidably technical in nature, but in essence what it was saying was that: “Here is what I do and here is my algorithm. Here is what I dislike about it. Here is what I’d consider a substantial improvement to it. If you do that, I’ll pay you $20,000.”

I have to admit that initially I had doubts about the potential success of this campaign: I simply wasn’t sure that the crowd of people we were approaching had enough individuals capable of dealing with such a specific topic. To my pleasant surprise, the problem posted by my client was met with high interest and enthusiasm. Educated questions followed, which is always a good sign, and by the posting deadline, we had a respectable number of submitted solutions.

I had no clue with regards to the quality of these proposals. In many cases, I can “feel” good ones–I don’t know how, perhaps, just by a manner a submission looks and reads. Alas, not in the case of descriptions full of mathematical formulas. But my client liked what we got. He initially focused on three most promising solutions and then rapidly singled out one that he announced a winner.

I proceeded with paperwork, a set of formalities needed to move money from one party to another. At this point, I could disclose the identity of the winning solver to my client, and when sending this information, I noticed with some amusement that both my client and the solver lived in the same state.

A few days later, I received a phone call from my client. “Are you kidding me?” he was yelling in my ear, “Are you kidding me with this guy?” I froze. “What? What’s wrong?” (However unlikely, but theoretically speaking, people may turn out to be ineligible to receive an award after completing their part of paperwork.) “Are you kidding me or what? This guy is my neighbor!”

As it turned out, my client and the gentleman who solved his problem were indeed living a few houses apart on the same street in a small Mid-Western town. They were meeting regularly outside, exchanging greetings and opinions about weather while walking their dogs, but never had a chance (or need?) to formally introduce each other. (Do you really know names and occupation of all of your neighbors?) So the irony was that my client has been struggling with this problem for years, and the person who solved it in less than two months lived down the same street.

What is the moral of this story? When facing a complex technical or business problem, many organizations have a natural inclination to engage experts in finding a solution. In order to do that, they first need to know who the experts are and then choose supposedly the best–and most of large organization already have a stable of pre-selected consultants for each area of potential interest. The expert of choice then provides his or her personal opinion. Sure, organizations may ask for a second opinion, but this is rear because experts, especially the elite ones, command high consulting fees. In other words, asking for an expert opinion requires organizations to know in advance where to go for a solution.

Crowdsourcing is different. When posting your problem online, you become agnostic on the sources of potential solutions. They may come from any direction, and you don’t have to do anything to “target” your search–given, of course, that you’re approaching a large and sufficiently diversified crowd. In other words, when you crowdsource, you don’t have to know where to go; you just announce to the world that you have a problem, and then solutions come to find you. That’s why properly designed crowdsourcing campaigns are so cost-effective.

And then, there is a question of diversity of responses. If you’re approaching a person, who is an expert in Method A, don’t expect him or her to tell you that using Method B might be a better option to deal with your problem. And if you’re approaching an expert in Method B, don’t expect him or her to tell you that this method won’t work–you’ll get at least some solution using Method B. And then, there is always a possibility that there exist Methods C, D or E, but you never heard about them and therefore know no appropriate experts. In other words, when asking for an expert opinion, you narrow the scope of potential solutions to what you already know.

Crowdsourcing is different again. Crowdsourcing is not only agnostic on the source of responses, but also on their nature. Unless you specifically indicate that you’re interested only in Methods A or B (and sometimes, you have to do that for some specific reasons), incoming solutions will be focused on solving your problem, not the way of solving it. That’s why the best praise I can get from my clients is hearing them say: “Wow, we never even thought about that!”

In summary, a properly designed crowdsourcing campaign will allow you to significantly simplify your search for a solution to your problem, reduce cost of the problem-solving process and result in diversified, original and sometimes even unexpected solutions. Especially if you don’t really care whether the solution will come from your neighbor or a person on the other side of the globe.

Image credit: Henry John Yeend King “Friendly Neighbors” (http://www.paintinghere.com/painting/henry_john_yeend_king_friendly_neighbors_26898.html)

Posted in Crowdsourcing | Tagged , , , , , | 1 Comment

Is crowdsourcing pitting “experts” against “amateurs”?

Two Men Engaged in an Argument_ One Manifesting Anger the Other Trying to Calm Him Down

In my previous post, I argued that one of the reasons crowdsourcing hasn’t yet become a mainstream innovation tool is the uncertainty over what crowdsourcing can (or can’t) do, meaning that many organizations struggle with identifying problems that can be successfully solved by crowdsourcing. There is another reason slowing down the acceptance of crowdsourcing: the lack of trust in the intellectual power of a crowd, its ability to tackle complex technological or business problems. Sure, everyone would agree that the wisdom of crowds can be successfully applied to accomplishing “simple” tasks, such as reporting potholes in the City of Boston, but when it comes to answering a question requiring special knowledge…well, let ask the experts.

The reluctance to replace experts with a crowd naturally sits well with the experts themselves, who’re often scornful of the very idea that someone with no immediate experience in their field can solve a problem that they couldn’t. This sentiment was nicely expressed by James Euchner who, with a tangible dose of bitterness and disdain, wrote back in 2010: “Our trust in the expert appears to be increasingly supplanted by a willingness to rely on the knowledge derived from crowds of amateurs. In this new world, the motives and competence of experts are at best suspect and presumed to be inferior to the wisdom of crowds.”

“Crowds of amateurs.” Pretty harsh words, eh?

The fundamental flaw of the notion that people participating in crowdsourcing campaigns are just a bunch of “amateurs” lies in the fact that in real life, crowds are composed of…experts. They simply are not experts working for your company, or in your industry, or in your country–or having your immediate area of expertise. But they’re experts nonetheless. Take, for example, InnoCentive, a commercially available crowdsourcing platform with a solid track record of solving difficult scientific and business problems for corporate and non-profit clients. The InnoCentive proprietary crowd is composed of 375,000+ solvers, with 66% of them holding advanced degrees. Moreover, academic research shows that a solver’s likelihood of solving a problem increases with the distance between the solver’s own field of technical expertise and the problem’s domain. So much for a crowd of “amateurs”!

Some experts, of course, are not outright dismissive of crowdsourcing; they’re trying to justify their negative attitude toward the technique. A recent example of this attitude was on display in a recent Boston Globe article written by Dr. Joshua Liao, a resident physician in the Department of Medicine at Brigham and Women’s Hospital and a clinical fellow in medicine at Harvard Medical School. Dr. Liao was approached by a gentleman, a son of a Dr. Liao’s patient, who asked him whether Dr. Liao would be willing to use CrowdMed to help establishing diagnosis for the patient’s mother.  CrowdMed is a recently launched company using crowdsourcing to provide online medical diagnosis, particularly diagnosis of rare conditions that have been missed by doctors.

Dr. Liao, response was no. He explained that a solid diagnosis requires not only patient’s medical history, which supposedly can be provided to a crowd, but, more importantly, direct medical examination of the patient, which can be not. For this reason, in his opinion, services like CrowdMed “produce more questions than answers, and more confusion than direction.”

I see Dr. Liao’s point, for I do understand the value of a close, in-person, medical examination. What troubles me in his argument is that the CrowdMed website features at least a dozen of “patient success stories”, the testimonies by the people who were apparently helped by the crowd after their own doctors failed to do so. What are then these “success stories”? Are they fake? Are they fluke? Or are they the examples that in some cases, in spite of what Dr. Liao says, crowdsourcing can really deliver something that a single expert, however accomplished and experienced, can’t? Would it not be better if medical professionals stopped dismissing new approaches as “confusing” and started instead a serious discussion on what crowdsourcing can (or can’t) do in the healthcare practice?

(By the way, I could easily find another online service, Sherpaa, that provides its subscribers, among other things, with medical diagnoses).

The irony of the pitting experts against crowds is that crowdsourcing is impossible without experts. It’s only experts who can identify and properly formulate your company’s most important problems; it’s only experts who can go through incoming external submissions to select those that make sense; it’s only experts who can integrate the external information with what is already available in-house.

So, the next time your organization has a pressing problem, ask your internal experts first, and if they can’t come with a suitable solution right away, launch a crowdsourcing campaign. Use the wisdom of the experts who don’t work for you.

Image credit: John Collier “Two Men Engaged in an Argument” (http://www.paintinghere.com/buy/john_collier_two_men_engaged_in_an_argument_one_manifesting_anger_the_other_trying_to_calm_him_down_art_painting_27641.html)

Posted in Crowdsourcing | Tagged , , , , , , , , | 3 Comments

Can crowdsourcing fix your marriage problems?


fighting couple

I think that one of the reasons crowdsourcing hasn’t yet become a mainstream innovation tool is the uncertainty over what crowdsourcing can (or can’t) do. I’m often asked the same question: can crowdsourcing solve this problem; what about that problem? My answer to these questions is always the same: yes, it can. Although sounding almost like a joke, the answer reflects my strong belief that crowdsourcing is first and foremost a question, a question that you pose to a large crowd of people. It doesn’t really matter what this question is about, for as long as it is well-thought-out, properly defined and clearly articulated.

For example, can crowdsourcing fix marriage problems? Imagine a married couple with their relationship in disarray. If both spouses are serious about fixing it, they will most likely approach a marriage counselor. The counselor will ask the couple a lot of questions and, based on their responses, emotional state and body language–and also on his or her own professional knowledge and experience–will suggest some measures to improve the situation.

How much can the couple trust the opinion of one single individual, however supposedly experienced? What if this particular counselor has gotten it completely wrong? Can the couple ask for a “second opinion,” like it happens in case of a life-threatening medical condition? Well, marriage counseling isn’t cheap: a 45-min session may cost the couple around $200–and, obviously, you won’t solve your problems in one single session. Besides, whereas most health insurance plans will pay for at least part of the cost of a second opinion (and Medicare will pay 80% of it), no one but the couple itself will have to pay for their counseling, first opinion or second.

Now, let’s imagine that the very same couple brings its troubles to a crowdsourcing platform. It will present to the crowd pretty much all the information it would divulge to the counselor and it’ll be ready to answer additional questions the crowd may ask. And then the crowd begins delivering opinions of its members–all based on the real-life experience of dozens, if not hundreds, of different individuals, many of whom might have gone (successfully or not) through exactly the same situation as our troubled couple. After all, isn’t this what professional marriage counselors do: giving opinions based on their prior experience of listening to dozens, if not hundreds, of troubled married couples?

Of course, the process of crowdsourcing solutions to marriage problems won’t be exactly free to the users; however, available methodologies of running crowdsourcing platforms would keep the price at a fraction of what the counseling would cost.

There is one potential problem to this scenario. Will the couple be willing to provide all the information the crowd requires it to divulge? It’s one thing to disclose very intimate–and often embarrassing details–of your personal life to a certified professional who, in addition, is strictly bound by a confidentiality agreement. It’s another to tell the same to a crowd of strangers, some of whom may appear unsympathetic or even openly hostile.

Part of these concerns can be adequately addressed by, first, protecting the anonymity of the couple itself (although some potentially identifying information, such as age, location, occupation, etc. is impossible to withdraw) and, second, by appropriate moderation of the online conversation. Besides, the idea of revealing your marriage problems to a stranger–instead of a person in flesh across the desk–may be quite appealing to some. This seems to be a rationale behind BetterHelp, an online mental health counseling service. The site boasts having onboard over 2,000 counselors who have worked “with over 200,000 people through more than two million sessions.” As typical for online services, counseling with BetterHelp is based on a flat membership fee that covers both the use of the platform and unlimited counseling; membership plans start as low as $35/week.

Back to corporate crowdsourcing. Companies often display the same emotion as our troubled married couple going online: the fear of revealing sensitive (in this case, proprietary) information to a crowd of “strangers.” That’s the reason why among the most difficult crowdsourcing campaigns to run–and the least successful in the final outcome–are those dealing with “internal” processes. Companies are just intrinsically hesitant to provide the crowd with relevant details about the nature of the problem and the signs (or, better, the origin) of the troubles. Yes, we understand that this particular process is inefficient and too costly; yes, we realize that we must improve it and, for this reason, are ready to ask for a help from outside; but no, we’re not going to tell you what this process is all about: it’s proprietary. So when I say that crowdsourcing is first and foremost a question, I ought to add: and also your willingness to provide as much information as needed (no more, but no less) to ensure that this question is solvable by a crowd.

I have to say that the fear of revealing proprietary corporate information through crowdsourcing is vastly overblown. Available techniques allow you to prepare your online question in a way that will make it “solvable”, yet leaving aside any information pointing to the source of the question. I’ll touch upon this topic in my following posts.

Image credit: http://quotesgram.com/couples-fighting-quotes-on-facebook/

Posted in Crowdsourcing | Tagged , , , , , , , | 1 Comment

What do you need to innovate? Freedom! Yes, freedom.

Freedom-Series-Logo-720x388We love talking about nurturing a culture of innovation; yet, our list of practical measures to promote entrepreneurial spirit is depressingly short. For this reason, I’ve set out to create a list of specific corporate policies that organizations may try in order to establish the culture of innovation.

One entry on this list could initially appear as not immediately related to innovation at all: labor laws. A 2001 study showed that labor laws making it more difficult to fire employees increase their participation in corporate innovation activities. The authors of the study argued that the lower threat of termination produced by stronger anti-dismissal laws decreased the “cost of failure” for employees to engage in potentially risky innovation projects. Another study, published by MIT researches, found that companies in 34 U.S. states having the so-called constituency statues produce more high-quality patents than those in 16 states lacking the statues. A constituency statue encourages corporate directors to consider non-shareholder (e.g., employees) interests when making business decisions, therefore forcing them to think of the long-term interests of their companies rather than the short-term profits. The both studies strongly suggest that removing the proverbial Sword of Damocles of punishment for innovation failure encourages risk-taking and experimentation. In other words, providing employees with freedom to fail is a great way to promote innovation activities.

The effect of personnel policies on innovation has again been brought into the spotlight in a recent study described in an August 17, 2016 Harvard Business Review article. The study shows that U.S. state-level employment nondiscrimination acts (ENDAs)—laws that prohibit discrimination based on sexual orientation and gender identity—spur innovation. More specifically, the study found that U.S. public companies headquartered in states that have passed ENDAs experienced an 8% increase in the number of patents and an 11% increase in the number of patent citations relative to companies headquartered in states that have not. Interestingly, the result was more pronounced for companies that previously have not implemented nondiscrimination policies, for companies in states with a LGBT population and for companies in human capital-intensive industries. The authors of the study argued that ENDAs positively affect innovation by matching more creative employees with innovative companies.

I’m not going to argue, of course, that in order to be more innovative, you have to be a gay, lesbian or permanently employed (as opposed to employment at will). What I do want to argue is that innovation implies certain level of freedom, be it freedom from fear of failure or freedom from being discriminated for whatever reason.

Sounds too farfetched? Hold on. Last week, the 2016 version of the Global Innovation Index (GII) was revealed. The GII gauges the world economies based on infrastructure, market and business sophistication and research. As in the previous years, Switzerland took the title of the world most innovative country; Sweden was second, the United Kingdom third and the U.S. fourth.

Back in 2014, I made a notion that the top of the GII ranking was heavily populated by countries representing developed democracies, the societies with strongly upheld political and individual freedoms. (And to make sure that this observation had any statistical meaning, I compared the 2013 GII with the 2013 Freedom of the World Report published by Freedom House, a U.S.-based non-government organization that monitors democratic developments around the world.) Nothing seems to have changed on the innovation Olympus since then. Moreover, it’s so tempting to argue–in the light of the findings discussed above–that it’s not by sheer coincidence that among the 10 most innovative countries in the 2016 GII, there are eight Western European countries with strong labor and antidiscrimination laws.

Are we watching a growing body of empirical evidence to what many of us always intuitively knew: in order to innovate, you need freedom? Do we need any further proof to this thesis at all?

Image credit: http://crossroadswaunakee.org/event/celebrating-freedom/

Posted in Global Innovation, Innovation | Tagged , , , , , , , , , , , , | Leave a comment

Crowdsourcing: two approaches, two different outcomes

In my July 16 post, I set out to prove that crowdsourcing is a very cost-effective tool allowing solving problems at much less cost compared to other innovation tools, and, therefore, the low popularity of crowdsourcing, of which I wrote earlier, can’t be explained by its being prohibitively expensive. In this post, I’ll continue the comparison exercise and talk about effectiveness of two different crowdsourcing approaches.

I like saying that one needs only two things to run a crowdsourcing campaign: a question and a crowd. The topic of selecting and formulating a proper question to crowdsource is of immense importance, and I’ll take on this topic later. Today let me deal with crowds.

There are two principal ways to acquire a crowd: to build it from scratch (I call this approach “build-the-crowd”) or to use proprietary crowds already assembled by a number of commercially available crowdsourcing platforms (I call this approach “rent-a-crowd”). Companies build their own crowds usually by creating the so-called External Innovation Portals (or something similarly called). A typical EIP is essentially a website that invites anyone from the outside of the company to register on the site and then submit innovative ideas in the areas of the company’s corporate interests.

Examples of EIPs are numerous. They’re especially popular among consumer-oriented companies: Starbucks runs My Starbucks Idea, General Mills has G-WIN, and Clorox came up with CloroxConnect. Tech companies, including pharmaceutical and medical device, don’t stay on the sidelines, either: Medtronic invites you to Innovate with Medtronic and AkzoNobel suggests that you Enter our Open Space. Energy giant Shell lures you into its own portal modestly named Shell GameChanger.

How effective EIPs are? Unfortunately, hard numbers are difficult to come by: companies are predictably reluctant to publicly discuss the efficiency of their open innovation programs, EIPs being no exception. Yet a brave one, Dell, does provide stats on how its EIP, IdeaStorm, is performing. IdeaStorm’s front page says that all in all, over 24,948 ideas were submitted, of which 549+ ideas have been implemented. Leaving aside the vagueness of the word “implemented,” the success ratio of the project barely exceeds 2%. Not a fountain of innovative ideas for Dell, to say the very least, and as my own involvement with corporate EIPs suggests, other corporate portals aren’t doing any better.

The apparent low efficiency of EIPs stems from the way they crowdsource knowledge from the outside. Many EIPs just ask for “ideas” without clearly defining what represents a valuable idea for this particular company. (I call this approach “bottom-up” and have criticized it on a number of occasions; see, for example, here and here). As a result, a lot of irrelevant or low-quality ideas are being submitted, dramatically decreasing the signal-to-noise ratio but at the same time significantly increasing the amount of resources the company needs to allocate for the initial screening of submissions.

Some companies have recognized this shortcoming and began including descriptions of the areas where innovations are especially welcome. For example, AstraZeneca lists “R&D Focus Areas” on the front page of its OpenInnovation portal; similar descriptions can be seen on Philips’ SimplyInnovate.

Even further went Unilever. Its The Unilever Foundry portal features Challenges, which are reasonably well-defined problems in a few product/services categories. Each Challenge explains the context of the problem and describes what exactly Unilever is looking for. Challenges have a submission deadline and a budget. Unfortunately, I have no data on how successful Unilever’s Challenges are and I’d appreciate any information on this topic.

Unilever’s Challenges almost exactly adhere to the Challenge concept introduced back in 2001 by InnoCentive, an open innovation service provider specializing in crowdsourcing. Instead of asking for “ideas,” InnoCentive’s clients post well-defined technical or business problems (“Challenges”), time-bound and having an “award tag” attached to it descriptions, to the InnoCentive website. Then a huge crowd of InnoCentive “Solvers” (375,000+ from 200 countries) that InnoCentive has assembled over the years and now “rents-out” works on finding solutions to these problems. Submitted solutions are then collected by the InnoCentive staff and delivered to the client. The client has a fixed amount of time to review all the submissions and announce a “winner” who is receiving the money award.

The solution rate of the InnoCentive Challenges is very impressive. Although I failed to find the precise number on the company’s recently redesigned website, about a year ago InnoCentive claimed this value being 85%. Similar high level of success of its crowdsourcing campaigns, up to 90%, was reported by another open innovation service provider, IdeaConnection. Just compare these numbers to about 2% success rate claimed by Dell’s IdeaStorm!

The InnoCentive (and IdeaConnection) mode of crowdsourcing is an example of what I call the “top-down” approach, a process in which a well-defined problem is offered to a pool of potential solvers, after which submitted solutions are reviewed–and successful ones identified–based on a number of criteria articulated in advance. (Again, I already wrote about the benefits of the “top-down” approach: here and here).

The available numbers therefore strongly suggest that the top-down mode of crowdsourcing–from the problem to a solution–is much more effective than the bottom-up mode–from “ideas” to potential implementation. At the same time, I do see a value for a large company (and I emphasize: large) to spend time and money for creating its own online crowd–instead of paying fees to the providers of crowdsourcing platforms, like InnoCentive (IdeaConnection doesn’t charge an upfront fee). So from this point of view, the business model adopted by Unilever with its Challenges, a combination of the top-down approach plus its own crowd of solvers, looks to me as an optimal way to conduct corporate crowdsourcing campaigns.

Image credit: http://www.huffingtonpost.ca/ashley-redmond/gap-year-canada_b_5948708.html

Posted in Crowdsourcing | Tagged , , , , , , , , , , , , , , , , , , , , , , | 3 Comments