
A robot arm (foreground) retrieves assay plates from incubators and places them at compound transfer stations or hands them off to another arm (background) that services liquid dispensers or plate readers.
I like to argue (for example, here) that the most important factor that defines the ultimate success or failure of any crowdsourcing campaign is the ability to properly identify and articulate the problem–technological, business or social–that the crowd will be asked to solve. I’d call it the “80:20 rule”: 80% of unsuccessful crowdsourcing campaigns I’m aware of failed because of the inability to properly formulate the question to be presented to the crowd; only 20% did so because of a poor match between the question and the crowd (the latter being usually a component of a chosen crowdsourcing platform).
So, what does it mean: to properly identify and articulate the problem to crowdsource? Let me start with a story. I worked with a client, a large pharmaceutical company (as in my previous post, I changed the industry affiliation of my client; it’s completely irrelevant to what I’m going to tell). My client wanted to design a high throughput screening (HTS) assay to measure one particular type of cellular transformation. To those unfamiliar with HTS assays I can say that widely used in drug development, HTS assays employ robotics and special software to screen for biologic activity literally hundreds of thousands of chemical compounds.
My counterpart at the client site, the head of the assay development group, dutifully described the most important parameters she expected from the future assay. This sample volume (the number of samples analyzed per hour or day), this ratio of false positives (the percentage of inactive compounds falsely identified as active), this ratio of false negatives (the percentage of active compounds falsely identified as inactive), the price (as cheap as possible; but of course).
I was making notes while listening to her and the sense that I was missing something very fundamental was gradually growing in me. Finally, I ventured to interrupt: “All right, everything is clear. But what is about your endpoint?” (In case of any assay, including HTS, endpoint is something that the assay actually measures).
For a split second, my client lost her confidence. She paused and then said, carefully choosing her words: “Well, we don’t have an endpoint. We thought that this would be part of the whole solution.”
It was my turn to carefully choose what I was about to say. “Well,” I said, “maybe we start with looking for an endpoint first and then, after we found it, we run a follow-up crowdsourcing campaign to design an HTS assay based on this endpoint?”
In response my interlocutor smiled broadly and said: “Look, if we had a good endpoint, we wouldn’t need you: my in-house assay developers will design an HTS version of the assay in a matter of weeks.”
That sealed it. Very rapidly the two of us put together a problem statement asking for a suitable molecule (protein or nucleic acid) whose change in structure or quantity would signal that the cellular transformation in question had taken place. We posted the statement online, and in about a couple of weeks I got a submission from a university professor living in one of the small Eastern European countries. The submission described a protein (I never heard about it before) that was overproduced by cells that had experienced the transformation in question; this overproduction can be easily detected by measuring the intensity of fluorescence, a slam dunk for any assay developer.
Frugally written, only half-page in length, the submission had only couple of paragraphs of text, a picture and a reference. But it was something I could share with my client, which I immediately did. Her response followed soon: “I love it! We’re buying it.”
And that was it. I completed the paperwork transferring all necessary IP rights to the solution to my client. I never heard from her again: apparently, her in-house assay developers were indeed as good as she described them.
Clients come to me always knowing what they want; very often, however, they don’t do enough digging to understand what they really need. I remember a client who wanted to change the design of a paint dispersing pump because it often clogged. We looked into the problem deeper and found that the cause was actually the paint whose viscosity dramatically increased with slight drops in air temperature. My client fixed the clogging problem himself by changing the paint formulation, without touching the pump. I remember another client who wanted to crowdsource an additive that would prevent a food product losing sweetness upon processing. To my client’s great surprise, the eventual winning solution proposed a change in the food processing itself leading to exactly the same result.
It’s tempting to say that what clients want is a symptom of a disease whereas what clients need is the disease cause. You can’t successfully fight the disease (solving the problem) unless you identify its real cause first (defining the problem). But enough scientific terminology for one post! Let me finish with formulating my first (“golden”) rule of crowdsourcing: know what you want, understand what you need.
Image credit: https://en.wikipedia.org/wiki/High-throughput_screening
Pingback: Don’t “fiddle” with the crowd — ask it better questions instead |
Pingback: Don’t blame crowdsourcing for “bad ideas” |