There are some professions that inspire awe just by the sound of their name. Professional fact-checker, a person tech companies such as Facebook and Twitter hire to identify false or misleading claims, is one of them. Isn’t it cool to be someone who always knows the truth? How special do you have to be to tell other folks what to believe in?
It turns out that there is nothing special in professional fact-checkers after all. An international team of authors explored a crowdsourcing approach to fact-checking. Participants, recruited from an online labor market, were asked to rate randomly selected articles just by reading the headline and the lede (the opening sentence or paragraph) of the article. The authors found that a crowd of untrained laypeople as small as 10 people was as good or even better at fact-checking than three professional fact-checkers who, incidentally, had a chance to read the full article.
This finding can surprise only someone who never heard about crowdsourcing (or still confuses it with crowdfunding). Over the past 10-15 years, numerous organizations–including corporations, governmental agencies, and nonprofits–have adopted crowdsourcing as a tool to help them address their most pressing technical and business challenges.
Now, a group of authors led by Kyle Bozentko from the Center for New Democratic Processes want to make crowdsourcing an intrinsic part of our democratic decision-making process. They call their idea Citizens’ Juries.
As the name implies, Citizens’ Juries will be comprised of small groups of people (12-24 members) and be convened for 3-8 days of moderated deliberations on a specific topic. Depending on the topic, the jury could include people from the general population or be “custom-selected” based on characteristics relevant to the policy issue at hand like age, gender, or race. (For example, a youth citizen’s jury could be convened to study student loans.) The expectation is that serving as a microcosm of the public, Citizens’ Juries will bring the diversity of perspectives, interpretations, and heuristics that is so desperately lacking in our traditional, “expert-based,” decision-making process.
Sure, to make Citizens’ Juries work, a series foundation of organizational and legal bricks needs to be built first, and the vehement resistance from professional nay-sayers is all but assured. But as someone who witnessed crowdsourcing creating value in real life—and as someone who feels that our political ecosystem is rapidly becoming uninhabitable—I’d give the idea a chance.