For nearly a decade, we’ve been hearing about the power of the crowd and the ability of social networks to unleash their power in everything from building an encyclopedia to helping people predict the ebbs and flows of equities markets.
Crowds are a great resource and a trove of valuable information – when they work. When they don’t, it doesn’t take much to question the wisdom of the crowd.
In Sunday’s New York Times, David Leonhardt gave a primer on crowdsourcing, complete with the failure of crowds to predict the Supreme Court’s recent health care decision. That decision gave critics of crowdsourcing fodder for attacking the model, but Leonhardt was quick to point out that the so-called experts and insiders are often just as flawed in their own predictions.
“The answer, I think, is to take the best of what both experts and markets have to offer, realizing that the combination of the two offers a better window onto the future than either alone,” he wrote. “Markets are at their best when they can synthesize large amounts of disparate information, as on an election night. Experts are most useful when a system exists to identify the most truly knowledgeable — a system that often resembles a market.”
Put another way, when everyone has access to the same information and is trying to make sense of it, crowds can be useful. But when only a few insiders have the information – an as-yet unannounced Supreme Court ruling, for example – crowds are all but useless.
“Crowdsourcing is a model, not magic – no different than outsourcing or SaaS. And similar to these other models, there are companies that do it exceedingly well, and those that do it poorly,” said Matt Johnson, who heads marketing at Boston-based uTest, a crowdsourced software testing firm. “And similar to most movements that gain traction, there’s more than a little hope-hype-hate cyclicality that goes with it.”
Not All Crowds Are Created Equal
The original model for crowdsourcing – and one still employed by many firms – was more is more. The idea was that the bigger the crowd you used, the better the end result, be it a prediction, a product or a logo design.
Peter H. LaMotte, president of GeniusRocket Inc., said the firm was originally founded to compete in the spec space for design, in which firms compete on spec to design logos, branding campaigns or, in the case of GeniusRocket, video production.
The problem with contests is that creative talent only gets paid if their content is selected, while commissioning firms have less input over the final product and may lose control over their brand to contest losers who try to use the content elsewhere. GeniusRocket shifted its business model to what it calls a “curated crowdsourcing” approach, which starts with curating who can be a part of the crowd that pitches ideas.
“This makes sure that professionals, not amateurs or students, make up the crowd,” LaMotte said. “These professionals pitch ideas based upon a client’s brief. The best ideas move forward and are compensated; the ideas that are not selected remain the property of the team that pitched it… . This is the crowdsourcing element: The crowd delivers the best ideas. GeniusRocket turns them into storyboards, and uses crowdsourcing to validate the ideas like a virtual focus group. We then have a second crowd of production companies that will bid for the work. The best-suited team at the right price wins the production.”