Last week, computer book publisher SitePoint relayed a story about recent experiences with Digg that demonstrates that the Digg system is far from perfect. We’ve written recently on ReadWriteWeb about the decline and fall of quality on Digg, but SitePoint’s anecdote demonstrates that sometimes the wisdom of crowds approach is, well, kind of dumb. Now is probably a good time to revisit the rules for harnessing the wisdom of the crowds we published on this blog a year ago.
SitePoint Marketing Manager Shayne Tilley talked about the company’s efforts to promote a recent book giveaway via Digg on an SP blog. Within an hour after the promotion went live it had been dugg 30 times, but then, just as quickly, it was buried. Was it because SitePoint had submitted their own content to Digg, something that Digg users generally frown upon? No, SitePoint hadn’t done that, they just put a “Digg This” button on the campaign page. The reason for the bury was likely this comment, according to SitePoint, who noticed the bury come down shortly after the comment was posted:
“It’s a trap. When you download it runs a validation check to see if you are running a pirated version of photoshop. Which then logs your ip back to Adobe HQ who then mark the ip address in the automated billing system. You will recieve [sic] a fine for $500 in the next 2 to 5 working days. Congratulations” — luke16
The problem, though, that’s not true. The book download is just a PDF file; it doesn’t run a version check on Photoshop, it doesn’t log your IP address, and SitePoint has no relationship with Adobe. Nonetheless, enough Digg users bought into luke16’s active imagination that the story was buried.
“So anyone else in the digg community who might be interested in a full, print-quality Photoshop book — sorry, you miss out,” wrote Tilley. “All because some goose decided to throw around some unsubstantiated claim about the legitimacy of our giveaway. What’s worse is that everyone believed him!”
SitePoint’s experience is an example of herd behavior or groupthink, where the Digg group acted blindly on poor information, without rationally thinking it through. This is a problem with the wisdom of crowds concept: if unchecked, rather than coming to the best conclusion based on the wisdom of the group, a crowd can come to the worst conclusion based on dumbness that spreads from a single bad node.
Last year, we laid out a set of rules to get the most out of a crowd. It might be a good idea to revisit those here:
- Crowds should operate within constraints. To harness the collective intelligence of crowds, there need to be rules in place to maintain order.
- Not everything can be democratic. Sometimes a decision needs to be made, and having a core team (or single person) make the ultimate decision can provide the guidance necessary to get things done and prevent crazy ideas and groupthink from wreaking havoc on your product.
- Crowds must retain their individuality. Encourage your group to disagree, and try not to let any members of the group disproportionately influence the rest.
- Crowds are better at vetting content than creating it. It is important to note that in most of the above projects, the group merely votes on the final product; they do not actually create it.
Digg’s problem lies in the third point — members were able to quickly spread undue influence on the group via poor information that caused undesired results before that information could be properly vetted by the group for accuracy. Eventually, more reasoned commenters on Digg shot down luke16’s paranoid conspiracy theory, but by that time it was too late, the story had already been buried.
Digg probably gets it right far more often than it gets it wrong, but SitePoint’s experience is a lesson in the dangers of letting a crowd run wild. Any site that relies on a crowd to organize information should be wary of things like this happening.