Home The Dirty Little Secret About the “Wisdom of the Crowds” – There is No Crowd

The Dirty Little Secret About the “Wisdom of the Crowds” – There is No Crowd

Recent research by Carnegie Mellon University (CMU) professor Vassilis Kostakos pokes a big hole in the prevailing wisdom that the “wisdom of crowds” is a trustworthy force on today’s web. His research focused on studying the voting patterns across several sites featuring user-generated reviews including Amazon, IMDb, and BookCrossing. The findings showed that a small group of users accounted for a large number of ratings. In other words, as many have already begun to suspect, small but powerful groups can easily distort what the “crowd” really thinks, leading online reviews to often end up appearing extremely positive or extremely negative.

Small Groups, Big Impact

To conduct the research, Kostakos worked with a large sample of online ratings. As MIT’s Technology Review reports, the researcher and his team studied hundreds of thousands of items and millions of votes across all three sites. In each and every case, they discovered that small numbers of users accounted for the largest number of ratings. For example, on Amazon, only 5% of active Amazon users ever cast votes on more than 10 products but a small handful of users voted on hundreds of items. Said Kostakos, “if you have two or three people voting 500 times, the results may not be representative of the community overall.”

This is hardly the first time that the so-called “wisdom of the crowds” has been called into question. The term, which implies that a diverse collection of individuals makes more accurate decisions and predications than individuals or even experts, has been used in the past to describe how everything from Wikipedia to user-generated news sites like Digg.com offer better services than anything created by a smaller group could do.

Of course, we now know that simply isn’t true. For one thing, Wikipedia isn’t written and edited by the “crowd” at all. In fact, 1% of Wikipedia users are responsible for half of the site’s edits. Even Wikipedia’s founder, Jimmy Wales, has been quoted as saying that the site is really written by a community, “a dedicated group of a few hundred volunteers.”

And as for Digg.com, a site whose algorithm is constantly tweaked in attempts to democratize the votes of its users, it still remains a place where a handful of power users can make or break getting a news item to the site’s front page.

Attempts to Address the Issue

It’s not surprising then to discover that, when it comes to review sites, it’s again small groups that are in control there too. Some sites, including Amazon, attempt to address this discrepancy by allowing users to vote on the helpfulness of reviews – a much easier process than having to write a review yourself. Also, local business finder and recommendations site Yelp implemented ways for business owners to respond to what they feel are inaccurate reviews by way of an owner comments feature. Unfortunately, despite these efforts, the small groups still remain in control of these so-called “popular opinion” features.

According to the article, another professor at CMU, Niki Kittur, suggested that sites create new tools for transparency. For example, there should be an easy way to see a summary of a user’s contributions which would quickly reveal any bias. He also suggested removing overly positive and negative reviews.

Earlier this year, we looked at another user-generated review site which attacked this problem from another angle. Lunch.com, a new Yelp competitor, uses something they call their “Similarity Network” which matches you to other site users who share your interests. That way, instead of looking at a list of reviews which could originate from anyone with an agenda or axe to grind, you’re focused on reviews from others like you.

Still, there is yet to be a perfect solution to the problem. Perhaps it’s time we give up the idea that the “wisdom of the crowds” was ever a driving force behind any socialized, user-generated anything and realize that, just like in life, there will always be active participants as well as the passive passerbys.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.