Home How Google Can Combat Content Farms

How Google Can Combat Content Farms

In my recent post about the rise of content farms like Demand Media and the current incarnation of AOL, I posited that Google (and search in general) risks becoming less relevant as the Web gets drowned in lesser quality content. This is due to the scale at which these content farms are operating at – Demand Media alone pumps out 4,000 new pieces of content every day. The solution is of course for Google and other search engines to find better ways to surface quality content, whether that be from traditional news media, blogs or even Demand Media (not all of its content is poor quality).

So how can Google evolve to identify quality content better?

Quality! Pah, Does Google Need to Bother?

Perhaps we should first answer the question: why should Google be worried about the quality issue? After all, it has a virtual monopoly on the search market. The obvious and PR answer is that Google wants to provide the best search results possible for its users. But there is another big reason why Google needs to do something. So-called “quality” content providers are already well advanced in routing around Google, or at least making them less relevant.

As I wrote yesterday, Reuters is onto something with its subscription business model. According to Chris Ahearn, President of Media at Thomson Reuters, the company already makes the “vast majority of its revenues” from subscription-based business models targeted to “vertical and niche markets.” Reuters also provides services as well as just content. Bloomberg is another leading media company finding success with this strategy.

The subscription model is making inroads, because the users themselves are flocking to it. A prime example comes from VC Paul Kedrosky, who became frustrated after doing various Google searches for “dishwasher reviews” and getting unsatisfactory results. He says that this has made him “more willing to pay for things” – in that case a Consumer Reports review of dishwashers. As Kedrosky archly noted, “the opportunity cost of continuing to try to sort through the info-crap in Google results was simply too high.”

What Google Can Do

Google surely knows that quality (or lack thereof) in its index is a problem. As one part of the solution, Google is currently experimenting with real-time search results from social media sites like Twitter, MySpace and even Facebook. The theory is that users are more likely to get timely, relevant results by tapping into their social network.

That’s all well and good, but real-time search is unlikely to give you better results on the dishwasher search and other topic-focused search queries. So what else can Google do to identify and surface quality material?

Some readers in Sunday’s post (Tadhg, Charles Coxhead and others) argued that Google’s current algorithm accounts for quality well enough, through the link economy. But many others thought that Google must improve its ranking of quality. Here were some of our readers’ suggestions:

  • Neutralize the link dilution; A.J. Kohn, who further wrote that “the introduction of SearchWiki, their measurement of short-clicks versus long-clicks, the new domain/brand SERP listing, snippet links, and use of breadcrumbs all point to a gathering movement to help determine quality without such a reliance on an ever diluted link ecosystem.”
  • Do a better job ranking authority; for more on this read Clay Shirky’s post on “Algorithmic Authority.”
  • Introduce a user rating system; Tony Masinelli.
  • Leverage sharing networks to determine where the quality is; Alex Kessinger.
  • Special curation and algorithms on top of that; William Mougayar, whose company Eqentia does precisely that.
  • p2p recommendation (i.e. filtering through your peers); Nick Taylor.
  • Capture engagement data; Mark Littlewood.
  • Give special weightings to categories of content, e.g. content farms, social media bookmarks blogs and Twitter; Aaron Savage.
  • Use anti-spam type software to identify content that makes too much use of keywords; Barry.
  • Track reputation against authors rather than URLs – a ‘PageRank for People’; Marshall Clark.

These are all great ideas. Google is almost certainly already doing at least some of these things already, as would other search companies. Will this lead to an improvement in 2010? John Battelle is even expecting a “major breakthrough” in search in 2010. I hope he’s right.

One thing is for sure, Google will need to do more in 2010 if it’s to stay ahead of the content farms and continue to surface quality content for its millions of users.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.