On 31 March 2005 a Google US patent was made public that reveals interesting data on how they rank your website. Patent number 20050071741 was actually filed on 30 September 2003, but it was only made public at the end of March. Darren Yates did some analysis on it – some of his observations (mixed with my own):
– How long the domain name has been running is a factor in good ranking. The longer the better, probably because it’s an indicator of reliability.
– Links are analysed by Google with a focus on historical data. In Darren’s words: “Google records the discovery of a link, link changes over time, the speed at which a site gains links and the link life span.”
– “fast link acquisition” is being targeted by Google. Darren said: “fewer but better quality links will benefit you more and they will be much more likely to be over the long-term which is good too.”
Nick Finck put it nicely over at Digital Web: high-ranking sites will be “less of a link farm, less of a re-blog, less of a link exchange, less of a faux landing page.”
– Click-throughs count: “sites are rewarded for good CTR with a raise in ranking. Similar to how Adwords works.”
– User behavior is monitored. No surprise there, but this includes seasonal rankings (e.g. ski websites) and “bookmarks and favorites could be monitored for changes, deletions or additions”.
– That old Web 1.0 staple, stickyness, still counts: “Clicks away from your site back to the search results are also monitored. Make your site as sticky as possible to keep visitors there longer.”
– The two extremes of website updates are discouraged – mass updates and very few updates: “Mass updates of hundreds of files will see you pop up on the radar. On the other hand, few or small updates to your site could see your rankings slide – unless your CTR is good.”
– Spam indicators according to Google include changing the focus of multiple pages at once and a spike in the number of topics.
(RM says: does this mean topic-focused blogs are rewarded in Google Page Rank as well as blogosphere attention? Sounds like it…)
– Other, more SEO-focused (search engine optimisation), spam indicators include keyword density changes and the reputation of your host’s IP address.
Darren’s final bit of advice is to “grow your site as organically as possible”, which is definitely something bloggers can relate to. This patent application seems to target spammers, which is great news.
However I do find it odd that, according to Darren, Google wants us to make our websites “more ‘sticky’ to encourage visitors to stay a while”. [I believe he’s referring to section 0092, User Behavior, in the Patent] In the blogosphere, the only ‘sticky’ thing is a user’s RSS Aggregator! Of all the blogs I follow, I only regularly visit the actual websites of a small percentage of them. The rest I read in Bloglines or Rojo, only clicking through occasionally for the comments (this includes for excerpt-only feeds, which I’m more likely to pass over than click through on). Joshua‘s been writing some excellent posts on this topic lately.
Mind you… because this patent was filed way back in September 2003 (before RSS Aggregators became popular), it’s likely that Google have updated their site ranking methods since then – to accomodate RSS reading and other Web 2.0 activities such as API access to web content. Google is usually more than one step ahead of anything they release publically(!), so I’d be surprised if they haven’t already accounted for ‘off-site browsing’ (i.e. RSS aggregation and web services) in their site ranking algorithms. I hope so. Hmmm, do you think they have?