Here’s an interesting and under-reported story: a developer by the name of Robert Norris Hills recently created a bot that automatically adds users to Circles on Google Plus. The bot runs until it reaches the friend limit of 5,000, then removes the Circle it created and starts again, keeping those who added it back.
Why would such a bot be a problem on Google’s new social network? Because, like similar “auto-follow” behavior employed by Twitter spammers in the past, many users will return follow (or in this case, return “encircle”) those who follow them. And that can be used to a spammer’s advantage.
There is some good news to report, however: Google throttled the bot within days of it going live. But according to Hills, that may not be enough.
Automatic “Circling”
Hills posted this video(see below; warning, rocking soundtrack! adjust your volume!) to YouTube recently, using the catchy title “A Certain Shade of Scoble,” referring to power user and startup tester extrodinaire, Robert Scoble, who is currently following the maximum number of friends (5,000) on Google Plus.
As the video explains, the bot runs for a few hours, friending a total of 5,000 people. The next day, it runs again, removing the Circle of 5,000, except for those who had added it back. Last weekend, before Google addressed the issue, Hills tweaked the bot to run 30% faster, and said that the bot could now add 1,000 friends in 5 minutes. But unlike with his first video, Hills was never able to upload proof of these claims to YouTube before Google stepped in to stop the bot.
We reached out to Google to ask how the system was secured. According to a Google spokesperson, this isn’t “an interesting story.” Our attempts to follow up further were not responded to, unfortunately.
We feel differently about this news, of course. It is interesting, we think. And here’s why.
Google Taking a Position on Automation? And, Yes, It Matters
What’s interesting about this story is not only how quickly Google made changes to throttle the bot’s behavior (within 3 days of the video upload – something it should be proud of!), but it speaks to a larger issue of how spammers could game Google Plus in the future.
Right now, Google is so wrapped up in the war on fake names (Google is requiring that real names are used on the service), that it can’t fully focus on this issue, we guess. In addition, the 5,000 friend limit makes this type automation of minimal concern for now. After all, if spammers were to use this type of system in order to amass followers today, they would have to create multiple accounts to reach a wide enough audience to make their link-baiting (or whatever nefarious activity they involve themselves in) worthwhile.
But one day, Google Plus will go live. And one day, the limit of 5,000 may be lifted. Or so we hope. As Marshall Kirkpatrick wrote here yesterday, limits like these seem like architectural shortcomings of our social networks, whether that’s a friend limit on Google Plus or a limit as to how many Twitter lists a user can have.
The problem has been addressed for now, Hills says. Google throttled his bot’s behavior so that it can only “encircle” one friend every 5 to 8 seconds. Still, any tool that allows for automation like this should be kept under watch, as it means that people can begin to use the service in non-organic ways. Hills suggests that simple CAPTCHAS could be used after set intervals to confirm it’s a user on a Circling spree, and not a bot.
Should Google Allow Automation?
Google may be right when it says, essentially, “nothing to see here,” but that same sentiment won’t be appropriate forever. At some point, Google will have to address how it handles bots and automation and take a position. Twitter did. Digg did. And Facebook did, with notable fallout resulting from the ban of the aforementioned Mr. Robert Scoble.
What will Google’s policy be on automation be? Throttle but don’t prevent? Prevent but don’t ban? (Hills’ account is still active, for what it’s worth). Happily allow through the yet-to-launch API for developers? What do you think it should be?