Google Sensors Are Data Mining I/O Attendees - And They Don't Care

If you're visiting the Google I/O developers conference this week, you're a tiny part of a giant Google experiment to sniff out everything from your body heat to your breath. Google is even listening to your footfalls as part of its Data Sensing Lab I/O 2013.

Think that's a scary, Big-Brother invasion of privacy? The conference attendees I talked to didn't seem to mind. In fact, one wanted Google to collect even more data.

Google planted 525 powered sensors around the halls of San Francisco's Moscone Convention Center, and began collecting data from them on Wednesday, according to Michael Manoochehri, a developer programs engineer at Google. The company began measuring temperature, humidity, light, pressure (including nearby footfalls), motion, air quality and both RF and ambient noise. All of the data is sent back at intervals of 20 seconds or so, collected by Google's App Engine, with analysis performed by its BigQuery Big Data analysis tool. You can see the results at the Lab's dedicated Web site

Among other things, Google's I/O developer conference has focused this year on improving developer tools and better integrating the services that it already owns via a more intelligent cloud. The unnamed sensor project, part of Google's Data Sensing Lab, encompasses a bit of all of that. By itself, knowing that the air quality diminished at 4a.m. might be intriguing, but not all that significant. But by correlating that information with a peak in another data stream - ambient noise, say - it becomes possible to guess what's going oin; in this case, perhaps, the arrival of the cleaning crew.

Manoochehri said that Google could build in queries against the sensor network into its Google I/O app, to identify the quietest spots on the floor for a phone call or a brief nap.

Crossing The Creepy Line?

Eric Schmidt, then the chief executive of Google, famously described Google's policy as "to get right up to the creepy line, but not cross it." When Google unified its privacy policy in March 2012, the company suggested that its unified services could anticipate an afternoon meeting and direct you to leave at a certain time. A year ago, that notion prompted righteous outrage from members of Congress, users and privacy advocates. A year later, that feature (now called Google Now) has been lauded as the herald of anticipatory search. (Six privacy advocates from the EU are still threatening action.)

Source: Google Source: Google

It's probably fair to say that attendees of Google I/O give Google a bit more leeway than the general public. That certainly proved to be the case for those sitting near the sensors. Alan Holzman, a retired venture capitalist who last worked for Intel Capital, shrugged it off. "My life is tied to Google in much more significant ways," he noted.

Ditto for Sam Napolitano, who was covering Google I/O for the Huffington Post. Napolitano said he believed that the sensors were probably picking up on the NFC tag embedded within his name tag - something that Google employees said wasn't true. In any event, Napolitano said, he didn't care, as he had no expectations of privacy in a public space. "As long as it's not under my toilet seat, I don't care," Napolitano said of the sensors.

And "Rachid," an employee of Motorola Mobility who declined to give his last name, said he wanted to Google sample more data. More data and more correlation often derives more interesting results, he said, such as the various causes of depression. 

The Internet Of Things

Collecting data from sensors is increasingly seen as part of the rise of the so-called Internet of Things, and Google clearly wants to be a leader in this growing domain. Google already collects some location data via Android phones to better improve its knowledge of traffic, and provide better solutions via Google Maps. 

(See also How The Internet Of Things Will Revolutionize Search.)

We know that Google is very good at parsing user data - pulling keywords from emails, for example, and selling ads against them. (Selling ads against search terms is child's play.) Likewise, it can make recommendations for where to eat, where to go, the route to take and when to leave - building more comprehensive, personalized and valuable profiles along the way.

But the I/O conference project suggests that Google is prepared to take the same value proposition - collect data, analyze it, and provide and sell services against it - far beyond today's core businesses. Imagine sensors placed on Google Street View cars, and selling a comprehensive snapshot of air quality to the communities it maps. Or mounting similar sensors on the light poles from which it strings  it Google Fiber broadband connections.

It will be interesting to see how far Google takes this. Remember this is the company that attempted to track the spread of influenza via search terms. Google said that it wants attendees and other users to be able to interact with its new sensor data via the project's website. How soon will it be when we'll be able to do the same for, say, San Francisco?