One of the key aspects of the emerging Internet of Things – where real-world objects are connected to the Internet – is the massive amount of new data on the Web that will result. As more and more “things” in the world are connected to the Internet, it follows that more data will be uploaded to and downloaded from the cloud. And this is in addition to the burgeoning amount of user-generated content – which has increased 15-fold over the past few years, according to a presentation that Google VP Marissa Mayer made last August at Xerox PARC. Mayer said during her presentation that this “data explosion is bigger than Moore’s law.”
During my visit to Hewlett Packard Labs earlier this month, I spoke to Parthasarathy Ranganathan – a
Distinguished Technologist at HP Labs – about this large influx of data onto the Web.
Like Mayer, Ranganathan compared the online data growth rate to Moore’s Law. He told me that it’s rising significantly faster than Moore’s Law. HP CEO Mark Hurd put it this way in June 2009: “more data will be created in the next four years than in the history of the planet.”
281 Exabytes of Online Data in 2009
In her presentation at PARC, intriguingly entitled “The Physics of Data,” Mayer noted that there have been three big changes to Internet data in recent times:
- Speed (real-time data);
- Scale (“unprecedented processing power”);
- Sensors (“new kinds of data”).
Mayer went on to say that there were 5 exabytes of data online in 2002, which had risen to 281 exabytes in 2009. That’s a growth rate of 56 times over seven years. Partly, she said, this has been the result of people uploading more data. Mayer said that the average person uploaded 15 times more data in 2009 than they did just three years ago.
A Sensor Revolution
Mayer talked about “a sensor revolution,” including data from mobile phones. She remarked that “today’s phones are almost like people,” in that they have senses such as eyes (a camera), ears (a microphone) and skin (a touch screen).
HP’s Ranganathan used the term “ubiquitous nanosensors,” that can have multiple dimensions per sensor:
- Air flow
Ranganathan noted that there will soon be millions of sensors working in real time, with data sampled every second. He said there’ll be lots of different applications for this data, including retail, defense, traffic, seismic, oil, wildlife, weather and climate modeling.
HP sees its role as providing the computing platform required to deal with this massive influx of data and the complexity of processing it in real-time. Google clearly sees itself as a provider of exascale Web services.
We don’t know yet which computing or Internet companies will be most successful over the next 5-10 years, but one thing is for sure. They’ll have to know how to process and make sense of massive quantities of data flowing through the Web – and do it in real-time.
Photo credit: nasa1fan/MSFC