At Facebook’s developer conference F8 today, a number of trends that Web watchers predicted would be defining characteristics of the future suddenly became parts of mainstream discourse. The Facebook megalith learns fast from its R&D department, what the rest of us call the rest of the Web.
Specifically: data as platform, the real-time/synchronous Web and pre-cognitive discovery. Those are things that Web nerds have said would be big and in one fell swoop today, Facebook made its move on them all. Below, some thoughts on data as a platform, and the new Timeline feature in particular. It’s like a grown-up version of the much-criticized Facebook Beacon, which the company had to remove after a backlash several years ago. It will be interesting to see how people react to this – I suspect that this time it may be less a question of privacy and more a question of creepy.
Data as Platform
Facebook’s forthcoming Timeline is a half-automatic, half manually-curated visual display of a user’s past activities and app data. Inspired by Nick Felton’s annual report for an individual person, this feature is something you can sign up for now and get it as it rolls out over the next few weeks.
Data about your online activities is now displayed in a beautiful interface that “tells the story of who you are.” It looks great, in theory.
I expect that you’ll see many more projects like this from many other companies. Value added interfaces on top of aggregated data are going to be a key area of competition.
But for now, the “data exhaust” that we all publish passively as a result of using the Web will now be turned into a timeline automatically on Facebook.
Will the day come when people feel comfortable allowing apps like run tracker, music player, food photo poster to automatically grab and publish their activity data? I think that day may be here now.
Facebook’s Beacon did this with off-site shopping data and there was a huge backlash. But that was four years ago and it wasn’t implemented nearly as well as this is. Beacon didn’t add nearly as much value for the user as Timeline will. Shopping data or data that can inform indirect advertising could be added to Timeline later, but for now it adds a lot of emotional value to the Facebook user experience built on top of wide ranging data. Much but not all of it is opt-in. It’s really smart.
Facebook founder Zuckerberg believes that this experience of value built on top of shared data will compel people to share far more data than they do today. It will be the next step in the continuing growth of social sharing. I think he might be right.
The most likely worst-case scenario though is that people will be creeped out by all this. I think there’s a risk that the new Facebook Timelines are going to look a lot like our real lives – almost exactly like them, but not quite. Far enough off to be frustrating but close enough to be creepy.
We’ll see if it feels empowering, like a new way to articulate the epic meaning of all our individual social lives, or if it feels like a too-nearly-human Panopticon, full of more memories than our own brains are capable of retaining. And thus somehow wrong.
In the world of human look-alike robots, there’s a theory called the
. Per Wikipedia: The uncanny valley is a hypothesis in the field of robotics and 3D computer animation, which holds that when human replicas look and act almost, but not perfectly, like actual human beings, it causes a response of revulsion among human observers.
We’ll see. I think there’s a real risk that people will find the Facebook reflection of themselves repulsive, horrifying on an existential level.
It’s not about privacy or clear exploitation of activity data, as it was with Beacon, it’s more about concern with how accurately one company is able to stitch together a picture of our whole lives.
The Timeline feature will roll out in the coming weeks and then we’ll see if it feels empowering, like a new way to articulate the epic meaning of all our individual social lives, or if it feels like a too-nearly-human panopticon’s picture-book of more memories than our own brains are capable of retaining.