This Spring, Tim O'Reilly was surprised to find himself defending Facebook's changes to its privacy policy. "There's enormous advantage for users in giving up some privacy online and [so] we need to be exploring the boundary conditions," the founder of O'Reilly Media and international technology thought leader wrote. "It's easy to say that this should always be the user's choice, but entrepreneurs from Steve Jobs to Mark Zuckerberg are in the business of discovering things that users don't already know that they will want, and sometimes we only find the right balance by pushing too far, and then recovering."

That's an interesting argument when it comes to consumer products and innovation, but I got to sit down with O'Reilly on the first day of his big OSCON conference yesterday and talk about privacy in a different context: health care, government, global cultural change and a crisis of crises.

Health Data as a Resource in a Changing World

O'Reilly told me he was recently at the US Department of Health and Human Services (DHS), talking about the kinds of things that could be done "if we could use medicare data like Google uses clickstream data." Such organizations have a lot of fear concerning consumer privacy. "The old model of privacy isn't taking into account any of the trade offs, and clearly people are willing to make those trade offs," he says. "Google maps on your phone sends your location to someone else's server every time you look something up, for example."

What kinds of services could be built using the kind of data that public and other large institutions hold? O'Reilly offered a related example of innovation built on top of previously unutilized data. Passur Aerospace is a company that wanted to do predictive analytics on air traffic data. "The airlines had the data, but they were throwing it away," O'Reilly says.

"So they set up their own network of radar stations. Now they sell predictive services to airports. Continental Airlines flights into New York that were running late went from 25 flights a day to none, because of these complex models the company was able to construct. If we had open data today, the FAA wouldn't be throwing it away and somebody would have figured that out faster and cheaper."

A lot of that data that developers will analyze and build on top of in the future will be data about us, collectively and as individuals, O'Reilly says.

"Technology is taking us a direction where more and more is known about us. I refer a lot to Jeff Jonas on this. It's hard to be completely anonymized. I think we need a complete fresh look at what trade offs we're making and why. A good example is health care privacy. It's true that there are some diseases that still have stigmas around them, but our need for privacy is mostly about adverse selection from insurance companies. The problem we need to solve is adverse selection due to pre-existing conditions, not to treat the info like it's toxic waste. If we look at the benefits of using the information - they are incredible.

"One thing we can do is look at places where people have given up a fair amount of privacy and feel ok about it. The financial arena is one of those places - it's ok to do data mining for fraud prevention.

"It's clear we're in the middle of an incredible revolution in what technology enables. In the private sector we're getting further and further ahead of government. We collaborate, get information on demand, our augmented reality is the military's wet dream from a few years ago - and it's fricking free. It's all about access to massive cloud data at any time, when and where we need it.

"In the face of that world, our policies are so hopelessly outdated. I don't know what the right policies are going to be, but I do know they won't be the same policies as 10 years ago, 50 years ago. They need a deep rethinking. We need to ask, what kind of outcomes do we want and how do we get there? And assume that you're not going to stop that more and more is known about us."

O'Reilly's Gov 2.0 conference in September will put an emphasis on bringing together leaders of industry, including finance, and government, in order to share lessons learned from working with data.

When Bad Things Happen

I asked Tim what this inevitable march away from anonymity means to monks in Burma or student protesters in Iran, whose safety and ability to use technology to effect social change depends on anonymity.

"I don't really have a good answer to that," he said.

"Flickr and Youtube killed people in those places. We have to acknowledge that. People have to be aware and we could build more technology for places where you do need to be anonymous. If you're dealing with those kinds of dangerous situations, if you're risking your life, then you act differently than a normal person. Ultimately it is hard to remain anonymous. There are pro-privacy projects, like Tor, and it's worth putting in place as much as possible the infrastructure for anonymity before it's needed."

Across technology innovation and tech-facilitated social change, O'Reilly thinks there's a big picture: the human condition is a social one and our technologies should help us build a response to oncoming crisis that's based in that humanity.

"In open source, the government space, and social media for good: we are building mechanisms for us to save ourselves, for us to work together, to remember what society is. Institutions are things we build to save ourselves. There is some bad shit coming down in our future: global warming, peak oil, wars, pandemics. It's not always going to be happy-happy. All the stuff we're building is going to be stuff that can help us be more adaptable, that will help us respond as a society. As Harlan Ellison wrote, 'why else did we come all this way? To be alone?'"

"We do all of this to do it together, to be together, society is a coping mechanism. Everything we do that's good is to make easy things easier and hard things possible. That was the original Perl slogan, to make easy things easier and hard things possible, and this was originally the Perl conference.

"We need to adopt that strategy in government. Right now we make easy things hard and hard things impossible."

Privacy is Up For Debate

O'Reilly's is certainly a compelling position on the relationship between privacy and innovation, but it's not the only one. For general counterpoints, see our article Why Facebook is Wrong: Privacy is Still Important and danah boyd's SXSW 2010 talk, Making Sense of Privacy and Publicity.

What do you think? Is privacy something that technologists ought to push against the boundaries of and try to change the consequences of, for the sake of innovation and the betterment of society? Or is it realistic to expect tech companies to prioritize the protection of personal control over information, even while building out innovative services, including those built on top of personal information?

Photo by Franz Patzig