Tim O'Reilly was recently at the US Department of Health and Human Services (HHS), talking about the kinds of things that could be done "if we could use medicare data like Google uses clickstream data." The response was a very cautious one.

Big organizations have a lot of fear concerning peoples' privacy, but book publisher, event organizer and industry luminary Tim O'Reilly thinks it's time to reconsider our beliefs concerning personal information. "The old model of privacy isn't taking into account any of the trade offs, and clearly people are willing to make those trade offs," he says. "Google maps on your phone sends your location to someone else's server every time you look something up, for example." O'Reilly's position on privacy is a very important one, at this point in history when the future of privacy is being debated.

I got to sit down with O'Reilly earlier this week, just before the start of his big open source conference OSCON. We talked about a number of things, but it was the discussion of digital privacy that stood out the most. O'Reilly's may be the most intelligent argument in favor of sacrificing some privacy that I've seen yet.

O'Reilly argues that the world is changing dramatically and so our decades-old policies, preferences and beliefs about personal privacy need to change, too. By loosening our privacy requirements, changing the consequences of disclosure of personal information where we can and considering the trade-offs, we could capture an incredible bounty of innovation for social good. And at a time of great peril, when such social innovation will be sorely needed.

What Are the Upsides?

What kinds of services could be built using the kind of data that public and other large institutions hold? O'Reilly offered a related example of innovation built on top of previously unutilized data. Passur Aerospace is a company that wanted to do predictive analytics on air traffic data. "The airlines had the data, but they were throwing it away," O'Reilly says.

"So they set up their own network of radar stations. Now they sell predictive services to airports. Continental Airlines flights into New York that were running late went from 25 flights a day to none, because of these complex models the company was able to construct. If we had open data today, the FAA wouldn't be throwing it away and somebody would have figured that out faster and cheaper."

A lot of that data that developers will analyze and build on top of in the future will be data about us, collectively and as individuals, O'Reilly says.

"Technology is taking us a direction where more and more is known about us. I refer a lot to Jeff Jonas on this. It's hard to be completely anonymized. I think we need a complete fresh look at what trade offs we're making and why. A good example is health care privacy. It's true that there are some diseases that still have stigmas around them, but our need for privacy is mostly about adverse selection from insurance companies. The problem we need to solve is adverse selection due to pre-existing conditions, not to treat the info like it's toxic waste. If we look at the benefits of using the information - they are incredible.

"One thing we can do is look at places where people have given up a fair amount of privacy and feel ok about it. The financial arena is one of those places - it's ok to do data mining for fraud prevention.

"It's clear we're in the middle of an incredible revolution in what technology enables. In the private sector we're getting further and further ahead of government. We collaborate, get information on demand, our augmented reality is the military's wet dream from a few years ago - and it's fricking free. It's all about access to massive cloud data at any time, when and where we need it.

"In the face of that world, our policies are so hopelessly outdated. I don't know what the right policies are going to be, but I do know they won't be the same policies as 10 years ago, 50 years ago. They need a deep rethinking. We need to ask, what kind of outcomes do we want and how do we get there? And assume that you're not going to stop that more and more is known about us."

"It's easy to say that this should always be the user's choice," O'Reilly wrote in a blog post about Facebook privacy this Spring, "but entrepreneurs from Steve Jobs to Mark Zuckerberg are in the business of discovering things that users don't already know that they will want, and sometimes we only find the right balance by pushing too far, and then recovering."

O'Reilly's Gov 2.0 conference in September will put an emphasis on bringing together leaders of industry, including finance, and government, in order to share lessons learned from working with data.

Below: O'Reilly and other industry leaders introduced yesterday the Code for America fellowship program, dedicated to leveraging data for civic good.


When Bad Things Happen

I asked Tim what this inevitable march away from anonymity means to monks in Burma or student protesters in Iran, whose safety and ability to use technology to effect social change depends on anonymity.

"I don't really have a good answer to that," he said.

"Flickr and Youtube killed people in those places. We have to acknowledge that. People have to be aware and we could build more technology for places where you do need to be anonymous. If you're dealing with those kinds of dangerous situations, if you're risking your life, then you act differently than a normal person. Ultimately it is hard to remain anonymous. There are pro-privacy projects, like Tor, and it's worth putting in place as much as possible the infrastructure for anonymity before it's needed."

Across technology innovation and tech-facilitated social change, O'Reilly thinks there's a big picture: the human condition is a social one and our technologies should help us build a response to oncoming crisis that's based in that humanity.

"In open source, the government space, and social media for good: we are building mechanisms for us to save ourselves, for us to work together, to remember what society is. Institutions are things we build to save ourselves. There is some bad shit coming down in our future: global warming, peak oil, wars, pandemics. It's not always going to be happy-happy. All the stuff we're building is going to be stuff that can help us be more adaptable, that will help us respond as a society. As Harlan Ellison wrote, 'why else did we come all this way? To be alone?'"

"We do all of this to do it together, to be together, society is a coping mechanism. Everything we do that's good is to make easy things easier and hard things possible. That was the original Perl slogan, to make easy things easier and hard things possible, and this was originally the Perl conference.

"We need to adopt that strategy in government. Right now we make easy things hard and hard things impossible."

Privacy is Up For Debate

O'Reilly's is certainly a compelling position on the relationship between privacy and innovation, but it's not the only one. For general counterpoints, see our article Why Facebook is Wrong: Privacy is Still Important and danah boyd's SXSW 2010 talk, Making Sense of Privacy and Publicity.

Traditionally, a higher price has been paid for lost privacy by society's most marginalized people. Will that be the case here as well?

What do you think? Is privacy something that technologists ought to push against the boundaries of and try to change the consequences of, for the sake of innovation and the betterment of society? Or is it realistic to expect tech companies to prioritize the protection of personal control over information, even while building out innovative services, including those built on top of personal information?

Photo by Scott Beale/LaughingSquid