During today’s SXSW keynote, social media research Danah Boyd, who works for Microsoft Research New England and is a fellow at Harvard University’s Berkman Center for Internet and Society, talked about online privacy. Specifically, she focused on how users can navigate issues around online privacy and how developers can help them to do so.

Boyd, who has researched how mainstream users use social media for the last couple of years, argued that developers have to focus on questions about privacy and publicity as they use and develop these new applications and experiences. According to Boyd, privacy is not dead and users care about it – both online and offline – and often react quite violently when their expectations of privacy are broken.

Google Buzz: Privacy Fail

Looking at the example of Google Buzz, which she called a “privacy fail,” Boyd argued that Google didn’t do anything technically wrong when it release Buzz. Instead, Google made a number of non-technical mistakes that interrupted a set of social expectations its users had.

Google’s mistakes:

  • Building a public system in an environment that most people consider to be private (their email service). A lot of users actually believed that once they started using Buzz, Google would expose all of their private emails to the world.
  • Google assumed that users would simply opt out if they didn’t want to participate. A lot of Google users, however, thought that they would cancel their Gmail accounts if the opted out of Buzz.
  • Technologists assume that the optimal solution is the best and forget about social rituals. Boyd noted that users expect to be able to choose their friends, for example, a social ritual that Google interrupted when it automatically populated its users Buzz accounts with people they tended to send a lot of emails to.

To explain these issues, Boyd distinguished between articulated networks (address books, Facebook, Twitter), behavioral networks (based on common behavior, location, etc.) and personal networks. According to Boyd, people don’t necessarily want to bring all of this info together (which Buzz did). Instead, they want to be able to separate different groups.

It’s also important to remember that private and public are also not always clear binary opposites. While technology often makes it looks like this, in real life, things tend to get a lot messier. If you are out in a cafĂ©, for example, you are in a public space, but you expect a certain community to be there – while you don’t expect others to be there – and you still expect a certain degree of privacy while you are talking to your friends.

Facebook’s Privacy Fail

Users generally don’t handle change well, which can have serious privacy implications. When Facebook asked its users to reevaluate their privacy settings a few months ago, the default choice was “everyone.” People encountered the Facebook popup with a notification about these changes, however, clicked through without reading it and suddenly all of their data was public. According to Facebook, only about 33% of users made changes. As Boyd noted in her talk, most Facebook users simply didn’t understand the privacy settings.

Public by Default, Private by Effort

By default, most conversations on social media services are now public, while making them private takes a conscious effort. By and large, teenagers, according to Boyd, are more conscious about what they can gain by being public, while adults worry more about what they could lose. That, however, can lead to shortsighted decisions and have serious consequences – something developers need to think about as they create their social media applications and especially aggregators.

The Public-By-Default Environment is Not the Great Democratizer

Just because something is publicly accessible, for example, doesn’t mean that people want it to be publicized. The launch of Facebook’s news stream, fore example, caught users by surprise as it broke the social contract on Facebook. While the data in the news stream had always been available, aggregating it violated the privacy expectations of most users. Developers, according to Boyd, have to ask themselves how the people whose content they are remixing and aggregating would feel if all of this data was suddenly available in one place.

What Can Developers Do?

  • There is no magical formula: privacy exists in social contexts and these contexts are complex and change constantly. For technologists, this is what makes it so hard to deal with these problems. Developers, said Boyd, have to learn to navigate these complexities and interact with their users. Developers also have to consider that privacy slip-ups can have real-world consequences for users.
  • Developers have to ask themselves how they would feel if this information they aggregate would be disclosed. Just because you can see somebody, doesn’t mean they want to be seen.
  • Wanting privacy is not about having something to hide, but about control and creating space to open up.