Facebook changed the world by helping 350 million people publish their thoughts, feelings, comments, photos, videos and shared links much more easily than ever before. It’s the King of social networking.
The network grew with a big promise of privacy at the center of what it offered: your information was by default visible only to people you approved as friends. In December that changed, in a fundamental way. We offer below a summary of the changes that were made and key highlights from the debate that’s raging around the world about privacy, public information and Facebook. Given the role that Facebook plays in so many of our lives, this is high-stakes stuff.
What changed in December: Facebook users are no longer allowed to restrict access to their profile photos and the list of pages they have subscribed to updates from. The list of any Facebook user’s friends were made irrevocably public but after a very negative reaction from users, users were given a way to hide those lists from human view and leave them visible only to machine access.
User updates (“What’s on your mind?”), shared photos, videos and links used to be private (visible only to approved friends) by default. If you’d never tweaked your privacy settings, then in December they were shifted by default to public (visible to the entire web) unless you decided when prompted to switch them back to private.
Those aren’t simple changes to understand and there has been a lot of confusion about them. Many people do not like the way this is going. Here are some of the highlights of that debate.
Facebook’s Arguments in Favor of a Shift Towards Public Information
In July we asked Facebook executives point-blank on a press call about some of the initial changes in privacy settings: are you pushing people towards sharing more information publicly on the site. Two out of three of those we asked said yes, they were. Why? The answers have been inconsistent and not very compelling.
Facebook Product Manager Leah Pearlman told us that making more user data publicly visible would help users identify which people were their friends when search results showed multiple people with the same name. Facebook Director of Communications Brandee Barker told us that more public information would help users connect with new people who share common interests.
Chief Privacy Officer Chris Kelly told us the July changes weren’t about decreased privacy, but about increased control for users over their privacy.
When December’s changes went down, we had a long conversation with Barry Schnitt, Director of Corporate Communications and Public Policy at Facebook. Schnitt told us that the shift towards more public information was big; just like “it was a big change in 2006 when Facebook became more than just people from colleges.” “Facebook is changing,” he said, “and so is the world changing and we are going to innovate to meet user requests.”
Schnitt said it was clear the world was changing away from a focus on privacy and cited as evidence the rise in blogging, Twitter and MySpace, comments posted on newspaper websites and the popularity of Reality TV. Schnitt also acknowledged that page views and advertising were part of the motivation.
Then in January Facebook founder and CEO Mark Zuckerberg said publicly that if he were to create Facebook today, the privacy settings would have been from the start just what they are today. He said that notions of privacy are evolving and that the company changed its policies to reflect that. He cited the rise of blogging as his evidence of that change.
Finally, the company has said for some time that more public information will lead to greater familiarity, understanding and empathy between people: that a change towards a public Facebook is good for world peace. This actually might be the most compelling argument of all and it’s not that compelling because of the matter of user trust.
The Arguments Against Facebook’s Change
Two years ago Facebook founder Mark Zuckerberg told us that Facebook users couldn’t be permitted to take their data from Facebook to other sites they wanted to use it on because privacy control “is the vector around which Facebook operates.” The company has changed its stance regarding privacy dramatically since then.
Many people believe that Facebook is getting ready to file for an Initial Public Offering – to start selling stock in the company to the public. It’s widely suspected that this shift toward more public information is intended to increase website traffic and advertising: the more pages you can look at, unhindered by privacy settings, the more ads Facebook will be able to show you. The more ads Facebook can show you, the more its stock will be worth in the IPO.
We’ve argued that the ways Facebook is justifying these shifts just aren’t believable. Last week we made these three arguments:
Even if society is changing to move away from privacy – that doesn’t justify taking away the option to keep many things private. As Microsoft researcher danah boyd wrote this weekend:
People still care about privacy because they care about control. Sure, many teens repeatedly tell me ‘public by default, private when necessary’ but this doesn’t suggest that privacy is declining; it suggests that publicity has value and, more importantly, that folks are very conscious about when something is private and want it to remain so. When the default is private, you have to think about making something public. When the default is public, you become very aware of privacy. And thus, I would suspect, people are more conscious of privacy now than ever.
As Nick O’Neill wrote on his own blog AllFacebook:
When Facebook decided that they would start making these decisions on behalf of users, they crossed the line. Facebook doesn’t need to update their system to ‘reflect what the current social norms are’. Instead, Facebook should give users complete control of their privacy and as a result, user settings in aggregate will effectively ‘reflect what the current social norms are’. Simplifying a system which gives users complete control of their privacy isn’t easy but the value of such a system is priceless and for Facebook it’s necessary.
Privacy isn’t just about keeping things secret, it’s about respecting the context of communication and not pushing peoples’ communication out of the context it was intended for. Thus, the fact that “nothing is secret on the internet” is beside the point. As University of Massachusetts-Amherst Legal Studies student Chris Peterson writes in a research paper Saving Face: The Privacy Architecture of Facebook (PDF), people today feel their privacy has been violated if what they say to one group of people gets shared with another group in different circumstances. By pushing personal information out of the restricted access of “friends only” – that’s what Facebook is doing.
There are many people who need to maintain control over their personal information, to restrict access to it to trusted friends, as a matter of personal safety. As online identity technical consultant Kaliya Hamlin wrote here last month, Facebook’s push away from privacy represents a violation of its contract with users. Scientists have been able to determine peoples’ sexual preferences by analyzing their friends lists. People with religious or political preferences that are unpopular where they live or work and people who are escaping abusive relationships used to be able to keep their private information (like interests in the form of Fan pages) between trusted friends on Facebook but can no longer.
Here’s how danah boyd explained a similar argument:
Power is critical in thinking through these issues. The privileged folks don’t have to worry so much about people who hold power over them observing them online. That’s the very definition of privilege. But most everyone else does. And forcing people into the public eye doesn’t dismantle the structures of privilege, the structures of power. What pisses me off is that it reinforces them. The privileged get more privileged, gaining from being exposed. And those struggling to keep their lives together are forced to create walls that are constantly torn down around them. The teacher, the abused woman, the poor kid living in the ghetto and trying to get out. How do we take them into consideration when we build systems that expose people?…People care deeply about privacy, especially those who are most at risk of the consequences of losing it. Let us not forget about them. It kills me when the bottom line justifies social oppression. Is that really what the social media industry is about?
Finally, thinker and author Nick Carr weighed in this weekend as well with a withering article titled “Other Peoples’ Privacy.” He discussed both Facebook’s shift away from privacy and Google CEO Eric Schmidt’s recent statement that “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”
Carr drives home the significance of these anti-privacy moves and statements by calling them a threat to human liberty.
Reading through these wealthy, powerful people’s glib statements on privacy, one begins to suspect that what they’re really talking about is other people’s privacy, not their own. If you exist within a personal Green Zone of private jets, fenced off hideaways, and firewalls maintained by the country’s best law firms and PR agencies, it’s hardly a surprise that you’d eventually come to see privacy more as a privilege than a right. And if your company happens to make its money by mining personal data, well, that’s all the more reason to convince yourself that other people’s privacy may not be so important.
There’s a deeper danger here. The continuing denigration of privacy may begin to warp our understanding of what “privacy” really means. As Bruce Schneier has written, privacy is not just a screen we hide behind when we do something naughty or embarrassing; privacy is ‘intrinsic to the concept of liberty’:
For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that – either now or in the uncertain future – patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable.
Privacy is not only essential to life and liberty; it’s essential to the pursuit of happiness, in the broadest and deepest sense of that phrase. It’s essential, as Schneier implies, to the development of individuality, of unique personality. We human beings are not just social creatures; we’re also private creatures. What we don’t share is as important as what we do share. The way that we choose to define the boundary between our public self and our private self will vary greatly from person to person, which is exactly why it’s so important to be ever vigilant in defending everyone’s ability and power to set that boundary as he or she sees fit. Today, online services and databases play increasingly important roles in our public and our private lives – and in the way we choose to distinguish between them. Many of those services and databases are under corporate control, operated for profit by companies like Google and Facebook. If those companies can’t be trusted to respect and defend the privacy rights of their users, they should be spurned.
Privacy is the skin of the self. Strip it away, and in no time desiccation sets in.
Desiccation means to dry something out by removing the water from it; Carr argues that the removal of privacy from our lives would suck dry our liberty, our individuality.
Those are the arguments being made. We don’t expect this debate to die down anytime soon.