You have to admit, he’s getting better at this. Four years ago, in response to numerous public complaints – many of them in court – about its plans to share aggregate user data with third parties, Facebook responded in a flat, dismissive tone that users were given every opportunity to opt out of behavior sharing. So what they don’t opt out of is effectively their own problem.
Today’s settlement between Facebook and the U.S. Federal Trade Commission effectively ensures that the company can no longer take this specific stance without facing intense U.S. government scrutiny. But in the intervening four years, Facebook has become a veteran of government scrutiny, including from the Canadian Privacy Commissioner and throughout Europe. And it has gained a lot more skill at adapting its semantics to strike the right political and often psychological tones.
When it’s in the interest of a software company to market a product that performs a set of functions, its marketing makes the case of how innumerable workflows can be condensed and simplified into simple, manageable streams, often with circles and arrows. When it’s in the interest of a software company to defend its own ability to manage the data entrusted to it, its marketing makes the case of how such colossal amounts of data are beyond the ability of any one company to control on its own.
Thus the word from Facebook CEO Mark Zuckerberg today is not that privacy is the user’s problem, but rather the user’s privilege. In place of an opt-out button, Facebook has instituted a myriad of privacy controls, including the ability to define groups and regulate which classes of data may be shared among those groups. But today, Zuckerberg was able to characterize these multiple controls as pre-emptive responses to the FTC – a way to say we’ve already addressed the problem, thank you very much, but we’re willing to accept suggestions because this job is bigger than any one company.
We’re okay with you having issues
“I also understand that many people are just naturally skeptical of what it means for hundreds of millions of people to share so much personal information online, especially using any one service,” writes Zuckerberg. “Even if our record on privacy were perfect, I think many people would still rightfully question how their information was protected. It’s important for people to think about this, and not one day goes by when I don’t think about what it means for us to be the stewards of this community and their trust. Facebook has always been committed to being transparent about the information you have stored with us – and we have led the Internet in building tools to give people the ability to see and control what they share.”
The CEO goes on to list many of the controls that the company has already implemented, including a dashboard for checking the degree of data that apps written for the Facebook Platform may access. He continues, “We do privacy access checks literally tens of billions of times each day to ensure we’re enforcing that only the people you want see your content. These privacy principles are written very deeply into our code.”
But “people” aren’t always the problem. As it becomes more adept at defending its own interests, Facebook has been successful of late at redirecting attention away from the real problem. The real problem was spelled out in the FTC’s complaint, which was settled today.
30. Facebook has disseminated or caused to be disseminated numerous statements to users stating that Platform Applications they use will access only the profile information these applications need to operate, including, but not limited to:
a. the following statement, which appeared within a dialog box that each user must click through before using a Platform Application for the first time:
Allowing [name of Application] access will let it pull your profile information, photos, your friends’ info, and other content that it requires to work. (Authorization Dialog box, Exhibit D); and
b. the following additional statements on www.facebook.com:
i. Applications you use will access your Facebook information in order for
them to work. (Facebook Privacy Settings: What You Share, Exhibit E); andii. When you authorize an application, it will be able to access any information associated with your account that it requires to work. (Facebook Privacy Settings: How Applications Interact With Your Information, Exhibit F).
31. Contrary to the statements set forth in Paragraph 30, in many instances, a Platform Application could access profile information that was unrelated to the Application’s purpose or unnecessary to its operation. For example, a Platform Application with a narrow purpose, such as a quiz regarding a television show, in many instances could access a user’s Relationship Status, as well as the URL for every photo and video that the user had uploaded to Facebook’s Web site, despite the lack of relevance of this information to the Application.
Then in Count 4 of what would have been the indictment, “In truth and in fact, as described in Paragraph 31, from approximately May 2007 until July 2010, in many instances, Facebook has provided Platform Applications unrestricted access to user profile information that such Applications have not needed to operate. Therefore, the representation set forth in Paragraph 32 constitutes a false or misleading representation.”
Pay no attention to the app behind the curtain
People – or rather, who sees your private data as opposed to what – are not the real problem, and frankly never were. Under the terms of the settlement, Facebook must now implement some type of control that triggers a visible warning whenever its system is preparing to make available “any sharing of a user’s non-public user information by [Facebook] with any third party, which materially exceeds the restrictions imposed by a user’s privacy setting(s).” That means, when an app accesses data above and beyond users’ privacy settings, the user must be informed.
How can Facebook get around this? By continuing to characterize its cornucopia of privacy settings in terms of people, not things. If there are no privacy settings with respect to the classes of data apps may have unrestricted access to, then there no restrictions to materially exceed. This is where the new Dashboard tool to which Zuckerberg referred may play a critical role. Despite how Facebook characterizes it as a way to limit how apps “personalize your experience” (once again making it into a who, not a what), this tool is a per-app “opt out” mechanism that effectively acknowledges the default state of Platform app access is “open,” unless you yourself state otherwise.
The settlement terms may directly impact how this tool gets used, and may compel Facebook to reverse that default state. If that happens, the way games and other apps interact with Facebook may have to change too, and developers may not be pleased.
To deal with what uproar may occur, Zuckerberg said he’s appointing another set of Chief Privacy Officers. In actuality, they were already there: Erin Egan was appointed “Director of Privacy” in September, and today was named “Chief Privacy Officer, Policy.” Michael Richter was the company’s long-standing Chief Privacy Counsel, and is now “Chief Privacy Officer, Product.”
“These two positions will further strengthen the processes that ensure that privacy control is built into our products and policies,” writes Zuckerberg. Of course, the previous roles of Privacy Director and Privacy Counsel served that same purpose.
A two-sided conclusion
On the one hand: As any IT security manager knows, the way to implement privacy control in an organization is not to make the private data available in the first place. Modern information security policies are never about per-instance restrictions to the otherwise free flow of information. The same level of controls can, and perhaps should, be provided for directing flow in the opposite direction. That is to say, share nothing by default, and opt in to services that other users and even apps may request.
On the other hand: Facebook’s responsibility for the protection of data provided by users of their own free will, and without any binding contract other than the implied consent agreement, is somewhat limited. The FTC made clear to cite Facebook for misrepresenting its services from the outset, and that misrepresentation gives the government the leverage it needed to force Facebook to change its policies (even though Zuckerberg implies no such change is necessary now). But had that misrepresentation not existed, the FTC may not have had much ground to stand on. It’s hard to establish a standard of care for property that so many millions of individuals willingly give for free.