If there truly is no privacy on the Web, then how can we be shocked by reports of a privacy breach?
If that's not the case, and we truly do expect privacy on the Web, then when 15 years go by before major browser makers pledge to implement Do Not Track buttons, and then only at the urging of the President of the United States, whom do we hold at fault for those buttons having been absent all this time? And if those buttons probably won't work anyway, which is what some experts believe, then just who is it being fooled by whom?
"People are holding out the same hopes for Do Not Track that they held out for P3P 15 years ago. It's definitely a whole déjà vu thing here," says Dr. Lorrie Faith Cranor, in the second part of her interview with ReadWriteWeb. Dr. Cranor, now an associate professor of computer science at Carnegie Mellon University, was an early contributor to P3P and the former chair of the W3C working group that developed it. It never got very far, and she believes there are many lessons from that experience that may now be applied to the Consumer Privacy Bill of Rights, which the White House unveiled Thursday afternoon.
The White House report (PDF) stops short of calling Do Not Track (DNT) the answer to consumer privacy issues.
"Privacy enhancing technologies such as the 'Do Not Track' mechanism allow consumers to exercise some control over how third parties use personal data or whether they receive it at all," the report reads. "For example, prompted by the FTC, members of the online advertising industry developed self-regulatory principles based on the FIPPs, a common interface to alert consumers of the presence of third party ads and to direct them to more information about the relevant ad network, and a common mechanism to allow consumers to opt out of targeted advertising by individual ad networks. A variety of other actors, including browser vendors, software developers, and standards-setting organizations, are developing 'Do Not Track' mechanisms that allow consumers to exercise some control over whether third parties receive personal data. All of these mechanisms show promise. However, they require further development to ensure they are easy to use, strike a balance with innovative uses of personal data, take public safety interests into account, and present consumers with a clear picture of the potential costs and benefits of limiting personal data collection."
A White House statement Thursday indicated that some set of privacy regulations would eventually become enforced by the Federal Trade Commission. It did not say whether DNT would be a factor in those regulations.
Self-Regulation: The Sequel
"I would love to see us just turn on the technology that we have, and maybe agree to get together and upgrade it a little bit, and it's good to go. That's really wishful thinking, though," she adds. "I am very pessimistic that that is going to happen."
The problem, as Cranor describes it, is that the real definition of privacy may be much deeper than President Obama described it in his preamble to his Thursday report. There, he cited former Supreme Court Justice Louis Brandeis' famous definition of privacy as "the right to be let alone."
Throughout a career full of contributions to the digital privacy process, Dr. Cranor has instead cited Alan F. Westin, the Columbia University law professor and, in 1967, the author of perhaps the most prescient volume on the subject of privacy in communications ever published, Privacy and Freedom. "Each individual is continually engaged in a personal adjustment process," Prof. Westin wrote, "in which he balances the desire for privacy with the desire for disclosure and communication of himself to others."
"As we walk about in the physical world, we raise and lower our voice and we raise and lower our window shades and we turn our faces, and we are all constantly adjusting to regulate our exposure and our privacy," Dr. Cranor tells RWW. "And it comes naturally; we don't spend a lot of time thinking about it. We just sort of naturally do it. But when we go online, it's no longer natural, because we don't have these readily apparent, physical things where you can just easily close that shade, and it's obvious what you're doing. So we have to rely on software tools to help us with this privacy regulation process."
Privacy Bird. In concept, it was simple. It led the user by the hand through various scenarios, including potential exceptions to privacy rules. A user might not want a first-party site to share session information with a third-party provider... but what if that provider is the thing that makes the shopping cart work?In 2001, working with AT&T Labs, Cranor led the development team of one of the first user-centered privacy preference tools, entitled
For the 2005 book Security and Usability, which she co-edited with Simson L. Garfinkel, Cranor wrote:
Initially, most people with whom I have discussed privacy preferences tell me that their privacy preferences are pretty simple - for example, "I don't want companies to give my information to anyone else." But as our conversations continue, people usually start to articulate a variety of exceptions to their simple initial rules. "If I order something from them, then they can provide my information to fulfill the order and ship a package to me. And if I tell them about my hobby, then it would be OK if they send me catalogs related to that hobby or let me know about clubs I might be interested in." Some people, eager for a good deal, go further: "I should have the right to control my information, but junk mail doesn't really bother me so much. So if they are willing to give me something for free, I don't mind throwing away their junk mail. But if they are profiting from my information, I should get something too." And when the discussion turns to the sharing of location or presence information with other individuals, privacy preferences tend to get very complex."
Simpler, But Not Better
As the Consumer Privacy Bill of Rights puts the Do Not Track standard into the spotlight, Cranor believes critics may (rightly) call into question whether its simpler, "Brandeis-style" approach to being tracked will be embraced by the public. If users discover there's no way to assert the exceptions to the rule that matter to them, they could simply turn DNT off - with result potentially being a more dangerous state of affairs than we have even today.
"I think Do Not Track is a very much watered-down P3P," the Carnegie Mellon professor tells RWW. "It's much, much simpler. It's not nearly as powerful. So the question is, was the problem with P3P that it was too complicated, and this very simple thing is what will allow [privacy] to get adopted? I don't know; my crystal ball isn't good enough... If we had P3P, we'd have a way for users to say, 'Okay, if they ask me for this, this, and this then it's okay; for any of these other purposes, it's not okay.' Your browser knows, your browser could deal with it."
Detail of a quilt entitled 'Thesis II' by Dr. Lorrie Faith Cranor. More of Dr. Cranor's quilts are on display here.
Cranor's CyLab team has evaluated perhaps dozens of third-party privacy tools over the years, few of which conform to anyone's idea of a standard, but some of which do try to take bold steps to enable privacy at a more granular level. One such tool that she likes and actually uses, she tells us, is called Ghostery. But in her team's report which RWW covered last November, entitled "Why Johnny Can't Opt Out," even Ghostery was among the nine tools cited as ineffectual in enabling personal privacy preferences.
The reasons may be unavoidable: Ghostery gives users an entire list of companies that may be tracking their behavior. Since none of these companies are recognizable for any particular reason, users have no basis to make a judgment call. So they may end up blocking everyone, just to go down the checklist and get this done. They may also do the reverse. Or they could simply opt for the "default settings," which for Ghostery and other tools... is nothing at all.
Wrote the CyLab researchers, "The tools we investigated tended to present information at a level that is either too simplistic to inform a user's decision or too technical to be understood."
Cranor told us one of her team's researchers took the trouble of investigating every single tracker on the tools' combined lists. In many cases, although companies do tend to post privacy policies, several are not in English, and those that happen to be in Japanese can only be translated by Google into a miserable facsimile of English.
She says she and her colleagues have been contacted by some tools' manufacturers since her report was published last October 31. They've told her there have been improvements since that time, some of which she's seen, and some of which she privately agrees are indeed improvements. But now that the cloud is becoming a more prominent factor in the delivery of everyday functionality, just during this year we should be seeing more services that relied on browser-based access converted to apps, extending the mobile delivery model to tablets and desktops. When that happens, whether the browser is even involved in the privacy process at all morphs from a certainty into a question mark.
As Dr. Cranor remarks, "Fundamentally, we still have to ask ourselves... What we want users to do here is to judge hundreds, maybe even thousands of companies they've never heard of, and make decisions about whether or not to allow them to track them with a technology that most users have no idea how it works. Does this make any sense?"
Stock photo by Shutterstock.com