The freedom of data from the constraints of single volumes and its expansion to Web-wide scale is no longer a trend. It is an event which, for many companies, has already happened. The result is a completely new landscape, which appears to have taken Microsoft by surprise.
This week at the RSA Conference in San Francisco, the company whose Trustworthy Computing initiative in 2002 was seen as euphemistic for “Big Brother,” is now fighting to keep its seat at the discussion table. In a world where big data rules the discussion, and open source technologies rule big data, is Microsoft even relevant?
“We started talking about building trust in stacks, rooting trust in hardware, making sure that software and data had providence, so you knew where it came from,” related Scott Charney, Microsoft’s corporate VP for the Trustworthy Computing Group, during his RSA keynote on Tuesday. “We started thinking about identity in new ways. When we talk about identity, of course, we raise serious questions about authentication, but privacy. The ability to authenticate users means you can track them across the Internet. We started publishing works on privacy, and moving to a claims-based system where people could assert attributes around themselves, and create a world that’s both more secure and more private at the same time.”
Holes in the Timeline
Indeed, Microsoft has had a security history. But most of the attendees at RSA are intimately familiar with it, which is why when Charney displayed a timeline of Microsoft’s security milestones in the context of Windows, the collective groan could be heard in San Jose.
I raised that issue later during the conference with Dave Forstrom, one of Charney’s directors for TWC. He referred me back to a point ten years ago, when Chairman Bill Gates penned a memo ostensibly to his employees, which set the company forth on its TWC mission. “You can make a strong argument that a lot of that originated from pain,” Forstrom told me, after I reminded him of the absence of Windows Me from Charney’s timeline.
“We make no quib about that. Just this last month, the ten-year milestone provided us with an opportunity to look back and reflect on those pain points,” Forstrom said. “And we have no bone about the fact that we experienced pain from that. Why did Gates put out that memo to the company? He was feelin’ some pain… We’re seeing the implications of what disruption can do, of what network degradation can do. And we’re in the era of mass-mailing worms, and huge-footprinted impact. I think innovation comes from a resolve of, how do you address this and climb back up, acknowledge our responsibility, accept that, and move forward?”
During his keynote, Charney characterized Windows Vista as the most obvious pain point. From a security engineering standpoint, it was an inaccurate characterization, and many folks knew it. While User Account Control was a nuisance for Vista users, its implementation was statistically one of the most effective malware deterrents that Microsoft ever implemented. Charney poked fun at UAC by having it interrupt his slideshow with an admin approval prompt.
“The key thing is, of course, we’re going to make mistakes, but we’re going to learn from them,” the corporate VP said while clicking “Allow.” “In Windows 7, we started looking at a trusted user experience in a different way. Because User Access Control was really in part a great idea, and in part a lesson about the challenges of doing security well.”
Relating a story about his mother – a schoolteacher who asked him what Internet Explorer means when it says a download may be dangerous – Charney added, “We see this constant problem, in both security and privacy: How do we get users engaged in a way that makes sense, gives them meaningful information that they can act upon? If you’re going to give a user a message in security or in privacy, that message has to be necessary. It has to be explainable. You have to be able to tell the user why they’re getting this message. And they have to have some information unique to them that allows them to action the message. It has to be actionable, and it has to be tested.”
All of this takes Microsoft to the edge of the emerging topic of big data, but that edge is the base of a very tall cliff. Trustworthiness needs to pertain to how personal data is being used. And once it’s amalgamated together into “big data,” the question of trustworthiness takes on new aspects that Bill Gates did not foresee in 2002. Gates saw trust as it pertains to software, not the people behind it. With respect to data, his sole concern was its integrity: “The data our software and services store on behalf of our customers should be protected from harm and used or modified only in appropriate ways,” Gates wrote.
“Take the old world that we know, the very bilateral, linear world that was really about the person and the document,” explained Dave Forstrom with respect to the world of 2002. “Think about how much that has evolved and changed today to this multi-lateral, hyper-connected world, where each and every element – from the machine to the application to the person to the data – could be influencing this rich user experience… in a helpful manner or a harmful manner, innocuous or dangerous, even to the point where, in this environment, you and I as individuals and the entity that’s providing us a service of collecting the data, have no direct relationship at all any more. That certainly strains what we see from the perspective of a traditional privacy model, where the rules of choice and consent existed.”
Forstrom went on to emphasize the importance of maintaining proper “hygiene,” which among the security-literate is a metaphor for basic data protection procedures. “But the big shift is the fundamental approach to security posture. Whereas in the past, it was one that was based and rooted in prevention and recovery, it’s no longer enough to say, I can strengthen and fortify the security of my castle. We have to be prepared to not only detect but contain the fact that intrusion will happen, and beyond that, be able to recover from it.”
It’s a mindset shift, he went on, from prevention and recovery to detection and containment – a shift that the era of big data not only enables, but mandates. “Big data exacerbates privacy concerns. The challenge there is, we have to get to a point where we’re able to balance the crafting of principles to not only recognize and acknowledge our rights as individuals to privacy, but to reap the benefits of big data… The burden has to be shifted from the consumer to the data collector. New privacy models have to focus on a shift from consumer choice and consent, to accountability principles and consensus.”
Put another way: There’s a very real prospect that new laws and regulations could lead to a world where consumer consent and denial could eventually resemble UAC on steroids. “Facebook is about to share your profile with Boogoo.ru: Allow / Cancel?” This is a state of affairs which Microsoft may not be able to sustain, especially if it is perceived by everyone down to Scott Charney’s mother as the origin of all the consent prompts. The way to avoid this eventuality is for the data collectors of the world to make their policies accessible, comprehensible, and agreeable to users.
You Have No Privacy: Confirm or Deny?
Charney predicted a near-future state of affairs where all the everyday system events that impact Web sites whose resources are scattered all over the planet, will need to be addressable in the aggregate, as a big data set. Systems, such as they are, will be global not only in reach but in physical size. So every error message indicating a fault with the system… could be world-changing. “How are we going to manage composite reliability at scale? Big data, instrumenting products to give us more intelligence about the dependencies among these things.
“But there’s going to be an interesting corollary to big data,” he added. “We’re going to get all these social benefits, and everyone’s going to be thrilled. It has a huge impact on privacy – the ability to analyze data about people in ways that are just astounding compared to where we were.”
Charney, whose background is law rather than marketing or software, reminded the audience that the Supreme Court has historically maintained that whenever one’s personal data is given to a third party, his right to privacy is surrendered. “But in this new cloud-enabled world, you give up everything to third parties. So how does that work?”
Forstrom elaborated: “It’s not sustainable any more to be bombarded with this choice and consent, in terms of [big data] use and this multi-lateral relationship. Of course, we have to set up some sort of accountability principle here. We have to set up a framework that establishes, what is ‘broadly acceptable use?'” For example, can highly aggregated big data be compiled, analyzed, and categorized without effectively compromising the privacy of each individual whose data comprises the aggregate? If pollsters anonymize survey data all the time, why can’t data collectors?
“A strategy is great to have. But a strategy is really just thought,” proclaimed Scott Charney. “To prove that you’re implementing a strategy comes in your products and services.” With the Windows 8 Community Technology Preview having been released this week, and Microsoft’s cloud services being expanded in the coming months, it will now be up to individuals to see whether the company’s strategy is a framework for others to build upon, or really just thought.