Big Data not only has the potential to measure and anticipate human behavior. It also is dangerously close to helping businesses and governments understand our thoughts, both what we’re thinking now and what we may think in the future. And take action accordingly.
Thus spake George Dyson, science historian and author of Turing’s Cathedral. While Dyson seems most concerned about the NSA’s use of Big Data to predict terrorist actions, the implications of his analysis have far wider implications:
The ultimate goal of signals intelligence and analysis is to learn not only what is being said, and what is being done, but what is being thought. With the proliferation of search engines that directly track the links between individual human minds and the words, images, and ideas that both characterize and increasingly constitute their thoughts, this goal appears within reach at last.
It’s not that a machine can understand exactly what we’re thinking at any given point in time. It doesn’t have to. As Dyson explains, “A reasonable guess at what you are thinking is good enough.”
Predicting The Future With Metadata
Importantly, such guesses derive from the very metadata that the NSA has tried to discount. “It’s only metadata,” says the NSA. But as Dyson highlights, it’s the metadata that matters most: “If Google has taught us anything, it is that if you simply capture enough links, over time, you can establish meaning, follow ideas, and reconstruct someone’s thoughts.”
Alex Pentland, one of the world’s foremost data scientists, explains this in more detail, arguing:
[T]he power of Big Data is that it is information about people’s behavior instead of information about their beliefs….What you put on Facebook is what you would like to tell people, edited according to the standards of the day. Who you actually are is determined by where you spend time, and which things you buy. Big data is increasingly about real behavior, and by analyzing this sort of data, scientists can tell an enormous amount about you.
For marketers, this is the stuff of fantasy, now increasingly real. For governments looking to squash dangerous behavior, it’s very real, as recent events with the NSA have demonstrated.
From Behavioral Analysis To Thought Control
Unfortunately, we’re unlikely to be satisfied with a close approximation of someone’s current thoughts. We’re going to want to know what they will be thinking, so we can sell to them or, in the case of government, impede illegal activity.
However, thought doesn’t always lead to action, and governments have traditionally been in the business of policing illegal actions. If we allow government to become equally interested in monitoring, anticipating and ultimately thwarting thoughts, as a society we will have lost. This is particularly true because so long as we can’t be completely certain that our assumptions and predictions are accurate, we will assuredly end up wrongly punishing actions.
Scarier still, we will likely start to punish thoughts.
While this (correctly) sounds Big Brother-ish, it’s actually worse than that. As Dyson posits, “No matter how much digital horsepower you have at your disposal, there is no systematic way to determine, in advance, what every given string of code is going to do except to let the codes run, and find out.”
In other words, no matter how dangerous a thought appears, we can’t in advance know whether it will turn out to be dangerous in fact, or whether it’s simply something that appears wrong to us today but ultimately turns out to be beneficial.
Causation, Not Correlation
All of which is reason enough to be suspicious of the new fad to focus on correlation, not causation, in Big Data. The correlation argument tells us to stop bothering to understand our data and simply look for patterns, not worrying about why those patterns exist. But it turns out that “why?” is the most critical question to answer, because it’s only when people intervene to overrule dehumanized data that fair government policy and ethical marketing happen.
Image courtesy of Shutterstock.