Researchers recently published a paper that detailed how they manipulated the emotions of 689,003 Facebook users in 2012, in an effort to determine whether positive and negative posts had an effect on their moods.
It turns out that yes, people feel differently and thus post differently, when they see positive or negative posts in their feeds.
The researchers conducted another unwitting experiment on themselves and their employer: Would revealing such an experiment enrage ordinary Facebook users as well as the academic and scientific community? The answer, they learned over the past weekend, was decidedly yes.
The Facebook Experiments
Researchers ran two parallel experiments for one week in January 2012. In one, they modified the Facebook news feed algorithm to show fewer positive posts in select people’s timelines. Another group saw fewer negative posts.
From the paper, “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks”:
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
The study was conducted by researchers from Cornell University and the University of California San Francisco, along with Adam Kramer, a data scientist at Facebook.
It’s impossible to know if you were one of the people that “participated” in this study, not just because Facebook randomly selected these participants, but because users were not told they were being used in this science experiment. Normally, when someone participates in a scientific study, they give “informed consent,” meaning they agree to being used for fodder in research, and are aware of the study being conducted.
Facebook didn’t ask participants for consent. Instead, the paper claimed that because users had consented to Facebook’s Data Use Policy—which, if you have a Facebook account, you’ve agreed to—it gave them permission to use them as test subjects.
Nowhere in the policy does it say users consent to having emotions manipulated in the name of science. Rather, it says that information may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” And, as Kashmir Hill at Forbes discovered, Facebook didn’t include the “research” portion of the Data Use Policy until May 2012, months after this particular study was conducted.
Professor James Grimmelmann, a law professor at the University of Maryland, says that the idea of users consenting to scientific studies by way of the company’s terms of service is “bogus.”
“When we do research we have informed consent—that extra word matters,” Grimmelmann told me in an interview. “[Facebook] doesn’t disclose that it will manipulate what it shows you, at all.”
How To Avoid Being A Test Subject
So, if Facebook is claiming that simply by being a Facebook user you are giving the company consent to use your information for scientific research, how can you opt out?
Well, you can opt out of Facebook.
Unlike the privacy settings you can change if you don’t want strangers looking at your profile or selfies, you can’t tell Facebook not to use your data. Just by being a Facebook user means you’re giving the company the ability to manipulate your news feed in an effort to analyze human behavior, even if that means your own feelings will change.
Consider that this paper details an experiment conducted two and a half years ago. What has Facebook been doing in the meantime? Are you a guinea pig right now for some Facebook experiment to be revealed in the future?
Thanks to the Data Use Policy, you don’t know.
On Facebook, You’re Just A Data Point
Of course, Facebook manipulates your feelings whether you realize it or not. In this particular study, researchers were trying to figure out how emotions affect what you post on Facebook, and how best to keep you posting on Facebook. But in other cases, it’s all about advertising.
See Also: How To Remove Yourself From The Internet
Facebook is a free service, and instead of paying for the convenience in dollars, we pay for it in data. Everything you post on Facebook, all the information you give it, allows Facebook to target you with personalized advertising. And it’s not just Facebook-based posts or likes anymore, but your activity across the entire Web.
Some people, like Facebook board member Marc Andreessen, argue that what Facebook did with the “emotional contagion” experiment is no different than other types of advertising-related testing Facebook does.
But with advertising on Facebook, we know what we’re getting. The ads are actively shown to us, and tailored to our own interests. We trade our information for the ability to use an ad-supported service. And we have some limited ability to opt out of such targeting.
“If you don’t tell people what the tradeoff is, you can’t make a choice,” Grimmelmann said. “With Facebook, the tradeoff they tell you about is, we will show you ads. That’s what you seem to think the data they’re collecting is going to be used for.”
Ultimately, the study was a way for Facebook to figure out how to prevent people leaving the service because using it was making them feel bad.
Data scientist Adam Kramer wrote in a Facebook post:
The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. [emphasis added]
There are a number of ways this Facebook experiment could have been designed not to anger quite so many people. For instance, had Facebook asked users to participate in a study to research the impacts of social media on behavior, the ethical question regarding manipulation of positive and negative emotions wouldn’t even exist.
In that post, Kramer apologized for “for the way the paper described the research and any anxiety it caused” and acknowledged that “the research benefits of the paper may not have justified all of this anxiety.”
This study bridges the worlds of academia and Silicon Valley’s tech culture. For years, Facebook’s mantra was “move fast and break things”—even, apparently, users’ trust.
“There’s a difference in expectations between those two worlds,” Grimmelmann said. “It exposes a real lack of ethical cognition underlying a lot of the Valley’s data practices.”
On Facebook, we’re now accustomed to giving up our data, and in exchange, seeing ads for companies we might buy stuff from. For the company to take that data we share with them and use it for purposes not listed in its privacy policy turns Facebook’s more than one billion users into potential test subjects, whether they want to be or not.
Until Facebook changes its practices, there’s only one way to assuredly remove yourself as a candidate for a scientific experiment: Delete your Facebook account.
Update 5:25 p.m.: Updated to note that Facebook didn’t include “research” in its Data Use Policy until May 2012, months after the initial research was conducted.
Lead image by Ludovic Toinel on Flickr