Facebook wants people to stop getting frustrated with the company’s privacy settings. Well, good luck with that.
Almost any change Facebook makes to privacy controls triggers outcries and accusations that the social network is continuing to erode any remaining confidence people might have have sharing their data with the social network—and justifiably so. Yet Facebook just can’t stop trying to win over hearts and minds. If it really wants to succeed, though, it needs to become a lot more transparent, and more lenient, about how it vacuums up data, what sort of data it keeps and what it does with it.
Facebook has faced lawsuits for sketchy privacy policies, and recently closed down a controversial advertising product that used people’s likeness in ads. In 2011, the Federal Trade Commission settled with Facebook after the social network failed to keep its privacy promises, and the FTC reminded Facebook of those promises when it cleared the Facebook-WhatsApp acquisition on Thursday.
The company is hoping to change this negative perception. At a roundtable with reporters this past Tuesday, Facebook highlighted a few changes people will start to see in their news feed.
“We haven’t communicated as well as we could have,” Mike Nowack, privacy product manager at Facebook, told reporters. “[Feedback] has led us to think about privacy not just as controls or settings, but as a set of experiences that help people feel comfortable sharing what they want with who they want.”
For instance, Facebook is testing a minor tweak that would change the look of a drop-down menu that lets you select with whom you share, making “Public” and “Friends” the two prominent options (Facebook says those choices are the most popular). Facebook also told us that it runs 4,000 privacy surveys a day to better understand what people like or dislike about their current settings in order to make changes retroactively.
The biggest change is letting users control who sees their past cover photos, one of the items Facebook deems publicly available information—that is, data that’s visible to anyone in the world. Previously, anyone could view all your past cover photos.
While it’s smart of Facebook to be proactive about educating users on privacy controls and anticipating backlash, these changes don’t go nearly far enough. The company is still missing some key features that would prove it really takes privacy seriously.
You Can’t Not Be Public
It’s easy for strangers to find you on Facebook, thanks to publicly available information—the data you give to Facebook that the social network then shares with the world. This includes your name, profile photo, cover photo, gender, and networks such as your school or workplace.
According to Facebook, it’s necessary for this information to be public: “These are pieces of information that both help disambiguate you from other people in the world, but help you get the best experience to find other people,” Raylene Yung, an engineering manager on Facebook’s privacy team, told me. “They’ve been a part of the site for as long as its existed.”
When Facebook was still a small and growing social network, it made sense for your personal information to be public so new friends or family that signed up for the service would be able to find you. But now, with over one billion users, many people have established their small piece of the social experience and don’t need to field any additional friend requests, while others just don’t want to be found at all.
Public information proves to be a difficult obstacle for many people who have experienced online harassment or stalking. I’ve personally been a target of Facebook stalking—in college I was harassed by a stranger who sent me numerous messages and a friend request; I eventually blocked him and the harassment stopped.
When asked on Tuesday about potential safety issues regarding public information, Facebook officials emphasized the blocking policy and said that people who feel harassed should report it to Facebook. Of course, once blocked or reported, harassers can simply create a pseudonymous account and find you once again.
In order to feel completely secure on Facebook, it should give users the opportunity to opt-out of search, or choose what part, if any, of their data can be publicly visible.
Facebook killed a privacy setting that did just this last fall. It eliminated users’ ability to block people from searching them, effectively forcing everyone into Graph Search—the massive, practically endless, natural-language search that contains all the public data of every Facebook user. I was one of the people that had the setting checked because I didn’t want to appear in unwanted searches, and was disappointed when the setting disappeared.
Luckily, you can tailor your settings to allow only “Friends of Friends” to send you friend requests, or “Only Friends” to send you messages, small but significant settings that deter unwanted contact.
Restricting cover photo viewing is a step in the right direction, but restricting or eliminating all required public information would boost confidence in users that Facebook is taking concerns seriously.
Multi-App Strategy: What Does It Do With That Data?
When Facebook acquired WhatsApp earlier this year, the fear of Facebook getting its hands on even more of your data irked numerous users.
Those fears could have legs—a report published by data analytics company SiSense took a look at an average WhatsApp conversation and noticed the data potentially collected from WhatsApp is significantly more personalized and meaningful than what Facebook gleans from its flagship application.
SiSense analyzed the conversation from one of its own employees to illustrate the potential data Facebook can mine from WhatsApp. The analysis showed that Jennifer regularly talks about food, specifically desserts, that she is most active around 8 p.m., and she regularly talks about populism and conservative politics.
Having access to these intimate conversations creates a more substantial profile based on what people say, not what they like—a profile that Facebook can then monetize. Although WhatsApp claims it will remain independent of Facebook and stay free from advertising, the company’s privacy policy says it may share personal data with third-party services “to the extent that it is reasonably necessary to perform, improve or maintain the WhatsApp Service.”
Clearly the FTC is concerned about the potential privacy flaws, too. The government organization sent letters to WhatsApp and Facebook that accompanied the acquisition approval, reiterating that their responsibility is to consumers first.
We want to make clear that, regardless of the acquisition, WhatsApp must continue to honor these promises to consumers. Further, if the acquisition is completed and WhatsApp fails to honor these promises, both companies could be in violation of Section 5 of the Federal Trade Commission (FTC) Act and, potentially, the FTC’s order against Facebook.
Instagram’s privacy policies have also drawn ire from users. It was forced to change its policies in 2012 after controversial wording of its privacy policies put users in an uproar.
And Facebook wants to bring even more of Instagram’s data in-house. The company is testing Facebook Places in lieu of Foursquare’s location services on the app that lets users geo-tag their photos. Instead of feeding precious data to a separate social network, Facebook wants to make sure it keeps tabs not just on photos, but location as well.
A Focus On Anonymity
Facebook has long been tied to your real identity. In fact, Mark Zuckerberg famously said, “Having two identities for yourself is an example of a lack of integrity.”
See Also: Facebook Just Killed A Privacy Setting, So It’s A Good Time To Do Your Own Checkup
Those tides may be changing, however. In an interview with Bloomberg earlier this year, Zuckerberg said a number of applications created under the Facebook Creative Labs umbrella allow users to login anonymously—an unprecedented move for the social network.
Recent rumors that Facebook is interested in acquiring Secret, an anonymous social network where people post photos and text updates, give more credence to the speculation that Zuckerberg and company are indeed pushing for more guarded privacy options.
It could be just the spark folks need to turn the tables in favor of the social network.
While anonymity may be bad for Facebook—its business is knowing as much about you as possible and using that information to sell advertising—it might prove a useful compromise for Facebook’s longstanding critics.
Lead image by Taylor Hatmaker for ReadWrite