This is a response to Mat Honan's thunderous complaint, "The Case Against Google." The essential argument against the once-simple, once-open, once-beloved search company is that we can no longer trust it with our data. When Google became Google+, it fundamentally changed its relationship with users without asking us first. Along the way, Google has engaged in some real trickery related to these privacy changes.

Given all of that, Honan questions whether we can trust Google anymore. I contend that the argument boils down a little bit further. The question of whether Google will honor our privacy is settled. We cannot assume that it will. The root question about whether to be for or against the new Google(+) lies with us: Do we want these data to be private or not?

Did Google+ Break Google?

There's a secondary argument, which is actually the one Honan makes first: that Google has broken search, its core product, by rearranging itself around the unified Google+ vision. This is debatable. I was certainly worried that this would be the case when the inevitable Google+ integration with search arrived, but it was not.

Instead, Google gave us two modes of search, personal and global. Global search works just the way we're used to, and personal search uses Google+ signals. It also promotes Google+-integrated Google properties over outside alternatives.

Honan says that "Google polluted the page with its own inferior products." For people who prefer Twitter or Yelp over Google+ or Google Places, this is indeed a shame. Personalized search could be amazingly useful for everyone if it allowed users to choose their own social graphs.

Google tried to argue that this is technically impossible, putting the onus on Twitter and Facebook, but engineers from other social networks built Focus on the User to prove this argument false. It's a business decision. Honan nails it precisely when he writes:

"Google wants to know things about you that you aren't already telling it so you will continue asking it questions and it can continue serving ads against the questions you ask it. So, it feels like it has to herd people into using Google+ whether they want to go there or not."

In order to provide the personalized service it promises, Google has to do this. It can't just keep crawling the Web and building its own graph now that the best signals about people are contained within apps. If Google doesn't gather these signals itself, it would have to concede valuable datasets to its competitors and merely index those.

In short, it would lose money. So instead, Google has decided to rebuild itself around its own sources of personal, real-time, topical data about its users.

There's no question that this decision places Google's needs over those of millions of its users. For people who don't want to use Google+-enhanced services, Search Plus Your World is just ugly noise. The only choice is to deactivate it in preferences. That's an unfortunate - though relatively painless - burden to put on users. I think the more compelling argument against that, though, which Honan makes, is that we can't be sure Google won't someday take this choice away.

However, as I said many paragraphs back, this is all still a secondary argument. For people who do use and enjoy Google+, and they do exist, Search Plus Your World is an exciting change. Personalized search can be magical. It's what some people want. Particularly for Android users, it's convenient to use Google's network for everything. For people who do, personalization works. Whether you want it or not is a matter of simple preference.

That is, unless you don't trust Google with your personal data. The choice is still simple - don't use Google - but the issue becomes more complicated. Should we be "against Google?" Is Google "being evil" by violating its users' privacy? Should Google be stopped? This is the fundamental question.

Is Google Evil?

Google has been shady lately. Of this, there can be no doubt. Indeed, it seems that all free, social Web companies have the capacity to be shady when given a chance. If their business is our data, they'll do what they can to get it, including sleight and deception.

Honan points to several instances of Google's wrongdoing. Some of the most egregious ones don't have to do with user data, but Honan is still right to mention them, since they call Google's corporate character into question. The one example that's definitely relevant is Google's end-run around Safari's privacy settings.

Google breached a contract between Apple and its users, even Apple users who weren't logged-in Google users. It did so in order to gather better user tracking data. While Google was hardly the only ad company using this practice, it is beyond question that it crossed an ethical line here.

Google must stop doing things like this. It's hurting its own case. It should also stop lying about being unable to index Twitter for true, real-time topical results. It should come out and tell the truth, that it is unwilling to do this. Google thinks it can do better with its own products. If you give Google all your data, it will prove it to you.

Do You Care?

Whether you want to keep using Google depends on your definition of "evil." It's really stupid that this word has entered the debate, but as Honan points out, Google brought it on itself by publicizing a mantra of "Don't be evil." It sealed its own fate in the court of public relations. Google has shown that its internal definition of "evil" will bend to the company's priorities.

But that's not the definition that matters. What matters is your definition of evil.

Evil has a threshold. Naughty can be forgiven, but evil cannot. Can you forgive Google for taking liberties with your data?

How important to you is your trail of information online? Google has made abundantly clear what its stance on privacy is. How do you feel about it? If your social network activity, your search and browsing history, your location and so forth are things you don't mind sharing, Google will give you benefits in return. Honan writes:

"Picture this scenario. You are about to leave San Francisco to drive to Lake Tahoe for a weekend of skiing, so you fire up your Android handset and ask it "what's the best restaurant between here and Lake Tahoe?"

It's an incredibly complex and subjective query. But Google wants to be able to answer it anyway. (This was an actual example given to me by Google.) To provide one, it needs to know things about you. A lot of things. A staggering number of things.

To start with, it needs to know where you are. Then there is the question of your route--are you taking 80 up to the north side of the lake, or will you take 50 and the southern route? It needs to know what you like. So it will look to the restaurants you've frequented in the past and what you've thought of them. It may want to know who is in the car with you--your vegan roommates?--and see their dining and review history as well. It would be helpful to see what kind of restaurants you've sought out before. It may look at your Web browsing habits to see what kind of sites you frequent. It wants to know which places your wider circle of friends have recommended. But of course, similar tastes may not mean similar budgets, so it could need to take a look at your spending history. It may look to the types of instructional cooking videos you've viewed or the recipes found in your browsing history.

It wants to look at every possible signal it can find, and deliver a highly relevant answer: You want to eat at Ikeda's in Auburn, California. Hey, I love that place too! Try the apple pie.

There is only one path to that answer, and it goes straight through your privacy."

Take a deep breath and think about it. Set aside questions of privacy. If you can do that, this is an amazing use case. I want this ability. I don't want dozens of apps that are each good at recommending one thing. I want one search, searching one map, putting the things I want on that map. That's an ideal.

If you share that ideal, Google will try to make it real. You have to be perfectly okay with Google gathering lots of data about you. To be a Google user going forward is to have nothing to hide. Is that okay?

I don't think that question has one answer. The case for Google is, "I have nothing to hide. Help me find what I'm looking for."

Honan cites a study saying that 73% of search users are not willing to compromise their privacy for personalized search. That's not 100%, and it's an easy question to say "no" to. More importantly, cultural norms shift. If Google's service is as amazing as Google promises it is, that 73% could dwindle.

Eight years ago, after the launch of Gmail, Honan wrote:

"I'm not at all worried that my privacy is about to be invaded by the world's most popular search engine company. Call me brave, call me crazy, but I'm not. Nor should you be."

Today, he writes that a "far bigger" Google has now crossed the line. But I don't think the debate has changed. Where is that line? Scanning email, tracking search and Web history, mapping out our relationships, it's all the same. Do you care about losing control over that information? That answer starts and ends with you. If enough people say "no," then Google will be vindicated.

Excellent post, Mat. Let's talk about it.

See also: What Google+ Needs To Do Now What Google+ Should Have Been: Bing's Linked Pages