Android vs. The iPhone: It’s All About The Cloud

The Platform is a regular column by mobile editor Dan Rowinski. Ubiquitous computing, ambient intelligence and pervasive networks are changing the way humans interact with everything.

I finally traded in my old iPad 2 for a brand new iPad Air. I didn’t really want to, but part of my professional obligation is to have the most up-to-date hardware for each major platform. (I was also mildly concerned the iOS 8 beta wouldn’t run on the iPad 2, although that fear proved unwarranted).

During the sales process, the Apple store employee casually suggested that I get a 32GB iPad Air instead of the 16GB tablet I wanted. “Most people find that 16GB is not enough for them,” he said. 

This is fundamentally surprising to me. I had a 16GB iPad 2 for three years and never once had an issue with running out of internal storage. My music is in the cloud (through Spotify). My movies and books are in the cloud, through Netflix or HBO Go or Amazon. I don’t take many pictures on an iPad, mostly just screenshots (who takes all of their pictures with an iPad anyway?). I have a lot of apps, but most of those don’t take up all that much room.

Quick Thought: Swift-ian Logic

The reactions are out and the reviews are generally positive: Swift is a good programming language that shouldn’t be all that hard for developers to learn.

It took less than a day for people to start proclaiming themselves Swift experts and for a variety of independent coding schools across the United States to start offering classes on Swift development—among them, the online coding school Treehouse and The New York Code + Design Academy. Meetup touts that its members are starting Swift groups across the world.

If there’s any one company that is terrific at unwittingly spawning a host of hucksters and snake oil salesmen, its Apple. The only real Swift experts at this point are Chris Lattner, the Apple engineer responsible for starting the project, and his team at Apple. Everything else is just hot air.

Bottom line is that I have no need for a 32GB iPad. Everything I do is served from the cloud, not locally on my device.Some call it ambient intelligence. I call it convenience.

So I was intrigued by the new cloud improvementsApple announced for Mac OS X and iOS 8 at its Worldwide Developer Conference keynote earlier this week. Finally, I thought, Apple gets it. Computing lives in the cloud, not on the device. 

But if you take a closer look at what Apple announced and how it all works, the cloud isn’t really the key feature in iOS or Mac OS X. For Apple, the cloud is a means to an end—and that end is to keep you using and buying new Macs, iPhones and iPads. I couldn’t quite put a finger on it, but what Apple has done with the cloud at WWDC this week struck me as mildly distasteful in a way I found hard to pinpoint.

I’ve been trying to figure out exactly why by checking up on what others had to say about Apple’s iCloud announcements. Nobody quite hit it on the head until I read what Andreessen Horowitz analyst Benedict Evans had to say on the topic. All of a sudden, it all clicked.

Evans states:

This is obviously a contest with Google, which has pretty much the opposite approach. For Google, devices are dumb glass and the intelligence is in the cloud, but for Apple the cloud is just dumb storage and the device is the place for intelligence. 

iPhone vs. Android: What’s Really At Stake

When I write about the potential of mobile computing or try to explain it to a friend, I usually break it down to some very basic but powerful facts.

  • Smartphones are powerful computers—more powerful than the laptops people bought just a few years ago—that live in people’s pockets.
  • Smartphones are windows into a world of information that can help people perform tasks in their day-to-day lives.
  • All aspects of human activity (all business, all communication, etc.) will be affected by the fact that we have these ubiquitous computers connected to the entire world of information in our pockets.

These are the basic facts that define mobile computing. Everything beyond that comes down to individual necessity and subjective brand preference. Some people like Android, some people like iOS, some people like Windows Phone and so on. But when you look at this in the context of the battle between Google’s Android and Apple’s iPhone, it’s clear there’s a deeper dynamic here.

For starters, this is sort of a curious frame for this argument, though that’s actually a clue to its importance. Android is a platform; the iPhone is a product. Yes, there are some big names in Android that deserve attention—Samsung, for instance—but this isn’t fundamentally a battle between device makers. So what we’re talking about is a “competition” between an operating system that’s not tethered to any specific device and an iconic product from one big company.

And that competition reflects some very different views about computing and how it best serves users. Start with Apple, a computer maker where the product has always been king. Everything that Apple has ever done has served the product. The Internet? Well, that can help Apple sell computers. The cloud? That can help Apple sell computers.

Photos? Phone calls? Texting? Apps? Developers? These are all things that can help Apple sell computers. So it’s in Apple’s best interest to make these computers as efficient and attractive as possible, not just “dumb glass.”

Google, by contrast, isn’t interested in selling computers—not any more, at least. But it is intensely interested in computing. Because computing and the people that use those computers create the one thing that Google craves above all else: Information.

With Android, Google’s aim lies in making sure that everyone on Earth has a computer with access to the Internet. More computers mean more people are computing, i.e. creating information. Google can then take that dumb glass, uses it as a window to the world of information and then sells advertisements against it. 

Once again, Evans gets to the core of the matter:

I’ve described this before by saying that Apple is moving innovation down the stack into hardware/software integration, where it’s hard for Google to follow, and Google is moving innovation up the stack into cloud-based AI & machine learning services, where it’s hard for Apple to follow. This isn’t a tactical ‘this’ll screw those guys’ approach—it reflects the fundamental characters of the two companies. Google thinks about improving UX by reducing page load times, Apple thinks about UX by making it easier to scroll that page.

The Worm In Apple’s iCloud

After puzzling through that for a while, I figured out what was really bothering me about iCloud in iOS 8 and Mac OS X Yosemite: Everything Apple does is designed to keep me buying Apple computers and using Apple platforms.

To Apple, the cloud is not this wonderful creation that provides ambient intelligence wherever you are. It’s just dumb storage and a fabric that provides continuity between devices. If ambient intelligence is a byproduct of that, then hey, that’s great—if it helps Apple sell more computers.

Quote Of The Day: “So there’s no such thing as work-life balance. There’s work, and there’s life, and there’s no balance.” ~ Facebook chief operating officer Sheryl Sandberg.

That’s not to say I’m enamored with Google’s obsession with information. Google is serving its own purposes as well. But the notion of “cloud-based AI & machine learning services” seems far more of an efficient and altruistic push in the evolution of computing than just creating better and better iPhones.

The ubiquitous intelligence provided by the cloud also means that I don’t need to pad Apple’s margins with a completely unnecessary and expensive 32GB iPad Air. The 16GB piece of dumb glass and metal that happens to run Apple’s operating system and acts as a window to the world of information will work just fine, thanks.

More On Apple, iCloud, iOS 8 & WWDC

Facebook Comments