Home Those “Backdoors” in Apple’s iOS: What You Need To Know

Those “Backdoors” in Apple’s iOS: What You Need To Know

Security researcher Jonathan Zdziarski started a firestorm over the weekend when he presented findings that Apple has—apparently deliberately—created undocumented “backdoors” in its iOS operating system that third parties could use to siphon personal data from iPhones and iPads under certain circumstances without notice, much less consent of the user.

Apple, meanwhile, has taken issue with Zdziarski’s analysis, although its response—such as it is—falls short of a complete denial. (Update: Apple has confirmed the presence of these backdoors, although it describes them as “diagnostic capabilities.”)

It’s a complicated issue, so here’s a quick FAQ to help you sort through it all.

Should I panic?

No. In a blog post summarizing his work, Zdziarski includes this helpful note: “DON’T PANIC.”

The backdoors he describes aren’t the sort of thing your average cybercriminal can easily exploit. There’s no evidence that they’ve been used for identity theft or any sort of related criminal attack on iPhone or iPad data. At least so far, that is.

See also: Apple Confirms iOS Backdoors, But Calls Them “Diagnostic Capabilities”

On the other hand, if you think the NSA or regular law enforcement might be tracking you, then Zdziarski might have described some of the backdoors by which their agents could be delving into your digital life.

Beyond that, they’re an intriguing mystery—one that Apple has yet to explain.

Hold on a moment. What’s a backdoor?

Like the word suggests, a backdoor is a simple or unguarded route into an otherwise secure system. Think Matthew Broderick’s character in War Games sussing out a way to access WOPR by guessing a backdoor password specific to the system’s creator (his dead son’s name—a classically terrible password, by the way).

How would the NSA (or whoever) make use of these backdoors?

Zdziarski, a forensics expert and one-time iOS jailbreaker who’s written several books about iPhone development, described three iOS services that appear to have an unusual degree of access to raw and potentially sensitive data gathered by or stored on the phone. These services are also apparently designed to collect that information, package it and dump it out upon request, either via USB or wirelessly over Wi-Fi.

These features are undocumented, meaning that they’re not described by Apple in the sort of detail it normally provides to third-party developers who might make use of them. According to Zdziarski, however, they are installed and active on roughly 600 million iOS devices. They provide no indication that they’re operating, and there’s no way for users to turn them off.

Perhaps most ominous, these services can send out unencrypted information even if users have chosen to encrypt the data they back up through iTunes. Zdziarski calls this behavior “bypassing backup encryption” and considers it deceptive at best.

That all sounds pretty panic-worthy. Isn’t it?

Turns out there’s a catch. These services only work when an iPhone or iPad is “paired” to a trusted device, such as the computer you run iTunes on. (Bluetooth pairing with, say, a set of headphones doesn’t count.) That greatly limits the ability of any attacker to exploit these services and rifle through your iPhone.

It is, however, possible to spoof that pairing. Every pairing generates a set of cryptographic keys and certificates designed to identify trusted devices to one another—and on the iPhone side, those keys and certificates are never deleted unless the user does a full restore or a factory reset on the device. Prior to iOS 7—the version used by most iPhones—pairing happened automatically without any user intervention. (iOS 7 now requires the user to approve pairing with a “trusted” device.)

As Zdziarski put it in a March 2014 technical journal article describing his findings: “[E]very desktop that a phone has been plugged into (especially prior to iOS 7) is given a skeleton key to the phone.” And that skeleton key is transportable, because a sufficiently motivated attacker can copy pairing keys and certificates from one computer to another. 

Who would go to all the trouble of tracking down those keys and copying them?

Well, the police might, if they thought you were involved with organized crime. So might the NSA, the FBI or a number of other intelligence agencies. And of course some of these outfits could also create seemingly innocuous “paired” devices such as an alarm clock or charging station that would run malicious code once connected to your phone.

As noted above, though, it’s not the sort of thing your average Belarusan hacker is likely to use to take over your phone any time soon.  

OK, tell me more about these undocumented services. What are they and what do they do?

In a presentation he made at the Hope X hacker conference in New York this past weekend, Zdziarski focused on three particular services known by the technical names com.apple.pcapd, com.apple.mobile.file_relay and com.apple.mobile.house_arrest. (You can see the slides from Zdziarski’s talk—all 58 of them—here.)

The pcapd service starts what security professionals call a “packet sniffer” on an iOS device—basically, software that records all data traffic to and from your iPhone. It’s installed by default on all iOS devices, and operates whether a phone is in “developer mode” or not, suggesting that it’s not a developer-specific feature. And it gives the user no warning when it’s activated.

“This means anyone with a pairing record can connect to a target device via USB or Wi-Fi and listen in on the target’s network traffic,” Zdziarski wrote in his March paper.

The file_relay service, according to Zdziarski, exists to vacuum up large volumes of raw data from particular sources on an iPhone and then to dump it out in unencrypted form. Several years back, file_relay appeared fairly innocuous. In iPhoneOS 2.0 (an early predecessor to iOS), it was only able to access six data sources, including “Apple Support,” “network,” and “CrashReporter.”

By iOS 7, however, file_relay‘s reach had expanded to include 44 data sources, many of which specifically address the owner’s personal information. These include the address book, accounts, GPS logs, maps of the phone’s entire file system, a collection of all words typed into the phone, photos, notes, calendar files, call history, voicemail and other records of personal activity that have been cached in temporary files.

Small wonder Zdziarski calls file_relay “the biggest forensic trove of intelligence on a device’s owner” and a “key ‘backdoor’ service” that provides a significant amount of data that “would only be relevant to law enforcement or spying agencies.”

The third service, house_arrest, originally allowed iTunes to copy documents to and from third-party apps. Now, however, house_arrest has access to a much broader array of app-related data, including photos, databases, screenshots and temporary “cached” information.

Couldn’t these services have legitimate functions?

Maybe, although it’s difficult to understand why they they’d have such apparently untrammeled access to so much information. That’s a pretty major security failing under any circumstance.

Zdziarski also runs through a number of possible explanations—that they might be used in iTunes or Xcode (Apple’s iOS app-development environment), or in developer debugging, or by Apple support, or in Apple engineering debugging—and shoots each one down in turn. 

It’s very difficult to construct an explanation for legitimate, non-surveillance uses of services that aren’t documented, that bypass backup encryption, that have access to otherwise inaccessible user data and that give the user no notification that they’re accessing and dumping out information. Oh, and whose code Apple has maintained and updated across several versions of iOS.

Given Apple’s historical issues with lack of cooperation and infighting between technical teams, it’s also conceivable that these services grew without much direction at all, almost by accident, as engineers struggled to solve other technical problems without writing a whole bunch of new code. Call this the it-ain’t-pretty-but-it-works explanation.

Is it plausible? Your guess is as good as mine. And it’s still a major security fail.

What does Apple have to say about all this?

In classic fashion, not very much. Apple didn’t get back to me when I emailed it for comment, although I’ll keep trying.

Apparently, however, it did email a statement to Tim Bradshaw, a reporter for the Financial Times, who tweeted it:

https://twitter.com/tim/status/491370587554471936/photo/1

The statement, of course, is rife with ambiguity. Is Apple referring specifically to pcapd, file_relay and house_arrest here, or just issuing a general statement about its diagnostic functions? (Update: An Apple spokeswoman got back to me post-publication with a copy of the statement and news of its first documentation of these backdoor services.)

And it fails to address most of Zdziarski’s basic questions. If these services are diagnostic functions, why aren’t they documented? Why do they operate even if users haven’t agreed to send diagnostic information to Apple? Why can’t users deny their consent to having information taken off their devices this way? Why can’t users turn these services off?

It is certainly interesting that Apple feels compelled to deny that it has even “worked with any government agency from any country” to engineer backdoors into its products or services. Especially since Zdziarski hadn’t accused them of such.

Does Zdziarski have thoughts about Apple’s statement?

Does he ever. In a new blog post Monday night, he summed up his reaction this way:

I understand that every OS has diagnostic functions, however these services break the promise that Apple makes with the consumer when they enter a backup password; that the data on their device will only come off the phone encrypted. The consumer is also not aware of these mechanisms, nor are they prompted in any way by the device. There is simply no way to justify the massive leak of data as a result of these services, and without any explicit consent by the user.

I also contacted Zdziarski for comment, but haven’t heard back. (Update: I did hear back from Zdziarski, although he didn’t have time to say much.)

Updated on Wednesday, July 23 at 10:08am with, well, updates noted in the text.

Lead image by Flickr user Mooganic; swan image by Flickr user blinking idiot, CC 2.0

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.