IN THROUGH THE BACKDOOR —

Apple documents previously undocumented services that can leak user data

Forensics expert disputes Apple claim that services serve purely diagnostic purpose.

Four days after a forensics expert warned that undocumented functions in iOS could leak personal user data, Apple has documented three services it says serve diagnostic purposes.

"iOS offers the following diagnostic capabilities to help enterprise IT departments, developers, and AppleCare troubleshoot issues," the support article published Tuesday stated. "Each of these diagnostic capabilities requires the user to have unlocked their device and agreed to trust another computer. Any data transmitted between the iOS device and trusted computer is encrypted with keys not shared with Apple. For users who have enabled iTunes Wi-Fi Sync on a trusted computer, these services may also be accessed wirelessly by that computer." As Ars reported Monday, three undocumented services include a packet sniffer dubbed com.apple.mobile.pcapd, a file downloader called com.apple.mobile.file_relay, and com.apple.mobile.house_arrest, a tool that downloads iPhone and iPad files to an iTunes folder stored on a computer.

Jonathan Zdziarski, the forensics expert who brought the undocumented functions to light on Saturday, published a blog post in response that criticized Apple's characterization of the services. He continued to maintain that at least one of the capabilities—stemming from the file relay service—constitutes a "backdoor" as defined by many security and forensics practitioners. He also took issue with Apple's suggestion that the purpose of the services was limited to diagnostics. He reiterated his previous stance that he doesn't believe Apple added the functions at the request of the National Security Agency.

Zdziarski's post in part stated:

Lets start with pcapd; I mentioned in my talk that pcapd has many legitimate uses such as these, and I have no qualms with Apple using pcapd to troubleshoot issues on users’ devices. Using a packet capture has been documented for developers for a couple of years, but had no explanation for being on every device that wasn’t in developer mode. The problem I have with its implementation, however. In iOS, pcapd is available on every iOS device out there, and can be activated on any device without the user’s knowledge. You also don’t have to be enrolled in an enterprise policy, and you don’t have to be in developer mode. What makes this service dangerous is that it can be activated wirelessly, and does not ask the user for permission to activate it… so it can be employed for snooping by third parties in a privileged position.

Now lets talk about file relay. Apple is being completely misleading by claiming that file relay is only for copying diagnostic data. If, by diagnostic data, you mean the user’s complete photo album, their SMS, Notes, Address Book, GeoLocation data, screenshots of the last thing they were looking at, and a ton of other personal data – then sure… but this data is far too personal in nature to ever be needed for diagnostics. In fact, diagnostics is almost the complete opposite of this kind of data. And once again, the user is never prompted to give their permission to dump all of this data, or notified in any way on-screen. Apple insists AppleCare gets your consent, but this must be a verbal consent, as it is certainly not a technological consent. What’s more, if this service really were just for diagnostic use, you’d think that it would respect backup encryption, so that everything coming off the phone is encrypted with the user’s backup password. When I take my laptop to Apple for repairs, I have to provide the password. But Apple apparently has admitted to the mechanics behind file relay, which skip around backup encryption, to get to much the same data. In addition to this, it can be dumped wirelessly, without the user’s knowledge. So why does this need to be the case? It doesn’t. File relay is far too sloppy with personal data, and serves up a lot more than “diagnostics” data.

Lastly, house arrest. I make no qualms with this either, and in fact iTunes and Xcode do use this service to access the documents inside a user’s sandbox, as I mentioned in my talk. As I mentioned in my talk also, however, it can also be used to access the stateful information on the device that should never come off the phone – Library, Caches, Preferences, etc. This is where most of the personal data from every application is stored, including OAuth tokens (which is just as good as having the password to your accounts), private conversations, friends lists, and other highly personal data. The interface is wide open to access all of this – far beyond just the “Documents” folder that iTunes needs to access new Pages files. This is not a back door, rather a privileged access that’s available here that really doesn’t need to be there (or at least could be engineered differently).

The last thing I’ll mention is this claim that your data is respected with data-protection encryption. The pairing record that is used to access all of this data is sent an escrow bag, which contains a backup copy of your key bag keys for unlocking data protection encryption. So again, we’re back to the fact that with any valid pairing, you have access to all of this personal data – whether it was Apple’s intention or not.

In a tweet, he added:

All I want from Apple:
1. pcap not work over wifi
2. file relay respect backup encryption or not exist
3. house arrest limited to Documents

The episode is a good example of the way Apple's trademark secrecy can come back to bite the company. Apple may have legitimate reasons for folding these services into iOS, even when it isn't running in special diagnostic or support modes. But the company never took the time to disclose these services or to respond to Zdziarski's private entreaties to executives until the undocumented functions became an international news story. Zdziarski's larger point seems to be that the services he brought to light represent vectors that ex-lovers, housemates, co-workers and, yes, spy agencies can exploit to bypass cryptographic protections designed to prevent sensitive data from being accessed by unauthorized parties. Until last weekend, that point was only implicit. It has now been made explicit.

Channel Ars Technica