Jim Comey’s Confused Defense of Front Door Back Doors and Storage Intercepts

I said somewhere that those wailing about Apple’s new default crypto in its handsets are either lying or are confused about the difference between a phone service and a storage device.

For the moment, I’m going to put FBI Director Jim Comey in the latter category. I’m going to do so, first, because at his Brookings talk he corrected his false statement — which I had pointed out — on 60 Minutes (what he calls insufficiently lawyered) that the FBI cannot get content without an order. Though while Comey admitted that FBI can read content it has collected incidentally, he made another misleading statement. He said FBI does so during “investigations. They also do so during “assessments,” which don’t require anywhere near the same standard of evidence or oversight to do.

I’m also going to assume Comey is having service/device confusion because that kind of confusion permeated his presentation more generally.

There was the confusion exhibited when he tried to suggest a “back door” into a device wasn’t one if FBI simply called it a “front door.”

We aren’t seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.

And more specifically, when Comey called for rewriting CALEA, he called for something that would affect only a tiny bit of what Apple had made unavailable by encrypting its phones.

Current law governing the interception of communications requires telecommunication carriers and broadband providers to build interception capabilities into their networks for court-ordered surveillance. But that law, the Communications Assistance for Law Enforcement Act, or CALEA, was enacted 20 years ago—a lifetime in the Internet age. And it doesn’t cover new means of communication. Thousands of companies provide some form of communication service, and most are not required by statute to provide lawful intercept capabilities to law enforcement. [my emphasis]

As I have noted, the main thing that will become unavailable under Apple’s new operating system is iMessage chats if the users are not using default iCloud back-ups (which would otherwise keep a copy of the chat).

But the rest of it — all the data that will be stored only on an iPhone if people opt out of Apple’s default iCloud backups — will be unaffected if what Comey is planning to do is require intercept ability for every message sent.

Now consider the 5 examples Comey uses to claim FBI needs this. I’ll return to these later, but in almost all cases, Comey seems to be overselling his case.

First, there’s the case of two phones with content on them.

In Louisiana, a known sex offender posed as a teenage girl to entice a 12-year-old boy to sneak out of his house to meet the supposed young girl. This predator, posing as a taxi driver, murdered the young boy, and tried to alter and delete evidence on both his and the victim’s cell phones to cover up his crime. Both phones were instrumental in showing that the suspect enticed this child into his taxi. He was sentenced to death in April of this year.

On first glance this sounds like a case where the phones were needed. But assuming this is the case in question, it appears wrong. The culprit, Brian Horn, was IDed by multiple witnesses as being in the neighborhood, and evidence led to his cab. There was DNA evidence. And Horn and his victim had exchange texts. Presumably, records of those texts, and quite possibly the actual content, were available at the provider.

Then there’s another texting case.

In Los Angeles, police investigated the death of a 2-year-old girl from blunt force trauma to her head. There were no witnesses. Text messages from the parents’ cell phones to one another, and to their family members, proved the mother caused this young girl’s death, and that the father knew what was happening and failed to stop it.

Text messages also proved that the defendants failed to seek medical attention for hours while their daughter convulsed in her crib. They even went so far as to paint her tiny body with blue paint—to cover her bruises—before calling 911. Confronted with this evidence, both parents pled guilty.

This seems to be another case where the texts were probably available in other places, especially given how many people received them.

Then there’s another texting story — this is the only one where Comey mentioned warrants, and therefore the only real parallel to what he’s pitching.

In Kansas City, the DEA investigated a drug trafficking organization tied to heroin distribution, homicides, and robberies. The DEA obtained search warrants for several phones used by the group. Text messages found on the phones outlined the group’s distribution chain and tied the group to a supply of lethal heroin that had caused 12 overdoses—and five deaths—including several high school students.

Again, these texts were likely available with the providers.

Then Comey lists a case where the culprits were first found with a traffic camera.

In Sacramento, a young couple and their four dogs were walking down the street at night when a car ran a red light and struck them—killing their four dogs, severing the young man’s leg, and leaving the young woman in critical condition. The driver left the scene, and the young man died days later.

Using “red light cameras” near the scene of the accident, the California Highway Patrol identified and arrested a suspect and seized his smartphone. GPS data on his phone placed the suspect at the scene of the accident, and revealed that he had fled California shortly thereafter. He was convicted of second-degree murder and is serving a sentence of 25 years to life.

It uses GPS data, which would surely have been available from the provider. So traffic camera, GPS. Seriously, FBI, do you think this makes your case?

Perhaps Comey’s only convincing example involves exoneration involving a video — though that too would have been available elsewhere on Apple’s default settings.

The evidence we find also helps exonerate innocent people. In Kansas, data from a cell phone was used to prove the innocence of several teens accused of rape. Without access to this phone, or the ability to recover a deleted video, several innocent young men could have been wrongly convicted.

Again, given Apple’s default settings, this video would be available on iCloud. But if it was only available on the phone, and it was the only thing that exonerated the men, then it would count.

Update: I’m not sure, but this sounds like the Daisy Coleman case, which was outside Kansas City, MO, but did involve a phone video that (at least as far as I know) was never recovered. I don’t think the video ever was found. The guy she accused of raping her plead guilty to misdemeanor child endangerment — he dumped her unconscious in freezing weather outside her house.

I will keep checking into these, but none of these are definite cases. All of this evidence would normally, given default settings, be available from providers. Much of it would be available on phones of people besides the culprit. In the one easily identifiable case, there was a ton of other evidence. In two of these cases, the evidence was important in getting a guilty plea, not in solving the crime.

But underlying it all is the key point: Phones are storage devices, but they are primarily communication devices, and even as storage devices the default is that they’re just a localized copy of data also stored elsewhere. That means it is very rare that evidence is only available on a phone. Which means it is rare that such evidence will only be available in storage and not via intercept or remote storage.

12 replies
  1. P J Evans says:

    I think Comey prefers that people not remember how much of the stuff they think is private is actually stored ‘in the cloud’ (online). If they know that it’s nearly all accessible without a warrant, they might stop using cloud-storage and the agencies would have to start actually working.

  2. What Constitution? says:

    So if FBI Director Comey is convinced that the passage of CALEA twenty years ago renders that thinking suspect because twenty years is “a lifetime in the internet age”, I presume he also recognizes that the world might also have changed a bit since the Supreme Court considered “pen registers” in Smith v Maryland THIRTY FIVE years ago. So could he ask his lawyers at DOJ to stop arguing to the courts that nobody has any expectation of privacy any more?

  3. jerryy says:

    Something does not quite sound right about that Sacramento story and the GPS data being on the phone, unless it was an old phone or the guy was running an actual gps-tracking app on his phone at the time of the accident. Cell phone operating systems stopped that generic kind of tracking a long time ago because the public got very upset (storing the data on the phone was very short-lived).
    You are right that the gps data would have been available from the provider. All ‘location aware’ apps get very chatty with the the cell towers.
    This gives a non-technical glossed over idea of how today’s tracking is done:

  4. earlofhuntingdon says:

    Because some bad guys use technology, Mr. Comey argues that we should submit all our data to FBI/alphabet soup agency copying, retention, storage, analysis and use for indefinite periods of time. What could be wrong with that picture?
    Among other things, govt power attracts predators, just as do big bidness, big banking and big military. The postwar abuses of the FBI and CIA, for example, are legendary. The last two decades have seen financial capitalism and Enron-style practices and business “morality” become the norm. Given the significantly greater reach and intrusiveness of today’s technology, the vastly greater sums of money and power attached to it, and the continuing absence of governmental oversight, are predators more or less likely to abuse vast data stores and the individual people to whom it belongs?

  5. Adam Colligan says:

    I think his lack of a conceptual distinction between storage and communication goes beyond just the issue of what data is likely to be available where. Yes, it is partly just frustrating to hear him talk about things like location histories as if they only (or even usually) appear on devices rather than in providers’ databases. But it also cuts through to some more fundamental principles that would be uncomfortable for the FBI to acknowledge.
    For a long time now, but especially since the Snowden saga began, we have been told that all communication metadata and much communicated content is subject to frequent querying for a reason. That reason is that it has been handed over into the possession of a third party business service provider, one who isn’t really a party to the speech per se, to work woth. Remember how hard John Yoo hammered at this point last week?
    The common answer, which is also the one that got thrown at Yoo, is that that’s not fair because we need the services to meaningfully participate in today’s society, and the services don’t work if they can’t access that information. So there is a kind of standoff in which surveillance advocates can be fairly confident of their positions. After all, at least in that sense, it is the privacy advocates who have to advocate changing a legal principle to suit new times, and that’s much harder to do politically than it is to just say, “tough luck, you’re forfeiting you close hold on that data by giving it to companies, regardless of whether everyone’s doing it en masse“. This has been followed by “…and you’ll just have to trust our integrity and our procedures in terms of separating out content from metadata.” “…” “[Crickets]”.
    So what does encrypted storage change? When it comes to communications surveillance at least, one thing it does is to act as a method that *technically* separates content from metadata rather than relying on law enforcement’s honor system. So I think one reason this has gotten a strong reaction isn’t just ignorance about the fact that a lot of content will be backed up into the cloud anyway. It’s fear that even that cloud-stored content will be encrypted using client-side private keys.
    What that does is it turns around the whole debate in which people like Yoo argue that privacy has been voluntarily surrendered. Privacy advocates no longer only have the option of saying, “Legislate an expansion of our expectation of privacy into information we share in this way.” Rather, they can also begin to say, “Okay, well, here are ways we can figure out that make these services work *without* having to actually expose the content to the itnermediary, even theoretically.” Remember, the FBI seems particularly alarmed by encrypted storage *of communications content* in a way that they did not freak out about, say, the inclusion of BitLocker with Windows, which would protect a lot of content that was generated on the device in the first place.
    And it isn’t just content that can be protected end-to-end *and* at both ends: depending on the type of service, more and more metadata also doesn’t need to be exposed to the intermediary in order for the service to work.
    In the end, then, this “new debate” seems to me partly like little a childish reaction to what was happening in the “old debate” that centered around the third-party handover principle. When the impact of applying the principle to the new environment concerned privacy advocates, it was “hey, the principle is the principle; deal with it.” Now, the impact on privacy of applying the principle looks set to *ease* (because the plain communicated information is only exists at the end storage device, and only when users call it actively). Suddenly it’s “oh my god, we’re Going Dark, we really need to rethink this principle, right? I mean, do you want child predators to go free?”

  6. John Reilly says:

    A Bit Wrong on the Apple Technology!!
    This is Apple’s statement on iMessage and FaceTime encryption in iOS 8
    Your iMessages and FaceTime calls are your business, not ours. Your communications are protected by end-to-end encryption across all your devices when you use iMessage and FaceTime, and with iOS 8 your iMessages are also encrypted on your device in such a way that they can’t be accessed without your passcode. Apple has no way to decrypt iMessage and FaceTime data when it’s in transit between devices. So unlike other companies’ messaging services, Apple doesn’t scan your communications, and we wouldn’t be able to comply with a wiretap order even if we wanted to. While we do back up iMessage and SMS messages for your convenience using iCloud Backup, you can turn it off whenever you want. And we don’t store FaceTime calls on any servers.

    This probably changed since iOS 7, because in iOS 7, they probably still held the public/private keys, and the data probably went through their servers.
    This is what this article is getting at.
    However, as Snowden pointed out in his New Yorker interview, the user still has the key to his own phone. This key is usually accessible through their computer. If you back up phone to your computer, or synch it, it resides on the computer, or at least it used to. In any case, the user or users know the key (password) to their own phone. In a criminal proceeding, I suppose they could be withholding evidence if they did not reveal their key.

    The best way to think of Apple’s encryption is that it gives a criminal the power to write a letter or a note, and burn it afterwards. Comney is right that IFF a criminal is using his phone intelligently, he can pass these notes back and forth and then burn them all. But in practice, the thought of “burning them all” probably comes when the police are at the door. And even with time, the drive could take an hour to do a 3-pass erase. Probably, messages will accumulate on the phone, and they are only “encrypted” when sent to and from the sender and recipient. On the phone, or on the iCloud drive, they are in plain text (sent securely to apple SSH). It NOT probable that you could take the hard drive out, attach it to another processor, and boot it up. But this is a higher-level question.

    • Adam Colligan says:

      But that is where there is a shift now: the increasing focus on the encryption of the device storage itself (and often even the synced cloud storage, too). And it’s not a simple issue when it comes to simply compelling the user to turn over the key. For one thing, government agents have gotten very used to being able to pry information out without the end user even knowing, so they’re going to be upset at having to go directly after users with real warrants rather than just hitting up service providers. But even at the level of legal principle, it is not at all clear that refusing to turn over or use a password or key is “withholding evidence”.
      Using words like “key” or “locked” is customary, but the analogy is misleading in this situation. A safe contains objects that exist in their own right, and they may be evidence. Warrants can compel people to get them out of the safe. But when data is encrypted, the original plain data *does not exist*. Instead, imagine that a person has kept a diary, or kept up a correspondence with a twin, in an original language that no one else speaks. There is little question that a warrant can compel a person to turn over the diary. But what’s being asked in these situations is really whether a warrant can compel a person to actively *translate* the diary entries into English for the benefit of the prosecutors.
      Decryption is not a process of “turning over” extant data to lawful authorities. It is a process of creating a new object that is derived from the encrypted data using rules of translation that are known to only one person. That implicates a very different set of principles.

      • John says:

        You said, “But when data is encrypted, the original plain data *does not exist*. Instead, imagine that a person has kept a diary, or kept up a correspondence with a twin, in an original language that no one else speaks.”

        But they are not speaking this language, they are using it for transit only, not for reading. The encrypted transmission, like the Searle’s Chinese box, or modern telephony is the one that is speaking a different language. On the users phone, texts are unencrypted. Or if faceTime calls are stored, they are unencrypted. They are not encrypted and decrypted every time you look at them. They are encrypted for transit, and decrypted upon arrival.

        There are applications that can encrypt portions of data on your phone, and in that case what you are speaking about is true, but that’s not how the phone works. The phone has a lock on it, and once you unlock it, all your iMessages are there for you or the FBI to read (if they have it).

        Comey, wants the transit to be decrypted. So, intercepts at any point in the transmission of a message will work better for the FBI. They want the real-time intercepts!! Now, having say a drug lord under the illusion that he is safely transmitting messages, and then accessing his cell phone, because he thinks the messages he sent are forever secret, might be better than immediate intercept powers. Even if someone erased a message, it is probably not erased on the drive and still exists.

        However, Comey isn’t really clear about what he wants. (I don’t think he knows what he is saying exactly.) I can see that if you have a phone tap on someone, it’s kind of useless if they just text with iMessage. You might be able to get some of those iMessages from the carrier, but all would be encrypted. Comey makes his problem to be a law enforcement problem, but its more of a surveillance problem.

        Under the current plan, through and intercept or tapped phone, Comey still gets metadata, Apple doesn’t encrypt that. Metadata should be good enough with proper algorithms, but I suspect that the algorithms, if they exist are not good enough yet, especially for the FBI.

        • Adam Colligan says:

          On the users phone, texts are unencrypted. Or if faceTime calls are stored, they are unencrypted. They are not encrypted and decrypted every time you look at them. They are encrypted for transit, and decrypted upon arrival.

          You are incorrect. If you enable the encryption features that are available now for iPhone and Android and are now going to be turned on by default — the thing this FBI freakout is about — then the messages *are* encrypted on your phone all the time and decrypted every time you look at them. That’s how transparent disk encryption works.
          This really is about the messages being secure on the device as well as in transit, without using any special app. (And not just messages — everything on your phone. All your application data, photos, everything). It’s the same thing as if you enable Bitlocker or TrueCrypt or LUKS on your laptop or desktop. The data is never written in plain text to the hard drive.
          Take the example of a message you receive on some end-to-end encrypted service. And you’re also using the password protection/encryption settings for your phone that are now going to be set as default. The sender encrypts the message to a key that only your phone’s copy of the messaging app can use to decipher it. So the message arrives in your phone. The messaging app uses its key — however the messaging service has it set up — to decrypt the message into plain English text in the phone’s RAM.
          But when the app actually writes anything to disk (e.g., on the phone’s SD card memory), it does *not* get recorded in plain text. The app tells the phone the plain data that it wants stored. And later, when the app requests the data be retrieved for the user or whatever, the storage system sends it the plain text back. So as far as the app or the user cares during a session, the stored message’s content doesn’t *seem* encrypted.
          But that’s because the disk encryption system is transparent. What’s really going on is this. When you, the user, log in to the phone, you enter your password. This password isn’t just a software trick that can be worked around. It is used to decrypt a special strong private key that is sitting in the device storage in an encrypted form while the device is off. The plain text of this private key is then put into the device’s RAM, and a low-level software program is started that acts as an intermediary whenever any other application requests to read or write from the disk/SD storage.
          Every piece of data that is written to the storage is encrypted using this private key. So apps send blocks of data for processing, and then the processor does math that combines the data with the key, and it writes the *result* of that to disk. And every piece of data read from the disk goes through the same process in reverse. The encrypted chunks are pulled from storage, operations are done on them that are dictated by what’s in the private key, and then the result is sent out to the app to show to the user.
          That reading and writing process is only possible as long as the plain private key is sitting there in the device’s RAM. The reading/writing middleman program has to constantly reference it. So if you power the device off (which clears the RAM), or if you send a lock command that involves deleting the private key from the RAM, it then becomes *impossible* to read anything stored on the device. The version of the private key that is permanently stored in the device is itself encrypted — it’s the user’s password itself that is used to transform it mathematically into the useful key that can actually translate the blocks of data.
          So now back to the message that was received. If the message is on the screen and an FBI agent looks over your shoulder, then obviously she can read it. And if you’re logged in to your phone, and the messaging app isn’t protected by any secondary password or encryption, then she can just open up the messaging app and read it. The message *was* encrypted, but it gets decrypted on request just by tapping on the app. If your lock screen is activated, but the phone is set up to keep reading/writing/running while the lock screen is up, then the plain private key is probably still sitting in RAM. In that case, it may be tricky and expensive, but the FBI can probably dump the RAM and find the key in it.
          But if you turn the phone off, or if that plain key is cleared from the RAM, then that’s it. ONLY your password can get the key back, and only with the key will it be possible to read any of the data stored on disk. As long as your password itself is strong enough to deter or prevent brute-force guessing, that message cannot be read by the FBI. It was secure when it left the sender, it was secure in transit, and it is secure on disk. Its plain English content is only ever created when a user with the right key requests to see it, and that plain message content (like the plain key used to translate it) only sits fleetingly in RAM.

    • P J Evans says:

      In the real world, phones don’t have unlimited on-board (local) storage. So I expect that users will delete messages (or if possible download them to store them offline).

  7. John says:

    Ok, that was certainly authoritative.

    My confusion was that I didn’t know that iMessage, the application, encrypted and decrypted its files. So then your argument about translating a secret language is valid: although someone would counter with the Chinese box argument, I don’t think that’s valid.

    My confusion is that Comey seems to be arguing against all encryption. However, the iPhone has been encrypted since the iPhone 3GS (fourth generation). This is not file-level, but done in the processor. (See https://discussions.apple.com/thread/6441100?start=0&tstart=0). With a difficult effort, you can take the hard drive out and read text not encrypted at the application level.

    Comey seems to be interested in turning back this privacy enhancement as well. Since iPhone OS 6, you have been able to put in codes longer than 4 digits.

    Thank you very much Mr. Colligan, you have enlightened me on this issue.
    I think Comey would have a stronger argument if he pointed out that 2 law-breakers communicate solely through face Time audio or video, there is no way to tap those conversations. If the FBI gets a warrant to tap someone’s phone, and they converse using FaceTime, it can’t get a warrant to tap FaceTime, at least not that I know of. So, the phone company or any 12333 – type intercept will have metadata, but not content.

  8. ЩЗ says:

    There’s another case in the background: the pending prosecution of Jokar Tsarnaev. You will recall that FBI is busy silencing potential witnesses, deporting them, locking them away in supermax, threatening them with frivolous charges, or murdering them. Unfortunately for the FBI, witnesses are not their only problem. Several of their foreign counterparts are fully cognizant of the facts and have meticulously compiled actionable underlying. The gentlemanly era of horse-trading has ended and now the primary value of such adverse information is de-legitimation: dissemination in international fora not subject to state suppression. Surveillance is the US government’s best chance to brace for exposure, and now they’re going blind. Imagine their panic.

Comments are closed.