Posts

[Photo: National Security Agency, Ft. Meade, MD via Wikimedia]

WAG: The Government Made a Significant FISA Back Door Request Just Before December 9, 2015

As I’ve noted, we can be virtually certain that the government has started demanding back doors from tech companies via FISA requests, including Section 702 requests that don’t include any court oversight of assistance provided. Wyden said as much in his statement for the SSCI 702 reauthorization bill request.

It leaves in place current statutory authority to compel companies to provide assistance, potentially opening the door to government mandated de-encryption without FISA Court oversight.

We can point to a doubling of Apple national security requests in the second half of 2016 as one possible manifestation of such requests.

The number of national security orders issued to Apple by US law enforcement doubled to about 6,000 in the second half of 2016, compared with the first half of the year, Apple disclosed in its biannual transparency report. Those requests included orders received under the Foreign Intelligence Surveillance Act, as well as national security letters, the latter of which are issued by the FBI and don’t require a judge’s sign-off.

We might even be able to point to a 2015 request that involved an amicus (likely Amy Jeffress) and got appealed.

Given those breadcrumbs, I want to return to this post on the demand for a back door into the work phone of the San Bernardino killer, Syed Rezwan Farook. In it, I presented a number of other data points to suggest such a request may have come in late 2015. First, in a court filing, Apple claimed to object to a bunch of requests for All Writs Act assistance to break into its phones on the same day, December 9, 2015.

As I noted the other day, a document unsealed last week revealed that DOJ has been asking for similar such orders in other jurisdictions: two in Cincinnati, four in Chicago, two in Manhattan, one in Northern California (covering three phones), another one in Brooklyn (covering two phones), one in San Diego, and one in Boston.

According to Apple, it objected to at least five of these orders (covering eight phones) all on the same day: December 9 (note, FBI applied for two AWAs on October 8, the day in which Comey suggested the Administration didn’t need legislation, the other one being the Brooklyn docket in which this list was produced).

Screen Shot 2016-02-24 at 7.23.53 PM

The government disputes this timeline.

In its letter, Apple stated that it had “objected” to some of the orders. That is misleading. Apple did not file objections to any of the orders, seek an opportunity to be heard from the court, or otherwise seek judicial relief. The orders therefore remain in force and are not currently subject to litigation.

Whatever objection Apple made was — according to the government, anyway — made outside of the legal process.

But Apple maintains that it objected to everything already in the system on one day, December 9.

Why December 9? Why object — in whatever form they did object — all on the same day, effectively closing off cooperation under AWAs in all circumstances?

I suggested that one explanation might have been a FISA request for the same thing. Apple would know that FISC takes notice of magistrate decisions, and would want to avoid fighting that battle on two fronts.

There are two possibilities I can think of, though they are both just guesses. The first is that Apple got an order, probably in an unrelated case or circumstance, in a surveillance context that raised the stakes of any cooperation on individual phones in a criminal context. I’ll review this at more length in a later post, but for now, recall that on a number of occasions, the FISA Court has taken notice of something magistrates or other Title III courts have done. For location data, FISC has adopted the standard of the highest common denominator, meaning it has adopted the warrant standard for location even though not all states or federal districts have done so. So the decisions that James Orenstein in Brooklyn and Sheri Pym in Riverside make may limit what FISC can do. It’s possible that Apple got a FISA request that raised the stakes on the magistrate requests we know about. By objecting across the board — and thereby objecting to requests pertaining to iOS 8 phones — Apple raised the odds that a magistrate ruling might help them out at FISA. And if there’s one lawyer in the country who probably knows that, it’s Apple lawyer Marc Zwillinger.

At the time, Tim Cook suggested that “other parts of government,” aside from the FBI, were asking for more, suggesting the NSA might be doing so.

Aside the obvious reasons to wonder whether Apple got some kind of FISA request, in his interview with ABC the other day, Tim Cook described “other parts of government” asking for more and more cases (though that might refer to state and city governments asking, rather than FBI in a FISA context).

The software key — and of course, with other parts of the government asking for more and more cases and more and more cases, that software would stay living. And it would be turning the crank.

The other possibility is that by December 9, Apple had figured out that — a full day after Apple had started to help FBI access information related to the San Bernardino investigation, on December 6 — FBI took a step (changing Farook’s iCloud password) that would make it a lot harder to access the content on the phone without Apple’s help.

Obviously, there are other possible explanations for these intersecting breadcrumbs (including that the unidentified 2015 amicus appointment was for some other issue, and that it didn’t relate to appeals up to and including the Supreme Court). But if these issues were all related it’d make sense.

Notorious “FOIA Terrorist” Jason Leopold “Saves” FBI Over $300,000

Last week, Jim Comey suggested the FBI paid more for the vulnerability that helped it break into Syen Rizwan Farook’s phone than he will be paid for the 7 years he’ll remain at FBI. The WSJ then did this math.

Speaking at the Aspen Security Forum in London, FBI Director James Comey didn’t cite a precise figure for how much the government paid for the solution to cracking the phone but said it was more than his salary for the seven-plus years remaining in his term at the FBI.

His annual salary is about $180,000 a year, so that comes to $1.26 million or more.

“[We] paid a lot’’ for the hacking tool, Mr. Comey said. “But it was worth it.’’

Over 600 outlets covered that story, claiming — without further confirmation — that FBI paid over $1 million for the hack, with many accounts settling on $1.3 million.

I noted at the time that 1) Jim Comey has a history of telling untruths when convenient and 2) he had an incentive to exaggerate the cost of this exploit, because it would pressure Congress to pass a bill, like the horrible Burr-Feinstein bill, that would force Apple and other providers to help law enforcement crack phones less expensively. I envisioned this kind of exchange at a Congressional hearing:

Credulous Congressperson: Wow. $1M. That’s a lot.

Comey: Yes, you’ll need to triple our budget or help me find a cheaper way.

Lonely sane Congressperson: But, uh, if we kill security won’t that be more expensive?

Comey: Let me tell you abt time I ran up some steps.

I then mused that, because Comey had officially acknowledged paying that kind of figure, it would make it a lot easier to FOIA the exact amount. By the time I tweeted that thought, of course, Jason Leopold had already submitted a FOIA for the amount.

Sure enough, the outcome I figured has already happened: without offering an explanation for the discrepancy, Mark Hosenball reported today that the figure was actually under $1 million, and FBI will be able to use it on other phones.

The FBI paid under $1 million for the technique used to unlock the iPhone used by one of the San Bernardino shooters – a figure smaller than the $1.3 million the agency’s chief initially indicated the hack cost, several U.S. government sources said on Thursday.

The Federal Bureau of Investigation will be able to use the technique to unlock other iPhone 5C models running iOS 9 – the specifications of the shooter’s phone – without additional payment to the contractor who provided it, these people added.

Just one FOIA submission later (and, probably, the calls of a bunch of outraged members of Congress wondering why FBI paid $1.3 million for a hack they claimed, in explaining why they would not submit the hack to the Vulnerabilities Equity Process that might require them to share it with Apple nine months after Apple patched it, they didn’t understand at all), and all of a sudden this hack is at least $300,000 less expensive (and I’m betting a lot more than that).

You see how effective a little aggressive FOIAing is at reining in waste, fraud, and abuse?

A pity it can’t reverse the impact of all those credulous reports repeating Comey’s claim.

Amid an Inconclusive Answer on Encryption, Hillary Reveals She Doesn’t Understand How Metadata Works

Less than a mile from my house (at a small local tech firm called Atomic Object), Hillary Clinton got asked a question about encryption. After talking about the role of encryption in Atomic Object’s own work, one of the women asked (after 14:00; recording cuts out during her question),

What steps do you think government needs to take to make sure that the companies who build these,  create these products, keep our data secure. And also looking at the controversy between Apple and the FBI about encr–

After describing Healthcare.gov as the biggest tech failure in government because “it just didn’t really gel and there wasn’t enough testing,” Hillary admitted (in an apparent non sequitur) the government doesn’t do a good enough job protecting its own data.

We are woefully behind in the government in even protecting our own stuff. And so we’ve got to do a better job if we’re going to be a good partner with businesses to try to maintain privacy of data, whether it’s just customer data or whether it has real public consequences.

She then pivoted from what (I thought) was a project management issue, not a security one, to a long answer on the Apple v FBI that basically admitting not knowing (or being willing to say) what the right answer was.

With respect to the current legal controversy, between Apple and the FBI, I am someone who is just feeling like I am in the middle of the worst dilemma ever. I mean, think about it. Because there’s got to be some way to protect the privacy of data information. There’s got to be some way to avoid breaking encryption, however you describe it, and opening the door to a lot of bad actors. But there also has to be some way to follow up on criminal activity and prevent both crimes and terrorism. You guys are the experts on this. I don’t know enough about it to tell you how to do it. But I think that the real mistrust between the tech companies and the government right now is a serious problem that has to be, somehow, worked through.

I keep saying, you know, we have a lot of smart people in this country. You know, we invented the Internet, we invented, you know, the Internet of Things, we’ve invented all of this. Isn’t there some way without opening the door and causing even, you know, more and worse consequences to figure out how you get information?

Because I’m also very understanding of the position that law enforcement finds itself and and if any one of you were working at Quantico in the FBI lab, and you know, you had this phone that one of the terrorists in San Bernardino did and you wanted to find out who they communicated with and you know that could trace us back to somebody in this country, it could trace us back more clearly to somebody directing it overseas. You’d want to know that too.

So that’s what we need help on, so that we don’t make a grave error that affects our ability to maintain privacy and to protect encryption, but we also don’t open the door — because we know what happens, is these guys that are on the other side of us now, with ISIS and the like, they are really smart. A lot of them are well-educated. They’re not the image of just some poor guy coming to be a Jihadist. They are educated, they are increasingly computer literate, they are wanting to wage as much war and violence on Europe, the United States, as they can. They have learned, so they’re now using encrypted devices, why wouldn’t they? You know why would they be so stupid to continue to allow us to monitor where they are and what they’re doing? This is a problem. And it’s a problem we’ve got to come up with some way to solve. But I certainly am not expert in any way to tell you how to do it.

Right in the middle, however, Hillary reveals not understanding a key part of this controversy. To the extent Syed Rizwan Farook used the Apple software on his work phone to communicate with accomplices, we know who he communicated with, because we have that metadata (as Admiral Mike Rogers recently confirmed). We just don’t know what he said.

We wouldn’t necessarily know who he talked to if he used an App for which metadata was more transient, like Signal. But if so, that’s not an Apple problem.

Moreover, if ISIS recruits are — as Hillary said — smart, then they definitely wouldn’t (and in fact generally don’t) use Apple products, because they’d know that would make their communications easily accessible under the PRISM or USA Freedom programs.

This response is not really any different from what we’re getting from other to Obama officials. But it does come with some indication of the misunderstandings about the problem before us.

Husband of San Bernardino Victim Agrees: Farook’s Phone Unlikely to Yield Useful Information

Even before the government obtained an All Writs Act ordering Apple to help back door Syed Rezwan Farook’s phone, it had arranged with a former judge to submit a brief on behalf of the victims of the attack, supporting the government’s demand. Yet not all victims agree. The husband of a woman shot three times in the attack, Salihin Kondoker, has submitted his own letter to the court in support of Apple’s stance. In it, he provides support for a point I was among the first to make: that the phone isn’t going to provide much information about the attack, in large part because it was a work phone Farook would have known was being surveilled.

In my opinion it is unlikely there is any valuable information on this phone. This was a work phone. My wife also had an iPhone issued by the County and she did not use it for any personal communication. San Bernardino is one of the largest Counties in the country. They can track the phone on GPS in case they needed to determine where people were. Second, both the iCloud account and carrier account were controlled by the county so they could track any communications. This was common knowledge among my wife and other employees. Why then would someone store vital contacts related to an attack on a phone they knew the county had access to? They destroyed their personal phones after the attack. And I believe they did that for a reason.

It’s a question no one asked Jim Comey earlier this week when he testified before the House Judiciary Committee.

Curiously, Kondoker (who explains he has attended briefings the FBI has held for victims) alludes to information the FBI is currently ignoring.

In the weeks and months since the attack I have been to the FBI briefings that were held for victims and their families. I have joined others in asking many questions about how this happened and why we don’t have more answers. I too have been frustrated there isn’t more information. But I don’t believe that a company is the reason for this.

[snip]

In the wake of this terrible attack, I believe strongly we need stronger gun laws. It was guns that killed innocent people, not technology. I also believe the FBI had and still has access to a lot of information which they have ignored and I’m very disappointed in the way they’ve handled this investigation.

I’m really curious what that is — and why Jim Comey, who promises he would never ignore a lead, isn’t ensuring it gets chased down?

Why Did Apple “Object” to All Pending All Writs Orders on December 9?

As I noted the other day, a document unsealed last week revealed that DOJ has been asking for similar such orders in other jurisdictions: two in Cincinnati, four in Chicago, two in Manhattan, one in Northern California (covering three phones), another one in Brooklyn (covering two phones), one in San Diego, and one in Boston.

According to Apple, it objected to at least five of these orders (covering eight phones) all on the same day: December 9 (note, FBI applied for two AWAs on October 8, the day in which Comey suggested the Administration didn’t need legislation, the other one being the Brooklyn docket in which this list was produced).

Screen Shot 2016-02-24 at 7.23.53 PM

The government disputes this timeline.

In its letter, Apple stated that it had “objected” to some of the orders. That is misleading. Apple did not file objections to any of the orders, seek an opportunity to be heard from the court, or otherwise seek judicial relief. The orders therefore remain in force and are not currently subject to litigation.

Whatever objection Apple made was — according to the government, anyway — made outside of the legal process.

But Apple maintains that it objected to everything already in the system on one day, December 9.

Why December 9? Why object — in whatever form they did object — all on the same day, effectively closing off cooperation under AWAs in all circumstances?

There are two possibilities I can think of, though they are both just guesses. The first is that Apple got an order, probably in an unrelated case or circumstance, in a surveillance context that raised the stakes of any cooperation on individual phones in a criminal context. I’ll review this at more length in a later post, but for now, recall that on a number of occasions, the FISA Court has taken notice of something magistrates or other Title III courts have done. For location data, FISC has adopted the standard of the highest common denominator, meaning it has adopted the warrant standard for location even though not all states or federal districts have done so. So the decisions that James Orenstein in Brooklyn and Sheri Pym in Riverside make may limit what FISC can do. It’s possible that Apple got a FISA request that raised the stakes on the magistrate requests we know about. By objecting across the board — and thereby objecting to requests pertaining to iOS 8 phones — Apple raised the odds that a magistrate ruling might help them out at FISA. And if there’s one lawyer in the country who probably knows that, it’s Apple lawyer Marc Zwillinger.

Aside the obvious reasons to wonder whether Apple got some kind of FISA request, in his interview with ABC the other day, Tim Cook described “other parts of government” asking for more and more cases (though that might refer to state and city governments asking, rather than FBI in a FISA context).

The software key — and of course, with other parts of the government asking for more and more cases and more and more cases, that software would stay living. And it would be turning the crank.

The other possibility is that by December 9, Apple had figured out that — a full day after Apple had started to help FBI access information related to the San Bernardino investigation, on December 6 — FBI took a step (changing Farook’s iCloud password) that would make it a lot harder to access the content on the phone without Apple’s help. Indeed, I’m particularly interested in what advice Apple gave the FBI in the November 16 case (involving two iOS 8 phones), given that it’s possible Apple was successfully recommending FBI pursue alternatives in that case which FBI then foreclosed in the San Bernardino case. In other words, it’s possible Apple recognized by December 9 that FBI was going to use the event of a terrorist attack to force Apple to back door its products, after which Apple started making a stronger legal stand than they might otherwise have done pursuant to secret discussions.

That action — FBI asking San Bernardino to change the password — is something Tim Cook mentioned several times in his interview with ABC the other night, at length here:

We gave significant advice to them, as a matter of fact one of the things that we suggested was “take the phone to a network that it would be familiar with, which is generally the home. Plug it in. Power it on. Leave it overnight–so that it would back-up, so that you’d have a current back-up. … You can think of it as making of making a picture of almost everything on the phone, not everything, but almost everything.

Did they do that?

Unfortunately, in the days, the early days of the investigation, an FBI–FBI directed the county to reset the iCloud password. When that is done, the phone will no longer back up to the Cloud. And so I wish they would have contacted us earlier so that that would not have been the case.

How crucial was that missed opportunity?

Assuming the cloud backup was still on — and there’s no reason to believe that it wasn’t — then it is very crucial.

And it’s something they harped on in their motion yesterday.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

Plus, consider the oddness around this iCloud information. FBI would have gotten the most recent backup (dating to October 19) directly off Farook’s iCloud account on December 6.

But 47 days later, on January 22, they obtained a warrant for that same information. While they might get earlier backups, they would have received substantially the same information they had accessed directly back in December, all as they were prepping going after Apple to back door their product. It’s not clear why they would do this, especially since there’s little likelihood of this information being submitted at trial (and therefore requiring a parallel constructed certified Apple copy for evidentiary purposes).

There’s one last detail of note. Cook also suggested in that interview that things would have worked out differently — Apple might not have made the big principled stand they are making — if FBI had never gone public.

I can’t talk about the tactics of the FBI, they’ve chosen to do what they’ve done, they’ve chosen to do this out in public, for whatever reasons that they have.What we think at this point, given it is out in the public, is that we need to stand tall and stand tall on principle. Our job is to protect our customers.

Again, that suggests they might have taken a different tack with all the other AWA orders if they only could have done it quietly (which also suggests FBI is taking this approach to make it easier for other jurisdictions to get Apple content). But why would they have decided on December 9 that this thing was going to go public?

Update: This language, from the Motion to Compel, may explain why they both accessed the iCloud and obtained a warrant.

The FBI has been able to obtain several iCloud backups for the SUBJECT DEVICE, and executed a warrant to obtain all saved iCloud data associated with the SUBJECT DEVICE. Evidence in the iCloud account indicates that Farook was in communication with victims who were later killed during the shootings perpetrated by Farook on December 2, 2015, and toll records show that Farook communicated with Malik using the SUBJECT DEVICE. (17)

This passage suggests it obtained both “iCloud backups” and “all saved iCloud data,” which are actually the same thing (but would describe the two different ways the FBI obtained this information). Then, without noting a source, it says that “evidence in the iCloud account” shows Farook was communicating with his victims and “toll records” show he communicated with Malik. Remember too that the FBI got subscriber information from a bunch of accounts using (vaguely defined) “legal process,” which could include things like USA Freedom Act.

The “evidence in the iCloud account” would presumably be iMessages or Facetime. But the “toll records” could be too, given that Apple would have those (and could have turned them over in the earlier “legal process” step. That is, FBI may have done this to obscure what it can get at each stage (and, possibly, what kinds of other “legal process” it now serves on Apple).


October 8: Comey testifies that the government is not seeking legislation; FBI submits requests for two All Writs Act, one in Brooklyn, one in Manhattan; in former case, Magistrate Judge James Orenstein invites Apple response

October 30: FBI obtains another AWA in Manhattan

November 16: FBI obtains another AWA in Brooklyn pertaining to two phones, but running iOS 8.

November 18: FBI obtains AWA in Chicago

December 2: Syed Rezwan Farook and his wife killed 14 of Farook’s colleagues at holiday party

December 3: FBI seizes Farook’s iPhone from Lexus sitting in their garage

December 4: FBI obtains AWA in Northern California covering 3 phones, one running iOS 8 or higher

December 5, 2:46 AM: FBI first asks Apple for help, beginning period during which Apple provided 24/7 assistance to investigation from 3 staffers; FBI initially submits “legal process” for information regarding customer or subscriber name for three names and nine specific accounts; Apple responds same day

December 6: FBI works with San Bernardino county to reset iCloud password for Farook’s account; FBI submits warrant to Apple for account information, emails, and messages pertaining to three accounts; Apple responds same day

December 9: Apple “objects” to the pending AWA orders

December 10: Intelligence Community briefs Intelligence Committee members and does not affirmatively indicate any encryption is thwarting investigation

December 16: FBI submits “legal process” for customer or subscriber information regarding one name and seven specific accounts; Apple responds same day

January 22: FBI submits warrant for iCloud data pertaining to Farook’s work phone

January 29: FBI obtains extension on warrant for content for phone

February 14: US Attorney contacts Stephen Larson asking him to file brief representing victims in support of AWA request

February 16: After first alerting the press it will happen, FBI obtains AWA for Farook’s phone and only then informs Apple

FBI Waited 50 Days before Asking for Syed Rezwan Farook’s iCloud Data

Apple’s motion to vacate the All Writs Act order requiring it to help FBI brute force Syed Rezwan Farook’s iPhone is a stupendous document worthy of the legal superstars who wrote it. To my mind, however, the most damning piece comes not from the lawyers who wrote the brief, but in a declaration from another lawyer: Lisa Olle, Apple’s Manager of Global Privacy and Law, the last 3 pages of the filing.

Olle provides an interesting timeline of FBI’s requests from Apple, some of which I’ll return to. The most damning details, however, are these.

First, FBI first contacted Apple in the middle of the night on December 5.
Screen Shot 2016-02-25 at 6.09.00 PM

That means FBI first contacted Apple the day before FBI (according to their own statement) asked San Bernardino County to reset Farook’s Apple password — a move that, FBI stated in the filing, would have made the AWA demand on Apple unnecessary.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

In other words, Apple was fully engaged in this case, and yet FBI still didn’t ask their advice before taking action that eliminated the easiest solution to get this information.

And then they waited, and waited, and waited.

Screen Shot 2016-02-25 at 6.16.11 PM

FBI waited 50 days from the time they seized the phone on December 3 until they asked Apple for the iCloud information on January 22 (they had to renew the warrant on the phone itself on January 29).

50 days.

And yet the FBI wants us to believe they think this phone will have important information about the attack.

On December 10, Intelligence Committees Not Told Any Encrypted Communications Used in San Bernardino

Here’s what Senate Intelligence Chair Richard Burr and House Intelligence Ranking Member Adam Schiff had to say about a briefing on the San Bernardino attack they attended on December 10.

Lawmakers on Thursday said there was no evidence yet that the two suspected shooters used encryption to hide from authorities in the lead-up to last week’s San Bernardino, Calif., terror attack that killed 14 people.

“We don’t know whether it played a part in this attack,” Senate Intelligence Committee Chairman Richard Burr (R-N.C.) told reporters following a closed-door briefing with federal officials on the shootings.

But that hasn’t ruled out the possibility, Burr and others cautioned.

“That’s obviously one issue were very interested in,” House Intelligence Committee ranking member Adam Schiff (D-Calif.) said. “To what degree were either encrypted devices or communications a part of the impediment of the investigation, either while the events were taking place or to our investigation now?”

The recent terror attacks in San Bernardino and Paris have shed an intense spotlight on encryption.

While no evidence has been uncovered that either plot was hatched via secure communications platforms, lawmakers and federal officials have used the incidents to resurface an argument that law enforcement should have guaranteed access to encrypted data.

On December 10, we should assume from these comments, the Congressmen privy to the country’s most secret intelligence and law enforcement information, were told nothing about a key source of evidence in the San Bernardino attack being encrypted. Schiff made it quite clear the members of Congress in the briefing were quite interested in that question, but nothing they heard in the briefing alerted them to a known trove of evidence being hidden by encryption.

That’s an important benchmark because of details the FBI provided in response to a questions from Ars Tecnica’s Cyrus Farivar. As had been made clear in the warrant, FBI seized the phone on December 3. But the statement also reveals that FBI asked the County to reset Farook’s Apple ID password on December 6. That means they were already working on that phone several days before the briefing to the Intelligence Committee members (it’s unclear whether that briefing was just for the Gang of Four or for both Intelligence Committees).

While, given what Tim Cook described last night, the FBI had not yet asked for Apple’s assistance by that point, the FBI had to have known what they were dealing with by December 6 — an iPhone 5C running iOS9. Therefore, they would have known the phone was encrypted by default (and couldn’t be open with a fingerprint).

Yet even four days later, they were not sufficiently interested in that phone they had to have known to be encrypted to tell Congress it held key data.

Update: Wow, this, from Apple’s motion to vacate the order, makes this all the more damning.

Screen Shot 2016-02-25 at 6.09.00 PM

What Claims Did the Intelligence Community Make about the Paris Attack to Get the White House to Change on Encryption?

I’m going to do a series of posts laying out the timeline behind the Administration’s changed approach to encryption. In this, I’d like to make a point about when the National Security Council adopted a “decision memo” more aggressively seeking to bypass encryption. Bloomberg reported on the memo last week, in the wake of the FBI’s demand that Apple help it brute force Syed Rezwan Farook’s work phone.

But note the date: The meeting at which the memo was adopted was convened “around Thanksgiving.”

Silicon Valley celebrated last fall when the White House revealed it would not seek legislation forcing technology makers to install “backdoors” in their software — secret listening posts where investigators could pierce the veil of secrecy on users’ encrypted data, from text messages to video chats. But while the companies may have thought that was the final word, in fact the government was working on a Plan B.

In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.

The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement. [my emphasis]

That is, the meeting was convened in the wake of the November 13 ISIS attack on Paris.

We know that last August, Bob Litt had recommended keeping options open until such time as a terrorist attack presented the opportunity to revisit the issue and demand that companies back door encryption.

Privately, law enforcement officials have acknowledged that prospects for congressional action this year are remote. Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

There is value, he said, in “keeping our options open for such a situation.”

Litt was commenting on a draft paper prepared by National Security Council staff members in July, which also was obtained by The Post, that analyzed several options. They included explicitly rejecting a legislative mandate, deferring legislation and remaining undecided while discussions continue.

It appears that is precisely what happened — that the intelligence community, in the wake of a big attack on Paris, went to the White House and convinced them to change their approach.

So I want to know what claims the intelligence community made about the use of encryption in the attack that convinced the White House to change approach. Because there is nothing in the public record that indicates encryption was important at all.

It is true that a lot of ISIS associates were using Telegram; shortly after the attack Telegram shut down a bunch of channels they were using. But reportedly Telegram’s encryption would be easy for the NSA to break. The difficulty with Telegram — which the IC should consider seriously before they make Apple back door its products — is that its offshore location probably made it harder for our counterterrorism analysts to get the metadata.

It is also true that an ISIS recruit whom French authorities had interrogated during the summer (and who warned them very specifically about attacks on sporting events and concerts) had been given an encryption key on a thumb drive.

But it’s also true the phone recovered after the attack — which the attackers used to communicate during the attack — was not encrypted. It’s true, too, that French and Belgian authorities knew just about every known participant in the attack, especially the ringleader. From reports, it sounds like operational security — the use of a series of burner phones — was more critical to his ability to move unnoticed through Europe. There are also reports that the authorities had a difficult time translating the dialect of (probably) Berber the attackers used.

From what we know, though, encryption is not the reason authorities failed to prevent the French attack. And a lot of other tools that are designed to identify potential attacks — like the metadata dragnet — failed.

I hate to be cynical (though comments like Litt’s — plus the way the IC used a bogus terrorist threat in 2004 to get the torture and Internet dragnet programs reauthorized — invite such cynicism). But it sure looks like the IC failed to prevent the November attack, and immediately used their own (human, unavoidable) failure to demand a new approach to encryption.

Update: In testimony before the House Judiciary Committee today, Microsoft General Counsel Brad Smith repeated a claim MSFT witnesses have made before: they provided Parisian law enforcement email from the Paris attackers within 45 minutes. That implies, of course, that the data was accessible under PRISM and not encrypted.

Reuters Asks Even Stupider Questions about Apple-FBI Fight than Pew

In my post on Pew’s polling on whether Apple should have to write a custom version of its operating system so FBI can brute force the third phone, I gave Pew credit for several aspects of its question, but suggested the result might be different if Pew had reminded the people the FBI has already solved the San Bernardino attack.

Imagine if Pew called 1000 people and asked, “would you support requiring Apple to make iPhones less secure so the FBI could get information on a crime the FBI has already solved?”

As I said, at least Pew’s question was fair.

Not so Reuters’ questions on the same topic. After asking a bunch of questions to which three-quarters said they would not be willing to give up their own privacy to ward against terrorism or hacking, Reuters than asked this question:

Apple is opposing a court order to unlock a smart phone that was used by one of the shooters in the San Bernardino attack. Apple is concerned that if it helps the FBI this time, it will be forced to help the government in future cases that may not be linked to national security, opening the door for hackers and potential future

Do you agree or disagree with Apple’s decision to oppose the court order?

While Reuters explains why Apple opposes the order — because it will be [in fact, already has been] asked to help break into more phones that have nothing to do with terrorism, creating vulnerabilities for hackers — the wording of the question could easily be understood to imply that Syed Rezwan Farook’s phone “was used [] in the San Bernardino attack.” It’s not clear Farook even used the phone after November, two days before his attack. And to the extent Farook and his wife used phones during the attack — as implied by the question — they are believed to be the phones they tried unsuccessfully to destroy.

Yet, even with his problematically framed question, 46% of respondents (on an online poll, which likely skews towards tech facility) supported Apple’s actions.

There’s a problem, too, with the only question for which a plurality supported the FBI’s snooping. a graph of which Reuters highlighted in its story.

The government should be able to look at data on Americans’ phones in order to protect against terror threats.

There are cases where investigators find information on a smart phone that helps prevent follow-on attacks (in happened in Paris with a phone that was not encrypted). Border searches(which I admittedly believe to be one of the real reasons FBI objects to default encryption), too, might prevent terror attacks. But more often, we’re talking about investigating crimes deemed to be terrorism after the fact (or, far, far more often, solving drug crimes).

Nothing the FBI could do with the data on Farook’s work phone will prevent the deaths of the 14 people he already killed. There are other kinds of surveillance far better suited to doing that.

Pew Poll Finding Majority Oppose Apple Is Premised on FBI Spin

Screen Shot 2016-02-22 at 9.00.37 PMImagine if Pew called 1000 people and asked, “would you support requiring Apple to make iPhones less secure so the FBI could get information on a crime the FBI has already solved?”

Respondents might find the entire question bizarre, as requiring a private company to damage its product for information on a crime the FBI had already solved would be a tremendous waste. Based on the argument I laid out here — that the information the FBI might get from Syed Rezwan Farook’s work phone wouldn’t add all that much to what they presumably already got off two phones he tried unsuccessfully to destroy, as well as the phones or iCloud accounts of his colleagues — that’s the question I think Pew should have asked in its poll.

Here’s what Pew asked :

As you may know, RANDOMIZE: [the FBI has said that accessing the iPhone is an important part of their ongoing investigation into the San Bernardino attacks] while [Apple has said that unlocking the iPhone could compromise the security of other users’ information] do you think Apple [READ; RANDOMIZE]?

To be fair to Pew, FBI has said this phone will be “important,” and to Pew’s great credit, they described Apple’s stance to be about security, not privacy.

But the fact of the matter is FBI is demanding access to this phone knowing full well who the perpetrators are — Farook and his wife — and knowing (per Admiral Mike Rogers and a slew of FBI statements before his) that the couple didn’t have overseas help. San Bernardino was, the FBI has known for months, a particularly brutal workplace killing inspired by radical Islam.

I sort of suspect Americans might think differently about this particular back door request (though maybe not another case where the phone really would be central to solving the case) if it were explained in those terms.