Charlie Savage has a story that confirms (he linked some of my earlier reporting) something I’ve long argued: NSA was willing to shut down the Internet dragnet in 2011 because it could do what it wanted using other authorities. In it, Savage points to an NSA IG Report on its purge of the PRTT data that he obtained via FOIA. The document includes four reasons the government shut the program down, just one of which was declassified (I’ll explain what is probably one of the still-classified reasons probably in a later post). It states that SPCMA and Section 702 can fulfill the requirements that the Internet dragnet was designed to meet. The government had made (and I had noted) a similar statement in a different FOIA for PRTT materials in 2014, though this passage makes it even more clear that SPCMA — DOD’s self-authorization to conduct analysis including US persons on data collected overseas — is what made the switch possible.
It’s actually clear there are several reasons why the current plan is better for the government than the previous dragnet, in ways that are instructive for the phone dragnet, both retrospectively for the USA F-ReDux debate and prospectively as hawks like Tom Cotton and Jeb Bush and Richard Burr try to resuscitate an expanded phone dragnet. Those are:
Both the domestic Internet and phone dragnet limited their use to counterterrorism. While I believe the Internet dragnet limits were not as stringent as the phone ones (at least in pre 2009 shutdown incarnation), they both required that the information only be disseminated for a counterterrorism purpose. The phone dragnet, at least, required someone sign off that’s why information from the dragnet was being disseminated.
Admittedly, when the FISC approved the use of the phone dragnet to target Iran, it was effectively authorizing its use for a counterproliferation purpose. But the government’s stated admissions — which are almost certainly not true — in the Shantia Hassanshahi case suggest the government would still pretend it was not using the phone dragnet for counterproliferation purposes. The government now claims it busted Iranian-American Hassanshahi for proliferating with Iran using a DEA database rather than the NSA one that technically would have permitted the search but not the dissemination, and yesterday Judge Rudolph Contreras ruled that was all kosher.
But as I noted in this SPCMA piece, the only requirement for accessing EO 12333 data to track Americans is a foreign intelligence purpose.
Additionally, in what would have been true from the start but was made clear in the roll-out, NSA could use this contact chaining for any foreign intelligence purpose. Unlike the PATRIOT-authorized dragnets, it wasn’t limited to al Qaeda and Iranian targets. NSA required only a valid foreign intelligence justification for using this data for analysis.
The primary new responsibility is the requirement:
- to enter a foreign intelligence (FI) justification for making a query or starting a chain,[emphasis original]
Now, I don’t know whether or not NSA rolled out this program because of problems with the phone and Internet dragnets. But one source of the phone dragnet problems, at least, is that NSA integrated the PATRIOT-collected data with the EO 12333 collected data and applied the protections for the latter authorities to both (particularly with regards to dissemination). NSA basically just dumped the PATRIOT-authorized data in with EO 12333 data and treated it as such. Rolling out SPCMA would allow NSA to use US person data in a dragnet that met the less-restrictive minimization procedures.
That means the government can do chaining under SPCMA for terrorism, counterproliferation, Chinese spying, cyber, or counter-narcotic purposes, among others. I would bet quite a lot of money that when the government “shut down” the DEA dragnet in 2013, they made access rules to SPCMA chaining still more liberal, which is great for the DEA because SPCMA did far more than the DEA dragnet anyway.
So one thing that happened with the Internet dragnet is that it had initial limits on purpose and who could access it. Along the way, NSA cheated those open, by arguing that people in different function areas (like drug trafficking and hacking) might need to help out on counterterrorism. By the end, though, NSA surely realized it loved this dragnet approach and wanted to apply it to all NSA’s functional areas. A key part of the FISC’s decision that such dragnets were appropriate is the special need posed by counterterrorism; while I think they might well buy off on drug trafficking and counterproliferation and hacking and Chinese spying as other special needs, they had not done so before.
The other thing that happened is that, starting in 2008, the government started putting FBI in a more central role in this process, meaning FBI’s promiscuous sharing rules would apply to anything FBI touched first. That came with two benefits. First, the FBI can do back door searches on 702 data (NSA’s ability to do so is much more limited), and it does so even at the assessment level. This basically puts data collected under the guise of foreign intelligence at the fingertips of FBI Agents even when they’re just searching for informants or doing other pre-investigative things.
In addition, the minimization procedures permit the FBI (and CIA) to copy entire metadata databases.
FBI can “transfer some or all such metadata to other FBI electronic and data storage systems,” which seems to broaden access to it still further.
Users authorized to access FBI electronic and data storage systems that contain “metadata” may query such systems to find, extract, and analyze “metadata” pertaining to communications. The FBI may also use such metadata to analyze communications and may upload or transfer some or all such metadata to other FBI electronic and data storage systems for authorized foreign intelligence or law enforcement purposes.
In this same passage, the definition of metadata is curious.
For purposes of these procedures, “metadata” is dialing, routing, addressing, or signaling information associated with a communication, but does not include information concerning the substance, purport, or meaning of the communication.
I assume this uses the very broad definition John Bates rubber stamped in 2010, which included some kinds of content. Furthermore, the SMPs elsewhere tell us they’re pulling photographs (and, presumably, videos and the like). All those will also have metadata which, so long as it is not the meaning of a communication, presumably could be tracked as well (and I’m very curious whether FBI treats location data as metadata as well).
Whereas under the old Internet dragnet the data had to stay at NSA, this basically lets FBI copy entire swaths of metadata and integrate it into their existing databases. And, as noted, the definition of metadata may well be broader than even the broadened categories approved by John Bates in 2010 when he restarted the dragnet.
So one big improvement between the old domestic Internet dragnet and SPCMA (and 702 to a lesser degree, and I of course, improvement from a dragnet-loving perspective) is that the government can use it for any foreign intelligence purpose.
At several times during the USA F-ReDux debate, surveillance hawks tried to use the “reform” to expand the acceptable uses of the dragnet. I believe controls on the new system will be looser (especially with regards to emergency searches), but it is, ostensibly at least, limited to counterterrorism.
One way USA F-ReDux will be far more liberal, however, is in dissemination. It’s quite clear that the data returned from queries will go (at least) to FBI, as well as NSA, which means FBI will serve as a means to disseminate it promiscuously from there.
Another thing replacing the Internet dragnet with 702 access does it provide another way to correlate multiple identities, which is critically important when you’re trying to map networks and track all the communication happening within one. Under 702, the government can obtain not just Internet “call records” and the content of that Internet communication from providers, but also the kinds of thing they would obtain with a subpoena (and probably far more). As I’ve shown, here are the kinds of things you’d almost certainly get from Google (because that’s what you get with a few subpoenas) under 702 that you’d have to correlate using algorithms under the old Internet dragnet.
Every single one of these data points provides a potentially new identity that the government can track on, whereas the old dragnet might only provide an email and IP address associated with one communication. The NSA has a great deal of ability to correlate those individual identifiers, but — as I suspect the Paris attack probably shows — that process can be thwarted somewhat by very good operational security (and by using providers, like Telegram, that won’t be as accessible to NSA collection).
This is an area where the new phone dragnet will be significantly better than the existing phone dragnet, which returns IMSI, IMEI, phone number, and a few other identifiers. But under the new system, providers will be asked to identify “connected” identities, which has some limits, but will nonetheless pull some of the same kind of data that would come back in a subpoena.
While replacing the domestic Internet dragnet with SPCMA provides additional data with which to do correlations, much of that might fall under the category of additional functionality. There are two obvious things that distinguish the old Internet dragnet from what NSA can do under SPCMA, though really the possibilities are endless.
The first of those is content scraping. As the Intercept recently described in a piece on the breathtaking extent of metadata collection, the NSA (and GCHQ) will scrape content for metadata, in addition to collecting metadata directly in transit. This will get you to different kinds of connection data. And particularly in the wake of John Bates’ October 3, 2011 opinion on upstream collection, doing so as part of a domestic dragnet would be prohibitive.
In addition, it’s clear that at least some of the experimental implementations on geolocation incorporated SPCMA data.
I’m particularly interested that one of NSA’s pilot co-traveler programs, CHALKFUN, works with SPCMA.
Chalkfun’s Co-Travel analytic computes the date, time, and network location of a mobile phone over a given time period, and then looks for other mobile phones that were seen in the same network locations around a one hour time window. When a selector was seen at the same location (e.g., VLR) during the time window, the algorithm will reduce processing time by choosing a few events to match over the time period. Chalkfun is SPCMA enabled1.
1 (S//SI//REL) SPCMA enables the analytic to chain “from,” “through,” or “to” communications metadata fields without regard to the nationality or location of the communicants, and users may view those same communications metadata fields in an unmasked form. [my emphasis]
Now, aside from what this says about the dragnet database generally (because this makes it clear there is location data in the EO 12333 data available under SPCMA, though that was already clear), it makes it clear there is a way to geolocate US persons — because the entire point of SPCMA is to be able to analyze data including US persons, without even any limits on their location (meaning they could be in the US).
That means, in addition to tracking who emails and talks with whom, SPCMA has permitted (and probably still does) permit NSA to track who is traveling with whom using location data.
Finally, one thing we know SPCMA allows is tracking on cookies. I’m of mixed opinion on whether the domestic Internet ever permitted this, but tracking cookies is not only nice for understanding someone’s browsing history, it’s probably critical for tracking who is hanging out in Internet forums, which is obviously key (or at least used to be) to tracking aspiring terrorists.
Most of these things shouldn’t be available via the new phone dragnet — indeed, the House explicitly prohibited not just the return of location data, but the use of it by providers to do analysis to find new identifiers (though that is something AT&T does now under Hemisphere). But I would suspect NSA either already plans or will decide to use things like Supercookies in the years ahead, and that’s clearly something Verizon, at least, does keep in the course of doing business.
All of which is to say it’s not just that the domestic Internet dragnet wasn’t all that useful in its current form (which is also true of the phone dragnet in its current form now), it’s also that the alternatives provided far more than the domestic Internet did.
Jim Comey recently said he expects to get more information under the new dragnet — and the apparent addition of another provider already suggests that the government will get more kinds of data (including all cell calls) from more kinds of providers (including VOIP). But there are also probably some functionalities that will work far better under the new system. When the hawks say they want a return of the dragnet, they actually want both things: mandates on providers to obtain richer data, but also the inclusion of all Americans.
The other day I looked at an exchange between Ron Wyden and Jim Comey that took place in January 2014, as well as the response FBI gave Wyden afterwards. I want to return to the reason I was originally interested in the exchange: because it reveals that FBI, in addition to obtaining cell location data directly from a phone company or a Stingray, will sometimes get location data from a mobile app provider.
I asked Magistrate Judge Stephen Smith from Houston whether he had seen any such requests — he’s one of a group of magistrates who have pushed for more transparency on these issues. He explained he had had several hybrid pen/trap/2703(d) requests for location and other data targeting WhatsApp accounts. And he had one fugitive probation violation case where the government asked for the location data of those in contact with the fugitive’s Snapchat account, based on the logic that he might be hiding out with one of the people who had interacted with him on Snapchat. The providers would basically be asked to to turn over the cell site location information they had obtained from the users’ phone along with other metadata about those interactions. To be clear, this is not location data the app provider generates, it would be the location data the phone company generates, which the app accesses in the normal course of operation.
The point of getting location data like this is not to evade standards for a particular jurisdiction on CSLI. Smith explained, “The FBI apparently considers CSLI from smart phone apps the same as CSLI from the phone companies, so the same legal authorities apply to both, the only difference being that the ‘target device’ identifier is a WhatsApp/Snapchat account number instead of a phone number.” So in jurisdictions where you can get location data with an order, that’s what it takes, in jurisdictions where you need a probable cause warrant, that’s what it will take. The map above, which ACLU makes a great effort to keep up to date here, shows how jurisdictions differ on the standards for retrospective and prospective location information, which is what (as far as we know) will dictate what it would take to get, say, CSLI data tied to WhatsApp interactions.
Rather than serving as a way to get around legal standards, the reason to get CSLI from the app provider rather than the phone company that originally produces it is to get location data from both sides of a conversation, rather than just the target phone. That is, the app provides valuable context to the location data that you wouldn’t get just from the target’s cell location data.
The fact that the government is getting location data from mobile app providers — and the fact that they comply with the same standard for CSLI obtained from phones in any given jurisdiction — may help to explain a puzzle some have been pondering for the last week or so: why Facebook’s transparency report shows a big spike in wiretap warrants last year.
[T]he latest government requests report from Facebook revealed an unexpected and dramatic rise in real-time interceptions, or wiretaps. In the first six months of 2015, US law enforcement agencies sent Facebook 201 wiretap requests (referred to as “Title III” in the report) for 279 users or accounts. In all of 2014, on the other hand, Facebook only received 9 requests for 16 users or accounts.
Based on my understanding of what is required, this access of location data via WhatsApp should appear in several different categories of Facebook’s transparency report, including 2703(d), trap and trace, emergency request, and search warrant. That may include wiretap warrants, because this is, after all, prospective interception, and not just of the target, but also of the people with whom the target communicates. That may be why Facebook told Motherboard “we are not able to speculate about the types of legal process law enforcement chooses to serve,” because it really would vary from jurisdiction to jurisdiction and possibly even judge to judge.
In any case, we can be sure such requests are happening both on the criminal and the intelligence side, and perhaps most productively under PRISM (which could capture foreign to domestic communications at a much lower standard of review). Which, again, is why any legislation covering location data should cover the act of obtaining location data, whether via the phone company, a Stingray, or a mobile app provider.
I was looking for something else on Ron Wyden’s website yesterday and noticed this exchange between Wyden and Jim Comey from January 29, 2014 (see my transcription below). At first it seemed to be another of Wyden’s persistent questions about how the government collects location data — which we generally assume to be via telephone provider or Stingray — but then realized he was asking something somewhat different. After asking about Cell Site Location Information from phone companies, Wyden then asked whether the FBI uses the same (order, presumably a Pen Register) standard when collecting location from a smart phone app.
Oh yeah! The government can collect location information via apps (and thereby from Google or WhatsApp other providers) as well.
Here’s the FBI’s response, which hasn’t been published before.
The response is interesting for several reasons, some of which may explain why the government hasn’t been getting all the information from cell phones that it wanted under the Section 215 phone dragnet.
First, when the FBI is getting prospective CSLI, it gets a full FISA order, based on a showing of probable cause (it can get historical data using just an order). The response to Wyden notes that while some jurisdictions permit obtaining location data with just an order, because others require warrants, “the FBI elects to seek prospective CSLI pursuant to a full content FISA order, thus matching the higher standard imposed in some U.S. districts.”
Some of this FISA discussed in 2006 in response to some magistrates’ rulings that you needed more than an order to get location, though there are obviously more recent precedents that are stricter about needing a warrant.
This means it is actually harder right now to get prospective CSLI under FISA than it is under Title III in some states. (The letter also notes sometimes the FBI “will use criminal legal authorities in national security investigations,” which probably means FBI will do so in those states with a lower standard).
The FBI’s answer about smart phone apps was far squirrelier. It did say that when obtaining information from the phone itself, it gets a full-content FISA order, absent any exception to the Fourth Amendment (such as the border exception, which is one of many reasons FBI loves to search phones at the border and therefore hates Apple’s encryption); note this March 6, 2014 response was before the June 24, 2014 Riley v. CA decision that required a warrant to search a cell phone, which says FISA was on a higher standard there, too, until SCOTUS caught up.
But as to getting information from smartphone apps itself, here’s what FBI answered.
Which legal authority we would use is very much dependent upon the type of information we are seeking and how we intend to obtain that information. Questions considered include whether or not the information sought would target an individual in an area in which that person has a reasonable expectation of privacy, what type of data we intend to obtain (GPS or other similarly precise location information), and how we intend to obtain the data (via a request for records from the service provider or from the mobile device itself).
In other words, after having thought about how to answer Wyden for five weeks rather than the one they had promised, they didn’t entirely answer the question, which was what it would take for the FBI to get information from apps, rather than cell phone providers, though I think that may be the same standard as a CSLI from a cell phone company.
But this seems to say that, in the FISA context, it may well be easier — and a lower standard of evidence — for the FBI to get location data from a Stingray.
This explains why Wyden’s location bill — which he was pushing just the other day, after the Supreme Court refused to take Quartavious Davis’ appeal — talks about location collection generally, rather than using (for example) a Stingray.
Wyden: I’d like to ask you about the government’s authority to track individuals using things like cell site location information and smart phone applications. Last fall the NSA Director testified that “we–the NSA–identify a number we can give that to the FBI. When they get their probable cause then they can get the locational information they need.”
I’ve been asking the NSA to publicly clarify these remarks but it hasn’t happened yet. So, is the FBI required to have probable cause in order to acquire Americans’ cell site location information for intelligence purposes?
Comey: I don’t believe so Senator. We — in almost all circumstances — we have to obtain a court order but the showing is “a reasonable basis to believe it’s relevant to the investigation.”
Wyden: So, you don’t have to show probable cause. You have cited another standard. Is that standard different if the government is collecting the location information from a smart phone app rather than a cell phone tower?
Comey: I don’t think I know, I probably ought to ask someone who’s a little smarter what the standard is that governs those. I don’t know the answer sitting here.
Wyden: My time is up. Can I have an answer to that within a week?
Comey: You sure can.
Chuck Rosenberg, head of the U.S. Drug Enforcement Agency, said Wednesday that he agrees with FBI Director James Comey that police officers are reluctant to aggressively enforce laws in the post-Ferguson era of capturing police activity on smartphones and YouTube.
“I think there’s something to it,” Rosenberg said during a press briefing on drug statistics at DEA headquarters in Arlington. “I think he’s spot on. I’ve heard the same thing.”
… I reminded that Rosenberg is also Comey’s former Chief of Staff, from when Comey was Deputy Attorney General in the Bush Administration.
Which is why I find it interesting that the White House has suggested President Obama raised the issue with Comey in a meeting this week.
Asked whether Mr. Obama would call in the two men to discuss the issue privately, Mr. Earnest noted that Mr. Comey met with the president last week, and he strongly hinted that the president chided his F.B.I. director on the subject.
“The president is certainly counting on Director Comey to play a role in the ongoing debate about criminal justice reform,” Mr. Earnest said, suggesting that Mr. Obama expected Mr. Comey to uphold the president’s view on the matter.
While he was Comey’s CoS, remember, Comey made sure he was in the loop on torture discussions he otherwise wouldn’t be, as Comey made an effort to limit some of what got approved in the May 2005 torture memos. That was partly to make sure the torturers didn’t use his absence to push through the memo, but also partly (it seems clear now) to lay out his own record of events.
Given the timing (and the distinct possibility Rosenberg endorsed Comey’s Ferguson Effect views after Comey got chewed out by the President), this feels like a concerted bureaucratic stand. Of course, these two allies’ role atop aggressive law enforcement agencies, Comey just 2 years into a 10-year term, stubbornly repeating police claims, is a pretty powerful bureaucratic stand for cops who want to avoid oversight.
For at least the second time, Jim Comey has presented himself as a Ferguson Effect believer, someone who accepts data that has been cherry picked to suggest a related rise in violent crime in cities across the country (I believe that in Ferguson itself, violent crime dropped last month, but whatever).
I have spoken of 2014 in this speech because something has changed in 2015. Far more people are being killed in America’s cities this year than in many years. And let’s be clear: far more people of color are being killed in America’s cities this year.
And it’s not the cops doing the killing.
We are right to focus on violent encounters between law enforcement and civilians. Those incidents can teach all of us to be better.
But something much bigger is happening.
Most of America’s 50 largest cities have seen an increase in homicides and shootings this year, and many of them have seen a huge increase. These are cities with little in common except being American cities—places like Chicago, Tampa, Minneapolis, Sacramento, Orlando, Cleveland, and Dallas.
In Washington, D.C., we’ve seen an increase in homicides of more than 20 percent in neighborhoods across the city. Baltimore, a city of 600,000 souls, is averaging more than one homicide a day—a rate higher than that of New York City, which has 13 times the people. Milwaukee’s murder rate has nearly doubled over the past year.
Yesterday, Comey flew to Chicago and repeated something its embattled Mayor recently floated (even while Bill Bratton, who is a lot more experienced at policing than Rahm Emanuel, has publicly disputed it): that cops are not doing their job because people have started taking videos of police interactions.
I’ve also heard another explanation, in conversations all over the country. Nobody says it on the record, nobody says it in public, but police and elected officials are quietly saying it to themselves. And they’re saying it to me, and I’m going to say it to you. And it is the one explanation that does explain the calendar and the map and that makes the most sense to me.
Maybe something in policing has changed.
In today’s YouTube world, are officers reluctant to get out of their cars and do the work that controls violent crime? Are officers answering 911 calls but avoiding the informal contact that keeps bad guys from standing around, especially with guns?
I spoke to officers privately in one big city precinct who described being surrounded by young people with mobile phone cameras held high, taunting them the moment they get out of their cars. They told me, “We feel like we’re under siege and we don’t feel much like getting out of our cars.”
I’ve been told about a senior police leader who urged his force to remember that their political leadership has no tolerance for a viral video.
So the suggestion, the question that has been asked of me, is whether these kinds of things are changing police behavior all over the country.
And the answer is, I don’t know. I don’t know whether this explains it entirely, but I do have a strong sense that some part of the explanation is a chill wind blowing through American law enforcement over the last year. And that wind is surely changing behavior.
Let’s, for the moment, assume Comey’s anecdote-driven impression, both of the Ferguson Effect and of the role of cameras, is correct (to his credit, in this speech he called for more data; he would do well to heed his own call on that front). Let’s assume that all these cops (and mayors, given that Comey decided to make this claim in Rahm’s own city) are correct, and cops have stopped doing the job we’re all paying them to do because they’re under rather imperfect but nevertheless increased surveillance.
We’ll take you at your word, Director Comey.
If Comey’s right, what he’s describing is the chilling effect of surveillance, the way in which people change their behavior because they know they will be seen by a camera. That Comey is making such a claim is all the more striking given that the surveillance cops are undergoing is targeted surveillance, not the kind of dragnet surveillance (such as the use of planes to surveil the Baltimore and Ferguson protests, which he acknowledged this week) his agency and the NSA subject Americans to.
Sorry, sir! Judge after judge has ruled such claims to be speculative and therefore invalid in a court of law, most recently when T.S. Ellis threw out the ACLU’s latest challenge to the dragnet yesterday!
I actually do think there’s something to the chilling effect of surveillance (though, again, what’s happening to cops is targeted, not dragnet). But if Comey has a problem with that, he can’t have it both ways, he needs to consider the way in which the surveillance of young Muslim and African-American men leads them to do things they might not otherwise do, the way in which it makes targets of surveillance feel under siege, he needs to consider how the surveillance his Agents undertake actually makes it less likely people will engage in the things they’re supposed to do, like enjoy free speech, a robust criminal defense unrestricted by spying on lawyers, like enjoy privacy.
Comey adheres to a lot of theories, including the Ferguson Effect.
But as of yesterday, he is also on the record as claiming that surveillance has a chilling effect. Maybe he should consider the implications of what he is saying for the surveillance his own agency has us under? If the targeted surveillance of cops is a problem, isn’t the far less targeted surveillance he authorizes a bigger problem?
The FBI just charged an Albanian hacker living in Malaysia, Ardit Ferizi, aka Th3Dir3ctorY, with stealing the Personally Identifiable Information of over 1,000 service members and subsequently posting that PII online to encourage people to target them (he provided the data to, among others, Junaid Hussain, who was subsequently killed in a drone strike).
Given Jim Comey’s repeated warnings of how the FBI is going dark on ISIS organizing, I thought I’d look at how FBI found this guy.
In other words, Ferizi apparently did nothing to hide the association between his public Twitter boasting about stealing PII and association with KHS and the hack, down to his repeated email nudges to the victim company (and his attempt to get 2 Bitcoins to stop hacking them). His Twitter account, Facebook account, and email account could all be easily correlated both through IP and name, and activity on all three inculpated him in the hack.
The only mention of any security in the complaint is that Bitcoin account.
Sure, Ferizi was not playing the role of formal recruiter here, but instead agent provocateur and hacker. Still! The FBI is billing this guy as a hacker. And he did less to protect his identity then I sometimes use.
At least in this case, FBI isn’t going dark on ISIS’ attempts to incite attacks on Americans.
At a 10 AM Senate Homeland Security hearing on October 8, Jim Comey read prepared testimony that reiterated his claim that encrypted devices are causing FBI problems, but stated that the Administration is not seeking legislation to do anything about it.
Unfortunately, changing forms of Internet communication and the use of encryption are posing real challenges to the FBI’s ability to fulfill its public safety and national security missions.. This real and growing gap, to which the FBI refers as “Going Dark,” is an area of continuing focus for the FBI; we believe it must be addressed given the resulting risks are grave both in both traditional criminal matters as well as in national security matters. The United States Government is actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services. However, the Administration is not seeking legislation at this time.
That statement got the Administration a lot of good press, with the WaPo declaring “Obama administration opts not to force firms to decrypt data — for now” and the NYT, even after this ruling had been unsealed, reporting, “Obama Won’t Seek Access to Encrypted User Data.” In the actual hearing, Comey was more clear that he did intend to keep asking providers for data and that the government was having “increasingly productive conversations with industry” to get them to do so, inspired in part by government claims about the ISIS threat. Part of that cooperation, per Comey, was “how can we get you to comply with a court order.”
Sometime that same day, on October 8, government lawyers submitted a request to a federal magistrate in Brooklyn to obligate Apple to help unlock a device law enforcement had been unable to unlock on their own.
In a sealed application filed on October 8, 2015, the government asks the court to issue an order pursuant to the All Writs Act, 28 U.S.C. § 1651, directing Apple, Inc. (“Apple”) to assist in the execution of a federal search warrant by disabling the security of an Apple device that the government has lawfully seized pursuant to a warrant issued by this court. Law enforcement agents have discovered the device to be locked, and have tried and failed to bypass that lock. As a result, they cannot gain access to any data stored on the device notwithstanding the authority to do so conferred by this court’s warrant.
The next day the judge, James Orenstein, deferred ruling on whether the All Writs Act is applicable in this case (though he did suggest it probably wasn’t) pending briefing from Apple on how burdensome it would find the request. Orenstein released his memo after giving the government opportunity to review his order.
This is not the first time the government has tried to use the All Writs Act to force providers (Apple, in at least one of the known cases) to help unlock a phone. EFF described two instances from last year in a December post. It also reviewed a 2005 ruling where Orenstein refused to allow the government to use All Writs Act to force telecoms to provide cell site location in real time.
Of course, as Lawfare seems to suggest, it has taken a decade for the decision Orenstein made in that earlier ruling — that the government needs a warrant to get cell tracking from a phone — to finally get fully developed into a debate and some Supreme Court (US v. Jones) and circuit rulings. That’s because in the interim, plenty of magistrates continued to compel providers to give such information to the government.
It’s quite possible the same is true here: that this is not just the third attempt to get a court to issue an All Writs Act to get Apple to provide data, but that instead, a number of magistrates who are more compliant with government wishes have agreed to do so as well. Indeed, as Orenstein noted, that’s a suggestion the government made in its application when it claimed “in other cases, courts have ordered Apple to assist in effectuating search warrants under the authority of the All Writs Act [and that] Apple has complied with such orders.”
What Orenstein did, then, was to make it clear this continues to go on, that even as Jim Comey and others were making public claims (and getting public acclaim) for not seeking legislation that would compel production of encrypted data the government — including, presumably, the FBI — was seeking court orders that would compel production secretly. The key rhetorical move in Orenstein’s order came when Orenstein compared Comey’s public statements claiming to support debate on this issue to the attempt to claim the government had to rely on the All Writs Act because no law existed. In a long footnote, Orenstein quoted from Comey’s Lawfare post,
Democracies resolve such tensions through robust debate …. It may be that, as a people, we decide the benefits here outweigh the costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular context, or that public safety folks will be able to do their job well enough in a world of universal strong encryption. Those are decisions Americans should make, but I think part of my job is [to] make sure the debate is informed by a reasonable understanding of the costs.
Then Orenstein pointed out that relying on the All Writs Act would undercut precisely the democratic debate Comey claimed to want to have.
Director Comey’s view about how such policy matters should be resolved is in tension, if not entirely at odds, with the robust application of the All Writs Act the government now advocates. Even if CALEA and the Congressional determination not to mandate “back door” access for law enforcement to encrypted devices does not foreclose reliance on the All Writs Act to grant the instant motion, using an aggressive interpretation of that statute’s scope to short-circuit public debate on this controversy seems fundamentally inconsistent with the proposition that such important policy issues should be determined in the first instance by the legislative branch after public debate – as opposed to having them decided by the judiciary in sealed, ex parte proceedings.
To be fair, even as the government was submitting its secret request to Orenstein, Comey was disavowing his former pro-democratic stance, and instead making it clear the government would try to find some other way to get orders forcing providers to comply.
But, given Orenstein’s invitation for Apple to lay out how onerous this is on it, Comey might get the democratic debate he once embraced.
Update: When I wrote this in the middle of the night I misspelled Judge Orenstein’s name. My apologies!
The WaPo has an update on the Administration’s debate about whether to push for legislation for back doors. It reports that the Obama Administration decided to punt — and not ask for legislation right now while continuing efforts to cajole companies to back door their own products. WaPo even provided the date that decision was made: October 1.
“The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.
The decision, which essentially maintains the status quo, underscores the bind the administration is in — between resolving competing pressures to help law enforcement and protecting consumer privacy.
The decision was made at a Cabinet meeting Oct. 1.
“As the president has said, the United States will work to ensure that malicious actors can be held to account – without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.”
Apple CEO Tim Cook said he doesn’t think we will hear the U.S. National Security Agency asking for a back door into our iPhones, at least not any more. In an interview on NPR’s All Things Consideredon Thursday, Mr. Cook implied that even the FBI is coming around on the need for end-user encryption.
The intelligence community has asked for a back door. They want access into the communications that are going through Apple’s devices. No?
Tim Cook: I don’t think you will hear the [National Security Agency] asking for a back door.
Robert Siegel: The FBI?
Tim Cook: There have been different conversations with the FBI, I think, over time. And I’ve read in the newspapers myself. But my own view is everyone’s coming around to some core tenets. And those core tenets are that encryption is a must in today’s world. And I think everyone is coming around also to recognizing that any back door means a back door for bad guys as well as good guys. And so a back door is a nonstarter. It means we’re all not safe.
When I first read this interview, I was struck by Cook’s certainty about the NSA, compared to his uncertainty about FBI. I wondered at the time whether that certainty meant that the rumored FISC request for a back door was ultimately rejected, which would close off the possibility for NSA for the moment(that would affect FBI, too, but only part of FBI’s requests).
Given the coincidence of these two events — Cook’s stated certainty and the cabinet decision not to pursue back doors right now — I’m all the more curious.
Has FISC secretly told the government it can’t force Apple to back door its products?
Periodically, Jim Comey invites a group of select journalists in for lunch and eats them alive with his charisma and unsubstantiated claims. The first I noticed came when Comey made some false claims about National Security Letters, without a single journalist correcting him. More recently, Comey claimed FBI had arrested 10 people with ties to ISIS, only two of whom have every publicly appeared.
In this week’s edition, Comey got passionate about a claimed spike in crime.
And in unusually passionate remarks, the FBI director said he was “very concerned about what’s going on now with violent crime and murder rates across the country,” in cities as disparate as Omaha and Milwaukee.
At least in this instance, journalists are getting less credulous, because most (though not CNN) reported that in fact the crime stats released this week show a decline in crime, not a spike, even while they reported that violent crime in “many” cities has spiked.
Newly released federal data suggest a slight dip in violence across the nation in 2014. But Comey said those numbers may not be capturing what’s happening on the ground today. He’s been hearing similar concerns from police chiefs, he said.
Earlier this week, the FBI released data showing violent crime dropped slightly in 2014, but many big city police departments have reported significant jumps in shootings this year compared with last year.
In 2014, the number [of murders in NYC] had dropped to 328 — the lowest number of murders since the New York City Police Department began collecting statistics in 1963.)
None I saw, however, pointed out that the claim of a spike in “many” cities stems from a persistent propaganda effort that has been debunked as cherry-picking. Yes, there are a few cities with alarming spikes in violence, but they should be examined as cities, not as a trend that the FBI’s own data shows is moving in the opposite direction.
In his comments, Comey didn’t endorse the Ferguson effect. But he did say we need to move slowly on criminal justice reform both because of this alleged spike and because crime has gone down (!?!). Still from the HuffPo:
Comey said he didn’t know whether protests against police violence have made it harder for police to do their jobs, a theory that has been dubbed the “Ferguson effect.” “I’m not discounting it, but I just don’t know,” he said, adding that he was “focused on it, trying to figure it out.”
“Some have said police officers aren’t getting out of their cars and talking to gang-bangers on street corners anymore, but I don’t know,” he said. “What I do know is that a whole lot of people are dying. They are, according to the chiefs, overwhelmingly people of color, and we’ve got to care about that.”
The spike in crime made him want to be “thoughtful” on criminal justice reform, Comey added.
“My strong sense is that a significant portion of the change in our world since I was a prosecutor in New York in 1987 is due to law enforcement, but I’m sure there are lots of other things [going on],” he said.
“I just want to make sure that as we reform — first of all, we’re grateful that we actually have the space and time to think and talk about sentencing better, rehabilitating better, and [that] is a product of hard work over the past 25 years — but as we do it, are very, very thoughtful about where we used to be and how we got from that point to here,” Comey said.
As with encryption back doors, the data is not there (on that issue, DOJ simply doesn’t collect data on how often encryption prevents it from accessing data). But that’s not going to stop him from cautioning against criminal justice reform.
Apple recently released its latest transparency report for the period ending June 30, 2015. By comparing the numbers for two categories with previous reports (2H 2013, 1H 2014, 2H 2014) we can get some sense of how badly Apple’s move to encrypt data has really thwarted law enforcement.
Thus far, the numbers show that “going dark” may be a problem, but nowhere near as big of one as, say, NY’s DA Cy Vance claims.
The easier numbers to understand are the national security orders, presented in the mandated bands.
Since the iPhone 6 was introduced in September 2014, the numbers for orders received have gone up — one band in the second half of 2014, and two more bands in the first half of this year. Curiously, the number of accounts affected haven’t gone up that much, possibly only tens or a hundred more accounts. And Apple still gets nowhere near the magnitude of requests Yahoo does, which number over 42,000.
Equally curiously, in the last period, Apple clearly received more NatSec orders than accounts affected, which is the reverse of what other companies show (before Apple had appeared close to one-to-one). One thing that might explain this is the quarterly renewal of Pen Register orders for metadata of US persons (which might be counted as 4 requests for each account affected).
In other words, clearly NatSec requests have gone up, proportionally significantly, though Apple remains a tiny target for NatSec requests compared to the bigger PRISM participants.
The law enforcement account requests are harder to understand.
Note, Apple distinguishes between device requests, which are often users seeking help with a stolen iPhone, and account requests, which are requests for either metadata or content associated with an account (and could even include purchase records). The latter are the ones that represent law enforcement trying to get data to investigate a user, and that what I’ve laid out the latter data here [note, I fully expect to have made some data errors here, and apologize in advance — please let me know what you see!!].
Here, too, Apple has seen a significant increase, of 23%, over the requests it got in the second half of last year. Though, note, the iPhone 6 introduction would not be the only thing that would affect this: so would, probably, the June 2014 Riley Supreme Court decision, which required law enforcement to get a warrant to access cell phones, would also lead law enforcement to ask Apple for data more often.
Interestingly, however, there were fewer accounts implicated in the requests in the last half of the year, suggesting that for some reason law enforcement was submitting requests with a slew of accounts listed for each request. Whereas last year, LE submitted an average of over 6.5 accounts per request, this year they have submitted fewer than 3 accounts per request. This may reflect LE was submitting more identifiers from the same account — who knows?
The percentage of requests where content was obtained has gone up too, from 16% in 2013 to 24% in the first period including the iPhone 6 to 30% last quarter. Indeed, over half the period-on-period increase this period may stem from an increase in content requests (that is, the 107 more requests where content was obtained in the first half of the year, which was a period in which Apple got 183 more requests overall). Still, that number, 107 more successful requests for content this year than the second half of last year, seems totally disproportionate to NYC DA Cy Vance’s claim that the NYPD was unable to access the content in 74 iPhones since the iPhone 6 was established (though note, that might represent 1 request for content from 74 iPhones).
Perhaps the most interesting numbers to compare are the number of times Apple objected (because the agency didn’t have the right kind of legal process or a signed document) and the number of times Apple disclosed no data (which would include all those times Apple successfully objected — which appears to include all those in the first number — as well as those times Apple didn’t have the account, as well as times Apple was unable to hand over the data because a user hadn’t used default iCloud storage for messages. [Update, to put this more simply, the way to find the possible number of requests where encryption prevented Apple from sharing information is to subtract the Apple objected number from the no data number.] In the second half of 2013, Apple did not disclose any data 28.5% of the time. In the first half of this year, Apple did not disclose any data in just 18.6% of requests. Again, there are a lot of reasons why Apple would not turn over any data at all. But in general, cops are getting data more of the time when they give Apple requests than they were a few years ago.
More importantly, for just 65 cases in the first half of this year and 80 cases in the second half of last year did Apple not turn over any data for a request for reasons other than some kind of legal objection — and those numbers are both lower than the two half years preceding them. Each of those requests might represent hundreds of phones, but overall it’s a tiny number. So tiny it’s tough to understand where the NYPD’s 74 locked iPhones (unless they did request data and Apple actually had it).
There’s one more place where unavailable encrypted data might show up in these numbers: in the number of specific accounts for which data was disclosed. But as a percentage, what happened this year is not that different from what happened in 2013. In the second half of 2013, Apple provided some data (and this can be content or metadata) for 57.6% of the accounts specified in requests. In the first half of this year, Apple provided some data for 51.6% of the accounts specified in requests — not that huge a difference. And of course, the second half of last year, which may be an outlier, but during much of which the iPhone 6 was out, Apple provided data for 88.5% of the accounts for which LE asked for data.
Overall, it’s very hard to see where the FBI and other law enforcement agencies are going dark — though they are having to ask Apple for content more often (which I consider a good thing).
Update: In talking to EFF’s Nate Cardozo about Apple’s most recent report, we agreed that Apple’s new category for Emergency Requests may be one other place where iPhone data is handed over (it doesn’t exist in the reports for previous half year periods). Apple defines emergency content this way:
Table 3 shows all the emergency and/or exigent requests that we have received globally. Pursuant to 18 U.S.C. §§ 2702(b)(8) and 2702(c)(4) Apple may voluntarily disclose information, including contents of communications and customer records, to a federal, state, or local governmental entity if Apple believes in good faith that an emergency involving imminent danger of death or serious physical injury to any person requires such disclosure without delay. The number of emergency requests that Apple deemed to be exigent and responded to is detailed in Table 3.
Given the scale of Apple’s other requests, though not in the scale of cloud requests comparatively, these are significant numbers, especially for the US (107) and UK (98).
Of significant note, Apple may give out content under emergency requests.
This is more likely to be a post-Riley response than an encryption response, but still notable given the number.