Ron Wyden

1 2 3 21

Ron Wyden: Obtaining ECTRs without a Warrant Is Almost Like Spying on Someone’s Thoughts

Screen Shot 2016-06-28 at 8.50.20 AM

As a number of outlets have reported, Ron Wyden has placed a hold on the Intelligence Authorization in an attempt to thwart FBI’s quest to be able to obtain Electronic Communication Transaction Records with just a National Security Letter.

But Wyden’s released statement on that hold differs in emphasis from what he said in his Senate address announcing the hold yesterday. The statement describes how all toll records — from emails, texts, or web browsing — can infringe on privacy.

The fact of the matter is that ‘electronic communication transaction records’ can reveal a great deal of personal information about individual Americans.  If government officials know that an individual routinely emails a mental health professional, or sends texts to a substance abuse support group, or visits a particular dating website, or the website of a particular political group, then the government knows a lot about that individual.  Our Founding Fathers rightly argued that such intrusive searches should be approved by independent judges.

But in his floor statement, Wyden went on at length about the particular threat posed by obtaining web browsing history (this starts after 4:40).

For example, the National Security Letters could be used to collect what are called Electronic Communication Transaction Records. This would be email and chat records and text message logs, and in particular, Mr. President, and I’ve had Senators come up to me to ask me about whether this could be true, folks at home this weekend, when I was out and responding to questions about this, people asked, “Does this really mean that the government can get the Internet browsing history of an individual without a warrant even when the government has the emergency authority if it’s really necessary?”

And the answer to that question, Mr. President, is yes, the government can. The government can get access to web browsing history under the Intelligence Authorization legislation, under the McCain amendment, and they can do it without getting a warrant, even when the government can go get it without a warrant when there is an emergency circumstance.

Now the reality is web browsing history can reveal an awful lot of information about Americans. I know of little information, frankly Mr. President, that could be more intimate than that web browsing history. If you know that a person is visiting the website of a mental health professional, or a substance abuse support group, or a particular political organization, or — say — a particular dating site, you know a tremendous amount of private and personal and intimate information about that individual — that’s what you get when you can get access to their web browsing history without a warrant, even when the government’s interest is protected, as I’ve said, in an emergency.

The reality is getting access to somebody’s web browsing history is almost like spying on their thoughts. This level of surveillance absolutely ought to come with court oversight, and as I’ve spelled out tonight, that is possible in two separate ways — the traditional approach with getting a warrant, and then under Section 102, which I wrote as part of USA Freedom Act, the government can get the information when there’s an emergency and come back later after the fact and settle up.

Wyden’s statement makes a few other things clear. First, by focusing on the emergency provision of USA Freedom Act, Wyden illustrates that the FBI is trying to avoid court oversight, not so much obtain records quickly (though there would be more paperwork to a retroactive Section 215 order than an NSL).

That means two things. First, as I’ve noted, FBI is trying to avoid the minimization procedures the FISC spent three years imposing on FBI. Right now, we should assume that FISC would prohibit FBI from retaining all of the data it obtains from web searches, but if it moved (back) to NSL collection it would have no such restriction.

The other thing obtaining ECTRs with NSLs would do, though, is avoid a court First Amendment review, which should be of particular concern with web search history, since everything about web browsing involves First Amendment speech. Remember, a form of emergency provision (one limited to Section 215’s phone chaining application) was approved in February 2014. But in the September 2014 order, the FISC affirmatively required that such a review happen even with emergency orders. A 2015 IG Report on Section 215 (see page 176) explains why this is the case: because once FISC started approving seeds, NSA’s Office of General Counsel stopped doing First Amendment reviews, leaving that for FISC. It’s unclear whether it took FISC several cycles to figure that out, or whether they discovered an emergency approval that infringed on First Amendment issues. Under the expanded emergency provision under USAF, someone at FBI or DOJ’s National Security Division would do the review. But FBI’s interest in avoiding FISC’s First Amendment review is of particular concern given that FBI has, in the past, used an NSL to obtain data the FISC refused on First Amendment grounds, and at least one of the NSL challenges appears to have significant First Amendment concerns.

In the Senate yesterday, Senator Wyden strongly suggested the FBI wants this ECTR provision so it can “spy[] on their thoughts” without a warrant. We know from other developments that doing so using an NSL — rather than an emergency Section 215 order — would bypass rigorous minimization and First Amendment review.

In other words, the FBI wants to spy on — and then archive — your thoughts.

Senate Narrowly Avoids Voting Themselves to Become “Typos”

The McCain (Cornyn) amendment to the Judiciary Appropriations bill that would let them get Electronic Communication Transaction Records with a National Security Letter just narrowly failed to get cloture, with Dan Sullivan flipping his vote to yes near the end but Mike Crapo, a likely no vote, not voting. The final vote was 59-37.

The floor debate leading up to the vote featured a few notable exchanges. Richard Burr was an absolutely douchebag, saying Ron “Wyden is consistently against providing LE the tools it needs to defend the American people.” He did so in a speech admitting that, “My colleague says this wouldn’t stop SB or Orlando. He’s 100% correct.”

Burr also insisted that we can’t let the Lone Wolf provision, which allegedly has never been used, expire. It was extended just last year and doesn’t expire until 2019.

More interesting though was the debate between Burr and Leahy over whether the FBI can’t obtain ECTRs because of a typo in the law as passed in 1993. Leahy basically described that Congress had affirmatively decided not to include ECTRs in NSLs (implicit in this, Congress also did not decide to include it in the 2001 expansion). Burr claimed that Congress meant to include it but didn’t in some kind of oversight.

Here’s how Mazie Hirono and Martin Heinrich described the debate in the report on the Intelligence Authorization, which has a version of the ECTR change.

The FBI has compared expanding these authorities to fixing a “typo” in the Electronic Communications Privacy Act (ECPA).

However, during consideration of ECPA reform legislation in 1993, the House Judiciary Committee said in its committee report that “Exempt from the judicial scrutiny normally
required for compulsory process, the national security letter is an extraordinary device. New applications are disfavored.”

The House Judiciary Committee report also makes clear that the bill’s changes to Section 2709(b) of ECPA were a “modification of the language originally proposed by the
FBI.”

This does not support claims that the removal of the ECTR language was a “typo.”

Burr effectively argued that because law enforcement wanted ECTRs to be included back in 1993, they were meant to be included, and Congress’ exclusion of them was just a typo.

In short, a member of the Senate just argued that if Congress affirmatively decides not to capitulate to every demand of law enforcement, it must be considered a “typo” and not legally binding law.

For the moment, the Senate voted down making itself a “typo,” but Mitch McConnell filed a motion to reconsider, meaning he can bring the vote back up as soon as he arm twists one more vote.

 

DOJ Confirms One or More Agencies Acted Consistent with John Yoo’s Crummy Opinion

There’s a whiff of panic in DOJ’s response to ACLU’s latest brief in the common commercial services OLC memo, which was submitted last Thursday. They really don’t want to release this memo.

As you recall, this is a memo Ron Wyden has been hinting about forever, stating that it interprets the law other than most people understand it to be. After I wrote about it a bunch of times and pointed out it was apparently closely related to cybersecurity, ACLU finally showed some interest and FOIAed, then sued, for it. In March, DOJ made some silly (but typical) claims about it, including that ACLU had already tried but failed to get the memo as part of their suit for Stellar Wind documents (which got combined with EPIC’s suit for electronic surveillance documents). In response, Ron Wyden wrote a letter to Attorney General Loretta Lynch, noting a lie DOJ made in DOJ’s filings in the case, followed by an amicus brief asking the judge in the case to read the secret appendix to the letter he wrote to Lynch. In it, Wyden complained that DOJ wouldn’t let him read his secret declaration submitted in the case (making it clear they’re being kept secret for strategic reasons more than sources and methods), but asking that the court read his own appendix without saying what was in it.

Which brings us to last week’s response.

DOJ is relying on an opinion the 2nd circuit released last year in ACLU’s Awlaki drone memo case that found that if a significant delay passed between the time an opinion was issued and executive branch officials spoke publicly about it — as passed between the time someone wrote a memo for President Bush’s “close legal advisor” in 2002 about drone killings (potentially of American citizens) and the time Executive branch officials stopped hiding the fact they were planning on drone-killing an American citizen in 2010, then the government can still hide the memo.(I guess we’re not allowed to learn that Kamal Derwish was intentionally, not incidentally, drone-killed in 2002?)

This is, in my understanding, narrower protection for documents withheld under the b5 deliberative privilege exemption than exists in the DC Circuit, especially given that the 2nd circuit forced the government to turn over the Awlaki memos because they had been acknowledged.

In other words, they’re trying to use that 2nd circuit opinion to avoid releasing this memo.

To do that they’re making two key arguments that, in their effort to keep the memo secret, end up revealing a fair amount they’re trying to keep secret. First, they’re arguing (as they did earlier) that the ACLU has already had a shot at getting this memo (in an earlier lawsuit for memos relating to Stellar Wind) and lost.

There’s just one problem with that. As I noted earlier, the ACLU’s suit got joined with EPIC’s, but they asked for different things. ACLU asked for Stellar Wind documents, whereas EPIC asked more broadly for electronic surveillance ones. So when the ACLU argued for it, they were assuming it was Stellar Wind, not something that now appears to (also) relate to cybersecurity.

Indeed, the government suggests the ACLU shouldn’t assume this is a “Terrorist Surveillance Program” document.

7 Plaintiffs conclude that the OLC memorandum at issue here must relate to the Terrorist Surveillance Program and the reauthorization of that program because the attorney who authored the memorandum also authored memoranda on the Terrorist Surveillance Program. Pls.’ Opp. at 10. The fact that two OLC memoranda share an author of course establishes nothing about the documents’ contents, nature, purpose, or effect.

Suggesting (though not stating) the memo is not about TSP is not the same as saying it is not about Stellar Wind or the larger dragnets Bush had going on. But it should mean ACLU gets another shot at it, since they were looking only for SW documents the last time.

Which is interesting given the way DOJ argues, much more extensively, that this memo does not amount to working law. It starts by suggesting Wyden’s filing arguing a “key assertion” in the government’s briefs is wrong.

3 Senator Wyden asks the Court to review a classified attachment to a letter he sent Attorney General Loretta Lynch in support of his claim that a “key assertion” in the Government’s motion papers is “inaccurate.” Amicus Br. at 4. The Government will make the classified attachment available for the Court’s review ex parte and in camera. For the reasons explained in this memorandum, however, the Senator’s claim of inaccuracy is based not on any inaccurate or incomplete facts, but rather on a fundamental misunderstanding of the “working law” doctrine.

In doing so, it reveals (what we already expected but which Wyden, but apparently not DOJ, was discreet enough not to say publicly) that the government did whatever this John Yoo memo said government could do.

But, it argues (relying on both the DC and 2nd circuit opinions on this) that just because the government did the same thing a memo said would be legal (such as, say, drone-killing a US person with no due process), it doesn’t mean they relied on the memo’s advice when they took that action.

The mere fact that an agency “relies” on an OLC legal advice memorandum, by acting in a manner that is consistent with the advice, Pls.’ Opp. at 11, does not make it “working law.” OLC memoranda fundamentally lack the essential ingredient of “working law”: they do not establish agency policy. See New York Times, 806 F.3d at 687; Brennan Center, 697 F.3d at 203; EFF, 739 F.3d at 10. It is the agency, and not OLC (or any other legal adviser), that has the authority to establish agency policy. If OLC advises that a contemplated policy action is lawful, and the agency considers the opinion and elects to take the action, that does not mean that the advice becomes the policy of that agency. It remains legal advice. 5

5 Nor could the fact that any agency elects to engage in conduct consistent with what an OLC opinion has advised is lawful possibly constitute adoption of that legal advice, because taking such action does not show the requisite express adoption of both the reasoning and conclusion of OLC’s legal advice. See Brennan Center, 697 F.3d at 206; Wood, 432 F.3d at 84; La Raza, 411 F.3d at 358.

Effectively, DOJ is saying that John Yoo wrote another stupid memo just weeks before he left, the government took the action described in the stupid memo, but from that the courts should not assume that the government took Yoo’s advice, this time.

One reason they’re suggesting this isn’t TSP (which is not the same as saying it’s not Stellar Wind) is because it would mean the government did not (in 2005, when Bush admitted to a subset of things called TSP) confirm this action in the same way Obama officials danced around hailing that they had killed Anwar al-Awlaki, which led to us getting copies of the memos used to justify killing him.

In short, the government followed Yoo’s advice, just without admitting they were following his shitty logic again.

DOJ’s Pre-Ass-Handing Capitulation

In its February 16 application for an All Writs Act to force Apple to help crack Syed Rizwan Farook’s phone, DOJ asserted,

Apple has the exclusive technical means which would assist the government in completing its search, but has declined to provide that assistance voluntarily.

[snip]

2. The government requires Apple’s assistance to access the SUBJECT DEVICE to determine, among other things, who Farook and Malik may have communicated with to plan and carry out the IRC shootings, where Farook and Malik may have traveled to and from before and after the incident, and other pertinent information that would provide more information about their and others’ involvement in the deadly shooting.

[snip]

3. As an initial matter, the assistance sought can only be provided by Apple.

[snip]

4. Because iOS software must be cryptographically signed by Apple, only Apple is able to modify the iOS software to change the setting or prevent execution of the function.

[snip]

5. Apple’s assistance is necessary to effectuate the warrant.

[snip]

6. This indicates to the FBI that Farook may have disabled the automatic iCloud backup function to hide evidence, and demonstrates that there may be relevant, critical communications and data around the time of the shooting that has thus far not been accessed, may reside solely on the SUBJECT DEVICE, and cannot be accessed by any other means known to either the government or Apple.

FBI’s forensics guy Christopher Pluhar claimed,

7. I have explored other means of obtaining this information with employees of Apple and with technical experts at the FBI, and we have been unable to identify any other methods feasible for gaining access to the currently inaccessible data stored within the SUBJECT DEVICE.

On February 19, DOJ claimed,

8. The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple.

[snip]

9. Apple left the government with no option other than to apply to this Court for the Order issued on February 16, 2016.

[snip]

10. Accordingly, there may be critical communications and data prior to and around the time of the shooting that thus far has not been accessed, may reside solely on the SUBJECT DEVICE; and cannot be accessed by any other means known to either the government or Apple.

[snip]

11. Especially but not only because iPhones will only run software cryptographically signed by Apple, and because Apple restricts access to the source code of the software that creates these obstacles, no other party has the ability to assist the government in preventing these features from obstructing the search ordered by the Court pursuant to the warrant.

[snip]

12. Apple’s close relationship to the iPhone and its software, both legally and technically – which are the produce of Apple’s own design – makes compelling assistance from Apple a permissible and indispensable means of executing the warrant.

[snip]

13. Apple’s assistance is also necessary to effectuate the warrant.

[snip]

14. Moreover, as discussed above, Apple’s assistance is necessary because without the access to Apple’s software code and ability to cryptographically sign code for the SUBJECT DEVICE that only Apple has, the FBI cannot attempt to determine the passcode without fear of permanent loss of access to the data or excessive time delay. Indeed, after reviewing a number of other suggestions to obtain the data from the SUBJECT DEVICE with Apple, technicians from both Apple and the FBI agreed that they were unable to identify any other methods – besides that which is now ordered by this Court – that are feasible for gaining access to the currently inaccessible data on the SUBJECT DEVICE. There can thus be no question that Apple’s assistance is necessary, and that the Order was therefore properly issued.

Almost immediately after the government made these claims, a number of security researchers I follow not only described ways FBI might be able to get into the phone, but revealed that FBI had not returned calls with suggestions.

On February 25, Apple pointed out the government hadn’t exhausted possible of means of getting into the phone.

Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks. See Hanna Decl. Ex. DD at 34–36 [October 26, 2015 Transcript] (Judge Orenstein asking the government “to make a representation for purposes of the All Writs Act” as to whether the “entire Government,” including the “intelligence community,” did or did not have the capability to decrypt an iPhone, and the government responding that “federal prosecutors don’t have an obligation to consult the intelligence community in order to investigate crime”). As such, the government has not demonstrated that “there is no conceivable way” to extract data from the phone.

On March 1, members of Congress and House Judiciary Committee witness Susan Landau suggested there were other ways to get into the phone (indeed, Darrell Issa, who was one who made that point, is doing a bit of a victory lap). During the hearing, as Jim Comey insisted that if people had ways to get into the phone, they should call FBI, researchers noted they had done so and gotten no response.

Issa: Is the burden so high on you that you could not defeat this product, either through getting the source code and changing it or some other means? Are you testifying to that?

Comey: I see. We wouldn’t be litigating if we could. We have engaged all parts of the U.S. Government to see does anybody that has a way, short of asking Apple to do it, with a 5C running iOS 9 to do this, and we don not.

[snip]

a) Comey: I have reasonable confidence, in fact, I have high confidence that all elements of the US government have focused on this problem and have had great conversations with Apple. Apple has never suggested to us that there’s another way to do it other than what they’ve been asked to do in the All Writs Act.

[snip]

b) Comey [in response to Chu]: We’ve talked to anybody who will talk to us about it, and I welcome additional suggestions. Again, you have to be very specific: 5C running iOS 9, what are the capabilities against that phone. There are versions of different phone manufacturers and combinations of models and operating system that it is possible to break a phone without having to ask the manufacturer to do it. We have not found a way to break the 5C running iOS 9.

[snip]

c) Comey [in response to Bass]: There are actually 16 other members of the US intelligence community. It pains me to say this, because I — in a way, we benefit from the myth that is the product of maybe too much television. The only thing that’s true on television is we remain very attractive people, but we don’t have the capabilities that people sometimes on TV imagine us to have. If we could have done this quietly and privately we would have done it.

[snip]

Cicilline: I think this is a very important question for me. If, in fact — is it in fact the case that the government doesn’t have the ability, including the Department of Homeland Security Investigations, and all of the other intelligence agencies to do what it is that you claim is necessary to access this information?

d) Comey: Yes.

While Comey’s statements were not so absolutist as to suggest that only Apple could break into this phone, Comey repeatedly said the government could not do it.

On March 10, DOJ claimed,

15. The government and the community need to know what is on the terrorist’s phone, and the government needs Apple’s assistance to find out.

[snip]

16. Apple alone can remove those barriers so that the FBI can search the phone, and it can do so without undue burden.

[snip]

17. Without Apple’s assistance, the government cannot carry out the search of Farook’s iPhone authorized by the search warrant. Apple has ensured that its assistance is necessary by requiring its electronic signature to run any program on the iPhone. Even if the Court ordered Apple to provide the government with Apple’s cryptographic keys and source code, Apple itself has implied that the government could not disable the requisite features because it “would have insufficient knowledge of Apple’s software and design protocols to be effective.”

[snip]

18. Regardless, even if absolute necessity were required, the undisputed evidence is that the FBI cannot unlock Farook’s phone without Apple’s assistance.

[snip]

19. Apple deliberately established a security paradigm that keeps Apple intimately connected to its iPhones. This same paradigm makes Apple’s assistance necessary for executing the lawful warrant to search Farook’s iPhone.

On March 15, SSCI Member Ron Wyden thrice suggested someone should ask NSA if they could hack into this phone.

On March 21, DOJ wrote this:

Specifically, since recovering Farook’s iPhone on December 3, 2015, the FBI has continued to research methods to gain access to the data stored on it. The FBI did not cease its efforts after this litigation began. As the FBI continued to conduct its own research, and as a result of the worldwide publicity and attention on this case, others outside the U.S. government have continued to contact the U.S. government offering avenues of possible research.

On Sunday, March 20, 2016, an outside party demonstrated to the FBI a possible method for unlocking Farook’s iPhone

You might think that FBI really did suddenly find a way to hack the phone, after insisting over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over they could only get into it with Apple’s help. Indeed, the described timing coincides remarkably well with the announcement that some Johns Hopkins researchers had found a flaw in iMessage’s encryption (which shouldn’t relate at all to breaking into such phones, though it is possible FBI is really after iMessages they think will be on the phone). Indeed, in describing the iMessage vulnerability, Johns Hopkins prof Matthew Green ties the discovery to the Apple fight.

Now before I go further, it’s worth noting that the security of a text messaging protocol may not seem like the most important problem in computer security. And under normal circumstances I might agree with you. But today the circumstances are anything but normal: encryption systems like iMessage are at the center of a critical national debate over the role of technology companies in assisting law enforcement.

A particularly unfortunate aspect of this controversy has been the repeated call for U.S. technology companies to add “backdoors” to end-to-end encryption systems such as iMessage. I’ve always felt that one of the most compelling arguments against this approach — an argument I’ve made along with other colleagues — is that we just don’t know how to construct such backdoors securely. But lately I’ve come to believe that this position doesn’t go far enough — in the sense that it is woefully optimistic. The fact of the matter is that forget backdoors: webarely know how to make encryption workat all. If anything, this work makes me much gloomier about the subject.

Plus, as Rayne noted to me earlier, Ellen Nakashima’s first report on this went up just after midnight on what would be the morning of March 21, suggesting she had an embargo (though that may be tied to Apple’s fix for the vulnerability). [Update: Correction — her story accidentally got posted then unposted earlier than that.]

But that would require ignoring the 19 plus times (ignoring Jim Comey’s March 1 testimony) that DOJ insisted the only way they could get into the phone was by having Apple’s help hacking it (though note most of those claims only considered the ways that Apple might crack the phone, not ways that, say, NSA might). You’d have to ignore the problems even within these statements. You’d have to ignore the conflicting sworn testimony from FBI’s witnesses (including Jim Comey).

It turns out FBI’s public argument went to shit fast. Considering the likelihood they screwed up with the forensics on this phone and that there’s absolutely nothing of interest on the phone, I take this as an easy retreat for them.

But that doesn’t mean this is over. Remember, FBI has already moved to unlock this iPhone, of similar vintage to Farook’s, which seems more central to an actual investigation (even if FBI won’t be able to scream terrorterrorterror). There are two more encrypted phones FBI has asked Apple to break open.

But for now, I take this as FBI’s attempt to take its claims back into the shadows, where it’s not so easy to expose the giant holes in their claims.

Updated with Comey testimony.

“Noteworthy” Ron Wyden Interview on Apple vs FBI: Ask NSA, Ask NSA, Ask NSA

This interview Ron Wyden did with Oregon Public Radio includes a lot of what you might expect from him, including an argument that weakening encryption makes us less safe, including possibly exposing kids (because their location gets identified) to pedophiles.

But the most interesting part of this interview are the three times Ron Wyden made it clear, in his inimitable fashion, that someone better ask NSA whether they can decrypt this phone. To me, the interview sounds like this:

Let me tell you what I think is noteworthy here. This is a fight between FBI and Apple. I think it’s noteworthy that nobody has heard from the NSA on this. [around 2:00]

And I want to come back to the fact that the NSA has not been heard from on this and I think that that is noteworthy. [before 7:25]

[After finally being asked what he had heard from NSA] I’m on the intelligence committee, so I’m bound, I take an oath, to not get into classified matters so I’m just going to, uh, leave that there with respect to the NSA. [at 8:30]

We’ve had experts like Susan Landau and Richard Clarke insist that NSA can get into this phone. Jim Comey, in testimony before HJC, sort of dodged by claiming that NSA doesn’t have the ability to get into a phone with this particular configuration.

But Ron Wyden sure seems to think the NSA might have more to say about that.

Golly, I can’t imagine what he thinks the NSA might have to offer about this phone.

Why Isn’t Jim Comey Crusading against This Tool Used to Hide Terrorist Secrets?

Several times over the course of Jim Comey’s crusade against strong encryption, I have noted that, if Comey wants to eliminate the tools “bad guys” use to commit crimes, you might as well eliminate the corporation. After all, the corporate structure helped a bunch of banksters do trillions of dollars of damage to the US economy and effectively steal the homes from millions with near-impunity.

It’d be crazy to eliminate the corporation because it’s a tool “bad guys” sometimes use, but that’s the kind of crazy we see in the encryption debate.

Yesterday, Ron Wyden pointed to a more narrow example of the way “bad guys” abuse corporate structures to — among other things — commit terrorism: the shell corporation.

In a letter to Treasury Secretary Jack Lew, he laid out several cases where American shell companies had been used to launder money for crime — including terrorism, broadly defined.

Screen Shot 2016-02-26 at 9.51.49 AM

He then asked for answers about several issues. Summarizing:

  • The White House IRS-registration for beneficial information on corporations probably won’t work. Does Treasury have a better plan? Would the Senate and House proposals to have states or Treasury create such a registry provide the ability to track who really owns a corporation?
  • FinCen has proposed a rule that would not only be easily evaded, but might weaken the existing FATCA standard. Has anyone review this?
  • Does FinCen actually think its rule would identify the natural person behind shell companies?
  • Would requiring financial institutions to report balances held by foreigners help information sharing?

They’re good questions but point, generally, to something more telling. We’re not doing what we need to to prevent our own financial system from being used as a tool for terrorism. Unlike encryption, shell companies don’t have many real benefits to society. Worse, it sounds like Treasury is making the problem worse, not better.

Of course, the really powerful crooks have reasons to want to retain the status quo. And so FBI Director Jim Comey has launched no crusade about this much more obvious tool of crime.

The Unnamed Network Provider Exposing our Infrastructure

Today was Global Threat day, when James Clapper testifies before various committees in Congress and Ron Wyden asks uncomfortable questions (today, directed exclusively at John Brennan). I’ll have a few posts about the hearings (in Senate Armed Services and Senate Intelligence Committees) and Clapper’s testimony, the SASC version of which is here.

One interesting detail in Clapper’s testimony comes in the several paragraph section on Infrastructure within a larger section on “Protecting Information Resources.” Here’s how the testimony describes the Juniper hack.

A major US network equipment manufacturer acknowledged last December that someone repeatedly gained access to its network to change source code in order to make its products’ default encryption breakable. The intruders also introduced a default password to enable undetected access to some target networks worldwide.

There’s no discussion of how many Federal agencies use Juniper’s VPN, nor of how this must have exposed US businesses (unless the NSA clued them into the problem). And definitely no discussion of the assumption that NSA initially asked for the back door that someone else subsequently exploited.

More importantly, there’s no discussion of the cost of this hack, which I find interesting given that it may be an own goal.

What We Know about the Section 215 Phone Dragnet and Location Data

Last month’s squabble between Marco Rubio and Ted Cruz about USA Freedom Act led a number of USAF boosters to belatedly understand what I’ve been writing for years: that USAF expanded the universe of people whose records would be collected under the program, and would therefore expose more completely innocent people, along with more potential suspects, to the full analytical tradecraft of the NSA, indefinitely.

In an attempt to explain why that might be so, Julian Sanchez wrote this post, focusing on the limits on location data collection that restricted cell phone collection. Sanchez ignores two other likely factors — the probable inclusion of Internet phone calls and the ability to do certain kinds of connection chaining — that mark key new functionalities in the program which would have posed difficulties prior to USAF. But he also misses a lot of the public facts about location collection and cell phones under the Section 215 dragnet.  This post will lay those out.

The short version is this: the FISC appears to have imposed some limits on prospective cell location collection under Section 215 even as the phone dragnet moved over to it, and it was not until August 2011 that NSA started collecting cell phone records — stripped of location — from AT&T under Section 215 collection rules. The NSA was clearly getting “domestic” records from cell phones prior to that point, though it’s possible they weren’t coming from Section 215 data. Indeed, the only known “successes” of the phone dragnet — Basaaly Moalin and Adis Medunjanin — identified cell phones. It’s not clear whether those came from EO 12333, secondary database information that didn’t include location, or something else.

Here’s the more detailed explanation, along with a timeline of key dates:

There is significant circumstantial evidence that by February 17, 2006 — two months before the FISA Court approved the use of Section 215 of the PATRIOT Act to aspire to collect all Americans’ phone records — the FISA Court required briefing on the use of “hybrid” requests to get real-time location data from targets using a FISA Pen Register together with a Section 215 order. The move appears to have been a reaction to a series of magistrates’ rulings against a parallel practice in criminal cases. The briefing order came in advance of the 2006 PATRIOT Act reauthorization going into effect, which newly limited Section 215 requests to things that could be obtained with a grand jury subpoena. Because some courts had required more than a subpoena to obtain location, it appears, FISC reviewed the practice in the FISC — and, given the BR/PR numbers reported in IG Reports, ended, sometime before the end of 2006 though not immediately.

The FISC taking notice of criminal rulings and restricting FISC-authorized collection accordingly would be consistent with information provided in response to a January 2014 Ron Wyden query about what standards the FBI uses for obtaining location data under FISA. To get historic data (at least according to the letter), FBI used a 215 order at that point. But because some district courts (this was written in 2014, before some states and circuits had weighed in on prospective location collection, not to mention the 11th circuit ruling on historical location data under US v. Davis) require a warrant, “the FBI elects to seek prospective CSLI pursuant to a full content FISA order, thus matching the higher standard imposed in some U.S. districts.” In other words, as soon as some criminal courts started requiring a warrant, FISC apparently adopted that standard. If FISC continued to adopt criminal precedents, then at least after the first US v. Davis ruling, it would have and might still require a warrant (that is, an individualized FISA order) even for historical cell location data (though Davis did not apply to Stingrays).

FISC doesn’t always adopt the criminal court standard; at least until 2009 and by all appearances still, for example, FISC permits the collection, then minimization, of Post Cut Through Dialed Digits collected using FISA Pen Registers, whereas in the criminal context FBI does not collect PCTDD. But the FISC does take notice of, and respond to — even imposing a higher national security standard than what exists at some district levels — criminal court decisions. So the developments affecting location collection in magistrate, district, and circuit courts would be one limit on the government’s ability to collect location under FISA.

That wouldn’t necessarily prevent NSA from collecting cell records using a Section 215 order, at least until the Davis decision. After all, does that count as historic (a daily collection of records each day) or prospective (the approval to collect data going forward in 90 day approvals)? Plus, given the PCTDD and some other later FISA decisions, it’s possible FISC would have permitted the government to collect but minimize location data. But the decisions in criminal courts likely gave FISC pause, especially considering the magnitude of the production.

Then there’s the chaos of the program up to 2009.

At least between January 2008 and March 2009, and to some degree for the entire period preceding the 2009 clean-up of the phone and Internet dragnets, the NSA was applying EO 12333 standards to FISC-authorized metadata collection. In January 2008, NSA co-mingled 215 and EO 12333 data in either a repository or interface, and when the shit started hitting the fan the next year, analysts were instructed to distinguish the two authorities by date (which would have been useless to do). Not long after this data was co-mingled in 2008, FISC first approved IMEI and IMSI as identifiers for use in Section 215 chaining. In other words, any restrictions on cell collection in this period may have been meaningless, because NSA wasn’t heeding FISC’s restrictions on PATRIOT authorized collection, nor could it distinguish between the data it got under EO 12333 and Section 215.

Few people seem to get this point, but at least during 2008, and probably during the entire period leading up to 2009, there was no appreciable analytical border between where the EO 12333 phone dragnet ended and the Section 215 one began.

There’s no unredacted evidence (aside from the IMEI/IMSI permission) the NSA was collecting cell phone records under Section 215 before the 2009 process, though in 2009, both Sprint and Verizon (even AT&T, though to a much less significant level) had to separate out their entirely foreign collection from their domestic, meaning they were turning over data subject to EO 12333 and Section 215 together for years. That’s also roughly the point when NSA moved toward XML coding of data on intake, clearly identifying where and under what authority it obtained the data. Thus, it’s only from that point forward where (at least according to what we know) the data collected under Section 215 would clearly have adhered to any restrictions imposed on location.

In 2010, the NSA first started experimenting with smaller collections of records including location data at a time when Verizon Wireless was named on primary orders. And we have two separate documents describing what NSA considered its first collection of cell data under Section 215 on August 29, 2011. But it did so only after AT&T had stripped the location data from the records.

It appears Verizon never did the same (indeed, Verizon objected to any request to do so in testimony leading up to USAF’s passage). The telecoms used different methods of delivering call records under the program. In fact, in August 2, 2012, NSA’s IG described the orders as requiring telecoms to produce “certain call detail records (CDRs) or telephony metadata,” which may differentiate records that (which may just be AT&T) got processed before turning over. Also in 2009, part of Verizon ended its contract with the FBI to provide special compliance with NSLs. Both things may have affected Verizon’s ability or willingness to custom what it was delivering to NSA, as compared to AT&T.

All of which suggests that at least Verizon could not or chose not to do what AT&T did: strip location data from its call records. Section 215, before USAF, could only require providers to turn over records they kept, it could not require, as USAF may, provision of records under the form required by the government. Additionally, under Section 215, providers did not get compensated after the first two dragnet orders.

All that said, the dragnet has identified cell phones! In fact, the only known “successes” under Section 215 — the discovery of Basaaly Moalin’s T-Mobile cell phone and the discovery of Adis Medunjanin’s unknown, but believed to be Verizon, cell phone — did, and they are cell phones from companies that didn’t turn over records. In addition, there’s another case, cited in a 2009 Robert Mueller declaration preceding the Medunjanin discovery, that found a US-based cell phone.

There are several possible explanations for that. The first is that these phones were identified based off calls from landlines and/or off backbone records (so the phone number would be identified, but not the cell information). But note that, in the Moalin case, there are no known land lines involved in the presumed chain from Ayro to Moalin.

Another possibility — a very real possibility with some of these — is that the underlying records weren’t collected under Section 215 at all, but were instead collected under EO 12333 (though Moalin’s phone was identified before Michael Mukasey signed off on procedures permitting the chaining through US person records). That’s all the more likely given that all the known hits were collected before the point in 2009 when the FISC started requiring providers to separate out foreign (EO 12333) collection from domestic and international (Section 215) collection. In other words, the Section 215 phone dragnet may have been working swimmingly up until 2009 because NSA was breaking the rules, but as soon as it started abiding by the rules — and adhering to FISC’s increasingly strict limits on cell location data — it all of a sudden became virtually useless given the likelihood that potential terrorism targets would use exclusively cell and/or Internet calls just as they came to bypass telephony lines. Though as that happened, the permissions on tracking US persons via records collected under EO 12333, including doing location analysis, grew far more permissive.

In any case, at least in recent years, it’s clear that by giving notice and adjusting policy to match districts, the FISC and FBI made it very difficult to collect prospective location records under FISA, and therefore absent some means of forcing telecoms to strip their records before turning them over, to collect cell data.

Continue reading

Shorter Devin Nunes: There Are Privacy-Violating Covert Counter-Terrorism Programs We’re Hiding

I want to return to a detail I pointed out in the Intelligence Authorization yesterday: This language, which would affirmatively clarify that the Privacy and Civil Liberties Oversight does not get access to information on covert operations.

ACCESS.—Nothing in this section shall be construed to authorize the Board, or any agent thereof, to gain access to information regarding an activity covered by section 503(a) of the National Security Act of 1947 (50 U.S.C. 3093(a)).

Some or several intelligence agencies are demanding this, presumably, at a time when PCLOB is working on a review of two EO 12333 authorized counterterrorism programs conducted by CIA or NSA that affect US persons.

During the next stage of its inquiry, the Board will select two counterterrorism-related activities governed by E.O. 12333, and will then conduct focused, in-depth examinations of those activities. The Board plans to concentrate on activities of the CIA and NSA, and to select activities that involve one or more of the following: (1) bulk collection involving a significant chance of acquiring U.S. person information; (2) use of incidentally collected U.S. person information; (3) targeting of U.S. persons; and (4) collection that occurs within the United States or from U.S. companies. Both reviews will involve assessing how the need for the activity in question is balanced with the need to protect privacy and civil liberties. The reviews will result in written reports and, if appropriate, recommendations for the enhancement of civil liberties and privacy.

It may be that the IC demanded this out of some generalized fear, of the sort Rachel Brand raised when she objected to PCLOB’s plan to conduct this EO 12333 (though none of what she says addresses the covert nature of any program, but only their classification). Indeed, given that PCLOB planned to finish the review in question by end of year 2015, it is unlikely that the two programs PCLOB pursued were covert operations. Furthermore, there is nothing in Ron Wyden’s statement opposing this language (which I’ve replicated in full below) that seems to indicate the specificity of concern as he had, for example, with location data or secret law or the OLC opinion affecting cybersecurity. Indeed, he specifically says, “this Board’s oversight activities to date have not focused on covert action.”

So there’s nothing in the public record to make me believe PCLOB has already butted up against a covert operation.

That said, I have in recent weeks become increasingly certain there are programs being run under the guise of counterterrorism, off the official books (and/or were, even after Stellar Wind was “shut down”), and probably in ways the affect the privacy of Americans, potentially a great many Americans.

I say that because there are places where the numbers in the public record don’t add up, where official sources are providing obviously bullshit explanations. I say that, too, because it is clear some places where you’d be able to manage such programs (via personnel labeled as “techs,” for example, and therefore not subject to the oversight of the publicly admitted programs) have been affirmatively preserved over the course of years. I say that because certain authorizations were pushed through with far too much urgency given their publicly described roll out over years. I also say that because it’s increasingly clear CIA, at least, views its surveillance mandate to extend to protecting itself, which in this era of inflamed counterintelligence concerns, might (and has in the past for DOD) extend to spying on its perceived enemies (indeed, one of the programs that I think might be such a covert action would be entirely about protecting the CIA).

I have a pretty good sense what at least a few of these programs are doing and where. I don’t know if they are formally covert operations or not — that’s a confusing question given how covert structure has increasingly been used to preserve deniability from US courts rather than foreign countries. But I do know that the IC’s demand that PCLOB be affirmatively disallowed access to such information suggests it knows such programs would not pass the muster of civil liberties review.

In any case, thanks to House Intelligence Chair Devin Nunes for making that so clear.


Wyden’s statement

This afternoon the House of Representatives passed a new version of the Intelligence Authorization bill for fiscal year 2016. I am concerned that section 305 of this bill would undermine independent oversight of US intelligence agencies, and if this language remains in the bill I will oppose any request to pass it by unanimous consent.

Section 305 would limit the authority of the watchdog body known as the Privacy and Civil Liberties Oversight Board. In my judgment, curtailing the authority of an independent oversight body like this Board would be a clearly unwise decision. Most Americans who I talk to want intelligence agencies to work to protect them from foreign threats, and they also want those agencies to be subject to strong, independent oversight. And this provision would undermine some of that oversight.

Section 305 states that the Privacy and Civil Liberties Board shall not have the authority to investigate any covert action program. This is problematic for two reasons. First, while this Board’s oversight activities to date have not focused on covert action, it is reasonably easy to envision a covert action program that could have a significant impact on Americans’ privacy and civil liberties – for example, if it included a significant surveillance component.

An even bigger concern is that the CIA in particular could attempt to take advantage of this language, and could refuse to cooperate with investigations of its surveillance activities by arguing that those activities were somehow connected to a covert action program. I recognize that this may not be the intent of this provision, but in my fifteen years on the Intelligence Committee I have repeatedly seen senior CIA officials go to striking lengths to resist external oversight of their activities. In my judgment Congress should be making it harder, not easier, for intelligence officials to stymie independent oversight.

For these reasons, it is my intention to object to any unanimous consent request to pass this bill in its current form. I look forward to working with my colleagues to modify or remove this provision

Interesting Tidbits from the House Intelligence Authorization

The House version of next year’s Intelligence Authorization just passed with big numbers, 364-58.

Among the interesting details included in the unclassified version of the bill, are the following:

Section 303, 411: Permits the ICIG and the CIA IG to obtain information from state and local governments

The bill changes language permitting the Intelligence Community Inspector General and the CIA IG to obtain information from any federal agency to obtain it from federal, state, or local governments.

Which sort of suggests the ICIG and CIA IG is reviewing — and therefore the IC is sharing information with — state and local governments.

I have no big problem with this for ICIG. But doesn’t this suggest the CIA — a foreign intelligence agency — is doing things at the state level? That I do have a problem with.

Update: Note No One Special’s plausible explanation: that the IGs would be investigating misconduct like DWIs. That makes sense, especially given the heightened focus on Insider Threat Detection.

Section 305: Tells PCLOB to stay the fuck out of covert operations

This adds language to the Privacy and Civil Liberties Oversight Board authorization stating that, “Nothing in [it] shall be construed to authorize the Board, or any agent thereof, to gain access to information regarding an activity covered by” the covert operation section of the National Security Act.

OK then! I guess Congress has put PCLOB in its place!

Remember, PCLOB currently has a mandate that extends only to counterterrorism (though it will probably expand to cyber once the CISA-type bill is passed). It is currently investigating a couple of EO 12333 authorized activities that take place in some loopholed areas of concern. I’m guessing it bumped up against something Congress doesn’t want it to know about, and they’ve gone to the trouble of making that clear in the Intelligence Authorization.

As it happens, Ron Wyden is none too impressed with this section and has threatened to object to unanimous consent of the bill in the Senate over it. Here are his concerns.

Section 305 would limit the authority of the watchdog body known as the Privacy and Civil Liberties Oversight Board.  In my judgment, curtailing the authority of an independent oversight body like this Board would be a clearly unwise decision.  Most Americans who I talk to want intelligence agencies to work to protect them from foreign threats, and they also want those agencies to be subject to strong, independent oversight.  And this provision would undermine some of that oversight.

Section 305 states that the Privacy and Civil Liberties Board shall not have the authority to investigate any covert action program.  This is problematic for two reasons.  First, while this Board’s oversight activities to date have not focused on covert action, it is reasonably easy to envision a covert action program that could have a significant impact on Americans’ privacy and civil liberties – for example, if it included a significant surveillance component.

An even bigger concern is that the CIA in particular could attempt to take advantage of this language, and could refuse to cooperate with investigations of its surveillance activities by arguing that those activities were somehow connected to a covert action program.  I recognize that this may not be the intent of this provision, but in my fifteen years on the Intelligence Committee I have repeatedly seen senior CIA officials go to striking lengths to resist external oversight of their activities.  In my judgment Congress should be making it harder, not easier, for intelligence officials to stymie independent oversight.

Section 306: Requires ODNI to check for spooks sporting EFF stickers

The committee description of this section explains it will require DNI to do more checks on spooks (actually spooks and “sensitive” positions, which isn’t full clearance).

Section 306 directs the Director of National Intelligence (DNI) to develop and implement a plan for eliminating the backlog of overdue periodic investigations, and further requires the DNI to direct each agency to implement a program to provide enhanced security review to individuals determined eligible for access to classified information or eligible to hold a sensitive position.

These enhanced personnel security programs will integrate information relevant and appropriate for determining an individual’s suitability for access to classified information; be conducted at least 2 times every 5 years; and commence not later than 5 years after the date of enactment of the Fiscal Year 2016 Intelligence Authorization Act, or the elimination of the backlog of overdue periodic investigations, whichever occurs first.

Among the things ODNI will use to investigate its spooks are social media, commercial data sources, and credit reports. Among the things it is supposed to track is “change in ideology.” I’m guessing they’ll do special checks for EFF stickers and hoodies, which Snowden is known to have worn without much notice from NSA.

Section 307: Requires DNI to report if telecoms aren’t hoarding your call records

This adds language doing what some versions of USA Freedom tried to requiring DNI to report on which “electronic communications service providers” aren’t hoarding your call records for at least 18 months. He will have to do a report after 30 days listing all that don’t (bizarrely, the bill doesn’t specify what size company this covers, which given the extent of ECSPs in this country could be daunting), and also report to Congress within 15 days if any of them stop hoarding your records.

Section 313: Requires NIST to develop a measure of cyberdamage

For years, Keith Alexander has been permitted to run around claiming that cyber attacks have represented the greatest transfer of wealth ever (apparently he hasn’t heard of slavery or colonialism). This bill would require NIST to work with FBI and others to come up with a way to quantify the damage from cyberattacks.

Section 401: Requires congressional confirmation of the National Counterintelligence Executive

The National Counterintelligence Executive was pretty negligent in scoping out places like the OPM database that might be prime targets for China. I’m hoping that by requiring congressional appointment, this position becomes more accountable and potentially more independent.

Section 701: Eliminates reporting that probably shouldn’t be eliminated

James Clapper hates reporting requirements, and with this bill he’d get rid of some more of them, some of which are innocuous.

But I am concerned that the bill would eliminate this report on what outside entities spooks are also working for.

(2) The Director of National Intelligence shall annually submit to the congressional intelligence committees a report describing all outside employment for officers and employees of elements of the intelligence community that was authorized by the head of an element of the intelligence community during the preceding calendar year. Such report shall be submitted each year on the date provided in section 3106 of this title.

We’ve just seen several conflict situations at NSA, and eliminating this report would make it less like to ID those conflicts.

The bill would also eliminate these reports.

REPORTS ON NUCLEAR ASPIRATIONS OF NON-STATE ENTITIES.—Section 1055 of the National Defense Authorization Act for Fiscal Year 2010 (50 U.S.C. 2371) is repealed.

REPORTS ON ESPIONAGE BY PEOPLE’S REPUBLIC OF CHINA.—Section 3151 of the National Defense Authorization Act for Fiscal Year 2000 (42 U.S.C. 7383e) is repealed.

Given that both of these issues are of grave concern right now, I do wonder why Clapper doesn’t want to report to Congress on them.

And, then there’s the elimination of this report.

§2659. Report on security vulnerabilities of national security laboratory computers

(a) Report required

Not later than March 1 of each year, the National Counterintelligence Policy Board shall prepare a report on the security vulnerabilities of the computers of the national security laboratories.

(b) Preparation of report

In preparing the report, the National Counterintelligence Policy Board shall establish a so-called “red team” of individuals to perform an operational evaluation of the security vulnerabilities of the computers of one or more national security laboratories, including by direct experimentation. Such individuals shall be selected by the National Counterintelligence Policy Board from among employees of the Department of Defense, the National Security Agency, the Central Intelligence Agency, the Federal Bureau of Investigation, and of other agencies, and may be detailed to the National Counterintelligence Policy Board from such agencies without reimbursement and without interruption or loss of civil service status or privilege.

Clapper’s been gunning to get rid of this one for at least 3 years, with the hysteria about hacking growing in each of those years. Department of Energy, as a whole, at least, is a weak spot in cybersecurity. Nevertheless, Congress is going to eliminate reporting on this.

Maybe the hacking threat isn’t as bad as Clapper says?

1 2 3 21