Posts

Working Thread, Apple Response

Apple’s response to the phone back door order is here.

(1) Apple doesn’t say it, but some people at Apple — probably including people who’d have access to this key (because they’d be involved in using it, which would require clearance) — had to have been affected in the OPM hack.

Screen Shot 2016-02-25 at 3.33.26 PM

(2) Remember as you read it that Ted Olson lost his wife on 9/11.

Screen Shot 2016-02-25 at 3.19.26 PM

(3) Several members of Congress — including ranking HPSCI member Adam Schiff — asked questions in hearings about this today.

Screen Shot 2016-02-25 at 3.21.44 PM

(4) Apple hoists Comey on the same petard that James Orenstein did.

Screen Shot 2016-02-25 at 3.29.30 PM

(8) More hoisting on petarding, in this case over DOJ generally and Comey specifically choosing not to seek legislation to modify CALEA.

Screen Shot 2016-02-25 at 3.40.13 PM

(11) Apple beats up FBI for fucking up.

Unfortunately, the FBI, without consulting Apple or reviewing its public guidance regarding iOS, changed the iCloud password associated with one of the attacker’s accounts, foreclosing the possibility of the phone initiating an automatic iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need to unlock the phone and thus for the extraordinary order the government now seeks.21 Had the FBI consulted Apple first, this litigation may not have been necessary.

(11) This is awesome, especially coming as it does from Ted Olson, who Comey asked to serve as witness for a key White House meeting after the Stellar Wind hospital confrontation.

Screen Shot 2016-02-25 at 3.44.41 PM

(12) This is the kind of information NSA would treat as classified, for similar reasons.

Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks. Neuenschwander Decl. ¶ 22. Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer.

(16) I’ll have to double check, but I think some of this language quotes Orenstein directly.

Congress knows how to impose a duty on third parties to facilitate the government’s decryption of devices. Similarly, it knows exactly how to place limits on what the government can require of telecommunications carriers and also on manufacturers of telephone equipment and handsets. And in CALEA, Congress decided not to require electronic communication service providers, like Apple, to do what the government seeks here. Contrary to the government’s contention that CALEA is inapplicable to this dispute, Congress declared via CALEA that the government cannot dictate to providers of electronic communications services or manufacturers of telecommunications equipment any specific equipment design or software configuration.

(16) This discussion of what Apple is has ramifications for USA Freedom Act, which the House report said only applied to “phone companies” (though the bill says ECSPs).

Screen Shot 2016-02-25 at 3.55.55 PM

(18) Loving Apple wielding Youngstown against FBI.

Nor does Congress lose “its exclusive constitutional authority to make laws necessary and proper to carry out the powers vested by the Constitution” in times of crisis (whether real or imagined). Youngstown Sheet & Tube Co. v. Sawyer, 343 U.S. 579, 588–89 (1952). Because a “decision to rearrange or rewrite [a] statute falls within the legislative, not the judicial prerogative[,]” the All Writs Act cannot possibly be deemed to grant to the courts the extraordinary power the government seeks. Xi v. INS, 298 F.3d 832, 839 (9th Cir. 2002).

(20) Reading this passage on how simple pen register rulings shouldn’t apply to far more intrusive surveillance, I’m reminded that Olson left DOJ in 2004 before (or about the same time as) Jim Comey et al applied PRTT to conduct metadata dragnet of Americans.

In New York Telephone Co., the district court compelled the company to install a simple pen register device (designed to record dialed numbers) on two telephones where there was “probable cause to believe that the [c]ompany’s facilities were being employed to facilitate a criminal enterprise on a continuing basis.” 434 U.S. at 174. The Supreme Court held that the order was a proper writ under the Act, because it was consistent with Congress’s intent to compel third parties to assist the government in the use of surveillance devices, and it satisfied a three-part test imposed by the Court.

(22) This is one thing that particularly pissed me off about the application of NYTelephone to this case:  there’s no ongoing use of Apple’s phone.

This case is nothing like Hall and Videotapes, where the government sought assistance effectuating an arrest warrant to halt ongoing criminal activity, since any criminal activity linked to the phone at issue here ended more than two months ago when the terrorists were killed.

(24) I think this is meant to be a polite way of calling DOJ’s claims fucking stupid (Jonathan Zdziarski has written about how any criminal use of this back door would require testimony about the forensics of this).

Use of the software in criminal prosecutions only exacerbates the risk of disclosure, given that criminal defendants will likely challenge its reliability. See Fed. R. Evid. 702 (listing requirements of expert testimony, including that “testimony [be] the product of reliable principles and methods” and “the expert has reliably applied the principles and methods to the facts of the case,” all of which a defendant is entitled to challenge); see also United States v. Budziak, 697 F.3d 1105, 1111–13 (9th Cir. 2012) (vacating order denying discovery of FBI software); State v. Underdahl, 767 N.W.2d 677, 684–86 (Minn. 2009) (upholding order compelling discovery of breathalyzer source code). The government’s suggestion that Apple can destroy the software has clearly not been thought through, given that it would jeopardize criminal cases. See United States v. Cooper, 983 F.2d 928, 931–32 (9th Cir. 1993) (government’s bad-faith failure to preserve laboratory equipment seized from defendants violated due process, and appropriate remedy was dismissal of indictment, rather than suppression of evidence). [my emphasis]

(25) “If you outlaw encryption the only people with encryption will be outlaws.”

And in the meantime, nimble and technologically savvy criminals will continue to use other encryption technologies, while the law-abiding public endures these threats to their security and personal liberties—an especially perverse form of unilateral disarmament in the war on terror and crime.

(26) The parade of horribles that a government might be able to coerce is unsurprisingly well-chosen.

For example, under the same legal theories advocated by the government here, the government could argue that it should be permitted to force citizens to do all manner of things “necessary” to assist it in enforcing the laws, like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant,25 or requiring a journalist to plant a false story in order to help lure out a fugitive, or forcing a software company to insert malicious code in its autoupdate process that makes it easier for the government to conduct court-ordered surveillance. Indeed, under the government’s formulation, any party whose assistance is deemed “necessary” by the government falls within the ambit of the All Writs Act and can be compelled to do anything the government needs to effectuate a lawful court order. While these sweeping powers might be nice to have from the government’s perspective, they simply are not authorized by law and would violate the Constitution.

(30) “Say, why can’t NSA do this for you?”

Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.

(33) Love the way Apple points out what I and others have: this phone doesn’t contain valuable information, and if it does, Apple probably couldn’t get at it.

Apple does not question the government’s legitimate and worthy interest in investigating and prosecuting terrorists, but here the government has produced nothing more than speculation that this iPhone might contain potentially relevant information.26 Hanna Decl. Ex. H [Comey, Follow This Lead] (“Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t.”). It is well known that terrorists and other criminals use highly sophisticated encryption techniques and readily available software applications, making it likely that any information on the phone lies behind several other layers of non-Apple encryption. See Hanna Decl. Ex. E [Coker, Tech Savvy] (noting that the Islamic State has issued to its members a ranking of the 33 most secure communications applications, and “has urged its followers to make use of [one app’s] capability to host encrypted group chats”).

26 If the government did have any leads on additional suspects, it is inconceivable that it would have filed pleadings on the public record, blogged, and issued press releases discussing the details of the situation, thereby thwarting its own efforts to apprehend the criminals. See Douglas Oil Co. of Cal. v. Petrol Stops Nw., 441 U.S. 211, 218-19 (1979) (“We consistently have recognized that the proper functioning of our grand jury system depends upon the secrecy of grand jury proceedings. . . . [I]f preindictment proceedings were made public, many prospective witnesses would be hesitant to come forward voluntarily, knowing that those against whom they testify would be aware of that testimony. . . . There also would be the risk that those about to be indicted would flee, or would try to influence individual grand jurors to vote against indictment.”).

(35) After 35 pages of thoroughgoing beating, Apple makes nice.

Apple has great respect for the professionals at the Department of Justice and FBI, and it believes their intentions are good.

(PDF 56) Really looking forward to DOJ’s response to the repeated examples of this point, which is likely to be, “no need to create logs because there will never be a trial because the guy is dead.” Which, of course, will make it clear this phone won’t be really useful.

Moreover, even if Apple were able to truly destroy the actual operating system and the underlying code (which I believe to be an unrealistic proposition), it would presumably need to maintain the records and logs of the processes it used to create, validate, and deploy GovtOS in case Apple’s methods ever need to be defended, for example in court. The government, or anyone else, could use such records and logs as a roadmap to recreate Apple’s methodology, even if the operating system and underlying code no longer exist.

(PDF 62) This is really damning. FBI had contacted Apple before they changed the iCloud password.
Screen Shot 2016-02-25 at 6.09.00 PM

(PDF 62) Wow. They did not ask for the iCloud data on the phone until January 22, 50 days after seizing the phone and 7 days before warrant expired.

Screen Shot 2016-02-25 at 6.16.11 PM

What Claims Did the Intelligence Community Make about the Paris Attack to Get the White House to Change on Encryption?

I’m going to do a series of posts laying out the timeline behind the Administration’s changed approach to encryption. In this, I’d like to make a point about when the National Security Council adopted a “decision memo” more aggressively seeking to bypass encryption. Bloomberg reported on the memo last week, in the wake of the FBI’s demand that Apple help it brute force Syed Rezwan Farook’s work phone.

But note the date: The meeting at which the memo was adopted was convened “around Thanksgiving.”

Silicon Valley celebrated last fall when the White House revealed it would not seek legislation forcing technology makers to install “backdoors” in their software — secret listening posts where investigators could pierce the veil of secrecy on users’ encrypted data, from text messages to video chats. But while the companies may have thought that was the final word, in fact the government was working on a Plan B.

In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.

The approach was formalized in a confidential National Security Council “decision memo,” tasking government agencies with developing encryption workarounds, estimating additional budgets and identifying laws that may need to be changed to counter what FBI Director James Comey calls the “going dark” problem: investigators being unable to access the contents of encrypted data stored on mobile devices or traveling across the Internet. Details of the memo reveal that, in private, the government was honing a sharper edge to its relationship with Silicon Valley alongside more public signs of rapprochement. [my emphasis]

That is, the meeting was convened in the wake of the November 13 ISIS attack on Paris.

We know that last August, Bob Litt had recommended keeping options open until such time as a terrorist attack presented the opportunity to revisit the issue and demand that companies back door encryption.

Privately, law enforcement officials have acknowledged that prospects for congressional action this year are remote. Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

There is value, he said, in “keeping our options open for such a situation.”

Litt was commenting on a draft paper prepared by National Security Council staff members in July, which also was obtained by The Post, that analyzed several options. They included explicitly rejecting a legislative mandate, deferring legislation and remaining undecided while discussions continue.

It appears that is precisely what happened — that the intelligence community, in the wake of a big attack on Paris, went to the White House and convinced them to change their approach.

So I want to know what claims the intelligence community made about the use of encryption in the attack that convinced the White House to change approach. Because there is nothing in the public record that indicates encryption was important at all.

It is true that a lot of ISIS associates were using Telegram; shortly after the attack Telegram shut down a bunch of channels they were using. But reportedly Telegram’s encryption would be easy for the NSA to break. The difficulty with Telegram — which the IC should consider seriously before they make Apple back door its products — is that its offshore location probably made it harder for our counterterrorism analysts to get the metadata.

It is also true that an ISIS recruit whom French authorities had interrogated during the summer (and who warned them very specifically about attacks on sporting events and concerts) had been given an encryption key on a thumb drive.

But it’s also true the phone recovered after the attack — which the attackers used to communicate during the attack — was not encrypted. It’s true, too, that French and Belgian authorities knew just about every known participant in the attack, especially the ringleader. From reports, it sounds like operational security — the use of a series of burner phones — was more critical to his ability to move unnoticed through Europe. There are also reports that the authorities had a difficult time translating the dialect of (probably) Berber the attackers used.

From what we know, though, encryption is not the reason authorities failed to prevent the French attack. And a lot of other tools that are designed to identify potential attacks — like the metadata dragnet — failed.

I hate to be cynical (though comments like Litt’s — plus the way the IC used a bogus terrorist threat in 2004 to get the torture and Internet dragnet programs reauthorized — invite such cynicism). But it sure looks like the IC failed to prevent the November attack, and immediately used their own (human, unavoidable) failure to demand a new approach to encryption.

Update: In testimony before the House Judiciary Committee today, Microsoft General Counsel Brad Smith repeated a claim MSFT witnesses have made before: they provided Parisian law enforcement email from the Paris attackers within 45 minutes. That implies, of course, that the data was accessible under PRISM and not encrypted.

Reuters Asks Even Stupider Questions about Apple-FBI Fight than Pew

In my post on Pew’s polling on whether Apple should have to write a custom version of its operating system so FBI can brute force the third phone, I gave Pew credit for several aspects of its question, but suggested the result might be different if Pew had reminded the people the FBI has already solved the San Bernardino attack.

Imagine if Pew called 1000 people and asked, “would you support requiring Apple to make iPhones less secure so the FBI could get information on a crime the FBI has already solved?”

As I said, at least Pew’s question was fair.

Not so Reuters’ questions on the same topic. After asking a bunch of questions to which three-quarters said they would not be willing to give up their own privacy to ward against terrorism or hacking, Reuters than asked this question:

Apple is opposing a court order to unlock a smart phone that was used by one of the shooters in the San Bernardino attack. Apple is concerned that if it helps the FBI this time, it will be forced to help the government in future cases that may not be linked to national security, opening the door for hackers and potential future

Do you agree or disagree with Apple’s decision to oppose the court order?

While Reuters explains why Apple opposes the order — because it will be [in fact, already has been] asked to help break into more phones that have nothing to do with terrorism, creating vulnerabilities for hackers — the wording of the question could easily be understood to imply that Syed Rezwan Farook’s phone “was used [] in the San Bernardino attack.” It’s not clear Farook even used the phone after November, two days before his attack. And to the extent Farook and his wife used phones during the attack — as implied by the question — they are believed to be the phones they tried unsuccessfully to destroy.

Yet, even with his problematically framed question, 46% of respondents (on an online poll, which likely skews towards tech facility) supported Apple’s actions.

There’s a problem, too, with the only question for which a plurality supported the FBI’s snooping. a graph of which Reuters highlighted in its story.

The government should be able to look at data on Americans’ phones in order to protect against terror threats.

There are cases where investigators find information on a smart phone that helps prevent follow-on attacks (in happened in Paris with a phone that was not encrypted). Border searches(which I admittedly believe to be one of the real reasons FBI objects to default encryption), too, might prevent terror attacks. But more often, we’re talking about investigating crimes deemed to be terrorism after the fact (or, far, far more often, solving drug crimes).

Nothing the FBI could do with the data on Farook’s work phone will prevent the deaths of the 14 people he already killed. There are other kinds of surveillance far better suited to doing that.

This Apple Fight Is (Partly) about Solving Car Accidents

I am going to spend my day laying out what a cynical man FBI Director Jim Comey is — from setting up a victims’ brief against Apple even before the government served Apple here, to this transparently bogus garbage post at Lawfare.

But first I wanted to reemphasize a detail I’ve noted before. On February 9, at a time when FBI already knew how it was going to go after Apple, Jim Comey said this in a hearing to the Senate Intelligence Committee:

I’d say this problem we call going dark, which as Director Clapper mentioned, is the growing use of encryption, both to lock devices when they sit there and to cover communications as they move over fiber optic cables is actually overwhelmingly affecting law enforcement. Because it affects cops and prosecutors and sheriffs and detectives trying to make murder cases, car accident cases, kidnapping cases, drug cases. It has an impact on our national security work, but overwhelmingly this is a problem that local law enforcement sees.

Even before he served Apple here, Comey made it clear this was about law enforcement, not terrorism cases, his cynical invocation of the San Bernardino victims notwithstanding.

And not just law enforcement: “car accidents.”

Since it got its All Writs Act, FBI has said this Apple request is a one-off request, just for this terrorism case they already know the perpetrators of. But at a time when it already knew it was going to get an AWA order, Jim Comey was more frank. This is about car accidents. Car accidents, murder, kidnapping, and drugs (the last All Writs Act request was about drugs, in a case where they had enough evidence to get the guy to plead guilty anyway, if there are any doubts they would demand an AWA going forward).

Car accidents.

District Attorneys Use Spying as Cover To Demand a Law Enforcement Back Door

In response to a question Senate Intelligence Committee Chair Richard Burr posed during his committee’s Global Threat hearing yesterday, Jim Comey admitted that “going dark” is “overwhelmingly … a problem that local law enforcement sees” as they try to prosecute even things as mundane as a car accident.

Burr: Can you, for the American people, set a percentage of how much of that is terrorism and how much of that fear is law enforcement and prosecutions that take place in every town in America every day?

Comey: Yeah I’d say this problem we call going dark, which as Director Clapper mentioned, is the growing use of encryption, both to lock devices when they sit there and to cover communications as they move over fiber optic cables is actually overwhelmingly affecting law enforcement. Because it affects cops and prosecutors and sheriffs and detectives trying to make murder cases, car accident cases, kidnapping cases, drug cases. It has an impact on our national security work, but overwhelmingly this is a problem that local law enforcement sees.

Much later in the hearing Burr — whose committee oversees the intelligence but not the law enforcement function of FBI, which functions are overseen by the Senate Judiciary Committee — returned to the issue of encryption. Indeed, he seemed to back Comey’s point — that local law enforcement is facing a bigger problem with encryption than intelligence agencies — by describing District Attorneys from big cities and small towns complaining to him about encryption.

I’ve had more District Attorneys come to me that I have the individuals at this table. The District Attorneys have come to me because they’re beginning to get to a situation where they can’t prosecute cases. This is town by town, city by city, county by county, and state by state. And it ranges from Cy Vance in New York to a rural town of 2,000 in North Carolina.

Of course, the needs and concerns of these District Attorneys are the Senate Judiciary Committee’s job to oversee, not Burr’s. But he managed to make it his issue by calling those local law enforcement officials “those who complete the complement of our intelligence community” in promising to take up the issue (though he did make clear he was not speaking for the committee in his determination on the issue).

One of the responsibilities of this committee is to make sure that those of you at at the table and those that comp — complete the complement of our intelligence community have the tools through how we authorize that you need. [sic]

Burr raised ISIS wannabes and earlier in the hearing Comey revealed the FBI still hadn’t been able to crack one of a number of phones owned by the perpetrators of the San Bernardino attack. And it is important for the FBI to understand whether the San Bernardino attack was directed by people in Saudi Arabia or Pakistan that Tashfeen Malik associated with before coming to this country planning to engage in Jihad.

But only an hour before Jim Comey got done explaining that the real urgency here is to investigate drug cases and car accident cases, not that terrorist attack.

The balance between security, intelligence collection, and law enforcement is going to look different if you’re weighing drug investigations against the personal privacy of millions than if you’re discussing terrorist communications, largely behind closed doors.

Yet Richard Burr is not above pretending this about terrorism when it’s really about local law enforcement.

More Evidence Secret “Tweaks” To Section 702 Coming

Way at the end of yesterday’s Senate Intelligence Committee Global Threats hearing, Tom Cotton asked his second leading question permitting an intelligence agency head to ask for surveillance, this time asking Admiral Mike Rogers whether he still wanted Section 702 (the first invited Jim Comey to ask for access to Electronic Communications Transactions Records with National Security Letters, as Chuck Grassley had asked before; Comey was just as disingenuous in his response as the last time he asked).

Curiously, Cotton offered Rogers the opportunity to ask for Section 702 to be passed unchanged. Cotton noted that in 2012, James Clapper had asked for a straight reauthorization of Section 702.

Do you believe that Congress should pass a straight reauthorization of Section 702?

But Rogers (as he often does) didn’t answer that question. Instead, he simply asserted that he needed it.

I do believe we need to continue 702.

At this point, SSCI Chair Richard Burr piped up and noted the committee would soon start the preparation process for passing Section 702, “from the standpoint of the education that we need to do in educating and having Admiral Rogers bring us up to speed on the usefulness and any tweaks that may have to be made.”

This seems to parallel what happened in the House Judiciary Committee, where it is clear some discussion about the certification process occurred (see this post and this post).

Note this discussion comes in the wake of a description of some of the changes made in last year’s certification in this year’s PCLOB status report. That report notes that last year’s certification process approved the following changes:

  • NSA added a requirement to explain a foreign intelligence justification in targeting decisions, without fully implementing a recommendation to adopt criteria “for determining the expected foreign intelligence value of a particular target.” NSA is also integrating reviewing written justifications in its auditing process.
  • FBI minimization procedures were revised to reflect how often non-national security investigators could search 702-collected data, and added new limits on how 702 data could be used.
  • NSA and CIA write justifications for conducting back door searches on US person data collected under Section 702, except for CIA’s still largely oversight free searches on 702-collected metadata.
  • NSA and CIA twice (in January and May) provided FISC with a random sampling of its tasking and US person searches, which the court deemed satisfactory in its certification approval.
  • The government submitted a “Summary of Notable Section 702 Requirements” covering the rules governing the program, though this summary was not comprehensive nor integrated into the FISC’s reauthorization.

As the status report implicitly notes, the government has released minimization procedures for all four agencies using Section 702 (in addition to NSA, CIA, and FBI, NCTC has minimization procedures), but it did so by releasing the now-outdated 2014 minimization procedures as the 2015 ones were being authorized. At some point, I expect we’ll see DEA minimization procedures, given that the shutdown of its own dragnet would lead it to rely more on NSA ones, but that’s just a wildarseguess.

What Secrets Are the Spooks Telling HJC about Section 702?

There’s a paper that has been making waves, claiming it has found a formula to debunk conspiracies based on the likelihood if they were real, they would have already been leaked. Never mind that people have already found fault with the math, the study has another glaring flaw. It treats the PRISM program — and not, say, the phone dragnet — as one of its “true” unknown conspiracies.

PRISM — one part of the surveillance program authorized by Section 702 of the FISA Amendments Act — was remarkable in that it was legislated in public. There are certainly parts of Section 702 that were not widely known, such as the details about the “upstream” collection from telecom switches, but even that got explained to us back in 2006 by Mark Klein. There are even details of how the PRISM collection worked — its reliance on network mapping, the full list of participants. There are details that were exposed, such as that the government was doing back door searches on content collected under it, but even those were logical guesses based on the public record of the legislative debates.

Which is why it is so remarkable that — as I noted here and here — House Judiciary Committee Chair Bob Goodlatte has scheduled a classified hearing to cover the program that has been the subject of open hearings going back to at least 2008.

The hearing is taking place as we speak with the following witnesses.

  • Mr. Robert S. Litt
    General Counsel
    Office of the Director of National Intelligence
  • Mr. Jon Darby
    Deputy Director for Analysis and Production, Signals Intelligence Directorate
    National Security Agency
  • Mr. Stuart J. Evans
    Deputy Assistant Attorney General for Intelligence, National Security Division
    U.S. Department of Justice
  • Mr. Michael B. Steinbach
    Assistant Director for Counterterrorism
    Federal Bureau of Investigation

This suggests there is either something about the program we don’t already know, or that the government is asking for changes to the program that would extend beyond the basic concept of spying on foreigners in the US using US provider help.

I guess we’re stuck wildarseguessing what those big new secrets are, given the Intelligence Community’s newfound secrecy about this program.

Some observations about the witnesses. First, between Litt and Evans, these are the lawyers that would oversee the yearly certification applications to FISC. That suggests the government may, in fact, be asking for new authorities or new interpretations of authorities.

Darby would be in charge of the technical side of this program. Since the PRISM as it currently exists is so (technologically) simple, that suggests the new secrets may involve a new application of what the government will request from providers. This might be an expansion of upstream, possibly to bring it closer to XKeyscore deployment overseas, possibly to better exploit Tor. Remember, too, that under USA Freedom Act, Congress authorized the use of data collected improperly, provided that it adheres to the new minimization procedures imposed by the FISC. This was almost certainly another upstream collection, which means there’s likely to be some exotic new upstream application that has caused the government some problems of late.

Note that the sole FBI witness oversees counterterrorism, not cybersecurity. That’s interesting because it would support my suspicions that the government is achieving its cybersecurity collection via other means now. But also that any new programs may be under the counterterrorism function. Remember, the NatSec bosses, including Jim Comey, just went to Silicon Valley to ask for help applying algorithms to identify terrorism content. Remember, too, that such applications would have been useless to prevent the San Bernardino attack if they were focused on the public social media content. So it may be that NSA and FBI want to apply algorithms identifying radicalizers to private content.

Finally, and critically, remember the Apple debate. In a public court case, Apple and the FBI are fighting over whether Apple can be required to decrypt its customers’ smart device communications. The government has argued this is within the legal notion of “assistance to law enforcement.” Apple disagrees. I think it quite possible that the FBI would try to ask for decryption help to be included under the definition of “assistance” under Section 702. Significantly, these witnesses are generally those (including Bob Litt and FBI counterterrorism) who would champion such an interpretation.

How the Purpose of the Data Sharing Portal Changed Over the OmniCISA Debate

Last year, House Homeland Security Chair Michael McCaul offered up his rear-end to be handed back to him in negotiations leading to the passage of OmniCISA on last year’s omnibus. McCaul was probably the only person who could have objected to such a legislative approach because it deprived him of weighing in as a conferee. While he made noise about doing so, ultimately he capitulated and let the bill go through — and be made less privacy protective — as part of the must-pass budget bill.

Which is why I was so amused by McCaul’s op-ed last week, including passage of OmniCISA among the things he has done to make the country more safe from hacks. Here was a guy, holding his rear-end in his hands, plaintively denying that, by claiming that OmniCISA reinforced his turf.

I was adamant that the recently-enacted Cybersecurity Act include key provisions of my legislation H.R. 1731, the National Cybersecurity Protection Advancement Act. With this law, we now have the ability to be more efficient while protecting both our nation’s public and private networks.

With these new cybersecurity authorities signed into law, the Department of Homeland Security (DHS) will become the sole portal for companies to voluntarily share information with the federal government, while preventing the military and NSA from taking on this role in the future.

With this strengthened information-sharing portal, it is critical that we provide incentives to private companies who voluntarily share known cyber threat indicators with DHS. This is why we included liability protections in the new law to ensure all participants are shielded from the reality of unfounded litigation.

While security is vital, privacy must always be a guiding principle. Before companies can share information with the government, the law requires them to review the information and remove any personally identifiable information (PII) unrelated to cyber threats. Furthermore, the law tasks DHS and the Department of Justice (DOJ) to jointly develop the privacy procedures, which will be informed by the robust existing DHS privacy protocols for information sharing.

[snip]

Given DHS’ clearly defined lead role for cyber information sharing in the Cybersecurity Act of 2015, my Committee and others will hold regular oversight hearings to make certain there is effective implementation of these authorities and to ensure American’s privacy and civil liberties are properly protected.

It is true that under OmniCISA, DHS is currently (that is, on February 1) the sole portal for cyber-sharing. It’s also true that OmniCISA added DHS, along with DOJ, to those in charge of developing privacy protocols. There are also other network defense measures OmniCISA tasked DHS with — though the move of the clearances function, along with the budget OPM had been asking for to do it right but not getting, to DOD earlier in January, the government has apparently adopted a preference for moving its sensitive functions to networks DOD (that is, NSA) will guard rather than DHS. But McCaul’s bold claims really make me wonder about the bureaucratic battles that may well be going on as we speak.

Here’s how I view what actually happened with the passage of OmniCISA. It is heavily influenced by these three Susan Hennessey posts, in which she tried to convince that DHS’ previously existing portal ensured privacy would be protected, but by the end seemed to concede that’s not how it might work out.

  1. CISA in Context: Privacy Protections and the Portal

  2. CISA in Context: The Voluntary Sharing Model and that “Other” Portal
  3. CISA in Context: Government Use and What Really Matters for Civil Liberties

Underlying the entire OmniCISA passage is a question: Why was it necessary? Boosters explained that corporations wouldn’t share willingly without all kinds of immunities, which is surely true, but the same boosters never explained why an info-sharing system was so important when experts were saying it was way down the list of things that could make us safer and similar info-sharing has proven not to be a silver bullet. Similarly, boosters did not explain the value of a system that not only did nothing to require cyber information shared with corporations would be used to protect their networks, but by giving them immunity (in final passage) if they did nothing with information and then got pawned, made it less likely they will use the data. Finally, boosters ignored the ways in which OmniCISA not only creates privacy risks, but also expands new potential vectors of attack or counterintelligence collection for our adversaries.

So why was it necessary, especially given the many obvious ways in which it was not optimally designed to encourage monitoring, sharing, and implementation from network owners? Why was it necessary, aside from the fact that our Congress has become completely unable to demand corporations do anything in the national interest and there was urgency to pass something, anything, no matter how stinky?

Indeed, why was legislation doing anything except creating some but not all these immunities necessary if, as former NSA lawyer Hennessey claimed, the portal laid out in OmniCISA in fact got up and running on October 31, between the time CISA passed the Senate and the time it got weakened significantly and rammed through Congress on December 18?

At long last DHS has publically unveiled its new CISA-sanctioned, civil-liberties-intruding, all-your-personal-data-grabbing, information-sharing uber vacuum. Well, actually, it did so three months ago, right around the time these commentators were speculating about what the system would look like. Yet even as the cleverly-labeled OmniCISA passed into law last month, virtually none of the subsequent commentary took account of the small but important fact that the DHS information sharing portal has been up and running for months.

Hennessey appeared to think this argument was very clever, to suggest that “virtually no” privacy advocates (throughout her series she ignored that opposition came from privacy and security advocates) had talked about DHS’ existing portal. She must not have Googled that claim, because if she had, it would have become clear that privacy (and security) people had discussed DHS’ portal back in August, before the Senate finalized CISA.

Back in July, Al Franken took the comedic step of sending a letter to DHS basically asking, “Say, you’re already running the portal that is being legislated in CISA. What do you think of the legislation in its current form?” And DHS wrote back and noted that the portal being laid out in CISA (and the other sharing permitted under the bill) was different in several key ways from what it was already implementing.

Its concerns included:

  • Because companies could share with other agencies, the bill permitted sharing content with law enforcement. “The authorization to share cyber threat indicators and defensive measures with ‘any other entity or the Federal Government,’ ‘notwithstanding any other provision of law’ could sweep away important privacy protections, particularly the provisions in the Stored Communications Act limiting the disclosure of the content of electronic communications to the government by certain providers.”
  • The bill permitted companies to share more information than that permitted under the existing portal. “Unlike the President’s proposal, the Senate bill includes ‘any other attribute of a cybersecurity threat’ within its definition of cyber threat indicator.”
  • Because the bill required sharing in real time rather than in near-real time, it would mean DHS could not do all the privacy scrubs it was currently doing. “If DHS distributes information that is not scrubbed for privacy concerns, DHS would fail to mitigate and in fact would contribute to the compromise of personally identifiable information by spreading it further.”
  • Sharing in real rather than near-real time also means participants might get overloaded with extraneous information (something that has made existing info-sharing regimes ineffective). “If there is no layer of screening for accuracy, DHS’ customers may receive large amounts of information with dubious value, and may not have the capability to meaningfully digest that information.”
  • The bill put the Attorney General, not DHS, in charge of setting the rules for the portal. “Since sharing cyber threat information with the private sector is primarily within DHS’s mission space, DHS should author the section 3 procedures, in coordination with other entities.”
  • The 90-day implementation timeline was too ambitious; according to DHS, the bill should provide for an 180-day implementation. “The 90-day timeline for DHS’s deployment of a process and capability to receive cyber threat indicators is too ambitious, in light of the need to fully evaluate the requirements pertaining to that capability once legislation passes and build and deploy the technology.”

As noted, that exchange took place in July (most responses to it appeared in August). While a number of amendments addressing DHS’ concerns were proposed in the Senate, I’m aware of only two that got integrated into the bill that passed: an Einstein (that is, federal network monitoring) related request, and DHS got added — along with the Attorney General — in the rules-making function. McCaul mentioned both of those things, along with hailing the “more efficient” sharing that may refer to the real-time versus almost real-time sharing, in his op-ed.

Not only didn’t the Senate respond to most of the concerns DHS raised, as I noted in another post on the portal, the Senate also gave other agencies veto power over DHS’ scrub (this was sort of the quid pro quo of including DHS in the rule-making process, and it was how Ranking Member on the Senate Homeland Security Committee, Tom Carper, got co-opted on the bill), which exacerbated the real versus almost real-time sharing problem.

All that happened by October 27, days before the portal based on Obama’s executive order got fully rolled out. The Senate literally passed changes to the portal as DHS was running it days before it went into full operation.

Meanwhile, one more thing happened: as mandated by the Executive Order underlying the DHS portal, the Privacy and Civil Liberties Oversight Board helped DHS set up its privacy measures. This is, as I understand it, the report Hennessey points to in pointing to all the privacy protections that will make OmniCISA’s elimination of warrant requirements safe.

Helpfully, DHS has released its Privacy Impact Assessment of the AIS portal which provides important technical and structural context. To summarize, the AIS portal ingests and disseminates indicators using—acronym alert!—the Structured Threat Information eXchange (STIX) and Trusted Automated eXchange of Indicator Information (TAXII). Generally speaking, STIX is a standardized language for reporting threat information and TAXII is a standardized method of communicating that information. The technology has many interesting elements worth exploring, but the critical point for legal and privacy analysis is that by setting the STIX TAXII fields in the portal, DHS controls exactly which information can be submitted to the government. If an entity attempts to share information not within the designated portal fields, the data is automatically deleted before reaching DHS.

In other words, the scenario is precisely the reverse of what Hennessey describes: DHS set up a portal, and then the Senate tried to change it in many ways that DHS said, before passage, would weaken the privacy protections in place.

Now, Hennessey does acknowledge some of the ways OmniCISA weakened privacy provisions that were in DHS’ portal. She notes, for example, that the Senate added a veto on DHS’ privacy scrubs, but suggests that, because DHS controls the technical parameters, it will be able to overcome this veto.

At first read, this language would appear to give other federal agencies, including DOD and ODNI, veto power over any privacy protections DHS is unable to automate in real-time. That may be true, but under the statute and in practice DHS controls AIS; specifically, it sets the STIX TAXXI fields. Therefore, DHS holds the ultimate trump card because if that agency believes additional privacy protections that delay real-time receipt are required and is unable to convince fellow federal entities, then DHS is empowered to simply refuse to take in the information in the first place. This operates as a rather elegant check and balance system. DHS cannot arbitrarily impose delays, because it must obtain the consent of other agencies, if other agencies are not reasonable DHS can cut off the information, but DHS must be judicious in exercising that option because it also loses the value of the data in question.

This seems to flip Youngstown on its head, suggesting the characteristics of the portal laid out in an executive order and changed in legislation take precedence over the legislation.

Moreover, while Hennessey does discuss the threat of the other portal — one of the features added in the OmniCISA round with no debate — she puts it in a different post from her discussion of DHS’ purported control over technical intake data (and somehow portrays it as having “emerged from conference with the new possibility of an alternative portal” even though no actual conference took place, which is why McCaul is stuck writing plaintive op-eds while holding his rear-end). This means that, after writing a post talking about how DHS would have the final say on protecting privacy by controlling intake, Hennessey wrote another post that suggested DHS would have to “get it right” or the President would order up a second portal without all the privacy protections that DHS’ portal had in the first place (and which it had already said would be weakened by CISA).

Such a portal would, of course, be subject to all statutory limitations and obligations, including codified privacy protections. But the devil is in the details here; specifically, the details coded into the sharing portal itself. CISA does not obligate that the technical specifications for a future portal be as protective as AIS. This means that it is not just the federal government and private companies who have a stake in DHS getting it right, but privacy advocates as well. The balance of CISA is indeed delicate.

Elsewhere, Hennessey admits that many in government think DHS is a basket-case agency (an opinion I’m not necessarily in disagreement with). So it’s unclear how DHS would retain any leverage over the veto given that exercising such leverage would result in DHS losing this portfolio altogether. There was a portal designed with privacy protections, CISA undermined those protections, and then OmniCISA created yet more bureaucratic leverage that would force DHS to eliminate its privacy protections to keep the overall portfolio.

Plus, OmniCISA did two more things. First, as noted, back in July DHS said it would need 180 days to fully tweak its existing portal to match the one ordered up in CISA. CISA and OmniCISA didn’t care: the bill and the law retained the 90 day turnaround. But in addition, OmniCISA required DHS and the Attorney General develop their interim set of guidelines within 60 days (which as it happened included the Christmas holiday). That 60 deadline is around February 16. The President can’t declare the need for a second portal until after the DHS one gets certified, which has a 90 day deadline (so March 18). But he can give a 30 day notice that’s going to happen beforehand. In other words, the President can determine, after seeing what DHS and AG Lynch come up with in a few weeks, that that’s going to be too privacy restrictive and tell Congress FBI needs to have its own portal, something that did not and would not have passed under regular legislative order.

Finally, as I noted, PCLOB had been involved in setting up the privacy parameters for DHS’ portal, including the report that Hennessey points to as the basis for comfort about OmniCISA’s privacy risk. In final passage of OmniCISA, a PCLOB review of the privacy impact of OmniCISA, which had been included in every single version of the bill, got eliminated.

Hennssey’s seeming admission that’s the eventual likelihood appears over the course of her posts as well. In her first post, she claims,

From a practical standpoint, the government does not want any information—PII or otherwise—that is not necessary to describe or identify a threat. Such information is operationally useless and costly to store and properly handle.

But in explaining the reason for a second portal, she notes that there is (at least) one agency included in OmniCISA sharing that does want more information: FBI.

[T]here are those who fear that awarding liability protection exclusively to sharing through DHS might result in the FBI not getting information critical to the investigation of computer crimes. The merits of the argument are contested but the overall intention of CISA is certainly not to result in the FBI getting less cyber threat information. Hence, the fix.

[snip]

AIS is not configured to receive the full scope of cyber threat information that might be necessary to the investigation of a crime. And while CISA expressly permits sharing with law enforcement – consistent with all applicable laws – for the purposes of opening an investigation, the worry here is that companies that are the victims of hacks will share those threat indicators accepted by AIS, but not undertake additional efforts to lawfully share threat information with an FBI field office in order to actually investigate the crime.

That is, having decided that the existing portal wasn’t good enough because it didn’t offer enough immunities (and because it was too privacy protective), the handful of mostly Republican leaders negotiating OmniCISA outside of normal debate then created the possibility of extending those protections to a completely different kind of information sharing, that of content shared for law enforcement.

In her final post, Hennessey suggests some commentators (hi!!) who might be concerned about FBI’s ability to offer immunity for those who share domestically collected content willingly are “conspiracy-minded” even while she reverts to offering solace in the DHS portal protections that, her series demonstrates, are at great risk of bureaucratic bypass.

But these laws encompass a broad range of computer crimes, fraud, and economic espionage – most controversially the Computer Fraud and Abuse Act (CFAA). Here the technical constraints of the AIS system cut both ways. On one hand, the scope of cyber threat indicators shared through the portal significantly undercuts claims CISA is a mass surveillance bill. Bluntly stated, the information at issue is not of all that much use for the purposes certain privacy-minded – and conspiracy-minded, for that matter – critics allege. Still, the government presumably anticipates using this information in at least some investigations and prosecutions. And not only does CISA seek to move more information to the government – a specific and limited type of information, but more nonetheless – but it also authorizes at least some amount of new sharing.

[snip]

That question ultimately resolves to which STIX TAXII fields DHS decides to open or shut in the portal. So as CISA moves towards implementation, the portal fields – and the privacy interests at stake in the actual information being shared – are where civil liberties talk should start.

To some degree, Hennessey’s ultimate conclusion is one area where privacy (and security) advocates might weigh in. When the government provides Congress the interim guidelines sometime this month, privacy (and security) advocates might have an opportunity to weigh in, if they get a copy of the guidelines. But only the final guidelines are required to be made public.

And by then, it would be too late. Through a series of legislative tactics, some involving actual debate but some of the most important simply slapped onto a must-pass legislation, Congress has authorized the President to let the FBI, effectively, obtain US person content pertaining to Internet-based crimes without a warrant. Even if President Obama chooses not to use that authorization (or obtains enough concessions from DHS not to have to directly), President Trump may not exercise that discretion.

Maybe I am being conspiratorial in watching the legislative changes made to a bill (and to an existing portal) and, absent any other logical explanation for them, concluding those changes are designed to do what they look like they’re designed to do. But it turns out privacy (and security) advocates weren’t conspiratorial enough to prevent this from happening before it was too late.

After Lying in a Closed Surveillance Briefing in 2011, Intelligence Community Plans Another Closed Briefing

On May 18, 2011, 48 members of the House (mostly Republicans, but also including MI’s Hansen Clarke) attended a closed briefing given by FBI Director Robert Mueller and General Counsel Valerie Caproni on the USA PATRIOT Act authorities up for reauthorization. The hearing would serve as the sole opportunity for newly elected members to learn about the phone and Internet dragnets conducted under the PATRIOT Act, given Mike Rogers’ decision not to distribute the letter provided by DOJ to inform members on the secret dragnets they were about to reauthorize.

During the hearing, someone asked,

Russ Feingold said that Section 215 authorities have been abused. How does the FBI respond to that accusation?

One of the briefers — the summary released under FOIA does not say who — responded,

To the FBI’s knowledge, those authorities have not been abused.

As a reminder, hearing witness Robert Mueller had to write and sign a declaration for the FISC two years earlier to justify resuming full authorization for the phone dragnet because, as Judge Reggie Walton had discovered, the NSA had conducted “daily violations of the minimization procedures” for over two years. “The minimization procedures proposed by the government in each successive application and approved and adopted as binding by the orders of the FISC have been so frequently and systemically violated that it can fairly be said that this critical element of the overall BR regime has never functioned effectively,” Walton wrote in March 2009.

Now, I can imagine that whichever FBI witness claimed the FBI didn’t know about any “abuses” rationalized the answer to him or herself using the same claim the government has repeatedly made — that these were not willful abuses. But Walton stated then — and more evidence released since has made clear he was right since — that the government simply chose to subject the vast amount of US person data collected under the PATRIOT Act to EO 12333 standards, not more stringent PATRIOT Act ones. That is, the NSA, operating under FBI authorizations, made a willful choice to ignore the minimization procedures imposed by the 2006 reauthorization of the Act.

Whoever answered that question in 2011 lied, and lied all the more egregiously given that the questioner had no way of phrasing it to get an honest answer about violations of minimization procedures.

Which is why the House Judiciary Committee should pointedly refuse to permit the Intelligence Committee to conduct another such closed briefing, as they plan to do on Section 702 on February 2. Holding a hearing in secret permits the IC to lie to Congress, not to mention disinform some members in a venue where their colleagues can not correct the record (as Feingold might have done in 2011 had he learned what the FBI witnesses said in that briefing).

I mean, maybe HJC Chair Bob Goodlatte wants to be lied to? Otherwise, there’s no sound explanation for scheduling this entire hearing in closed session.

 

Martin Luther King Jr., Subversives, and the PATRIOT Dragnet

In a superb column today, Alvaro Bedoya recalls the long, consistent history during which people of color and other minorities, including Martin Luther King, Jr., were targeted in the name of national security.

The FBI’s violations against King were undeniably tinged by what historian David Garrow has called “an organizational culture of like-minded white men.” But as Garrow and others have shown, the FBI’s initial wiretap requests—and then–Attorney General Robert Kennedy’s approval of them—were driven by a suspected tie between King and the Communist Party. It wasn’t just King; Cesar Chavez, the labor and civil rights leader, was tracked for years as a result of vague, confidential tips about “a communist background,” as were many others.

Many people know that during World War II, innocent Americans of Japanese descent were surveilled and detained in internment camps. Fewer people know that in the wake of World War I, President Woodrow Wilson openly feared that black servicemen returning from Europe would become “the greatest medium in conveying Bolshevism to America.” Around the same time, the Military Intelligence Division created a special “Negro Subversion” section devoted to spying on black Americans. Near the top of its list was W.E.B. DuBois, a “rank Socialist” whom they tracked in Paris for fear he would “attempt to introduce socialist tendencies at the Peace Conference.”

I think Bedoya, as many people do, gives FBI Director Jim Comey a big pass on surveillance due to the Director’s stunt of having agents-in-training study what the Bureau did to King. I have written about how Comey’s claim to caution in the face of the MLK example don’t hold up to the Bureau’s current, known activities.

Comey engages in similar obfuscation when he points to FBI’s treatment of Martin Luther King Jr., whose treatment at the hands of the FBI he holds up to FBI Agents as a warning. The FBI Director describes the unlimited amount of surveillance the Bureau subjected King to based solely on the signature of Hoover and the Attorney General  “Open-ended. No time limit. No space restriction. No review. No oversight.” While it is true that the FBI now gets court approval to track civil rights leaders, they do track them, especially in the Muslim community. And without oversight, the FBI can and does infiltrate houses of worship with informants, as they did with African-American churches during the Civil Rights movement. FBI can obtain phone and Internet metadata records without judicial oversight using National Security Letters — which they still can’t count accurately to fulfill congressionally mandated reporting. The FBI has many tools that evade the kind of oversight Comey described, and because of technology many of them are far more powerful than the tools wielded against Dr. King.

But I’m particularly interested in Bedoya’s reminder that the government targeted African Americans for surveillance as subversives in the wake of World War I.

The government’s practice of targeting specific kinds of people, often people of color, as subversives continued, after all. It’s something J. Edgar Hoover continued throughout his life, keeping a list of people to be rounded up if anything happened.

I’ve been thinking about that practice as I’ve been trying to explain, even to civil liberties supporters, why the current 2-degree targeted dragnet is still too invasive of privacy. We’ve been having this discussion for 2.5 years, and yet still most people don’t care that completely innocent people 2 degrees — 3, until 2014 — away from someone the government has a traffic-stop level of suspicion over will be subjected to the NSA’s “full analytic tradecraft.”

The discussion of a Subversives List makes me think of this article from 2007 (which I first wrote about here and here). The story explains that the thing that really freaked out the hospital “heroes” in 2004 was not the Internet dragnet itself, but instead the deployment of Stellar Wind against Main Core, which appears to be another name for this Subversives List.

While Comey, who left the Department of Justice in 2005, has steadfastly refused to comment further on the matter, a number of former government employees and intelligence sources with independent knowledge of domestic surveillance operations claim the program that caused the flap between Comey and the White House was related to a database of Americans who might be considered potential threats in the event of a national emergency. Sources familiar with the program say that the government’s data gathering has been overzealous and probably conducted in violation of federal law and the protection from unreasonable search and seizure guaranteed by the Fourth Amendment.

According to a senior government official who served with high-level security clearances in five administrations, “There exists a database of Americans, who, often for the slightest and most trivial reason, are considered unfriendly, and who, in a time of panic, might be incarcerated. The database can identify and locate perceived ‘enemies of the state’ almost instantaneously.” He and other sources tell Radar that the database is sometimes referred to by the code name Main Core. One knowledgeable source claims that 8 million Americans are now listed in Main Core as potentially suspect. In the event of a national emergency, these people could be subject to everything from heightened surveillance and tracking to direct questioning and possibly even detention.

[snip]

Another well-informed source—a former military operative regularly briefed by members of the intelligence community—says this particular program has roots going back at least to the 1980s and was set up with help from the Defense Intelligence Agency. He has been told that the program utilizes software that makes predictive judgments of targets’ behavior and tracks their circle of associations with “social network analysis” and artificial intelligence modeling tools.

“The more data you have on a particular target, the better [the software] can predict what the target will do, where the target will go, who it will turn to for help,” he says. “Main Core is the table of contents for all the illegal information that the U.S. government has [compiled] on specific targets.” An intelligence expert who has been briefed by high-level contacts in the Department of Homeland Security confirms that a database of this sort exists, but adds that “it is less a mega-database than a way to search numerous other agency databases at the same time.”

[snip]

The following information seems to be fair game for collection without a warrant: the e-mail addresses you send to and receive from, and the subject lines of those messages; the phone numbers you dial, the numbers that dial in to your line, and the durations of the calls; the Internet sites you visit and the keywords in your Web searches; the destinations of the airline tickets you buy; the amounts and locations of your ATM withdrawals; and the goods and services you purchase on credit cards. All of this information is archived on government supercomputers and, according to sources, also fed into the Main Core database.

[snip]

Main Core also allegedly draws on four smaller databases that, in turn, cull from federal, state, and local “intelligence” reports; print and broadcast media; financial records; “commercial databases”; and unidentified “private sector entities.” Additional information comes from a database known as the Terrorist Identities Datamart Environment, which generates watch lists from the Office of the Director of National Intelligence for use by airlines, law enforcement, and border posts. According to the Washington Post, the Terrorist Identities list has quadrupled in size between 2003 and 2007 to include about 435,000 names. The FBI’s Terrorist Screening Center border crossing list, which listed 755,000 persons as of fall 2007, grows by 200,000 names a year. A former NSA officer tells Radar that the Treasury Department’s Financial Crimes Enforcement Network, using an electronic-funds transfer surveillance program, also contributes data to Main Core, as does a Pentagon program that was created in 2002 to monitor anti-war protestors and environmental activists such as Greenpeace.

Given what we now know about the dragnet, this article is at once less shocking and more so. Much of the information included — phone records and emails — as well as the scale of the known lists — such as the No Fly List — are all known. Others, such as credit card purchases, aren’t included in what we know about the dragnet, though we have suspected. The purported inclusion of peace protestors, in what appears to be a reference to CIFA, is something I’ll return to.

Mostly, though, this article takes the generally now-known scope of the dragnet and claim that it serves as the function that those Subversives lists from days past have. As such (and assuming it is true in general outline, and I have significant reason to believe it is) it does two things for our understanding. First, it illustrates what I have tried to in the past, what it means to be exposed to the full complement of NSA’s analytical tradecraft. But it also reframes what our understanding of what 2-degree of suspicion from a traffic stop means.

Whether or not this Main Core description is accurate, it invites us to think of this 2-degree dragnet as a nomination process to be on the Subversives list. Unlike in Hoover’s day, when someone had to keep up a deck of index cards, here it’s one interlocking set of data, all coded to serve both as a list and a profiling system for anyone on that list.

To the extent that this dragnet still exists (or has been magnified with the rollout of XKeyscore), and it absolutely does for Muslims 2 degrees from a terrorist suspect, this is what the dragnet is all about: getting you on that list, which serves as a magnet for all the rest of your information to be sucked in and retained, so that if the government ever feels like it has to start cracking down on dissidents, it has that list, and a ton of demographic data, ready at had.

Update: See this Global Research post on COG programs.