The Hill sees fit to quote NSA’s Compliance Officer John DeLong boasting that the NSA put in (one of) the reforms Obama announced the day he announced it — which (DeLong claimed) was proof that NSA’s compliance system works.
Earlier this year, Obama directed the NSA to get court approval before it searched a database of Americans’ phone records and limited those searches to people two “hops” away from a suspect.
DeLong said on Thursday that the changes were put into effect the same day that the president announced them.
“It helped to have a compliance program — a compliance workforce — that was already in place,” he said. That way, the agency was not operating “from a cold start.”
As I noted in January when commentators first started hailing what the Administration billed as a great change, it was instead presidential codification of a policy that had been in place since 2011.
I’m seeing a lot of enthusiasm about President Obama’s promise to limit the NSA to 2 hops on its phone dragnet.
Effective immediately, we will only pursue phone calls that are two steps removed from a number associated with a terrorist organization instead of three.
But it’s not that big of a limit.
As far back as 2011, the NSA had standardized on 2-hops, only permitting a 3rd with special approval. (See page 13.)
While the BR Order permits contact chaining for up to three hops, NSA has decided to limit contact chaining to only two hops away from the RAS-approved identifier without prior approval from your Division management to chain the third hop.
So in effect, Obama has replaced the NSA’s internal directive limiting the hops to 2 with his own directive (which can be pixie dusted with no notice) limiting the hops to 2.
What NSA’s ability to implement this change immediately shows is not the great performance of its compliance program, but rather the ability to do nothing while claiming a great victory over the status quo.
But don’t look for that to appear in most reporting on the NSA.
Last Friday, NSA’s Compliance Director John DeLong assured journalists the violations NSA reported in 2012 were “miniscule.” (I noted that the report showed some of the most sensitive violations primarily get found through audits and therefore their discovery depend in part on how many people are auditing.)
Today, as part of a story describing that NSA still doesn’t know what Edward Snowden took from NSA, MSNBC quotes a source saying NSA has stinky audit capabilities.
Another said that the NSA has a poor audit capability, which is frustrating efforts to complete a damage assessment.
For the past several months, various Intelligence officials have assured Congress and the public that it keeps US person data very carefully guarded, so only authorized people can access it.
Today, MSNBC reports NSA had (has?) poor data compartmentalization.
NSA had poor data compartmentalization, said the sources, allowing Snowden, who was a system administrator, to roam freely across wide areas.
Again, there have long been signs that non-analysts had untracked access to very sensitive data. Multiple sources agree — and possibly not just non-analysts.
While I’m really sympathetic for the people who are reportedly “overwhelmed” trying to figure out what Snowden took, we’re seeing precisely the same thing we saw with Bradley Manning: that it takes a giant black eye for intelligence agencies to even admit to gaping holes in their security and oversight.
And in NSA’s case, it proves most of their reassurances to be false.
The NSA claims there have been no willful violations the law relating to the NSA databases. For example, NSA’s Director of Compliance John DeLong just said “NSA has a zero tolerance policy for willful misconduct. None of the incidents were willful.” House Intelligence Chair Mike Rogers just said the documents show “no intentional or willful violations.”
Which is why I want to look more closely at the user error categories included in the May 3, 2012 audit.
The report doesn’t actually break down the root cause of errors across all violations. But it does for 3 different types of overlapping incident types (the 195 FISA authority incidents, the 115 database query ones, and the 772 S2 Directorate violations).
It says the root cause for FISA authority incidents breaks down this way:
It says the root cause of all database query incidents breaks down this way:
And it breaks down the errors in its worst performing (in terms of violations) Deputy Directorate organization, S2, this way:
What I’m interested in are the three main types of operator error: human error, resources, and lack of due diligence.
Human error is, from the descriptions, an honest mistake. It includes broad syntax errors, typographical errors, Boolean operator errors, misapplied query technique, incorrect option, unfamiliarity with tool, selector mistypes, incorrect realm, or improper queries. Let’s assume, improbably, that none of the violations listed as human error were anything but honest mistakes. These honest mistakes account for anywhere from 9% to 74% of the violations broken out by root cause.
Then there’s resource violations. Those are described as “inaccurate of insufficient research information and/or workload issues.” So partly, resource violations stem from someone having too much analysis to do. But given that “inaccurate or insufficient research information” always appears first, it seems that resource violations arise when an analyst targets someone based on a faulty understanding about this person. Given how prominent this problem is for FISA violations, I suspect it includes, in part, target location. It may also pertain to targets erroneously believed to have a tie to terror or Chinese military or Iranian nukes. These appear to mistakes based on the analyst not having enough or accurate information before she starts the collection. These may or may not be honest mistakes. The description of them as resource errors suggests they may in part by people taking research short cuts. Resource problems account for anywhere from 6% to 31% of the violations broken out by root cause.
But then there’s a third category: lack of due diligence. The report defines lack of due diligence as “a failure to follow standard operating procedures.” But some failure to follow standard operating procedure is accounted for in other categories, like training, the misapplied query techniques, and the apparent inadequate research violations. This category appears to be something different than the “honest mistake” errors categorized under human error. In fact, by the very exclusion of these violations from the “human error” category, NSA seems to be admitting these violations aren’t errors. These violations of standard operating procedures, it seems, are intentional. Not errors. Willful violations.
At the very least, this category seems to count the violations on behalf of analysts who just don’t give a fuck what he rules are, they’re going to ignore the rules.
This category, what consider the “Analyst didn’t give a fuck” category, accounts for 9% to 20% of all the violations broken out by root cause.
In aggregate, these violations may not amount to all that many given the thousands of queries run every year — they make up just 68 of the violations in S2, for example. Those 68 due diligence violations make up almost 8% of the violations in the quarter, not counting due diligence violations that may have happened in other Directorates.
John DeLong, who is in charge of compliance at NSA, says the Agency has zero tolerance for willful misconduct. But the NSA appears to have a good deal more tolerance for a lack of due diligence.