Lack of Due Diligence: The NSA’s “the Analyst Didn’t Give a Fuck” Violation

The NSA claims there have been no willful violations the law relating to the NSA databases. For example, NSA’s Director of Compliance John DeLong just said “NSA has a zero tolerance policy for willful misconduct. None of the incidents were willful.” House Intelligence Chair Mike Rogers just said the documents show “no intentional or willful violations.”

Which is why I want to look more closely at the user error categories included in the May 3, 2012 audit.

The report doesn’t actually break down the root cause of errors across all violations. But it does for 3 different types of overlapping incident types (the 195 FISA authority incidents, the 115 database query ones, and the 772 S2 Directorate violations).

It says the root cause for FISA authority incidents breaks down this way:

  • 60 resource (31% of all FISA authority violations)
  • 39 lack of due diligence (20% of all FISA authority violations)
  • 21 human error (11% of all FISA authority violations)
  • 3 training (1.5% of all FISA authority violations)
  • 67 system limitations (34% of all FISA authority violations, mostly on the roamer problem)
  • 4 system engineering (2% of all FISA authority violations)
  • 1 system disruption (.5% of all FISA authority violations)

It says the root cause of all database query incidents breaks down this way:

  • 85 human error (74% of all database query incidents)
  • 13 lack of due diligence (11% of all database query incidents)
  • 9 training (8% of all database query incidents)
  • 7 resources (6% of all database query incidents)
  • 1 system disruption (~1% of all database query incidents)

And it breaks down the errors in its worst performing (in terms of violations) Deputy Directorate organization, S2, this way:

  • 71 human error (9% of all S2 violations)
  • 80 resources (10% of all S2 violations)
  • 68 lack of due diligence (9% of all S2 violations)
  • 2 resources
  • 9 training (1% of all S2 violations)
  • 541 system limitations (70% of all S2 violations)
  • 1 system engineering

What I’m interested in are the three main types of operator error: human error, resources, and lack of due diligence.

Human error is, from the descriptions, an honest mistake. It includes broad syntax errors, typographical errors, Boolean operator errors, misapplied query technique, incorrect option, unfamiliarity with tool, selector mistypes, incorrect realm, or improper queries. Let’s assume, improbably, that none of the violations listed as human error were anything but honest mistakes. These honest mistakes account for anywhere from 9% to 74% of the violations broken out by root cause.

Then there’s resource violations. Those are described as “inaccurate of insufficient research information and/or workload issues.” So partly, resource violations stem from someone having too much analysis to do. But given that “inaccurate or insufficient research information” always appears first, it seems that resource violations arise when an analyst targets someone based on a faulty understanding about this person. Given how prominent this problem is for FISA violations, I suspect it includes, in part, target location. It may also pertain to targets erroneously believed to have a tie to terror or Chinese military or Iranian nukes. These appear to mistakes based on the analyst not having enough or accurate information before she starts the collection. These may or may not be honest mistakes. The description of them as resource errors suggests they may in part by people taking research short cuts. Resource problems account for anywhere from 6% to 31% of the violations broken out by root cause.

But then there’s a third category: lack of due diligence. The report defines lack of due diligence as “a failure to follow standard operating procedures.” But some failure to follow standard operating procedure is accounted for in other categories, like training, the misapplied query techniques, and the apparent inadequate research violations. This category appears to be something different than the “honest mistake” errors categorized under human error. In fact, by the very exclusion of these violations from the “human error” category, NSA seems to be admitting these violations aren’t errors. These violations of standard operating procedures, it seems, are intentional. Not errors. Willful violations.

At the very least, this category seems to count the violations on behalf of analysts who just don’t give a fuck what he rules are, they’re going to ignore the rules.

This category, what consider the “Analyst didn’t give a fuck” category, accounts for 9% to 20% of all the violations broken out by root cause.

In aggregate, these violations may not amount to all that many given the thousands of queries run every year — they make up just 68 of the violations in S2, for example. Those 68 due diligence violations make up almost 8% of the violations in the quarter, not counting due diligence violations that may have happened in other Directorates.

John DeLong, who is in charge of compliance at NSA, says the Agency has zero tolerance for willful misconduct. But the NSA appears to have a good deal more tolerance for a lack of due diligence.

Marcy has been blogging full time since 2007. She’s known for her live-blogging of the Scooter Libby trial, her discovery of the number of times Khalid Sheikh Mohammed was waterboarded, and generally for her weedy analysis of document dumps.

Marcy Wheeler is an independent journalist writing about national security and civil liberties. She writes as emptywheel at her eponymous blog, publishes at outlets including the Guardian, Salon, and the Progressive, and appears frequently on television and radio. She is the author of Anatomy of Deceit, a primer on the CIA leak investigation, and liveblogged the Scooter Libby trial.

Marcy has a PhD from the University of Michigan, where she researched the “feuilleton,” a short conversational newspaper form that has proven important in times of heightened censorship. Before and after her time in academics, Marcy provided documentation consulting for corporations in the auto, tech, and energy industries. She lives with her spouse and dog in Grand Rapids, MI.

12 replies
  1. der says:

    As any teen will tell you there are ways around the rules, so the lying liars are updating their schemes as the ink dries. On the one hand we have the Minister of Truth telling us the plumbers fixed the leak, and on the other there’s, well the water carriers for the NSA diverting the flow. I’m curious to know how long an analysts eyes stayed on the “error” page before clicking off. Thankfully the NSA plans on axing 90% of systems administrators so that should help with the workload of those overworked techies.

  2. P J Evans says:

    That’s crappy performance and crappy training.
    Where I worked, if someone continued to make a lot of errors, they were released. (The tolerable level was 10% error, which is still a high rate.)

  3. Arbusto says:

    You’d think the NSA would have different levels of analysts with levels of permissions, but it sure doesn’t seem there are any system lockouts. Is their program really open to all vetted employees or is more sensitive communication intercepts limited by the database. I’d hope intercepts with US locations would required more expertise, oversight and procedure, especially US to US, even though TPTB deny US/US intercepts.

  4. peasantparty says:

    So the clean up crew did come out after all. I thought it would have waited until after the Sunday Talk shows so the tv heads could aim questions just enough to the right to calm the situation.

    No matter what they said today, tomorrow, or any other day; they are using the Patriot Act and NDAA to find reasons and keep them till they need em.

    Why collect all this information on normal innocent Americans? So they can use it after the fact to squeeze together a faked up reason to throw you into a military or FEMA Gulag. No rights, no reason, secret evidence, and NO LAWYER!

    We as citizens cannot let this go and we cannot walk on by thinking it will get fixed later. It won’t!

  5. orionATL says:

    the problem behind these numbers is that nsa gets to make up the categories, set rules for what is to be placed in each category, and, most imortantly, decide which category to put any particular incident in, all in secret. then it uses them to publicly tout its sensitivity to and vigilence about improper querying.

    give nsa’s evasiveness and lying, e.g., the injunction not to provide too much info to overseers, the public would be foolish to give too much credence to the totals in a “benign” category over those in a more blameworthy category.

    as for delong’s assurance that nsa does not tolerate willful misconduct, that is purely formalistic, and irrelevant to day-to-day operating reality. of course the nsa rules of professional conduct rule out willful misconduct.

    but i’ll bet there is a strong informal tolerance for something similar to hot pursuit.

    i’d also bet if a supervisor says “to hell with the rules. this is too important. do it.” the analyst will do it.

    that’s the wonderful thing about be totally protected by secrecy; you can get away with so much without even being questioned about what you’re doing, let alone being held accountable.

  6. JohnT says:

    You’ve prolly mentioned this already Marcy, but this is stunning

    Yes, the NSA claimed in a recently released white paper that it “touches” only 1.6 percent of the planet’s online data, but the agency neglected to note that this is roughly equivalent to the Library of Congress’s entire textual collection, inspected 2,990 times every day.

  7. orionATL says:

    wyden from the guarfian srticleccited above:

    “..On July 31, Wyden, backed by Udall, vaguely warned other senators in a floor speech that the NSA and the director of national intelligence were substantively misleading legislators by describing improperly collected data as a matter of innocent and anodyne human or technical errors…”

    “innocent and anodyne human or technical errors”. this must have given gellman his text.

  8. Jessica says:

    I’m growing a bit tired of Udall and Wyden’s “cautious statements” being so cautious. Don’t get me wrong – I appreciate that they are concerned and I’m glad they’re prodding people. But at the same time, I’d kind of like them to dispense with the tippy-toes and start kicking some doors down.

  9. earlofhuntingdon says:

    Imagine if the airlines tolerated that level of “unintentional” human error among pilots and maintenance staff, or if the air force permitted that high a percentage of error in investigation allegations of sexual misconduct at its vaunted academy.

  10. Jim says:

    Marcy, you’re probably aware of this, but Binney suggested a way to eliminate a large source for “human error” in NSA inquiries, by encrypting the entire database(s) and implementing a process whereby the FISA court provides decryption keys that are unique to a given person / subject.

    Assuming appropriate implementation of the database(s) and decryption mechanism, the only query results that would be human-readable would be those pertaining to a subject for which the FISA court gives permission to search / view.

    Given enough compute resource, just about any encryption method can be defeated, but doing so would require such a large amount of such resource to be outside the capabilities of (most) individuals. It would be very difficult, if not impossible, to break a strong encryption scheme without the knowledge of NSA higher-ups.

    Someone else might have mentioned this point before, but it’s such an obvious solution to the human abuse problem that I thought I’d mention it here.

Comments are closed.