Posts

[Photo: National Security Agency, Ft. Meade, MD via Wikimedia]

Shadow Brokers Gets Results! Congress Finally Moves to Oversee Vulnerabilities Equities Process

Since the Snowden leaks, there has been a big debate about the Vulnerabilities Equities Process — the process by which NSA reviews vulnerabilities it finds in code and decides whether to tell the maker or instead to turn it into an exploit to use to spy on US targets. That debate got more heated after Shadow Brokers started leaking exploits all over the web, ultimately leading to the global WannaCry attack (the NotPetya attack also included an NSA exploit, but mostly for show).

In the wake of the WannaCry attack, Microsoft President Brad Smith wrote a post demanding that governments stop stockpiling vulnerabilities.

Finally, this attack provides yet another example of why the stockpiling of vulnerabilities by governments is such a problem. This is an emerging pattern in 2017. We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world. Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen. And this most recent attack represents a completely unintended but disconcerting link between the two most serious forms of cybersecurity threats in the world today – nation-state action and organized criminal action.

The governments of the world should treat this attack as a wake-up call. They need to take a different approach and adhere in cyberspace to the same rules applied to weapons in the physical world. We need governments to consider the damage to civilians that comes from hoarding these vulnerabilities and the use of these exploits. This is one reason we called in February for a new “Digital Geneva Convention” to govern these issues, including a new requirement for governments to report vulnerabilities to vendors, rather than stockpile, sell, or exploit them.

But ultimately, the VEP was a black box the Executive Branch conducted, without any clear oversight.

The Intelligence Authorization would change that. Starting 3 months after passage of the Intel Authorization, it would require each intelligence agency to report to Congress the “process and criteria” that agency uses to decide whether to submit a vulnerability for review; the reports would be unclassified, with a classified annex.

In addition, each year the Director of National Intelligence would have to submit a classified list tracking what happened with the vulnerabilities reviewed in the previous year. In addition to showing how many weren’t disclosed, it would also require the DNI to track what happened to the vulnerabilities that were disclosed. One concern among spooks is that vendors don’t actually fix their vulnerabilities in timely fashion, so disclosing them may not make end users any safer.

There would be an unclassified report on the aggregate reporting of vulnerabilities both at the government level and by vendor. Arguably, this is far more transparency than the government provides right now on actual spying.

This report would, at the very least, provide real data about what actually happens with the VEP and may show (as some spooks complain) that vendors won’t actually fix vulnerabilities that get disclosed. My guess is SSCI’s mandate for unclassified reporting by vendor is meant to embarrass those (potentially including Microsoft?) that take too long to fix their vulnerabilities.

I’m curious how the IC will respond to this (especially ODNI, which under James Clapper had squawked mightily about new reports). I also find it curious that Rick Ledgett wrote his straw man post complaining that Shadow Brokers would lead people to reconsider VEP after this bill was voted out of the SSCI; was that a preemptive strike against a reasonable requirement?


SEC. 604. REPORTS ON THE VULNERABILITIES EQUITIES POLICY AND PROCESS OF THE FEDERAL GOVERNMENT.

Report Policy And Process.—

(1) IN GENERAL.—Not later than 90 days after the date of the enactment of this Act and not later than 30 days after any substantive change in policy, the head of each element of the intelligence community shall submit to the congressional intelligence committees a report detailing the process and criteria the head uses for determining whether to submit a vulnerability for review under the vulnerabilities equities policy and process of the Federal Government.

(2) FORM.—Each report submitted under paragraph (1) shall be submitted in unclassified form, but may include a classified annex.

(b) Annual Report On Vulnerabilities.—

(1) IN GENERAL.—Not less frequently than once each year, the Director of National Intelligence shall submit to the congressional intelligence committees a report on—

(A) how many vulnerabilities the intelligence community has submitted for review during the previous calendar year;

(B) how many of such vulnerabilities were ultimately disclosed to the vendor responsible for correcting the vulnerability during the previous calendar year; and

(C) vulnerabilities disclosed since the previous report that have either—

(i) been patched or mitigated by the responsible vendor; or

(ii) have not been patched or mitigated by the responsible vendor and more than 180 days have elapsed since the vulnerability was disclosed.

(2) CONTENTS.—Each report submitted under paragraph (1) shall include the following:

(A) The date the vulnerability was disclosed to the responsible vendor.

(B) The date the patch or mitigation for the vulnerability was made publicly available by the responsible vendor.

(C) An unclassified appendix that includes—

(i) a top-line summary of the aggregate number of vulnerabilities disclosed to vendors, how many have been patched, and the average time between disclosure of the vulnerability and the patching of the vulnerability; and

(ii) the aggregate number of vulnerabilities disclosed to each responsible vendor, delineated by the amount of time required to patch or mitigate the vulnerability, as defined by thirty day increments.

(3) FORM.—Each report submitted under paragraph (1) shall be in classified form.

(c) Vulnerabilities Equities Policy And Process Of The Federal Government Defined.—In this section, the term “vulnerabilities equities policy and process of the Federal Government” means the policy and process established by the National Security Council for the Federal Government, or successor set of policies and processes, establishing policy and responsibilities for disseminating information about vulnerabilities discovered by the Federal Government or its contractors, or disclosed to the Federal Government by the private sector in government off-the-shelf (GOTS), commercial off-the-shelf (COTS), or other commercial information technology or industrial control products or systems (including both hardware and software).

[Photo: National Security Agency, Ft. Meade, MD via Wikimedia]

Rick Ledgett’s Straw Malware

For some reason, over a month after NotPetya and almost two months after WannaCry, former Deputy DIRNSA Rick Ledgett has decided now’s the time to respond to them by inventing a straw man argument denying the need for vulnerabilities disclosure. In the same (opening) paragraph where he claims the malware attacks have revived calls for the government to release all vulnerabilities, he accuses his opponents of oversimplification.

The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds.  Proponents argue this will allow for the development of patches, which will in turn ensure networks are secure.  On the face of it, this argument might seem to make sense, but it is actually a gross oversimplification of the problem, would not have the desired effect, and would in fact be dangerous.

Yet it’s Ledgett who is oversimplifying. What most people engaging in the VEP debate — even before two worms based, in part, on tools stolen from NSA — have asked for is for some kind of sense and transparency on the process by which NSA reviews vulnerabilities for disclosure. Ledgett instead poses his opponents as absolutists, asking for everything to be disclosed.

Ledgett then spends part of his column claiming that WannaCry targeted XP.

Users agree to buy the software “as is” and most software companies will attempt to patch vulnerabilities as they are discovered, unless the software has been made obsolete by the company, as was the case with Windows XP that WannaCry exploited.

[snip]

Customers who buy software should expect to have to patch it and update it to new versions periodically.

Except multiple reports said that XP wasn’t the problem, Windows 7 was. Ledgett’s mistake is all the more curious given reports that EternalBlue was blue screening at NSA when — while he was still at the agency — it was primarily focused on XP. That is, Ledgett is one of the people who might have expected WannaCry to crash XP; that he doesn’t even when I do doesn’t say a lot for NSA’s oversight of its exploits.

Ledgett then goes on to claim that WannaCry was a failed ransomware attack, even though that’s not entirely clear.

At least he understands NotPetya better, noting that the NSA component of that worm was largely a shiny object.

In fact, the primary damage caused by Petya resulted from credential theft, not an exploit.

The most disturbing part of Ledgett’s column, however, is that it takes him a good eight (of nine total) paragraphs to get around to addressing what really has been the specific response to WannaCry and NotPetya, a response shared by people on both sides of the VEP debate: NSA needs to secure its shit.

Some have made the analogy that the alleged U.S. government loss of control of their software tools is tantamount to losing control of Tomahawk missile systems, with the systems in the hands of criminal groups threatening to use them.  While the analogy is vivid, it incorrectly places all the fault on the government.  A more accurate rendering would be a missile in which the software industry built the warhead (vulnerabilities in their products), their customers built the rocket motor (failing to upgrade and patch), and the ransomware is the guidance system.

We are almost a full year past the day ShadowBrokers first came on the scene, threatening to leak NSA’s tools. A recent CyberScoop article suggests that, while government investigators now have a profile they believe ShadowBrokers matches, they’re not even entirely sure whether they’re looking for a disgruntled former IC insider, a current employee, or a contractor.

The U.S. government’s counterintelligence investigation into the so-called Shadow Brokers group is currently focused on identifying a disgruntled, former U.S. intelligence community insider, multiple people familiar with the matter told CyberScoop.

[snip]

While investigators believe that a former insider is involved, the expansive probe also spans other possibilities, including the threat of a current intelligence community employee being connected to the mysterious group.

[snip]

It’s not clear if the former insider was once a contractor or in-house employee of the secretive agency. Two people familiar with the matter said the investigation “goes beyond” Harold Martin, the former Booz Allen Hamilton contractor who is currently facing charges for taking troves of classified material outside a secure environment.

At least some of Shadow Brokers’ tools were stolen after Edward Snowden walked out of NSA Hawaii with the crown jewels, at a time when Rick Ledgett, personally, was leading a leak investigation into NSA’s vulnerabilities. And yet, over three years after Snowden stole his documents, the Rick Ledgett-led NSA still had servers sitting unlocked in their racks, still hadn’t addressed its privileged user issues.

Rick Ledgett, the guy inventing straw man arguments about absolutist VEP demands is a guy who’d do the country far more good if he talked about what NSA can do to lock down its shit — and explained why that shit didn’t get locked down when Ledgett was working on those issues specifically.

But he barely mentions that part of the response to WannaCry and NotPetya.

The Shadow Brokers Vulnerability Equities Process: NSA Has Had at Least 96 Days to Warn Microsoft about These Files

On January 8, Shadow Brokers announced an auction of Windows Warez, with lists of the exploits he/they had for sale (these two posts from Malware Jake provide analysis of them). Four days later, SB released a different set of Windows exploits, a more dated set that (SB claimed) Kaspersky Labs had had some visibility onto.  The Windows files released today include the ones offered for sale back in January, down to the version numbers. Compare, in particular, the touch, exploit, and payloads with this screencap. SB announced Fuzzbunch and DanderSpritz in January, too.

That’s a critical detail for the debate going on on Twitter and in chats about how shitty it was for SB to release these files on Good Friday, just before (or for those with generous vacation schedules, at the beginning of) a holiday weekend. While those trying to defend against the files and those trying to exploit them are racing against the clock and each other, it is not the case that the folks at NSA got no warning. NSA has had, at a minimum, 96 days of warning, knowing that SB could drop the files at any time.

The big question, of course, is whether NSA told Microsoft what the files targeted. Certainly, Microsoft had not fully responded to that warning, as hackers have already gotten a number of these files to work.

With WikiLeaks’s Vault 7 files, it’s at least possible the CIA doesn’t know precisely what got leaked to WikiLeaks, even though the government immediately identified when and how the files were breached. The NSA cannot make that claim here, at least not with the Windows files. SB was kind enough to provide warning. The question is, what did NSA do with that warning.

The fact that SB provided that warning, though, should have very serious ramifications for the Vulnerabilities Equities Process, under which the NSA is supposed to consider whether it is better to alert companies to exploits or to sit on them and use them. It’s one thing to decide NSA’s spying takes precedence over the security of the customers of big American companies. It’s another thing to keep those exploits in a way that makes them vulnerable to theft, as both CIA and NSA have done.

But it should be beyond question that when an intelligence agency gets a very detailed list of a group of exploits a malicious entity plans to release, the agency should warn the American companies affected.

Update: Microsoft told Sam Biddle they haven’t heard from any “individual or organization.”

A Microsoft spokesperson told The Intercept “We are reviewing the report and will take the necessary actions to protect our customers.” We asked Microsoft if the NSA at any point offered to provide information that would help protect Windows users from these attacks, given that the leak has been threatened since August 2016, to which they replied “our focus at this time is reviewing the current report.” The company later clarified that “At this time, other than reporters, no individual or organization has contacted us in relation to the materials released by Shadow Brokers.”

I think there’s actually some wiggle room in there. We shall see how long it takes MSFT to patch this stuff.

Update: MSFT released a statement that said all but three of these had been addressed. Three of them were addressed in their March update, and another this year. Which would suggest NSA did warn them.