Sheldon Whitehouse just attempted (after 1:44) to rebut an epic rant from John McCain (at 1:14) in which the Arizona Senator suggested anyone who wanted to amend the flawed Cyber Intelligence Sharing Act wasn’t serious about national security.
Whitehouse defended his two amendments first by pointing out that McCain likes and respects the national security credentials of both his co-sponsors (Lindsey Graham and Max Blunt).
Then Whitehouse said, “I believe both of the bills [sic] have now been cleared by the US Chamber of Commerce, so they don’t have a business community objection.”
Perhaps John McCain would be better served turning himself purple (really! watch his rant!) attacking the very notion that the Chamber of Commerce gets pre-veto power over a bill that (according to John McCain) is utterly vital for national security.
Even better, maybe John McCain could turn himself purple suggesting that the Chamber needs to step up to the plate and accept real responsibility for making this country’s networks safer, rather than just using our cybersecurity problems as an opportunity to demand immunity for yet more business conduct.
If this thing is vital for national security — this particular bill is not, but McCain turned himself awfully purple — then the Chamber should just suck it up and meet the requirements to protect the country decided on by the elected representatives of this country.
Yet instead, the Chamber apparently gets to pre-clear a bill designed to spy on the Chamber’s customers.
Most outlets that commented on DHS’ response to Al Franken’s questions about CISA focused on their concerns about privacy.
The authorization to share cyber threat indicators and defensive measures with “any other entity or the Federal Government,” “notwithstanding any other provision of law” could sweep away important privacy protections, particularly the provisions in the Stored Communications Act limiting the disclosure of the content of electronic communications to the government by certain providers. (This concern is heightened by the expansive definitions of cyber threat indicators and defensive measures in the bill. Unlike the President’s proposal, the Senate bill includes “any other attribute of a cybersecurity threat” within its definition of cyber threat indicator and authorizes entities to employ defensive measures.)
To require sharing in “real time” and “not subject to any delay [or] modification” raises concerns relating to operational analysis and privacy.
First, it is important for the NCCIC to be able to apply a privacy scrub to incoming data, to ensure that personally identifiable information unrelated to a cyber threat has not been included. If DHS distributes information that is not scrubbed for privacy concerns, DHS would fail to mitigate and in fact would contribute to the compromise of personally identifiable information by spreading it further. While DHS aims to conduct a privacy scrub quickly so that data can be shared in close to real time, the language as currently written would complicate efforts to do so. DHS needs to apply business rules, workflows and data labeling (potentially masking data depending on the receiver) to avoid this problem.
None of those outlets noted that DOJ’s Inspector General cited privacy concerns among the reasons why private sector partners are reluctant to share data with FBI.
So the limited privacy protections in CISA are actually a real problem with it — one changes in a manager’s amendment (the most significant being a limit on uses of that data to cyber crimes rather than a broad range of felonies currently in the bill) don’t entirely address.
But I think this part of DHS’ response is far more important to the immediate debate.
Finally the 90-day timeline for DHS’s deployment of a process and capability to receive cyber threat indicators is too ambitious, in light of the need to fully evaluate the requirements pertaining to that capability once legislation passes and build and deploy the technology. At a minimum, the timeframe should be doubled to 180 days.
DHS says the bill is overly optimistic about how quickly a new cybersharing infrastructure can be put in place. I’m sympathetic with their complaint, too. After all, if it takes NSA 6 months to set up an info-sharing infrastructure for the new phone dragnet created by USA Freedom Act, why do we think DHS can do the reverse in half the time?
Especially when you consider DHS’ concerns about the complexity added because CISA permits private sector entities to share with any of a number of government agencies.
Equally important, if cyber threat indicators are distributed amongst multiple agencies rather than initially provided through one entity, the complexity–for both government and businesses–and inefficiency of any information sharing program will markedly increase; developing a single, comprehensive picture of the range of cyber threats faced daily will become more difficult. This will limit the ability of DHS to connect the dots and proactively recognize emerging risks and help private and public organizations implement effective mitigations to reduce the likelihood of damaging incidents.
DHS recommends limiting the provision in the Cybersecurity Information Sharing Act regarding authorization to share information, notwithstanding any other provision of law, to sharing through the DHS capability housed in the NCCIC.
Admittedly, some of this might be attributed to bureaucratic turf wars — albeit turf wars that those who’d prefer DHS do a privacy scrub before FBI or NSA get the data ought to support. But DHS is also making a point about building complexity into a data sharing portal that recreates one that already exists that has less complexity (as well as some anonymizing and minimization that might be lost under the new system). That complexity is going to make the whole thing less secure, just as we’re coming to grips with how insecure government networks are. It’s not clear, at all, why a new portal needs to be created, one that is more complex and involves agencies like the Department of Energy — which is cybersprinting backwards on its own security — at the front end of that complexity, one that lacks some safeguards that are in the DHS’ current portal.
More importantly, that complexity, that recreation of something that already exists — that’s going to take six months of DHS’s time, when it should instead be focusing on shoring up government security in the wake of the OPM hack.
Until such time as Congress wants to give the agencies unlimited resources to focus on cyberdefense, it will face limited resources and with those limited resources some real choices about what should be the top priority. And while DHS didn’t say it, it sure seems to me that CISA would require reinventing some wheels, and making them more complex along the way, at a time when DHS (and everyone in government focused on cybersecurity) have better things to be doing.
Congress is already cranky that the Administration took
a month two months to cybersprint middle distance run in the wake of the OPM hack. Why are they demanding DHS spend 6 more months recreating wheels before fixing core vulnerabilities?
Way down in the second-to-last paragraph of this NYT piece claiming the US will retaliate against China for the OPM hack, national security reporter David Sanger makes this claim about the hack, about experts affiliated with an agency that aspires to “Collect it all.”
Instead, the goal was espionage, on a scale that no one imagined before.
He follows it — he ends the entire article — with uncritical citation of this statement from a senior intelligence official.
“This is one of those cases where you have to ask, ‘Does the size of the operation change the nature of it?’ ” one senior intelligence official said. “Clearly, it does.”
Several paragraphs earlier, the reporter who did a lot of the most important work exposing the first-of-its-type StuxNet attack makes this claim. (NYLibertarian noted this earlier today.)
The United States has been cautious about using cyberweapons or even discussing it.
In other words, built into this story, written by a person who knows better, is a fiction about the US’ own aggressive spying and cyberwar. Sanger even suggests that the sensors we’ve got buried in Chinese networks exist solely to warn of attacks, and not to collect information just like that which China stole from OPM.
So if someone creating either a willful or lazy fiction also says this …
That does not mean a response will happen anytime soon — or be obvious when it does. The White House could determine that the downsides of any meaningful, yet proportionate, retaliation outweigh the benefits, or will lead to retaliation on American firms or individuals doing work in China. President Obama, clearly seeking leverage, has asked his staff to come up with a more creative set of responses.
… We’d do well to ask whether this is nothing more than propaganda, an effort to dissipate calls for a more aggressive response from Congress and others.
There is, however, one other underlying potential tension here. Yesterday, Aram Roston explained why some folks who work at NSA may be even more dissatisfied then they were when a contractor exposed their secrets for the world to see.
Employees at the National Security Agency complain that the director, Adm. Michael Rogers, is neglecting the intelligence agency in favor of his other job, running the military’s Cyber Command, three sources with deep knowledge of the NSA have told BuzzFeed News.
“He’s spending all his time at CYBERCOM,” one NSA insider said. “Morale is bad because of a lack of leadership.” A second source, who is close to the agency, agreed that employees are complaining that Rogers doesn’t seem to focus on leading the agency. A third said “there is that vibe going on. But I don’t know if it’s true.”
[O]ne of the NSA sources said Rogers appears to be focusing on CYBERCOM not just because the new organization is growing rapidly but also because it has a more direct mission and simpler military structure than the complex and scandal-ridden NSA in its post-Snowden era. That makes focusing on CYBERCOM easier, that source said, “than trying to redesign the National Security Agency.”
If true (note one of Roston’s sources suggests it may not be), it suggests one of the most important advisors on the issue of how to respond to China’s pawning the US is institutionally limiting his focus to his offensive role, not on his information collection (to say nothing of defensive) role. So if Roston’s sources are correct, we are in a very dangerous position, having a guy who is neglecting other potential options drive the discussion about how to respond to the OPM hack.
And there’s one detail in Sanger’s story that suggests Roston’s sources may be right — where Rogers describes “creating costs” for China, but those costs consist of an escalation of what is, in fact, a two-sided intelligence bonanza.
Admiral Rogers stressed the need for “creating costs” for attackers responsible for the intrusion,
Those of us without the weapons Rogers has at his disposal think of other ways of “creating costs” — of raising the costs on the front end, to make spies adopt a more targeted approach to their spying. Those methods, too, might be worth considering in this situation. If we’re going to brainstorm about how to deal with the new scenario where both the world’s major powers have adopted a bulk collection approach, maybe the entire world would be safer thinking outside the offensive weapon box?
Earlier this week, I noted that of the seven agencies that would automatically get cybersecurity data shared under the Cyber Information Sharing Act, several had similar or even worse cyberpreparedness than the Office of Personnel Management, from which China stole entire databases of information on our cleared personnel.
To make that argument, I used data from the FISMA report released in February. Since then — or rather, since the revelation of the OPM hack — the Administration has been pushing a “30 day sprint” to try to close the gaping holes in our security.
And there have been significant results (though note, the 30 day sprint turned into a 60 day middle distance run), particularly from OPM, Interior (which hosted OPM’s databases), and — two of those CISA data sharing agencies — DHS and Treasury.
Whoa! Check out that spike! Congratulations to those who worked hard to make this improvement.
But when you look at the underlying data, things aren’t so rosy.
We are apparently supposed to be thrilled that DOD now requires strong authentication for 58% of its privileged users (people like Edward Snowden), up 20% from the earlier 38%. Far more of DOD’s unprivileged users (people like Chelsea Manning?) — 83% — are required to use strong authentication, but that number declined from a previous 88%.
More remarkable, however, is that during a
30 day 60 day sprint to plug major holes, the Department of Energy also backslid, with strong authentication going from 34% to 11%. Admittedly, more of DoE’s privileged users must use strong authentication, but only 13% total.
DOJ (at least FBI and probably through them other parts of DOJ will receive this CISA information), too, backslid overall, though with a huge improvement for privileged users. And Commerce (another CISA recipient agency) also had a small regression for privileged users.
There may be explanations for this, such as that someone is being moved from a less effective two-factor program to a better one.
But it does trouble me that an agency as central to our national security as Department of Energy is regressing even during a period of concerted focus.
DOJ’s Inspector General just released a report on how well FBI’s cybersecurity initiative has been going. In general, it finds that the FBI has improved its ability to investigate cyberattacks.
But among the most significant challenges facing the FBI is in two-way information sharing with the private sector.
You might think that the Cyber Information Sharing Act — which after all, aims to increase information sharing between the private sector and those in government who will investigate it — would help that.
On one count it would: private sector entities interviewed by the IG were reluctant to cooperate with the FBI because of FOIA concerns.
During our interviews with private sector individuals, we found that private sector entities are reluctant to share information, such as PII or sensitive or proprietary information, with the government because of concerns about how that information could be used or the possibility that it could be publicly released under the Freedom of Information Act (FOIA).26 One private sector professional told us that he had declined to be interviewed by the OIG due to FOIA concerns.
CISA would include a blanket exception from FOIA — which is not necessarily a good thing, but should placate those who have these concerns.
But other private sector entities expressed concerns about the multiple uses to which shared data would be put. They cited Snowden disclosures showing data might be used for other purposes.
In addition, several private sector individuals discussed with us the challenges in collaborating with the FBI in a “post-Snowden” era. One private sector individual emphasized that Snowden has redefined how the private sector shares information with the United States government. We were told by private industry representatives and the FBI that, following the Snowden disclosures, private sector entities have become more reluctant to share information with the United States government because they are uncertain as to how the information they provide will be used and are concerned about balancing national security and individual privacy interests.
The recent reports on the use of cyber signatures for upstream Section 702 collection show that the NSA and FBI might be able to use signatures to search all traffic (though I suspect FISC has put more limitations on this practice than is currently known).
Just as troubling, however, are the broad permissions under CISA to use the data turned over under the law for prosecutions on a range of crimes. Right now, ECPA has provided tech companies — at least the ones that pushed back on NSLs demanding Internet data — a way to protect their customers from fishing expeditions. CISA is voluntary (though I can imagine many ways pressure would be brought to participate), but it does undermine that system of protections for customers.
When commenting on this, Jim Comey apparently added in proprietary information among the concerns of providers, along with the explicitly described “guard[ing] customer data.
The FBI Director has acknowledged private sector concerns related to proprietary information and the need to guard customer data and stated the FBI will do what it can to protect private sector privacy.27
Given NSA’s voracious use of any information it gets its hands on, and the broad permissions for information sharing in the bill, the protections for trade secrets may not be enough for the private sector, since it’s now clear the government, not just competitors, is exploiting trade secrets.
The IG ends this section urging the FBI to provide “appropriate assurances” about its handing of Personally Identifiable Information.
More generally, efforts to detect, prevent, and mitigate threats are hampered because neither the public nor private sector can see the whole picture.
The FBI Director further explained government lacks visibility into the many private networks maintained by companies in the United States, and the FBI “has information it cannot always share [with the private sector].” Consequently, each can see distinct types of cyber threats, but the information is not always visible to the other. We believe that the FBI should strengthen its outreach efforts to provide appropriate assurances regarding its handling of PII and proprietary information received from the private sector and work to reduce classification, where appropriate, of information in its possession in order to improve sharing and collaboration in both directions consistent with appropriate privacy and other limitations.
It is just my opinion, but I suspect CISA, as written, would further exacerbate concerns.
Finally, Inspector General Michael Horowitz’ statement releasing this report includes something not developed in the report itself, perhaps because it is a more recent concern: security of data shared with the federal government.
And, the FBI continues to face challenges relating to information sharing with private sector entities, in part because of concerns in the private sector about privacy and the security of sensitive information it shares with the government.
I’d be very interested in whether this stems just from trade secret concerns or from the concern that several of the agencies that would automatically get data shared with the government have their own cybersecurity challenges.
Bloomberg reports that the same people who hacked OPM then went on to target United, which does a lot of business with the government (and, though the story doesn’t say it, a lot of flights to China).
United, the world’s second-largest airline, detected an incursion into its computer systems in May or early June, said several people familiar with the probe. According to three of these people, investigators working with the carrier have linked the attack to a group of China-backed hackers they say are behind several other large heists — including the theft of security-clearance records from the U.S. Office of Personnel Management and medical data from health insurer Anthem Inc.
The timing of the United breach also raises questions about whether it’s linked to computer faults that stranded thousands of the airline’s passengers in two incidents over the past couple of months. Two additional people close to the probe, who like the others asked not to be identified when discussing the investigation, say the carrier has found no connection between the hack and a July 8 systems failure that halted flights for two hours. They didn’t rule out a possible, tangential connection to an outage on June 2.
But what I find most interesting is that OPM developed a list of potential victims, including United, and alerted them of the signatures related to the hack.
The China-backed hackers that cybersecurity experts have linked to that attack have embedded the name of targets in web domains, phishing e-mails and other attack infrastructure, according to one of the people familiar with the investigation.
In May, the OPM investigators began drawing up a list of possible victims in the private sector and provided the companies with digital signatures that would indicate their systems had been breached. United Airlines was on that list.
That’s interesting for two reasons. First, OPM alerted United before it alerted even the less exposed OPM victims, those whose personnel data got stolen; OPM has yet to formally alert those whose security clearance data got taken. I get that you might want to alert additional targets before confirming publicly you know about the hack (potentially to learn more about the perpetrators).
But it also shows that data sharing — alleged to be the urgent need calling for CISA — is not a problem.
In the wake of the OPM hack, Congress is preparing to do something!!! Unfortunately, that “something” will be to pass the Cyber Information Sharing Act, which not only wouldn’t have helped prevent the OPM hack, but comes with its own problems.
To understand why it is such a bad idea to pass CISA just to appear to be doing something in response to OPM, compare this table from this year’s Federal Information Security Management report with the list of agencies that will automatically get the data turned over to the Federal government if CISA passes.
(A) The Department of Commerce.
(B) The Department of Defense.
(C) The Department of Energy.
(D) The Department of Homeland Security.
(E) The Department of Justice.
(F) The Department of the Treasury.
(G) The Office of the Director of National Intelligence.
So not only will information automatically go to DOJ, DHS, and DOD — all of which fulfill the information security measures reviewed by Office of Management and Budget — but it would also go to Department of Energy, which scores just a few points better than OPM, Department of Commerce, which was improving but lost some IT people and so couldn’t be graded last year, and Department of Treasury, which scores worse than OPM.
Which is just one of the reasons why CISA is a stupid idea.
Some folks have put together this really cool tool that will help you fax the Senate (a tool they might understand) so you can explain how dumb passing CISA would be. Try it!
Over at Lawfare, Ben Wittes does some brainstorming about what other databases the Chinese may be hacking after ingesting all its OPM winnings. He thinks they might target:
For each description of why he thinks they might be juicy targets, he ends with this statement:
Fortunately, the [XXX] Department is a highly competent counterintelligence agency with first-rate cybersecurity expertise, whose employees are scrupulous about cybersecurity and never do business on their own email servers. I am sure it is fully competent to protect these records.
As it happens, there’s plenty of support for most of Wittes’ speculative targets, especially if you consult this year’s FISMA report from OMB.
Several of the agencies — especially the State Department, but also especially Commerce — rated very poorly in OMB’s summary of the Inspector Generals reviews from last year.
I’d add two agencies to Wittes’ list: USDA (China has allegedly been stealing seed corn, so why not Ag records?) and Treasury generally (though in some other areas Treasury is pretty good, and it has mostly been “hacked” via old style means — including PII “spillage” — of late).
This list is particularly notable, however, given that the debate over CISA is about to start again. Both Treasury and Commerce are among the agencies that get automatic updates of the data turned over under the law. But their security is, in some ways, even worse than OPM’s.
Update: Paul Rosenzweig takes a shot. He picks CFIUS, NRC, FERC, state license DBs, and university research. There is some correlation with weak agencies there, too.
I realized something the other day.
For the purposes of hacking, a theater (or at least any mall it was attached to) might count as critical infrastructure that would deem it a National Security target, just as Sony Pictures was deemed critical infrastructure for sanction and retaliation purposes after it got hacked.
But if a mentally ill misogynist with a public track record of supporting right wing hate shoots up a movie showing, it would not be considered a national security target. Given his death, DOJ won’t be faced with the challenge of naming John Russell Houser’s crime, but they would have even less ability to punish Houser for his motivation and ties to other haters than they had with Dylann Roof.
DOJ had no such problem with Joseph Buddenberg and Nicole Kissane, who got charged with terrorism (under the Animal Enterprise Terrorism Act) yesterday because they freed some minks. And a bobcat.
So shooting African Americans worshipping in church is not terrorism, but freeing a bobcat is.
Meanwhile, most of the 204 mass shootings — averaging one a day — that happened this year have passed unremarked.
I laid out some of the problems with the disparity between Muslim terrorism and white supremacist terrorism (to say nothing of bobcat-freeing “terrorism”) the other day.
“This should in no way signify that this particular murder or any federal crime is of any lesser significance.” [than terrorism, Loretta Lynch claimed while announcing the Hate Crime charges against Roof
Except it is, by all appearances.
When asked, Lynch refused to comment on how DOJ is allocating resources, but reporting on the increase in terrorism analysts since 9/11 suggests the FBI has dedicated large amounts of new resources to fighting Islamic terrorism, domestically and abroad. In addition, there are a number of spying tools that are tied solely to international terrorism — but DOJ has managed to define, in secret, domestic terrorism espoused by Muslims in the U.S. as international terrorism. That means FBI has far more tools to dedicate to finding tweets posted by Muslims, and fewer to find the manifesto Roof wrote speaking of having ”the bravery to take it to the real world” against blacks and even Jews.
Perhaps most importantly, because of vastly expanded post-9/11 information sharing, local law enforcement offices have been deputized in the hunt for Muslim terrorists, receiving intelligence obtained through those additional spying tools and sharing tips back up with the FBI. By contrast, as one after another confrontation makes clear — most recently the video of a white Texas trooper escalating a traffic stop with African American woman Sandra Bland that ultimately ended in her death, purportedly by suicide — too many white local cops tend to prey on African Americans themselves rather than the police who target African Americans for their race.
Finally, the FBI has an incentive to call Roof’s attack something different, as it makes a big deal of its success in preventing “terrorist” attacks. If the Charleston attack was terrorism, it means FBI missed a terrorist plotting while tracking a bunch of Muslims who might not have acted without FBI incitement. That would be all the worse as the FBI might have stopped Roof during the background check conducted before he bought the murder weapon, if not for some confusion on a prior charge.
I’m certainly not saying we should expand the already over-broad domestic dragnet to include white supremacists espousing ugly speech (but neither should hateful speech from Muslims be sufficient for a material support for terrorism charge, as it currently is). Yet as one after another white cop kills or leads to the death of unarmed African Americans, we have to ensure that we call like crimes by like names to emphasize the importance of protecting all Americans. DOJ under Eric Holder was superb at policing civil rights violations, and there’s no reason to believe that will change under DOJ’s second African American Attorney General, Loretta Lynch.
But hate crimes brought with the assistance of DOJ’s Civil Rights division (as these were) are not the same as terrorist crimes brought by national security prosecutors, nor are they as easy to prosecute. If our nation can’t keep African Americans worshipping in church safe, than we’re not delivering national security.
But I’d add to that. If we’re discussing mass killings with guns (remember, earlier this year Richard Burr tried to include commission of a violent crime while in possession of a gun among the definitions of terrorism) then it suggests far different solutions than just calling terrorism terrorism.
What if we focused all our energy on interceding before crazy men — of all sorts — shoot up public spaces rather than just one select group?
What if our definitions of national security started with a measure of impact rather than a picture of global threat?
One of the more interesting comments at the Aspen Security Forum (one that has, as far as I’ve seen, gone unreported) came on Friday when Michael Chertoff was asked about whether the government should be able to require back doors. He provided this response (his response starts at 16:26).
I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order. And I say that for a number of reasons and I’ve given it quite a bit of thought and I’m working with some companies in this area too.
First of all, there is, when you do require a duplicate key or some other form of back door, there is an increased risk and increased vulnerability. You can manage that to some extent. But it does prevent you from certain kinds of encryption. So you’re basically making things less secure for ordinary people.
The second thing is that the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.
The third thing is that what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.
Finally, I guess I have a couple of overarching comments. One is we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information. We often make trade-offs and we make it more difficult. If that were not the case then why wouldn’t the government simply say all of these [takes out phone] have to be configured so they’re constantly recording everything that we say and do and then when you get a court order it gets turned over and we wind up convicting ourselves. So I don’t think socially we do that.
And I also think that experience shows we’re not quite as dark, sometimes, as we fear we are. In the 90s there was a deb — when encryption first became a big deal — debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Congress ultimately and the President did not agree to that. And, from talking to people in the community afterwards, you know what? We collected more than ever. We found ways to deal with that issue.
So it’s a little bit of a long-winded answer. But I think on this one, strategically, we, requiring people to build a vulnerability may be a strategic mistake.
These are, of course, all the same answers opponents to back doors always offer (and Chertoff has made some of them before). But Chertoff’s answer is notable both because it is so succinct and because of who he is: a long-time prosecutor, judge, and both Criminal Division Chief at DOJ and Secretary of Homeland Security. Through much of that career, Chertoff has been the close colleague of FBI Director Jim Comey, the guy pushing back doors now.
It’s possible he’s saying this now because as a contractor he’s being paid to voice the opinions of the tech industry; as he noted, he’s working with some companies on this issue. Nevertheless, it’s not just hippies and hackers making these arguments. It’s also someone who, for most of his career, pursued and prosecuted the same kinds of people that Jim Comey is today.
Update: Chertoff makes substantially the same argument in a WaPo op-ed also bylined by Mike McConnell and William Lynn.