DOD Uses Sequester to Excuse 5 Year Delay in Implementing Basic Network Security

More than 22 months ago, I wrote a post analyzing Congressional testimony describing the gaping holes in DOD network security 3 years after a nasty malware infection and a year after the publication of Collateral Murder by WikiLeaks.

Almost two years later, Assistant Secretary of Defense Zachary Lemnios says sequestration might hold up improving network security on classified and unclassified networks.

Zachary J. Lemnios, the assistant secretary of defense for research and engineering, was asked by Sen. Rob Portman (R-Ohio) to describe the “most significant” impacts on cybersecurity that could follow from the anticipated cuts to the Pentagon’s budget.

Mr. Lemnios replied that “cuts under sequestration could hurt efforts to fight cyber threats, including […] improving the security of our classified Federal networks and addressing WikiLeaks.”

This is news not just for the specific details offered about how bad DOD’s network security remains (click through for more details). But also for the tacit admission that 3 years after a breach DOD considers tantamount to aiding the enemy, and 5 years after a malware infection that badly affected DOD’s networks in Iraq, DOD still hasn’t completed security enhancements to its networks.

image_print
3 replies
  1. P J Evans says:

    So what have they been doing for the last several years?

    I realize it takes time to get major projects off the ground, but they should have gotten this one going years ago.

  2. Stormcrow says:

    @P J Evans: What they’ve been doing about cybersecurity is what just about everybody does unless they are under the actual hammer: they’re sitting with their thumbs up their asses.

    I speak from extensive painful personal experience: I’m a CISSP who’s been working in this field professionally for the last 15 years.

    Over those 15 years, I’ve watched a number of my colleagues simply give up and quit, leaving the field for pastures which may not be all that much greener, but which aren’t completely covered in crap either.

    Doing cybersecurity even halfway right requires that the entire enterprise exercises collective and enforced discipline about matters like patching (not only the OS, but also the APPS – far harder), permissible content, and essential policy.

    That _also_ means that decisions must be taken, proactively, about what the _level_ of control that’ll be exercised, and in what areas. This is tough and requires maturity, because too much security is as destructive as too little.

    Things like change control, for instance. You _must_ have this in place, and _enforced_, and changes must be reviewed by security personnel; otherwise some eejit will hang a database server out past the perimeter firewalls …. No, I’m not kidding; that’s not a hypothetical example.

    Your worst enemy is almost always your own senior management. They’re the ones who’ll contract out the web interface to your financial services company to some bunch of clowns three countries away, where you haven’t a legal leg to stand on and no way in hell of even requiring, let alone enforcing, secure coding standards. Why? Because they’re the low bidder, that’s why!

    And it’s the same sack of horseshit in government as it is in business. Just a different brand, political horseshit rather than the “free-market” flavor. Where your program stalls out because the jackasses in Congress are playing “Chicken” with agencies like the FAA rather than with automobiles.

    Result?

    Easy: every goddamned year for the last 15 years, the good guys have been falling further and further behind. In 2000, it was “how do we prevent incidents?” In 2005, it was “how do we respond to incidents we know we’re going to have?” Now, it’s the sort of statistics we use to _enumerate_ the incidents when they roll in by the hundreds.

    If you doubt me, just spend a bit of quality time looking at the operational track record of antivirus systems.

    A dozen years ago, if you were diligent about the way you maintained your A/V and had intelligent perimeter controls, you could drive your infection rates as close to zero as made no difference. Not that most enterprises did this, of course.

    Today? If you don’t assume a certain proportion of your client workstations are infected right from the git-go, you’re a fool. But the odds your management will let you deal with the consequences of that is close enough to zero to need an electron microscope to resolve.

Comments are closed.