Stuxnet: A Way to Nuke Iran without Using a Bomb?

Last week, Russian Ambassador to NATO, Dmitry Rogozin, told the organization that the computer worm Israel and the US devised to ruin Iran’s nuclear program could have led to a catastrophe with the Bushehr nuclear plant like Chernobyl.

Russia said on Wednesday that NATO should investigate last year’s computer virus attack on a Russian-built nuclear reactor in Iran, saying the incident could have triggered a nuclear disaster on the scale of Chernobyl.

[snip]

“This virus, which is very toxic, very dangerous, could have very serious implications,” he said, describing the virus’s impact as being like explosive mines.

“These ‘mines’ could lead to a new Chernobyl,” he said, referring to the 1986 nuclear accident at a plant in Ukraine, then part of the Soviet Union. “NATO should get to investigating the matter… This is not a private topic.”

At first, it seemed like the risk for such a disaster had passed. But the AP has gotten a foreign intelligence report stating that the risk of such a catastrophe remains.

… such conclusions were premature and based on the “casual assessment” of Russian and Iran scientists at Bushehr.

With control systems disabled by the virus, the reactor would have the force of a “small nuclear bomb,” it says.

Which would be rather “neat,” don’t you think? If the US and Israel were to collaborate to pioneer cyberwarfare to effective set off an explosion equivalent to that of a nuclear bomb, all without having to drop the bomb themselves? (The Bushehr reactor is apparently just 12 KM outside of the city of Bushehr, Iran’s chief seaport.)

Richard Clarke provides an explanation (assuming this was not an intentional potential side effect of the US-Israeli plot) for why Stuxnet may still be a risk, in Iran and elsewhere.

Second, the cyber agent Stuxnet was captured and successfully interrogated. That was not supposed to happen. The attack program had built in to it all sorts of collateral damage controls, including instructions to kill itself after a date certain in 2009. Those controls, most unusual in the world of hackers but common in certain countries covert action programs, failed apparently because the weapon’s designers took the collateral damage controls less seriously than they did the ingenious attack. For a hacker, attacking is always more interesting than pleasing the lawyers. Thus, after laying low the Iranian nuclear enrichment centrifuges at Natanz, the worm made its way from that plant’s supposedly isolated, internal computer network to freedom in cyberspace. Thousands of other computers in Iran were infected, as were many in countries such as Pakistan, India, Indonesia, and even a few in the United States.

[snip]

The problem lies in the fact that the worm ran freely through cyberspace and lots of people caught a copy. One can be sure that highly skilled hackers in several countries are even now taking it apart, modifying it, and getting it ready to destroy some other target. They are benefiting from free access to the most sophisticated computer attack weapon ever created. That would not be such a problem except for the fact that the thousands of computer networks that run our economy are essentially defenseless against sophisticated computer attacks.

That is, the Israeli and American hackers behind this cyberattack were no more competent than (or perhaps, just as incompetent as) the spooks that gave Iran nuclear blueprints 11 years ago.

And meanwhile, DOD won’t tell Congress about its cyberwar operations, presumably up to and including Stuxnet.

I guess maybe they’re just crossing their finger and hoping none of the easily predicted unintended consequences would come to pass?

Marcy has been blogging full time since 2007. She’s known for her live-blogging of the Scooter Libby trial, her discovery of the number of times Khalid Sheikh Mohammed was waterboarded, and generally for her weedy analysis of document dumps.

Marcy Wheeler is an independent journalist writing about national security and civil liberties. She writes as emptywheel at her eponymous blog, publishes at outlets including the Guardian, Salon, and the Progressive, and appears frequently on television and radio. She is the author of Anatomy of Deceit, a primer on the CIA leak investigation, and liveblogged the Scooter Libby trial.

Marcy has a PhD from the University of Michigan, where she researched the “feuilleton,” a short conversational newspaper form that has proven important in times of heightened censorship. Before and after her time in academics, Marcy provided documentation consulting for corporations in the auto, tech, and energy industries. She lives with her spouse and dog in Grand Rapids, MI.

  1. BoxTurtle says:

    If the Russians are saying the original Stuxnet could have caused Bushehr to explode, I call bullshit. That virus was directed specifically at the PLC’s that ran Iran’s centrifuges. Only that specific brand of controllers were effected. Unless the Russians are saying that the same PLC was used at the reactor, the risk is zero.

    A hacker MIGHT be able to change that, if they had access to a decent test platform. That’s a strong 5 figure investment, even if all you get is the PLC’s. But realistically, you’d need government level resources to turn that from a certrifuge virus to a reactor virus.

    However, I’m pretty sure the original authors could change it to a reactor virus if they wanted to do so.

    Remember Iran has several copies of the virus and government level resources. And they’d sure be motivated to “upgrade” Israel’s reactor if they could.

    Boxturtle (And there are probably some pissed engineers with that as their personal goal)

    • Disgusteddan says:

      Ah, but one can never forget the unintended consequences. They targeted a specific brand of PLC (Siemens) whose flaws were disclosed by Siemens engineers. There is quite a lot of investment that goes into using PLC’s (i.e. software and training) that you usually don’t change brands. This virus was designed to attack Siemens product so any operation using a Siemens PLC could be vulnerable to attack. Any PLC in the reactor cycle could fail creating a Chernobyl.

      Now that copies of the virus have been captured, I would think the Siemens PLC brand in any application would be susceptible to attack and therefore rendered obsolete.

    • Synoia says:

      if they had access to a decent test platform…

      I wrote one of those…It was hard…Not five figures for a customer to buy though.

      And it could simulate 1,000 other “individual machines” on one PC.

      • BoxTurtle says:

        You have to buy the hardware as well. You can simulate the PLC in software, but what will that REALLY do to the hardware?

        However, hackers typically let others test their code for them. They’ll simply release it into the wild and fix the bugs others find.

        Boxturtle (Never do work you can get others to do for you)

        • Synoia says:

          You don’t need more than one piece of hardware, then you can simulate the PLCs actions.

          One PLC and One centrifuge provide calibration for the simulator (model).

          The the simulator is replicable, because each centrifuge is a separate independent piece of machinery.

    • papau says:

      Spot on –

      it was just a speed control screw up worm that stopped the gas from being made into fuel, and was planned to screw up the electric turbine speed eventually.

      Why did our CIA use Russia to let the world know how smart they are – sounds like an attempt for a budget increase – a method which I doubt was approved by the White House or our the intel committees. Our CIA once again does not give a damn about anything other than its budget and its ability to protect corporations.

    • kgb999 says:

      I think you are a little off base on this. The public (media) analysis just isn’t adding up. First, Stuxnet involved three distinct PLC exploits. The specific exploit to be deployed was selected based on identified hardware and specific values found in the code blocks. Two (A/B) were very similar and reasonably basic in operation; exploit “C” was a doozy. If I had to make my own unfounded guess, A/B went for the centrifuges and C was targeting the nuclear facility proper.

      One thing for sure, it doesn’t make any sense that damage to centrifuges at Natanz would push back the start up of Bushehr. Unless I missed something, Russia is providing the fuel for the reactor. If Iran doesn’t need the fuel, why would problems at their fuel production facility push back the reactor?

      Additionally, Iran has claimed this thing is taking control of power plants. Seimens provides a vast array of equipment for power plants (they claim 25% of the world’s power production is from the Seimen’s “Fleet”). It is far from given that deployment conditions for exploit “C” would be unique to one specific facility. A power plant seems more generic in every way compared to a centrifuge cascade. I imagine there are many functions in common between all power plants which may use very similar hardware and software configurations. While tight lipped about the targets, Seimens indicates they’ve helped at least 25 major clients clean up infections, so this seems to impact facilities beyond Iran.

      There are also claims the virus has “mutated”. Computer code isn’t like genetics. If it’s mutating, there is a reason. Either mutation is built in to the original code (which I haven’t seen mentioned previously) or someone is releasing updated versions. Considering the malware was designed to update itself from two command-control servers (which were also used to gather information from all compromised systems), I have a pretty strong opinion what’s going on. With an actively mutating vector in the wild – apparently still seeking pay dirt – it seems a pretty lackadaisical security posture to assume the exploit finding it’s way to the targeted hardware would have benign results.

      As for the larger danger of Stuxnet in the wild; there are other things at risk beyond nuclear power plants. A functional proof-of-concept that targets embedded systems in the wild is huge. In the limited microcosm where a person desires to repurpose Stuxnet against a highly secure/secret system such as a nuclear facility it would be quite a challenge. But the need is for information, not millions of dollars worth of hardware. Any documented system is pretty much at risk now; the more common the system the higher the risk.

      I’m just waiting for every Lexmark in the world to start spitting out page after page of “All UR Prntrz R belonging to US!”

  2. WilliamOckham says:

    The Russians are pissed off and blowing smoke. Stuxnet was unbelievably precise in its attack on the SCADA stuff and unbelievably promiscuous in its attempt to spread (It contained 4 separate 0-day Windows exploits). Stuxnet was designed to embed itself inside of Iran’s nuclear centrifuges and cause some very specific trouble, but nothing like a nuclear explosion.

    The real problem with Stuxnet is that the techniques they used are now available to the public. Who knew you could to code injection on PLCs? (almost nobody, really). It would be impossible for anyone but the most sophisticated governments to replicate the kind of attack that Stuxnet pulled off (the coders had very precise intelligence about the physical layout of the Iranian installations), but now a really dedicated teen ager (or old fart like me) could bring down power plants, rail systems, natural gas pipelines, etc. Solo, from the privacy of your nearest Starbucks.

    I’m amazed at the technical sophistication of this attack, but appalled by the sheer stupidity of launching it in this manner. If they had had one person on the ground inside the Iranian nuclear program, they could have avoided the Windows exploits and this thing probably would have never seen the light of day. Brilliant idiots.

    • Synoia says:

      Stuxnet V 1.0 was unbelievably precise in its attack on the SCADA stuff

      The cat is out of the bag. Incremental enhancements are simple.

  3. WilliamOckham says:

    Let me expand/correct my previous statement. Stuxnet had two attacks. The first was screwing with the centrifuges. The second, much more complex, was designed to screw with the steam turbine at Bushehr. The point was put the reactor out of commission, not pull off a Chernobyl.

  4. WilliamOckham says:

    I’m not convinced that the thing was really supposed to off itself. The people who did this put in way too much code to ensure that it spread to have thought that it wouldn’t get out into the wild.

    The more I read about this thing, the more inexplicable it seems. These guys created a man in the middle attack on a freaking PLC!. Let me explain (or to quote Inigo Montoya, No, there is too much. Let me sum up).

    This thing exploited 4 different completely new holes in the Windows OS. In a single virus. That’s pretty impressive, but it pales in comparision to what these folks did in the PLC code. They inserted code directly on the controller that recorded normal activity and then, when the time came, the code played back the normal activity while it instructed the machinery to do whatever they wanted it to do (just like in the movies where the bad guys put the security cameras on a loop). The stuff they made the equipment do was targeted very specifically at the Iranians, but that really doesn’t matter. Now, everybody in the world can look at this code and see how to pull a MITM attack off a SCADA system.

    A few years ago, I spent some time thinking about how to take down SCADA systems (part of my job was to give advice on how to protect them) and I never imagined someone could pull off something like this. These Siemans controllers are installed all over the world in everybody’s infrastructure. Not to mention practically every factory in Germany. The Germans must be alternately livid and panicked.

  5. puppethead says:

    The one thing never mentioned when talking about Stuxnet is that it’s a Windows virus, and the control systems from Siemens run Windows.

    Fools run their critical systems on Windows, complete idiots control nuclear facilities (or US Navy warships) with it. There is no other OS that comes close to the inherent vulnerabilities Windows has, which haven’t been fixed in almost two decades and four completely new versions.

    • BoxTurtle says:

      I’m betting the Siemans brand is dead in the marketplace until they release patches.

      Hmm…it’s been speculated that the worm could not have been written without Siemans internal manuals. We could speculate that if Siemans have fixes ready quickly, they knew it was coming.

      I just checked their website. They had a crude patch ready fairly quickly, but still no real fix and the crude patch can be easily bypassed. So I’m thinking that whomever did it snuck the manuals.

      Boxturtle (Speculation is fun!)

      • PeasantParty says:

        Speculation is fun. However, I’ve known EW on the blogs for years and I can tell you she is top notch on getting to the real story and facts.

        It won’t be long before we see some more of this puzzle unravel and thanks to you and others here that understand OS and Programming we learn so much!

  6. Rayne says:

    Two things, EW — first, I think you’re missing a link to the source of Clarke’s comments, made at ABC News as an analyst.

    Secondly, did you see the hole in what Clarke said? Never mentions North Korea.

    It’s not like NK would use entirely different technology to develop their own program, given how much trading they’ve done with so many other entities which supplied Iran.

      • Synoia says:

        Yes, possible by changing some constants in the worm, maybe by writing a bit more code.

        The hard work is done. Now it is a platform for attacks.

      • Rayne says:

        I haven’t had the time to go and dig into Siemens PLCs; they are everywhere, and the technology in them is old and mature enough that in theory the code could impact a lot of different applications. Believe me, I’m even worried about manufacturing and health care applications here; I’ve seen Siemens content in a lot of different sites and am not at all certain how the developers would be absolutely certain they walled it off to specific models.

        Unless Siemens was in on this and built in the firewalls themselves, but then there would have been wholesale problems with upgrades in all manner of environments.

        Seems to me we are still missing at least one big piece in this puzzle.

      • Professor Foland says:

        Plutonium separation doesn’t use centrifuges.

        But every major industrial installation has gazillions of PLC’s. And Stuxnet seems like a general way to compromise them. The techniques for an attack on the Siemens PLC’s in a uranium enrichment plant will work for the Siemens PLC’s in a plutonium separation facility. What you will have to do anew, is to understand which PLC’s to compromise, and what you want them to be saying to the control software (which film loop to play, in WO’s analogy). In general, you’d have to go through that process anew even for another uranium facility.

        And, of course, if the plutonium installation has installed, say, Fedora Core, instead of windows, you’ll be starting pretty far behind.

        (Also, I second disgusteddan at 8–there’s a huge amount of inertia associated with changing an existing PLC installation. I once had to change a set in a radiation-environment lab; the paperwork was endless. And knowing that it would be , we did everything we could first before finally getting new ones.)

  7. papau says:

    A programmable logic controller (PLC) is just a digital computer used for a “hard (result must be available within minimum time” time system.

    How a single worm now opens up the world to destruction is above my pay grade, I guess – it did seem to be to be about a single design that was for the set of PLC’s for the gas reduction process. Perhaps we are anticipating some new post grad course innovations and “windows improvements”.

  8. Sinestar says:

    I read an article that said the Stuxnet worm had a lot of amateurish components, specifically in the way it polymorphed to conceal itself. That is was the same technique being used in the early 90’s.

    Since everyone is so eager to blame Israel and the U.S. for creating this worm why doesn’t anyone consider that there are some very smart Iranians that might have a conscience knew that the Nuke program was not for peaceful purposes and pulled an inside job? Maybe they didn’t want Iran’s mullahs to provoke Israel or the U.S. to bomb them into the stone age?

    Having four 0 day’s in a single bug is pretty neat I guess except that there are probably 40,000 more in an operating system (Windows) composed of hundreds of millions of lines of code.

    And code injection can be done on anything with a CPU that executes, so it is a well known technique, even if this was the first known use on a PLC. Siemens SCADA systems are simply Windows computers that are *supposed* to be on isolated networks.

    But anyway, who would know better how to sabotage the system than the people who work there everyday? I think it was the Iranians themselves due to some of the obvious flaws in the worm. The CIA or Israelis probably would have created a less clumsy effort.

    • Rayne says:

      Look, somebody had to insert this in Iran; the chances are good it was either an Israeli plant or as you suggest, a “smart Iranian with a conscience.”

      But do not think for a second that CIA or Israelis would do something less clumsy. If they could, they’re in the wrong line of business and could have done in Windows with a better replacement.

      • BoxTurtle says:

        It was spread by releasing it into the wild in areas where plant workers lived and waiting for the law of probability to take it’s course. Could have been planted via facebook!

        Boxturtle (Credit where credit is due, this was a VERY skillful attack)

      • Sinestar says:

        The CIA would likely have contracted this out, so I am thinking they know who the really bad-ass programmers are. Same for the Israeli’s but for all the novel things this worm did, the flaws suggest that the people who wrote it didn’t have access to the latest programming techniques, like say a country that has suffered a ban on encryption technology for around 30 years? See, if I recall from the article it just used a variation of XOR encryption to conceal the code. Not a very sophisticated technique at all really. And yes they speculate it came in on a flash drive.

        Since I am pretty damn sure the Iranian’s are going to crawl up every orifice anyone has that is working on their nuclear projects an Israeli agent seems doubtful. And some bozo leaving a couple flash drives in the parking lot and some person with direct access to the SCADA systems just happens to pick it up and use it also seems like an equally lame explanation. Again it all points to an inside job from where I sit. Besides why would the CIA and Israeli’s be so dumb as to allow it to infect their systems themselves.

        Again, solid skills in some ways but very poor in others. There are just too many serious flaws in it as if someone didn’t have any way to QC it and I think a State Sponsor would have done that. I think also it would have taken someone with intimate knowledge to target key systems. Natanz wasn’t built off a blueprint you can buy online, I’d be willing to bet it is all custom work.

        • Rayne says:

          Besides why would the CIA and Israeli’s be so dumb as to allow it to infect their systems themselves.

          It’s called Windows. It’s ubiquitous, and as has already been pointed out, there are numerous exploits which have still not been plugged after numerous versions and over a decade. That was the weakspot — they used an agent of infection that wasn’t exclusive to the site. They were stupid for ever thinking they could use Windows to deploy this.

          There are just too many serious flaws in it as if someone didn’t have any way to QC it and I think a State Sponsor would have done that.

          Spending a lot of time on QC means greater risk of exposure. And I don’t think there are enough non-state sponsors out there who could organize this AND deploy it.

          I’ll also circle back to Windows here for a moment: what if they did QC it, but only with regard to the effect of the virus on what they believed was a closed plant environment? What if they stupidly and naively failed to think out worst case scenarios in which the site and the targeted PLCs are networked? I can see that happening, especially with state agents.

          Why? Because virtually every bloody IT project the U.S. military has ordered over the last 15 years was either screwed up on scope or over-budget, because the military and the IT vendors over-ask/over-promise and tacked shit together to make an approximation of what was needed. (I used to work for a rather large IT vendor with numerous government contracts, know something about this area.) Why would this be an exception?

          • dakine01 says:

            What if they stupidly and naively failed to think out worst case scenarios in which the site and the targeted PLCs are networked? I can see that happening, especially with state agents.

            As someone who has many years in Software Quality, I can gay-roan-tee this happens all the time.

            • Rayne says:

              Heh. I was thinking of you when I wrote that, wondered if you’d reply.

              I used to rely heavily on a guy at the last plant I worked at; no degree, but his knowledge and skills were phenomenal. He could fix anything, from the plant security system to the furnace, had a little cart with all manner of bits and pieces of PLC equipment to duct tape which he pushed everywhere. This guy had more than a full-time job because of the chronic failure to model out worst case scenarios in system design. It’s like a career niche, too.

              Probably would have been more effective and safer to find his Iranian equivalent at Bushehr, bribe him to mess with the PLCs and get him out — unless Bushehr was not the only objective.

          • Sinestar says:

            There was a kill timer programmed into the code. It was supposed to shut itself off. I agree the government seems pretty good at bungling IT projects from all reports which is ironic on many levels. But for a programmer not able to handle a kill switch competently reeks of a very educated ‘hobbyist’.

            I pointed out earlier that having 4 zero-day exploits is sort of impressive until you consider there are probably 40,0000 more in Windows 2000 and/or XP. And again I point out, how many people know exactly what to do that would cause catastrophic damage to the enrichment process without hopefully killing anyone. At least you could concede it is just as possible a group of Iranian engineers (perhaps even Russian?) created this worm as the US or Israel. I think everyone, at least the less technically inclined probably think it is pretty ‘bad-ass’, as Bill Maher put it, and would like to feel good about taking credit for it.

            But it really was amateur hour on many levels and used many well known exploits to function as well and the new ones. Also it appears Siemens systems are laughably vulnerable as well as poor security measures (like disabling USB/1394 ports/removable media on workstations hardware and administrative policies. That’s security 101 if you want to secure a PC. Come on.

            • WilliamOckham says:

              Do you have any idea what you are talking about? Have you read the Symantec document (Linked by epiphyte or Langner’s article that Rayne linked to)? This thing was no amateur production. It was light years ahead of most malware in many, many respects.

              The point about the 4 Windows exploits is that no normal malware producer would blow 4 on the same virus. Those things are incredibly valuable. Read the Symantec paper closely. They don’t come out and say it, but it is very obvious that the Windows piece of this was designed to make very, very sure that the PCs that were used to program the PLCs would be infected in spite of good security measures. By the way, where did you come up with your estimate of the number of security holes in WinXP and Win2K? Do you have a source for that?

              There is a lot of circumstantial evidence that the U.S. was responsible for this virus. In 2008, the U.S. government actually got Siemens to help them analyze the security flaws of the PLCs in question. Whoever developed the virus almost certainly had a hardware test bed that included the type of centrifuges used by Iran (orignally for A.Q. Khan’s network). The idea that somebody in Iran could have pulled this off is simply nonsense.

              Here are the likely suspects:

              1. The U.S. – We had motive, opportunity, and capacity. We had centrifuges just like the Iranian ones that we got from Libya when they gave up their program.

              2. Israel – Same as us. They got their centrifuges from Khan (in all likelihood).

              3. Germany – Only if they are totally insane.

              4. Russia – Probably could have pulled this off, but seems unlikely. Although Russian contractors at Nanantz were probably the infection vector.

              • Sinestar says:

                You haven’t read all my posts, like the last guy you cherry picking my points. My main point was that it is just as likely and possible that it was internal sabotage as it was anyone else, and yes I do know what the hell I’m talking about. It used some well known attacks in part, depended on the use of Admin (ya know C$) shares, so it was common knowledge these are exploitable. I’m really flummoxed the dummies didn’t lock their workstations down, like at all. Disabling the admin share alone would have neutered that ‘bad-ass’ worm!

                There were some really clever parts to the worm and there were some rank amateur parts. Including not being able to code it to shut off before it escapes in the wild! WOW! How hard could that have been really? And if you need 4 exploits to make it work, you use 4 exploits so WTF is your point on that? Keep looking in Windows long a while and you’ll find the other 40,000 holes in it or any other complex piece of software.

  9. person1597 says:

    The Stuxnet wiki explains some of the subtleties involved. Reprogramming the logic devices on the variable frequency controllers is but one feature. Not that it isn’t novel, but it isn’t unusual for products from that era to allow field reprogrammability.

    Indeed, it isn’t that unusual to store logic configuration code in nonvolatile memory — especially EEPROM (old fashioned flash) that is reconfigurable from a host interface (PC) or remote host (internet pipe). The same exploit was available to those who wanted to screw with the Diebold voting machines. I blame Karl!

    What fascinates me, and lends some validity to the notion that the attack may be ongoing and viable against the reactor is that it is mutable — there are new releases in the field…

    According to Hamid Alipour, deputy head of Iran’s government Information Technology Company, “The attack is still ongoing and new versions of this virus are spreading.”

    What a mess! The reactor technology is part German, part Iranian and part Russian. Somebody is having a good laugh! Good CYA fodder…

  10. victorx says:

    > With control systems disabled by the virus, the reactor would have the force of a “small nuclear bomb,” it says.

    This seems like irresponsible reporting to me. Nuclear reactors are physically incapable of exploding like a nuclear bomb. You can have other kinds of industrial explosions at a nuclear plant and those can spread radioactive debris over a large area (as in Chernobyl) but that’s a totally different scenario.

    Nuclear weapons – even very small ones – are extraordinarily powerful. The yield of the smallest practical nuclear weapon ever made (the W54 used in the Davy Crockett) is roughly equivalent to that of the largest conventional bomb ever built (the MOAB) and would be about 5x the size of the explosion from the Oklahoma City Bombing. This is NOT what would or could happen with Iran’s nuclear facilities.

    Think “industrial explosion” + “environmental disaster” + “major hazardous materials emergency” rather than “small nuke”.

  11. kidcharles says:

    To get technical for a second, there is no way this virus could lead to a nuclear explosion. It is very difficult to initiate a nuclear explosion, it requires a sophisticated triggering mechanism and the right nuclear materials, i.e. a bomb. At worst, Stuxnet could lead to a meltdown (ala Chernobyl) which is an entirely different thing, more like a slow release of radiation. Still very bad for any nearby people, but hardly comparable to an actual nuclear detonation.

    • Rayne says:

      Langer’s piece for Control Global is a solid synopsis about the PLC threat.

      Only problem I have with it is that he glosses over the infection component, and the infection component has broad repercussions, beyond PLCs.

      Remember Google’s problem with the Chinese in Dec/Jan 2010? Yeah.

      • epiphyte says:

        This paper: Symantec w32.stuxnet dossier goes into quite some detail on the windows attack/propagation side of the story. Its a bit out of date in that it was written before the details of the PLC attack were well understood, but those arent necessary to appreciate the windows-related aspects.

        A couple of points which may not be clear to everyone going by earlier comments on the thread:

        1) PLCs do not run windows, they run machine code which has been downloaded into their non-volatile memory. The ultimate windows-based target for stuxnet was a the dll responsible for uploading the code (which is cross-compiled on the windows computer running a suitable development environment) to the device. When connected to the right PLC, it would silently alter the code it was uploading.

        2) all of the clever windows infiltration/propagation was needed because the machines used to program the PLCs were never connected to the internet; possibly never networked at all. The worm had to find its way onto a usb drive, which would be mounted on one of the programming machines, which would at some point upload a new program to a PLC, in order to find its way to the target.

        • Rayne says:

          Thanks, have read Symantec’s and Kaspersky’s content on Stuxnet to date, which is why I said Langer’s piece glosses over the infection component.

          Let me point out a few more things, all of which may/may not be connected:

          — Russia has been supplying nuclear energy equipment for years to Iran; there have been points along the way where Russian equipment may have been detained for reasons other than stated (ex. Azerbaijani officials detained Bushehr-bound heat-isolating equipment supplied by OAO Atomstroiexport in Astara at Iranian border, March 29, 2008).

          — Conficker virus spread in 2008/2009 — and components of that exploit are used within a year for the Stuxnet, even after MSFT sends out a patch for Conficker. (Might make one wonder if Conficker was a proof of concept.)

          — Earliest theoretical “birth” of Stuxnet is summer 2009, with others theorizing January 2010; coincidentally, and almost concurrently China launches what is best labeled an intelligence gathering effort (nicknamed Operation Aurora), also believed to use zero-day exploits in MSFT applications but disseminated to international companies through Google mail. The intent appeared to go after source code of at least 34 companies, most of which had military-industrial connections.

          — American scientist visiting North Korea finds a “sophisticated uranium enrichment facility” under construction as reported in Sept. 2010. Where’d they get the equipment? Russia condemned the development in strong terms, which seems odd since Russia could have taken up NK’s 2006 offering of uranium deposits and simply buy out all their supplies to prevent NK from having anything to enrich.

          There’s a lot of moving parts out there, and they’re not all working in a vacuum or a silo. When Russians carp, it might behoove us to look at motives beyond the obvious.

          Edit: Nuts, hit submit before I was done. Note this article about a Czech-Russian consortium turning over the keys to a new nuclear plant in the Czech Republic at Temelin in October 2009. The Russian partner in this consortium is the firm formerly known as OAO Atomstroiexport.

          What else have the Russians installed in other nuclear facilities around the world?

    • person1597 says:

      Judging from your links, it looks like just software — no FPGA binaries were modified by the payload.

      Would have been even more impressive if they had downloaded a new hardware design via the virus. …Someday, maybe…(Hey, it is a weakness of software-defined hardware!)

  12. sues says:

    They should infiltrate pakistan nuclear facilities with stuxnet so they can’t fire off any nuclear arms. this is the most dangerous lunatic country in the world which no doubt will destroy the world. the supplier of nuclear arms to the world.

  13. WilliamOckham says:

    Btw, based on Langner’s latest stuff, I am starting to think that Bushehr was never a target of this thing at all. It seems to have been all about the Iranian centrifuges.

    People are still working on figuring out exactly what the PLC part of the malware was up to, but it is very clear that Langner’s latest stuff maps very well to the reports of the problems the Iranians had with their program. I sure the people who did this made the calculation that slowing the Iranian nuclear program was worth this stuff getting out, but I think they were very wrong. This thing will cause much more damage in the long run than an Iranian bomb.

    • WilliamOckham says:

      The problem with Parker’s analysis (and most of the other people who are claiming the code is poorly written) is the assumption he makes about the intent of the authors. They are analyzing it from the perspective of the typical malware threat where hiding your tracks is important to your success. Let’s take his criticism of the command and control (the communication with the two websites used by the worm) being conducted in the clear. Was that a rookie mistake? I don’t think it was. As the Symantec people point out, sending a plain text http request and receiving a binary response is one of the best ways to avoid setting off firewalls and alarms.

      Most of the the so-called deficiencies in the code are a clear trade-off between stealth and maximizing the chances of propagation to the intended target. Take the criticism that Stuxnet didn’t do a very good job of hiding what it was looking for (Step 7 systems). Given what else Stuxnet accomplished and the features that everyone admits are way ahead of anything else, the appropriate question to ask is why didn’t they hide their target? To assume the answer is incompetence may make the experts feel better, but seems rather inconsistent. A better guess is that they just didn’t care. I think they wanted people to figure out what they had done (after they had completed the mission).

      [On a side note, your comment that “Disabling the admin share alone would have neutered that ‘bad-ass’ worm!” is incorrect. Using the default admin share was simply one of several different techniques Stuxnet used to replicate. I would encourage you to spend some quality time with some of the references that have already been mentioned before you make such bold assertions.]

      • Sinestar says:

        Again you’ve chosen to cherry pick certain parts of my comments that you have a defense ready and ignore the rest. What about Lawson’s analysis?

        So what you are saying by tacit omission is that it would be impossible for the Iranian engineers who designed Natanz UEF, with presumably Russian or DPRK assistance, built, and work in everyday could not possibly also have in their ranks programmers and process analysts with fear or conscience that would prefer Iran not look like Iraq or Afghanistan because of the Mullah’s insistence on antagonizing the West that would also be clever enough to write this code? That those who actually work their no whether or not the goal is weaponization or not and decide to stop it?

        Code that didn’t even poly-morph with any more sophistication than the VX worms from the early 90’s? That it was written so poorly that it couldn’t shut itself down and delete itself after the mission was accomplished, didn’t hide the payload so NOW THE ENEMY (in your view) (and the rest of the Internet) has the code, the design, the intent of the code and now everyone knows exactly how it was pulled off? This is comical to assume the CIA would design the equivalent of a Stealth Fighter and leaved the blueprints lying around on the proverbial coffee table for anyone to browse. I’m sorry not buying it.

        Additionally I used the word ‘neutered’. A Neutered dog can still bite, they just tend to be less aggressive. You also completely ignore the comment that by administratively disabling removable media could have prevented this, which for some laughable reason wasn’t thought of by their security, but would be default in my view. The basics. They may as well have just plugged everything into the Internet. Doesn’t require a genius at all to use that as a vector. And we stilllllll haven’t answered the fundamental question of how someone walked in with the payload and set it loose. B2 carpet bombing the parking lot with flash drives with pictures of the Quran? An moussad agent breaks into the secure underground facility? IAEA inspectors with their escorts watching their every move?

        None of these arguments and none that you have made assert that the only people that could have pulled this off was a western state sponsored entity, but then it is you who is making that BOLD assertion. I have to assume therefore that you are a CIA or Israeli computer programmer analyst who can just spout out such bold predictions, but your conclusions are probably based on a propaganda rag like the NY Times.

        All I said is that is entirely possible that the Iranians could have done it. Thinking people in the middle east are a bunch of uneducated fanatics that live in caves and couldn’t possibly approach western sophistication is living in the dark my friend. Having met several Middle Easteners and Indians, and a Turk, they are of a very serious mind and they are often extremely intelligent. Unfortunately, one day the US will realize at least if they insist on bombing the shit out of everyone they choose

        But you make such bold statements that it had to be US, Israel, Germany, or MAYBE Russia (who BTW work in the country everyday) No Israelis or Americans are allowed within a hundred miles of Natanz to be sure. So to try to get through to you, if you even care at all and not just trying to be intransigent:

        THE IRANIANS COULD HAVE EASILY CREATED THAT WORM AS ANYONE ELSE AND IT DID NOT REQUIRE THE US, ISRAEL, GERMANY, OR THE RUSSIANS. THERE IS NO EVIDENCE TO THE CONTRARY AND CRITICAL EXAMINATION (i.e. Ockam’s Razor) OF THE PERIPHERAL FACTS BESIDES THE WORM ITSELF POINT TO AN INSIDE JOB. THE EXPERTS THAT COULD DO THIS LIVE IN IRAN AS WELL AS ANY OTHER PLACE.

        That being said, the US and Israel could do it too and I never said it couldn’t. I just consider that less likely that it was government sponsored attack. Besides, the American’s seem to prefer bombing the shit out of everything than finesse anymore. LOL ‘If there is 1 terrorist in a building with 30 civilians, 31 people are going to die’ seems to be the new credo.

        • WilliamOckham says:

          I’m going to explain my position to you one more time. I’m trying to help you understand why your assumptions might be invalid. We should be able to have this conversation without insults or shouting.

          I’ll respond to your comment paragraph by paragraph so that you can see that I’m responding to your entire position.

          Again you’ve chosen to cherry pick certain parts of my comments that you have a defense ready and ignore the rest. What about Lawson’s analysis?

          I didn’t ignore Lawson’s analysis, but perhaps I wasn’t clear in my response. Here’s what Lawson is quoted as saying in the link you gave:

          Nate Lawson, an expert on the security of embedded systems, also criticised the cloaking and obfuscation techniques applied by the malware’s creators, arguing that teenage BulgarianVXers had managed a much better job on those fronts as long ago at the 1990s.

          “Rather than being proud of its stealth and targeting, the authors should be embarrassed at their amateur approach to hiding the payload,” Lawson writes. “I really hope it wasn’t written by the USA because I’d like to think our elite cyberweapon developers at least know what Bulgarian teenagers did back in the early 90′s”1

          He continues: “First, there appears to be no special obfuscation. Sure, there are your standard routines for hiding from AV tools, XOR masking, and installing a rootkit. But Stuxnet does no better at this than any other malware discovered last year. It does not use virtual machine-based obfuscation, novel techniques for anti-debugging, or anything else to make it different from the hundreds of malware samples found every day.

          “Second, the Stuxnet developers seem to be unaware of more advanced techniques for hiding their target. They use simple “if/then” range checks to identify Step 7 systems and their peripheral controllers. If this was some high-level government operation, I would hope they would know to use things like hash-and-decrypt or homomorphic encryption to hide the controller configuration the code is targeting and its exact behavior once it did infect those systems,” he adds.

          Lawson’s comments are solely about the techniques that the Stuxnet creators used (or didn’t use) to hide the worm’s payload. His analysis assumes that stealth and obfuscation were desired qualities. If they were not, his analysis is irrelevant. Think about this in terms of an actual software development project. The code we have meets some very exacting requirements with what everyone agrees is impressive technique. Why is this particular area so weak? It’s pretty clear that multiple teams worked on Stuxnet, but that’s not an explanation. An organization that can acquire stolen Verisign-issued certs certainly could have gotten their hands on any number of development tools (or teams) that could have used more sophisticated stealth techniques. It seems only logical to me that the explanation is that the team that coded these aspects did what the customer wanted.

          So what you are saying by tacit omission is that it would be impossible for the Iranian engineers who designed Natanz UEF, with presumably Russian or DPRK assistance, built, and work in everyday could not possibly also have in their ranks programmers and process analysts with fear or conscience that would prefer Iran not look like Iraq or Afghanistan because of the Mullah’s insistence on antagonizing the West that would also be clever enough to write this code? That those who actually work their no whether or not the goal is weaponization or not and decide to stop it?

          No, that’s not my point at all. I’m sure the Iranians have many fine programmers. I’ve certainly worked with enough Iranian ex-pats to know that. The real problem for creating Stuxnet was having an adequate test bed to ensure that it would do what you want. That’s what narrows down the list of suspects.

          Code that didn’t even poly-morph with any more sophistication than the VX worms from the early 90′s? That it was written so poorly that it couldn’t shut itself down and delete itself after the mission was accomplished, didn’t hide the payload so NOW THE ENEMY (in your view) (and the rest of the Internet) has the code, the design, the intent of the code and now everyone knows exactly how it was pulled off? This is comical to assume the CIA would design the equivalent of a Stealth Fighter and leaved the blueprints lying around on the proverbial coffee table for anyone to browse. I’m sorry not buying it.

          Um.. go read this and re-think that. The CIA gave the Iranians nuclear blueprints. Also, I’m not particularly worried about the Iranians having this code. I’m far more concerned with the fact that our government apparently decided making pretty much the entire infrastructure of the developed world more susceptible to “cyberwarfare” was an acceptable risk for this operation. It seems idiotic to me, but I can see how the thinking (or lack thereof) went in the CIA.

          Additionally I used the word ‘neutered’. A Neutered dog can still bite, they just tend to be less aggressive. You also completely ignore the comment that by administratively disabling removable media could have prevented this, which for some laughable reason wasn’t thought of by their security, but would be default in my view. The basics. They may as well have just plugged everything into the Internet. Doesn’t require a genius at all to use that as a vector. And we stilllllll haven’t answered the fundamental question of how someone walked in with the payload and set it loose. B2 carpet bombing the parking lot with flash drives with pictures of the Quran? An moussad agent breaks into the secure underground facility? IAEA inspectors with their escorts watching their every move?

          I assumed that you meant neutered in the sense of “not capable of reproducing”. Sorry if I misunderstood you, but I think it was a reasonable interpretation, given the context. I don’t understand your statement about something being prevented by removing administrative shares. Again, the use of administrative shares was only one way that Stuxnet spread on LANs. Removing them would have removed one vector of infection. That’s all. We don’t actually know whether or not the targeted systems had administrative shares enabled. You are correct in saying that we don’t know exactly how Stuxnet got into the targeted location. I think there are a lot of scenarios more realistic than the ones you suggest. If I was planning this, I would have targeted the laptop of a foreign contractor. Wait for the contractor to go home (to Russia, most likely) and then try to infect their laptop then. The aggressiveness of the infection mechanisms suggests that it was some sort of indirect attack (if you had physical access to one of the PCs you were targeting you wouldn’t have needed most of the “incompetent” infection code).

          None of these arguments and none that you have made assert that the only people that could have pulled this off was a western state sponsored entity, but then it is you who is making that BOLD assertion. I have to assume therefore that you are a CIA or Israeli computer programmer analyst who can just spout out such bold predictions, but your conclusions are probably based on a propaganda rag like the NY Times.

          The primary indicator is my belief that the PLC code was tested on actual centrifuge hardware (not a certainty, but highly likely). I hope the rest of this paragraph was a joke. I do occasionally get emails from the CIA

          offering me the opportunity to apply for a job, but I’m pretty sure they are just mass emails, probably based on my LinkedIn profile (under my real name).

          All I said is that is entirely possible that the Iranians could have done it. Thinking people in the middle east are a bunch of uneducated fanatics that live in caves and couldn’t possibly approach western sophistication is living in the dark my friend. Having met several Middle Easteners and Indians, and a Turk, they are of a very serious mind and they are often extremely intelligent. Unfortunately, one day the US will realize at least if they insist on bombing the shit out of everyone they choose

          For the reasons I gave above, I doubt that this particular attack originated in the Middle East. I’m not sure why you apparently want to attribute my position to the worst sort of bigotry. I think that was uncalled for.

          But you make such bold statements that it had to be US, Israel, Germany, or MAYBE Russia (who BTW work in the country everyday) No Israelis or Americans are allowed within a hundred miles of Natanz to be sure. So to try to get through to you, if you even care at all and not just trying to be intransigent:

          I stated who I thought the most likely suspects were. I hope I’ve explained why I think that. There are other possibilities, but fundamentally you have to figure who really wanted to cause problems for the Iranian nuclear program and had the capacity to pull this off. The North Koreans could have done it, but why would they bother? I don’t think I’m being intransigent. We’re still learning about Stuxnet and if new evidence comes out, I’m more than willing to admit my errors. I just don’t agree with the folks who say that some of this code was amateurish. I don’t have access to the code, so I can’t be sure. I’m just trying to figure out what makes the most sense based on what we know. There are off-the-shelf components that give you polymorphic virus code. These folks used a lot of other common techniques on the Windows part of this (combined with some stuff they invented). Why would they overlook this common technique? Why take the time or spend the money to develop multiple zero day attacks, but not implement polymorphic virus code? Your explanation doesn’t make sense to me. Until someone comes up with a better explanation, I will stick to my speculative one: Whoever did this didn’t want polymorphism.

          THE IRANIANS COULD HAVE EASILY CREATED THAT WORM AS ANYONE ELSE AND IT DID NOT REQUIRE THE US, ISRAEL, GERMANY, OR THE RUSSIANS. THERE IS NO EVIDENCE TO THE CONTRARY AND CRITICAL EXAMINATION (i.e. Ockam’s Razor) OF THE PERIPHERAL FACTS BESIDES THE WORM ITSELF POINT TO AN INSIDE JOB. THE EXPERTS THAT COULD DO THIS LIVE IN IRAN AS WELL AS ANY OTHER PLACE.

          No need to shout. I disagree, but you know that already. The nature of the worm doesn’t point to an inside job. On the contrary, it points to an outsider with good, but not perfect, intel about the Nanantz facility. There’s no evidence that the creators knew much about the internal workings of the facility (hence all the different vectors of infection). On the other hand, they seem to have known a lot about the technical features of the installation. That’s exactly the kind of information that the U.S. is good at collecting because it doesn’t take a lot of human intelligence. We’ve managed to screw up the human intelligence in Iran repeatedly, but we still gather lots of sophisticated technical data.

          That being said, the US and Israel could do it too and I never said it couldn’t. I just consider that less likely that it was government sponsored attack. Besides, the American’s seem to prefer bombing the shit out of everything than finesse anymore. LOL ‘If there is 1 terrorist in a building with 30 civilians, 31 people are going to die’ seems to be the new credo.

          I agree that it is somewhat surprising that the U.S. pulled this off. I didn’t think we were capable of thinking this creatively, to be honest.

  14. WilliamOckham says:

    Following up:

    I apologize for repeatedly misspelling Natanz.

    I found the slides from Mr. Parker’s presentation from Black Hat [pdf file](which was the basis for the article that Sinestar linked to above).

    Here are some excerpts:

    Resources Required

     Access to hardware & software

     including frequency converters

     and probably centrifuges

     Propagation Method

     Stolen Certificates

    The Dichotomy of Stuxnet

     Costly due to:

     Maintenance for at least eighteen months and

    as long as four years

     R&D invested into R&D PLC Payload, Step7

    Subversion & Delivery Framework

     However..

     Trivial C&C Channel

     Lots of prior art re-use

     We’re talking about it right now

    Parker rules out Western nations almost solely based on the C & C structure. I think he makes far too much of that.

    • Rayne says:

      Agree on the C&C — it’s bullshit that C&C structure would have been necessary to get this done, can imagine a number of independent compartmentalized teams working on specific deliverables for Stuxnet, with one “assembly” team tacking it together.

      Would actually make more sense since the skill sets for the different components are not typically found in a single group. PLC folks, for example, may not be experts on MSFT products when it comes to exploits.

      Edit: My bad, apparently I have spent far too much time recently doing research on military projects as I immediately thought C&C in military terminology and not in software. You can ignore this comment altogether, although I’ll leave it up in case it jogs somebody’s thought processes.