Apple’s Go to Fail Response

if you haven’t already heard, Apple admitted to what has been discovered to be a serious security flaw on Friday.

Essentially, for some of the more careful kinds of security, the flaw would allow an attacker to conduct a Man-in-the-Middle attack when you were sending or receiving data via an Apple operating system. Apple’s announcement Friday pertained to just iOS. But security researchers quickly discovered that the bug affects recent releases of OSX as well. And even if you’re using Chrome or Firefox, the bug may affect underlying applications.

This post, from Google engineer Adam Langley, is one of the best posts on the bug itself. Here’s Wired’s take. Here’s a really accessible take from Gizmodo.

In the wake of the Snowden revelations, the discovery of the bug raises questions about how it got there. Langley thinks it was a mistake. Steve Bellovin does too, though does note that targeting Perfect Forward Security is precisely what a determined hacker, including a nation-state’s SIGINT agency, would need to compromise. Others are raising more questions.

But whether or not this is an intentional backdoor into the security protecting users of most of Apple’s most recent devices, I’m just as interested in Apple’s response … both to the public report, almost 6 months ago, that,

US and British intelligence agencies have successfully cracked much of the online encryption relied upon by hundreds of millions of people to protect the privacy of their personal data, online transactions and emails, according to top-secret documents revealed by former contractor Edward Snowden.

And to its discovery — reportedly perhaps as long as a few weeks ago — that it had this serious bug.

Now, if I were a leading device/consumer products company with an incentive to get consumers deeper into the cloud and living further and further online, particularly if I were a leading device/consumer products company sitting on mountains and mountains of cash, upon reading the report last September, I would throw bodies at my code to make sure I really was providing the security my customers needed to sustain trust. And given that this is a key part of the security on which that trust relies, I would think the mountains of cash device/consumer products company might have found this bug.

According to rumors, at least, this bug was not found by Apple with all its mountains and mountains of cash; it was found by a researcher.

Then there’s the radio silence Apple has maintained since issuing its alert about iOS on Friday. It told Reuters over the weekend that it would have a fix to the OSX bug “soon,” so it has, effectively acknowledged that it’s there. But it has not issued an official statement.

It just seems to me there is little that can explain issuing Friday’s security alert — alerting everyone, including potential hackers, that the problem is there, which quickly led to the independent identification of the OSX problem — without at the same time rolling out an OSX announcement and alert. Admitting to the iOS error effectively led to OSX users being exposed to people responding to the announcement. Millions of Apple customers are even further exposed, until such time as Apple rolls out a fix (though you might consider doing your banking on a browser other than Safari to give yourself a tiny bit of protection until that point).

The only thing I can think of that would explain Apple’s actions is if the security researcher who found this bug gave them limited warning, before her or she would have published it.

Otherwise, though, I’m as interested in the explanation for Apple’s two-step rollout of this bug fix as I am in how it got there in the first place.

image_print
19 replies
  1. charlie says:

    1. Actually Apple hasn’t credited anyone with the discovery, which means that it was done internally.

    2. Seeing that the SSL layer crashes — which it reserved in January — and going line by line through the code to find the bad copy/paste job takes a long time.

    3. No idea why the delay on Mac OSX except that iphones/ipads are more likely to connect to guest networks than desktops and laptops. It is also a much larger universe.

  2. C says:

    While I generally agree with your analysis EW I think that there is a corporate issue that you should consider here. All of the steps you suggest, especially throwing bodies at the code, make perfect sense to an Engineer or anyone who cares about the quality of the product. For those who only think about the quality of the stock (i.e. the MBA-trained execs who make the actual decisions) that doesn’t make as much sense nor does making a clear and open admission up front. In my experience nontechnical execs tend to think, much like politicians ironically, about silence, “damage mitigation”, and “cost containment” first rather than actual problem-solving. While this lends itself to bigger problems of the type here it makes sense in the short term.

  3. emptywheel says:

    @charlie: Does it definitely mean it was done internally?

    And if it was, why leave all your OSX users exposed by rolling out the iOS fix first? It is understandable, maybe, if they didn’t find it. Unforgivable if they did.

  4. charlie says:

    No, what it could mean that someone told them from the outside (the snowdon PPT released in october, or in January) that their SSL sockets could be hijacked, but then apple had to spend all this time doing the work for themselves to find the fault.

    And in terms of the OSX vs ioS split, again the attack vector is different from laptops/desktops, and the rumor mill has in a larger updates (10.9.2) is being released this week.

    Apple’s response here get a B+ or A-. Unanwered questions on why MacOS is taking longer but I’ll willing to throw them a bone if it gets out this week. Given the (just-as-large) flaw in Android devices discovered last week — which will never be patched given the update problem — don’t make a mountain out of a molehill.

  5. Saul Tannenbaum says:

    @charlie: I concur with charlie in this assessment.

    But look at this from a different point of view. Imagine Apple found the bug, built the IOS patch then held it until the OS X patch was ready. They’d be criticized for leaving IOS users vulnerable even though they had a patch ready.

    The evidence does seem to indicate that Apple found this themselves, which suggests they’re undergoing the massive software audit you’d expect them to, post-Snowden. And, given their choices, patch IOS immediately vs. wait for the OS X patch, it’s really counter-intuitive to suggest they made a mistake by releasing the IOS patch sooner, especially if the vulnerability is being exploited.

    With 10.9.2 almost ready, you’ve got to patch 10.9.1 for folks who won’t upgrade, fix 10.9.2, make sure 10.9.2 doesn’t unfix 10.9.1, and roll out patches for each supported version of OS X that also has the bug. You’ve got to test this all against a huge matrix of hardware and software because, the minute you don’t, you risk making things worse. And you’ve probably added a whole array of security test cases, just to make things worse. It’s a big sofware and release engineering job, one that takes as long as it takes.

    (Much as I hate the locked down nature of IOS – one should be able to run whatever code one wants on a device one owns – it is a far controlled environment, which makes the software/release engineering much simpler.)

    And, to put this in perspective, all the good SSL in the world isn’t stopping stupid people from clicking on bad links in phishing emails.

    Yeah, this is a bad bug and should be fixed promptly. But once you’ve introduced the bug, you’re pretty well screwed. There’s no graceful, safe exit. You release the patches as quickly as you can and take your lumps, which is what Apple seems to be doing.

  6. jerryy says:

    There are some slight drifting thoughts offered up here about various topics and guessing as to motives…

    Here are some countering thoughts:

    This arror occurred in the open-source portion of Apple’s code, not the closed source part. [Apple’s underlying code for OS-X and somewhat for iOS is and has been open-source, based upon UNIX(tm), you can download it and compile the OS-X kernel from Apple’s site, for free, if you wish. The closed source part is the gui — the graphical desktop part which is what you have been, up until the latest release, paying the $30 or so.

    At one point a while back in the past, Apple decided the usual version of the openssl sutie of programs was a problem because of the method needed to constantly push out upgrades to users so they rolled their own version so to speak. They were able to stay on top of upgrades pertty well, but a mistake in this version was missed.

    As such, the source code which includes the error has been available for inspection by anyone, for some time.

    Here is the guessing part: This does look more like a simple mistake than deliberate malfeasance. In spite of the mounds of cash that Apple has, they do not have a large number of programmers, they often pull the OS-X programmers off of projects and put them to work fixing iOS for the iPhones. My guess is that someone was burning both ends of the candle with the middle catching and slipped up. The programming style is sloppy and slips from ‘best practices’ to ‘oh crap I have to get this done to work on both platforms.’

    I too am surprised that the OS-X patch has not been released yet, the FAST fix does not need regression testing (running on all the available equipment and softwares to see if a problem pops up), but then only the folks at Cupertino can answer that one.

    http://www.imore.com/understanding-apples-ssl-tls-bug

  7. thatvisionthing says:

    Least tech savvy person here, and I’m Apple clueless, and I haven’t clicked and grokked all the links here (hopeless), but I did see the youtube of Jacob Appelbaum’s 30c3 presentation given apparently the same day Der Spiegel published all the NSA spy gear catalog stuff at the end of December and I did try to look at stuff then. Is this the same fail he was talking about?

    http://www.nakedcapitalism.com/2014/01/jacob-appelbaum-30c3-protect-infect-militarization-internet-transcript.html

    Jacob Appelbaum: Here’s an iPhone back door.

    So DROPOUTJEEP, so you can see right there. So, SMS, contact list retrieval, voicemail, hot microphone, camera capture, cell tower location. Cool. Do you think Apple helped them with that? I don’t know. I hope Apple will clarify that. I think it’s really important that Apple doesn’t.

    Here’s a problem. I don’t really believe that Apple didn’t help them. I can’t prove it yet, but they literally claim that any time they target an iOS device, that it will succeed for implantation. Either they have a huge collection of exploits that work against Apple products, meaning that they are hoarding information about critical systems that American companies produce and sabotaging them, or Apple sabotaged it themselves. I’m not sure which one it is. I’d like to believe that since Apple didn’t join the PRISM program until after Steve Jobs died that maybe it’s just that they write shitty software. We know that’s true.

    In which case, doesn’t the fail go back years longer than you say, at least five years like everything else in the catalog, and didn’t Der Spiegel break the story? The slide for DROPOUTJEEP for Apple iPhone is dated 10/01/08: http://leaksource.files.wordpress.com/2013/12/nsa-ant-dropoutjeep.jpg.

    And re “they literally claim that any time they target an iOS device, that it will succeed for implantation” — that’s apparently from the QUANTUMNATION slide:

    http://www.spiegel.de/fotostrecke/nsa-dokumente-so-uebernimmt-der-geheimdienst-fremde-rechner-fotostrecke-105329-24.html

    Note: QUANTUMNATION and standard QUANTUM tasking results in the same exploitation technique. The main difference is that QUANTUMNATION deploys a stage 0 implant and is able to be submitted by the TOPI. Any ios device will always get VALIDATOR deployed.

    Re Quantum, earlier in the presentation he said this:

    Jacob Appelbaum: So there are different programs. So QUANTUMTHEORY, QUANTUMNATION, QUANTUMBOT, QUANTUMCOPPER and QUANTUMINSERT. You’ve heard of a few of them. I’ll just go through them real quick.

    QUANTUMTHEORY essentially has a whole arsenal of zero-day exploits. Then the system deploys what’s called a SMOTH, or a seasoned moth. And a seasoned moth is an implant which dies after 30 days. So I think that these guys either took a lot of acid or read a lot of Philip K. Dick, potentially both.

    [applause]

    And they thought Philip K. Dick wasn’t dystopian enough. Let’s get better at this. And after reading VALIS, I guess, they went on, and they also have as part of QUANTUMNATION what’s called VALIDATOR or COMMONDEER. Now these are first-stage payloads that are done entirely in memory. These exploits essentially are where they look around to see if you have what are called PSPs, and this is to see, like, you know, if you have Tripwire, if you have Aid, if you have some sort of system tool that will detect if an attacker is tampering with files or something like this, like a host intrusion detection system. So VALIDATOR and COMMONDEER, which, I mean, clearly the point of COMMONDEER, while it’s misspelled here – it’s not actually, I mean that’s the name of the program – but the point is to make a pun on commandeering your machine.

    So, you know, when I think about the U.S. Constitution in particular, we talk about not allowing the quartering of soldiers – and, gosh, you know? Commandeering my computer sounds a lot like a digital version of that, and I find that a little bit confusing, and mostly in that I don’t understand how they get away with it, but part of it is because until right now we didn’t know about it, in public, which is why we’re releasing this in the public interest so that we can have a better debate…

    Long transcript says at bottom that VALIDATOR is mentioned in the -MONTANA router slides (works on Juniper and Junos routers) and DROPOUTJEEP is in the mobile phone slides at Der Spiegel’s interactive graphic (http://www.spiegel.de/international/world/a-941262.html), or you can see them all in a scroll-down roll here: http://leaksource.wordpress.com/2013/12/30/nsas-ant-division-catalog-of-exploits-for-nearly-every-major-software-hardware-firmware/ — All the gadget stuff is circa 2008, though I see the QUANTUMNATION slide uses a Facebook screenshot with a 2013 date in it and says the phrase above, that “any ios will always get VALIDATOR deployed.” I am so stupid, why should an iPhone trigger a Juniper/Junos router, or why is VALIDATOR a vulnerability common to both, or — what is it saying? I dunno. But I did see that stuff then and throw it in your pot now. Also, slide 1 of the Quantum slide show says it is derived from NSA/CSSM 1-52 dated 20070108, so that takes it back even farther, and slide 3 says VALIDATOR is soon to be called COMMONDEER.

  8. to_err=human; to_forgive=nsa GOTO 1 says:

    “This does look more like a simple mistake than deliberate malfeasance.”

    In other news, Shyamji Krishna Varma didn’t mean to blow up Black Tom, he was just really busy and in a moment of distraction he leaned on the plunger of the blasting machine and KA-BOOM, next thing he knows he’s got smudges on his face and his clothes are all ripped. Imagine his surprise and distress.

  9. jerryy says:

    @thatvisionthing: … “why should an iPhone trigger a Juniper/Junos router, or why is VALIDATOR a vulnerability common to both, or — what is it saying?”

    To just touch upon your question a bit, Juniper/Juno routers are used by the big Internet backbone companies, if the bad guys wanted to monitor or target the traffic, this is one kind of equipment they would infiltrate to run that operation. If they control the j/j equipment they would be able to see from the header information that the data was for an iPhone or Android or Windows or whatever type of cell phone being used.

    I will venture to guess that lots of jailbreakers and whatnot are red faced over this bug in Apple’s software… they were handed the holy grail of cracking and missed it.

  10. lefty665 says:

    A note from us old bit twiddlers. After all the arrogant and insufferable crap we’ve had to listen to from Apple, they trip over a f***ing “goto” that trashes security on everything they support? WTF? If Apple software management had any sense of shame they’d commit suicide out of humiliation. But we know they have no shame, they’re Apple.

    There were rumors that besides cobol Y2K fixes “goto” had been written out of language standards with the coming of the millennium. Hell, some of us pretty much ditched “goto” when structured programming emerged in the 70’s, and for good reason.

  11. bloopie2 says:

    Problem:

    1. Apple has a coding/security problem that opens its products up to hackers.
    2. Apple has $100 billion of cash on hand.
    3. The NSA employs thousands of the best mathematicians and coders around. A number of them may be looking for better (read: Honorable) work.

    Solution: Apple offers to hire any and all NSA mathematicians or coders, at a 50% increase in salary each. To work either in California or at a new office that Apple will set up near Fort Meade (or wherever). If 10,000 accept the offer, at $150,000 per year, that’s a cost of only $1.5 billion per year – peanuts. You can guarantee them each a 10-year contract – that should work.

    Result: Apple gets the cream of the crop, folks experienced at how to deal with hacking, and improves its product greatly. The NSA loses much of its internal firepower. A win for Apple, and a win for America.

Comments are closed.