Volkswagen’s bad news, good news as Detroit’s auto show opens
Bad news first: In news dump zone on Friday afternoon, we heard Volkswagen wasn’t going to release documents pertaining to the emissions control defeat scandal to several U.S. states’ attorneys. VW said it couldn’t due to privacy laws, which sounds dicey; why do corporations have privacy rights? You’d think only U.S. businesses would attempt such excuses.
The good news was held until VW’s CEO Matthias Mueller arrived in U.S. for the soft opening of the North American International Auto Show in Detroit. VW is working on a catalytic converter it believes will resolved the emissions problem for roughly 2/3 of the affected vehicles. I’m guessing this is fix is intended for the oldest vehicles, and that the newest ones are likely to be swapped with a new vehicle, or a sizeable discount on a replacement will be offered. Color me skeptical about the effectiveness of this fix; if this was such an obvious and easy solution, it would already appear on VW’s diesel-powered passenger vehicles. Fuel economy will likely diminish due to increased back pressure — but that’s why I think this fix is for the oldest cars. It would encourage VW loyalists to buy a new one.
Juniper Network shuts the (a?) backdoor
The network equipment company says it’s “dropping” NSA-developed code after the revelation of a backdoor into their network device software. Does anyone believe all covert access by NSA has now been eliminated, though, if Juniper’s source code isn’t open?
Apple’s devices monitoring your emotions soon?
Ridiculously cash-rich Apple snapped up artificial intelligence company Emotient, which makes an application to interpret users’ emotions based on their facial expressions — sentiment analysis, they call it. I call it creepy as hell, especially since smartphone users can’t be absolutely certain their cameras aren’t in use unless they physically cover the apertures.
And yes, I do cover apertures on my devices with low-tack adhesive tape. It’s the first thing I do after opening the box on any new camera-enabled device, even before charging the battery.
That’s enough to get your cart moving. I hope to have a post up later, on the recent power outage in Ukraine.
Last night, Marco Rubio went on Fox News to try to fear-monger over the phone dragnet again.
He repeated the claim that the AP also idiotically parroted uncritically — that the government can only get three years of records for the culprits in the San Bernardino attack.
In the case of these individuals that conducted this attack, we cannot see any phone records for the first three years in which — you can only see them up to three years. You’ll not be able to see the full five-year picture.
Again, he’s ignoring the AT&T backbone records that cover virtually all of Syed Rizwan Farook’s 28-year life that are available, that 215 phone dragnet could never have covered Tashfeen Malik’s time in Pakistan and Saudi Arabia, and that EO 12333 collection not only would cover Malik’s time before she came to the US, but would also include Farook’s international calls going back well over 5 years.
So he’s either an idiot or he’s lying on that point.
I’m more interested in what he said before that, because he appears to have leaked a classified detail about the ongoing USA Freedom dragnet: that they’ve been issuing orders to a “large and significant number of companies” under the new dragnet.
There are large and significant number of companies that either said, we are not going to collect records at all, we’re not going to have any records if you come asking for them, or we’re only going to keep them on average of 18 months. When the intelligence community or law enforcement comes knocking and subpoenas those records, in many cases there won’t be any records because some of these companies already said they’re not going to hold these records. And the result is that we will not be able in many cases to put together the full puzzle, the full picture of some of these individuals.
Let me clear: I’m certain this fact, that the IC has been asking for records from “a large number of companies,” is classified. For a guy trying to run for President as an uber-hawk, leaking such details (especially in appearance where he calls cleared people who leak like Edward Snowden “traitors”) ought to be entirely disqualifying.
But that detail is not news to emptywheel readers. As I noted in my analysis of the Intelligence Authorization the House just passed, James Clapper would be required to do a report 30 days after the authorization passes telling Congress which “telecoms” aren’t holding your call records for 18 months.
Section 307: Requires DNI to report if telecoms aren’t hoarding your call records
This adds language doing what some versions of USA Freedom tried to requiring DNI to report on which “electronic communications service providers” aren’t hoarding your call records for at least 18 months. He will have to do a report after 30 days listing all that don’t (bizarrely, the bill doesn’t specify what size company this covers, which given the extent of ECSPs in this country could be daunting), and also report to Congress within 15 days if any of them stop hoarding your records.
That there would be so many companies included Clapper would need a list surprised me, a bit. When I analyzed the House Report on the bill, I predicted USAF would pull in anything that might be described as a “call.”
We have every reason to believe the CDR function covers all “calls,” whether telephony or Internet, unlike the existing dragnet. Thus, for better and worse, far more people will be exposed to chaining than under the existing dragnet. It will catch more potential terrorists, but also more innocent people. As a result, far more people will be sucked into the NSA’s maw, indefinitely, for exploitation under all its analytical functions. This raises the chances that an innocent person will get targeted as a false positive.
At the same time, I thought that the report’s usage of “phone company” might limit collection to the providers that had been included — AT&T, Verizon, and Sprint — plus whatever providers cell companies aren’t already using their backbone, as well as the big tech companies that by dint of being handset manufacturers, that is, “phone” companies, could be obligated to turn over messaging records — things like iMessage and Skype metadata.
Nope. According to uber-hawk who believes leakers are traitors Marco Rubio, a “large number” of companies are getting requests.
From that I assume that the IC is sending requests to the entire universe of providers laid out by Verizon Associate General Counsel Michael Woods in his testimony to SSCI in 2014:
Woods describes Skype (as the application that carried 34% of international minutes in 2012), as well as applications like iMessage and smaller outlets of particular interest like Signal as well as conferencing apps.
So it appears the intelligence committees, because they’re morons who don’t understand technology (and ignored Woods) got themselves in a pickle, because they didn’t realize that if you want full coverage from all “phone” communication, you’re going to have to go well beyond even AT&T, Verizon, Sprint, Apple, Microsoft, and Google (all of which have compliance departments and the infrastructure to keep such records). They are going to try to obtain all the call records, from every little provider, whether or not they actually have the means with which to keep and comply with such requests. Some — Signal might be among them — simply aren’t going to keep records, which is what Rubio is complaining about.
That’s a daunting task — and I can see why Rubio, if he believes that’s what needs to happen, is flustered by it. But, of course, it has nothing to do with the end of the old gap-filled dragnet. Indeed, that daunting problem arises because the new program aspires to be more comprehensive.
In any case, I’m grateful Rubio has done us the favor of laying out precisely what gaps the IC is currently trying to fill, but hawks like Rubio will likely call him a traitor for doing so.
I confess I don’t know the answer to this question, but I’m going to pose it anyway. Could companies report non-participation in CISA — or whatever the voluntary cyber information sharing program that will soon roll out is eventually called — in their transparency reports?
I ask in part because there’s great uncertainty about whether tech companies support or oppose the measure. The Business Software Alliance suggested they supported a data sharing bill, until Fight for the Future made a stink, when at least some of them pulled off (while a number of other BSA members, like Adobe, IBM, and Siemens, will surely embrace the bill). A number of companies have opposed CISA, either directly (like Apple) or via the Computer and Communications Industry Association. But even Google, which is a CCIA member, still wants a way to share information even if they express concerns about CISA’s current form. Plus, there some indication that some of the companies claiming to oppose CISA — most notably, Facebook — are secretly lobbying in favor of it.
In the wake of CISA passing, activists are wondering if companies would agree not to participate (because participation is, as Richard Burr reminded over and over, voluntary, even if the key voluntary participants will also be bidding on a $50 billion contract as CISA rolls out). But I’m not sure what that would even mean.
So, first, would companies legally be permitted to claim in their transparency reports that they did not voluntarily participate in CISA? There are a lot of measures that prohibit the involuntary release of information about companies’ voluntary participation in CISA. But nothing in the bill that seems to prohibit the voluntary release of information about companies’ voluntary non-participation.
But even if a company made such a claim — or claimed that they only share cyber indicators with legal process — would it even be meaningful? Consider: Most of the companies that might make such a claim get hacked. Even Apple, the company that has taken the lead on pushing back against the government, has faced a series of attacks and/or vulnerabilities of late, both in its code and its app store. Both any disclosures it made to the Federal government and to its app vendors would be covered by CISA unless Apple deliberately disclosed that information outside the terms of CISA — for example, by deliberately leaving personally identifiable information in any code it shared, which it’s not about to do. Apple will enjoy the protections in CISA whether it asked for them or not. I can think of just two ways to avoid triggering the protections of CISA: either to only report such vulnerabilities as a crime report to FBI (which, because it bypassed the DHS, would not get full protection, and which would be inappropriate for most kinds of vulnerability disclosures), or to publicly disclose everything to the public. And that’s assuming there aren’t more specific disclosures — such as attempts to attack specific iCloud accounts — that would legitimately be intelligence reports. Google tells users if they think state actors are trying to compromise their accounts; is this appropriate to share with the government without process? Moreover, most of the companies that would voluntarily not participate already have people with clearance who can and do receive classified intelligence from the government. Plus, these companies can’t choose not to let their own traffic that transits communications backbone be scanned by the backbone owners.
In other words, I’m not sure how a company can claim not to participate in CISA once it goes into effect unless it doesn’t share any information. And most of the big tech companies are already sharing this information among themselves, they want to continue to do that sharing, and that sharing would get CISA protections.
The problem is, there are a number of kinds of information sharing that will get the permission of CISA, all of which would count as “participating in it.” Anything Apple shared with the government or other companies would get CISA protection. But that’s far different than taking a signature the government shares and scanning all backbone traffic for instances of it, which is what Verizon and AT&T will almost certainly be doing under CISA. That is, there are activities that shouldn’t require legal process, and activities that currently do but will not under CISA. And to get a meaningful sense of whether someone is “participating” in CISA by performing activities that otherwise would require legal process, you’d need a whole lot of details about what they were doing, details that not even criminal defendants will ever get. You’d even need to distinguish activities companies would do on their own accord (Apple’s own scans of its systems for known vulnerabilities) from things that came pursuant to information received from the federal government (a scan on a vulnerability Apple learned about from the government).
We’re never going to get that kind of information from a transparency report, except insofar as companies detail the kinds of things they require legal process for in spite of CISA protection for doing them without legal process. That would not be the same thing as non-participation in CISA — because, again, most of the companies that have raised objections already share information at least with industry partners. But that’s about all we’d get short of really detailed descriptions of any scrubbing that goes on during such information sharing.
Apple recently released its latest transparency report for the period ending June 30, 2015. By comparing the numbers for two categories with previous reports (2H 2013, 1H 2014, 2H 2014) we can get some sense of how badly Apple’s move to encrypt data has really thwarted law enforcement.
Thus far, the numbers show that “going dark” may be a problem, but nowhere near as big of one as, say, NY’s DA Cy Vance claims.
The easier numbers to understand are the national security orders, presented in the mandated bands.
Since the iPhone 6 was introduced in September 2014, the numbers for orders received have gone up — one band in the second half of 2014, and two more bands in the first half of this year. Curiously, the number of accounts affected haven’t gone up that much, possibly only tens or a hundred more accounts. And Apple still gets nowhere near the magnitude of requests Yahoo does, which number over 42,000.
Equally curiously, in the last period, Apple clearly received more NatSec orders than accounts affected, which is the reverse of what other companies show (before Apple had appeared close to one-to-one). One thing that might explain this is the quarterly renewal of Pen Register orders for metadata of US persons (which might be counted as 4 requests for each account affected).
In other words, clearly NatSec requests have gone up, proportionally significantly, though Apple remains a tiny target for NatSec requests compared to the bigger PRISM participants.
The law enforcement account requests are harder to understand.
Note, Apple distinguishes between device requests, which are often users seeking help with a stolen iPhone, and account requests, which are requests for either metadata or content associated with an account (and could even include purchase records). The latter are the ones that represent law enforcement trying to get data to investigate a user, and that what I’ve laid out the latter data here [note, I fully expect to have made some data errors here, and apologize in advance — please let me know what you see!!].
Here, too, Apple has seen a significant increase, of 23%, over the requests it got in the second half of last year. Though, note, the iPhone 6 introduction would not be the only thing that would affect this: so would, probably, the June 2014 Riley Supreme Court decision, which required law enforcement to get a warrant to access cell phones, would also lead law enforcement to ask Apple for data more often.
Interestingly, however, there were fewer accounts implicated in the requests in the last half of the year, suggesting that for some reason law enforcement was submitting requests with a slew of accounts listed for each request. Whereas last year, LE submitted an average of over 6.5 accounts per request, this year they have submitted fewer than 3 accounts per request. This may reflect LE was submitting more identifiers from the same account — who knows?
The percentage of requests where content was obtained has gone up too, from 16% in 2013 to 24% in the first period including the iPhone 6 to 30% last quarter. Indeed, over half the period-on-period increase this period may stem from an increase in content requests (that is, the 107 more requests where content was obtained in the first half of the year, which was a period in which Apple got 183 more requests overall). Still, that number, 107 more successful requests for content this year than the second half of last year, seems totally disproportionate to NYC DA Cy Vance’s claim that the NYPD was unable to access the content in 74 iPhones since the iPhone 6 was established (though note, that might represent 1 request for content from 74 iPhones).
Perhaps the most interesting numbers to compare are the number of times Apple objected (because the agency didn’t have the right kind of legal process or a signed document) and the number of times Apple disclosed no data (which would include all those times Apple successfully objected — which appears to include all those in the first number — as well as those times Apple didn’t have the account, as well as times Apple was unable to hand over the data because a user hadn’t used default iCloud storage for messages. [Update, to put this more simply, the way to find the possible number of requests where encryption prevented Apple from sharing information is to subtract the Apple objected number from the no data number.] In the second half of 2013, Apple did not disclose any data 28.5% of the time. In the first half of this year, Apple did not disclose any data in just 18.6% of requests. Again, there are a lot of reasons why Apple would not turn over any data at all. But in general, cops are getting data more of the time when they give Apple requests than they were a few years ago.
More importantly, for just 65 cases in the first half of this year and 80 cases in the second half of last year did Apple not turn over any data for a request for reasons other than some kind of legal objection — and those numbers are both lower than the two half years preceding them. Each of those requests might represent hundreds of phones, but overall it’s a tiny number. So tiny it’s tough to understand where the NYPD’s 74 locked iPhones (unless they did request data and Apple actually had it).
There’s one more place where unavailable encrypted data might show up in these numbers: in the number of specific accounts for which data was disclosed. But as a percentage, what happened this year is not that different from what happened in 2013. In the second half of 2013, Apple provided some data (and this can be content or metadata) for 57.6% of the accounts specified in requests. In the first half of this year, Apple provided some data for 51.6% of the accounts specified in requests — not that huge a difference. And of course, the second half of last year, which may be an outlier, but during much of which the iPhone 6 was out, Apple provided data for 88.5% of the accounts for which LE asked for data.
Overall, it’s very hard to see where the FBI and other law enforcement agencies are going dark — though they are having to ask Apple for content more often (which I consider a good thing).
Update: In talking to EFF’s Nate Cardozo about Apple’s most recent report, we agreed that Apple’s new category for Emergency Requests may be one other place where iPhone data is handed over (it doesn’t exist in the reports for previous half year periods). Apple defines emergency content this way:
Table 3 shows all the emergency and/or exigent requests that we have received globally. Pursuant to 18 U.S.C. §§ 2702(b)(8) and 2702(c)(4) Apple may voluntarily disclose information, including contents of communications and customer records, to a federal, state, or local governmental entity if Apple believes in good faith that an emergency involving imminent danger of death or serious physical injury to any person requires such disclosure without delay. The number of emergency requests that Apple deemed to be exigent and responded to is detailed in Table 3.
Given the scale of Apple’s other requests, though not in the scale of cloud requests comparatively, these are significant numbers, especially for the US (107) and UK (98).
Of significant note, Apple may give out content under emergency requests.
This is more likely to be a post-Riley response than an encryption response, but still notable given the number.
There were a number of interesting exchanges in the Senate Armed Services Committee on cybersecurity hearing today, which I’ll return to in a bit. But for the moment I wanted to point to this bizarre exchange featuring Bill Nelson.
Nelson: Admiral, I’m concerned about all of these private telecoms that are going to encrypt. If you have encryption of everything, how, in your opinion, does that affect Section 702 and 215 collection programs?
Rogers: It certainly makes it more difficult.
Nelson: Does the Administration have a policy position on this?
Rogers: No. I think we’re still — I mean, we’re the first to acknowledge this is an incredibly complicated issue, with a lot of very valid perspectives. And we’re still, I think, collectively trying to work through what’s the right way ahead, here, recognizing that there’s a lot of very valid perspectives but from the perspective as CyberCommand and NSA as I look at this issue, there’s a huge challenge here that we have got to deal with.
Nelson: A huge challenge? And I have a policy position. And that is that the telecoms better cooperate with the United States government or else … it just magnifies the ability for the bad guys to utilize the Internet to achieve their purposes.
Bill Nelson is apparently very upset by the increasing use of encryption, but seems to believe Apple — which is at the center of these discussions — is a telecom. I’m happy to consider Apple a “phone company,” given that iMessage messages would go through the Internet and Apple rather than cell providers, and I think the IC increasingly thinks of Apple as a phone company. But it’s not a telecom, which is a different legal category.
He also believes that Apple’s encryption would hurt NSA’s Section 215 collection program. And NSA Director Mike Rogers appears to agree!
It shouldn’t. While Apple’s use of encryption will make it harder to get iMessage content, the metadata should still be available. So I’m rather curious why it is that Rogers agreed with Nelson?
In any case, Nelson doesn’t seem very interested in why Rogers immediately noted how complicated this question is — this is, after all, a hearing on cybersecurity and we know the Administration admits that more widespread encryption actually helps cybersecurity (especially since sophisticated hackers will be able to use other available encryption methods).
But I am intrigued that Rogers didn’t correct Nelson’s assertion that encryption would hurt the Section 215 program.
Update: This, from Apple’s transparency report, is one more reason Rogers’ agreement that encryption creates problems for the Section 215 program is so curious.
To date, Apple has not received any orders for bulk data.
During the July 1 Senate Judiciary Committee hearing on back doors, Deputy Attorney General Sally Yates claimed that the government doesn’t want the government to have back doors into encrypted communications. Rather, they wanted corporations to retain the back doors to be able to access communications if the government had legal process to do so. (After 1:43.)
We’re not going to ask the companies for any keys to the data. Instead, what we’re going to ask is that the companies have an ability to access it and then with lawful process we be able to get the information. That’s very different from what some other countries — other repressive regimes — from the way that they’re trying to get access to the information.
The claim was bizarre enough, especially as she went on to talk about other countries not having the same lawful process we have (as if that makes a difference to software code).
More importantly, that’s not true.
Remember what happened with Lavabit, when the FBI was in search of what is presumed to be Edward Snowden’s email. Lavabit owner Ladar Levison had a discussion with FBI about whether it was technically feasible to put a pen register on the targeted account. After which the FBI got a court order to do it. Levison tried to get the government to let him write a script that would provide them access to just the targeted account or, barring that, provide for some kind of audit to ensure the government wasn’t obtaining other customer data.
The unsealed documents describe a meeting on June 28th between the F.B.I. and Levison at Levison’s home in Dallas. There, according to the documents, Levison told the F.B.I. that he would not comply with the pen-register order and wanted to speak to an attorney. As the U.S. Attorney for the Eastern District of Virginia, Neil MacBride, described it, “It was unclear whether Mr. Levison would not comply with the order because it was technically not feasible or difficult, or because it was not consistent with his business practice in providing secure, encrypted e-mail service for his customers.” The meeting must have gone poorly for the F.B.I. because McBride filed a motion to compel Lavabit to comply with the pen-register and trap-and-trace order that very same day.
Magistrate Judge Theresa Carroll Buchanan granted the motion, inserting in her own handwriting that Lavabit was subject to “the possibility of criminal contempt of Court” if it failed to comply. When Levison didn’t comply, the government issued a summons, “United States of America v. Ladar Levison,” ordering him to explain himself on July 16th. The newly unsealed documents reveal tense talks between Levison and the F.B.I. in July. Levison wanted additional assurances that any device installed in the Lavabit system would capture only narrowly targeted data, and no more. He refused to provide real-time access to Lavabit data; he refused to go to court unless the government paid for his travel; and he refused to work with the F.B.I.’s technology unless the government paid him for “developmental time and equipment.” He instead offered to write an intercept code for the account’s metadata—for thirty-five hundred dollars. He asked Judge Hilton whether there could be “some sort of external audit” to make sure that the government did not take additional data. (The government plan did not include any oversight to which Levison would have access, he said.)
Most important, he refused to turn over the S.S.L. encryption keys that scrambled the messages of Lavabit’s customers, and which prevent third parties from reading them even if they obtain the messages.
The discussions disintegrated because the FBI refused to let Levison do what Yates now says they want to do: ensure that providers can hand over the data tailored to meet a specific request. That’s when Levison tried to give FBI his key in what it claimed (even though it has done the same for FOIAs and/or criminal discovery) was in a type too small to read.
On August 1st, Lavabit’s counsel, Jesse Binnall, reiterated Levison’s proposal that the government engage Levison to extract the information from the account himself rather than force him to turn over the S.S.L. keys.
THE COURT: You want to do it in a way that the government has to trust you—
BINNALL: Yes, Your Honor.
THE COURT: —to come up with the right data.
BINNALL: That’s correct, Your Honor.
THE COURT: And you won’t trust the government. So why would the government trust you?
Ultimately, the court ordered Levison to turn over the encryption key within twenty-four hours. Had the government taken Levison up on his offer, he may have provided it with Snowden’s data. Instead, by demanding the keys that unlocked all of Lavabit, the government provoked Levison to make a last stand. According to the U.S. Attorney MacBride’s motion for sanctions,
At approximately 1:30 p.m. CDT on August 2, 2013, Mr. Levison gave the F.B.I. a printout of what he represented to be the encryption keys needed to operate the pen register. This printout, in what appears to be four-point type, consists of eleven pages of largely illegible characters. To make use of these keys, the F.B.I. would have to manually input all two thousand five hundred and sixty characters, and one incorrect keystroke in this laborious process would render the F.B.I. collection system incapable of collecting decrypted data.
The U.S. Attorneys’ office called Lavabit’s lawyer, who responded that Levison “thinks” he could have an electronic version of the keys produced by August 5th.
Levison came away from the debacle believing that the FBI didn’t understand what it was asking for when they asked for his keys.
One result of this newfound expertise, however, is that Levison believes there is a knowledge gap between the Department of Justice and law-enforcement agencies; the former did not grasp the implications of what the F.B.I. was asking for when it demanded his S.S.L. keys.
There’s a persistent rumor going around that Apple is in the secret FISA Court, fighting a government order to make its platform more surveillance-friendly — and they’re losing. This might explain Apple CEO Tim Cook’s somewhat sudden vehemence about privacy. I have not found any confirmation of the rumor.
Weaver’s post describes how, because of the need to allow users to access their iMessage account from multiple devices (think desktop, laptop, iPad, and phone), Apple technically could give FBI a key.
In iMessage, each device has its own key, but its important that the sent messages also show up on all of Alice’s devices. The process of Alice requesting her own keys also acts as a way for Alice’s phone to discover that there are new devices associated with Alice, effectively enabling Alice to check that her keys are correct and nobody has compromised her iCloud account to surreptitiously add another device.
But there remains a critical flaw: there is no user interface for Alice to discover (and therefore independently confirm) Bob’s keys. Without this feature, there is no way for Alice to detect that an Apple keyserver gave her a different set of keys for Bob. Without such an interface, iMessage is “backdoor enabled” by design: the keyserver itself provides the backdoor.
So to tap Alice, it is straightforward to modify the keyserver to present an additional FBI key for Alice to everyone but Alice. Now the FBI (but not Apple) can decrypt all iMessages sent to Alice in the future.
Admittedly, as heroic as Levison’s decision to shut down Lavabit rather than renege on a promise he made to his customers, Apple has a lot more to lose here strictly because of the scale involved. And in spite of the heated rhetoric, FBI likely still trusts Apple more than they trusted Levison.
Still, it’s worth noting that Yates’ claim that FBI doesn’t want keys to communications isn’t true — or at least wasn’t before her tenure at DAG. Because a provider, Levison, insisted on providing his customers what he had promised, the FBI grew so distrustful of him they did demand a key.
This morning, Wired reports that the hackers who two years ago hacked an Escape and a Prius via physical access have hacked a Jeep Cherokee via remote (mobile phone) access. They accessed the vehicle’s Electronic Control Unit and from that were able to get to ECUs controlling the transmission and brakes, as well as a number of less critical items. The hackers are releasing a report [correction: this is Markey’s report], page 86 of which explains why cars have gotten so much more vulnerable (generally, a combination of being accessible via external communication networks, having more internal networks, and having far more ECUs that might have a vulnerability). It includes a list of the most and least hackable cars among the 14 they reviewed.
Today Ed Markey and Richard Blumenthal are releasing a bill meant to address some of these security vulnerabilities in cars.
Meanwhile — in a remarkably poorly timed announcement — Apple announced yesterday that it had hired Fiat Chrysler’s former quality guy, the guy who would have overseen development of both the hackable Jeep Cherokee and the safer Dodge Viper.
Doug Betts, who led global quality at Fiat Chrysler Automobiles NV until last year, is now working for the Cupertino, Calif.-based electronics giant but declined to comment on the position when reached Monday. Mr. Betts’ LinkedIn profile says he joined Apple in July and describes his title as “Operations-Apple Inc.” with a location in the San Francisco Bay Area but no further specifics.
Along with Mr. Betts, whose expertise points to a desire to know how to build a car, Apple recently recruited one of the leading autonomous-vehicle researchers in Europe and is building a team to work on those systems.
In 2009, when Fiat SpA took over Chrysler, CEO Sergio Marchionne tapped Mr. Betts to lead the company’s quality turnaround, giving him far-reaching authority over the company’s brands and even the final say on key production launches.
Mr. Betts abruptly left Fiat Chrysler last year to pursue other interests. The move came less than a day after the car maker’s brands ranked poorly in an influential reliability study.
Note, the poor quality ratings that preceded Betts’ departure from Fiat Chrysler pertained especially to infotainment systems, which points to electronics vulnerabilities generally.
As they get into the auto business, Apple and Google will have the luxury that struggling combustion engine companies don’t have — that they’re not limited by tight margins as they try to introduce bells and whistles to compete on the marketplace. But they’d do well to get this quality and security issue right from the start, because the kind of errors tech companies can tolerate — largely because they can remotely fix bugs and because an iPhone that prioritized design over engineering can’t kill you — will produce much bigger problems in cars (though remote patching will be easier in electric cars).
So let’s hope Apple’s new employee takes this hacking report seriously.
On May 7, the very same day the Second Circuit ruled that Congress has to say specifically what a surveillance bill means for the bill to mean that thing, Richard Burr engaged in a staged colloquy on the Senate floor where he claimed that the Section 215 bulk collection program collects IP addresses. After Andrew Blake alerted me to that and I wrote it up, Burr stuffed the claim into the memory hole and claimed, dubiously, to have made a misstatement in a planned colloquy.
Then, after Mitch McConnell created a crisis by missing the first Section 215 reauthorization deadlines, Burr submitted a bill that would immediately permit the bulk collection of IP addresses, plus a whole lot more, falsely telling reporters this was a “compromise” bill that would ensure a smooth transition between the current (phone) dragnet and its replacement system.
Which strongly suggests Burr’s initial “misstatement” was simply an attempt to create a legislative record approving a vast expansion of the current dragnet that, when he got caught, led Burr to submit a bill that actually would implement that in fact.
This has convinced me we’re going to need to watch these authoritarians like hawks, to prevent them from creating the appearance of authorizing vast surveillance systems without general knowledge that’s what’s happening.
So I reviewed the speech Mitch made on Friday (this appears after 4:30 to 15:00; unlike Burr’s speech, the congressional record does reflect what Mitch actually said; h/t Steve Aftergood for Congressional Record transcript). And amid misleading claims about what the “compromise” bill Burr was working on, Mitch suggested something remarkable: among the data he’s demanding be retained are documents, not just call data.
I’ve placed the key part of Mitch’s comments below the rule, with my interspersed comments. As I show, one thing Mitch does is accuse providers of an unwillingness to provide data when in fact what he means is far more extensive cooperation. But I’m particularly interested in what he says about data retention:
The problem, of course, is that the providers have made it abundantly clear that they will not commit to retaining the data for any period of time as contemplated by the House-passed bill unless they are legally required to do so. There is no such requirement in the bill. For example, one provider said the following: “[We are] not prepared to commit to voluntarily retain documents for any particular period of time pursuant to the proposed USA FREEDOM Act if not otherwise required by law.”
Now, one credulous journalist told me the other day that telecoms were refusing to speak to the Administration at all, which he presumably parroted from sources like Mitch. That’s funny, because not only did the telecom key to making the program work — Verizon — provide testimony to Congress (which is worth reviewing, because Verizon Associate General Counsel — and former FBI lawyer — Michael Woods pointed to precisely what the dragnet would encompass under Burr’s bill, including VOIP, peer-to-peer, and IP collection), but Senator Feinstein has repeatedly made clear the telecoms have agreed with the President to keep data for two years.
Furthermore, McConnell’s quotation of this line from a (surely highly classified letter) cannot be relied on. Verizon at first refused to retain data before it made its data handshake with the President. So when did this provider send this letter, and does their stance remain the same? Mitch doesn’t say, and given how many other misleading comments he made in his speech, it’s unwise to trust him on this point.
Most curiously, though, look at what they’re refusing to keep. Not phone data! But documents.
Both USA F-ReDux and Burr’s bill only protect messaging contents, not other kinds of content (and Burr’s excludes anything that might be Dialing, Routing Addressing and Signaling data from his definition of content, which is the definition John Bates adopted in 2010 to be able to permit NSA to resume collecting Internet metadata in bulk). Both include remote computing services (cloud services) among the providers envisioned to be included not just under the bill, but under the “Call Detail Record” provision.
Perhaps there’s some other connotation for this use of the word “documents.” Remember, I think the major target of data retention mandates is Apple, because Jim Comey wants iMessage data that would only be available from their cloud.
But documents? What the hell kind of “Call Detail Records” is Mitch planning on here?
One more thing is remarkable about this. Mitch is suggesting it will take longer for providers to comply with this system than it took them to comply with Protect America Act. Yahoo, for example, challenged its orders and immediately refused to comply on November 8, 2007. Yet, even in spite of challenging that order and appealing, Yahoo started complying with it on May 5, 2008, that same 180-time frame envisioned here. And virtually all of the major providers already have some kind of compliance mechanism in place, either through PRISM (Apple, Google, and Microsoft) or upstream 702 compliance (AT&T and Verizon).
Last night, Mitch McConnell dealt himself a humiliating defeat. As I correctly predicted a month before events played out, McConnell tried to create a panic that would permit him and Richard Burr to demand changes — including iMessage retention, among other things — to USA F-ReDux. That is, in fact, what Mitch attempted to do, as is evident from the authoritarian power grab Burr released around 8:30 last night (that is, technically after the Administration had already missed the FISA Court deadline to renew the dragnet).
Contrary to a lot of absolutely horrible reporting on Burr’s bill, it does not actually resemble USA F-ReDux.
As I laid out here, it would start by gutting ECPA, such that the FBI could resume using NSLs to do the bulky Internet collection that moved to Section 215 production in 2009.
It also vastly expanded the application of the call record function (which it very explicitly applied to electronic communications providers, meaning it would include all Internet production, though that is probably what USA F-ReDux does implicitly), such that it could be used against Americans for any counterterrorism or counterintelligence (which includes leaks and cybersecurity) function, and for foreigners (which would chain onto Americans) for any foreign intelligence purpose. The chaining function includes the same vague language from USA F-ReDux which, in the absence of the limiting language in the House Judiciary Committee bill report, probably lets the government chain on session identifying information (like location and cookies, but possibly even things like address books) to do pattern analysis on providers’ data. Plus, the bill might even permit the government to do this chaining in provider data, because it doesn’t define a key “permit access” term.
Burr’s bill applies EO 12333 minimization procedures (and notice), not the stronger Section 215 ones Congress mandated in 2006; while USA F-ReDux data will already be shared far more widely than it is now, this would ensure that no defendant ever gets to challenge this collection. It imposes a 3-year data retention mandate (which would be a significant new burden on both Verizon and Apple). It appears to flip the amicus provision on its head, such that if Verizon or Apple challenged retention or any other part of the program, the FISC could provide a lawyer for the tech companies and tell that lawyer to fight for retention. And in the piece de la resistance, the bill creates its very own Espionage Act imposing 10 year prison terms for anyone who reveals precisely what’s happening in this expanded querying function at providers.
It is, in short, the forced-deputization of the nation’s communications providers to conduct EO 12333 spying on Americans within America.
Had Mitch had his way, after both USA F-ReDux and his 2-month straight reauthorization failed to get cloture, he would have asked for a week extension, during which the House would have been forced to come back to work and accept — under threat of “going dark” — some of the things demanded in Burr’s bill.
It didn’t work out.
But as it was, USA F-ReDux had far more support than the short-term reauthorization. Both McConnell and Rand Paul voted against both, for very different reasons. The difference in the vote results, however, was that Joe Donnelly (D), Jeff Flake (R), Ron Johnson (R), James Lankford (R), Bill Nelson (D), Tim Scott (R), and Dan Sullivan (R) voted yes to both. McConnell’s preferred option didn’t even get a majority of the vote, because he lost a chunk of his members.
Then McConnell played the hand he believed would give himself and Burr leverage. The plan — as I stated — was to get a very short term reauthorization passed and in that period force through changes with the House (never mind that permitting that to happen might have cost Boehner his Speakership, that’s what McConnell and Burr had in mind).
First, McConnell asked for unanimous consent to pass an extension to June 8. (h/t joanneleon for making the clip) But Paul, reminding that this country’s founders opposed General Warrants and demanding 2 majority vote amendments, objected. McConnell then asked for a June 5 extension, to which Ron Wyden objected. McConnell asked for an extension to June 3. Martin Heinrich objected. McConnell asked for an extension to June 2. Paul objected.
McConnell’s bid failed. And he ultimately scheduled the Senate to return on Sunday afternoon, May 31.
By far the most likely outcome at this point is that enough Senators — likely candidates are Mark Kirk, Angus King, John McCain, Joni Ernst, or Susan Collins — flip their vote on USA F-ReDux, which will then be rushed to President Obama just hours before Section 215 (and with it, Lone Wolf and Roving Wiretaps) expires on June 1. But even that (because of when McConnell scheduled it) probably requires Paul to agree to an immediate vote.
But if not, it won’t be the immediate end of the world.
On this issue, too, the reporting has been horrible, even to almost universal misrepresentation of what Jim Comey said about the importance of expiring provisions — I’ve laid out what he really said and what it means here. Comey cares first and foremost about the other Section 215 uses, almost surely the bulky Internet collection that moved there in 2009. But those orders, because they’re tied to existing investigations (of presumably more focused subject than the standing counterterrorism investigation to justify the phone dragnet), they will be grand-fathered at least until whatever expiration date they have hits, if not longer. So FBI will be anxious to restore that authority (or move it back to NSLs as Burr’s bill would do), especially since unlike the phone dragnet, there aren’t other ways to get the data. But there’s some time left to do that.
Comey also said the Roving Wiretap is critical. I’m guessing that’s because they use it to target things like Tor relays. But if that’s the primary secretly redefined function, they likely have learned enough about the Tor relays they’re parked on to get individual warrants. And here, too, the FBI likely won’t have to detask until expiration days on these FISA orders come due.
As for the phone dragnet and the Lone Wolf? Those are less urgent, according to Comey.
Now, that might help the Republicans who want to jam through some of Burr’s demands, since most moderate reformers assume the phone dragnet is the most important function that expires. Except that McConnell and others have spent so long pretending that this is about a phone dragnet that in truth doesn’t really work, that skittish Republicans are likely to want to appear to do all they can to keep the phone dragnet afloat.
As I said, the most likely outcome is that a number of people flip their vote and help pass USA F-ReDux.
But as with last night’s “debate,” no one really knows for sure.