As I laid out when he gave his speech at Brookings, Jim Comey’s public explanation for needing back doors to Apple and Android phones doesn’t hold up. He conflated stored communication with communication in transit, ignored the risk of a back door (which he called a front door), and the law enforcement successes he presented, across the board, do not support his claim to need a back door.
So yesterday Comey and others had a classified briefing, where no one would be able to shred his flawed case.
FBI and Justice Department officials met with House staffers this week for a classified briefing on how encryption is hurting police investigations, according to staffers familiar with the meeting.
The briefing included Democratic and Republican aides for the House Judiciary and Intelligence Committees, the staffers said. The meeting was held in a classified room, and aides are forbidden from revealing what was discussed.
Comey called for Congress to revise the law to create a “level playing field” so that Google, Apple, and Facebook have the same obligation as AT&T and Verizon to help police.
National Journal listed out those companies, by the way — Facebook, for example, did not appear in Comey’s Brooking’s speech where he used the “level the playing field comment.”
I was puzzled by Comey’s inclusion of Facebook here until I saw this news.
To make their experience more consistent with our goals of accessibility and security, we have begun an experiment which makes Facebook available directly over Tor network at the following URL:
[ NOTE: link will only work in Tor-enabled browsers ]
Facebook Onion Address
Facebook’s onion address provides a way to access Facebook through Tor without losing the cryptographic protections provided by the Tor cloud.
The idea is that the Facebook onion address connects you to Facebook’s Core WWW Infrastructure – check the URL again, you’ll see what we did there – and it reflects one benefit of accessing Facebook this way: that it provides end-to-end communication, from your browser directly into a Facebook datacentre.
All that got me thinking about what Comey said in the classified briefing — in the real reason he wants to make us all less secure.
And I can’t help but wonder whether it’s metadata.
The government aspires to get universal potential coverage of telephony (at least) metadata under USA Freedom Act, with the ability to force cooperation. But I’m not sure that Apple, especially, would be able to provide iMessage metadata, meaning iPhone users can text without leaving metadata available to either AT&T (because it bypasses the telecom network) or Apple itself (because they no longer have guaranteed remote object).
And without metadata, FBI and NSA would be unable to demonstrate the need to do a wiretap of such content.
Ah well, once again I reflect on what a pity it is that FBI didn’t investigate the theft of data from these same companies, providing them a very good reason to lock it all up from sophisticated online criminals like GCHQ.
At his Brookings event yesterday, Jim Comey claimed that there is a misperception, in the wake of the Snowden releases, about how much data the government obtains.
In the wake of the Snowden disclosures, the prevailing view is that the government is sweeping up all of our communications. That is not true. And unfortunately, the idea that the government has access to all communications at all times has extended—unfairly—to the investigations of law enforcement agencies that obtain individual warrants, approved by judges, to intercept the communications of suspected criminals.
It frustrates me, because I want people to understand that law enforcement needs to be able to access communications and information to bring people to justice. We do so pursuant to the rule of law, with clear guidance and strict oversight.
He goes onto pretend that Apple and Google are default encrypting their phone solely as a marketing gimmick, some arbitrary thing crazy users want.
Both companies are run by good people, responding to what they perceive is a market demand. But the place they are leading us is one we shouldn’t go to without careful thought and debate as a country.
Encryption isn’t just a technical feature; it’s a marketing pitch. But it will have very serious consequences for law enforcement and national security agencies at all levels. Sophisticated criminals will come to count on these means of evading detection. It’s the equivalent of a closet that can’t be opened. A safe that can’t be cracked. And my question is, at what cost?
He ends with a plea that “our private sector partners … consider changing course.”
But we have to find a way to help these companies understand what we need, why we need it, and how they can help, while still protecting privacy rights and providing network security and innovation. We need our private sector partners to take a step back, to pause, and to consider changing course.
There’s something missing from Comey’s tale.
An explanation of why the FBI has not pursued the sophisticated criminals who stole Google’s data overseas.
At a recent event with Ron Wyden, the Senator asked Schmidt to weigh in on the phone encryption “kerfuffle.” And Schmidt was quite clear: the reason Google and Apple are doing this is because the NSA’s partners in the UK stole their data, even while they had access to it via PRISM.
The people who are criticizing this should have expected this. After Google was attacked by the British version of the NSA, we were annoyed and so we put end-to-end encryption at rest, as well as through our systems, making it essentially impossible for interlopers — of any kind — to get that information.
Schmidt describes the default encryption on the iPhone, notes that it has been available for the last 3 years on Android phones, and will soon be standard, just like it is on iPhone.
Law enforcement has many many ways of getting information that they need to provide this without having to do it without court orders and with the possible snooping conversation. The problem when they do it randomly as opposed to through a judicial process is it erodes user trust.
If everything Comey said were true, if this were only about law enforcement getting data with warrants, Apple — and Google especially — might not have offered their customers the privacy they deserved. But it turns out Comey’s fellow intelligence agency decided to just go take what they wanted.
And FBI did nothing to solve that terrific hack and theft of data.
I guess FBI isn’t as interested in rule of law as Comey says.
Today, Jim Comey will give what will surely be an aggressively moderated (by Ben Wittes!) talk at Brookings, arguing that Apple should not offer its customers basic privacy tools (congratulations to NYT’s Michael Schmidt for beating the rush of publishing credulous reports on this speech).
Mr. Comey will say that encryption technologies used on these devices, like the new iPhone, have become so sophisticated that crimes will go unsolved because law enforcement officers will not be able to get information from them, according to a senior F.B.I. official who provided a preview of the speech.
Never mind the numbers, which I laid out here. While Apple doesn’t break out its device requests last year, it says the vast majority of the 3,431 device requests it responded to last year were in response to a lost or stolen phone request, not law enforcement seeking data on the holder. Given that iPhones represent the better part of the estimated 3.1 million phones that will be stolen this year, that’s a modest claim. Moreover, given that Apple only provided content off the cloud to law enforcement 155 times last year, it’s unlikely we’re talking a common law enforcement practice.
At least not with warrants. Warrantless fishing expeditions are another issue.
As far back as 2010, CBP was conducting 4,600 device searches at the border. Given that 20% of the country will be carrying iPhones this year, and a much higher number of the Americans who cross international borders will be carrying one, a reasonable guess would be that CBP searches 1,000 iPhones a year (and it could be several times that). Cops used to be able to do the same at traffic stops until this year’s Riley v, California decision; I’ve not seen numbers on how many searches they did, but given that most of those were (like the border searches) fishing expeditions, it’s not clear how many will be able to continue, because law enforcement won’t have probable cause to get a warrant.
So the claims law enforcement is making about needing to get content stored on and only on iPhones with a warrant doesn’t hold up, except for very narrow exceptions (cops may lose access to iMessage conversations if all users in question know not to store those conversations on iCloud, which is otherwise the default).
But that’s not the best argument I’ve seen for why Comey should back off this campaign.
As a number of people (including the credulous Schmidt) point out, Comey repeated his attack on Apple on the 60 Minutes show Sunday.
James Comey: The notion that we would market devices that would allow someone to place themselves beyond the law, troubles me a lot. As a country, I don’t know why we would want to put people beyond the law. That is, sell cars with trunks that couldn’t ever be opened by law enforcement with a court order, or sell an apartment that could never be entered even by law enforcement. Would you want to live in that neighborhood? This is a similar concern. The notion that people have devices, again, that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone? My sense is that we’ve gone too far when we’ve gone there
What no one I’ve seen points out is there was an equally charismatic FBI Director named Jim Comey on 60 Minutes a week ago Sunday (these are actually the same interview, or at least use the same clip to marvel that Comey is 6’8″, which raises interesting questions about why both these clips weren’t on the same show).
That Jim Comey made a really compelling argument about how most people don’t understand how vulnerable they are now that they live their lives online.
James Comey: I don’t think so. I think there’s something about sitting in front of your own computer working on your own banking, your own health care, your own social life that makes it hard to understand the danger. I mean, the Internet is the most dangerous parking lot imaginable. But if you were crossing a mall parking lot late at night, your entire sense of danger would be heightened. You would stand straight. You’d walk quickly. You’d know where you were going. You would look for light. Folks are wandering around that proverbial parking lot of the Internet all day long, without giving it a thought to whose attachments they’re opening, what sites they’re visiting. And that makes it easy for the bad guys.
Scott Pelley: So tell folks at home what they need to know.
James Comey: When someone sends you an email, they are knocking on your door. And when you open the attachment, without looking through the peephole to see who it is, you just opened the door and let a stranger into your life, where everything you care about is.
That Jim Comey — the guy worried about victims of computer crime — laid out the horrible things that can happen when criminals access all the data you’ve got on devices.
Scott Pelley: And what might that attachment do?
James Comey: Well, take over the computer, lock the computer, and then demand a ransom payment before it would unlock. Steal images from your system of your children or your, you know, or steal your banking information, take your entire life.
Now, victim-concerned Jim Comey seems to think we can avoid such vulnerability by educating people not to click on any attachment they might have. But of course, for the millions who have their cell phones stolen, they don’t even need to click on an attachment. The crooks will have all their victims’ data available in their hand.
Unless, of course, users have made that data inaccessible. One easy way to do that is by making easy encryption the default.
Victim-concerned Jim Comey might offer 60 Minute viewers two pieces of advice: be careful of what you click on, and encrypt those devices that you carry with you — at risk of being lost or stolen — all the time.
Of course, that would set off a pretty intense fight with fear-monger Comey, the guy showing up to Brookings today to argue Apple’s customers shouldn’t have this common sense protection.
That would be a debate I’d enjoy Ben Wittes trying to debate.
Former FBI Assistant Director apparently isn’t afraid to embarrass himself to fear monger for law enforcement.
That’s the only conclusion I can reach by his penning this op-ed, which still bears its original title in the URL.
In it, Ronald T. Hosko claimed shamelessly that if Apple had been employing its new encryption plans earlier this year, a kidnap victim the FBI rescued would be dead. The two nut paragraphs originally read,
It made no sense! As Hosko correctly explained, they solved this case with lawful intercepts of phone content.
Once we identified potential conspirators, we quickly requested and secured the legal authority to intercept phone calls and text messages on multiple devices.
Even if the kidnappers had a new iPhone, FBI would still go to precisely the same source they did go to — the telecom providers — to get the intercepts. The FBI never even had the actual phones of kidnappers in hand — except for the phone the gang leader used to direct the plot from prison, which he crushed before it could be investigated, a technology that has been available to thugs far longer than encryption has.
So it is quite clear that, had this technology been used by the conspirators in this case, the FBI would still have caught them, using precisely the same process they did use to catch them..
After Hosko got called on his false claims on Twitter, he made two corrections — first to this interim fallback. (h/t @empirical error for catching this)
That didn’t make any more sense, as they were tracing calls made from the kidnappers. Once they got close enough to examine their actual devices, they had the kidnappers. Now he has changed it to read:
Last week, Apple and Google announced that their new operating systemswill be encrypted by default. Encrypting a phone doesn’t make it any harder to tap, or “lawfully intercept” calls. But it does limit law enforcement’s access to a data, contacts, photos and email stored on the phone itself.
That kind information can help law enforcement officials solve big cases quickly. For example, criminals sometimes avoid phone interception by communicating plans via Snapchat or video. Their phones contain contacts, texts, and geo-tagged data that can help police track down accomplices. These new rules will make it impossible for us to access that information. They will create needless delays that could cost victims their lives.*
* Editors note: This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.
Phew. Apparently all this surveillance technology is hard to keep straight, even for an experienced FBI guy. But the truly funny part of Hosko’s piece — now that he at least has some semblance of factual accuracy (though I think he’s still exaggerating about video and Snapchat) — is where he suggests that we should not avail ourselves of any technologies that make it easier on criminals.
If our cutting edge technologies are designed to keep important dots out of the hands of our government, we all might start thinking about how safe and secure we will be when the most tech-savvy, dedicated criminals exponentially increase their own success rates.
This would lead you to believe Hosko is unaware of the “cutting edge technology” that has probably kept more crime-solving information out of the hands of the government than any measly encryption: incorporation. Drug cartels, human traffickers, even dreaded banksters all use shell corporations as a favored technology to not only hide the evidence of their crime, but to dodge accountability if it ever is discovered. That snazzy technology, the corporation, has empowered criminals far more than cell phone encryption — with all the possible workarounds — will ever do.
Yet if you called for eliminating a beneficial technology like the corporation just because criminals also happen to find it useful, people would consider you batshit insane. It would be a totally disproportionate measure, trading away real benefits in the name of relative but not absolute safety.
But hey! Hosko has already embarrassed himself. So if he feels like doing so again, by all means, I implore him to call for the elimination of the corporation — or even just a few of the exotic financial tools that the most dangerous financial criminals use.
After all, it will make us safer!
As I noted the other day, Apple just rolled out — and Google plans to match with its next Android release — passcode protected encryption for its cell phone handsets.
Last night WSJ had a story quoting some fairly hysterical law enforcement types complaining mightily not just that Apple is offering its customers security, but that it is a marketing feature.
Last week’s announcements surprised senior federal law-enforcement officials, some of whom described it as the most alarming consequence to date of the frayed relationship between the federal government and the tech industry since the Snowden revelations prompted companies to address customers’ concerns that the firms were letting—or helping—the government snoop on their private information.
Senior U.S. law-enforcement officials are still weighing how forcefully to respond, according to several people involved in the discussions, and debating how directly they want to challenge Apple and Google.
One Justice Department official said that if the new systems work as advertised, they will make it harder, if not impossible, to solve some cases. Another said the companies have promised customers “the equivalent of a house that can’t be searched, or a car trunk that could never be opened.”
Andrew Weissmann, a former Federal Bureau of Investigation general counsel, called Apple’s announcement outrageous, because even a judge’s decision that there is probable cause to suspect a crime has been committed won’t get Apple to help retrieve potential evidence. Apple is “announcing to criminals, ‘use this,’ ” he said. “You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.”
I think the outrage about the stated case — that law enforcement will not longer be able to have Apple unlock a phone with a warrant — is overblown. As Micah Lee points out, the same data will likely be available on Apple’s Cloud.
But despite these nods to privacy-conscious consumers, Apple still strongly encourages all its users to sign up for and use iCloud, the internet syncing and storage service where Apple has the capability to unlock key data like backups, documents, contacts, and calendar information in response to a government demand. iCloud is also used to sync photos, as a slew of celebrities learned in recent weeks when hackers reaped nude photos from the Apple service. (Celebrity iCloud accounts were compromised when hackers answered security questions correctly or tricked victims into giving up their credentials via “phishing” links, Cook has said.)
And the stuff that won’t be on Apple’s Cloud will largely be available from a user’s phone provider — AT&T and Verizon will have call records and texts, for example. So one effect of this will be to put warrant decisions into a review process more likely to be scrutinized (though not in the case of AT&T, which has consistently proven all to happy to share data with the Feds).
Which is why I think the hysteria is either overblown or is about something else.
It may be that this prevents NSA from getting into handsets via some means we don’t understand. Matthew Green lays out how this change will bring real security improvement to your phone from all matter of hackers.
But the most immediate impact of this, I suspect, will be seen at borders — or rather, the government’s expansive 100 mile “border zone,” which incorporates roughly two-thirds of the country’s population. At “borders” law enforcement works under a warrant exception that permits them to search devices — including cell phones — without a warrant, or even any articulable suspicion.
And while it is the case that really aggressive security wonks can and do encrypt their phones now, it is not the default. Which means most people who cross an international border — or get stopped by some authority in that border zone — have their phone contents readily available to those authorities to search. Authorities routinely use their expanded border authority to obtain precisely the kinds of things at issue here, without any suspicion. The terrorist watchlist guidelines (see page 68), for example, note that border encounters may provide evidence from “electronic media/devices observed or copied,” including cell phones.
In 2011, DHS whipped out similarly hysterical language about what horribles actually requiring suspicion before searching a device might bring about.
[A]dding a heightened [suspicion-based] threshold requirement could be operationally harmful without concomitant civil rights/civil liberties benefit. First, commonplace decisions to search electronic devices might be opened to litigation challenging the reasons for the search. In addition to interfering with a carefully constructed border security system, the litigation could directly undermine national security by requiring the government to produce sensitive investigative and national security information to justify some of the most critical searches. Even a policy change entirely unenforceable by courts might be problematic; we have been presented with some noteworthy CBP and ICE success stories based on hard-to-articulate intuitions or hunches based on officer experience and judgment. Under a reasonable suspicion requirement, officers might hesitate to search an individual’s device without the presence of articulable factors capable of being formally defended, despite having an intuition or hunch based on experience that justified a search.
That is, DHS thinks it should be able to continue to search your phone at the border, because if it had to provide a rationale — say, to get a warrant — it might have to disclose the dodgy watchlisting policies that it uses to pick whose devices to search without any cause.
In other words, I’m arguing that the most immediate impact of this will be to lessen the availability of data increasingly obtained without a warrant, and given that the alternate means — administrative orders and warrants — require actual legal process, may mean these things will not be available at all.
If I’m right, though, that’s not a technical impediment. It’s a legal one, one which probably should be in place.
Update: Argh! This is even worse fear-mongering. A former FBI guy says he used intercepted communications to find kidnappers.
Once we identified potential conspirators, we quickly requested and secured the legal authority to intercept phone calls and text messages on multiple devices.
Then claims losing an entirely unrelated ability to search — for data stored on, and only on, handsets — would have prevented them from finding that kidnap victim.
Last week, Apple and Android announced that their new operating systemswill be encrypted by default. That means the companies won’t be able to unlock phones and iPads to reveal the photos, e-mails and recordings stored within.
It also means law enforcement officials won’t be able to look at the range of data stored on the device, even with a court-approved warrant. Had this technology been used by the conspirators in our case, our victim would be dead.
Instead of proving this guy would be dead, the story instead proves that this is not the most pressing information.
There were two significant pieces of Apple security news yesterday.
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.
I find the comment as interesting for the list of things Apple envisions potentially having to hand over as I do for the security claim (though the security claim is admirable).
Though Apple’s promise to protect this kind of data only goes so far; as the NYT makes clear, that doesn’t extend to data stored on Apple’s cloud.
The new security in iOS 8 protects information stored on the device itself, but not data stored on Apple’s cloud service. So Apple will still be able to hand over some customer information stored on iCloud in response to government requests.
Which brings us to the second piece of news. As GigaOm notes, Apple’s warrant canary indicating that it has never received a Section 215 order has disappeared.
When Apple published its first Transparency Report on government activity in late 2013, the document contained an important footnote that stated:
“Apple has never received an order under Section 215 of the USA Patriot Act. We would expect to challenge such an order if served on us.”
Warrant canaries are a tool used by companies and publishers to signify to their users that, so far, they have not been subject to a given type of law enforcement request such as a secret subpoena. If the canary disappears, then it is likely the situation has changed — and the company has been subject to such request.
Now, Apple’s warrant canary has disappeared. A review of the company’s last two Transparency Reports, covering the second half of 2013 and the first six months of 2014, shows that the “canary” language is no longer there.
Note, GigaOm goes on to mistakenly state that Section 215 is the basis for PRISM, which doesn’t detract from the importance of noting the dead warrant canary. The original PRISM slides indicate that Apple started complying with Section 702 (PRISM) in October 2012, and the ranges in Apple’s government request data probably reflect at least some of its Section 702 compliance to provide content.
So Apple receiving its first Section 215 order sometime last year would reflect either a different kind of request — one not available by targeting someone overseas, as required under Section 702 — or a request for the kind of information it has already provided via a new authority, Section 215.
Many of the things listed above — at a minimum, call history, but potentially things like contacts and the titles of iTunes content (remember, James Cole has confirmed the government could use Section 215 to get URL searches, and we know they get purchase records) — can be obtained under Section 215.
I find Apple’s dead warrant canary of particular interest given the revelation in the recent DOJ IG Report on National Security Letters that some “Internet companies” started refusing NSLs for certain kinds of content starting in 2009; that collection has moved to Section 215 authority, and it now constitutes a majority of the 200-some Section 215 orders a year.
The decision of these [redacted] Internet companies to discontinue producing electronic communication transactional records in response to NSLs followed public release of a legal opinion issued by the Department’s Office of Legal Counsel (OLC) regarding the application of ECPA Section 2709 to various types of information. The FBI General Counsel sought guidance from the OLC on, among other things, whether the four types of information listed in subsection (b) of Section 2709 — the subscriber’s name, address, length of service, and local and long distance toll billing records — are exhaustive or merely illustrative of the information that the FBI may request in an NSL. In a November 2008 opinion, the OLC concluded that the records identified in Section 2709(b) constitute the exclusive list of records that may be obtained through an ECPA NSL.
Although the OLC opinion did not focus on electronic communication transaction records specifically, according to the FBI, [redacted] took a legal position based on the opinion that if the records identified in Section 2709(b) constitute the exclusive list of records that may be obtained through an ECPA NSL, then the FBI does not have the authority to compel the production of electronic communication transactional records because that term does not appear in subsection (b).
We asked whether the disagreement and uncertainty over electronic communication transactional records has negatively affected national security investigations. An Assistant General Counsel in NSLB told us that the additional time it takes to obtain transactional records through a Section 215 application slows down national security investigations, all of which he said are time-sensitive. He said that an investigative subject can cease activities or move out of the country within the time-frame now necessary to obtain a FISA order. [my emphasis]
These Internet company refusals must pertain to somewhat exotic requests, otherwise the government would simply take the companies to court one time apiece and win that authority. So we should assume the government was making somewhat audacious requests using NSLs, some companies refused, and it now uses Section 215 to do the collection. Another signal that these requests are fairly audacious is that the FISA Court appears to have imposed minimization procedures, which for individualized content must reflect a good deal of irrelevant content that would be suppressed.
While my wildarse guess is that this production pertains to URL searches, everything cloud providers like Apple store arguably falls under the Third Party doctrine and may be obtained using Section 215.
That’s not to say Apple’s dead canary pertains to this kind of refusal. But it ought to raise new questions about how the government has been using Section 215.
This production will likely be increasingly obtained using USA Freedom Act’s emergency provisions, which permit the government to retain data even if it is not legal, if the bill passes. And the bill’s “transparency” provisions hide how many Americans would be affected.
In his report on an interview with the new Director of NSA, Admiral Mike Rogers, David Sanger gets some operational details wrong, starting with his claim that the new phone dragnet would require an “individual warrant.”
The new phone dragnet neither requires “warrants” (the standard for an order is reasonable suspicion, not probable cause), nor does it require its orders to be tied to “individuals,” but instead requires “specific selection terms” that may target facilities or devices, which in the past have been very very broadly interpreted.
All that said, I am interested in Rogers’ claims Sanger repeats about NSA’s changing relationship with telecoms.
He also acknowledged that the quiet working relationships between the security agency and the nation’s telecommunications and high technology firms had been sharply changed by the Snowden disclosures — and might never return to what they once were in an era when the relationships were enveloped in secrecy.
Sadly, here’s where Sanger’s unfamiliarity with the details makes the story less useful. Publicly, at least, AT&T and Verizon have had significantly different responses to the exposure of the dragnet (though that may only be because Verizon’s name has twice been made public in conjunction with NSA’s dragnet, whereas AT&T’s has not been), and it’d be nice if this passage probed some of those details.
Telecommunications businesses like AT&T and Verizon, and social media companies, now insist that “you are going to have to compel us,” Admiral Rogers said, to turn over data so that they can demonstrate to foreign customers that they do not voluntarily cooperate. And some are far more reluctant to help when asked to provide information about foreigners who are communicating on their networks abroad. It is a gray area in the law in which American courts have no jurisdiction; instead, the agency relied on the cooperation of American-based companies.
Last week, Verizon lost a longstanding contract to run many of the telecommunications services for the German government. Germany declared that the revelations of “ties revealed between foreign intelligence agencies and firms” showed that it needed to rely on domestic providers.
After all, under Hemisphere, AT&T wasn’t requiring legal process even for domestic call records. I think it possible they’ve demanded the government move Hemisphere under the new phone dragnet, though if they have, we haven’t heard about it (it would only work if they defined domestic drug dealer suspects as associated with foreign powers who have some tie to terrorism). Otherwise, though, AT&T has not made a peep to suggest they’ll alter their decades-long overenthusiastic cooperation with the government.
Whereas Verizon has been making more audible complaints about their plight, long before the Germans started ending their contracts. And Sprint — unmentioned by Sanger — even demanded to see legal support for turning over phone data, including, apparently, turning over foreign phone data under ECPA;s exception in 18 U.S.C. § 2511(2)(f)‘s permitting telecoms to voluntarily provide foreign intelligence data.
Given that background — and the fact ODNI released the opinions revealing Sprint’s effort, if not its name — I am curious whether the telecoms are really demanding process. If courts really had no jurisdiction then it is unclear how the government could obligate production
Though that may be what the Microsoft’s challenge to a government request for email held in Ireland is about, and that may explain why AT&T and Verizon, along with Cisco and Apple — for the most part, companies that have been more reticent about the government obtaining records in the US — joined that suit. (In related news, EU Vice President Viviane Reding says the US request for the data may be a violation of international law.)
Well, if the Microsoft challenge and telecom participation in the request for data overseas is actually an effort to convince the Europeans these corporations are demanding legal process, Admiral Rogers just blew their cover.
Admiral Rogers said the majority of corporations that had long given the agency its technological edge and global reach were still working with it, though they had no interest in advertising the fact.
Dear Ireland and the rest of Europe: Microsoft — which has long been rather cooperative with NSA, up to and including finding a way to obtain Skype data — may be fighting this data request just for show. Love, Microsoft’s BFF, Mike Rogers.
The second-and-third-to-last line of Magistrate Judge John Facciola’s opinion responding to a warrant application for information from Apple reads,
To be clear: the government must stop blindly relying on the language provided by the Department of Justice’s Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations manual. By doing so, it is only submitting unconstitutional warrant applications. [link added, h/t Mike Scarcella]
Over the course of the opinion — which denies a warrant for three entire months of emails, plus account information and correspondence with Apple for a criminal investigation into Defense Contractor kickbacks — Facciola lays out what, over the last 6 months he has found to be a problem with DOJ’s search and seizure guidelines.
Here’s how Facciola describes what is common to all these warrant applications.
In essence, the applications ask for the entire universe of information tied to a particular account, even if it has established probable cause only for certain information.
He goes on to describe that the government uses essentially the same argument it uses in its NSA dragnets to claim that seizing all the phone records from a company don’t count as seizing them.
Any search of an electronic source has the potential to unearth tens or hundreds of thousands of individual documents, pictures, movies, or other constitutionally protected content. It is thus imperative that the government “describe the items to be seized with as much specificity as the government’s knowledge and circumstances allow.” United States v. Leary, 846 F.2d 592, 600 (10th Cir. 1988).
Here, the government has adequately described the “items to be seized”—but it has done so in the wrong part of the warrant and in a manner that will cause an unconstitutional seizure. By abusing the two-step procedure under Rule 41, the government is asking Apple to disclose the entirety of three months’ worth of e-mails and other e-mail account information. See Application at 14-15. Yet, on the very next page, it explains that it will only “seize” specific items related to its criminal investigation; it goes so far as to name specific individuals and companies that, if mentioned in an e-mail, would make that e-mail eligible to be seized. Id. at 15. Thus, the government has shown that it can “describe the items to be seized with  much specificity”; it has simply chosen not to by pretending that it is not actually “seizing” the information when Apple discloses it. See Facebook Opinion [#5] at 9-10 (“By distinguishing between the two categories, the government is admitting that it does not have probable cause for all of the data that Facebook would disclose; otherwise, it would be able to ‘seize’ everything that is given to it.”).
As this Court has previously noted, any material that is turned over to the government is unquestionably “seized” within the meaning of the Fourth Amendment. See Brower v. Cnty. of Inyo, 489 U.S. 593, 596 (1989) (noting that a “seizure” occurs when an object is intentionally detained or taken). The two-step procedure of Rule 41 cannot be used in situations like the current matter to bypass this constitutional reality because the data is seized by the government as soon as it is turned over by Apple.
What the government proposes is that this Court issue a general warrant that would allow a “general, exploratory rummaging in a person’s belongings”—in this case an individual’s e-mail account. Coolidge, 403 U.S. at 467. This Court declines to do so.
This opinion will likely result only in DOJ submitting a new application. It’ll clean up its ways or submit applications in other districts to avoid Facciola. This opinion, by a Magistrate, certainly won’t establish the principle that as soon as DOJ obtains data, it has seized it under the Fourth Amendment.
Still, given how centrally this claim that seizures don’t equal seizures, perhaps the obvious logic of Facciola’s stance will encourage other judges to stop twisting the normal meaning of seize to be solicitous to government demands.
if you haven’t already heard, Apple admitted to what has been discovered to be a serious security flaw on Friday.
Essentially, for some of the more careful kinds of security, the flaw would allow an attacker to conduct a Man-in-the-Middle attack when you were sending or receiving data via an Apple operating system. Apple’s announcement Friday pertained to just iOS. But security researchers quickly discovered that the bug affects recent releases of OSX as well. And even if you’re using Chrome or Firefox, the bug may affect underlying applications.
In the wake of the Snowden revelations, the discovery of the bug raises questions about how it got there. Langley thinks it was a mistake. Steve Bellovin does too, though does note that targeting Perfect Forward Security is precisely what a determined hacker, including a nation-state’s SIGINT agency, would need to compromise. Others are raising more questions.
But whether or not this is an intentional backdoor into the security protecting users of most of Apple’s most recent devices, I’m just as interested in Apple’s response … both to the public report, almost 6 months ago, that,
US and British intelligence agencies have successfully cracked much of the online encryption relied upon by hundreds of millions of people to protect the privacy of their personal data, online transactions and emails, according to top-secret documents revealed by former contractor Edward Snowden.
And to its discovery — reportedly perhaps as long as a few weeks ago — that it had this serious bug.
Now, if I were a leading device/consumer products company with an incentive to get consumers deeper into the cloud and living further and further online, particularly if I were a leading device/consumer products company sitting on mountains and mountains of cash, upon reading the report last September, I would throw bodies at my code to make sure I really was providing the security my customers needed to sustain trust. And given that this is a key part of the security on which that trust relies, I would think the mountains of cash device/consumer products company might have found this bug.
According to rumors, at least, this bug was not found by Apple with all its mountains and mountains of cash; it was found by a researcher.
Then there’s the radio silence Apple has maintained since issuing its alert about iOS on Friday. It told Reuters over the weekend that it would have a fix to the OSX bug “soon,” so it has, effectively acknowledged that it’s there. But it has not issued an official statement.
It just seems to me there is little that can explain issuing Friday’s security alert — alerting everyone, including potential hackers, that the problem is there, which quickly led to the independent identification of the OSX problem — without at the same time rolling out an OSX announcement and alert. Admitting to the iOS error effectively led to OSX users being exposed to people responding to the announcement. Millions of Apple customers are even further exposed, until such time as Apple rolls out a fix (though you might consider doing your banking on a browser other than Safari to give yourself a tiny bit of protection until that point).
The only thing I can think of that would explain Apple’s actions is if the security researcher who found this bug gave them limited warning, before her or she would have published it.
Otherwise, though, I’m as interested in the explanation for Apple’s two-step rollout of this bug fix as I am in how it got there in the first place.
Microsoft sees itself as going head-to-head with Apple and Google. The 10-year chart above comparing Microsoft, Apple, and Google stock tells us this has been a delusional perception.
It also sees itself in competition with IBM. Yet IBM surpassed it in market value two years ago, even after nearly a decade of ubiquity across personal computers in the U.S. and in much of the world. (IBM is included in that chart above, too.)
One might expect a sea change to improve performance, but is the shell game shuffling of Microsoft executives really designed to deliver results to the bottom line?
Tech and business sector folks are asking as well what is going on in Redmond; even the executive assignments seemed off-kilter. One keen analysis by former Microsoft employee Ben Thompson picked apart the company’s reorganization announcement last Thursday — coincidentally the same day the Guardian published a report that Microsoft had “collaborated closely” with the National Security Agency — noting that the restructuring doesn’t make sense.
The new organization pulls everything related to Windows 8 under a single leader, from desktop to mobile devices using the same operating system, migrating to a functional structure from a divisional structure. There are several flaws in this strategy Thompson notes, but a key problem is accountability.
To tech industry analysts, the new functional structure makes it difficult to follow a trail of failure in design and implementation for any single product under this functional umbrella.
To business analysts, the lack of accountability means outcomes of successful products hide failed products under the functional umbrella, diluting overall traceability of financial performance.
But something altogether different might be happening beneath the umbrella of Windows 8.
There’s only one product now, regardless of device — one ring to rule them all. It’s reasonable to expect that every single desktop, netbook, tablet, cellphone running on Windows 8 will now substantially be the same software.
Which means going forward there’s only one application they need to allow the NSA to access for a multitude of devices.
We’ve already learned from a Microsoft spokesman that the company informs the NSA about bugs or holes in its applications BEFORE it notifies the public.
It’s been reported for years about numerous backdoors and holes built intentionally and unintentionally into Microsoft’s operating systems, from Windows 98 forward, used by the NSA and other law enforcement entities.
Now Skype has likewise been compromised after Microsoft’s acquisition of the communications application and infrastructure for the purposes of gathering content and eavesdropping by the NSA, included in the PRISM program.
Given these backdoors, holes, and bugs, Microsoft’s Patch Tuesday — in addition to its product registration methodology requiring online validation of equipment — certainly look very different when one considers each opportunity Microsoft uses to reach out and touch business and private computers for security enhancements and product key validations.
Why shouldn’t anyone believe that the true purpose of Microsoft’s reorganization is to serve the NSA’s needs?
Tech magazine The Verge noted with the promotion of Terry Myerson to lead Windows — it’s said Myerson “crumples under the spotlight and is ungenerous with the press” — Microsoft doesn’t appear eager to answer questions about Windows.
As ComputerworldUK’s Glyn Moody asked with regard to collaboration with the NSA, “How can any company ever trust Microsoft again?”
If a company can’t trust them, why should the public?
The capper, existing outside Microsoft’s Windows 8 product: Xbox One’s Kinect feature is always on, in order to sense possible commands in the area where Kinect is installed.
ACLU’s senior policy analyst Chris Sogohian tweeted last Thursday, “… who in their right mind would trust an always-on Microsoft-controlled Xbox camera in their living room?”
One might wonder how often the question of trust will be raised before serious change is made with regard to Microsoft’s relationship with the NSA. With political strategist Mark Penn handling marketing for the corporation and Steve Ballmer still at the helm as CEO, don’t hold your breath.