Apple vs the FBI

somdfunguy

not impressed
A Federal court has ordered Apple to assist the FBI in accessing encrypted data hidden on a cellphone that belonged to the terrorist couple who killed 14 people in San Bernardino last year.

Apple is refusing, and on Wednesday, the company's CEO, Tim Cook, posted a message explaining the company's stance.

http://www.apple.com/customer-letter/

Copy of court order http://www.engadget.com/2016/02/16/judge-tells-apple-to-help-fbi-access-san-bernardino-shooters-ip/

February 16, 2016

A Message to Our Customers

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook
 

TheLibertonian

New Member
I dislike apple products; they're overpriced and showy.

But this is a upscale thing to do on the part of Tim Cook. Good for them.
 
Good on Apple.

And the FBI is asking for more than for Apple to assist it in trying to recover the data from that iPhone, Apple's already been doing that.

If Apple (and other tech companies, Apple just happens to be the one most involved in this particular instance) loses this fight on appeal - and more generally if they lose the fight for the right to build products that people want to buy, products that are capable of protecting those people's data from all sorts of bad actors - then we will have crossed yet another major Rubicon when it comes to even pretending that we care about our constitutional protections. This is an important fight, but unfortunately it's one that is capable of being spun into one that to the casual considerer looks nothing like what it is actually about (and in this case, it isn't the tech companies that need to do the spinning). So I worry that some will be too easy to convince that the governments' positions are right, and we might end up losing this fight. I don't think we will, but it's worrying enough that it's plausible that we will.
 

b23hqb

Well-Known Member
PREMO Member
I dislike apple products; they're overpriced and showy.

But this is a upscale thing to do on the part of Tim Cook. Good for them.

We agree here. I can't stand the arrogance of apple or any other device with the trademark of a piece of fruit with a chunk missing, but I agree with dude's assessment of the situation.
 

b23hqb

Well-Known Member
PREMO Member
Good on Apple.

And the FBI is asking for more than for Apple to assist it in trying to recover the data from that iPhone, Apple's already been doing that.

If Apple (and other tech companies, Apple just happens to be the one most involved in this particular instance) loses this fight on appeal - and more generally if they lose the fight for the right to build products that people want to buy, products that are capable of protecting those people's data from all sorts of bad actors - then we will have crossed yet another major Rubicon when it comes to even pretending that we care about our constitutional protections. This is an important fight, but unfortunately it's one that is capable of being spun into one that to the casual considerer looks nothing like what it is actually about (and in this case, it isn't the tech companies that need to do the spinning). So I worry that some will be too easy to convince that the governments' positions are right, and we might end up losing this fight. I don't think we will, but it's worrying enough that it's plausible that we will.

I think that fight has already been lost by the obamacare decision forcing people to buy a product they do not want (health insurance). The inverse would be just as stupid as forcing companies to make products that will have any consumer appeal or private use.
 

GURPS

INGSOC
PREMO Member
So I worry that some will be too easy to convince that the governments' positions are right, and we might end up losing this fight.


your next iDevice will just directly report to the NSA anything you do ... :whistle:




Why Apple is battling investigators over San Bernardino terrorists' iPhone


Does this battle go beyond cellphones?

Yes, terrorism spreading through social media has also been a major issue.

Sen. Dianne Feinstein (D-Calif.) has led a push in Congress for legislation that would require social media companies to root out and report suspicious activity. Tech firms and privacy advocates beat back an effort by Feinstein earlier last year.




what is it a businesses JOB to monitor and root out activity :shrug:
 
Last edited:

LibertyBeacon

Unto dust we shall return
Good on Apple.

And the FBI is asking for more than for Apple to assist it in trying to recover the data from that iPhone, Apple's already been doing that.

If Apple (and other tech companies, Apple just happens to be the one most involved in this particular instance) loses this fight on appeal - and more generally if they lose the fight for the right to build products that people want to buy, products that are capable of protecting those people's data from all sorts of bad actors - then we will have crossed yet another major Rubicon when it comes to even pretending that we care about our constitutional protections. This is an important fight, but unfortunately it's one that is capable of being spun into one that to the casual considerer looks nothing like what it is actually about (and in this case, it isn't the tech companies that need to do the spinning). So I worry that some will be too easy to convince that the governments' positions are right, and we might end up losing this fight. I don't think we will, but it's worrying enough that it's plausible that we will.

But just a minute...

The FBI are asking for signed firmware from Apple which will disable certain features, like the brute force attacks that "reset" the phone (by deleting the private key) after 10 failed attempts, or whatever the standard is. I think 10 is right, if a PIN is used.

If the asymmetric encryption is good, disabling brute-force detection should make no difference to the "attacker", in this case the government under ostensible legitimate interest. But the fact a signed firmware will disable that, then that tells me the FBI know it is brute-force-able.

This by extension is confirmation that the encryption was never solid in the first place. If it can be defeated by a properly signed firmware, then game is over. Rogue states are a very real threat at that point.
 

This_person

Well-Known Member
George Will, a journalist for whom I have great respect normally, suggested that there is a simple answer: give the phone to Apple and have Apple provide the data from the phone the government seeks.

The government apparently does not want the information from the phone, they want the technology to disable the protections on the phone. Now, why would they want that? :sarcasm:

In my humble opinion, justice through the court system for the alleged crimes of the accused terrorist who was the authorized user of the county-owned cell phone will never happen because, uh, he's dead. There is no specific crime that will be adjudicated as a result of any data on the phone because the phone's user is dead. This is a request for the technology to conduct a fishing expedition, and such that future use of the technology will be possible.

Now, we all know that the government is great at cyber security, so there's no way the software being requested by the government will EVER reach improper hands, right? Meanwhile, we also know that the government is great at upholding our individual securities and liberties, so there's no way that THEY will ever improperly use the technology - why, it's ONLY for this particular use and no others, right?

If you believe the previous paragraph, I have a great bridge in Brooklyn to sell you!
 

glhs837

Power with Control
Yep, have Apple give you the data. If you are concerned about chain of custody, simple enough have an agent accompany the phone through it's process. But its what you all are saying, they don't want this particular data, but we know this case makes our opponent seem most unreasonable. "OMG, dont you want us to stop another attack like that one? What the hells wrong with you? See people, there's just no reasoning with this company, that's why we need the law to force them."

Same reason MD only places speed cameras in school zones, makes the sell easier.
 

GURPS

INGSOC
PREMO Member
Apple Unlocked iPhones for the Feds 70 Times Before


But in a legal brief, Apple acknowledged that the phone in the meth case was running version 7 of the iPhone operating system, which means the company can access it. “For these devices, Apple has the technical ability to extract certain categories of unencrypted data from a passcode locked iOS device,” the company said in a court brief.

Whether the extraction would be successful depended on whether the phone was “in good working order,” Apple said, noting that the company hadn’t inspected the phone yet. But as a general matter, yes, Apple could crack the iPhone for the government. And, two technical experts told The Daily Beast, the company could do so with the phone used by deceased San Bernardino shooter, Syed Rizwan Farook, a model 5C. It was running version 9 of the operating system.

Still, Apple argued in the New York case, it shouldn’t have to, because “forcing Apple to extract data…absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand,” the company said, putting forth an argument that didn’t explain why it was willing to comply with court orders in other cases.

“This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue,” Apple said.
 

Merlin99

Visualize whirled peas
PREMO Member
A Federal court has ordered Apple to assist the FBI in accessing encrypted data hidden on a cellphone that belonged to the terrorist couple who killed 14 people in San Bernardino last year.

Apple is refusing, and on Wednesday, the company's CEO, Tim Cook, posted a message explaining the company's stance.

http://www.apple.com/customer-letter/

Copy of court order http://www.engadget.com/2016/02/16/judge-tells-apple-to-help-fbi-access-san-bernardino-shooters-ip/

Why should Apple have to help the Feds cover up for their own incompetence? If they wouldn't have screwed the pooch keeping a terrorist out of the country they wouldn't require assistance.
 
But just a minute...

The FBI are asking for signed firmware from Apple which will disable certain features, like the brute force attacks that "reset" the phone (by deleting the private key) after 10 failed attempts, or whatever the standard is. I think 10 is right, if a PIN is used.

If the asymmetric encryption is good, disabling brute-force detection should make no difference to the "attacker", in this case the government under ostensible legitimate interest. But the fact a signed firmware will disable that, then that tells me the FBI know it is brute-force-able.

This by extension is confirmation that the encryption was never solid in the first place. If it can be defeated by a properly signed firmware, then game is over. Rogue states are a very real threat at that point.

What's doable depends, in part, on how old a model we're talking about and what version of iOS it's running. In this case we're talking about a 2012 model (when it comes to the relevant design aspects), but one running the latest version of iOS. Newer models function differently, e.g., when it comes to accepting software updates - the security built into them is even better.

As for the encryption issue, I don't want to get lost in the technical details of how it works, but ... yes, the user passcode can be quite brute-forceable, depending on what kind of passcode the user has used (assuming you can get around the auto-erase function (it isn't really an auto-erase, but for practical purposes it might as well be) and the delays between allowed attempts). But that isn't the extent of the encryption (of the data on the user partition of the memory). That passcode is just used, in conjunction with an unreadable (speaking practically), hard-coded (burned into the silicon) unique device identifier and other things to create the real encryption key - one that likely couldn't be brute-forced in any reasonable time frame. That's why the user passcode is itself protected from brute-forcing by the attempt delays and the optional auto-erase function.
 
Why should Apple have to help the Feds cover up for their own incompetence? If they wouldn't have screwed the pooch keeping a terrorist out of the country they wouldn't require assistance.

And it isn't as though Apple has refused to help the FBI. It has done what it reasonably can. It's given, pursuant to a warrant, the FBI data from the iCloud account. (The iCloud backup function apparently stopped being used on that phone about 6 weeks before the shootings.) And, as the government acknowledged in one of its filings, Apple has routinely accessed unencrypted data for law enforcement - pursuant to a warrant and court order - from devices running older versions of iOS, devices for which it was able to.

But what the government is asking for (really, demanding) here goes beyond that kind of help, it's a bridge too far I think. I don't want to get lost in a technical explanation (and I only understand how the security works down to a certain level myself, I'm far from a an expert on this subject; though we can walk through how it works on a somewhat more detailed level if you want), but it would indeed - in my view - amount to a backdoor through the built in security measures that many people want and rely on (even if they don't themselves understand how those measures work). The government is trying to force Apple to build such a backdoor and, more troubling, asserting the right (based on a vague, overly broad, and, btw, rather old law) to force Apple (and thus, going forward, likely any other tech company) to do just that. If that law can be used in this way, and if doing so is constitutional, then I shudder to think of what else it can be used to force people or companies to do.

Now, the FBI is playing coy - oh, what we're asking (demanding) Apple to do is only for this particular phone, they can build this software in a way that it would only work on this phone. But based on my understanding of how such things work, I don't think that's correct; and at any rate, Apple says that it is not. If Apple were to build what they want Apple to build, it would be something that - in the hands of bad actors - could be used to access at least any of that model of iPhone (which, btw, has an installed base of about 10 million in the U.S. alone as of the last estimates I saw - that model, not all iPhones). It's like someone claiming, oh I'm only asking you to design a gun that's capable of killing this particular person - John Doe 856. You don't have to make it so that it's dangerous to others, you know, such that it would be capable of killing anyone else. Well, sorry, that's not how it works. Speaking practically, if I make you a gun to kill John Doe 856, it's necessarily going to be capable of killing a whole lot of other people.

Perhaps more importantly though, even if we assume that what they want Apple to do in this particular instance poses no threat to the security of other people's phones, the problem is that allowing this law to be used in this way - to force a company to build something like this - opens the door even further to all kinds of bad uses down the road. If the government can set the precedent in this case that it can make companies do its bidding in this kind of way (again, we aren't talking about just making a company handover access to records it has or turn over something else that it has or advise the government how to get around certain security features; we're talking about forcing a company to build something new for the purpose of being able to break into people's private information, information they'd rather the government and (other?) bad actors not be able to access), it's just bad. Dangerous. Contrary to what this nation has supposedly been about. Scary. The government can force you to buy health insurance if it wants to kind of bad. Or worse. Probably worse.

If the government wants such tools, it should build them itself. And if it can't do so, then so be it. That should tell it something. In any event, it shouldn't be allowed to force someone else to build such tools for it.
 
DoJ files motion to compel Apple to comply with FBI order

We should keep in mind that the order Apple is under is an ex parte order. Apple hasn't lost in court, so to speak, yet - let alone lost on appeal (though it very well may, possibly even fairly quickly). The government went to the court and told its side of the story, and asked that court to order Apple - which wasn't a party to that action - to do something. The court (i.e. the magistrate judge) said, okay government, we'll do that. Apple's arguments weren't heard before that happened. (Just to be clear, that's not particularly unusual, that's how such things sometimes work.) But now Apple gets to make its case to the court why it shouldn't be ordered to do this. And then of course there are appeals processes should the original court decide that it still thinks the government is entitled to what it wants.

The government has decided to make this a public fight, it apparently thinks it can win in the court of public opinion (with, basically, a version of the 'it's for the children' message). I hope it doesn't. I hope the deleterious lingering effects on our collective psyche (and our subsequent willingness to prostrate ourselves in the face of any possible threats) from 9/11 are not yet that strong. But I fear that they are, and that it will.
 

TheLibertonian

New Member
DoJ files motion to compel Apple to comply with FBI order

We should keep in mind that the order Apple is under is an ex parte order. Apple hasn't lost in court, so to speak, yet - let alone lost on appeal (though it very well may, possibly even fairly quickly). The government went to the court and told its side of the story, and asked that court to order Apple - which wasn't a party to that action - to do something. The court (i.e. the magistrate judge) said, okay government, we'll do that. Apple's arguments weren't heard before that happened. (Just to be clear, that's not particularly unusual, that's how such things sometimes work.) But now Apple gets to make its case to the court why it shouldn't be ordered to do this. And then of course there are appeals processes should the original court decide that it still thinks the government is entitled to what it wants.

The government has decided to make this a public fight, it apparently thinks it can win in the court of public opinion (with, basically, a version of the 'it's for the children' message). I hope it doesn't. I hope the deleterious lingering effects on our collective psyche (and our subsequent willingness to prostrate ourselves in the face of any possible threats) from 9/11 are not yet that strong. But I fear that they are, and that it will.

They won the patriot act fight, didn't they? All they have to do is start talking about the san berrnando shooters, because muslims are scary, and convince people if they don't give up their privacy rights the next attack in....I dunno average time between major attacks...2029 or so will be there fault!
 

Gilligan

#*! boat!
PREMO Member
The government has decided to make this a public fight,

Barry brought up this issue (encrypted private communication and data) years ago and has held up the goal of eliminating "government proof" encryption from devices and the internet for quite some time. This is simply the time and case he's (his DOJ) chosen as the one to move forward and get it done with.
 
Holy cow. The Justice Department seems to be ramping up its efforts to bully Apple; it's spreading more BS and trying to spin the situation to make it look like Apple is the bad guy here. As I've suggested, I fear the government (and like minded advocates) will be successful to some extent on that front as it's the kind of situation that isn't going to be easy (i.e. meaning without putting in some effort) for most people to understand. It will be easy to mislead casual considers about what's really at issue, about what's really going on.

At this point though, the government may be forcing Apple's hand too much and leaving Apple with little choice but to fight back by revealing things that put the government in a more suspect light. It seems that Apple had been taking the high road to some extent. If what is now reportedly being said by Apple representatives is true - that the password for the Apple ID associated with that phone was changed after the shooters were dead (and perhaps after the phone was in the custody of law enforcement) - then, that's a potentially explosive development.

I've been giving the government the benefit of the doubt on this one. What it's demanding is wrong, and it's indicative of an out of control government, but I haven't been buying into the conspiracy theories that the government was intentionally using this situation - perhaps even contriving it - so that it could force Apple to make what it wants with the actual intent of then, somehow, being able to use it going forward in other situations. I'm still not sure I buy that.

I don't think the government can be trusted here, well, just because - history and such. So I think the situation is dangerous regardless of whether there's specific nefarious intent in this particular case. It's worse if the latter is present though. And the optics of this situation - again, to the casual considerer - do set up pretty well for the government, so it would present a good opportunity for it to try to take advantage of public perception and use the situation to establish something that it could use going forward. So I'd have to say, if this latest revelation is indeed legit, color me willing to consider the more conspiratorial interpretations of what's going on.

This situation may be about to get pretty darn interesting.
 
Barry brought up this issue (encrypted private communication and data) years ago and has held up the goal of eliminating "government proof" encryption from devices and the internet for quite some time. This is simply the time and case he's (his DOJ) chosen as the one to move forward and get it done with.

I think there's some merit in that line of thinking.

But calling him Barry doesn't make him sound nefarious enough.

At any rate, the long-term implications are potentially staggering. It isn't just the potential for getting around encryption (and more generally, privacy safeguards) on smartphones, which would be bad enough. It's all of the encryption that we use (usually without being conscious of it) in so many facets of our lives. Some of that would, with the right legal obstacles already being pushed aside, be even easier to compromise and take advantage of than is the case with smartphones (for governments and private bad actors alike).

The surest way to lose an important fight is to wait too long to start fighting it or, relatedly, to not even realize that you're in an important fight until it's too late.
 
Apple has reportedly indicated that it was officials with the San Bernardino County government that changed the password for the Apple ID account, not law enforcement. So that suggests incompetence (perhaps on the part of someone that didn't really understand how such things worked, and thus should have figured them out before doing something like that) more so than someone intentionally trying to manipulate the situation such that they might be able to justify trying to force Apple to build a backdoor for them.
 
Top