Apple vs The FBI

A bunch of FBI suits standing in a circle staring blankly at an iPhone doesn't exactly inspire confidence.  Love Cook's response (and I'm paraphrasing, of course): you're the US Govt for chrissakes... you can shoot a picture of Pluto and help land a probe on a comet traveling a zillion miles an hour through space-- YOU figure it out.


BrickPig said:

Possibly, but I just ran several google searches trying to find such a thing and couldn't.
terp said:
I thought I saw a poll that indicated the public was largely in favor of Apple Complying with the request.  I'll see if I can dig it up.

If so, I am pretty sure the "public" doesn't really understand the issue


I haven't been following this. Am I correct that the FBI wants to be able to obtain access to the data on an iPhone that has been physically seized?


Yes.  Except that backdoor access doesn't presently exist, and needs to be developed by Apple.


BG9 said:
drummerboy said:


BG9 said:

What I heard is that the FBI really doesn't give a damn about what's on this phone. Meta data collection from phone providers already tells phone calls.

FBI management has been whining about getting this type of access forever. They're using this to set precedent, to crack the secure decryption walls accessible to consumers.

Reminds me of the guy who wrote and published PGP, the public key encryption, Philip R. Zimmermann. Then too we had the world is ending because of the distribution of encryption for the masses that government can't break. Our government tried to browbeat Zimmermann by "investigating" him criminally for years.

https://www.philzimmermann.com/EN/news/PRZ_case_dropped.html


https://www.philzimmermann.com/EN/background/index.html


btw- the world did not end

this is the first explanation about this story that makes any sense.

Tim Cook's explanation about opening up back doors and threatening their entire customer base makes absolutely zero sense.

On the other hand, it's equally bizarre that the FBI is trying to force Apple to write software for them. Is there any precedent at all for the FBI forcing someone to work for them?

Very weird story.

I'd like to get more technical details on what exactly is being asked for though. It seems like the FBI simply wants to turn off the "brick the phone after 10 failed pin attempts" feature. I find it unbelievable that they are not able to reverse engineer the code for something as simple as this.

Bizarro time.

Not everything is reverse engineered possible.

For example, public key encryption. The encryption and decryption source code is widely available in books and on the internet in Java, C, C++. 

Yet, its not possible to timely break an encryption by "reverse" engineering even when you know the decryption code used to do the decryption.

yes, that's true of course. but program code is not encryption - though you raise an interesting question about encrypting program code. never thought about that. wonder if that's the case.

regardless, I would imagine that the FBI has access to some very high-falutin hacking tools that should make short order of problems like this.


ctrzaska said:

Yes.  Except that backdoor access doesn't presently exist, and needs to be developed by Apple.

i believe they're asking for an altered version of the os - which is not exactly a "back door".

have to go see if there's more technical coverage of this...


Only used the term since Cook equated the two in his letter...

"But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation."

...but I'll freely admit the nuance is a bit above my head.


I'm going to wind up on that damn MOL double-entendre thread once again, aren't I?

ETA: And yes, I did see your post.  LOL.


well,Cook's use of the word backdoor is an example of his own hyperbole in the matter.

though there's no question that Apple should fight this. Forcing Apple to work for the FBI would be a horrible, horrible precedent. I can't even believe the judge even suggested it, actually.


I think I largely agree with someone I just read about this - which is that this is just a tactic (largely ineffective, if you ask me) to imply to terrorist organizations that the iPhone is beyond the reaches of American law enforcement, when in fact the FBI probably already has whatever it needs from this phone. And the whole issue will probably fade away.


It's kind of the only explanation that makes sense.


drummerboy said:

I think I largely agree with someone I just read about this - which is that this is just a tactic (largely ineffective, if you ask me) to imply to terrorist organizations that the iPhone is beyond the reaches of American law enforcement, when in fact the FBI probably already has whatever it needs from this phone. And the whole issue will probably fade away.


It's kind of the only explanation that makes sense.

Wait...what? You think is a ploy by Apple to boost sales in the terrorist cell phone market niche? That's what you think is the only sensible explanation?


I think he means a ploy by the FBI to say "Oh dearie me we can't get any info from this iPhone! Hopefully no terrorists read this and start putting critical information on their iPhone!"

It wouldn't be the strangest thing a government agency has ever done.

I read the article by McAfee and it smacks a bit of braggodocio but what do I know? Maybe I would just prefer to think he's full of it than accept the premise that we're that far behind in the cybersecurity arena.

Why doesn't the FBI just give the phone to TMZ? They'll have it cracked in minutes.


mrincredible said:

I think he means a ploy by the FBI to say "Oh dearie me we can't get any info from this iPhone! Hopefully no terrorists read this and start putting critical information on their iPhone!"

Ah! OK, that I could believe.


The maturation of the internet into a relatively secure platform has been what's made it such an important part of our lives - in commerce, government, etc. One of the ongoing challenges is to keep it secure and reliable - addressing its current shortcomings, defending against those who would undermine it, etc.

This request by the FBI is very short-sighted. They, as much as the rest of us, have come to depend on a reasonable expectation of security online. This isn't as bad as, say, reports that the NSA has tried to purposefully introduce weaknesses into encryption algorithms, but the software that runs iPhones is widely distributed enough that I think you can make the case it approaches the level of basic online architecture.

Purposefully writing malware to undermine this software introduces a huge security hole for millions of people worldwide (including, I'd assume, a fair amount of law enforcement systems). Sure, this would just be a "one-off," but software is just text. Once written, it's pretty hard to guarantee it won't spread.

The internet has been a very successful example of public and private sectors, and commercial and non-commercial concerns, creating a public good. The FBI here is acting in a way that would undermine that, and Apple is right to resist it.


PVW said:

The maturation of the internet into a relatively secure platform has been what's made it such an important part of our lives - in commerce, government, etc. One of the ongoing challenges is to keep it secure and reliable - addressing its current shortcomings, defending against those who would undermine it, etc.

This request by the FBI is very short-sighted. They, as much as the rest of us, have come to depend on a reasonable expectation of security online. This isn't as bad as, say, reports that the NSA has tried to purposefully introduce weaknesses into encryption algorithms, but the software that runs iPhones is widely distributed enough that I think you can make the case it approaches the level of basic online architecture.

Purposefully writing malware to undermine this software introduces a huge security hole for millions of people worldwide (including, I'd assume, a fair amount of law enforcement systems). Sure, this would just be a "one-off," but software is just text. Once written, it's pretty hard to guarantee it won't spread.

The internet has been a very successful example of public and private sectors, and commercial and non-commercial concerns, creating a public good. The FBI here is acting in a way that would undermine that, and Apple is right to resist it.

I agree with everything in this post except what I've bolded. I think we all know it will not be a "one-off." Already, William Bratton is quoted in this Times article complaining that the police cannot access the information from a phone belonging to an associate of a supect, even though the suspect has already confessed to his crime.

http://www.nytimes.com/2016/02/19/technology/a-yearlong-road-to-a-standoff-with-the-fbi.html

In New York, William Bratton, the police commissioner, held up a phone that he said was used by an associate of a man who shot and wounded two police officers in the Bronx recently.
“Despite having a court order, we cannot access this iPhone,” Mr. Bratton said. “Just one example, a very significant example in which two of my officers were shot, that impeding that case going forward is our inability to get into this device.”
The case in Brooklyn continues, even though Mr. Feng has already pleaded guilty. While the Justice Department sees the San Bernardino incident as its ideal test case, Apple is hoping for a legal win in Brooklyn.

Generally speaking I don't easily buy in to 'slippery slope' arguments, but it's nearly impossible for me to believe this would be a once-and-done deal, or even that the technology would be used only under the most extreme circumstances. (Not even to mention the possibility of it falling into the wrong hands.)


You find yourself buying into 1 slippery slope argument and then the next thing you know....


This case is textbook slippery slope as evidenced by the fact that the judge is relying on the 1789 (originally passed) All Writs Act to compel Apple to comply.  That law, like so many, is finding use beyond the original intent and that benefits neither society nor law enforcement.


I love this thread. It's so reasoned and interesting. What's going on?


BrickPig said:


Generally speaking I don't easily buy in to 'slippery slope' arguments, but it's nearly impossible for me to believe this would be a once-and-done deal, or even that the technology would be used only under the most extreme circumstances. (Not even to mention the possibility of it falling into the wrong hands.)

I agree, and I was trying to frame my post in terms of self-interest (we all have a vested interest in secure software) rather than as a slippery slope. IOW, even assuming some version of reality where there were no further law enforcement requests of this sort, it would still be a bad idea because of the way it undermines security for everyone.


BrickPig said:
mrincredible said:

I think he means a ploy by the FBI to say "Oh dearie me we can't get any info from this iPhone! Hopefully no terrorists read this and start putting critical information on their iPhone!"

Ah! OK, that I could believe.

yes, the Mr I clarification is correct. And funnier.


.

terp said:

You find yourself buying into 1 slippery slope argument and then the next thing you know....

As BrickPig stated, this is essentially what the FBI is asking from Apple:

1) disable auto-erase, so that the phone doesn't brick itself after 10 failed attempts at the passcode,

2) to make it possible to submit passcodes electronically

3) to eliminate the 80 millisecond (?) delay between passcodes entered.

Once that is done, the FBI can utilize a brute force attack to try every password combination and unlock the phone.

If I put on my tin foil cap, this is a ploy by the FBI to (1) get Apple to produce such a tool if it doesn't already exist, and (2) at some point, go an acquire that technology so they can use at will.

Otherwise, considering that Apple has complied and unlocked iPhones in the past, they could simply do the same with this phone.


tjohn said:

Thought-provoking comments on this issue.  Seems like this is case that will have to be appealed to higher courts.

Yup and since SCOTUS might split 4-4 the lawyers for Apple and the lawyers for Government will be maneuvering to bring it in the Circuit most favorable to their position.

But at least Bork, who said there was no Constitutional Right to Privacy, was not confirmed.


No, I don't think they could "simply do the same with this phone". If that was the case we wouldn't have the situation of a judge trying to force Apple employees to become slave labor for the FBI. (I mean, that's what it is, right?) They are not simply asking to unlock the phone as might have been done in the past.


drummerboy said:

No, I don't think they could "simply do the same with this phone". If that was the case we wouldn't have the situation of a judge trying to force Apple employees to become slave labor for the FBI. (I mean, that's what it is, right?) They are not simply asking to unlock the phone as might have been done in the past.

What was done in the past is no longer possible. Beginning with iOS 8, Apple has taken progressively bigger steps to fortify the iPhone's encryption. One of the main reasons being that, in the words of Apple's lawyer, "[Apple was] being forced to become an agent of law enforcement.” 

http://www.nytimes.com/2016/02/19/technology/a-yearlong-road-to-a-standoff-with-the-fbi.html?ref=technology


So imagine this scenario:

Apple builds a single Mac Pro which has no network hardware built into it. That computer is used to encode the special software needed to decrypt a suspect's iPhone. It's kept in a vault which can only be opened by the concurrent fingerprint scans of the Director of the FBI,  a Supreme Court Justice and the Senate Majority and Minority leaders. Inside the vault the CPU is kept inside another vault with only a keyboard and monitor and sync cable for the device in question accessible.

It sounds ridiculous but I'm curious if there is any method which could be employed to safeguard the misuse of this software that would make people comfortable with it. I'm not sure there is. Who builds the vault? Who writes the code? What about the convenient human-sized ventilation shaft into the vault that lets out in a abandoned train tunnel that's only on an old engineering map in the archives at City Hall?

If nothing else, this is an awesome premise for a certain kind of film.

https://youtu.be/k55NuWQCh78


mrincredible said:

So imagine this scenario:

Apple builds a single Mac Pro which has no network hardware built into it. That computer is used to encode the special software needed to decrypt a suspect's iPhone. It's kept in a vault which can only be opened by the concurrent fingerprint scans of the Director of the FBI,  a Supreme Court Justice and the Senate Majority and Minority leaders. Inside the vault the CPU is kept inside another vault with only a keyboard and monitor and sync cable for the device in question accessible.

It sounds ridiculous but I'm curious if there is any method which could be employed to safeguard the misuse of this software that would make people comfortable with it. I'm not sure there is. Who builds the vault? Who writes the code? What about the convenient human-sized ventilation shaft into the vault that lets out in a abandoned train tunnel that's only on an old engineering map in the archives at City Hall?

If nothing else, this is an awesome premise for a certain kind of film.


k55NuWQCh78

Then the FBI will just compel the guy who built the vault to bypass the security measures. Because terrorists.


RobB said:
Then the FBI will just compel the guy who built the vault to bypass the security measures. Because terrorists.

Or eventually there's a line of law enforcement officers out the door and around the block, all waiting their turn to have their suspects' iPhones opened.


Not an expert in either area, but I wonder if there could be a Takings clause or 13th Amendment argument raised here.  Seems pretty extreme that the government could compel a company to re-write its code that would result in the destruction of their business.  Who would buy an iPhone if it were so insecure?


In order to add a comment – you must Join this community – Click here to do so.