Thread: Security v. Freedom Board: Oblivion / Ship of Fools.


To visit this thread, use this URL:
http://forum.ship-of-fools.com/cgi-bin/ultimatebb.cgi?ubb=get_topic;f=70;t=029652

Posted by lilBuddha (# 14333) on :
 
The FBI have won an order in court to direct Apple to help decrypt the iPhone of a jihadist shooter.
Apple have refused stating that a backdoor into the OS would be likely abused.
In societies that desire to be free, there will always be a battle between what rights we cede for protection by our governments and those we keep for protection from them.
So where is your line, re encryption of electronic devices and communications?
 
Posted by Lamb Chopped (# 5528) on :
 
I don't understand. Either there is an existing backdoor into the OS, or there isn't. If there is, I don't see how obliging the FBI would lead to further abuses (it's not like they're going to publish the backdoor, after all, and it's existence would be common knowledge regardless of what choice they make). If there is no backdoor, then they could simply have told the FBI so.

is it possible that the FBI is asking not for a backdoor but for complete access to the OS code, in the hopes of breaking in?
 
Posted by Doublethink. (# 1984) on :
 
The way it was reported on BBC radio was that Apple was saying it could not currently accesss the data - and that to write software to do so would create a security problem.

So effectively claiming there currently isn't a back door.
 
Posted by Curiosity killed ... (# 11770) on :
 
According the BBC reports, the FBI want to get into the phone. To do this they have to find the PIN code by trial and error. Apple have to be involved as it needs their coding signature to allow repeated trials, rather than being locked out after a number of erroneous entries.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Doublethink.:
The way it was reported on BBC radio was that Apple was saying it could not currently accesss the data - and that to write software to do so would create a security problem.

So effectively claiming there currently isn't a back door.

Currently is the key word. If Apple successfully break into the phone, there would effectively would be even if we argue the technical correctness of using the term backdoor.

Lamb chopped, how much do you trust your government? Not only the current, but future. Google the FISA court battles.
 
Posted by Schroedinger's cat (# 64) on :
 
So Apple cannot hack their own phones. Which is good, because if they can't then others probably can't either.

The FBI want them to, because they don't understand encryption? Makes sense.

I suppose I want total security and total privacy, because I am not a security risk. In the end, I am willing to accept that my privacy is not absolute, especially when I am online. I am not happy about that, but I am prepared to accept some compromise there - and I know my bank seem to be very good at contacting me when there is a change in account behaviour.

The other side of this is that those who can access my data are not judgemental. If I am a security threat, come and talk to me. If not, so what if I like Busty Betty's Big Bang Emporium.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by lilBuddha:
and that to write software to do so would create a security problem.

This is nonsense. If software can be written to circumvent the security, there is already a security problem. Whether or not Apple has chosen to write the software yet isn't terribly relevant.

In this case, it seems as though there's a software function that wipes the phone after some number of failed attempts, and the FBI want a version of the software without that function, so they can brute-force the passcode.
 
Posted by Schroedinger's cat (# 64) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by lilBuddha:
and that to write software to do so would create a security problem.

This is nonsense. If software can be written to circumvent the security, there is already a security problem.
Not necessarily. Apple may have access to parts of the code that are not accessible to other developers, and knowledge and understanding of how to access key parts of the system. There is a possibility that Apple could produce something that others couldn't. Because of how restrictive Apple are, it may be a possibility.
 
Posted by Dafyd (# 5549) on :
 
There's a difference I think between asking Apple to break into a particular phone, and asking Apple to show them how to break into any phone.
 
Posted by Alan Cresswell (# 31) on :
 
But, any potential security risk can only be realised if details of what Apple do to facilitate access to the phone become known beyond the group of programmers who develop the hack. I'm sure neither Apple nor the FBI have any desire to do that.
 
Posted by lilBuddha (# 14333) on :
 
Form a CNN Money article:

quote:

Cook said that to comply with the ruling, Apple will have to rewrite its iOS operating system.


 
Posted by Doc Tor (# 9748) on :
 
AIUI, Apple have designed the phone so that it is secure against intrusion, and the FBI are asking Apple to show them how a secure phone can be cracked. Which makes a mockery of the 'secure' part of the phone.

Now, while neither Apple nor the FBI want to advertise how a secure phone magically becomes insecure, part of the iPhone's appeal is that it's secure. If they agree to the FBI's demands, Apple will then be selling an insecure phone. Android encryption apps are, of course, readily available.
 
Posted by Honest Ron Bacardi (# 38) on :
 
AIUI, what Apple is being asked to do is to effectively write a specific update of iOS (the iPhone operating system) that will not contain the current default of scrambling the data if more than the set number of unsuccessful passcodes is entered. This will allow the FBI to use brute-force techniques to crack the phone's security by entering 4-digit passcodes until they hit the correct one.

If that's correct, there is no backdoor nor will there be one. But there appears to be no impediment to commissioning a third party to reverse-engineer iOS and write the required software. No doubt there will be obstacles to overcome to do that, but it looks feasible. But I assume will take too long for the nature of the enquiries they want to make.
 
Posted by lilBuddha (# 14333) on :
 
Adding that nothing exists in a vacuum. The government security communities have been against private encryption since it very beginnings. Any inroad will be heavily traveled.
 
Posted by lilBuddha (# 14333) on :
 
A lengthier article discussing the issue.
 
Posted by anteater (# 11435) on :
 
I'm torn on this, having worked for many years on computer and IT security. By natural inclination I'm suspicious of governments, and I cannot see how people can think that it unreasonable.

We know that the government spies on us and lies about it, and the problem with loss of trust is it takes a long time to regain it.

Having said that I'm not involved in any activism that might attract the attention of the authorities, and I can accept that there are bad people around and we are tying the police by having unbreakable security.

Our prejudices are often fed by resentment and are not always creditable. For instance, most peoples only contact with the police is to get fined for doing marginally over the limit. The only time we have a robbery the police admitted there was little point in hoping for a resolution, mainly because they were certain that the robber had an inside police collaborator.

But who knows, I may once need the police and then if they needed to decrypt a phone to protect me I may well experience a change of view!

Difficult. Isn't it a pity that there's no much casual lying and corruption in the government and law enforcement.
 
Posted by Honest Ron Bacardi (# 38) on :
 
Exactly so, anteater.

BTW - here's another, more technical article on what Apple would need to do.
 
Posted by Golden Key (# 1468) on :
 
Does this mean that the NSA couldn't hack it? Or that agencies still aren't playing nicely with each other?
 
Posted by Beeswax Altar (# 11644) on :
 
I dislike Apple, the federal government, and jihadists.

I'm torn.
 
Posted by simontoad (# 18096) on :
 
I like it when the law requires that Courts approve intrusive investigative steps before they are taken by law enforcement authorities, provided that the right set of tests are open to be applied. I think it's appropriate given the complexity of information technology and its ubiquity in rich countries that the Court can order IT companies to facilitate access to their technology.

I thing the right set of tests for courts to apply is whether in the particular circumstances before it, issuing the order sought is in the public interest.
 
Posted by RuthW (# 13) on :
 
quote:
Originally posted by Beeswax Altar:
I dislike Apple, the federal government, and jihadists.

I'm torn.

I am too, but the extensive damage this could do to Apple's business seems like an important consideration. How many people will opt for Samsung if Apple complies with this order? Especially given that more than half of all iPhones are sold outside the US.
 
Posted by Lamb Chopped (# 5528) on :
 
quote:
Originally posted by lilBuddha:

Lamb chopped, how much do you trust your government? Not only the current, but future. Google the FISA court battles.

Why are you asking me this question? My response had nothing to do with trust in the government (as if) and everything to do with the technical aspects of the situation. A backdoor is a name for a software programmer's deliberately designed way of getting past the security on a software program without having to go through the usual hoops, and by definition exists since before it was released to the public. Backdoors are supposed to be closed before public release, as they are obvious security risks, but you can easily imagine all the reasons a company might NOT actually close all the backdoors, leading to users' indignation.

From what you write later it appears you are not speaking of an actual backdoor, but rather of proprietary knowledge that would make breaking in easier in general.
 
Posted by Lamb Chopped (# 5528) on :
 
Later:

Okay, having read your link from further down the thread, it seems that what the FBI is asking for is that Apple construct a backdoor to be added to the phone via a software update--and is promising, pinkies crossed, that they will only use it on this one single iPhone.

Ain't gonna happen.

If Apple does this, it'll quickly become a routine request to all sorts of software providers. Or at least, all sorts of software developers that the US has any power over--which leaves US businesses at a disadvantage.

And I don't see how it would be much of an advantage for the FBI in the ongoing arms race that is computer security.

The thing I'm having a hard time visualizing is what information on a phone could possibly be so critical that the FBI could not do their job without it. Certainly it would speed things up, provide corroborating evidence, etc. etc. but if something like 9/11 is in the works again, I would expect the FBI to be picking up signals from more places than a single confiscated iPhone. Maybe I'm not thinking creatively enough.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Schroedinger's cat:
I suppose I want total security and total privacy, because I am not a security risk.

Should your self-assessment make you immune to a court order?

This is the thing that bothers me more than anything else. Frankly, it doesn't matter two hoots whether Apple thinks it's a good idea or not. I'm sure they argued in court that it wasn't a good idea.

But a judge weighed all that up, and made a decision. And unless someone can show that that decision was outside the judge's powers, it's contempt of court to not obey.

And obedience to court decisions is not, has never been, and should never be conditional on you agreeing with the court's decision. Otherwise you get Kim Davis refusing to issue marriage licences.

[ 18. February 2016, 06:53: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Lamb Chopped:
If Apple does this, it'll quickly become a routine request to all sorts of software providers.

To which the routine response should be: "get a court order".

Apple shouldn't do this on the basis of a request. They should do it because they've been commanded to do so. There's a massive difference, the difference between the FBI wanting it done and a judge wanting it done.

[ 18. February 2016, 06:57: Message edited by: orfeo ]
 
Posted by Macrina (# 8807) on :
 
Is there any reason why the FBI can't just give the phone to Apple under secure conditions and let them crack it then give it back? That way Apple doesn't have to let the FBI have the code to tempt them to use it again and no one will know how the code works except Apple?

Not that I like this situation one bit, it's morally grey.
 
Posted by mdijon (# 8520) on :
 
quote:
Originally posted by Macrina:
Is there any reason why the FBI can't just give the phone to Apple under secure conditions and let them crack it then give it back?

I would guess because Apple staff don't have the FBI's trust to delve into the data on jihadist networks and pass it on untampered-with to the FBI. Personally if I was an Apple member of staff I would be quite concerned for my safety in doing that and I doubt that Apple would want to put its employees in that position.
 
Posted by Macrina (# 8807) on :
 
Good point [Smile] I really don't know enough about the technology hence my question.

What about just making sure they could unlock the phone? i.e give them the password to the device?
 
Posted by Curiosity killed ... (# 11770) on :
 
The password to the phone is a PIN number. Ir it's anything like my daughter's i-phone or my Windows phone, it's 4 digit code. That means there are 10^4 options = 10000.

iPhones have something embedded in their coding that means that the phone wipes if too many wrong attempts are made at a PIN number. Which is one of the things that the FBI has asked Apple to write a code to avoid.

Apple need to be involved as their coding signatures are required to change Apple devices. (That one is according to the BBC news.)
 
Posted by mdijon (# 8520) on :
 
It sounds like it isn't as simple as that - I think they can only get in by doing what those of us who know as much about technology as I do call a complicated thing. Something to do with backdoors and brute force and other words.

I think the Mexican stand-off is that the FBI wouldn't trust Apple to be looking at the sensitive data without messing it up and Apple wouldn't trust the FBI to know how to get in without abusing the knowledge in the future.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by Macrina:
Is there any reason why the FBI can't just give the phone to Apple under secure conditions and let them crack it then give it back? That way Apple doesn't have to let the FBI have the code to tempt them to use it again and no one will know how the code works except Apple?

Actually that's exactly what they've asked for.

The FBI are asking for an iOS update that will a) disable the wipe-after-10-wrong-guesses code, b) allow guesses to be entered via computer rather than manually and c) remove the delay that's added after a wrong guess. They've asked that this update be coded to the hardware key of the specific iPhone so it couldn't be used elsewhere. It's also suggested that it be loaded by Apple at Apple premises and that after that either the phone is handed back or the FBI get remote access to it via a computer.

(Source see the paragraph beginning "As a remedy...")

So I think Apple's resistance is more about the legal precedent than the technical one.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by orfeo:
This is the thing that bothers me more than anything else. Frankly, it doesn't matter two hoots whether Apple thinks it's a good idea or not. I'm sure they argued in court that it wasn't a good idea.

But a judge weighed all that up, and made a decision. And unless someone can show that that decision was outside the judge's powers, it's contempt of court to not obey.

But it's not contempt of court to appeal, which AIUI is what Apple plan to do. And I assume one of the first steps in the appeal process would be to seek some sort of order which allows them to delay following the original order until the appeal is ruled on.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Paul.:
quote:
Originally posted by orfeo:
This is the thing that bothers me more than anything else. Frankly, it doesn't matter two hoots whether Apple thinks it's a good idea or not. I'm sure they argued in court that it wasn't a good idea.

But a judge weighed all that up, and made a decision. And unless someone can show that that decision was outside the judge's powers, it's contempt of court to not obey.

But it's not contempt of court to appeal, which AIUI is what Apple plan to do. And I assume one of the first steps in the appeal process would be to seek some sort of order which allows them to delay following the original order until the appeal is ruled on.
Absolutely true.

Meanwhile, though, they're deciding to try the case in the court of public opinion, with an open letter to their users. THAT is most definitely not one of the steps in appealing. It's a PR stunt designed to create sympathy and/or pressure. And as a person dedicated to the rule of law, it's exactly the kind of stunt that I hate.
 
Posted by Jane R (# 331) on :
 
Why should electronic documents be more private than anything else? In a murder investigation, if you have reason to believe that (physical) documents which will help the police build a case are hidden in the suspect's house or locked in a safe, you get a court order to allow you to look for them. If the owner refuses to give you the key you have the authority to break open the safe, and the safe manufacturer doesn't get to say 'You can't do that, it will encourage criminals to break into other safes made by us'.

[ 18. February 2016, 10:43: Message edited by: Jane R ]
 
Posted by Alan Cresswell (# 31) on :
 
The question is, should the safe manufacturer be forced to assist in the safe breaking? At their own expense and thereby undermining their "no one has ever broken into our safes" advertising campaign.
 
Posted by quetzalcoatl (# 16740) on :
 
I thought that in US law, code is seen as speech, and thus, Apple are being asked to produce a form of speech (code) to unlock the phone.

I have no idea whether this is unconstitutional or not in the US, but it's possible that you can't force someone to produce speech.
 
Posted by Albertus (# 13356) on :
 
I wonder how much of Apple's position is derived from a genuine and principled concern about users' privacy- and how much about the possible commercial effect of Apple kit becoming known, or thought, not to be as secure as that of its competitors.
 
Posted by lowlands_boy (# 12497) on :
 
Apple have always been masters of marketing, and I'm sure they see this principled crusade as excellent marketing.

As to the question of whether they should be made to participate, and Alan's analogy about safes. I'm pretty sure that if you buy a safe, it has some serial number that allows the vendor to produce a replacement key. I recently had a lock replaced on a car that was more than 15 years old, and the original car vendor was able to send me (or rather, an approved repairer), a matching barrel to be installed in the door lock so the key didn't need to be changed.

As for the technicalities of what is being asked for. I'm sure that allowing more variants of the key to be entered is trivial, as is eliminating the delay between retries. Allowing the key to be entered over a USB connection might be a bit more complex, but even if they just did the first two things, it wouldn't take very long to manually try 10,000 entries by hand. If I was sure that the data wasn't going to be wiped, I'd try a bunch of common values first.
 
Posted by Sioni Sais (# 5713) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by Paul.:
quote:
Originally posted by orfeo:
This is the thing that bothers me more than anything else. Frankly, it doesn't matter two hoots whether Apple thinks it's a good idea or not. I'm sure they argued in court that it wasn't a good idea.

But a judge weighed all that up, and made a decision. And unless someone can show that that decision was outside the judge's powers, it's contempt of court to not obey.

But it's not contempt of court to appeal, which AIUI is what Apple plan to do. And I assume one of the first steps in the appeal process would be to seek some sort of order which allows them to delay following the original order until the appeal is ruled on.
Absolutely true.

Meanwhile, though, they're deciding to try the case in the court of public opinion, with an open letter to their users. THAT is most definitely not one of the steps in appealing. It's a PR stunt designed to create sympathy and/or pressure. And as a person dedicated to the rule of law, it's exactly the kind of stunt that I hate.

Thanks for putting this in black and white.

What we have here is data. Just about every territory on earth has laws governing data protection and disclosure whether that data is in a card index, your Facebook wall or tucked away in a program like a proprietory OS.
 
Posted by deano (# 12063) on :
 
I'm not torn on this at all. I believe you can all guess where my sympathies lie.

I trust the government. They can be voted out. Jihadis can't.

I have no sympathy for Apple. They are big boys and in this case they need to realise they exist because of the freedoms they would not have if they existed under a Jihadist regime.

Sometimes things are black and white, and you have to pick a side. Apple is being offered that choice.

Put it this way, if Apple win their case then I won't buy any Apple products and I suspect I won't be alone. There are plenty of alternatives.
 
Posted by Alan Cresswell (# 31) on :
 
quote:
Originally posted by lowlands_boy:
As to the question of whether they should be made to participate, and Alan's analogy about safes. I'm pretty sure that if you buy a safe, it has some serial number that allows the vendor to produce a replacement key. I recently had a lock replaced on a car that was more than 15 years old, and the original car vendor was able to send me (or rather, an approved repairer), a matching barrel to be installed in the door lock so the key didn't need to be changed.

But, those are examples of a manufacturer providing a replacement key to the legitimate owner of the safe/car. The FBI is not the legitimate owner of the phone - though they are in possession of it legally (assuming all due processes for collecting evidence were followed).

The analogous service would be for Apple to provide a means for people to access their phone if they have forgotten their PIN. Do they have such a service? I assume not, otherwise the FBI wouldn't be asking Apple to provide such a convoluted route to get into the phone.
 
Posted by Crœsos (# 238) on :
 
quote:
Originally posted by Alan Cresswell:
quote:
Originally posted by Jane R:
If the owner refuses to give you the key you have the authority to break open the safe, and the safe manufacturer doesn't get to say 'You can't do that, it will encourage criminals to break into other safes made by us'.

The question is, should the safe manufacturer be forced to assist in the safe breaking? At their own expense and thereby undermining their "no one has ever broken into our safes" advertising campaign.
More to the point, the state is asking the safe manufacturer (to extend the metaphor) to make it a master key to open all their safes. This not only completely obliterates the selling point AC mentioned, it seems prone to abuse.

Fun fact: the expanded surveillance powers the U.S. implemented to "fight terrorism" have been used much more frequently (by several orders of magnitude) in drug-related cases.

quote:
Originally posted by deano:
I'm not torn on this at all. I believe you can all guess where my sympathies lie.

I trust the government. They can be voted out. Jihadis can't.

Depends on the government, doesn't it? Would you be as sanguine about the Chinese or Egyptian state demanding a way decrypt any of Apple's products? Because once such a tool exists you can be fairly certain other states will also demand its use. "Sorry, but you're not a democratic government" isn't an argument that's likely to fly in the courts of those countries.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Alan Cresswell:
quote:
Originally posted by lowlands_boy:
As to the question of whether they should be made to participate, and Alan's analogy about safes. I'm pretty sure that if you buy a safe, it has some serial number that allows the vendor to produce a replacement key. I recently had a lock replaced on a car that was more than 15 years old, and the original car vendor was able to send me (or rather, an approved repairer), a matching barrel to be installed in the door lock so the key didn't need to be changed.

But, those are examples of a manufacturer providing a replacement key to the legitimate owner of the safe/car. The FBI is not the legitimate owner of the phone - though they are in possession of it legally (assuming all due processes for collecting evidence were followed).

The analogous service would be for Apple to provide a means for people to access their phone if they have forgotten their PIN. Do they have such a service? I assume not, otherwise the FBI wouldn't be asking Apple to provide such a convoluted route to get into the phone.

Well, I think the first point is moot. Syed Farook is dead, so presumably he's not very interested in what happens next to his phone. Plus which the phone was owned by the San Bernardino County Department of Public Health, where Farook worked. It wasn't actually his, and the owner has given the FBI permission to search the phone.

I agree that they presumably don't have a simple access service for PIN codes though. But I don't think that's an issue anyway - according to the court, the FBI have a legitimate interest in the phone, and the FBI are also happy for the custom software required to be tied to the phone.

It was possible for Apple to bypass security features in older versions of iOS in a simpler way. That they have "chosen" to take themselves out of that loop shouldn't mean that a court can't order them to do it.

I'm sure that lots of high profile service providers have been ordered to provide access to data that someone thought was secure. It was foolish of Apple to assume that they'd be allowed to do declare their devices secure when it was obvious that such a bypass could be created.

Note that Apple haven't denied the technical feasibility at all.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by Crœsos:
More to the point, the state is asking the safe manufacturer (to extend the metaphor) to make it a master key to open all their safes.

But are they? According to the link I posted earlier the FBI requested the "key" (SIF) "be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE" i.e they're asking for a key specific to that safe.

However in their statement Apple say

quote:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
Which implies that either they can't make it for a single phone (unlikely as the encryption is already based on a per-phone unique key) or that they believe that such a single-phone insecure version of the OS could be hacked and turned into a general purpose tool.
 
Posted by lowlands_boy (# 12497) on :
 
The Chinese probably have all the necessary code. The FBI should just ask them....
 
Posted by Jane R (# 331) on :
 
Paul.
quote:
Which implies that either they can't make it for a single phone (unlikely as the encryption is already based on a per-phone unique key) or that they believe that such a single-phone insecure version of the OS could be hacked and turned into a general purpose tool.
Does anybody actually believe their claim that their phones are impossible to hack anyway? Given enough time and computing power, nothing is completely secure. The only security lies in making it very, very difficult for anyone who wants to hack in and hoping that they'll go after easier targets. Same principle as preventing a burglary, in fact; window locks and burglar alarms are all very well, but anyone who really wants to get in will manage it somehow.
 
Posted by quetzalcoatl (# 16740) on :
 
I paused when I read, 'Apple was told to build software that would ...'. I'm not sure if that is a correct paraphrase, but assuming it is, I wonder if it is constitutional for the govt to order a private individual to do certain work, or utter certain speech (i.e. code)?

That is a genuine question, because in some situations it may be so. But I leave this for you constitutional experts.
 
Posted by Lamb Chopped (# 5528) on :
 
Just to clarify if anybody's confused--the FBI aren't asking them to break the encryption, they intend to do that themselves if it can be done. (There are forms of encryption out there that are so far unbreakable, as far as we know.)What they are doing is asking them to disable secondary security features that prevent a brute force attack on the encryption issue.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Lamb Chopped:
Just to clarify if anybody's confused--the FBI aren't asking them to break the encryption, they intend to do that themselves if it can be done. (There are forms of encryption out there that are so far unbreakable, as far as we know.)What they are doing is asking them to disable secondary security features that prevent a brute force attack on the encryption issue.

Yes - in this case, the "encryption" is linked explicitly to the pin code, which is only 4 digits long. By any measure, this is extremely weak and would be crackable in a fraction of a second. The issue is the "self destruct" that is built in after ten attempts.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Paul.:
or that they believe that such a single-phone insecure version of the OS could be hacked and turned into a general purpose tool.

Bing, Bing, Bing, we have a winner!
----------------------

It is ridiculous to phrase this as Apple v. the FBI or to care based upon one's opinion of Apple.*
It is equally foolish to frame this a Apple v. jihadi or jihadi v. government.
As far as the law goes, that should not be held as sacrosanct either. It should always be a right to challenge a law, not only through the courts, but through civil disobedience.
And this is a civil liberties matter.


*For the record, though I have Apple products, I am not a massive fan of the company. And even were I, I think that is a ridiculous criterion to use to form an opinion in a matter such as this.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Lamb Chopped:
Just to clarify if anybody's confused--the FBI aren't asking them to break the encryption, they intend to do that themselves if it can be done. (There are forms of encryption out there that are so far unbreakable, as far as we know.)What they are doing is asking them to disable secondary security features that prevent a brute force attack on the encryption issue.

No, they are asking for a version of the OS in which that security feature has been disabled. Which wiil make hacking any other phone that much easier.
 
Posted by Alan Cresswell (# 31) on :
 
But, there is no suggestion that the revised OS (should it ever be developed) will ever exist anywhere other than on a small number of computers at Apple (or, potentially one laptop that someone from Apple takes with them to wherever the FBI have the phone to install it). Apple aren't going to make this a free download from their website for anyone who wants to bypass a security feature on their phone (or, someone elses phone). So, unless the security at Apple is so bad that someone obtains that code illegally that isn't an issue.

There is an outside chance that by making it known that there is a way of bypassing the 10 attempt limit on the PIN that someone else might create a piece of code capable of doing the same thing, but it won't be a version of whatever Apple produce. However, the chatter about this has already let that cat out of the bag.

For Apple (and, presumably, other producers) the issue is that if they do this it won't be long before the FBI, or some other organisation, comes along with another phone they need access to, and another, and another ... either they fight each one through the courts or some of their talented software engineers spend a large amount of their time opening up phones for the law enforcement community. How long can Apple, or anyone else, afford to keep on doing that?
 
Posted by lowlands_boy (# 12497) on :
 
There is no reason to believe this version to be any less secure in terms of the technical components of the phone being hacked and reused than there would be any others.

The extent to which the operating system of the phone (which is what would need to be changed) is secure is completely different to the security of the user content.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Alan Cresswell:
But, there is no suggestion that the revised OS (should it ever be developed) will ever exist anywhere other than on a small number of computers at Apple (or, potentially one laptop that someone from Apple takes with them to wherever the FBI have the phone to install it). Apple aren't going to make this a free download from their website for anyone who wants to bypass a security feature on their phone (or, someone elses phone). So, unless the security at Apple is so bad that someone obtains that code illegally that isn't an issue.

There is an outside chance that by making it known that there is a way of bypassing the 10 attempt limit on the PIN that someone else might create a piece of code capable of doing the same thing, but it won't be a version of whatever Apple produce. However, the chatter about this has already let that cat out of the bag.

For Apple (and, presumably, other producers) the issue is that if they do this it won't be long before the FBI, or some other organisation, comes along with another phone they need access to, and another, and another ... either they fight each one through the courts or some of their talented software engineers spend a large amount of their time opening up phones for the law enforcement community. How long can Apple, or anyone else, afford to keep on doing that?

Since all these phones are essentially the same (with minor variations), there isn't really any issue for Apple to keep doing it. Realistically, they wouldn't destroy such a "feature" version anywhere. They'd just keep the details stored in a branch of code somewhere with steps to make sure that they don't accidentally install it on the next release of their handsets. That's not at all difficult.

This is a marketing issue. Apple have marketed iPhones as somehow being more secure because Apple cannot decrypt the data. This is rubbish, as they have neglected to point out that they could easily circumvent a key part of the protection, as they are now being required to do by a court. They don't like that. Life is hard.

As for how long they could keep doing it, they have a cash mountain the size of many countries, so I should imagine they could sustain the services of a few engineers for quire some time....
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Alan Cresswell:
But, there is no suggestion that the revised OS (should it ever be developed) will ever exist anywhere other than on a small number of computers at Apple

This isn't a plausible scenario. Even were it, the FBI would have a copy of the disabled iOS in their hands when Apple left.
The intelligence community don't like encryption and they don't like having to ask each time they wish to snoop.
They want the tools, that is what this is about.
quote:
Originally posted by Alan Cresswell:

There is an outside chance that by making it known that there is a way of bypassing the 10 attempt limit on the PIN that someone else might create a piece of code capable of doing the same thing, but it won't be a version of whatever Apple produce. However, the chatter about this has already let that cat out of the bag.

This was hardly a secret even before the shootings.
 
Posted by Alan Cresswell (# 31) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by Alan Cresswell:
But, there is no suggestion that the revised OS (should it ever be developed) will ever exist anywhere other than on a small number of computers at Apple

This isn't a plausible scenario. Even were it, the FBI would have a copy of the disabled iOS in their hands when Apple left.
The intelligence community don't like encryption and they don't like having to ask each time they wish to snoop.
They want the tools, that is what this is about.

The iOS will still be property of Apple, and the FBI won't be able to do anything with it without either permission from Apple or by breaking the law (which I would hope would make anything gained inadmissible in court). Even if they can bypass Apple's property rights, will the FBI be able to copy the revised iOS off the phone, and if they could would they be able to install it on a different phone without some input from Apple? What I've read suggests that anything Apple can produce will include components specific to the phone they are working with (access codes and the like). If so, then even with the iOS the FBI (or anyone else) will still need to ask each time they want to snoop.

The fact that the FBI won't like it is irrelevant to the fact.
 
Posted by lilBuddha (# 14333) on :
 
You are suggesting that the FBI would respect rights?
I do not believe this would be the case. Even allowing for the best intentions, power will be abused.
 
Posted by Alan Cresswell (# 31) on :
 
I'm suggesting that if they fail to respect Apple's property rights then they will only be able to do that with the collusion of other parts of the criminal justice system. They can't conduct what would effectively be an illegal search and still use that evidence within a criminal prosecution, so they would need a warrant of some form from the courts (as they have obtained in the current case). If the FBI can ride rough shod over the law then you have very significant problems, and the ability to access the contents of some phones isn't going to make any real difference to that.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by lilBuddha:
You are suggesting that the FBI would respect rights?
I do not believe this would be the case. Even allowing for the best intentions, power will be abused.

The FBI court submission already specifies that the modified version of the phone operating system should only operate on the specified handset, and once Apple digitally sign the software, it wouldn't be transferable to another handset anyway.

There are several ways it could go wrong therefore.

1. Apple screw up the modified version and neglect to lock it to a specific handset. Highly unlikely

2. FBI obtain Apple's secret code signing keys, in which case they can do whatever they want forever. They might already have them illegally and just not want to publicise that, in which case this whole case is just smoke and mirrors.

3. There's a big hole in the whole code signing idea, in which case lots of entities are in trouble.

Which of these, or others, do you think is likely to cause a problem?
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Alan Cresswell:
I'm suggesting that if they fail to respect Apple's property rights then they will only be able to do that with the collusion of other parts of the criminal justice system. They can't conduct what would effectively be an illegal search and still use that evidence within a criminal prosecution, so they would need a warrant of some form from the courts (as they have obtained in the current case). If the FBI can ride rough shod over the law then you have very significant problems, and the ability to access the contents of some phones isn't going to make any real difference to that.

There exists no intelligence agency in the world in which all its members operate completely within the rules. None.
Once again, it is about encryption in general, not this just this one mobile, not just Apple.
quote:
Originally posted by lowlands_boy:
The FBI court submission already specifies that the modified version of the phone operating system should only operate on the specified handset, and once Apple digitally sign the software, it wouldn't be transferable to another handset anyway

And how would that be done? The OS on that mobile is not unique.
 
Posted by lilBuddha (# 14333) on :
 
Just to clarify, though this
quote:

There exists no intelligence agency in the world in which all its members operate completely within the rules. None.

might sound a bit paranoid, it is based on those I know within the community, the selection processes, human nature and historical examples.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by Alan Cresswell:
I'm suggesting that if they fail to respect Apple's property rights then they will only be able to do that with the collusion of other parts of the criminal justice system. They can't conduct what would effectively be an illegal search and still use that evidence within a criminal prosecution, so they would need a warrant of some form from the courts (as they have obtained in the current case). If the FBI can ride rough shod over the law then you have very significant problems, and the ability to access the contents of some phones isn't going to make any real difference to that.

There exists no intelligence agency in the world in which all its members operate completely within the rules. None.
Once again, it is about encryption in general, not this just this one mobile, not just Apple.
quote:
Originally posted by lowlands_boy:
The FBI court submission already specifies that the modified version of the phone operating system should only operate on the specified handset, and once Apple digitally sign the software, it wouldn't be transferable to another handset anyway

And how would that be done? The OS on that mobile is not unique.

The entire basis of the request is that a new version of the OS be created. This version of the OS should remove the self destruct function, remove the time delay between allowed attempts, and allow the submission of passcodes via a USB connection or similar.

There are lots of ways to forensically identify the handset in a unique way - serial numbers in hardware etc etc. Once you agree on a way to do that, you create the new version of the OS that incorporates that feature, then you digitally sign it to say that it's an authentic Apple version.

If you then tamper with it to try and change the forensic identification code (or indeed, any other code) and then put the tampered with version on another phone, the other phone would be able to detect the tampering and refuse to run the software.

This tampering detection is already commonplace. It's been used, for example, to stop people installing non Sony approved software on PlayStation games consoles. There, it was cracked, but that was because people were able to steal the secret key that allowed them to pass off modified versions of the software as being legitimate ones.

The fact that the operating system is not unique at the moment doesn't matter. The new version would be, and would be locked to the specific device. Note (again) that Apple have not denied the technical feasibility of this, and that's what I think they are really pissed about - they've marketed their phones as being secure because Apple themselves can't decrypt them. Someone has now called bullshit on that (quite rightly) and Apple don't like it.
 
Posted by Paul. (# 37) on :
 
As an aside it's worth pointing out that the FBI's plan is only viable because the number of combinations of a 4-digit PIN can be tried via a computer in a reasonable time. If it had been a 6-character alphanumeric passcode then it could take years to try them all. iPhone users take note!
 
Posted by Boogie (# 13538) on :
 
quote:
Originally posted by lowlands_boy:

If you then tamper with it to try and change the forensic identification code (or indeed, any other code) and then put the tampered with version on another phone, the other phone would be able to detect the tampering and refuse to run the software.

Even our printer knows when we have used a non-Canon cartridge and won't work [Roll Eyes]
 
Posted by lilBuddha (# 14333) on :
 
Originally posted by lowlands_boy:
quote:
Someone has now called bullshit on that (quite rightly) and Apple don't like it.

Yeah, and Google support them because it is all about Apple and nothing else.

[ 18. February 2016, 18:56: Message edited by: lilBuddha ]
 
Posted by Ricardus (# 8757) on :
 
Exposing total ignorance here, but how are they going to persuade the phone to download and install the new OS if the phone is locked behind a PIN?
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:

Meanwhile, though, they're deciding to try the case in the court of public opinion, with an open letter to their users. THAT is most definitely not one of the steps in appealing. It's a PR stunt designed to create sympathy and/or pressure.

Why shouldn't they do this?

The issues raised by the court order are matters of public interest. Consequently, it is good for there to be a public debate on them.

(Note: I am under no illusions that Apple actually care about the public interest, but if Apple's concerns happen to coincide with public concerns, I don't think those concerns suddenly become invalid just because it's Apple.)
 
Posted by Honest Ron Bacardi (# 38) on :
 
The phone in question is an iPhone 5S I believe. Cracking it should be relatively trivial compared with one of the 6 series. The latter have a separate firewalled processor system dealing with security presumably due to the "Apple Pay" feature.

Just to repeat what has already been said, this isn't about backdoors. It's more like crowbarring your way in through the front door. That way, the issue of encryption is irrelevant.

Its nice to think that Apple is acting on the highest principles here. Get a grip! It's a commercial decision. Apple is not a high-principles company.
 
Posted by Dafyd (# 5549) on :
 
quote:
Originally posted by Alan Cresswell:
If the FBI can ride rough shod over the law then you have very significant problems, and the ability to access the contents of some phones isn't going to make any real difference to that.

I don't believe the only problem with the FBI breaking the law is that it might secure convictions that would otherwise be illegal. This is the organisation that tried to blackmail Martin Luther King into giving up.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by lilBuddha:
Originally posted by lowlands_boy:
quote:
Someone has now called bullshit on that (quite rightly) and Apple don't like it.

Yeah, and Google support them because it is all about Apple and nothing else.
Google didn't like being told in Europe to remove selected listings from search engines. Microsoft didn't want to produce "unbundled" versions of Windows that had to allow people to select other browsers by default. Etc Etc. And in all cases, they're all taking the piss over their tax arrangements, as we have another thread about.

In every case, they wish to preserve the status quo for their own commercial advantage. It's nothing to do with principles.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Alan Cresswell:
The question is, should the safe manufacturer be forced to assist in the safe breaking? At their own expense and thereby undermining their "no one has ever broken into our safes" advertising campaign.

Yes.

This is the entire point of judges, to make decisions like this. To weigh up competing interests. To claim that the safe manufacturer's interest is absolute and will always, no matter what, win out over the interests of investigating and dealing with a crime is every bit as disturbing as anything that can be conjured up by 'the government might breach your privacy' bogeymen.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Ricardus:
quote:
Originally posted by orfeo:

Meanwhile, though, they're deciding to try the case in the court of public opinion, with an open letter to their users. THAT is most definitely not one of the steps in appealing. It's a PR stunt designed to create sympathy and/or pressure.

Why shouldn't they do this?

The issues raised by the court order are matters of public interest. Consequently, it is good for there to be a public debate on them.

(Note: I am under no illusions that Apple actually care about the public interest, but if Apple's concerns happen to coincide with public concerns, I don't think those concerns suddenly become invalid just because it's Apple.)

They're not looking for public debate. They're looking for an emotive push-button where all the people who are paranoid about government gasp in horror at the prospect that the government might get into their phone. They're sending out a letter to every customer that the FBI has shown zero interest in, with a subliminal message of "if we don't fight this, you could be next!"

Because, you know, owners of iPhones are good people whom the government should have no right to pry into.

Except, of course, for the ones that steal millions from superannuation or take sexually exploitative photographs of children or massacre an entire roomful of people. But those people are different. No true iPhone user would do those things.

Just as no true NRA member would ever use their gun for anything nefarious, and isn't it outrageous to think the government might impose upon them.

The thing that really exasperates me here is that a judge making an individual decision about an individual circumstance is exactly how decisions about invasion of our privacy are supposed to work. It's how search warrants work.

And Apple is basically saying that the principle of your privacy ought to be completely sacrosanct and absolute. If your iPhone is believed to contain a copy of plans to fly a plane into the World Trade Center, they'll fight for your right not to have that divulged.

[ 18. February 2016, 21:01: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
Meanwhile, of course, most of those customers that Apple is reassuring are blindly sharing data with Apple and/or app developers on a regular basis.

I've actually been having a heated argument with the developer of my gym logging app because, after years of entirely private use, they've created a situation where any data I enter about when I had a workout and what exercises I did ends up with them.

I'd be far happier with the police obtaining this information by judicial warrant than I am with a company obtaining this information as a matter of standard practice.

[ 18. February 2016, 21:09: Message edited by: orfeo ]
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:
They're not looking for public debate. They're looking for an emotive push-button where all the people who are paranoid about government gasp in horror at the prospect that the government might get into their phone.

There are two points here (correct me if I caricature you):

1. That Apple have impure motives. True, but I think this is technically an argumentum ad hominem. Last year I wrote to the taxman saying I should have some money back because I had been over-taxed due to someone misunderstanding information I had given them. My concern was for my own filthy lucre, not for the abstract fairness of the tax system, but that doesn't mean I wasn't right.

2. That Apple's argument is emotive and silly. This may also be true, but in order to be assessed as emotive and silly, it has to be put in the public domain for public scrutiny.
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

They're looking for an emotive push-button where all the people who are paranoid ...

If your iPhone is believed to contain a copy of plans to fly a plane into the World Trade Center, they'll fight for your right not to have that divulged.

There is a certain irony in claiming that one side is probably making an emotive argument, whilst making one of your own.

[ 18. February 2016, 22:48: Message edited by: chris stiles ]
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by Honest Ron Bacardi:

Its nice to think that Apple is acting on the highest principles here. Get a grip! It's a commercial decision. Apple is not a high-principles company.

Which is largely besides the point; As a consumer you have to take your allies where you find them - and in this particular case Apple's interests line most closely with that of the consumer (who not coincidentally is the end customer for their services).

Policies have to be judged according to their outcomes, and not necessarily the stated intentions of those that push them.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

They're looking for an emotive push-button where all the people who are paranoid ...

If your iPhone is believed to contain a copy of plans to fly a plane into the World Trade Center, they'll fight for your right not to have that divulged.

There is a certain irony in claiming that one side is probably making an emotive argument, whilst making one of your own.
It would be ironic if I didn't know I was doing it. I'm illustrating how the emotion that is evoked changes a great deal depending on whose phone you're talking about, and that this is exactly why it's a poor basis for argument.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Ricardus:
Last year I wrote to the taxman saying I should have some money back because I had been over-taxed due to someone misunderstanding information I had given them. My concern was for my own filthy lucre, not for the abstract fairness of the tax system, but that doesn't mean I wasn't right.

Sure. But this in fact neatly illustrates the difference between individual cases and systemic issues.

Apple are trying to invoke a systemic agenda because they've lost an individual court decision. To which part of my response is to actually talk about the systemic issue: about why we allow judges to make these kinds of orders to access data, or access properties.

Because the systemic issue is not about whether an individual Apple customer is a nice person. My objection to Apple's tactics is that they are trying to say "dear nice person, you should be mortified at the prospect of this ever happening to you".

My response would be "dear nice person, if you stay nice and don't, say, murder a large number of your colleagues, the chances of this ever happening to you are quite remote". Apple deliberately strips this context of the individual case in order to declare it's interested in the principle.

The fact that you were entitled to receive tax back in your own individual circumstances is no kind of basis to mount an argument that all of your neighbours ought to be getting money back too.
 
Posted by RuthW (# 13) on :
 
quote:
Originally posted by orfeo:
My response would be "dear nice person, if you stay nice and don't, say, murder a large number of your colleagues, the chances of this ever happening to you are quite remote".

This is a terrible argument -- it's like saying if you have nothing to hide, you have nothing to worry about when they violate your fourth amendment rights and search your home.

Moreover, it's not just government intrusions that people are concerned about -- it's also the very real possibility that if Apple is made to write code that breaks into its phones that code will end up being used by criminals who will have fewer scruples than the FBI about using it.

Finally, the notion that Apple should not have made a public statement defending their stance is ridiculous. Tim Cook not only has a constitutional right to speak, he has a responsibility to protect the interests of his company. I don't at all buy the crap about Apple caring more about profit than principles; being profitable is an important principle for a publicly held company.
 
Posted by Dave W. (# 8765) on :
 
Apple says that complying with the court's directive would pose a danger to other users:
quote:
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.
Their reasoning isn't clear to me, though. According to their security white paper, Apple routinely makes software upgrades encrypted for use on unique devices, so the copy used on this phone wouldn't run on another. It's true that the "technique" could be used again - by someone with access to the source code and Apple's signature keys. But if the modification to the iOS is relatively minor, then practically the same danger already exists, since Apple currently has the unmodified source code and its signature keys. If Apple is currently able to ensure their secure possession of the elements required to create the iOS, I don't see why they can't take the same steps to limit access to the "technique" the FBI is asking for.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by RuthW:
quote:
Originally posted by orfeo:
My response would be "dear nice person, if you stay nice and don't, say, murder a large number of your colleagues, the chances of this ever happening to you are quite remote".

This is a terrible argument -- it's like saying if you have nothing to hide, you have nothing to worry about when they violate your fourth amendment rights and search your home.
Bingo. "If you're not a bad person you have nothing to fear from this wee little trampling of your rights" is not a good argument.
 
Posted by RuthW (# 13) on :
 
Dave W.: Yes, I see your point. Tim Cook in his statement said,

quote:
Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
This doesn't seem like a great analogy, but I think I see his point -- better that the thing doesn't exist at all.

Another thing ... the FBI was the one to decide to try this thing in the court of public opinion. They made this public, not Apple.

And another ... this is an unusual court order in that it doesn't require Apple to turn over information it already has -- it requires Apple to write code that (presumably) does not exist. If Apple can be compelled to do this, what can other companies be compelled to do to assist government investigations?

[ 19. February 2016, 04:41: Message edited by: RuthW ]
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by RuthW:
And another ... this is an unusual court order in that it doesn't require Apple to turn over information it already has -- it requires Apple to write code that (presumably) does not exist. If Apple can be compelled to do this, what can other companies be compelled to do to assist government investigations?

Look, we need a widget that does this, that, and the other, and we know your company is capable of inventing and producing one. Cough up the goods or go to jail.

[ 19. February 2016, 04:42: Message edited by: mousethief ]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by RuthW:
quote:
Originally posted by orfeo:
My response would be "dear nice person, if you stay nice and don't, say, murder a large number of your colleagues, the chances of this ever happening to you are quite remote".

This is a terrible argument -- it's like saying if you have nothing to hide, you have nothing to worry about when they violate your fourth amendment rights and search your home.
Bingo. "If you're not a bad person you have nothing to fear from this wee little trampling of your rights" is not a good argument.
Of course it's not a good argument.

But it's not my argument, and in fact it's begging the question. First prove that there's a trampling of your rights.

Apple's argument is the data equivalent of declaring "I have a right to resist any attempt to enter my home". No, you don't. There's this thing called a search warrant.

The notion that people have an absolute right to privacy of their information is bunkum. They have as much right to privacy as the law supplies.

[ 19. February 2016, 05:03: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
The other reason I stand by my actual argument is the involvement of a judge. This is entire reason that we involve judges in such decisions, so that law enforcement is not authorised to go poking in anyone they are interested in. They have to justify that interest to an independent person.

The reason "this is unlikely to happen to you" works is not because I think that the FBI is good and wholesome, but because I think nothing in the law permits the FBI to go from a specific case to a general one.

If you don't trust that, well then, I spend a heck of a lot of my time writing protections for you that aren't worth the paper they're written on. But ask yourself, if the FBI are that nefarious, why the fuck did they bother going to court in the first place to get a court order?

In short: how does the fact that the FBI followed legal process justify dark predictions that if they get their way, the FBI will start doing illegal things?

[ 19. February 2016, 05:17: Message edited by: orfeo ]
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:

Apple are trying to invoke a systemic agenda because they've lost an individual court decision. To which part of my response is to actually talk about the systemic issue: about why we allow judges to make these kinds of orders to access data, or access properties.

Well, presumably the judge believes it's in the public interest and the law allows it.

Both of which are legitimate matters for public debate. The first for obvious reasons, and the second because the law is supposed to reflect public opinion insofar as it is created by a publicly elected legislature - if the law creates an outcome that is repellant to the public, the public ought to lobby for a change in the law. But for this to happen, the public has to know about the (potentially) repellant outcome in the first place.

quote:

My response would be "dear nice person, if you stay nice and don't, say, murder a large number of your colleagues, the chances of this ever happening to you are quite remote".

Tell that to Khaled El-Masri.
 
Posted by orfeo (# 13878) on :
 
Again: how the hell does Khalid El-Masri have any relevance to a situation where a law enforcement agency has gone to a court and obtained an entirely non-secret court order?

If we were talking about a fear that Apple software engineers are going to be abducted and waterboarded, I'd accept the relevance.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by orfeo:
In short: how does the fact that the FBI followed legal process justify dark predictions that if they get their way, the FBI will start doing illegal things?

The fear is more that the court will start awarding warrants to every tom dick and harry law enforcement brigade who want to decrypt our private information.

This is a proxy in the long-running war between the government and the encryption industry. Without taking that into account, talking about court orders and how nice the FBI is is just so much hot air.

The government doesn't want there to be non-government-break-into-able encryption. Encryption creators, and people like me, think that allowing the government to have access to our personal data on the level they want is tantamount to there not being any security at all.

Before you start gassing about this is just one phone, you need to plug this one phone into this larger argument. Yes, it's a slippery slope argument. Once the camel's nose is under the tent flap, what is to stop the government from demanding more and more? If not the FBI then some other agency. State agencies. Local agencies. We need to catch the guy who robbed this convenience store. Break into the cell phones of everybody standing in line or go to jail.

As for laws not being worth the paper they're printed on -- yeah, let's talk about police confiscating people's cars for suspicion of drugs, and not having to return them when the person is found innocent of all charges. And the paper we're talking about here is the paper the U.S. Bill of Rights was written on. So yeah. Where law enforcement agencies are involved, you'll have to forgive us if we feel a little unsure of their intent. However beautiful the laws you craft are.
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:
Again: how the hell does Khalid El-Masri have any relevance to a situation where a law enforcement agency has gone to a court and obtained an entirely non-secret court order?

Because it raises the question: "Do the American security services have sufficient checks to protect the innocent from molestation for them to be entrusted with the powers that, according to this judge, the law entitles them to?" and "If not, should the public lobby for additional legal safeguards?"
 
Posted by Leorning Cniht (# 17564) on :
 
There seems to be some magical thinking based on Apple's claims of security going on. So here's an analogy.

Suppose a suspect had some documents locked up in a safe. The police think there might be some incriminating evidence in that safe. They can go to court and get a warrant allowing them to search the safe, which might well involve the services of a locksmith to gain access.

In any reasonable society, this has to be allowed. You don't get to say "no, you can't look in my safe, because it says "private" on it.

But what if the safe has an additional security feature? Suppose there's a bomb in the safe, and if you tamper with the safe, it will destroy its contents. Clearly this can't change anything about whether the courts should be able to order the safe opened - it's still got the same stuff in it. But it has become technically more difficult to do.

Now, the court is ordering Apple Safe Co. to help defeat the bomb in the safe. The court knows that Apple Safe Co. has the plans for the safe, believes that it is capable of building a tool that will prevent the bomb from going off, and so orders it to do so.

We're still in the same scenario. If the court orders the safe opened, it gets opened. It's not reasonable for people to claim that documents shouldn't be admitted as evidence against them because those documents were stamped "private". It's equally and identically unreasonable for someone to claim that the court shouldn't be able to order their locks unlocked.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by mousethief:
The fear is more that the court will start awarding warrants to every tom dick and harry law enforcement brigade who want to decrypt our private information.

It is not a fear, but a reality.

Here is a longer article. This link is to a preface, there is an 83 page report linked therein.
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:
It would be ironic if I didn't know I was doing it. I'm illustrating how the emotion that is evoked changes a great deal depending on whose phone you're talking about, and that this is exactly why it's a poor basis for argument.

Which would have merit if they had actually made the argument you claim they made, if they had been the first to do so, and if this had been their first and only contribution to this debate.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Ricardus:
quote:
Originally posted by orfeo:
Again: how the hell does Khalid El-Masri have any relevance to a situation where a law enforcement agency has gone to a court and obtained an entirely non-secret court order?

Because it raises the question: "Do the American security services have sufficient checks to protect the innocent from molestation for them to be entrusted with the powers that, according to this judge, the law entitles them to?" and "If not, should the public lobby for additional legal safeguards?"
I'd say it raises the question "why on earth would additional legal safeguards make you feel any safer"?
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by mousethief:
The fear is more that the court will start awarding warrants to every tom dick and harry law enforcement brigade who want to decrypt our private information.

It is not a fear, but a reality.

Here is a longer article. This link is to a preface, there is an 83 page report linked therein.

Yeah, okay. I give you permission to be frightened.
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:
I'd say it raises the question "why on earth would additional legal safeguards make you feel any safer"?

Indeed, another good question of public interest.

To be clear: I am not saying I agree or disagree with Apple. I am saying I agree with their right to lobby the public.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by Leorning Cniht:
Now, the court is ordering Apple Safe Co. to help defeat the bomb in the safe. The court knows that Apple Safe Co. has the plans for the safe, believes that it is capable of building a tool that will prevent the bomb from going off, and so orders it to do so.

We're still in the same scenario. If the court orders the safe opened, it gets opened. It's not reasonable for people to claim that documents shouldn't be admitted as evidence against them because those documents were stamped "private". It's equally and identically unreasonable for someone to claim that the court shouldn't be able to order their locks unlocked.

I don't think anyone is arguing that the contents of the phone, should the FBI be able to recover them, are or should be inadmissable.

It's more that the tool has the potential for abuse and undermines the business of Apple Safe co.

I'm uneasy with the idea that people or corporations should not just not obstruct law enforcement but be coerced to join their ranks - against their own interests and their own moral reservations.
 
Posted by orfeo (# 13878) on :
 
I did laugh.

http://www.thebeaverton.com/us/item/2454-apple-vows-to-stop-fbi-from-corrupting-exploitation-monopoly
 
Posted by lowlands_boy (# 12497) on :
 
So - should there be some absolute right to privacy then? And should it extend to the dead?
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by lowlands_boy:
So - should there be some absolute right to privacy then?

I don't think one necessarily has to believe I an absolute right of privacy in order to be cautiously supportive of Apple in this particular scenario given the wider context.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by lowlands_boy:
So - should there be some absolute right to privacy then?

I don't think one necessarily has to believe I an absolute right of privacy in order to be cautiously supportive of Apple in this particular scenario given the wider context.
Absolute privacy is not a practical right. But expecting a level of privacy is. And trusting the government completely is foolish. At its best intention, government is composed of people, people who will fail, people who will abuse authority, people who will bend or suspend rules for the 'greater good'. Time and again they get it wrong. Guantanamo, the NSA and MI5 phone spying, Kincora*, the Computer Misuse Act amendments, etc.
Over time, authority will be a used. It is not a question of if, but of when and how far.


*Kincora involvement is alleged, but the fact they can legally hinder investigation is troubling.

[ 19. February 2016, 15:18: Message edited by: lilBuddha ]
 
Posted by quetzalcoatl (# 16740) on :
 
UK readers may recall how the Terrorism Act was used to expel an 82 year old man from the Labour Party Conference.

OK, this is a long way from Apple, but it's the idea that if the authorities are able to abuse powers, inevitably they will. Hence, widespread wariness.

http://news.bbc.co.uk/1/hi/4291388.stm
 
Posted by Doublethink. (# 1984) on :
 
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.
 
Posted by Paul. (# 37) on :
 
It's OK John McAfee will do it instead
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by Doublethink.:
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.

But couldn't you make the same argument about all of Apple's software? If they're unable to control this one copy of iOS, why should anything they do be considered secure?
 
Posted by lowlands_boy (# 12497) on :
 
I'm quite sure that the FBI do have other mechanisms by which they can do this, given the resources at their disposal. I think it's reasonable to assume that they(and other intelligence agencies), do actually go out and acquire large quantities of phones, tablets etc etc, and study them in forensic detail at both the software and hardware level.

I'm sure they have already cracked this mechanism before as an exercise and could do so in this case. No harm in them asking Apple to do it, from their point of view.
 
Posted by W Hyatt (# 14250) on :
 
With regard to the safe analogy, it seems to me that it's not so much like the government asking for a master key, it's more like they're asking for software that could be loaded into any safe to make it possible for anyone to open it.
 
Posted by Belle Ringer (# 13379) on :
 
What happens if Apple, uhm, attempts to crack the phone but fails?

Anyway, I doubt FBI believes national security shattering data is on that one phone and nowhere else; more likely they see this situation as a great gift yto exploit for getting the public to agree that FBI should have backdoor access to everything.

It's not about one phone end of topic. It's about one phone used as a strategic tool to push, phone by phone at first, then collect multiple incidents to prove FBI needs access to all communications "to keep us safe."

Possessing a phone they don't have backdoor access to will become a crime. Duh, if they have right of access you have no right to possess unaccessible, obviously you'd be a terrorist.

FBI aren't stupid, they've always wanted total access, here's a bloody incident and a locked phone to milk for public persuasion towards their goal.

The only question is do you agree with their goal or not?
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by W Hyatt:
With regard to the safe analogy, it seems to me that it's not so much like the government asking for a master key, it's more like they're asking for software that could be loaded into any safe to make it possible for anyone to open it.

...with the point being that if you can walk up to a safe, install new software on it, and it opens, then it's not a safe at all. It's a box designed to mislead you into thinking that it's a safe.

Your stuff is n your apartment, Apple is your landlord, and has the key. The FBI have a court order to look through your stuff. It's flat-out unreasonable for Apple to say "no, that's his apartment - it's private. I', not going to let you in there."

Apple should say exactly that right up to the point that the FBI or whoever present a search warrant, and then say "certainly, officer - I'll open the door for you." That's how search warrants are supposed to work.

Can this be abused? Yes. But that's not a reason to ban the searching of someone's home.
 
Posted by Doublethink. (# 1984) on :
 
quote:
Originally posted by Dave W.:
quote:
Originally posted by Doublethink.:
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.

But couldn't you make the same argument about all of Apple's software? If they're unable to control this one copy of iOS, why should anything they do be considered secure?
Actually, I was more thinking of the FBI, given its very large.
 
Posted by W Hyatt (# 14250) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by W Hyatt:
With regard to the safe analogy, it seems to me that it's not so much like the government asking for a master key, it's more like they're asking for software that could be loaded into any safe to make it possible for anyone to open it.

...with the point being that if you can walk up to a safe, install new software on it, and it opens, then it's not a safe at all. It's a box designed to mislead you into thinking that it's a safe.

Your stuff is n your apartment, Apple is your landlord, and has the key. The FBI have a court order to look through your stuff. It's flat-out unreasonable for Apple to say "no, that's his apartment - it's private. I', not going to let you in there."

Apple should say exactly that right up to the point that the FBI or whoever present a search warrant, and then say "certainly, officer - I'll open the door for you." That's how search warrants are supposed to work.

Can this be abused? Yes. But that's not a reason to ban the searching of someone's home.

Only Apple isn't the landlord because I own my home. And Apple not only doesn't have a key, they sold me the home with the lock based on the fact that there can be no such key for them to have.

Should the government be able to demand that they retroactively change the design of my lock so that such a key can be created after the fact? It's not that Apple doesn't want the FBI to have the information in the one particular phone, it's that they don't want to be forced into breaking their promise to their entire customer base.

The problem with a physical analogy is that it can't adequately represent the unique issues presented by software. I don't know what the answer should be, but it shouldn't necessarily be based on precedents that assume a physical analogy.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by Doublethink.:
quote:
Originally posted by Dave W.:
quote:
Originally posted by Doublethink.:
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.

But couldn't you make the same argument about all of Apple's software? If they're unable to control this one copy of iOS, why should anything they do be considered secure?
Actually, I was more thinking of the FBI, given its very large.
According to the writ, the FBI only requires access to the phone to brute force the password; the software package is to be made loadable and executable on this phone only, and the phone itself can remain at an Apple facility.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by W Hyatt:
Only Apple isn't the landlord because I own my home. And Apple not only doesn't have a key, they sold me the home with the lock based on the fact that there can be no such key for them to have.

And they lied. It is technically possible for them to defeat the security mechanisms on your lock.

quote:
it's that they don't want to be forced into breaking their promise to their entire customer base.
Yes, that's what this is about. Apple made a promise, which was a lie. They represented their new security feature as being effective, whereas it can be defeated by loading new code on to the phone.

quote:

The problem with a physical analogy is that it can't adequately represent the unique issues presented by software.

And the problem with talking about the "unique issues presented by software" is that they're really not unique. Privacy doesn't look different just because there's a computer involved.

The big difference is that computers and electronic records make data analytics and mass searches possible that would take far too long to be practicable by hand. But this is a feature of electronic records, not a feature of phone encryption.

Why should the phone in your pocket have more legal protection than the PC locked in your house. Suppose you have a home computer full of all kinds of personal information (like most people) and it's stored in your locked house, but doesn't have password protection. Everybody agrees that, with a warrant, the police can enter your locked home, take your computer, and search through all your personal data for incriminating records. Why should the computer in your pocket be different?
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Leorning Cniht:
Why should the computer in your pocket be different?

And in case it wasn't clear, why should a software lock be treated differently from a physical lock?
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Belle Ringer:
Anyway, I doubt FBI believes national security shattering data is on that one phone and nowhere else; more likely they see this situation as a great gift yto exploit for getting the public to agree that FBI should have backdoor access to everything.

Yes. This. exactly this. They are using this case as a wedge to destroy our right to privacy via non-government-crackable cybersecurity. As I said above, the government has been in apoplexy since large-prime keys were invented. How dare anybody in the world not have a back door that the government -- who will of course always get a warrant and never use it for, say, political gain -- can use.

May I suggest that before people try to make more moronic analogies to physical objects that everybody take a breather and go read Crypto by Steven Levy. This isn't about this phone. This is about living in a fishbowl with the government able to get into any and all data it wants to. Because there's no middle road here. It's a binary. Can they, or can they not, get in through a back door?

Now that Apple has all but admitted that there is a way to crack the phone, namely by installing new code on it even though it's locked, will phone manufacturers try to prevent this in future phone operating systems, and will the government allow it?
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by Leorning Cniht:
Why should the computer in your pocket be different?

And in case it wasn't clear, why should a software lock be treated differently from a physical lock?
The lock analogy is ridiculous.
The FBI are attacking civil liberty. That is what this is about.
 
Posted by W Hyatt (# 14250) on :
 
quote:
Originally posted by Leorning Cniht:
quote:

The problem with a physical analogy is that it can't adequately represent the unique issues presented by software.

And the problem with talking about the "unique issues presented by software" is that they're really not unique. Privacy doesn't look different just because there's a computer involved.
The privacy issues have lots of legal precedents and are pretty well understood, but the uniqueness of this case is not about an individual's privacy, it's about the what tools our society is willing to allow law enforcement to use.

quote:
The big difference is that computers and electronic records make data analytics and mass searches possible that would take far too long to be practicable by hand. But this is a feature of electronic records, not a feature of phone encryption.
Sure, but as I said, it's not the privacy aspect with regard to a single individual that makes this case of such interest to the public, it's the ramifications of mass searches.

quote:
Why should the phone in your pocket have more legal protection than the PC locked in your house. Suppose you have a home computer full of all kinds of personal information (like most people) and it's stored in your locked house, but doesn't have password protection. Everybody agrees that, with a warrant, the police can enter your locked home, take your computer, and search through all your personal data for incriminating records. Why should the computer in your pocket be different?
For the same reason the NSA digitally collecting massive amounts of phone records is different than getting a warrant for an old-style wire-tap.

Prior to the digital age, law enforcement had to make a case that an individual was a legitimate target for investigation before they could start collecting data about that individual. But in the digital age, that is being flipped so that data collection is happening before any individual is identified as a legitimate target. This allows law enforcement, at least in theory, to look for everyone who fits any profile that they consider suspicious. Law enforcement tends to dislike anyone that have decided is suspicious, and we know that they can severely disrupt or even ruin someone they don't like even before anything gets to trial.

Imagine that some totally innocent person happens to purchase items from an ethnic restaurant serving middle-east cuisine, travel to parts of the world where there is easier access to terrorists (such as Turkey), and visit a set of Facebook pages and web sites the FBI has linked to terrorism (perhaps all just to research a book of fiction about terrorism). If the FBI can gather that information prior to having any reason to suspect that person, they can decide that person fits the profile of a dangerous terrorist and start focusing their resources on establishing a case against them. If they convince themselves that they are investigating a terrorist, they have the power to ruin that person's life. Remember that this is the organization that one of the upthread links pointed out used as a reason for suspecting someone that they had grown up (in the USA) working at his father's gas station, making him an expert in blowing one up. This is not how we want our law enforcement to operate, but it's how they actually do operate when they are allowed to and are given the tools to do so.

Like I said, I don't know what the answer should be, but these are the kinds of issues that need to be carefully considered. Because as lilBuddha has pointed out:

quote:
Over time, authority will be a used. It is not a question of if, but of when and how far.
and:

quote:
It is not a fear, but a reality.
Note that in theory, if the FBI was allowed to operate that way, I could become the target of an investigation just for posting this because they decided I fit the profile of someone who poses a danger to the country.

This is not a simple case of a search warrant overriding someone's privacy.

[ 20. February 2016, 03:15: Message edited by: W Hyatt ]
 
Posted by W Hyatt (# 14250) on :
 
Sorry, it wasn't the FBI that offered that reason for suspecting someone, it was the Director of National Intelligence. And it wasn't in a link provided up thread, it was an indirect link I followed from Ricardus' link to an article about Khaled El-Masri, which referenced a different article about Majid Khan:

quote:
Khan gained asylum in the United States in 1998 and was a legal resident of Baltimore, Maryland, where he had attended high school and worked for his father. Khan has made repeated offers to submit to a polygraph test to prove his innocence, but been denied.[4] The Director of National Intelligence has asserted that Khan's experience working in his father's gas station "...made Khan highly qualified to assist Mohammad with the research and planning to blow up gas stations."
But don't worry, they're the good guys so we should trust them to be scrupulous in their use of mass searches. [Roll Eyes]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by lowlands_boy:
So - should there be some absolute right to privacy then?

I don't think one necessarily has to believe I an absolute right of privacy in order to be cautiously supportive of Apple in this particular scenario given the wider context.
The "wider context" is full of bogeymen. The actual case involves access to a phone.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Dave W.:
quote:
Originally posted by Doublethink.:
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.

But couldn't you make the same argument about all of Apple's software? If they're unable to control this one copy of iOS, why should anything they do be considered secure?
Ding! Ding! Ding! We have a winner. All of these terrible things will apparently only happen with software that Apple doesn't want to write. Not with all the software that Apple wants you to trust.

Such as the 'error 53' software that somehow, inexplicably, escaped from Apple's labs and ruined lots of people's phones. A
factory test gone wrong, they're telling us.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by Leorning Cniht:
Why should the computer in your pocket be different?

And in case it wasn't clear, why should a software lock be treated differently from a physical lock?
The lock analogy is ridiculous.
The FBI are attacking civil liberty. That is what this is about.

Declarations that they are attacking civil liberty are equally ridiculous. They are investigating a crime. They are going to a judge to get a court order, which is exactly what law enforcement authorities are supposed to do.

Of course, if you'd like the government to leave your rights alone, we can also see to it that they leave you to fend for yourself on various other rights such as your right to life.

There is a balance to these things. And shrieking in horror at attempts to work out whether a mass murderer was part of a larger network of mass murderers is freaking ridiculous.

The other thing that is freaking ridiculous is talking about the FBI and the NSA and people abducted from other countries all in the same breath, as if "The Government" is just one big monolithic thing. No, as someone who works in "The Government", let me tell you all that it isn't some big fucking monolith of people conspiring in black vans.

I'm honestly trying to work out right now just what is the difference between people who develop conspiracy theories about the FBI (including the bald assertion upthread that lowlands boy is sure the FBI really already know how to crack the phone - probably using their secret mental powers derived from aliens) and a bunch of ranchers who take over federal land in Oregon. Not much difference at all, I fancy.

[ 20. February 2016, 03:58: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
Let me just end my crankiness by saying this: people spend a hell of a lot of time cowering in fear at the prospect of government invading their lives, while blithely handing over massive amounts of their personal information to corporations whose primary interest in you is how much money they can make out of you.

The government never tried to come into my house and change anything. Meanwhile Microsoft takes active steps to move me onto an entirely new operating system. Apple now pesters me whenever they can to join their music streaming service. Facebook is upset with me because I haven't updated or added any information to my profile for years.

Uber tells you they're breaking the law for your benefit, not because of the millions they make from it.

They want to know all about you, and however much they smile about it it's not because they want to be friends with you.

And that's a difference really, isn't it? The government doesn't smile. The government knows you kind of have to deal with it by force of law, so you're not a 'customer'. But companies, they smile at the boiling frog and shower it with bright colours and cat videos, and the boiling frog smiles back.

[ 20. February 2016, 04:07: Message edited by: orfeo ]
 
Posted by W Hyatt (# 14250) on :
 
I understand what you're saying, orfeo, but Apple and Microsoft generally won't be trying to put me in prison (although I did hear about one case where a bank convinced the FBI to convict an ex-employee who knew about high-frequency trading software).
 
Posted by W Hyatt (# 14250) on :
 
BTW, here's a link to the story about a bank convincing the FBI to prosecute an ex-employee, Sergey Aleynikov (conviction was overturned on appeal).

quote:
In the course of these events, Aleynikov has spent a year in prison for crimes he didn't commit. Aleynikov has divorced, lost his savings,[12] and, according to his lawyer, "[his] life has been all but ruined".
Overzealous law enforcement can't be completely eliminated, but it's prudent for society to take it seriously.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by orfeo:

The other thing that is freaking ridiculous is talking about the FBI and the NSA and people abducted from other countries all in the same breath, as if "The Government" is just one big monolithic thing. No, as someone who works in "The Government", let me tell you all that it isn't some big fucking monolith of people conspiring in black vans.

One doesn't need any vast government conspiracy. All that is necessary is that a power be available and someone will use it, someone will abuse it. Often with the best intention. It is inevitable.
H Wyatt's example is but one of many.
Being completely paranoid of government is mental. But so is trusting it completely.

The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly. The information needed to prevent 9/11 was there. It was the failure to coordinate between agencies and failure to properly assess the data that was the problem.
It is when politics drive policy instead of analysts that we end up with necessary wars.
It isn't that I believe there is a unified conspiracy to control, I don't. It is that I know that such a thing is unnecessary for abuse to happen.

ETA: BTW, any use of the word 'government' by me is short hand for 'element or agency of government', not the thing entire.

[ 20. February 2016, 06:45: Message edited by: lilBuddha ]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
Wow, so that caused a traffic jam of responses.
First was someone should tell them that. Followed by aren't they the equivilant of MI5? And there is nothing that mandates a wall between police force and spying.
They are a domestic intelligence agency like MI5 and ASIO.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by W Hyatt:
For the same reason the NSA digitally collecting massive amounts of phone records is different than getting a warrant for an old-style wire-tap.

The proposal being discussed is a modification to the phone's operating system that requires physical access to the phone. This isn't the same as a mass NSA data sieve.

I will note, however, that there's no reason why Apple couldn't install back doors that would, for example, silently copy all your data to the NSA, in a future version of iOS. This decision doesn't make that either more or less likely.

quote:

This is not a simple case of a search warrant overriding someone's privacy.

Yes, it is. This isn't a mass wiretap program, and doesn't do anything to enable a mass wiretap program.
 
Posted by lilBuddha (# 14333) on :
 
It creates a precedent and moves the line further down the road.

[ 20. February 2016, 14:36: Message edited by: lilBuddha ]
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by lilBuddha:
It creates a precedent and moves the line further down the road.

It's the same precedent as looking through someone's computer.

Here's a question for you, and/or anyone else who is seeing the court order in this case as an overreach of intrusive government:

I have a computer in my house which contains all kinds of personal information. It is in a locked building, and may or may not require passwords and decryption to access its data. Do you, or do you not agree that a police force in possession of a suitable court-issued warrant should be able to look through the data on my computer. Do the lengths that I may have gone to to protect the data on that machine (passwords, encryption, deleted data, booby traps, ...) alter your opinion on whether the police should be able to attempt to unlock it?

If the phone wasn't protected by the self-destruct feature, we wouldn't be having this discussion - the FBI would have already looked through the phone. So why does the presence of the self-destruct feature change what the FBI should be allowed to do?

[ 20. February 2016, 15:04: Message edited by: Leorning Cniht ]
 
Posted by Honest Ron Bacardi (# 38) on :
 
quote:
Originally posted by lilBuddha:
It creates a precedent and moves the line further down the road.

It very likely would, but probably not in the way most people think.

Writing a variant version of iOS is not a herculean task, and could probably be done in many places, and may have been done already. What prevents it from "going live" is that it needs to be digitally signed using Apple's private key.

In effect, the risk of the modified software escaping from Apple is probably somewhat less than the risk of the key itself (which will be a much smaller piece of data) escaping.

Neither risk is zero, but the risk of damage if the key escapes (as opposed to the risk of either the software or the key escaping) is far higher than if the modified OS escapes. It's impossible to work out what the key is from the software (it uses a variant of the two-key public/private cipher I believe), but with the key you can distribute functioning updates at will.

So in fact there is a much larger danger lurking out there. Any risks attaching to the production of an engineered variant of iOS, provided it is kept within Apple, are small in comparison, and probably smaller than the risk of losing control of the signing key.

As I mentioned earlier, the real reason is almost certainly a commercial one to market the superiority of the iPhone 6 and succeeding models on the basis of its internal security with your data. That's a commendable development, but it's obscuring the arguments. This thread as a whole risks misidentifying what and where the risks are, as well as their magnitude.

And of course that then feeds into the separate issue of the nature and inviolacy of private data, though that's not really what I'm looking at here, important though it is.
 
Posted by Alisdair (# 15837) on :
 
The police have the right to do whatever the law allows them to do.

But, 'the law is an ass' - in other words it is an arbitrary human construct, subject to all the flaws that human beings always bring to the party.

In other words: the appropriateness of application, and the morality of motive, and the consequences of applying the law in any given situation should NEVER be taken for granted.

Apple's motives for resisting the court order, and their ability to comply with it, are effectively beside the point.

The question is: what are the likely consequences of compliance: for Apple, for other iPhone users, and for the relationship between 'the state' and 'the people'?

This is the stuff of true democracy, and a good punch-up in the public square (and on the Ship for that matter) is exactly what is required, I think.
 
Posted by Enoch (# 14322) on :
 
Although it's not overtly stated, is part of the reason why a lot of people feel uneasy about this, is that a search warrant usually just authorises the investigating authorities to enter your house and pull everything apart. Even if they can take your computer away, they have to do the work. Here, though, they seem to be asking for a court order compelling somebody else either to do their investigating for them, or to set up all the necessaries so they can do it.


Incidentally also, though this is only a collateral point, who would pay Apple to do the reprogramming? Would Apple have to do that at their own expense? Or would the government at least be required to reimburse them at the appropriate hourly rate? At least with the compulsory purchase of an asset, the government has to pay market price for it - or at least, they do here.

And is compulsory purchase of labour or time a bit like a sort of slavery?
 
Posted by Enoch (# 14322) on :
 
quote:
Originally posted by Alisdair:
The police have the right to do whatever the law allows them to do. ...

In a very literal and limited way, obviously that is true, but I hope nobody really believes it is either ethically or existentially true.
 
Posted by RuthW (# 13) on :
 
quote:
Originally posted by Dave W.:
quote:
Originally posted by Doublethink.:
I think the biggest risk is more basic, once the software is written and available in a large organisation - sooner or later it will be leaked. Not because either Apple or the FBI want that - but just because it is worth a huge amount of money.

But couldn't you make the same argument about all of Apple's software? If they're unable to control this one copy of iOS, why should anything they do be considered secure?
And clearly they've been successful at safeguarding the Apple signature software that has to be used to get the phone to accept a new iOS, or else the FBI wouldn't need to get Apple to help -- they'd just have their own people write the code.

But having it stolen isn't what I'd be worrying about if I were an Apple bigwig. If Apple is compelled by the US government to write code to break into someone's phone, they're going to have a hell of a time trying to tell the Chinese government that they won't do the same thing when the Chinese government demands it. Do folks here with faith in the goodness of US security agencies have the same faith in the goodness of Chinese security agencies? Somehow I doubt it.

quote:
Originally posted by Belle Ringer:
I doubt FBI believes national security shattering data is on that one phone and nowhere else; more likely they see this situation as a great gift yto exploit for getting the public to agree that FBI should have backdoor access to everything.

It's not about one phone end of topic. It's about one phone used as a strategic tool to push, phone by phone at first, then collect multiple incidents to prove FBI needs access to all communications "to keep us safe."

Possessing a phone they don't have backdoor access to will become a crime. Duh, if they have right of access you have no right to possess unaccessible, obviously you'd be a terrorist.

FBI aren't stupid, they've always wanted total access, here's a bloody incident and a locked phone to milk for public persuasion towards their goal.

The only question is do you agree with their goal or not?

I agree with almost all of this. I do think they want the info on this one phone, and I believe them when they say they think it might tell them whether other people helped the San Bernardino terrorists carry out this attack.

But I think this is also a great case for them to try to advance the goal of access to people's data. The US security agencies have tried and failed to get Congress to pass legislation requiring companies like Apple and Google to build backdoors for government access into their systems. The FBI could have done this all quietly -- they usually do. But they sought this court order very publicly because they're hoping that the elements of the case -- terrorism, a dead suspect whose privacy we're not likely to care about, and the phone being owned by a local government agency which has given permission for it to be hacked -- will help shift the public debate of privacy vs security in the direction of security.
 
Posted by RuthW (# 13) on :
 
And another thing ...

orfeo: The FBI isn't a spying agency? WTF? Do you know anything about the history of the FBI?
 
Posted by Honest Ron Bacardi (# 38) on :
 
RuthW wrote:
quote:
But I think this is also a great case for them to try to advance the goal of access to people's data. The US security agencies have tried and failed to get Congress to pass legislation requiring companies like Apple and Google to build backdoors for government access into their systems. The FBI could have done this all quietly -- they usually do. But they sought this court order very publicly because they're hoping that the elements of the case -- terrorism, a dead suspect whose privacy we're not likely to care about, and the phone being owned by a local government agency which has given permission for it to be hacked -- will help shift the public debate of privacy vs security in the direction of security.
This (or perhaps something like it) is pretty much where I am at too. It's actually been stated publicly by the security community, post the Snowden revelations, since when Apple and Google are reported to have stopped playing ball so nicely. So there are bigger issues at play here on both sides without any doubt.

I'm less sure about the China issue though. Apple is notorious for avoidance of tax in the countries in which it makes most of its profits. (So is Google, though they look like repenting to a degree - perhaps). I'm not sure why they should fear China when they already seem to have decided to tell friendly governments to fuck off.

[ 20. February 2016, 22:50: Message edited by: Honest Ron Bacardi ]
 
Posted by The Rogue (# 2275) on :
 
The matter isn't concluded yet and Apple are certainly making the most of it - this is terrific advertising and anyone who wants absolute phone security (I don't particularly) will be more interested in their products now. Whatever we think about firms like Apple we should recognise their PR capabilities - money well spent. I've already used their name twice in this post.

By the way there is an option on my iphone to have a four digit or a six digit access code. It initially came set as six. And you don't have to have it wipe the phone if you guess wrong too many times but that is an option which presumably the FBI believe has been selected on this particular phone.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by RuthW:
If Apple is compelled by the US government to write code to break into someone's phone, they're going to have a hell of a time trying to tell the Chinese government that they won't do the same thing when the Chinese government demands it. Do folks here with faith in the goodness of US security agencies have the same faith in the goodness of Chinese security agencies? Somehow I doubt it.

Well, it's not exactly goodness that I'm relying on, but I do think the FBI for all its shortcomings is probably more subject to legal restraint than its Chinese counterpart.

But then, I also don't think the outcome of this case will affect the behavior of the Chinese government one way or the other; if they want to require Apple's assistance, I don't think they'll care whether a US court ruled for or against a similar demand from the FBI.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
That's so cute.
 
Posted by Dave W. (# 8765) on :
 
The FBI describes itself as "an intelligence-driven and a threat-focused national security organization with both intelligence and law enforcement responsibilities" and its top 3 priorities are currently:
It looks like there's a fair amount of overlap with the mission of MI5, though I think the FBI traditionally has had a greater law enforcement role.
 
Posted by Nicolemr (# 28) on :
 
Just as an aside, Donald Trump has called for a boycott of all Apple products over this. He tweeted this from his iPhone. [Roll Eyes]
 
Posted by Golden Key (# 1468) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
That's so cute.
[Overused]


orfeo: FWIW, IMHO, you frequently seem to think that the US is what you think it should be, not what it really is.

Here's a sampling of HuffPost articles on FBI spying.

Also look up "COINTELPRO", "Carnivore", and "Echelon".
And then there's Stingray.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by The Rogue:
anyone who wants absolute phone security (I don't particularly) will be more interested in their products now.

Why? Apple have admitted that their new security, which they claimed was failsafe, is in fact "trust us, we're Apple".

Then again, anyone who wanted actual security would be taking other measures anyway.

[ 21. February 2016, 03:36: Message edited by: Leorning Cniht ]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
Wow, so that caused a traffic jam of responses.
First was someone should tell them that. Followed by aren't they the equivilant of MI5? And there is nothing that mandates a wall between police force and spying.
They are a domestic intelligence agency like MI5 and ASIO.

No, they really aren't. If you can't tell the difference between the FBI and the CIA and the NSA, this conversation is never going to get anywhere.

And that goes for the rest of you as well. I know that the systems between different countries aren't exactly parallel, but the fact is the FBI is not the CIA or the NSA, otherwise there'd be no reason for each of them to have a separate existence.

[ 21. February 2016, 09:24: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
More to the point, this conversation is useless so long as everyone can just mention big, dark bogeymen and mutter darkly about the possibilities. How can anyone disprove these theories?

I just saw one astounding post where a person simultaneously asserted that (1) The FBI has already access to the phone, (2) hacking a phone is easy, (3) the FBI wants this for these wider nefarious purposes.

Again, how can anyone disprove such nonsensical, internally inconsistent theories? This is a guy proposing that Apple's security isn't actually worth anything, and that the FBI already knows how to break it, yet the FBI is asking for the tool to break Apple's security.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by orfeo:
More to the point, this conversation is useless so long as everyone can just mention big, dark bogeymen and mutter darkly about the possibilities. How can anyone disprove these theories?

I just saw one astounding post where a person simultaneously asserted that (1) The FBI has already access to the phone, (2) hacking a phone is easy, (3) the FBI wants this for these wider nefarious purposes.

Again, how can anyone disprove such nonsensical, internally inconsistent theories? This is a guy proposing that Apple's security isn't actually worth anything, and that the FBI already knows how to break it, yet the FBI is asking for the tool to break Apple's security.

Well, let's look at that again shall we

quote:
I'm quite sure that the FBI do have other mechanisms by which they can do this, given the resources at their disposal. I think it's reasonable to assume that they(and other intelligence agencies), do actually go out and acquire large quantities of phones, tablets etc etc, and study them in forensic detail at both the software and hardware level.

I'm sure they have already cracked this mechanism before as an exercise and could do so in this case. No harm in them asking Apple to do it, from their point of view.


(1) The FBI has already access to the phone

It doesn't say that.

(2) hacking a phone is easy,

It doesn't say that

(3) the FBI wants this for these wider nefarious purposes.

It doesn't say that.

Perhaps you were referring to a different one?
 
Posted by Jay-Emm (# 11411) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
The problem with governmental intelligence is not that they do not already have enough, but that they do not use it properly.

The FBI is a police force, not a spying agency.
Wow, so that caused a traffic jam of responses.
First was someone should tell them that. Followed by aren't they the equivilant of MI5? And there is nothing that mandates a wall between police force and spying.
They are a domestic intelligence agency like MI5 and ASIO.

No, they really aren't. If you can't tell the difference between the FBI and the CIA and the NSA, this conversation is never going to get anywhere.
Um the CIA is not a domestic intelligence service, (and would be like MI6/ASIS and not MI5/ASIO).

The FBI is a domestic intelligence service (like MI5/ASIO and unlike MI6/ASIS)

In theory at least, obv things get a bit blurry.
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by The Rogue:
anyone who wants absolute phone security (I don't particularly) will be more interested in their products now.

Why? Apple have admitted that their new security, which they claimed was failsafe, is in fact "trust us, we're Apple".

The iphone model in question is not using their new security implementation.
 
Posted by Dave W. (# 8765) on :
 
I don't think Apple has given any public statement suggesting that newer iPhones are any less susceptible to the kind of thing the FBI is asking for.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by orfeo:
I know that the systems between different countries aren't exactly parallel, but the fact is the FBI is not the CIA or the NSA, otherwise there'd be no reason for each of them to have a separate existence.

Do you seriously think that the organization of government agencies always makes perfect sense, and never has any redundancy? Many voices said, and are still saying, that the NSA was completely unnecessary because we have the FBI. Do you think there was no domestic spying in the US before Dubya created the NSA? Or that the FBI just willingly relinquished all such activities when the NSA was created?

If the NSA were really the only agency in charge of investigating terrorism (for which spying is an absolute necessity), it would be they who were asking that the phone be hacked, not the FBI.

Further the FBI and CIA are famous for bickering over turf.

You are coming across increasingly as out of your depth here.

quote:
Originally posted by orfeo:
More to the point, this conversation is useless so long as everyone can just mention big, dark bogeymen and mutter darkly about the possibilities. How can anyone disprove these theories?

I just saw one astounding post where a person simultaneously asserted that (1) The FBI has already access to the phone, (2) hacking a phone is easy, (3) the FBI wants this for these wider nefarious purposes.

Again, how can anyone disprove such nonsensical, internally inconsistent theories? This is a guy proposing that Apple's security isn't actually worth anything, and that the FBI already knows how to break it, yet the FBI is asking for the tool to break Apple's security.

Not all of us are making these claims. You can't refute one person's claims and thereby dismiss the rest of us. That's a variant of the straw man fallacy.

[ 21. February 2016, 15:52: Message edited by: mousethief ]
 
Posted by Dave W. (# 8765) on :
 
I don't think it's necessary to invoke turf battles or the FBI's checkered past to justify characterizing it as a domestic intelligence agency - that view is sufficiently mainstream to make it into the first sentence of the FBI's wiki page:
quote:
The Federal Bureau of Investigation (FBI) is the domestic intelligence and security service of the United States, which simultaneously serves as the nation's prime Federal law enforcement organization.
That's also pretty much how the FBI describes itself:
quote:
What is the FBI?

The FBI is an intelligence-driven and threat-focused national security organization with both intelligence and law enforcement responsibilities—the principal investigative arm of the U.S. Department of Justice and a full member of the U.S. Intelligence Community. It has the authority and responsibility to investigate specific crimes assigned to it and to provide other law enforcement agencies with cooperative services, such as fingerprint identification, laboratory examinations, and training. The FBI also gathers, shares, and analyzes intelligence—both to support its own investigations and those of its partners and to better understand and combat the security threats facing the United States.

I think the FBI should be investigating the San Bernardino terrorist attack, and they'd be remiss if they didn't try to find out what's on that phone. If Apple has a good reason why they shouldn't be required to do what the FBI wants, they'll have a chance to explain it this week to the judge who issued the writ.

So far I don't think their public statements are too convincing. The FBI seems to be making a request narrowly targeted at this one particular phone and in its own iOS Security Guide, Apple describes how it makes sure that update packages are linked to individual phones:
quote:
The boot-time chain-of-trust evaluation verifies that the signature comes from Apple and that the measurement of the item loaded from disk, combined with the device’s ECID, matches what was covered by the signature.

These steps ensure that the authorization is for a specific device and that an old iOS version from one device can’t be copied to another.

Perhaps this protection doesn't apply to the firmware update the FBI is asking for, but so far Apple hasn't provided enough detail publicly to explain why.
 
Posted by mousethief (# 953) on :
 
Dave W. -- good job investigating.

"The FBI also gathers, shares, and analyzes intelligence...."

IOW, spying.
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by orfeo:
More to the point, this conversation is useless so long as everyone can just mention big, dark bogeymen and mutter darkly about the possibilities. How can anyone disprove these theories?

I don't think one needs evidence or even a suggestion of any specific conspiracy to be concerned about the limits of power. I have no evidence at all of any crimes the Queen would have committed had she come to the throne with absolute power, but that's not a reason for thinking this democracy idea might be a bit excessive. To put it another way, if one waits to see if a power is abused before putting limits on it, one is usually too late.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Dave W.:
I don't think it's necessary to invoke turf battles or the FBI's checkered past to justify characterizing it as a domestic intelligence agency - that view is sufficiently mainstream to make it into the first sentence of the FBI's wiki page:
quote:
The Federal Bureau of Investigation (FBI) is the domestic intelligence and security service of the United States, which simultaneously serves as the nation's prime Federal law enforcement organization.
That's also pretty much how the FBI describes itself:
quote:
What is the FBI?

The FBI is an intelligence-driven and threat-focused national security organization with both intelligence and law enforcement responsibilities—the principal investigative arm of the U.S. Department of Justice and a full member of the U.S. Intelligence Community. It has the authority and responsibility to investigate specific crimes assigned to it and to provide other law enforcement agencies with cooperative services, such as fingerprint identification, laboratory examinations, and training. The FBI also gathers, shares, and analyzes intelligence—both to support its own investigations and those of its partners and to better understand and combat the security threats facing the United States.

I think the FBI should be investigating the San Bernardino terrorist attack, and they'd be remiss if they didn't try to find out what's on that phone. If Apple has a good reason why they shouldn't be required to do what the FBI wants, they'll have a chance to explain it this week to the judge who issued the writ.

So far I don't think their public statements are too convincing. The FBI seems to be making a request narrowly targeted at this one particular phone and in its own iOS Security Guide, Apple describes how it makes sure that update packages are linked to individual phones:
quote:
The boot-time chain-of-trust evaluation verifies that the signature comes from Apple and that the measurement of the item loaded from disk, combined with the device’s ECID, matches what was covered by the signature.

These steps ensure that the authorization is for a specific device and that an old iOS version from one device can’t be copied to another.

Perhaps this protection doesn't apply to the firmware update the FBI is asking for, but so far Apple hasn't provided enough detail publicly to explain why.

It would apply to this individual version. The reason Apple are opposing is it is that if they manage to set the precedent that they can't be asked to do it, then they have that to cite in the future. If they lose this time, they can be asked to do it again and again.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
Dave W. -- good job investigating.

I used only publicly available sources, honest! I swear on the grave of Efrem Zimbalist, Jr.
quote:

"The FBI also gathers, shares, and analyzes intelligence...."

IOW, spying.

Terrorist groups and organized criminals tend to avoid revealing their plans; some amount of secret information gathering seems necessary.

Ricardus
quote:
I don't think one needs evidence or even a suggestion of any specific conspiracy to be concerned about the limits of power. [...] To put it another way, if one waits to see if a power is abused before putting limits on it, one is usually too late.
But what's wrong with the limits proposed? They got a warrant from a judge and they're asking for Apple to do something to one specific phone. If this request on its own is reasonable, I don't think it makes sense to deny it just because they might make an unreasonable request in the future. (That view seems to provide grounds for arguing that no judge should grant any warrant or writ, ever, no matter how justified.)

lowlands_boy
quote:
It would apply to this individual version. The reason Apple are opposing is it is that if they manage to set the precedent that they can't be asked to do it, then they have that to cite in the future. If they lose this time, they can be asked to do it again and again.

I would imagine so! But it's not like Apple doesn't already have policies and guidelines for responding to government entities. Unless they can show that it's an unreasonable burden, they can't refuse just because they don't want to.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Dave W.:
Unless they can show that it's an unreasonable burden, they can't refuse just because they don't want to.

This is the strawest of men. It's not because they "don't want to." It's because it would open the door to all manner of abuses of power. A road it's best not to take the first step on.

quote:
Terrorist groups and organized criminals tend to avoid revealing their plans; some amount of secret information gathering seems necessary.
And we have all heard tales of the FBI's use of this power to shut down groups that have nothing to do with terrorism or the mob. "Legalize marijuana" advocates being one that comes to mind immediately.

And of course it's a long-standing joke, at least among people who are awake and do not automatically acquiesce to the status quo, to wonder exactly what is in one's FBI dossier. All such people assuming they have an FBI dossier. And depending on just how out-of-the-status-quo your publicly-stated opinions are, it's not too farfetched an assumption.

[ 21. February 2016, 18:21: Message edited by: mousethief ]
 
Posted by lowlands_boy (# 12497) on :
 
This particular case will cease to matter in due course, because the specific technology is already obsolete. Later versions that allow much longer alphanumeric passwords already exist. Apple can't decrypt the current phone - only make it easier for agencies to cycle through the 10,000 possible passwords. With a longer character based password, the number of permutations is enormous.

So, they should comply in this case because it won't matter much longer.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Dave W.:
Unless they can show that it's an unreasonable burden, they can't refuse just because they don't want to.

This is the strawest of men. It's not because they "don't want to." It's because it would open the door to all manner of abuses of power. A road it's best not to take the first step on.
"If they lose this time, they can be asked to do it again and again." was exactly the argument I was responding to; I don't think this is obviously the same as "all manner of abuses of power," so I plead not guilty to strawman assassination on this point

But as to opening the door to abuses of power - presumably you'd consider the government's power to execute other kinds of searches to be legitimate, and not necessarily a harbinger of abuse. Assuming you do, what exactly makes this attempt so much more dangerous?
 
Posted by Ricardus (# 8757) on :
 
quote:
Originally posted by Dave W.:
But what's wrong with the limits proposed? They got a warrant from a judge and they're asking for Apple to do something to one specific phone. If this request on its own is reasonable, I don't think it makes sense to deny it just because they might make an unreasonable request in the future. (That view seems to provide grounds for arguing that no judge should grant any warrant or writ, ever, no matter how justified.)

Enoch expressed my concerns very well a few posts ago - a warrant that explicitly requires Apple to do stuff seems to me qualitatively different from a warrant that merely requires Apple not to resist while the FBI take stuff away.

Now I assume that for a 'taking stuff away' warrant to be issued, certain legal thresholds must be met and these are established by the elected legislature. My concern is that a 'do stuff for us' warrant constitutes an entirely new kind of warrant for which no guidelines have ever been laid down. This is what I mean by a lack of limits.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by lowlands_boy:
This particular case will cease to matter in due course, because the specific technology is already obsolete. Later versions that allow much longer alphanumeric passwords already exist. Apple can't decrypt the current phone - only make it easier for agencies to cycle through the 10,000 possible passwords. With a longer character based password, the number of permutations is enormous.

So, they should comply in this case because it won't matter much longer.

I'm pretty sure a longer passcode was already possible with the 5C; and in any case, from what I've read people are still likely to choose passcodes which are easy to remember - i.e. short, bad passcodes - so brute force attacks usually don't have to try anywhere near all the possible combinations. Even if they make 6-digits the default instead of 4, that would still take only the FBI's proposed method just under a day at most instead of just under 15 minutes.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Dave W.:
quote:
Originally posted by lowlands_boy:
This particular case will cease to matter in due course, because the specific technology is already obsolete. Later versions that allow much longer alphanumeric passwords already exist. Apple can't decrypt the current phone - only make it easier for agencies to cycle through the 10,000 possible passwords. With a longer character based password, the number of permutations is enormous.

So, they should comply in this case because it won't matter much longer.

I'm pretty sure a longer passcode was already possible with the 5C; and in any case, from what I've read people are still likely to choose passcodes which are easy to remember - i.e. short, bad passcodes - so brute force attacks usually don't have to try anywhere near all the possible combinations. Even if they make 6-digits the default instead of 4, that would still take only the FBI's proposed method just under a day at most instead of just under 15 minutes.
Digits is different to characters. 0-9 for digits, but for characters, you have 52 upper and lower case letters, plus the 0-9, plus punctuation.

That's where the scale ramps up. People may choose stupid passwords, but they may not.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by Ricardus:
My concern is that a 'do stuff for us' warrant constitutes an entirely new kind of warrant for which no guidelines have ever been laid down. This is what I mean by a lack of limits.

I see the distinction you're making, but I think that you're mistaken that there's no precedent for this. One example the government gives in its application for the writ is a case from 1977 in which the Supreme Court ruled that a telephone company was properly compelled under the All Writs Act to assist the FBI in setting up a pen register (a method for recording numbers dialed) in an investigation of illegal gambling.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by lowlands_boy:
Digits is different to characters. 0-9 for digits, but for characters, you have 52 upper and lower case letters, plus the 0-9, plus punctuation.

Thank you, I'm quite aware of the distinction - but it's irrelevant to my point. Since the ability to enter strong passwords is already available, I don't agree that "it won't matter much longer."
 
Posted by mousethief (# 953) on :
 
Aha! Found it. 9 Chickweed Lane takes on the Creepy National Furtiveness Agency
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Dave W.:
But as to opening the door to abuses of power - presumably you'd consider the government's power to execute other kinds of searches to be legitimate, and not necessarily a harbinger of abuse. Assuming you do, what exactly makes this attempt so much more dangerous?

Because as I have said above and nobody seems to have paid any attention to, the government is in apoplexy because there is no back door to cybersecurity, despite the government throwing a hissy fit in court prior to this. This creates the possibility of a back door. It's unprecedented in the cyber world, and something cybersecurity people have been fighting since large-prime encryption was invented.

For God's sake, all you "what's the harm?" types, read Crypto by Steven Levy, so you'll know what you're talking about.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Dave W.:
But as to opening the door to abuses of power - presumably you'd consider the government's power to execute other kinds of searches to be legitimate, and not necessarily a harbinger of abuse. Assuming you do, what exactly makes this attempt so much more dangerous?

Because as I have said above and nobody seems to have paid any attention to, the government is in apoplexy because there is no back door to cybersecurity, despite the government throwing a hissy fit in court prior to this. This creates the possibility of a back door. It's unprecedented in the cyber world, and something cybersecurity people have been fighting since large-prime encryption was invented.
I'm not sure what you mean by "creating the possibility of a back door." Not quite the same as creating a back door, apparently. But doesn't Apple's tacit admission that the FBI's request is feasible imply that the possibility (at least) already exists? If Apple wants to make more secure phones, let them do so.
quote:
For God's sake, all you "what's the harm?" types, read Crypto by Steven Levy, so you'll know what you're talking about.

Yes, I'll get right on that. At the appropriate juncture. In due course. In the fullness of time. Until then, I'd be open to any attempt you'd care to make at explaining more fully at a level somewhere between "Because as I have said above and nobody seems to have paid any attention to" and "go read a 370 page book so you can be as enlightened as me."
 
Posted by Honest Ron Bacardi (# 38) on :
 
mt - I have already agreed with the point about risk of backdoors earlier. But what the FBI has asked for is not a backdoor, whatever media chatter may currently be calling it. It is known as a brute-force entry. To count as a backdoor, a device has to offer an entry that bypasses standard security checking, and there is no proposal to do that.

And it's not just me being picky. This business is going to be tested in court and you can rest assured they will be making the most of that difference. This is a carefully chosen case on their part I suspect. Because a brute-force entry renders issues of cryptography irrelevant.

[ 21. February 2016, 19:50: Message edited by: Honest Ron Bacardi ]
 
Posted by Golden Key (# 1468) on :
 
MT said:
quote:
Do you think there was no domestic spying in the US before Dubya created the NSA?
Actually, per Wikipedia,

quote:
The agency was formally established by Truman in a memorandum of October 24, 1952, that revised National Security Council Intelligence Directive (NSCID) 9.[28] Since President Truman's memo was a classified document,[28] the existence of the NSA was not known to the public at that time. Due to its ultra-secrecy the U.S. intelligence community referred to the NSA as "No Such Agency".[29]
FYI.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:
This creates the possibility of a back door.

This does no such thing. The possibility of a back door already exists, and you only have Apple's word for it that there isn't one.

That remains unchanged, whatever Apple does in this case.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by mousethief:
there is no back door to cybersecurity

In which case, should I assume that Microsoft and Apple are constantly offering me security updates for some nefarious purpose? They keep suggesting they're closing unforeseen back doors for me, but if there are no back doors...

Lord knows there's enough signs they would quite like to walk in the front door and collect my data. Although credit where credit is due, Apple is better than most at giving me options to switch off various kinds of information sharing. They of course tend to default to 'on' rather than 'off', and I only have their word for it that switching to 'off' actually works, but I guess it's something.

[ 21. February 2016, 23:24: Message edited by: orfeo ]
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Golden Key:
MT said:
quote:
Do you think there was no domestic spying in the US before Dubya created the NSA?
Actually, per Wikipedia,

quote:
The agency was formally established by Truman in a memorandum of October 24, 1952, that revised National Security Council Intelligence Directive (NSCID) 9.[28] Since President Truman's memo was a classified document,[28] the existence of the NSA was not known to the public at that time. Due to its ultra-secrecy the U.S. intelligence community referred to the NSA as "No Such Agency".[29]
FYI.

Sorry I got it mixed up with TSA & Homeland Security.

Orfeo: There is no back door to a large-prime-based public/private key system.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:

Orfeo: There is no back door to a large-prime-based public/private key system.

...assuming your friendly client isn't stashing a copy of your private key somewhere for you, or choosing deliberately weak keys or something. The math is sound, but how much do you trust the tool to be doing what it says it's doing?
 
Posted by mousethief (# 953) on :
 
NOW who's spinning conspiracy theories?
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:
NOW who's spinning conspiracy theories?

If you are afraid that "someone" (whether that's the FBI, the NSA, the Chinese government or whoever) might have a back door installed in your phone's operating system, but don't think that there's any chance that they'll install a back door in your encryption software, which you probably obtained from the same source, then I think you're a bit confused.

[ 22. February 2016, 03:36: Message edited by: Leorning Cniht ]
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by mousethief:
NOW who's spinning conspiracy theories?

If you are afraid that "someone" (whether that's the FBI, the NSA, the Chinese government or whoever) might have a back door installed in your phone's operating system, but don't think that there's any chance that they'll install a back door in your encryption software, which you probably obtained from the same source, then I think you're a bit confused.
Ah, the ol' deflect-and-dodge. Nice.
 
Posted by Dave W. (# 8765) on :
 
Apple's latest "Answers to your questions about Apple and security", in which Apple suggests the government withdraw its demand and form a commission to "discuss the implications".
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:
Ah, the ol' deflect-and-dodge. Nice.

No, it's an entirely serious thought. I'm not spinning conspiracy theories - I'm pointing out that if your conspiracy theory centers around a potential back door in your phone software, and merrily assumes that there can't be a back door in another bit of your phone's software, then your theory is a trifle lacking in internal consistency.

This particular case is nothing to do with back doors of any kind, so the whole back-door tangent is pretty much irrelevant except as the end-game to a slippery-slope argument.
 
Posted by lilBuddha (# 14333) on :
 
What is the name for a fallacious reference to a logical fallacy?
No one, as far as I recall, is saying this is a first/further step down a slippery slope. We* are saying that this step, in and of itself, is a problem.
We are also saying that there should be limits on how much freedom one should be required to relinquish.
This discussion is about those limits, not a tinfoil hat fear gathering.

*We as in the majority of the posters thus far who don't side with the FBI.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
No one, as far as I recall, is saying this is a first/further step down a slippery slope. We* are saying that this step, in and of itself, is a problem.

The first step is access to a specific phone. Everything past that - every assertion that this will lead to access to other phones - IS a slippery slope argument. Whether you recognise it or not.

There have been several references to things such as mass surveillance programs. Which would be relevant if the FBI was seeking authorisation of a mass surveillance program. They're not. They're asking for access to the data on one phone, and every single statement about what that might lead to is a slippery slope argument.

[ 23. February 2016, 01:14: Message edited by: orfeo ]
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
No one, as far as I recall, is saying this is a first/further step down a slippery slope. We* are saying that this step, in and of itself, is a problem.

The first step is access to a specific phone. Everything past that - every assertion that this will lead to access to other phones - IS a slippery slope argument. Whether you recognise it or not.

There have been several references to things such as mass surveillance programs. Which would be relevant if the FBI was seeking authorisation of a mass surveillance program. They're not. They're asking for access to the data on one phone, and every single statement about what that might lead to is a slippery slope argument.

It would be if they did not already have a record of abusing the legal limits of their authority
 
Posted by Golden Key (# 1468) on :
 
Sometimes, slopes really *are* slippery.
 
Posted by Dave W. (# 8765) on :
 
That sounds pretty slippery to me.

Per that Wikipedia article, I note that such an argument isn't necessarily a fallacy. (After all, some slopes really are dangerous because of their slipperiness.) But without some more detail connecting the dots than I've seen so far, I don't think I find convincing a generalized suspicion of the FBI which seemingly could be used to delegitimize nearly any action they might take.
 
Posted by Dave W. (# 8765) on :
 
Hey! I was going to say that!
 
Posted by W Hyatt (# 14250) on :
 
The FBI is not asking for new software to do a mass search in this case, but they are asking for new software (as I think Enoch and Ricardus pointed out) and while they are only asking for it to brute-force security on a single phone, the new software they're asking for would be very close to a general solution to making Apple's encryption easy to bypass. I don't see how it's so clear-cut either way.

It's not a mass search, and its application in this one case is not an abuse of power, but that fact that it requires new software does make it seem like it would be an entirely new precedent and the question is how close it would come to inviting the abuses of mass searches. So I think the question is ultimately one of whether or not it qualifies as a dangerously slippery slope, and part of the answer to that depends on how much you trust the government agencies that will inevitably push to have as many tools as they are allowed to have. We already know "they" will conduct mass searches if they decide they're allowed to, but would coercing Apple to write the new software establish a precedent that could be used in the future to demand new software that could help them do mass searches? Or would it establish only a limited precedent that would not apply to more general searches?

Again, I don't see how it's so clear-cut either way, although I'm just a lay observer with a superficial understanding of all the intricacies.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by lilBuddha:
quote:
Originally posted by orfeo:
quote:
Originally posted by lilBuddha:
No one, as far as I recall, is saying this is a first/further step down a slippery slope. We* are saying that this step, in and of itself, is a problem.

The first step is access to a specific phone. Everything past that - every assertion that this will lead to access to other phones - IS a slippery slope argument. Whether you recognise it or not.

There have been several references to things such as mass surveillance programs. Which would be relevant if the FBI was seeking authorisation of a mass surveillance program. They're not. They're asking for access to the data on one phone, and every single statement about what that might lead to is a slippery slope argument.

It would be if they did not already have a record of abusing the legal limits of their authority
It's still a slippery slope argument. The difference is that you assert a history of lubrication.

Dave W has it right.

[ 23. February 2016, 03:22: Message edited by: orfeo ]
 
Posted by Dave W. (# 8765) on :
 
In support of its claim that the order does not place an unreasonable burden on Apple, the government says "providers of electronic communications services and remote computing services are sometimes required to write code in order to gather information in response to subpoenas or other process." (p. 15)
 
Posted by W Hyatt (# 14250) on :
 
For example, if the precedent becomes "the only criterion for demanding new software is that its development is determined not to be an undue burden," that sounds to me like a very dangerous precedent.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by lilBuddha:
We* are saying that this step, in and of itself, is a problem.

This step, in and of itself, is the FBI wanting to examine the contents of a phone, and having a court order Apple to assist them.

This is exactly the same as if Apple was the manufacturer of a sophisticated safe, and the court ordered Apple to help the FBI break in to the safe.

It is not magically different because computer, and there isn't even the remotest shred of an argument that your telephone should have more rights to privacy than your filing cabinet.

Discussions about mass data mining are real - but that's not what this is, and this does nothing to enable that.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by W Hyatt:
For example, if the precedent becomes "the only criterion for demanding new software is that its development is determined not to be an undue burden," that sounds to me like a very dangerous precedent.

Good thing that's not the actual criterion, then.

We are dealing with powers of a law enforcement authority to obtain evidence. There are situations where a warrant or equivalent is required. You have to go to a judge to obtain a warrant or equivalent. One of the factors that a judge will take into account is whether the warrant or equivalent will place an undue burden on someone.

That is simply not the same thing as "law enforcement can ask for anything that isn't too inconvenient". Why? Because to even get as far as asking, they have to deal with questions about what power they have to seek evidence, and for what purpose, and how useful the evidence would be.

No-one is focusing on those factors because we all take for granted that the FBI has a legitimate interest in investigating a couple of mass murderers. But there is a vast conceptual leap between their power to investigate mass murderers and their power to investigate just anyone they feel like investigating, and the whole point of warrants and such like is to prevent that leap.

It's THAT which protects from random requests for demanding new software. Not some new-fangled notion that law enforcement ought not to be able to compel people to do things.
 
Posted by W Hyatt (# 14250) on :
 
Theoretically, is there any software the FBI should not be allowed to have in its arsenal for collecting information?

I understand what you're saying up until you refer to new software. When information is on a physical medium (e.g. paper), it's clear when a warrant is needed. But when information is digital, it's won't necessarily be obvious when the FBI ignores the need for a warrant. If they have a software tool that's supposed to be used only in combination with a warrant, what's to stop them from using it without a warrant? Any evidence they gather won't be admissable, but once they have it, figuring out how to obtain the same information legally becomes much easier so it will always be a temptation to skip the warrant.

It seems to me to be somewhat parallel with devices police departments use to identify everyone's cell phone in a specific area. It's a form of mass search that some judges seem to think is perfectly fine, but from an information point of view, it's the same as stopping everyone at that location and demanding to see their identification. Maybe that's actually legal, but it seems to me that software has a tendency to blur lines that used to be distinct, and I'm not at all sure that judges generally recognize that to be the case.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by W Hyatt:
Theoretically, is there any software the FBI should not be allowed to have in its arsenal for collecting information?

I understand what you're saying up until you refer to new software. When information is on a physical medium (e.g. paper), it's clear when a warrant is needed. But when information is digital, it's won't necessarily be obvious when the FBI ignores the need for a warrant. If they have a software tool that's supposed to be used only in combination with a warrant, what's to stop them from using it without a warrant? Any evidence they gather won't be admissable, but once they have it, figuring out how to obtain the same information legally becomes much easier so it will always be a temptation to skip the warrant.

It seems to me to be somewhat parallel with devices police departments use to identify everyone's cell phone in a specific area. It's a form of mass search that some judges seem to think is perfectly fine, but from an information point of view, it's the same as stopping everyone at that location and demanding to see their identification. Maybe that's actually legal, but it seems to me that software has a tendency to blur lines that used to be distinct, and I'm not at all sure that judges generally recognize that to be the case.

Your entire question is premised on the notions that whatever Apple writes to unlock this particular phone will inevitably be (1) in the possession of the FBI thereafter, and (2) will be useful to unlock any other phone.

It's far from clear that either of those notions is actually valid.

I'm aware that there are key differences between digital environments and physical ones that make copying/repeatability more likely, but I'm still sceptical that what FBI is actually asking for is a tool that will then be in their possession.

As imperfect as physical analogies are, there's an important difference between asking a safe manufacturer to unlock one safe (a task they might carry out using a master key), and asking a safe manufacturer to provide a master key.

[ 23. February 2016, 06:13: Message edited by: orfeo ]
 
Posted by W Hyatt (# 14250) on :
 
I understand what you're saying, but I'm not assuming (1) and regarding (2), it will probably be useful to unlock any similar phone with the simplest of software changes. Yes, it will be in Apple's possession, but I'm reluctant to conclude that therefore Apple should simply cooperate. I'm not saying this case is an abuse of legal authority, but I'm not convinced it's clearly a case of a routine search warrant. I'm also not saying the FBI shouldn't make the request. I'm only wondering if our court system should grant the request in the end.

I think it was Eutychus who told us about an inmate (probably in France?) who was incarcerated at least in part because the police determined that the "suspect" was in a particular area but had his cell phone turned off, which they knew because they couldn't identify him with the kind of device I mentioned above. The court was apparently willing to accept that as evidence of guilt and I have no doubt something similar could occur in the U.S. to almost any young black male in a poor urban location. This is the kind of "what if" I have in mind when I wonder what tools we want to allow law enforcement to have.

This case wouldn't directly give the FBI a new tool, but it would create new software and I have to wonder about the possible precedent for other new software. Can we count on courts being careful about who owns the software? Should the courts consider the possibility that someone might obtain an illegal copy? Can we count on the courts to only demand new software that targets individuals? What if the only way for the FBI to get some pertinent information it knows exists is by doing a mass search and then filtering the results? Will the courts prevent them from demanding that a manufacturer supply new software to do such a search?

From what I understand about legal precedents, it's as much what's written in the court opinion in support of the decision as what the final decision is. I'd probably be fine with granting the FBI's request in this case if the opinion (or whatever it's called) is written in a way that establishes a sufficiently narrow precedent for demanding new software, but not if the precedent is too broad.
 
Posted by Eutychus (# 3081) on :
 
quote:
Originally posted by W Hyatt:
I think it was Eutychus who told us about an inmate (probably in France?) who was incarcerated at least in part because the police determined that the "suspect" was in a particular area but had his cell phone turned off, which they knew because they couldn't identify him with the kind of device I mentioned above. The court was apparently willing to accept that as evidence of guilt

For the record:

I know people who have been remanded (but not necessarily convicted)
a) because their cell phone was on and located near the scene of the crime
b) because their cell phone was on and located 60 miles away from the crime
c) because their cell phone was off at the time of the crime, irrespective of their presumed location.

The conclusion I draw from this (and many other cases) has a lot less to do with technology than it does with the way prosecutions are obtained. Once an investigating judge has you in their sights, they will use whatever material evidence there is to build a case against you - or if they don't, the prosecutor will.*

I don't have any expertise to contribute to the debate here, except to say that AIUI your phone can be located even when it's off if the battery is still in it. And if it can't be located, that might make you a person of interest, not exonerate you.

=

*If you think this could never happen to you, for a completely scary video on this topic from the perspective of the US justice system, watch Don't talk to police. 48 minutes but worth every one of them. Note: this is not posted as legal advice but as an illustration of how one defense attorney and one cop view the US system as it was in 2008.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by W Hyatt:
Can we count on the courts to only demand new software that targets individuals? What if the only way for the FBI to get some pertinent information it knows exists is by doing a mass search and then filtering the results? Will the courts prevent them from demanding that a manufacturer supply new software to do such a search?

Probably. Maybe. It's hypothetical, and I'm not going to claim I can read the mind of every judge (and every police officer for that matter) from here until the end of time.

And I think that's one of the problems I have with many of these kinds of arguments. Does everyone want 100% guarantees? Sorry, I can't give them. Any more than I can guarantee you that some hacker won't write exactly the software that the FBI wants and then release it on the black market. I can't guarantee that some person with Apple won't devise some way of mining the data off your phone that is deeply unethical.

I can't promise you that a President of the United States won't authorise hotel break-ins and the erasure of tapes, either. Or that Volkswagen won't fake the results of environmental tests.

Laws can't actually physically stop people from acting unlawfully. They can only set out the procedures for lawful action, and the legal consequences for unlawful action. We have laws, for example, about the exclusion of evidence that is unlawfully obtained.

There is a point where you have to shift away from "what are the things that could go wrong under this system" and start asking "is there any other better system". People keep raising the spectre of a system where law enforcement can access anything, anywhere, but frankly I'm not convinced that a system where there are things that law enforcement can never access in any circumstance is any kind of improvement.

News of places where law enforcement may never go because the software is impenetrable and they can't ask for help penetrating it would inevitably spread just as fast, if not faster, than any software that enabled penetration.

For all Apple trumpets this value of absolute privacy, does it actually want to be known as the preferred tool of criminals? Because that is what would happen. But my main concern is not about their reputation, it's about the balance that has to be struck. I don't think complete inaccessibility strikes the right balance, any more than I think complete accessibility strikes the right balance. Having to justify access in a given case seems entirely appropriate to me, and while judges are not perfect - especially not in the USA where becoming a judge can involve a popularity contest - the entire point of the job is to make these kinds of tricky decisions.

[ 23. February 2016, 08:06: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
I also have to question Apple's stance on a level that has nothing to do with law enforcement, in that the actual owner of the phone is not the person who was using it. Forget the FBI: does Apple have no power or obligation to help the owner of the phone? If the county provided the phone for work purposes, doesn't that mean that the phone AND at least part of what's on it actually belong to the county?

Plus, what does Apple's attitude mean for inheritance law? If I die and you inherit my phone, and it's clear that you've legally inherited my phone, but you don't have access to my password or my fingerprint, are Apple able to refuse you help? Are they entitled to insist that you can only use the phone if you first wipe anything from it that I had on there?

Do you have to buy all the apps again, all the music again?

This latter point is actually the kind of thing that makes me deeply sceptical about the whole move to digitisation. We've been cunningly moved towards a space where individuals own less and less of what they used to. I can sell you my CDs and books, or give them to you. That's one of the signs of my ownership, that I can choose how to dispose of my property. Much of the digital environment is built around forcing each person to make a purchase from the original seller and to erase any notion of a second-hand market. If you don't want something any more, you can't sell/give it to someone, and it just becomes dead weight.

There's also a risk that you might have to buy something you 'own' a second time. Certain actions like moving countries or changing too many components of your computer can trigger this.

In circumstances like the current case Apple is able to trumpet how they're protecting what's yours, but in many other circumstances people discover that the company holds all the cards.

[ 23. February 2016, 08:15: Message edited by: orfeo ]
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by mousethief:
Ah, the ol' deflect-and-dodge. Nice.

No, it's an entirely serious thought. I'm not spinning conspiracy theories - I'm pointing out that if your conspiracy theory centers around a potential back door in your phone software, and merrily assumes that there can't be a back door in another bit of your phone's software, then your theory is a trifle lacking in internal consistency.

This particular case is nothing to do with back doors of any kind, so the whole back-door tangent is pretty much irrelevant except as the end-game to a slippery-slope argument.

It's not a conspiracy to expect spy agencies will continue spying. It's not naive to expect that companies whose very existence depends on the security of their security systems will work to make and keep them secure.

What is naive is to accept at face value the government's assurance that it's "just this one phone." What a crock of shit.
 
Posted by mousethief (# 953) on :
 
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by mousethief:
Ah, the ol' deflect-and-dodge. Nice.

No, it's an entirely serious thought. I'm not spinning conspiracy theories - I'm pointing out that if your conspiracy theory centers around a potential back door in your phone software, and merrily assumes that there can't be a back door in another bit of your phone's software, then your theory is a trifle lacking in internal consistency.

This particular case is nothing to do with back doors of any kind, so the whole back-door tangent is pretty much irrelevant except as the end-game to a slippery-slope argument.

It's not a conspiracy to expect spy agencies will continue spying. It's not naive to expect that companies whose very existence depends on the security of their security systems will work to make and keep them secure.

What is naive is to accept at face value the government's assurance that it's "just this one phone." What a crock of shit.

Apple are not, as far as I know, the government (although they have more money than lots of governments). Apple do not dispute that they can produce the modified software for one phone.

So do you think Apple will just mess it up, or that Apple will just not bother insisting on a warrant in future, or what?
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.

A precedent is "an example or guide to be used in similar circumstances", but you seem to be arguing that the danger is that a ruling for the FBI would embolden it in other circumstances; this is different from a bad precedent, which is something that shouldn't be repeated even under similar circumstances.
 
Posted by Marvin the Martian (# 4360) on :
 
quote:
Originally posted by orfeo:
We have laws, for example, about the exclusion of evidence that is unlawfully obtained.

I'm pretty sure it would come as cold comfort to any "terrorism suspect" locked up without trial in Guantanamo Bay that the evidence that put him there was unlawfully obtained and thus would not be admissible in court.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by Marvin the Martian:
quote:
Originally posted by orfeo:
We have laws, for example, about the exclusion of evidence that is unlawfully obtained.

I'm pretty sure it would come as cold comfort to any "terrorism suspect" locked up without trial in Guantanamo Bay that the evidence that put him there was unlawfully obtained and thus would not be admissible in court.
If that's what you're worried about, I think you're well beyond caring whether this judge is correctly applying the All Writs Act.
 
Posted by RuthW (# 13) on :
 
quote:
Originally posted by Dave W.:
quote:
Originally posted by mousethief:
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.

A precedent is "an example or guide to be used in similar circumstances", but you seem to be arguing that the danger is that a ruling for the FBI would embolden it in other circumstances; this is different from a bad precedent, which is something that shouldn't be repeated even under similar circumstances.
The FBI wants Apple to break into 12 other iPhones, and the Manhattan DA wants Apple to break into 175 phones - article in The Atlantic. So the slope looks pretty slippery to me.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by RuthW:
quote:
Originally posted by Dave W.:
quote:
Originally posted by mousethief:
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.

A precedent is "an example or guide to be used in similar circumstances", but you seem to be arguing that the danger is that a ruling for the FBI would embolden it in other circumstances; this is different from a bad precedent, which is something that shouldn't be repeated even under similar circumstances.
The FBI wants Apple to break into 12 other iPhones, and the Manhattan DA wants Apple to break into 175 phones - article in The Atlantic. So the slope looks pretty slippery to me.
All of which appear to be in front of a judge, in the normal way, who can evaluate and approve or deny them.

Is every single "wire tap" request approved because one was at some point? Or every search warrant for a premises?

How many other devices that aren't iPhones have law enforcement agencies seized and manipulated themselves?

How does the number of "phone decryption" cases compare to all warrants issued over the same time period?
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Marvin the Martian:
quote:
Originally posted by orfeo:
We have laws, for example, about the exclusion of evidence that is unlawfully obtained.

I'm pretty sure it would come as cold comfort to any "terrorism suspect" locked up without trial in Guantanamo Bay that the evidence that put him there was unlawfully obtained and thus would not be admissible in court.
Yes.

Though I'm not sure how pertinent a point that is in the current context. There seems to be a belief that because I'm arguing in favour of a particular form of executive government action, I must be in favour of all the most extreme forms of government action.

I am in fact arguing for judicial supervision of the executive. Which is sorely lacking in Guantanamo.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by RuthW:
quote:
Originally posted by Dave W.:
quote:
Originally posted by mousethief:
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.

A precedent is "an example or guide to be used in similar circumstances", but you seem to be arguing that the danger is that a ruling for the FBI would embolden it in other circumstances; this is different from a bad precedent, which is something that shouldn't be repeated even under similar circumstances.
The FBI wants Apple to break into 12 other iPhones, and the Manhattan DA wants Apple to break into 175 phones - article in The Atlantic. So the slope looks pretty slippery to me.
They elected one black man as President, now there are others who want the job.

Slippery slope? Yes, based on your definition of the slope.

[ 23. February 2016, 22:30: Message edited by: orfeo ]
 
Posted by RuthW (# 13) on :
 
Apples and oranges, orfeo. Electing a black man president doesn't establish a legal precedent for future elections.
 
Posted by orfeo (# 13878) on :
 
Well, I think it's you that is comparing apples and oranges, because the notion of a 'precedent' is completely different to the notion of a 'slippery slope'.

The point of a precedent is that something has been done before. The point of a slippery slope is that it might lead to something different, and bigger, in the future.

The FBI making a dozen more individual applications for unlocking an iPhone after the one application has been granted is precedent. A slippery slope would be the FBI not having to make individual applications any more, or never being knocked back by a judge.

It's the difference between how many times a principle might be used and actually changing the principle.

[ 23. February 2016, 23:54: Message edited by: orfeo ]
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by RuthW:
quote:
Originally posted by Dave W.:
quote:
Originally posted by mousethief:
As for the "slippery slope" phrasing, if that's bad I suppose you will have to throw out forever the argument that "this sets a bad precedent." Because that's just another way of saying "this is the first step down a slippery slope."

Therefore there is no such thing as a bad precedent, and that should never be a consideration for judging the advisability of any action.

A precedent is "an example or guide to be used in similar circumstances", but you seem to be arguing that the danger is that a ruling for the FBI would embolden it in other circumstances; this is different from a bad precedent, which is something that shouldn't be repeated even under similar circumstances.
The FBI wants Apple to break into 12 other iPhones, and the Manhattan DA wants Apple to break into 175 phones - article in The Atlantic. So the slope looks pretty slippery to me.
I didn't expect this to be a unique case, never to be repeated. I'm not sure how to judge whether the number of phones cited by The Atlantic is large or small - according to the US Federal Courts 3,554 wiretap orders were reported in 2014.* Smartphones are pretty much ubiquitous, so it's not surprising that other warrants will have been issued in similar circumstances - but if the circumstances really are similar, these additional warrants will leave us at the same spot on the slope as with the San Bernardino case.

*The report includes this brief note on encryption:
quote:
The number of state wiretaps in which encryption was encountered decreased from 41 in 2013 to 22 in 2014. In two of these wiretaps, officials were unable to decipher the plain text of the messages. Three federal wiretaps were reported as being encrypted in 2014, of which two could not be decrypted. Encryption was also reported for five federal wiretaps that were conducted during previous years, but reported to the AO for the first time in 2014. Officials were able to decipher the plain text of the communications in four of the five intercepts.

 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:
It's not naive to expect that companies whose very existence depends on the security of their security systems will work to make and keep them secure.

But it's not secure. They can write software to defeat the self-destruct feature. Whether they actually have or not is completely irrelevant to the actual security of the phone.

quote:

What is naive is to accept at face value the government's assurance that it's "just this one phone." What a crock of shit.

This court order is about one phone. Of course there will be other phones, each with their own court order, and I still don't see why searching through someone's phone is magically worse than searching through their filing cabinet.

This isn't a data-mining fishing expedition - it's an order to search one phone.
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Leorning Cniht:
I still don't see why searching through someone's phone is magically worse than searching through their filing cabinet.

How about this: nobody expects that filing cabinets will be secure against government intrusion. The filing cabinet industry was not created to keep filing cabinets secure from snooping, and did not create a filing cabinet security system that has the government shitting its fucking pants because there is no way through it. The ignorance on this thread about the actual history of this issue is choking. Absofuckinglutely CHOKING.

quote:
Originally posted by orfeo:
The FBI making a dozen more individual applications for unlocking an iPhone after the one application has been granted is precedent. A slippery slope would be the FBI not having to make individual applications any more, or never being knocked back by a judge.

Children, gather around. Who can tell me if this is a straw man fallacy, or a false analogy? Suzie?
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Leorning Cniht:
I still don't see why searching through someone's phone is magically worse than searching through their filing cabinet.

How about this: nobody expects that filing cabinets will be secure against government intrusion. The filing cabinet industry was not created to keep filing cabinets secure from snooping, and did not create a filing cabinet security system that has the government shitting its fucking pants because there is no way through it.

You bought a phone expecting it would be secure against a search warrant? And you think the smartphone industry was created to keep smartphones secure from snooping?
quote:
The ignorance on this thread about the actual history of this issue is choking. Absofuckinglutely CHOKING.

Oh right, right - I seem to recall you once read a book on it, didn't you? That's really very impressive!
 
Posted by mousethief (# 953) on :
 
I was unaware that knowledge is risible. This must be the all-Republican Ship of Fools thread.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by mousethief:
I was unaware that knowledge is risible. This must be the all-Republican Ship of Fools thread.

Unlikely, especially for participants like me who aren't in the US. Perhaps you could try explaining your points of view, rather than just constantly expressing your sympathy/contempt for people who don't already hold it?

It's very hard for pupils to learn when the teacher just tells them they're stupid and never explains what the right answer is, let alone how to arrive at it....
 
Posted by orfeo (# 13878) on :
 
Honestly mousethief, I find the proposition that phones are purchased with security in mind and filing cabinets are not to be quite bizarre. If anything, when discussing the purposes of the two objects I'd say the exact opposite. Phones are communication tools. Cabinets are storage tools. Cabinets had locks on them long before anyone thought phones needed locks.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by mousethief:
Children, gather around. Who can tell me if this is a straw man fallacy, or a false analogy? Suzie?

This seems like an adept way of trying to cast doubt upon my statement without going to the effort of actually articulating what's wrong with it.
 
Posted by Golden Key (# 1468) on :
 
ISTM:

--Filing cabinets are generally kept in an area over which you have some control. (E.g., locked room.) Thieves generally aren't going to haul a whole filing cabinet away. So a basic, visible lock on a cabinet may be enough to deter a thief--unless they know how to pick a lick, or they're determined enough to use a pry bar or small explosives to forcibly open it.

--Cell phones are purposely portable, which makes them much easier to steal. They're not only communication devices, but storage devices. Even a basic pre-paid, non-smart phone stores contacts, logs, messages, photos, and maybe ID and credit info. So it needs at least security options. And, since it broadcasts, rather than using a wire, it needs security on that level, too.

So yes, cell phones should be very secure.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Golden Key:
ISTM:

--Filing cabinets are generally kept in an area over which you have some control. (E.g., locked room.) Thieves generally aren't going to haul a whole filing cabinet away. So a basic, visible lock on a cabinet may be enough to deter a thief--unless they know how to pick a lick, or they're determined enough to use a pry bar or small explosives to forcibly open it.

--Cell phones are purposely portable, which makes them much easier to steal. They're not only communication devices, but storage devices. Even a basic pre-paid, non-smart phone stores contacts, logs, messages, photos, and maybe ID and credit info. So it needs at least security options. And, since it broadcasts, rather than using a wire, it needs security on that level, too.

So yes, cell phones should be very secure.

I was talking about purpose. When you think about information that you need to keep secure, where do you think about putting it?

I strongly suspect the answer is not an iPhone, although apparently Apple wants you to start thinking that. You put data in your iPhone for the purpose of portability and convenience. The need for some security flows from that, in the ways you've described. But it's not the reason for the phone's existence nor the reason you decide to buy a phone.

You might well avoid buying a particular brand of phone on the grounds that it's not as secure as other brands of phone, but that's not really the same thing as going out to buy a phone with security being one of the goals you're trying to achieve.

[ 24. February 2016, 09:09: Message edited by: orfeo ]
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

I was talking about purpose. When you think about information that you need to keep secure, where do you think about putting it?

I strongly suspect the answer is not an iPhone, although apparently Apple wants you to start thinking that. You put data in your iPhone for the purpose of portability and convenience. The need for some security flows from that

Well, as the post you are responding to points out, at least some information that you are likely to want to keep secure is created on the phone itself. Secondly, regardless of where it is stored the phone itself needs to be secure - unless the user never accesses personal information from their phone. The need for security is driven by access - not necessarily storage itself.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
I was unaware that knowledge is risible. This must be the all-Republican Ship of Fools thread.

It's not knowledge I'm mocking. And I'm sure that, if asked, Donald Trump would assure us that's read plenty of books, too.

But don't hide your lamp under a bushel, mousethief - how about digging into that deep reserve of knowledge and responding to questions about statements you made?

quote:
Originally posted by mousethief:
quote:
Originally posted by Leorning Cniht:
I still don't see why searching through someone's phone is magically worse than searching through their filing cabinet.

How about this: nobody expects that filing cabinets will be secure against government intrusion. The filing cabinet industry was not created to keep filing cabinets secure from snooping, and did not create a filing cabinet security system that has the government shitting its fucking pants because there is no way through it.

You bought a phone expecting it would be secure against a search warrant? And you think the smartphone industry was created to keep smartphones secure from snooping?
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Dave W.:
But don't hide your lamp under a bushel, mousethief - how about digging into that deep reserve of knowledge and responding to questions about statements you made?

Why should I? When I try, what I say is fed back to me in a garbled state, such as this:

quote:
You bought a phone expecting it would be secure against a search warrant? And you think the smartphone industry was created to keep smartphones secure from snooping?
This entire thread is one colossal straw man.
 
Posted by quetzalcoatl (# 16740) on :
 
Do people actually store private stuff on their lap-tops and phones? Gulp. Actually, double gulp.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Dave W.:
But don't hide your lamp under a bushel, mousethief - how about digging into that deep reserve of knowledge and responding to questions about statements you made?

Why should I? When I try, what I say is fed back to me in a garbled state, such as this:

quote:
You bought a phone expecting it would be secure against a search warrant? And you think the smartphone industry was created to keep smartphones secure from snooping?
This entire thread is one colossal straw man.

Because at least three of us have made points that we're really interested in continuing to discuss? You just don't seem to be able to stay on the same topic as the rest of the us though.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by mousethief:
How about this: nobody expects that filing cabinets will be secure against government intrusion.

So your argument is that the law should treat your phone differently from your filing cabinet because your phone comes with a "government, don't look in here" sticker?

I understand that governments don't like strong encryption. I understand that certain government agencies in particular would be very happy to have a back door into your encryption (sidenote: to that kind of agency, encryption that you think is secure but isn't is very much more useful than encryption you don't trust).

This case isn't asking for strong encryption to be banned. It's not asking for back doors to be incorporated into anyone's encryption.

It is, quite explicitly, saying "we have legal authority to search the contents of this phone, but we don't have the technical ability. Apple has the technical ability, so we want to force them to cooperate with us".

Apple don't like it because they've been telling you that they can't do what we know they can, and now they are reduced to telling you that they won't do it.

To me, you seem to be arguing that government agencies should not be allowed to try to look at information that people have tried to hide from the government. But you can't really be trying to argue that, because that argument makes no sense, so you must mean something else.

I stand by my assertion that the presence of a computer in this case is pretty much a red herring. Computers have privacy implications that differ from filing cabinets because of the potential ability to do mass data analytics on a very large number records - this is why there are privacy issues surrounding the records of number plate recognition cameras, for example. This case doesn't contain mass anything - it contains one phone, and once it's unlocked, it will be looked through by hand. The only role of the computer in this case is to be a special kind of lock.

As far as I'm concerned, RuthW asked the most important question a few pages ago:

quote:

what can other companies be compelled to do to assist government investigations?

Again, the computer aspect is a red herring here. Some police force needs something to pursue an investigation, and you are the only person who knows how to make that thing. Can you be forced to help?

In the US, the answer contained in the All Writs Act seems to be "yes", subject to the Supreme Court's 1977 ruling in US vs New York Telephone Co. that the thing you are asked to do must be necessary and relevant, and can't place an undue burden on you.

So they can't just make you work for the government for free.

The actual work required of Apple in this case is minimal. It wouldn't take more than a morning for a software engineer familiar with the code to disable the data destruction feature and rebuild. So that's not an undue burden.

The question becomes whether calling Apple on the misleading claims of security that they have made places an undue commercial burden on their business. I find it hard to argue that revealing the truth can ever be an "undue burden".
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by mousethief:
quote:
Originally posted by Dave W.:
But don't hide your lamp under a bushel, mousethief - how about digging into that deep reserve of knowledge and responding to questions about statements you made?

Why should I? When I try, what I say is fed back to me in a garbled state, such as this:

quote:
You bought a phone expecting it would be secure against a search warrant? And you think the smartphone industry was created to keep smartphones secure from snooping?
This entire thread is one colossal straw man.

Asked why X is different from Y, you answer that Y doesn't have properties A and B. It is neither garbling your words nor creating a straw man to suggest that you have just implied that X does have those properties.
 
Posted by Enoch (# 14322) on :
 
Going back to what I asked earlier about the difference between negative action (allow a litigant or the state to search), relatively negative action (provide information) and positive action (do X which involves work), can somebody answer something else which is puzzling me? I realise I'm asking questions about the law in a foreign country. So this question might be stupid.

What really is the significance of the references to the All Writs Act? Looking at brief summaries on the web of what it does, all it seems to be saying is that if someone has a legal right to insist on something, the courts have the power to issue a writ to enable them to do it. That is to say, that on my understanding, it's legislation that has an entirely subsidiary, collateral, effect. It does not, or should not be used to, create substantive rights.

So, if the state has the right to compel Apple to do X, then the All Writs Act will mean that if there isn't already some sort of writ or court order to compel Apple to comply, the courts can't say 'well the state has the right, but we haven't got a procedure to enable it to do anything about it'. A judge has to write a form of writ or order that will do the job.

It doesn't or should not follow, that the Act also means that because a writ could be issued to say anything, that means that anybody can go to the court and demand any writ they like. The Act can only exist to compel compliance with law that already exists somewhere else in the legal system.

Am I right, or am I wrong?

If I am right, what is the substantive US law on this subject? Does legislation exist that enables the state to insist that citizens, whether private or corporate, carry out various possibly quite strenuous and demanding activities, either to further the state's investigations or the requirements of other peoples' litigation? Does the US government, or for that matter the government of any of the 50 states, have the power, in effect to compulsorily purchase a citizen's efforts other than under the military draft?
 
Posted by orfeo (# 13878) on :
 
Having read further on the technical question, can I just point out that indications are iPhones were far more crackable up until iOS 8.

The idea that absolute security is a key principle of smartphones ignores that the current problem only arose in the last 18 months.
 
Posted by Dave W. (# 8765) on :
 
Enoch:
From what I've read, the All Writs Act is currently used to effectuate court orders by compelling assistance where there is no applicable statute or rule; it is also limited in use to third parties with a close connection to a case, when assistance is necessary, and when it doesn't impose an unreasonable burden. The Wikipedia article links to a 6.5 minute video by a Stanford Law professor on its application to surveillance cases.

There's a similar case in New York in which Apple has been asked to unlock a 5S belonging to a confessed meth dealer. Here's a transcript of the argument between the US Attorney and Apple before the judge over whether the All Writs Act can be used. The judge asks some interesting questions of both sides which give some idea of what kind of issues might come up in the San Bernardino case. Interestingly, the Apple rep doesn't say anything about a threat to privacy on other phones; he seems to rely on arguments that Apple doesn't have a close connection and that acquiescing to this order would damage consumer trust. The judge seems skeptical in light of the many instances in which Apple has unlocked phones previously, but also presses pretty hard on the government attorney, asking if her theory of close connection means a court could compel assistance in carrying out a death sentence from a company which had chosen to stop making drugs for lethal injections (which the judge agrees is a purposefully inflammatory hypothetical.)

Fun stuff! 87 pages, but double-spaced with big font and wide margins...
 
Posted by cliffdweller (# 13338) on :
 
quote:
Originally posted by Dave W.:
Interestingly, the Apple rep doesn't say anything about a threat to privacy on other phones; he seems to rely on arguments that Apple doesn't have a close connection and that acquiescing to this order would damage consumer trust. The judge seems skeptical in light of the many instances in which Apple has unlocked phones previously...

If the judge can potentially dismiss Apple's argument based on the "many instances in which Apple has unlocked phones previously" doesn't that pretty much prove that precedent is a real & valid concern and not just a "slippery slope" argument?
 
Posted by Dave W. (# 8765) on :
 
Well, it's a weakness in Apple's argument in these cases; that's a concern if you think it's important that the government not be able to search these phones with specific, court-issued warrants. But pending a more detailed responset from Apple, I tend to think what the government is asking for is reasonable and appropriately limited, so from my perspective it isn't a real and valid concern.
 
Posted by Dave W. (# 8765) on :
 
By the way, I meant the judge was skeptical about Apple's claim regarding imperiling customer trust - not necessarily about Apple's entire argument.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by cliffdweller:
quote:
Originally posted by Dave W.:
Interestingly, the Apple rep doesn't say anything about a threat to privacy on other phones; he seems to rely on arguments that Apple doesn't have a close connection and that acquiescing to this order would damage consumer trust. The judge seems skeptical in light of the many instances in which Apple has unlocked phones previously...

If the judge can potentially dismiss Apple's argument based on the "many instances in which Apple has unlocked phones previously" doesn't that pretty much prove that precedent is a real & valid concern and not just a "slippery slope" argument?
A precedent for what, though? It's a precedent for Apple's capability to unlock a phone when asked.

It's not a precedent for deciding when a court should grant a request, and frankly there shouldn't be a different precedent on that for iPhones as compared to any other device or object.

The requirements for a court order should be based on the nature of the investigation, and of the evidence sought, and the value of that evidence.
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

The idea that absolute security is a key principle of smartphones ignores that the current problem only arose in the last 18 months.

Again, I'm not sure this is relevant. Most of the wider IT industry is moving in this direction - and in many ways the behaviour of government is only a small factor in driving this change - a larger factor is the escalation in technical capabilities to crack systems as well as various data breaches and the associated fallout.

Apple is just able to move slightly faster as they control their own ecosystem from end to end, and have slightly more incentive to do so as their end customers are the people buying their devices.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

The idea that absolute security is a key principle of smartphones ignores that the current problem only arose in the last 18 months.

Again, I'm not sure this is relevant. Most of the wider IT industry is moving in this direction - and in many ways the behaviour of government is only a small factor in driving this change - a larger factor is the escalation in technical capabilities to crack systems as well as various data breaches and the associated fallout.

Apple is just able to move slightly faster as they control their own ecosystem from end to end, and have slightly more incentive to do so as their end customers are the people buying their devices.

It's highly relevant, though. We're not merely talking about technology, we're talking about legal principles.

I fail to see why a principle of sanctity of data should appear now just because sanctity of data might have become a technical possibility. For all of Apple's posturing, they haven't actually in the past done something like go to court to establish that the FBI breaking into a phone is wrong.

If FBI access of data is wrong and your phone ought to be involate, then it was always wrong even when the FBI didn't need Apple's help in the way they do now. If FBI access of data was acceptable in an appropriate case of a targeted investigation was an acceptable proposition before, then it still ought to be an acceptable proposition now.

Whether Apple ought to be required to help is a distinct legal argument, and while it might be one Apple is running in court, in the land of public opinion they're running the argument that the FBI accessing a phone is a terrible idea. It didn't suddenly become a terrible idea just because access has become difficult. All the the change in technology did was create a situation where Apple could combine it's supposed "principles" on the issue with a practical situation.

It's no different to what I've previously said about the principles applying to all devices and objects - iPhones, safes, filing cabinets, hidden boxes that can only be found by tracing a magic rune during the full moon. If I don't think the difference between data in an iPhone and data in a filing cabinet ought to affect the legal principles for accessing data, I sure as hell don't think the difference between an iPhone 4 running iOS 7 and an iPhone 6 running iOS 9 ought to affect the legal principles.

But Apple is behaving as if it does make a difference, and that THIS FAR into the history of iPhones is the time to take a stand against access to data when there have been plenty of successful court orders for access to data in the past.

[ 25. February 2016, 09:46: Message edited by: orfeo ]
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

It's highly relevant, though. We're not merely talking about technology, we're talking about legal principles.

I fail to see why a principle of sanctity of data should appear now just because sanctity of data might have become a technical possibility.

There are several strands here:

It has always been possible to encrypt (place a mathematical level lock) a piece of information in such a way that it could not be broken, that's not merely a technical capability.

At the same time, the volume of information people generate just by virtue of having smart phones etc. is far greater than in the past. There are information flows available these days that were not available at any practical level in the past. Search engine query for an uncommon disease is traceable in a way that a trip to a reference library isn't. Legal guidelines written with the past in mind give current access to far more information than was necessarily intended when the legislation itself was drawn up. So to punt it one level up, we aren't talking about legal capabilities but philosophical principles.

If people's personal data is going to stored in anything other than offline storage to which only they have access to (which in itself has got harder and harder) then it has to be protected in some way [for our purposes, transmitting this data is equivalent to storage]. There is no way of doing this in a way that would allow only the 'good guys' access to this information whilst simultaneously preventing 'bad guys' from accessing it.

Given the ability to retain large quantities of data over time, any form of protection has to be capable of defeating not just present, but also future attacks, so the protection can't just be based on cost of capability required to break it.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by orfeo:
Whether Apple ought to be required to help is a distinct legal argument, and while it might be one Apple is running in court, in the land of public opinion they're running the argument that the FBI accessing a phone is a terrible idea. It didn't suddenly become a terrible idea just because access has become difficult. All the the change in technology did was create a situation where Apple could combine it's supposed "principles" on the issue with a practical situation.

From what I've seen Apple are not saying it's a terrible idea to unlock a single phone, they've mentioned their objection to providing a "back door" to phones in general. Now we know that's not what the FBI have asked for, but the fact that Apple framed it that way is either spin or it implies something about the technical details that mean a single-phone solution would easily become a more general purpose back door.

I'll be interested to see what arguments Apple give in court regarding this.
 
Posted by lilBuddha (# 14333) on :
 
chris styles,

You have hit on a real problem with this case. Technology, its use and reach have moved at a far quicker pace, with greater implications, than those adjudicating this issue understand and/or present. This also includes the public debate.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

It's highly relevant, though. We're not merely talking about technology, we're talking about legal principles.

I fail to see why a principle of sanctity of data should appear now just because sanctity of data might have become a technical possibility.

There are several strands here:

It has always been possible to encrypt (place a mathematical level lock) a piece of information in such a way that it could not be broken, that's not merely a technical capability.

At the same time, the volume of information people generate just by virtue of having smart phones etc. is far greater than in the past. There are information flows available these days that were not available at any practical level in the past. Search engine query for an uncommon disease is traceable in a way that a trip to a reference library isn't. Legal guidelines written with the past in mind give current access to far more information than was necessarily intended when the legislation itself was drawn up. So to punt it one level up, we aren't talking about legal capabilities but philosophical principles.

If people's personal data is going to stored in anything other than offline storage to which only they have access to (which in itself has got harder and harder) then it has to be protected in some way [for our purposes, transmitting this data is equivalent to storage]. There is no way of doing this in a way that would allow only the 'good guys' access to this information whilst simultaneously preventing 'bad guys' from accessing it.

Given the ability to retain large quantities of data over time, any form of protection has to be capable of defeating not just present, but also future attacks, so the protection can't just be based on cost of capability required to break it.

No, this still not persuading me in the slightest.

Whatever theoretical notion there might have been of it being possible to make information impregnable, the fact is that iPhones were, until about 18 months ago, easier to crack. Whether by good guys or bad guys.

I fail to see what the volume of information has to do with anything. Let me emphasise yet again, we aren't talking about general licenses to collect information, we're talking about specific permission to collect specific information relating to specific persons. Which makes something like a search engine utterly irrelevant. A search engine engages in a form of fishing expedition to find out what's out there. The FBI aren't asking for a search engine, they're asking to type a specific URL.

The principles of how the law deals with an individual person don't miraculously change just because the population increases, and neither is "there's a lot more information now" create any kind of argument in and of itself for a change in legal principle. A principle might be more important and of more interest as the subject matter increases, but that in no way tells you what the principle ought to be.

The whole notion of "offline storage that only the person has access to" kind of begs the question and actually illustrates perfectly the problems with arguing that smartphones are some special new issue. The fact is, both good guys and bad guys had the exact same tools to break into your house or your safe or your storage locker. This supposed novel problem about being unable to devise a method only usable by good guys is not a novel problem. If the police knew how to pick locks, so did burglars. If the police knew how to crack safes, so did criminal safe crackers.

Technical capability has never been what separates law enforcement from crooks. Lawful authority is the distinguishing feature. And that's the entire point of the FBI going to court. It's the going to court that distinguishes them from the crook who wants to steal the data from your phone.

No-one would ever argue that we ought not allow the FBI to seek a court order because it might mean that crooks would have access to court orders. So they argue that the FBI mustn't be able to get into phones in case crooks are able to get into phones. But that's pretty well a basis for an argument against every piece of technology, ever. No guns for the good guys in case the bad guys misuse them. No cars for police in case crooks convert the technology into getaway cars. Heck, no mobile phones for us in case crooks find ways of using them to talk to each other about planning crimes or coordinating terror attacks.

It only makes sense to ban a technology on this basis if you can show that the bad uses actually outweigh the good. The mere existence of potential bad use is not enough.

[ 25. February 2016, 11:56: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Paul.:
quote:
Originally posted by orfeo:
Whether Apple ought to be required to help is a distinct legal argument, and while it might be one Apple is running in court, in the land of public opinion they're running the argument that the FBI accessing a phone is a terrible idea. It didn't suddenly become a terrible idea just because access has become difficult. All the the change in technology did was create a situation where Apple could combine it's supposed "principles" on the issue with a practical situation.

From what I've seen Apple are not saying it's a terrible idea to unlock a single phone, they've mentioned their objection to providing a "back door" to phones in general. Now we know that's not what the FBI have asked for, but the fact that Apple framed it that way is either spin or it implies something about the technical details that mean a single-phone solution would easily become a more general purpose back door.

The main thing it implies to me is that Apple actually lacks the technical competence to devise an appropriate single-phone solution, and wants to blame the FBI for asking for a single-phone solution.

To carry on an analogy previously alluded to, it's as if Apple is saying that they can't possibly find a specific URL for you, so they're going to have give you a search engine, and then it's your fault that this will enable to find other websites besides the one you wanted.

[ 25. February 2016, 11:58: Message edited by: orfeo ]
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

Whatever theoretical notion there might have been of it being possible to make information impregnable

The notion that its purely 'theoretical' is largely incorrect and irrelevant. There are systems that implement similar levels of protection - it just so happens that at this particular moment in time, a consumer electronics company is rolling out similar protection at the level of a mass market device.

quote:

I fail to see what the volume of information has to do with anything. Let me emphasise yet again, we aren't talking about general licenses to collect information, we're talking about specific permission to collect specific information relating to specific persons.

Because the volume of specific information each of us produces has expanded exponentially, including activities which would have been previously transacted in a largely untraceable manner, and that this in itself changes the implications of laws drawn decades ago. I do not think that it is illogical that the massive increase in powers thus implied should be the subject of public debate. It's not just an issue about volume of information created, but the types of information collected.

[The search engine was an example of the kinds of data traces that would previously not have been created, and not meant as an analogy of the technical capability the FBI was trying to acquire].

quote:

It only makes sense to ban a technology on this basis if you can show that the bad uses actually outweigh the good. The mere existence of potential bad use is not enough.

There's a quandary embedded in your use of this principle. The next generation of this technology could well be implemented in such a way that it was impossible for Apple - even in principle - to do what the FBI want them to do now.
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by orfeo:

The main thing it implies to me is that Apple actually lacks the technical competence to devise an appropriate single-phone solution, and wants to blame the FBI for asking for a single-phone solution.

You cannot safely draw this conclusion - it could also imply that it is not possible to create a single-phone solution because of the manner in which the system is or has to be constructed.
 
Posted by Paul. (# 37) on :
 
Orfeo, you'll forgive me if I doubt your knowledge and expertise to judge Apple's technical competence. I can imagine plausible scenarios in which it might be very difficult to create a single-phone upgrade that wasn't a general purpose back door in thin disguise. However I don't know enough about the internals of iOS to know whether those scenarios are likely in reality.

However Apple's statements - if they're not just lying for PR purposes - would support that. In which case the courts will need to decide whether Apple are duty bound to give away a master key to unlock one safe. The government would presumably try to prove that a single key is possible, however they, like us, lack the detailed knowledge that Apple has.

Which all brings me back to my previous statement that I think the arguments in court on this are going to be very interesting.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Paul.:
In which case the courts will need to decide whether Apple are duty bound to give away a master key to unlock one safe.

Only if Apple not just asserts that a master key is required, but if Apple asserts that they can't hold onto the master key themselves after unlocking one phone as requested.

I will fully accept that I don't have technical expertise in this area. It does seem quite surprising to me, though, that Apple would've painted themselves into such a conceptual corner. Not just because of law enforcement requests, but because of all the other completely legitimate reasons someone might need access to a phone.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

The main thing it implies to me is that Apple actually lacks the technical competence to devise an appropriate single-phone solution, and wants to blame the FBI for asking for a single-phone solution.

You cannot safely draw this conclusion - it could also imply that it is not possible to create a single-phone solution because of the manner in which the system is or has to be constructed.
Similarly, the notion that it is "not possible" says something about Apple. It'd be one thing for a person to tell you that they can't possibly create a one-off solution for a system designed by someone else, but here it's the actual creator of the system that is telling you they designed the system in such a way as to make it impossible.

I suppose someone thought that was an excellent idea at the time - selling you a product so secure, you can't even ask for assistance from the person who devised the security. But to me it has flashes of all those sci fi stories about machines that can't be stopped by their own creators. There's a certain overconfidence in your own plans when you make them irreversible.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

The main thing it implies to me is that Apple actually lacks the technical competence to devise an appropriate single-phone solution, and wants to blame the FBI for asking for a single-phone solution.

You cannot safely draw this conclusion - it could also imply that it is not possible to create a single-phone solution because of the manner in which the system is or has to be constructed.
Similarly, the notion that it is "not possible" says something about Apple. It'd be one thing for a person to tell you that they can't possibly create a one-off solution for a system designed by someone else, but here it's the actual creator of the system that is telling you they designed the system in such a way as to make it impossible.
Indeed and they did this deliberately precisely because they didn't want to be placed in this position.
 
Posted by lowlands_boy (# 12497) on :
 
What statements have Apple actually made? Their original ones did not deny the technical feasibility, but did suggest that they thought it was a very bad thing. The most recent thing I saw suggested that the case should be suspended and a commission formed to investigate the consequences.

I'm still strongly inclined to believe that they could lock it to a handset (that's the only interesting bit - the other requirements are pretty trivial). I'd agree that the arguments against will be interesting, because I presume the burden of proof that it's not possible will rest with Apple, and proving it will involve vast technical arguments that they will undoubtedly want conducted in private.
 
Posted by Dave W. (# 8765) on :
 
quote:
Originally posted by Paul.:
quote:
Originally posted by orfeo:
quote:
Originally posted by chris stiles:
quote:
Originally posted by orfeo:

The main thing it implies to me is that Apple actually lacks the technical competence to devise an appropriate single-phone solution, and wants to blame the FBI for asking for a single-phone solution.

You cannot safely draw this conclusion - it could also imply that it is not possible to create a single-phone solution because of the manner in which the system is or has to be constructed.
Similarly, the notion that it is "not possible" says something about Apple. It'd be one thing for a person to tell you that they can't possibly create a one-off solution for a system designed by someone else, but here it's the actual creator of the system that is telling you they designed the system in such a way as to make it impossible.
Indeed and they did this deliberately precisely because they didn't want to be placed in this position.
Based on what Apple itself says in its white paper on iOS security it does have the ability to write software that is uniquely signed for a specific phone - they use this when sending out software update packages to ensure that a package meant for one phone can't be used on another (see page 6.)
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by lowlands_boy:
What statements have Apple actually made?

quote:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
From Apple's 'letter' to its customers
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by Dave W.:
Based on what Apple itself says in its white paper on iOS security it does have the ability to write software that is uniquely signed for a specific phone - they use this when sending out software update packages to ensure that a package meant for one phone can't be used on another (see page 6.)

I'm aware of that. I'm also aware Tim Cook says "any phone" above. He's also (prior to this case) talked about a "master key" not being a good thing.

So I'm not sure why he would say that if the signing by Apple of updates is so secure that it can't be abused. Unless there's some other loophole. But it's such a blanket statement, as part of announcing why they're going to court to refuse the FBI request, that I can't help but think Apple have some answer to it which they are willing to present in court.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by chris stiles:
The next generation of this technology could well be implemented in such a way that it was impossible for Apple - even in principle - to do what the FBI want them to do now.

Yes, it could. It would be possible to design a system that would be very difficult indeed to break in to. Imagining a system that requires drilling in to a chip package to break into is trivial. It's quite possible that one could build something that was protected against even that level of physical intrusion. And if the data's on a device that is actually impregnable, the only hope law enforcement have of reading it is to persuade the device owner to decrypt it for them.

That's not what we have here.

(Here's the technical capability that Apple have: It is trivial for them to build a version of iOS that doesn't implement the self-destruct feature. It is also trivial for them to build in a boot-time check that ensures that a particular iOS version will only run on a device with particular hardware IDs.

Equally trivially, they can build an iOS version that will disable the self-destruct and run on any phone.

None of this takes more than a morning for one engineer. The testing would take longer than the actual modifications.)
 
Posted by Dave W. (# 8765) on :
 
Apple also says, in response to government statements that their objection appears to be based on concern for their business model and marketing strategy, "Absolutely not. Nothing could be further from the truth." So some amount of hyperbole in their public statements isn't unexpected.

As you say, they may reveal a technical explanation that justifies their talk of "master keys" and "cancer" in their response to the judge; but they also could have already given it in one of their public statements.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Dave W.:
As you say, they may reveal a technical explanation that justifies their talk of "master keys" and "cancer" in their response to the judge; but they also could have already given it in one of their public statements.

They're arguing against the precedent. Because they know, as do we all, that once they are required to do this once, they will start getting weekly boxes of iPhones shipped to them by law enforcement all across the country.

It's not a technical argument at all.
 
Posted by Honest Ron Bacardi (# 38) on :
 
Having been away for a couple of days, it's interesting to see the way this thread has moved on. Particularly in view of the fact that a couple of days ago I had lunch with someone whose day job is the design and supply of embedded security systems such as Apple are introducing from the iPhone 6 onwards.

So naturally I asked him what he thought about the whole thing. He replied that there was little to say professionally. In his view, the discussion is or should be on the morality of several interlinked issues, as it would seem that if pressed to extremes, either side of this discussion in the FBI case could be seen as having some aspect that was immoral. And probably that's a discussion we all need to have.

Whilst we discussed much else, the other comment that intrigued me was when I asked him what phone he used himself. He replied that he used an old Nokia phone, as smartphones are all much worse as phones, and all the other things smartphones do made their use too insecure for his line of business.

I think latterly the main discussion here seems to be headed the same way and I thought the convergence interesting.
 
Posted by Honest Ron Bacardi (# 38) on :
 
Oh, incidentally - don't confuse security with the inviolacy of user data.

Some while back, Apple started deleting any music loaded on your device which had not been bought through iTunes. Because security. Just think about that for a minute - that wasn't even restricted to Apple devices. Apple was using iTunes as a backdoor on your machine to delete your data.

They were required to stop, and suffered a $$$$$$$$$ anti-trust suit of course, and strangely nobody seems to have suffered any security issues since. But it illustrates the difference between the two.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Paul.:
quote:
Originally posted by Dave W.:
Based on what Apple itself says in its white paper on iOS security it does have the ability to write software that is uniquely signed for a specific phone - they use this when sending out software update packages to ensure that a package meant for one phone can't be used on another (see page 6.)

I'm aware of that. I'm also aware Tim Cook says "any phone" above. He's also (prior to this case) talked about a "master key" not being a good thing.

So I'm not sure why he would say that if the signing by Apple of updates is so secure that it can't be abused. Unless there's some other loophole. But it's such a blanket statement, as part of announcing why they're going to court to refuse the FBI request, that I can't help but think Apple have some answer to it which they are willing to present in court.

And I can't help thinking he might not be being precisely accurate or honest.

My scepticism is at least partly driven by the Error 53 debacle, where Apple switched from "this proves how excellent our security is" to "oh my God we are so sorry, we never meant that to happen" within a matter of weeks. Quite possibly because their lawyers told them they were at risk of massive lawsuits for anticompetitive behaviour.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by Dave W.:
As you say, they may reveal a technical explanation that justifies their talk of "master keys" and "cancer" in their response to the judge; but they also could have already given it in one of their public statements.

They're arguing against the precedent. Because they know, as do we all, that once they are required to do this once, they will start getting weekly boxes of iPhones shipped to them by law enforcement all across the country.

It's not a technical argument at all.

Oh look, it's the fallacious form of slippery slope argument again that treats this as the first phone access that's ever been sought and ignores the involvement of a court in assessing the merits of a request based on other factors besides 'can it be done'.

Haven't seen that for up to 24 hours.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by orfeo:
Oh look, it's the fallacious form of slippery slope argument again that treats this as the first phone access that's ever been sought and ignores the involvement of a court in assessing the merits of a request based on other factors besides 'can it be done'.

No, it's no that at all. Of course the courts will be involved, and will assess the merits.

It'll just happen quite often. Here's some criminal, who you believe to have evidence incriminating him and others on his phone. This is not uncommon.

If it's the same kind of phone as the one in this case (and there are a lot of those phones) then the data can be acquired with Apple's help, but can't be acquired without it.

Sure, a court will judge each case on its merits, but I'd expect the answer in many cases to be "yes".

And I don't have a problem with that.

Apple have built a system that they can help unlock, but nobody else can. This means that when law enforcement can convince a court that it should search someone's phone, they're going to be going to Apple. All the time.
 
Posted by orfeo (# 13878) on :
 
Yes, okay, I see what you're saying.

Basically Apple has created a rod for their own backs, forcing their own involvement in law enforcement when on previous phones their involvement would not have been necessary.
 
Posted by Dave W. (# 8765) on :
 
Christmas has come a day early - Apple's response is now out:
Apple Inc's motion to vacate order compelling Apple Inc. to assist agents in search, and opposition to government's motion to compel assistance

It has it all: no close connection, undue burden, assistance not necessary (the three All Writs Act tests), existence of applicable statues, plus the First and Fifth Amendments thrown in for good measure!

Apple does not argue that the software can't be made specific to one phone - just that the technique could also be used on other phones (which, well, obviously.) They do provide some estimates of what it would take: 6-10 Apple engineers and employees for 2-4 weeks (p. 13 and an engineer's declaration at the end.) Apple repeats that the software would be too tempting a target for criminals and hackers to keep around (but doesn't say why it its regular source code and keys aren't already in similar danger.) They also give a shoutout to the lethal injection drug hypothetical from the New York case.

I think it will be interesting to see how the judge decides whether Apple's estimate of the work required rises to the level of an undue burden.

Government reply is due March 10, Apple's further response by March 15; there will be a federal court hearing March 22.

(Note: Ars Technica has paid pretty close attention to this case, and is good about linking to the original documents.)
 
Posted by Golden Key (# 1468) on :
 
Is it usual for there to be such a tight timeline on something like this, for responses from the parties? Or is that because of the terrorism link?
 
Posted by Dave W. (# 8765) on :
 
I have no idea whether this is considered fast, slow, or average. I haven't noticed anything in the documents that suggests a rush, or any time pressure because of the investigation itself.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Dave W.:
Apple repeats that the software would be too tempting a target for criminals and hackers to keep around (but doesn't say why it its regular source code and keys aren't already in similar danger.)

Rather a good point. I mean, think about all the things that Apple tells developers so that they can develop apps.

I shudder to think what would happen if anyone ever manages to develop a form of 'ransomware' that attacks iPhones. Maybe Apple believes its systems actually prevent such a possibility from arising.

[ 26. February 2016, 04:32: Message edited by: orfeo ]
 
Posted by Honest Ron Bacardi (# 38) on :
 
Ransomware attacking iPhones has been around for quite a while, orfeo.

Here's an example being discussed.

But the point is well made. Extreme security as a concept is all very well, but legitimate users (and here the endpoint user is surely uncontroversial) do have to be able to circumvent its misuse, erratic behaviour and so on.
 
Posted by Paul. (# 37) on :
 
quote:
Originally posted by Dave W.:
Christmas has come a day early - Apple's response is now out:
Apple Inc's motion to vacate order compelling Apple Inc. to assist agents in search, and opposition to government's motion to compel assistance

Thanks for the link. Well worth the read. Especially the engineer's part which explains well why it's not a trivial amount of work.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Unfortunately, the FBI, without consulting Apple or reviewing its public
guidance regarding iOS, changed the iCloud password associated with one of the
attacker’s accounts, foreclosing the possibility of the phone initiating an automatic
iCloud back-up of its data to a known Wi-Fi network, see Hanna Decl. Ex. X [Apple
Inc., iCloud: Back up your iOS device to iCloud], which could have obviated the need
to unlock the phone and thus for the extraordinary order the government now seeks.21
Had the FBI consulted Apple first, this litigation may not have been necessary

A commendable balls up there...
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Honest Ron Bacardi:
Extreme security as a concept is all very well, but legitimate users (and here the endpoint user is surely uncontroversial) do have to be able to circumvent its misuse, erratic behaviour and so on.

There is also a conflict between ensuring the security of your data and ensuring the continued existence of your data (in general - not specific to phones).

You can encrypt your data securely, so that it is uncrackable in practical time. That has been possible for quite a while. But if you do that, and you forget your password, you've lost your data. Do you really want to do that to a decade of family photos? Probably not.

Do you want to do it to your banking transaction request? Yes.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Paul.:
Especially the engineer's part which explains well why it's not a trivial amount of work.

So I hadn't understood that the request was also for an electronic password input. That's more work - that's entirely new functionality. I think Apple are padding their estimates a bit, but it's unquestionably a much bigger deal than "disable the self-destruct" (which is trivial).
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by Honest Ron Bacardi:
Extreme security as a concept is all very well, but legitimate users (and here the endpoint user is surely uncontroversial) do have to be able to circumvent its misuse, erratic behaviour and so on.

There is also a conflict between ensuring the security of your data and ensuring the continued existence of your data (in general - not specific to phones).

You can encrypt your data securely, so that it is uncrackable in practical time. That has been possible for quite a while. But if you do that, and you forget your password, you've lost your data. Do you really want to do that to a decade of family photos? Probably not.

Do you want to do it to your banking transaction request? Yes.

Any fuckwit who only has one copy of his family photos, or has them all stored with the same password, deserves to lose them.
 
Posted by orfeo (# 13878) on :
 
Yes, for God's sake keep the negatives!
 
Posted by mousethief (# 953) on :
 
quote:
Originally posted by orfeo:
Yes, for God's sake keep the negatives!

The what?
 
Posted by Honest Ron Bacardi (# 38) on :
 
mousethief wrote:
quote:
Any fuckwit who only has one copy of his family photos, or has them all stored with the same password, deserves to lose them
Isn't this exactly what Apple is wanting you to do? And not just your photos, but your music, your contacts, your financial transactions (Apple pay) and who knows what in the future? All on the one device with one password. Of course they will point you in the direction of backing up to the iCloud ($$$), but the downside of that is neatly illustrated above - see under FBI cockups.

Fuckwits may well be the technical term for people who buy into this, though I doubt they would thank you for pointing it out.

This thread is about the big beasts positioning for ownership of Big Data, where they see the future is, and they don't want any other big beasts intruding on their turf. Not that that remotely describes all that's going on here, but wariness about who and what you share with big business is always a worthwhile consideration.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by Honest Ron Bacardi:


This thread is about the big beasts positioning for ownership of Big Data, where they see the future is, and they don't want any other big beasts intruding on their turf.

It really isn't. Yeah, Apple are about that, (and Google and Microsoft) but their concerns overlap the public's rights.

quote:
but wariness about who and what you share with big business is always a worthwhile consideration.
Oh yes. It is another ascect of relinquishing your life to others without proper consideration.
 
Posted by Honest Ron Bacardi (# 38) on :
 
That observation was just an attempt to reframe the issue, - as you say Big Data is already in there. No it doesn't cover everything (I did point that out). But there's no unique way of looking at this issue.

But both sides in this dispute have a lamentable record of care. Apple's concern does - for the moment - overlap with public interest in peoples' personal data. But the FBI will be along shortly to point out that what you get suits organized crime, terrorism, pedophiles etc. etc. down to the ground. Hell, they've already said so.

What is to be done, then? If both the current position of the FBI and Apple are both inherently immoral, then like the old adage about the road to Dublin, you wouldn't want to start from here. Some sort of reframing is called for. Though there's no suggestion that my is necessarily the way forward. But something is.
 
Posted by Dave W. (# 8765) on :
 
Judge Orenstein rules for Apple in NY iPhone unlocking case.

This ruling is only authoritative in the judge's district, but may influence the judge in the San Bernardino case. The judge's rebuff was pretty comprehensive:
On the first point, the judge seems to think the government is trying an end run around the separation of powers:
quote:
It is also clear that the government has made the considered decision that it is better off securing such crypto-legislative authority from the courts (in proceedings that had always been, at the time it filed the instant Application, shielded from public scrutiny) rather than taking the chance that open legislative debate might produce a result less to its liking.
And on the question of burden, the judge is looking beyond the issue of financial costs, referring to the government's inability to answer his death penalty hypothetical:
quote:
If the government cannot explain why the authority it seeks here cannot be used, based on the same arguments before this court, to force private citizens to commit what they believe to be the moral equivalent of murder at the government's behest, that in itself suggests a reason to conclude that the government cannot establish a lack of unreasonable burden.
In conclusion, the judge says this kind of balancing between security and privacy interests is really a job for Congress:
quote:
In deciding this motion, I offer no opinion as to whether, in the circumstances of this case or others, the government's legitimate interest in ensuring that no door is too strong to resist lawful entry should prevail against the equally legitimate societal interests arrayed against it here. Those competing values extend beyond the individual's interest in vindicating reasonable expectations of privacy – which is not directly implicated where, as here, it must give way to the mandate of a lawful warrant. They include the commercial interest in conducting a lawful business as its owners deem most productive, free of potentially harmful government intrusion; and the far more fundamental and universal interest – important to individuals as a matter of safety, to businesses as a matter of competitive fairness, and to society as a whole as a matter of national security – in shielding sensitive electronically stored data from the myriad harms, great and small, that unauthorized access and misuse can cause.

How best to balance those interests is a matter of critical importance to our society, and the need for an answer becomes more pressing daily, as the tide of technological advance flows ever farther past the boundaries of what seemed possible even a few decades ago. But that debate must happen today, and it must take place among legislators who are equipped to consider the technological and cultural realities of a world their predecessors could not begin to conceive. It would betray our constitutional heritage and our people's claim to democratic governance for a judge to pretend that our Founders already had that debate, and ended it, in 1789.

So now all we have to do is find "legislators who are equipped to consider the technological and cultural realities ... oh never mind.
 
Posted by orfeo (# 13878) on :
 
quote:
Originally posted by Dave W.:
Anyway, there's no gap because relevant statute law already exists (1994 Communications Assistance Law Enforcement Act, CALEA) which exempts "information services" companies like Apple

Well, that strikes me as rather important.
 
Posted by Dave W. (# 8765) on :
 
CALEA was discussed at the hearing Judge Orenstein held back in October. The government argued that Apple didn't qualify as an information service under the CALEA definition; the judge now says he's persuaded by Apple's argument that it does.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by orfeo:
quote:
Originally posted by Dave W.:
Anyway, there's no gap because relevant statute law already exists (1994 Communications Assistance Law Enforcement Act, CALEA) which exempts "information services" companies like Apple

Well, that strikes me as rather important.
CALEA is the act that requires phone companies and the like to support duly authorised wiretaps. I don't see how CALEA is remotely relevant to this case. It's not Apple's services as an "information services" company that are being sought - it's Apple's services as the manufacturer of a pocket computer.

I think any claim in this context based on Apple being any kind of "provider" is mistaken. The relevant context for this claim is Apple as electronics manufacturer.

The points on the scope of the AWA are real, though. This one's going to appeal...
 
Posted by orfeo (# 13878) on :
 
I have to admit, without having gone and looked at the legislation, the reference to "communications" did strike me as immediately odd. Because I agree, this isn't about the phone as a communications device.

It does make you rather wonder which arguments would hold up if the relevant device was an iPad.

*makes notes in case this issue ever comes up at work*

[ 01. March 2016, 09:41: Message edited by: orfeo ]
 
Posted by orfeo (# 13878) on :
 
ADDENDUM: Does Apple offer any kind of phone plans in the US?

It doesn't here. You either buy a phone from Apple and then take it to the actual phone network provider of your choice, or you get a plan with one of those providers where you pay a monthly amount for the phone as part of the overall cost.
 
Posted by Dave W. (# 8765) on :
 
According to the judge's ruling, CALEA makes a distinction between telecommunications companies and information services companies, and only imposes the assistance requirements on the former.

He agrees that Apple falls into the latter category; on p. 20 he quotes Apple quoting CALEA:
quote:
Under CALEA "information services" means the "offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications," and "includes a service that permits a customer to retrieve stored information from, or file information for storage in, information storage facilities; electronic publishing; and electronic messaging services."

 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Dave W.:

He agrees that Apple falls into the latter category; on p. 20 he quotes Apple quoting CALEA:
quote:
Under CALEA "information services" means the "offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications," and "includes a service that permits a customer to retrieve stored information from, or file information for storage in, information storage facilities; electronic publishing; and electronic messaging services."

I think he is in error. Those are indeed services that Apple offers, and if the court were considering imposing an obligation on Apple qua operator of the iCloud facility, then I think he'd be correct.

That's not what's at stake here - what's at stake here is the storage on the local phone/computer device, and it's Apple's assistance qua computer manufacturer that is desired.
 
Posted by RuthW (# 13) on :
 
quote:
Originally posted by Leorning Cniht:
quote:
Originally posted by Dave W.:

He agrees that Apple falls into the latter category; on p. 20 he quotes Apple quoting CALEA:
quote:
Under CALEA "information services" means the "offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications," and "includes a service that permits a customer to retrieve stored information from, or file information for storage in, information storage facilities; electronic publishing; and electronic messaging services."

I think he is in error. Those are indeed services that Apple offers, and if the court were considering imposing an obligation on Apple qua operator of the iCloud facility, then I think he'd be correct.

That's not what's at stake here - what's at stake here is the storage on the local phone/computer device, and it's Apple's assistance qua computer manufacturer that is desired.

I think you've misread something.

Yes, those are services Apple offers, and which is why the judge puts Apple in the category of information service companies which are not subject to the assistance requirements of CALEA, nad not in the category of telecommunications companies, i.e. Verizon, AT&T, T-Mobile, etc., which are required to assist per CALEA.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by RuthW:
I think you've misread something.

Yes, those are services Apple offers, and which is why the judge puts Apple in the category of information service companies which are not subject to the assistance requirements of CALEA, nad not in the category of telecommunications companies, i.e. Verizon, AT&T, T-Mobile, etc., which are required to assist per CALEA.

The claim was that Congress had considered the matter and chosen to not apply CALEA to "information services", and therefore Congress had actively chosen not to apply legislation to Apple.

My case is that the whole CALEA business is a complete red herring, and it's of no more relevance than saying "Congress didn't intend to impose a burden on Apple because the Aviation and Transportation Security Act doesn't apply to phone manufacturers" or perhaps "Malus Safe and Lock Company isn't a phone company, so is not subject to CALEA".
 
Posted by Dave W. (# 8765) on :
 
From what I can tell the judge's order has a number of components, any one of which would be fatal to the governments application.

The first sticking point is CALEA's explicit exemption of "information services" - does this include Apple? Apple says yes, government says no, judge says yes, effectively (though "the matter is a close call".)

But even if Apple isn't covered by an explicit exemption, the judge says the legislative history since CALEA shows Congress has clearly considered the issue of the government's requested authority, but has failed either to create or reject it. The government says that in the absence of an affirmative prohibition, the AWA fills the gap. The judge says that construes the AWA way too broadly and would imply the executive and judiciary have the power to decide things that are really in the legislative domain. For him, the history of legislative consideration means there is no gap, so this application isn't "agreeable to the usages and principles of the law" as the statute requires.

Finally, even if he were to accept the government's position on the two previous issues, the judge says the application would still fail to meet the AWA's three discretionary factors: on the third party's "closeness" to the underlying case, the level of burdensomeness (ugly word, but there it is...), and necessity (the government's statements have been mixed on whether there are alternative methods it could have tried.)
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by Dave W.:
From what I can tell the judge's order has a number of components, any one of which would be fatal to the governments application.

Yes. The CALEA thing is nonsense, AFAICS, but the question of how broad the AWA is is fundamental. If you're a particularly skilled locksmith who doesn't like the police, does the government have the power to compel you to open a lock for it? That's basically the question here.
 
Posted by lilBuddha (# 14333) on :
 
Relevant and related story.
 
Posted by Leorning Cniht (# 17564) on :
 
quote:
Originally posted by lilBuddha:
Relevant and related story.

The sense of my claim is that this story (which refers to the "StingRay" device used to intercept cellphone communications) isn't relevant to the case at hand.

It's related in the sense that it involves a government agency using technological means against suspects, and that it involves a cellphone, but that's where the similarities end.

The Apple case is to do with breaking the encryption on a phone in order to access the stored data. The fact that the phone is a communications device is irrelevant - it would be exactly the same case if it were a personal organizer, laptop computer, or whatever else.

The StingRay can be used as a mass data mining tool. In that context, it has the same kind of privacy implications as allowing the police to go on warrantless fishing trips through the logs of a city's license plate recognition cameras.
 
Posted by lilBuddha (# 14333) on :
 
The same at its core, which is the erosion of civil liberty. Even in the best case scenario, those who are monitoring the process are hopeless.
 
Posted by lowlands_boy (# 12497) on :
 
Well well well, they suddenly managed to do it without Apple after all. Why doesn't that surprise me.

Any bets on whether they'll be sharing their solution with Apple so they can patch it?
 
Posted by Dave W. (# 8765) on :
 
Apple has plenty of money - perhaps they should use some of it to buy their own clues.
 
Posted by no prophet's flag is set so... (# 15560) on :
 
quote:
Originally posted by lowlands_boy:
Well well well, they suddenly managed to do it without Apple after all. Why doesn't that surprise me.

Any bets on whether they'll be sharing their solution with Apple so they can patch it?

I expect they successfully copied the phone and did brute force on the copy, then copied again, again and again etc. Brute force 10 times each copy/clone. They could have brute forced millions of copies. Lots of resources means lots of goes at it. Nothing really to share with any company if that's the case.
 
Posted by orfeo (# 13878) on :
 
Good news for all. Apple can continue to tell its customers that it will never, ever help break their phones, and the authorities can continue to go ahead and do it anyway when legally justified.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by no prophet's flag is set so...:
quote:
Originally posted by lowlands_boy:
Well well well, they suddenly managed to do it without Apple after all. Why doesn't that surprise me.

Any bets on whether they'll be sharing their solution with Apple so they can patch it?

I expect they successfully copied the phone and did brute force on the copy, then copied again, again and again etc. Brute force 10 times each copy/clone. They could have brute forced millions of copies. Lots of resources means lots of goes at it. Nothing really to share with any company if that's the case.
The FBI got pulled up on that in front of some congressional hearing, where they were repeatedly asked why they weren't doing that. It would be pretty time consuming, unless you could then resurrect the data in an emulator or virtual machine environment, in which case you could do it much more quickly.
 
Posted by Honest Ron Bacardi (# 38) on :
 
I doubt if it was done in exactly that way - what the FBI was pulled up on was why they did not back it up to the cloud, where presumably someone could do a brute-force sequence of attempts as no prophet suggests. Whilst they didn't say explicitly, I imagine they let the phone charge die, so on restarting, the option was no longer available. But that's me guessing.
 
Posted by lowlands_boy (# 12497) on :
 
quote:
Originally posted by Honest Ron Bacardi:
I doubt if it was done in exactly that way - what the FBI was pulled up on was why they did not back it up to the cloud, where presumably someone could do a brute-force sequence of attempts as no prophet suggests. Whilst they didn't say explicitly, I imagine they let the phone charge die, so on restarting, the option was no longer available. But that's me guessing.

The cloud option went very early on. Apple did say that against Apple's advice, the FBI changed the password on a cloud account associated with the phone. Apple said that had the cloud password not been changed, the phone would likely have backed up to that, and the rest of the saga wouldn't have arisen...

There are suggestions that there will be a disclosure on the technique.
 
Posted by lilBuddha (# 14333) on :
 
quote:
Originally posted by orfeo:
Good news for all. Apple can continue to tell its customers that it will never, ever help break their phones, and the authorities can continue to go ahead and do it anyway when legally justified.

Yeah. The FBI only doing this when legally justified. Right.
 
Posted by no prophet's flag is set so... (# 15560) on :
 
quote:
Originally posted by lowlands_boy:
The FBI got pulled up on that in front of some congressional hearing, where they were repeatedly asked why they weren't doing that. It would be pretty time consuming, unless you could then resurrect the data in an emulator or virtual machine environment, in which case you could do it much more quickly.

I have put Android phone operating systems in VirtualBox on to computer. Easy to do. It's also easy to copy flash memory. Would it be so different? I don't / won't own any Apple products so don't know, but it is just a flavour of unix/BSD according to web.
 
Posted by deano (# 12063) on :
 
quote:
Originally posted by no prophet's flag is set so...:
quote:
Originally posted by lowlands_boy:
The FBI got pulled up on that in front of some congressional hearing, where they were repeatedly asked why they weren't doing that. It would be pretty time consuming, unless you could then resurrect the data in an emulator or virtual machine environment, in which case you could do it much more quickly.

I have put Android phone operating systems in VirtualBox on to computer. Easy to do. It's also easy to copy flash memory. Would it be so different? I don't / won't own any Apple products so don't know, but it is just a flavour of unix/BSD according to web.
I'm hoping the FBI threatened one of the programmers that they would plant a kilo of cocaine on their little brother unless they gave them the algorithms.

I also hope that they post the algorithms on the Internet as a warning to other companies that if they don't cough up the goods when required they will find their secret, proprietary code slathered all over the internet for all and sundry to see.
 
Posted by lilBuddha (# 14333) on :
 
Jesus would be so proud of that sentiment.
 
Posted by Golden Key (# 1468) on :
 
...so I stumbled onto a satirical piece about this, not knowing it was satire. Fortunately, there was a labeled link to more satire, just below the article. Pity it's not for real!
[Snigger]

"Unlocked iPhone Worthless After F.B.I. Spills Glass of Water on It."
 
Posted by lowlands_boy (# 12497) on :
 
So it seems the FBI have now set themselves up as

"unlockers are us..."
 
Posted by chris stiles (# 12641) on :
 
quote:
Originally posted by lilBuddha:
Jesus would be so proud of that sentiment.

It's the banality of authoritarian viciousness - the tactic is always one of threatened sexual violence - often at several removes.
 


© Ship of Fools 2016

Powered by Infopop Corporation
UBB.classicTM 6.5.0