I May Just Have to Buy an iPhone Now

It's interesting how bad they want into this phone considering they certainly already have all of the calling metadata and have essentially admitted such by talking about calls that were made. The carriers cough that stuff up without so much as a peep with the fiasco of their removal of liability via the FISA courts.

They want something out of an App. An App that doesn't talk through a server they can get access to. Otherwise they'd already have that info too.

That could be iMessage itself or another app, but they certainly don't need the call history. They have that.
 
A private company has stated they could hack the phone in less than three weeks, on their dime. (Reported by ABC News)
Along the same line, what would the potential cost to apple be and is the gov't willing to cover it? And if not, what is a valid amount apple should spend to do this and what length of time do they have? $1,000 extra, $10,000, $100,000, $1,000,000???
 
It's interesting how bad they want into this phone considering they certainly already have all of the calling metadata and have essentially admitted such by talking about calls that were made. The carriers cough that stuff up without so much as a peep with the fiasco of their removal of liability via the FISA courts.

They want something out of an App. An App that doesn't talk through a server they can get access to. Otherwise they'd already have that info too.

That could be iMessage itself or another app, but they certainly don't need the call history. They have that.

Maybe, but I think they just want the precedent. I doubt there's anything at all on that phone of national security interest. If there were, Farook and Malik would have destroyed it. But it comes in handy to establish the precedent that the government has the right to force companies to undermine their own products' security.

What's ironic is that several time a week I get emails from US-CERT (an arm of the Department of Homeland Security) warning of security holes that have been found in all kinds of software, firmware, and devices. It's actually quite a useful service. It aggregates all the vulnerabilities into one report and includes vulnerabilities in open-source and legacy commercial software that's often used without the benefit of current vendor support.

All of this is done in the interest of protecting national security because the government recognizes that cyber-threats are as real as any other mode of terrorism. But now they want Apple to intentionally create a hole where one doesn't exist, also in the interest of national security.

Surely someone in government understands the absurdity of this request. You don't make society more secure by making software and devices less secure -- and then effectively publishing the fact that the device is less secure by publicly compelling the manufacturer to cooperate in making it less secure. That just tips off the bad guys that the devices can be cracked and to be more careful.

The more intelligent approach for the government would be to buy a couple hundred or thousand iPhones, intercept and analyze the updates, reverse-engineer them, and figure out a way to crack the system that no one else knows about, all in secret and with no fanfare; and then establish plausible deniability by throwing up their collective hands and lamenting that Apple has indeed created a rock so heavy that even Apple can't move it.

Rich
 
Last edited:
THERE IS A COURT ORDER TO CRACK THE PHONE. THAT MAKES IT CONSTITUTIONAL, NO DIFFERENT THAN A SEARCH WARRANT FOR A HOME PC.

I think the analogy breaks down.

With a warrant, they have a right to search your home computer.

The warrant does not compel you to show them where the suspect files are, or, if encrypted, to provide the password.

And in this case, Apple is a corporate third party being mandated to exert effort to break their own encryption. That's where the judge overstepped, I think.
 
Hmmm...

Warrant against who???:dunno::dunno:

Apple does NOT own the phone... San Bernadino county does..

That entire thing stinks to high heaven... IMHO...

Agreed. However, the iPhone IOS software and any data that comes with the phone is "licensed, not sold" by Apple. How does it change things if the person owns the data, the company owns the hardware, Apple owns the IOS, and the apps are owned by whatever company wrote them? But even then, I'm not sure who really owns the data. Maps are owned by the mapmakers, music is owned by the label (or artist or whatever) my phone activity is owned by the carrier. What part of the phone is truly "mine"?
 
If anyone really believes that the order is for Apple to only this one phone, this just shows the direction they really want to go...

"In New York, the Manhattan district attorney's office said its investigators are locked out of more than 175 Apple devices that could provide crucial evidence in criminal cases."

You do it for one and then the cats out of the bag.

And just to show how DUMB so people can be... "Apple recommended trying to back up to Farook's iCloud account over the Internet, but investigators could not. Shortly after the attack, a San Bernardino County employee apparently had reset the password remotely. That made it impossible to initiate the auto-backup feature later
 
Last edited:
What I can't figure out is why Apple didn't keep the whole thing secret, crack it themselves (and not let the Feds have the cracked device at ALL), pull the data, and hand the Feds a USB stick.

Somehow the whole thing got FUBAR somewhere and Apple knew the story they were working with the Feds got out, is my guess. Then they HAD to say no, because they'd never sell another phone to any country or person again who wanted assurances their data was encrypted on the device.

Maybe that's the part they're not saying here. Maybe it's not encrypted after all and the passcode lock is the only thing standing in the Feds way but it'll wipe after ten attempts.

We aren't hearing the whole story here of course, but we can also tell that because the story doesn't make any sense.

This is ALL about politics.

The FBI and DOJ want to trump this up because they haven't gotten a bill passed in Congress to force the manufacturers to provide a back door. This would force it through the courts ("legislating from the bench") and set a precedent for the future. This is a convenient case because it allows use of the "T" word to scare people.

Anyone who doesn't have at least some concern about the ability of the government to break, access or read each and every piece of information about you has forgotten the lessons of history.
 
Along the same line, what would the potential cost to apple be and is the gov't willing to cover it? And if not, what is a valid amount apple should spend to do this and what length of time do they have? $1,000 extra, $10,000, $100,000, $1,000,000???


Whatever the lawyers cost to litigate all the way to SCOTUS is what they're going to charge.

Maybe, but I think they just want the precedent. I doubt there's anything at all on that phone of national security interest. If there were, Farook and Malik would have destroyed it. But it comes in handy to establish the precedent that the government has the right to force companies to undermine their own products' security.

What's ironic is that several time a week I get emails from US-CERT (an arm of the Department of Homeland Security) warning of security holes that have been found in all kinds of software, firmware, and devices. It's actually quite a useful service. It aggregates all the vulnerabilities into one report and includes vulnerabilities in open-source and legacy commercial software that's often used without the benefit of current vendor support.

All of this is done in the interest of protecting national security because the government recognizes that cyber-threats are as real as any other mode of terrorism. But now they want Apple to intentionally create a hole where one doesn't exist, also in the interest of national security.

Surely someone in government understands the absurdity of this request. You don't make society more secure by making software and devices less secure -- and then effectively publishing the fact that the device is less secure by publicly compelling the manufacturer to cooperate in making it less secure. That just tips off the bad guys that the devices can be cracked and to be more careful.

The more intelligent approach for the government would be to buy a couple hundred or thousand iPhones, intercept and analyze the updates, reverse-engineer them, and figure out a way to crack the system that no one else knows about, all in secret and with no fanfare; and then establish plausible deniability by throwing up their collective hands and lamenting that Apple has indeed created a rock so heavy that even Apple can't move it.

Rich


Excellent points Rich. Especially the logical one I missed that the bad guys would have simply destroyed the device. I totally missed that obvious logical flaw.

You forgot to mention the revolving door that are the reported cybercrimes attacking DOI, DOJ, and DOC reported nearly weekly at this point in time. CERT only publishes the way the bad guys could or already did, get in. SANS and others publish the actual results.

There isn't a week goes by in the IT security news rags that another government agency wasn't broken into or otherwise exploited. And those are only the ones they'll admit to.

Someone attempted to flame broil the poster here who said he wouldn't keep anything truly private on either an Apple or a Microsoft internet connected system. It was clear by their response that they actually believe those things are in some fashion "secure". There's no empirical evidence that any modern OS actually is secure, at all.

Even with the most up to date and fully patched system, it's more just luck and a lack of interest, that's protecting most folks from someone having full access to the entire machine remotely. The holes are way worse than Swiss cheese. Only marketing makes folks think this isn't so. The hard data says you're using an OS that has a large number of exploits and that number is *accelerating* without any industry plan to actually write better, not just more, software.

Think journalism in the newspaper era vs journalism today. Cranking out more doesn't make the product better. It makes the product sloppier.

I suspect the folks who *already* know how to crack the phones are enjoying this circus show. And they're out there.
 
All Writ is just flat dangerous overall, if you think about it.

In theory, the precedent set here could apply to all sorts of things, including R&D, weapons, drugs and the like. Think about it....

And to increase the ante, not only are local prosecutors planning to use the same logic/strategy, but some state legislators are debating bills to keep names and any info on law enforcement officers secret.
 
The court order seems to assume that Farook enabled the iOS feature to erase data if someone enters 10 wrong passcodes.

Until this case I hadn't noticed that my phone had that feature. I've never turned that feature on, and I'd bet most other users haven't, either.
 
The court order seems to assume that Farook enabled the iOS feature to erase data if someone enters 10 wrong passcodes.

Until this case I hadn't noticed that my phone had that feature. I've never turned that feature on, and I'd bet most other users haven't, either.

Pretty sure if you've locked the phone with a passcode, that's the default setting.

Never mind. Just checked in "Settings" and that option is NOT on by default - unless I changed it, which I don't think I did.
 
Last edited:
The court order seems to assume that Farook enabled the iOS feature to erase data if someone enters 10 wrong passcodes.

Until this case I hadn't noticed that my phone had that feature. I've never turned that feature on, and I'd bet most other users haven't, either.


On here. Does a better quality wipe than the one on the menu, too, according to the folks that check such things. It's how I wipe any iOS device I am selling.
 
On here. Does a better quality wipe than the one on the menu, too, according to the folks that check such things. It's how I wipe any iOS device I am selling.

Just an FYI....When I was involved in it, a wipe involved every data location being over-written seven times with random data. I think those were DOD standards at the time. Who knows by now.

I'm 100% behind Apple on this one, based on reasons others have already articulated much better than I could.

Jim

Edit- Retired (from that area) in 2008, a lifetime in computer stuff
 
Last edited:
This is ALL about politics.

The FBI and DOJ want to trump this up because they haven't gotten a bill passed in Congress to force the manufacturers to provide a back door. This would force it through the courts ("legislating from the bench") and set a precedent for the future. This is a convenient case because it allows use of the "T" word to scare people.

Anyone who doesn't have at least some concern about the ability of the government to break, access or read each and every piece of information about you has forgotten the lessons of history.

Exactly.
 
yxV36jy.png
 
Just an FYI....When I was involved in it, a wipe involved every data location being over-written seven times with random data. I think those were DOD standards at the time. Who knows by now.

I'm 100% behind Apple on this one, based on reasons others have already articulated much better than I could.

Jim

Edit- Retired (from that area) in 2008, a lifetime in computer stuff


The seven pass wipe is considered weak now on traditional drives, and doesn't work at all with certain drives (especially SSDs) since the drives handle remapping where they're writing and block instructions from the OS are ignored. The only approved erasure method for SSD is physical disk shredding.

Apple's iOS wipe is not seven pass. It's likely data could be recovered from a "wiped" iPhone, which means they're lying to us about that, also. Whole story doesn't make sense and is full of lies.

I wouldn't "back" anything on this one yet, if I were you. The more that comes out, he more this looks political. Not technological.

We're all being played for some reason.

There was no technical reason this ever had to go to court other than precedence. It's a flat out lie that they "can't get the data".

Even the Apple data wipe isn't good enough to protect the data, as you pointed out.

Even if the chips are tamper proof, potted, exceedingly small, whatever, the electronics themselves can be directly attacked to get the data. Especially the flash storage. The tech to do that exists and is expensive, but it's not a particularly hard tech challenge.
 
The seven pass wipe is considered weak now on traditional drives, and doesn't work at all with certain drives (especially SSDs) since the drives handle remapping where they're writing and block instructions from the OS are ignored. The only approved erasure method for SSD is physical disk shredding.

Apple's iOS wipe is not seven pass. It's likely data could be recovered from a "wiped" iPhone, which means they're lying to us about that, also. Whole story doesn't make sense and is full of lies.

I wouldn't "back" anything on this one yet, if I were you. The more that comes out, he more this looks political. Not technological.

We're all being played for some reason.

There was no technical reason this ever had to go to court other than precedence. It's a flat out lie that they "can't get the data".

Even the Apple data wipe isn't good enough to protect the data, as you pointed out.

Even if the chips are tamper proof, potted, exceedingly small, whatever, the electronics themselves can be directly attacked to get the data. Especially the flash storage. The tech to do that exists and is expensive, but it's not a particularly hard tech challenge.

Apple doesn't have to try and overwrite the entire ssd to wipe it. It wipes so fast because it securely trashes the keys to the encrypted data.
 
Apple doesn't have to try and overwrite the entire ssd to wipe it. It wipes so fast because it securely trashes the keys to the encrypted data.


I assume you're talking OSX and not iOS. iOS has a mix of encrypted and unencrypted data AFAICT.

(It's also flash and not modeled as an SSD in the chipset which may be pedantic, but wiping them is handled differently at the low level.)

OS X full disk encryption is a different beast than iOS encryption.

But you've done more iOS Dev than I have so you probably know more about it.

We've come back around to Rich's statement that if there were really anything bad on the phone the bad guys simply would have destroyed it.

The story, as being told to the press, is a lie. How much of a lie and why, we don't get that info. If one prefers "carefully stated" or "only need to know" as a way to describe it, fine, but definitely not the full story.

We're getting what FBI PR wants us to hear and have been told to say. They aren't truly and completely locked out of this phone. Not if they apply he right tech and wait to crack the encryption.

They have the CPU/GPU horsepower. Someone doesn't want to let FBI use it. Which may be the really interesting part of the story.
 
The essence of what this boils down to is this:

1) It's not a 4th amendment case. 4th amendment covers unreasonable searches and seizures. Since a warrant was issued, and the owner of the phone has consented, it's not unreasonable.
2) It MAY be a First or Fifth amendment case. Can the government compel disclosure of an encryption key or security code? Maybe. If the individual were still alive, it would be arguable that he could not be forced to disclose the key. Apple? Maybe or maybe not. Software is typically covered by the First amendment, which would be the argument against forcing someone to write specific new software. But AFAIK, there's not much case law in this area.
3) Is part of this a marketing statement by Apple? Yes.
4) Does this open a security hole for the product or put Apple in a pretty difficult position dealing with various governments? Yes. The mere fact of disclosing that the units can be cracked is enough to encourage hackers and/or foreign governments. It could encourage bans of the equipment in other countries unless the crack method is disclosed. And it certainly sets a precedent in the US of requiring disclosure for whatever reason the secret national security court decides, even if the justification is weak. Do you trust your government, hackers, and other governments?
5) Is the FBI pushing this because they can't get legislation through Congress. No doubt about it. This is truly creating new law via the justice system. It sets a troubling precedent.
6) Could the problem have been avoided? In this case, yes. If the county had installed and properly managed MDM software, this likely would not have happened. And the reset of the iCloud password was really not smart.
7) Will the prevent smart terrorists in the future? Not a chance. With packages like Cyanagenmod on Android, one can install an OS that might contain encryption & wipe features without anyone who can break it. TrueCrypt used to have that feature for hard drives on desk/laptops - although TrueCrypt has been withdrawn (under mysterious circumstances), it is certainly feasible.
8) Whether or not Apple can crack this is not certain at this point. The "fail counter" would probably be before the encrypted code, and therefore might be vulnerable, but if the user employed a strong passphrase it might still not unlock.
9) If Apple designs this and it doesn't work (the memory gets trashed), who will be held responsible. I wouldn't want to touch the coding on this one without clear indemnification.

The case is not simple, but the key to this is the Government wanting to set precedent to get around the legislative process, and Apple wanting to protect it's reputation for both liability and security reasons....

this cartoon from Toles sums it up (at least as far as Google and MS are concerned, and to a lesser degree Apple): http://www.gocomics.com/tomtoles/2016/02/21
 
Suppose for a second you had a company that made safes.. and they designed a new kind of safe that could not be opened by any currently known methods for breaking into safes without destroying the contents.

Would anyone think it's reasonable of the government to demand the safe company find a way to break into their safe? Asking nicely for help sure, but demanding it, making it a court order?

It's not like Apple has a key and is not giving it up. They would need to craft a tool that doesn't currently exist to do this... I don't see where the government has any authority to force a company to create something like that.
 
Suppose for a second you had a company that made safes.. and they designed a new kind of safe that could not be opened by any currently known methods for breaking into safes without destroying the contents.

Would anyone think it's reasonable of the government to demand the safe company find a way to break into their safe? Asking nicely for help sure, but demanding it, making it a court order?

Nice job - very similar to the analogy I thought of!
 
6) Could the problem have been avoided? In this case, yes. If the county had installed and properly managed MDM software, this likely would not have happened. And the reset of the iCloud password was really not smart.

Nice summary. New news though.

FBI urged the County to change the password. So basically they're utterly incompetent. They HAD THE PASSWORD up until that point. And it wasn't just a County wonk that mismanged their mobile management software. Phone was already in FBI custody.

http://www.buzzfeed.com/johnpaczkowski/apple-terrorists-appleid-passcode-changed-in-government-cust
 
Nice summary. New news though.

FBI urged the County to change the password. So basically they're utterly incompetent. They HAD THE PASSWORD up until that point. And it wasn't just a County wonk that mismanged their mobile management software. Phone was already in FBI custody.

http://www.buzzfeed.com/johnpaczkowski/apple-terrorists-appleid-passcode-changed-in-government-cust

WOW..... Just WOW.....

Just when I thought the guvmint was partially competent.... They prove me wrong..:redface::redface::redface:
 
Suppose for a second you had a company that made safes.. and they designed a new kind of safe that could not be opened by any currently known methods for breaking into safes without destroying the contents.

Would anyone think it's reasonable of the government to demand the safe company find a way to break into their safe? Asking nicely for help sure, but demanding it, making it a court order?

It's not like Apple has a key and is not giving it up. They would need to craft a tool that doesn't currently exist to do this... I don't see where the government has any authority to force a company to create something like that.
If the FBI had reason to believe they could stop mass murder by accessing the contents?

Yes, totally reasonable.
 
Did you read what Nate just posted????

No.. Hit the next unread post icon and it brought me to that. I'll backtrack.

Ahh... I didn't give that notice because... So what? Even if true that would mean they screwed up, but that doesn't change the premise of the case.
 
Last edited:
They have the CPU/GPU horsepower. Someone doesn't want to let FBI use it. Which may be the really interesting part of the story.
Exactly. The President uses a Blackberry because the iPhone wasn't deemed secure enough. Yet the FBI with all their resources supposedly can't hack a consumer phone. It might take them months to somehow brute force their way into it. If they did succeed, they would be trapped by their "just this one phone" BS. It would take them just as long to hack the next one. What they really want is a manufacturer designed, easily repeatable process.
 
Back
Top