Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is incredibly naive bordering on puerile. To suggest that POTUS' view on this is without nuance is to miss his point. POTUS went on to cite existing warrant mechanisms and their underlying principle:

"And we agree on that, because we recognize that just like all of our other rights ... that there are going to be some constraints we impose so we are safe, secure and can live in a civilized society."

I'm not suggesting that this means POTUS and the government have the right answers at the moment. Despite that, we can't ignore the important role law enforcement plays in society, the requirements in support of their role, and the complexities surrounding the right to privacy. We need people advocating for the right balance, not just getting each other frustrated.

OP may have worries other than US law enforcement being from another country. This is one of the /many/ complexities in this space.



> We need people advocating for the right balance, not just getting each other frustrated.

I don't understand this line of thinking regarding the big encryption debate. Yeah it sounds good if you don't know anything about encryption. "Look, he's being reasonable. Why is everyone saying he's wrong instead of striking a balance".

The problem is it's a binary problem. There is zero way to strike any type of balance here. You either encrypt communication to prevent others from accessing it or you provide a backdoor or escrow key to access the data in which case all encrypted communications are now insecure.

There isn't a middle ground there. If there is I have yet to see anyone suggest it and I can't think of a technical way to do it. Too many attack vectors when you purposefully impose multiple party keys or backdoors.


In that case, perhaps the balance we need to strike is rhetorical, not technical.

I think some people in the tech world are 4th-amendment absolutists. That's not a new thing; in Cryptonomnicon there was the electromagnetic doorframe so strong that it would wipe drives passing through it. I don't know that it's a position I hold myself, but it's one I definitely respect.

But we have a multi-century history of the government being able to read people's documents once they have a warrant. A lot of crime-fighting makes use of this. Whatever the intricacies of the technology, there's a status quo that I think we should at least acknowledge.

So I think the rhetorical balance we need to strike as an industry is along the lines of, "Yes, we totally get why you would want to get into that terrorist's phone. We wish we could find a way to give you access to just that without compromising everything. But it's not like a wall we can put a special door in. It's like a balloon. The moment it loses integrity, it's worthless. We want to help you, but we can't."

When the message instead comes across as absolutist, I think we're in danger of losing public support. Without that, we'll have a very hard time resisting government demands for access.


Your message can basically be distilled into "it's absolutist but we can't let it come across that way to hurt our PR".

I'm not sure trying to increase the complexity of the issue just to make it sound like the technology community doesn't see this as binary is a good solution. I think what we need to do is education people around how encryption works.

If we're wishy-washy about the whole thing it may come off as something we can possibly do in the end. I think education is the only way. But that's just my opinion.


Not quite.

Some people on this are absolutists, and some aren't. If we could create a safe back door, some would be in favor of doing that, and some wouldn't. That's distinct from us not having a way to create a safe back door.

One problem I see us having (and I've seen this on Facebook) is that some people think that we are saying we can't create a safe back door because we don't want to. And I think they're getting that notion because of the people in the absolutist camp.


> If we could create a safe back door, some would be in favor of doing that, and some wouldn't. That's distinct from us not having a way to create a safe back door.

I've seen a tiny handful of comments like that and other comments about how they're fine if it isn't completely safe in order to "protect the country". Is there any data to even know for sure what the majority of common people think about encryption and backdoors?

I'm not convinced a near majority simply thinks that SV just "doesn't want to". Well except in maybe the current Apple vs FBI case as the FBI and some of the media are kinda shouting that. Perhaps that'll sway the tides and make more people think in that way, however?

Either way I still think trying to educate the populous is the best way to undermine the entire idea of escrow keys / backdoors.


I don't think a "near majority" needs to think that for this to be a useful approach.

When people can't engage on the substance, they fall back on human heuristics. E.g., bedside manner is important in doctors because people will generally stop listening to doctors they don't find trustworthy.

You're welcome to try to educate the populace on the subtleties of crypto. My guess is that the number of people who will sit still for that is pretty small, but maybe I'm wrong. In case I'm right, though, I think a useful backup would be displaying empathy for the (reasonable) societal goals that are driving this push for back doors.


I didn't think I'd come back but I liked your comment and wanted to rebut.

There is a middle ground and where it lies is directly related to the assessed risk. To support a mechanism is a binary choice but its design is not.

In the San Bernardino example, an unexpected potential adversary (Apple) has emerged. An unexpected vector is being discussed which has nothing to do with the encryption mechanism itself. Perfect cryptography doesn't imply a perfect cryptosystem and it sure doesn't imply perfect privacy.

The balance that needs to be found is around the assistance which is rendered to warranted agencies to enable their access to data. IMO vendor supported key material attacks are not an unreasonable solution but obviously that mechanism can go horribly wrong if the software is leaked and it is not adequately designed. The same can be said for signed updates in general though. This is a hard technical problem but it's not a binary choice where the agency has some magic key they can use wherever they want or they have nothing. If they can be enabled on a case by case basis such that it is cost effective for them to do the rest, that might be enough.

Obviously this is all based on the assumption of a robust local judicial process. However, whatever decision making process occurs needs to take into account global implications. Other commenters here and elsewhere have mentioned what this could mean for agencies in other countries where judicial process may be less robust. Many have commented about a lack of transparency in our own. This is a separate but very closely related concern. There are also considerations for the open source community and how precedent like this might be applied to them. It's a hard problem but it's not binary.


I think this is an inadequate rebuttal. You're not being specific enough with what you mean by "its design is not [a binary choice]" Yes it really is. Either the cryptographic algorithm is broken or it's not; either the passphrase is known by 2+ parties or it's not; either there are backdoors, or there aren't. If no, no (1), and no, then the ciphertext is not determinable by the government if the only source for the passphrase can't or won't give it up.

It's also a binary condition whether a company should produce an unsafe fork of their own software and make it available to any government. If they should, then in effect all governments will use it against their adversaries, foreign and domestic.


> The balance that needs to be found is around the assistance which is rendered to warranted agencies to enable their access to data. IMO vendor supported key material attacks are not an unreasonable solution[...]If they can be enabled on a case by case basis such that it is cost effective for them to do the rest, that might be enough.

Why is it not unreasonable? You're compelling companies to write custom software explicitly for law enforcement use. Let's say you own a company that creates product X. Now the FBI is compelling you, through the court system, to develop a completely custom version of product X to let them access it or use it in some way. Sure they're paying you but you now have to redirect resources to handle this and you likely hired people to do things other than work on a government project. This ultimately can hurt your company and its market position simply by consuming your resources.

This is unprecedented outside of war times. It's hard for me to imagine anyone being okay with compelling a company to become a government contractor.

> It's a hard problem but it's not binary.

It absolutely is. The "middle ground" you pointed to isn't a middle ground; it's unconstitutional. I have yet to hear a technically viable AND constitutional "middle ground" in this debate.


> vendor supported key material attacks are not an unreasonable solution

Did you know encryption technology is not limited to Apple? It's not even limited to big companies. Anyone with a book or internet connection can implement encryption.

This isn't plutonium, which is hard to come by. Any law that puts a backdoor on phone's encryption schemes will fail to stop criminals from hiding their communications. They'll simply use another device or download a different app that does give them encrypted communications.

You and Obama are both correct to point out we should be seeking a balance to maintain our security. However, your calculation of that balance is missing some key elements, such as the one I mention above, which is that this is a gigantic game of whack-a-mole. It's the biggest one you could ever play. There's no way you can censor the internet or ban certain kinds of software. It's the same as censoring speech. Encryption is just code. The policy Obama is implicitly proposing simply won't work.

I can see you've thought about this so I'll give you one example where I think we should unlock a phone.

In the movie-like scenario where there's a nuclear weapon hidden somewhere, and an encrypted phone holds its location, then I would expect the NSA and FBI and every computer in the world to sick their processors on copies of that phone's data in an attempt to break the encryption. I believe this is already described as something the NSA has in place with a program called Bullrun. It seems likely to me that the FBI is unable to make use of this technology because decryption is so laborious, even for the mass of computers that the NSA has built up, and even for all of Google's machines. They just don't have enough to decrypt every criminal's phone. But I believe they could decrypt the movie-like scenario nuclear-weapons-hiding phone.

So we're covered in the most extreme case. But in reality there are all kinds of ways to discover someone's password, and you don't always need a huge server farm to get at it. For example, software has weaknesses that are only known to the NSA. No software is 100% impregnable. Even Snowden admits this.

We've existed as a society for a long time without encryption. Bad actors who collaborate exclusively online using encrypted technology, and who never meet in person, are few and far between.

Unless the US government has some major secrets to reveal about the balancing factors of maintaining public security, then on balance, trying to force backdoors into phones will make us all less secure. This forced weakness will be exploitable by every hacker in the world, at no cost, and from positions which are not governable. And the phone manufacturer will be required to keep that weakness as-is because to fix it would be to break the law.


I don't believe that Obama himself has an blunt understanding of this issue, but I can't know what is truly in his heart or head and the view promoted in his statements here is utterly oversimplified and purposely lacking in nuance. Glossing over the impossibility of providing security only against "bad guys" while leaving you open to "good guys" is an intentionally low resolution view of the issue intended to make refutations based on that impossibility sound like nerds nitpicking the implementation details.


I don't think whether he understands it to really matter. I think even if he was the world's foremost security expert, his message would be the same as it is now.


but there is no "balance" when somethig is binary. You either have encryption or you don't.


So Apple either designs their phones such that they themselves can't hack them, or they have to comply with government court orders.


Sounds binary to me.


What about disabling the bricking mechanism that would have allowed the FBI to brute-force the PIN?


If the San Bernardino case proves anything, it's that encryption is not the whole picture.


So far, my takeaway from the federal government's reaction to the San Bernardino attacks is the following prioritization of freedom:

Higher priority freedoms:

1. Freedom of religious choice and freedom from religious persecution. Freedom of religious assembly.

2. Freedom to immigrate to the US.

Lower Priority Freedoms:

1. Freedom to purchase guns and ammunition

2. Freedom to use encryption to keep personal communication private


That's wrong.

The government isn't saying that no one should be able to encrypt data. They just thing that, in some cases, it should be done in such a way that certain data can be read by use of a centralized master key.

This, of course, presents certain problems. What if that master key leaks?

But it's not binary. Data secured this way is absolutely safer than not encrypting something at all. "Is it safe enough?" is certainly a question that people can disagree on.

It's not binary.


The fundamental issue is that any back door accessible to "the good guys" will also be accessible to sufficiently-technically-advanced "bad guys". And from the perspective of securing the United States' defense infrastructure, major corporations, etc., there are multiple sufficiently-technically-advanced "bad guys" in the world already (i.e., certain major foreign governments and their intelligence/defense complexes).

So if you're doing US national security, you cannot accept the Obama position; any technology to which our "good guy" law enforcement could get access is guaranteed to also be a technology to which a "bad guy" foreign enemy combatant could similarly gain access.

If you're a major US corporation, you cannot accept the Obama position; any technology to which our "good guy" law enforcement could get access is guaranteed to also be a technology to which a "bad guy" foreign intelligence/espionage agent could similarly gain access.

So as far as that is concerned, it is binary. You can either be secure against foreign threats, in which case you are also as a side effect impenetrable to the local "good guys". Or you can be penetrable by the local "good guys" and also as a side effect penetrable by the foreign "bad guys".

The argument advanced against Obama's position is that A) it fundamentally fails to acknowledge this reality, and B) seems to argue for deliberately compromising national security in order to... preserve national security? In other words, it is not only unrealistic, it is in fact nonsensical and self-contradictory.


We're talking about iPhones here. Not the nuclear launch codes. Maybe you have information on your phone that KGB is interested in, but most people don't.

Again, security & encryption aren't binary. It isn't on or off. Different levels of protection are appropriate for different levels of threats. The lock on my mom's suburban house is a lot weaker than the lock on the vault holding the gold at Fort Knox.

Similarly the kind of security appropriate for consumer grade cell phones might be different than the kind appropriate for guarding US national security secrets.


Maybe you have information on your phone that KGB is interested in, but most people don't.

You have a very narrow view of what information is sensitive. Do you know how many Americans work either directly in, or in industries associated with, defense? How many of them use Apple products as part of their work?

Or let's take a more down-to-earth example: I work for a company in the health-care space. We have health records on thousands of people, and as we expand we'll have more health records on more people. With your approach, where should we draw the line? Would it be OK to use security bad enough that, say, an Eastern European organized-crime ring could break in? Or do we need to defend against them and not against the KGB? How do we figure out what level of tools to use to do that? How do we figure out whether what we have now will still be mafia-proof in five years?

Oh, and keep in mind that legally we have to protect that data, and improper access to it can not just shut down the company but send me and my co-workers to jail. How much would you be willing to compromise with that on the line? If the potential consequences to you were jail time, how much on the side of Obama and the FBI would you be? How much risk of that would you take on in order to make law enforcement's life a little bit easier?

And in case you think this is a red herring, remember the San Bernardino shooter... worked for a public-health department.


If I can gain access to your customer's data by gaining access to a cell phone owned by one of your employees then you are doing something deeply, deeply wrong.

You should fix that.


Who says it's only about phones?

The fact that it's an iPhone in the San Bernardino case is kind of a red herring; the precedent it would set is not "iPhones and only iPhones and not any other type of computing device are subject to law enforcement unlocking". The precedent it would set is that any encrypted system must be built to be decrypted on demand by law enforcement.

And that gets back to my original comment: a system the "good guys" can decrypt is one the "bad guys" will be able to decrypt too.


Slippery slope arguments are lame when they start to go like this.

Like, I get you. By allowing X which might actually be OK, we might allow X+1 to happen and that wouldn't be OK. So we must oppose X as a practical matter.

I get the practicality of that.

But when you start opposing X as a matter of principle rather than simply as a matter of practicality that's when you've gone off the rails.


Well, for one thing it's not even a slippery slope. They want to unlock his work-issued phone, not his personal phone, and the FBI admits they have, and other law-enforcement agencies admit they have, an almighty queue of other things they'd like to get force-unlocked if they can only get precedent claiming that companies do indeed have to circumvent their own products on demand.


And you presume that compromising the iPhones belonging to the men and women who do control the nuclear launch codes wouldn't serve to compromise everything they control, by extension? Either by using the data as a lever against the people, or directly snooping, this would be a very powerful opening, and your position seems to defy all common sense.


Obama used the word "fetishizing" for a reason. It's a pretty unusual word to use don't you think? It was carefully chosen though I'm sure.

Comments like this are exactly why he used it.


Yes he did use it for a reason. He used it to make it seem perverse that you would want your information on your phone to be secure. Hes is trying, once again, to put forth the governments position that if you're not trying to hide something, you have nothing to fear.

And quite frankly, yes, I do hold the security and privacy of data as more important than the terrorism song and dance the government is playing. And I do hold that view absolutely.


wow, that's the best writing on this whole thing I've read. well done. I tweeted it https://twitter.com/andrewarrow/status/709158302127509504


Thank you! I really should have put a lot more into this, because I have a ton of examples. Really need to do a follow up


It's actually less safe. People will think their data is secure, but it won't be.

The government always leaks keys. Sometimes literally.

http://www.wired.com/2015/09/lockpickers-3-d-print-tsa-lugga...

http://nypost.com/2015/09/20/the-8-key-that-can-open-new-yor...

Plus, any key that the government can use without your permission, they can illegally abuse. And if there's one thing we've seen proven recently, it's that such abuse will occur.


It's worth noting that in this case we're only talking about keys that are useful to someone in physical possession of a phone. So as long as I retain physical possession of my phone no one can get at it, keys or not.

That's still pretty safe.


Until you lose it or it's stolen.


wow, when you put it that way it's like saying "either I have the right to own a gun or I do not." So I guess if people except limits on gun ownership they should accept limits on encryption cuz it's just as dangerous?


Actually no. What I'm saying is that you do have the right to own a gun, but there are limits on gun ownership.

  * Minors cannot buy guns.
  * Convicted felons cannot buy guns in most circumstances.
  * There are numerous locations in which it is illegal to bring a gun.
  * There are numerous types of guns that you are not allowed to own.
It's not binary.


it's that interesting, people on the LEFT want NO LIMITS on encryption but LOTS OF LIMITS on guns. Is encryption more or less dangerous than guns?


To me personally the concept of even owning a gun is utterly ridiculous, but I guess American people would be incredibly upset if every gun had to come with a built-in remotely controlled switch that could be used by the police to disable it. This is essentially what they are trying to do with encryption - you are free to use it, but we want to be able to manually switch it off if we want to.


Encryption is a measure that prevents unauthorized parties from taking part of what is encrypted. If it fails that on such a basic level that a specific unauthorized party can trivially access it, it's not effective encryption. Whether or not it is effective, then, is just a matter of whether you agree that who ever holds the master key is authorized to access whatever it is that you encrypted.

It's still not broken in any technical sense, but if I don't want a third party to read my data and they are still able to do so, it's a stretch to call it encryption.


I don't think that's true at all. If you use gmail your data is pretty safe. Unless you tell me your password I'm going to have a pretty hard time reading your mail. It's encrypted at rest on Google's disks. It's encrypted in transmission over the internet. Google's network is protected with various forms of encryption and security that makes it hard for an unauthorized party to get into.

But Google itself can access and read your email. It does so in an automated fashion to target ads at you.

So is your gmail encrypted? Well...kind of.

It's not binary.


The difference is that by using Gmail you agree with Google's terms of service, which state somewhere in them that they can target ads at you.

With a government-held master key, there is no such contract.


Sure there is. We have a system of laws and courts that define the contract.

And besides, the existence of a contract or not isn't really my point. My point was that encryption isn't an all or nothing situation. You don't either have completely ironclad security where you and only you can access your data or nothing at all.

There are degrees of technical protection. That was my point. The black and white thinking about this issue is wrong.


> Sure there is. We have a system of laws and courts that define the contract.

Isn't the point of discussing this that you actually don't have anything that defines that contract? In terms of the fourth amendment, it seems like there is quite the opposite contract in place already. Also, as an Apple product user outside the U.S., this supposed contract is not something that I ever agreed to.

> And besides, the existence of a contract or not isn't really my point. My point was that encryption isn't an all or nothing situation. You don't either have completely ironclad security where you and only you can access your data or nothing at all.

It is an all or nothing situation. That there is only one unauthorized party able to intercept my data using a back door doesn't change the fact that there is an unauthorized party able to intercept my data using a back door. Whether that is practical for you or not, again, depends on whether you consider the third back-dooring third party authorized.

> There are degrees of technical protection.

I think that maybe the discussion here comes from the fact that you are indeed talking about "technical protection" in general and not encryption specifically. The point of encryption is that only eligible parties can decode the data. If others can decode the data it's not really encrypted, as a matter of definition.

For example, using Google as a mail provider may prove to be poor "technical protection", but the data is encrypted and the authorized parties are part of the terms of service. That the U.S. government can decode my encrypted data without permission may prove to be practical "technical protection", but in terms of encryption it really is a boolean situation.


no way man. either I have the right to use technology to make something encrypted (my definition of encrypted is cannot be broken) or I don't.


There's no award for effort. Apple built a cryptosystem that doesn't work against an adversary that is Apple. "Encrypting" isn't some magical property that makes your data safe. The encryption itself is just one aspect of a whole system that must be designed to protect against certain adversaries with certain capabilities. The government would be totally allowed, for example, to tamper with your wax seals if it had a warrant to read your mail.

I lied, there is an award for effort. Wherever you have an "reasonable expectation of privacy", the government must get a warrant or your permission to exercise its search and seizure authority. That's the protection that you're entitled to by our laws.

Even the EFF is acknowledging the weakness of your argument when they try to frame it as an issue about Apple's free speech instead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: