Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Governments know that the police sometimes go rogue and don't care. It can be cleaned up when they do.

Tech firms also know that programmers can make mistakes whilst implementing complex cryptography, or even be corrupted, yet this is not itself an argument against implementing cryptography!

To repeat once again, we're not debating the ethics of E2E encryption here. Please don't waste time trying to convince me that E2E is encryption is a good idea, because (if it was real and worked) I'd agree with you! But your argument is a Type 1 No by the scheme presented in the article. It is a "we'd really rather not" social argument.

The problem our industry has is typified by the article. Too many tech people argue that giving police access to WhatsApp encryption is actually a Type 3 "it can't be done for fundamental physical reasons so it doesn't matter who demands it" problem, but that isn't true. Remember that governments don't care about E2E encryption whatsoever in any way. They would much rather ban it as a source of unnecessary problems. If tech firms claim they can't turn it off completely, they're obviously lying and that will just enrage governments. If tech firms claim they can't keep it whilst providing targeted access, governments don't care about that either. After all, email isn't end-to-end encrypted, nor is SMS, nor are normal phone calls, nor are letters. Why should WhatsApp be any different?

In reality it actually is possible to design a system that stops people with only server-side access to WhatsApp reading messages, whilst still breaking if the clients are compromised, and which allows police to have targeted levels of access without any risk of universal master key leaks. There are lots of ways to do that. You can use secure enclaves, zero knowledge proofs, or more exotic algorithms. But it's also not really relevant to the point I'm making, which is about the No Type being presented to governments. There was surely a better example that could have been chosen for a Type 3 No.



> Too many tech people argue that giving police access to WhatsApp encryption is actually a Type 3 "it can't be done for fundamental physical reasons so it doesn't matter who demands it" problem, but that isn't true.

This is changing the goalposts.

Giving police access to whatsapp chat is trivially easy. But that's not the question.

The pro-surveillance people say "Give the good-guy police (whoever they are) access to everything and keep it secure from the Bad Guys Whom We Oppose (whoever they might be this week)". That one is indeed impossible due to the laws of information theory.

> without any risk of universal master key leaks

You're looking at a technical problem when this is not that, it is a humanity problem.

It's relatively easy to avoid e.g. master key leaks. That's an irrelevant implementation detail. What matters is that if some set of people have unfettered access to bypass all protections, then all the Bad Guys will also have that access soon enough because you can't keep people from getting corrupted/threatened. No matter how hard you wish, you can't. Humans are like that.


> That one is indeed impossible due to the laws of information theory ... it is a humanity problem

So, is this impossible due to human nature or the "laws of information theory"? Which is it? And if the latter what "law" are you thinking of, exactly? Can you name these laws?

Here's the problem: it's neither impossible mathematically nor practically. Remember the kerfuffle over the NSA's backdoored ECDRBG algorithm? That was a very pure textbook example of what's possible, it would have allowed the NSA and only the NSA to decrypt TLS streams using it. According to you that would have been physically impossible due to violating some sort of law, but it's not. Cryptographers know how to construct such systems, there are many ways.

But the existence of such solutions doesn't even matter. Lawful intercept abilities have existed for ages, governments will happily accept a solution of just turning off E2E encryption entirely, and the possibility that governments or tech firms will get hacked doesn't bother them because that's a transient problem that can be made hard by throwing money at it.

They also don't care about corruption because the countries demanding this have low levels of it. Governments are the original experts in corruption, you might say, and have evolved a lot of different mechanisms to fight it. Finally, remember that all these systems are already hackable or corruptible. Pay off someone who works on the WhatsApp mobile app team and unless Meta's internal controls detect it, it's game over.


> So, is this impossible due to human nature or the "laws of information theory"? Which is it? And if the latter what "law" are you thinking of, exactly? Can you name these laws?

Yes. It's simple, if you send information in a recoverable way to another entity, they can recover it. If that entity involves humans, they can and will with 100% certainty be corrupted or threatened to obtain the information improperly.

> Remember the kerfuffle over the NSA's backdoored ECDRBG algorithm?

Amusing that your example is a counterexample to your thesis. Exactly. Backdoors never serve only their would-be masters. That's the impossible part.

If this is not blindingly obvious by now, I fail at being able to explain it better.


The backdoor in the EC DRBG algorithm was detected but never leaked because to actually open that back door required a key only the NSA had, but that key itself never leaked. Only the NSA could decrypt streams that used this PRNG. To everyone else they remained undecryptable.

So it's not a counter-example and your amusement is misplaced.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: