Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It really doesn’t have much to lose. None of the non-tech people I have spoken to think of this as a bad thing.


My family, my wife's family, and much of our extended families are all privacy conscious (we chat over Signal, just like we close our window blinds at night)... and they are all alarmed over this and ready to ditch Apple if they don't change their stance on this.

All it takes is a wiretap warrant and Apple would have to scan on-device pictures and iMessages for whatever the wiretap says. This is true even if the phone has iCloud switched off (most of us already do), since all Apple has to do is change a Boolean variable's value, or something similar requiring no creative effort (and hence legally coercible).


> All it takes is a wiretap warrant and Apple would have to scan on-device pictures and iMessages for whatever the wiretap says.

This is total and utter bullshit. It is a complete misunderstanding of how the system works.

If you and your family think this is true, then of course you are alarmed.


> This is total and utter bullshit. It is a complete misunderstanding of how the system works.

No it's not. It's an extrapolation based on known facts.

Governments can and will coerce companies into changing things about their systems, but the main limitation is that at least in the US, the government can't coerce creative work.

But that's just the thing - flipping a bit here or there, or adding non-CSAM image hashes to the database, is all possible and accessible to wiretap warrants, since doing so is a simple change to a system that already exists.

There's a principle here: trust good security system design, but do not trust system operators, since they can be coerced (legally or otherwise). A secure system that requires trusting particular people who operate it is not designed well.


> It's an extrapolation based on known facts.

No it’s not. It’s bullshit based on a complete lack of understanding of the system.

Read any of the public documentation on how it works, and you’d know it’s completely wrong.

There are multiple good reasons to oppose this system. But this:

> All it takes is a wiretap warrant and Apple would have to scan on-device pictures and iMessages for whatever the wiretap says.

Is still complete bullshit. No part of it is true or an extrapolation of known facts. It’s neither true as a technical possibility not even as a legal one.


If you have any specifics as to why that's not true, please do share.


As I said, it’s in public documentation you could easily check.

Here:

https://www.apple.com/child-safety/pdf/Security_Threat_Model...


Quoting from Edward Snowden: "I intentionally wave away the technical and procedural details of Apple’s system here, some of which are quite clever, because they, like our man in the handsome suit, merely distract from the most pressing fact—the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them."

Reference: https://edwardsnowden.substack.com/p/all-seeing-i

So the details really don't matter as much as the fact that this creates a huge precedent. And, quoting Snowden again, "There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well." (emphasis mine)

So I might read through that PDF, I might even understand it, but what it can't dissuade me from is a persuasion that yes, governments can and will take advantage of this turn of events, and no, there is no technological or legal barrier that can keep them from doing so once this is in place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: