Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s not a back door in any sense of the word. That’s why he is surprised people see it as one.

It really only does what they say it does, and it really is hard to abuse.

But that doesn’t matter. The point is that even so, it makes everyone into a suspect, and that feels wrong.



> It’s not a back door in any sense of the word.

It is. It's a simple matter for two foreign governments to decide they don't want their people to criticize the head of state with memes, and then insert such images into the database Apple uses for scanning.

Apple's previous position on privacy was to make such snooping impossible because they don't have access to the data. Now they are handing over access.

What I and thousands of others online are describing ought to be understandable by anyone at Apple. The fact that an Apple exec can sit there in a prepared interview and look perplexed about how people could see this as a back door is something I don't understand at all. This "closing his ears" attitude may indicate he is full of it.


Edit: I reconsidered my previous reply.

That really doesn’t sound like anything I’d describe as a “back door”. A back door implies general purpose access. A system which required the collision of multiple governments and Apple and their child abuse agencies simply is not that.

One of the casualties of this debate is that people are using terms that make things sound worse than they are. If you can’t get at my filesystem, you don’t have a back door. I understand the motive for stating the case as harshly as possible, but I think it’s misguided.

Having said this, I would find it interesting to hear what Federighi would say about this potential abuse case.


> A system which required the collision of multiple governments and Apple and their child abuse agencies simply is not that.

Agree to disagree.

> I would find it interesting to hear what Federighi would say about this potential abuse case.

Personally I would not. That's a political consideration and not something I want to hear a technologist weigh in on while defending their technology. Apple's previous stance, with which I agree, was to not give humans any chance to abuse people's personal data,

https://youtu.be/rQebmygKq7A


> Personally I would not. That's a political consideration and not something I want to hear a technologist weigh in on while defending their technology.

It’s not. The abuse case flows from their architecture. Perhaps it isn’t as ‘easy’ as getting multiple countries to collude with Apple. If the architecture can be abused the way you think it can, that is a technical problem as well as a political one.


You can't solve the human-bias problem with technology. That's the whole reason Apple didn't want to build in a back door in the first place.


You may not be able to solve the bias problem altogether, but you can definitely change the threat model and who you have to trust.

Apple’s model has always involved trusting them. This model involves trusting other people in narrow ways. The architecture determines what those ways are.


Trusting Apple to not scan my device in the past was easy because as an engineer I know I would speak up if I saw that kind of thing secretly happening, and I know security researchers would speak up if they detected it.

Now Apple will scan the device and we must trust that 3rd parties will not abuse the technology by checking for other kinds of imagery such as memes critical of heads of state.

The proposed change is so much worse than the previous state of things.


> checking for other kinds of imagery such as memes critical of heads of state.

Do you live in a country where the head of state wants to check for such memes?


Probably. You underestimate humans if you don't think any of us will try to squash things that make us look bad.


Does your state not have protections against such actions?


In theory it does. Laws put in place by previous generations do require maintenance to uphold.


They do indeed.


How does this attack even work? So some government poisons the database with political dissident memes and suddenly Apple starts getting a bunch of new reports which when reviewed are obviously not CSAM.

If the government can force Apple to also turn over these reports then they could have just made Apple add their political meme database directly and it's already game over.


More like, the government says Apple can't operate there unless they include what they say is illegal.

Apple is run by humans who are subject to influence and bias. Who knows what policy changes will come in Apple's future. Apple's previous stance was to not hand over data because they don't have access to it. This change completely reverses that.


I would say that it is a back door, because it would work even if iCloud Photos were E2E encrypted. It may not be a generic one, but one that is specific to a purpose Apple decided is rightful. And there is no guarantee that Apple (or authorities) won't decide that there are other rightful purposes.


> I would say that it is a back door, because it would work even if iCloud Photos were E2E encrypted.

Backdoor is defined by the Oxford Dictionary as "a feature or defect of a computer system that allows surreptitious unauthorized access to data."

The system in question requires you to upload the data to iCloud Photos for the tickets to be meaningful and actionable. Both your phone and iCloud services have EULA which call out and allow for such scanning to take place, and Apple has publicly described how the system works as far as its capabilities and limitations. In the sense that people see this as a change in policy (IIRC the actual license agreement language changed over a year ago) , Apple has also described how to no longer use the iCloud Photos service.

One less standard usage is not about unauthorized access but specifically to private surveillance (e.g. "Clipper chip") - but I would argue that the Clipper chip was a case where the surveillance features were specifically not being talked about, hence it still counting as "unauthorized access".

But with a definition that covers broad surveillance instead of unauthorized access, it would still be difficult to classify this as a back door. Such surveillance arguments would only pertain to the person's phone and not to information the user chose to release to external services like iCloud Photos.

To your original point, it would still work with iCloud Photos did not have the key escrow, albeit with less data being capable of being able to be turned over to law enforcement. However iCloud Photos being an external system would still mean this is an intentional and desired feature (presumably) by the actual system owners (Apple).


I see. My interpretation doesn't hold up given your definitions of back door.

I bet the authorities would be happy with a surveillance mechanism disclosed in the EULA, though. Even if such a system is not technically a back door, I am opposed to it and would prefer Apple to oppose it.

Edit: I just noticed that you had already clarified your argument in other replies. I am sorry to make you repeat it.


It has proven very difficult to oppose laws meant to deter child abuse and exploitation.

Note that while the EFF's mission statement is about defending civil liberties, they posted two detailed articles about Apple's system without talking about the questionable parts of the underlying CSAM laws. There was nothing about how the laws negatively impact civil liberties and what the EFF might champion there.

The problem is that the laws themselves are somewhat uniquely abusable and overreaching, but they are meant to help reduce a really grotesque problem - and reducing aspects like detection and reporting is not going to be as effective against the underlying societal issue.

Apple has basically been fighting this for the 10 years since the introduction of iCloud Photo, saying they didn't have a way to balance the needs to detect CSAM material without impacting the privacy of the rest of users. PhotoDNA was already deployed at Microsoft and being deployed by third parties like Facebook when iCloud Photo launched.

Now it appears that Apple was working a significant portion on that time toward trying to build a system that _did_ attempt to accomplish a balance between social/regulatory responsibility and privacy.

But such a system has to prop technical systems and legal policies against one another to make up the shortcomings of each, which make it a very complex and nuanced system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: