A key point that needs to be mentioned: we strongly dislike being distrusted.
It might well be a genetic heritage. Being trusted in a tribe is crucial to survival, and so is likely wired deep into our social psychology.
Apple is making a mistake by ignoring that. This isn’t about people not trusting Apple. It’s about people not feeling trusted by Apple.
Because of this, it doesn’t matter how trustworthy the system is or what they do to make it less abusable. It will still represent distrust of the end user, and people will still feel that in their bones.
People argue about the App Store and not being trusted to install their own Apps etc. That isn’t the same. We all know we are fallible and a lot of people like the protection of the store, and having someone to ‘look after’ them.
This is different and deeper than that. Nobody want to be a suspect for something they know they aren’t doing. It feels dirty.
The part Federighi's "interview" where he can't understand how people perceive this as a back door [1] seems incredibly out of touch. The back door is what everyone is talking about. Someone at Apple should at least be able to put themselves in their critics' shoes for a moment. I guess we need to wait to hear Tim Cook explain how this is not what he described 5 years ago [2].
It is. It's a simple matter for two foreign governments to decide they don't want their people to criticize the head of state with memes, and then insert such images into the database Apple uses for scanning.
Apple's previous position on privacy was to make such snooping impossible because they don't have access to the data. Now they are handing over access.
What I and thousands of others online are describing ought to be understandable by anyone at Apple. The fact that an Apple exec can sit there in a prepared interview and look perplexed about how people could see this as a back door is something I don't understand at all. This "closing his ears" attitude may indicate he is full of it.
That really doesn’t sound like anything I’d describe as a “back door”. A back door implies general purpose access. A system which required the collision of multiple governments and Apple and their child abuse agencies simply is not that.
One of the casualties of this debate is that people are using terms that make things sound worse than they are. If you can’t get at my filesystem, you don’t have a back door. I understand the motive for stating the case as harshly as possible, but I think it’s misguided.
Having said this, I would find it interesting to hear what Federighi would say about this potential abuse case.
> A system which required the collision of multiple governments and Apple and their child abuse agencies simply is not that.
Agree to disagree.
> I would find it interesting to hear what Federighi would say about this potential abuse case.
Personally I would not. That's a political consideration and not something I want to hear a technologist weigh in on while defending their technology. Apple's previous stance, with which I agree, was to not give humans any chance to abuse people's personal data,
> Personally I would not. That's a political consideration and not something I want to hear a technologist weigh in on while defending their technology.
It’s not. The abuse case flows from their architecture. Perhaps it isn’t as ‘easy’ as getting multiple countries to collude with Apple. If the architecture can be abused the way you think it can, that is a technical problem as well as a political one.
You may not be able to solve the bias problem altogether, but you can definitely change the threat model and who you have to trust.
Apple’s model has always involved trusting them. This model involves trusting other people in narrow ways. The architecture determines what those ways are.
Trusting Apple to not scan my device in the past was easy because as an engineer I know I would speak up if I saw that kind of thing secretly happening, and I know security researchers would speak up if they detected it.
Now Apple will scan the device and we must trust that 3rd parties will not abuse the technology by checking for other kinds of imagery such as memes critical of heads of state.
The proposed change is so much worse than the previous state of things.
How does this attack even work? So some government poisons the database with political dissident memes and suddenly Apple starts getting a bunch of new reports which when reviewed are obviously not CSAM.
If the government can force Apple to also turn over these reports then they could have just made Apple add their political meme database directly and it's already game over.
More like, the government says Apple can't operate there unless they include what they say is illegal.
Apple is run by humans who are subject to influence and bias. Who knows what policy changes will come in Apple's future. Apple's previous stance was to not hand over data because they don't have access to it. This change completely reverses that.
I would say that it is a back door, because it would work even if iCloud Photos were E2E encrypted. It may not be a generic one, but one that is specific to a purpose Apple decided is rightful. And there is no guarantee that Apple (or authorities) won't decide that there are other rightful purposes.
> I would say that it is a back door, because it would work even if iCloud Photos were E2E encrypted.
Backdoor is defined by the Oxford Dictionary as "a feature or defect of a computer system that allows surreptitious unauthorized access to data."
The system in question requires you to upload the data to iCloud Photos for the tickets to be meaningful and actionable. Both your phone and iCloud services have EULA which call out and allow for such scanning to take place, and Apple has publicly described how the system works as far as its capabilities and limitations. In the sense that people see this as a change in policy (IIRC the actual license agreement language changed over a year ago) , Apple has also described how to no longer use the iCloud Photos service.
One less standard usage is not about unauthorized access but specifically to private surveillance (e.g. "Clipper chip") - but I would argue that the Clipper chip was a case where the surveillance features were specifically not being talked about, hence it still counting as "unauthorized access".
But with a definition that covers broad surveillance instead of unauthorized access, it would still be difficult to classify this as a back door. Such surveillance arguments would only pertain to the person's phone and not to information the user chose to release to external services like iCloud Photos.
To your original point, it would still work with iCloud Photos did not have the key escrow, albeit with less data being capable of being able to be turned over to law enforcement. However iCloud Photos being an external system would still mean this is an intentional and desired feature (presumably) by the actual system owners (Apple).
I see. My interpretation doesn't hold up given your definitions of back door.
I bet the authorities would be happy with a surveillance mechanism disclosed in the EULA, though. Even if such a system is not technically a back door, I am opposed to it and would prefer Apple to oppose it.
Edit: I just noticed that you had already clarified your argument in other replies. I am sorry to make you repeat it.
It has proven very difficult to oppose laws meant to deter child abuse and exploitation.
Note that while the EFF's mission statement is about defending civil liberties, they posted two detailed articles about Apple's system without talking about the questionable parts of the underlying CSAM laws. There was nothing about how the laws negatively impact civil liberties and what the EFF might champion there.
The problem is that the laws themselves are somewhat uniquely abusable and overreaching, but they are meant to help reduce a really grotesque problem - and reducing aspects like detection and reporting is not going to be as effective against the underlying societal issue.
Apple has basically been fighting this for the 10 years since the introduction of iCloud Photo, saying they didn't have a way to balance the needs to detect CSAM material without impacting the privacy of the rest of users. PhotoDNA was already deployed at Microsoft and being deployed by third parties like Facebook when iCloud Photo launched.
Now it appears that Apple was working a significant portion on that time toward trying to build a system that _did_ attempt to accomplish a balance between social/regulatory responsibility and privacy.
But such a system has to prop technical systems and legal policies against one another to make up the shortcomings of each, which make it a very complex and nuanced system.
But the catch is: all the incumbents already treated your data as if you were guilty until proven innocent. Apple’s transparency about that change may have led people to internalize that, but it’s been the de facto terms of most cloud relationships.
What I personally don’t understand is why Apple didn’t come out with a different message: we’ve made your iPhone so secure that we’ll let it vouch for your behalf when it sends us data to store. We don’t want to see the data, and we won’t see any of it unless we find that lots of the photos you send us are fishy. Android could never contemplate this because they can’t trust the OS to send vouchers for the same photos it uploads, so instead they snoop on everything you send them.
It seems like a much more win-win framing that emphasizes their strengths.
> What I personally don’t understand is why Apple didn’t come out with a different message: we’ve made your iPhone so secure that we’ll let it vouch for your behalf when it sends us data to store. We don’t want to see the data, and we won’t see any of it unless we find that lots of the photos you send us are fishy.
That is PR speak that would have landed worse in tech forums. I respect them more for not doing this.
The core issue is this performs scans, without your approval, on your local device. Viruses already do that and it's something techies have always feared governments might impose. That private companies are apparently being strong-armed into doing it is concerning because it means the government is trying to circumvent the process of public discussion that is typically facilitated by proposing legislation.
I hate this argument. Pressing yes to the T&C once when you setup an Apple account doesn’t exactly constitute my approval imo (even if it does legally).
There’s no disable button or even clear indication that it’s going on.
> There’s no disable button or even clear indication that it’s going on.
iCloud Photos is an on-off switch.
In terms of clearly indicating everything that is going on within the service, that is just not possible for most non-tech users. It appears to have been pretty difficult even for those familiar with things like cryptography and E2E systems to understand the nuances and protections in place.
Instead, the expectation should be that using _any_ company's cloud hosting or cloud synchronization features is fundamentally sharing your data with them. It should then be any vendor's responsibility give give customers guarantees on which to evaluate trust.
No - but you did suggest there was no opting in. I’m pointing out that just because I’m not entirely happy with the choice doesn’t mean it isn’t a choice.
I agree this is vastly better than what anyone else is doing, and you know I understand the technology.
However, I don’t think any framing would have improved things. I think it was always going to feel wrong.
I would prefer they don’t do this because it feels bad to be a suspect even in this abstract and in-practice harmless way.
Having said that, having heard from people who have investigated how bad pedophile activity actually is, I can imagine being easily persuaded back in the other direction.
I think thins is about the logic of evolutionary psychology, not the logic of cryptography.
My guess is that between now and the October iPhone release we are going to see more media about the extent of the problem they are trying to solve.
> Having said that, having heard from people who have investigated how bad pedophile activity actually is, I can imagine being easily persuaded back in the other direction.
There are terrible things out there that we should seek to solve. They should not be solved by creating 1984 in the literal sense, and certainly not by the company that became famous for an advertisement based on that book [1].
Apple, take your own advice and Think Different [2].
In case you missed it, I think this is probably a bad move.
I just don’t think these arguments about back doors and creeping totalitarianism are either accurate or likely to persuade everyday users when they weigh them up against the grotesque nature of child exploitation.
> I just don’t think these arguments about back doors and creeping totalitarianism are either accurate or likely to persuade everyday users when they weigh them up against the grotesque nature of child exploitation.
Agree to disagree. This opens the door to something much worse in my opinion.
I don’t think I agree. If you think this boils down to instinct, do you think a story about coming together to save the next generation will work well on people cynical enough to see TLAs around every corner? At the very least, I feel like Apple should probably make a concession to the conspiracy minded so that they can bleach any offensive bits from their device and use them as offline-first DEFCON map viewing devices, or some such.
> do you think a story about coming together to save the next generation will work well on people cynical enough to see TLAs around every corner?
Absolutely not. I don’t think they need to persuade people who are convinced of their iniquity.
What they need is an environment in which those people look like they are arguing over how many dictatorships need to collude to detect anti-government photos and how this constitutes literal 1984, while Apple is announcing a way to deter horrific crimes against American children that they have seen on TV.
A lot of people like strong border controls for instance, even if it means when they return to their country from abroad they have to go through more checks or present more documents to get in. Or consider large gated communities where you have to be checked by a guard to get in.
Many peoples seem fine with being distrusted as long as the distrust is part of a mechanism to weed out those who they feel really are not trustworthy and it is not too annoying for them to prove that they are not one of those people when they encounter a trust check.
It might well be a genetic heritage. Being trusted in a tribe is crucial to survival, and so is likely wired deep into our social psychology.
Apple is making a mistake by ignoring that. This isn’t about people not trusting Apple. It’s about people not feeling trusted by Apple.
Because of this, it doesn’t matter how trustworthy the system is or what they do to make it less abusable. It will still represent distrust of the end user, and people will still feel that in their bones.
People argue about the App Store and not being trusted to install their own Apps etc. That isn’t the same. We all know we are fallible and a lot of people like the protection of the store, and having someone to ‘look after’ them.
This is different and deeper than that. Nobody want to be a suspect for something they know they aren’t doing. It feels dirty.