> If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
Absolutely. I don't know whether there's a reason for this timing (that is, if they are planning E2E encryption, why they announced this first), but this is probably the biggest PR bungle Apple has had since "you're holding it wrong," if not ever.
> Apple are clearly willing to add functionality at the request of governments, on device.
Maybe? I'm not as willing to state that quite as definitively, given the pushback Apple gave in the San Bernardino shooter case. Some of what their Privacy Engineering head said in that TechCrunch article suggests that Apple has engineered this to be strategically awkward, e.g., generating the hashes by using ML trained on the CSAM data set (so the hashing system isn't as effective on other data sets) and making the on-device hashing component part of the operating system itself rather than a separately updatable data set. That in turn suggests to me Apple is still looking for an engineering way to say "no" if they're asked "hey, can you just add these other images to your data set." (Of course, my contention that this is not ultimately an engineering problem applies here, too: even if I'm right about Apple playing an engineering shell game here, I'm not convinced it's enough if a government is sufficiently insistent.)
A minor interesting tidbit: your linked Sophos story is based on a Telegraph UK story that has this disclaimer at the bottom:
> This story originally said Apple screens photos when they are uploaded to iCloud, Apple's cloud storage service. Ms Horvath and Apple's disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.
It's hard to say what they were actually doing, but it's reasonable to suspect it's an earlier, perhaps entirely cloud-based rather than partially cloud-based, version of NeuralHash.
Right, and in the interview linked [1] above they state:
> The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos.
But they do appear to do "something" server-side. It's possible that all data in scanned as it is ingested for example. I dislike this statement, because it's probably technically correct but doesn't help clarify the situation in a helpful way. It makes me trust Apple less.
What I don’t understand is that if they announced they would do that scanning server side, the only eyebrows that would be raised is of people who thought they were doing it already. It’s not like if those pictures were e2e encrypted. I still haven’t seen any convincing argument about why searching client side provide any benefit to end users, while being a massive step in the direction of privacy invasion.
> But they do appear to do "something" server-side. It's possible that all data in scanned as it is ingested for example. I dislike this statement, because it's probably technically correct but doesn't help clarify the situation in a helpful way. It makes me trust Apple less.
The qualifier is "Photos" - different services have different security properties.
Email transport is not E2E encrypted because there are no interoperable technologies for that.
Other systems are encrypted but apple has a separate key escrow system outside the cloud hosting for law enforcement requests and other court orders (such as a heir/estate wanting access).
Some like iCloud Keychain use more E2E approach where access can't be restored if you lose all your devices and paper recovery key.
iCloud Photo Sharing normally only works between AppleID accounts, with the album keys being encrypted to the account. However, you can choose to publicly share an album, at which point it becomes accessible via a browser on icloud.com. I have not heard Apple talking about whether they scan photos today once they are marked public (going forward, there would be no need).
FWIW this is all publicly documented, as well as what information Apple can and can't provide to law enforcement.
Absolutely. I don't know whether there's a reason for this timing (that is, if they are planning E2E encryption, why they announced this first), but this is probably the biggest PR bungle Apple has had since "you're holding it wrong," if not ever.
> Apple are clearly willing to add functionality at the request of governments, on device.
Maybe? I'm not as willing to state that quite as definitively, given the pushback Apple gave in the San Bernardino shooter case. Some of what their Privacy Engineering head said in that TechCrunch article suggests that Apple has engineered this to be strategically awkward, e.g., generating the hashes by using ML trained on the CSAM data set (so the hashing system isn't as effective on other data sets) and making the on-device hashing component part of the operating system itself rather than a separately updatable data set. That in turn suggests to me Apple is still looking for an engineering way to say "no" if they're asked "hey, can you just add these other images to your data set." (Of course, my contention that this is not ultimately an engineering problem applies here, too: even if I'm right about Apple playing an engineering shell game here, I'm not convinced it's enough if a government is sufficiently insistent.)
A minor interesting tidbit: your linked Sophos story is based on a Telegraph UK story that has this disclaimer at the bottom:
> This story originally said Apple screens photos when they are uploaded to iCloud, Apple's cloud storage service. Ms Horvath and Apple's disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.
It's hard to say what they were actually doing, but it's reasonable to suspect it's an earlier, perhaps entirely cloud-based rather than partially cloud-based, version of NeuralHash.