The slippery slope argument is the only useful argument here.
The fundamental issue with their PSI/CSAM system is that they already were scanning iCloud content [1] and that they're seemingly not removing the ability to do that. If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
But it wasn't, so as it stands there's no obvious user benefit. It then becomes a question of trust. Apple are clearly willing to add functionality at the request of governments, on device. By doing this they lose user trust (in my opinion).
> In the long run, we solve this, at least in liberal democracies, by voting people into office who understand technology, understand the value of personal encryption
But this is clearly not the way it happens in practice. At least in part, we vote with our wallets and give money to companies willing to push back on governmental over-reach. Until now, Apple was one such company [2].
I realize that Apple likely don't "care" about privacy (it's a company, not a individual human). But in a purely cynical sense, positioning themselves as caring about privacy, and pushing back against governmental over-reach on users behalf was useful. And while it's "just marketing" it benefits users.
By implementing this functionality, they've lost this "marketing benefit". Users can't buy devices believing they're supporting a company willing to defend their privacy.
> If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
Absolutely. I don't know whether there's a reason for this timing (that is, if they are planning E2E encryption, why they announced this first), but this is probably the biggest PR bungle Apple has had since "you're holding it wrong," if not ever.
> Apple are clearly willing to add functionality at the request of governments, on device.
Maybe? I'm not as willing to state that quite as definitively, given the pushback Apple gave in the San Bernardino shooter case. Some of what their Privacy Engineering head said in that TechCrunch article suggests that Apple has engineered this to be strategically awkward, e.g., generating the hashes by using ML trained on the CSAM data set (so the hashing system isn't as effective on other data sets) and making the on-device hashing component part of the operating system itself rather than a separately updatable data set. That in turn suggests to me Apple is still looking for an engineering way to say "no" if they're asked "hey, can you just add these other images to your data set." (Of course, my contention that this is not ultimately an engineering problem applies here, too: even if I'm right about Apple playing an engineering shell game here, I'm not convinced it's enough if a government is sufficiently insistent.)
A minor interesting tidbit: your linked Sophos story is based on a Telegraph UK story that has this disclaimer at the bottom:
> This story originally said Apple screens photos when they are uploaded to iCloud, Apple's cloud storage service. Ms Horvath and Apple's disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.
It's hard to say what they were actually doing, but it's reasonable to suspect it's an earlier, perhaps entirely cloud-based rather than partially cloud-based, version of NeuralHash.
Right, and in the interview linked [1] above they state:
> The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos.
But they do appear to do "something" server-side. It's possible that all data in scanned as it is ingested for example. I dislike this statement, because it's probably technically correct but doesn't help clarify the situation in a helpful way. It makes me trust Apple less.
What I don’t understand is that if they announced they would do that scanning server side, the only eyebrows that would be raised is of people who thought they were doing it already. It’s not like if those pictures were e2e encrypted. I still haven’t seen any convincing argument about why searching client side provide any benefit to end users, while being a massive step in the direction of privacy invasion.
> But they do appear to do "something" server-side. It's possible that all data in scanned as it is ingested for example. I dislike this statement, because it's probably technically correct but doesn't help clarify the situation in a helpful way. It makes me trust Apple less.
The qualifier is "Photos" - different services have different security properties.
Email transport is not E2E encrypted because there are no interoperable technologies for that.
Other systems are encrypted but apple has a separate key escrow system outside the cloud hosting for law enforcement requests and other court orders (such as a heir/estate wanting access).
Some like iCloud Keychain use more E2E approach where access can't be restored if you lose all your devices and paper recovery key.
iCloud Photo Sharing normally only works between AppleID accounts, with the album keys being encrypted to the account. However, you can choose to publicly share an album, at which point it becomes accessible via a browser on icloud.com. I have not heard Apple talking about whether they scan photos today once they are marked public (going forward, there would be no need).
FWIW this is all publicly documented, as well as what information Apple can and can't provide to law enforcement.
> This is an area we’ve been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users’ libraries on cloud services that — as you point out — isn’t something that we’ve ever done; to look through users’ iCloud Photos.
> This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
- Tim Cook, Apple
At what point does the fact that half a decade has passed since those words were written yet the hacker community has made little contribution to that discourse about the importance of privacy start implicating us in the collective failure to act?
And yet we can go e.g. on Twitter and observe comments from relevant security researchers that appear to describe a chilling atmosphere surrounding privacy research, including resistance to even consider white paper.
That’s not my area of expertise and don’t know how to fix that, but that should be an important consideration.
> The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos.
I really dislike this statement. It's likely designed to be "technically true". But it's been reported elsewhere that they do scan iCloud content:
My interpretation is Sophos got it wrong (they don't give a quote from the Apple Officer involved and manage to have a typo in the headline).
Apple does scanning of data which is not encrypted, such as received and sent email over SMTP. They presumably at that time were using PhotoDNA to scan attachments by hash. This is likely what Apple was actually talking about back at CES 2020.
They may have been also scanning public iCloud photo albums, but I haven't seen anyone discuss that one way or another.
My understanding based on piecing together the various poorly cited news stories is that Apple used to scan iCloud Mail for this material, and that’s it.
If they weren't doing any scanning why would they find any to report? The data is encrypted as rest so... why would they find any to report. This clearly doesn't include search requests [1].
iCloud has perhaps 25% of the users of Facebook. Of that 25% it's not clear how many actively use the platform of backups/photos. iCloud is not a platform for sharing content like Facebook. So how many reports should we expect to see from Apple? It's unclear to me.
So, I'm not saying the number isn't suspiciously low. But it doesn't really clarify what's going on to me...
> The fundamental issue with their PSI/CSAM system is that they already were scanning iCloud content
This scanning was of email attachments being sent through an iCloud-hosted account, not of other iCloud hosted data (which is encrypted during operation.)
> If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
As I understand it, this was more "leaked" than "announced". That is, it wasn't part of Apple's planned rollout strategy.
My thought was that they'll announce that soon, but then again, I'm shocked soon wasn't last week. I have no idea. Maybe there is some other limitation (legal) on he iCloud E2E backups they needed to solve first?
> If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
Most likely because the only way this announcement makes sense, is that they tried to respond for misleading leaks. We'll see on September probably some E2EE announcements, since iOS 15 beta supports tokens for backup recovery. At least, let's hope so.
The slippery slope argument is the only useful argument here.
The fundamental issue with their PSI/CSAM system is that they already were scanning iCloud content [1] and that they're seemingly not removing the ability to do that. If the PSI/CSAM system had been announced along side E2E encryption for iCloud backups, it would be clear that they were attempting to act in their users best interests.
But it wasn't, so as it stands there's no obvious user benefit. It then becomes a question of trust. Apple are clearly willing to add functionality at the request of governments, on device. By doing this they lose user trust (in my opinion).
> In the long run, we solve this, at least in liberal democracies, by voting people into office who understand technology, understand the value of personal encryption
But this is clearly not the way it happens in practice. At least in part, we vote with our wallets and give money to companies willing to push back on governmental over-reach. Until now, Apple was one such company [2].
I realize that Apple likely don't "care" about privacy (it's a company, not a individual human). But in a purely cynical sense, positioning themselves as caring about privacy, and pushing back against governmental over-reach on users behalf was useful. And while it's "just marketing" it benefits users.
By implementing this functionality, they've lost this "marketing benefit". Users can't buy devices believing they're supporting a company willing to defend their privacy.
[1] https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-...
[2] https://epic.org/amicus/crypto/apple/