Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> . Yes, you can turn off iCloud Photos to disable Apple’s scanning,

so, Apple now has the ability to reach into every law abiding citizens personal phone - report them to the FBI when the algorithm makes mistakes (and it will - nothing is 100%, particular in image recognition), AND THEN THE PERPETRATORS CAN STILL GET AWAY BY JUST TURNING IT OFF??!??!?!

My goodness, I couldn't imagine a stupider way to fight CSAM.

> Facebook made 20.3 million reports of Child Sexual Abuse Material (CSAM) in 2020.

Were there 20.3 million arrests of offenders? I don't think so. I doubt they caught 20 perps from that data. While it may be a better algorithm, it should be a good warning of how often your' very legitimate and non-CSAM data will be leaked to a "verifier", and worse, to the authorities.

Take a look at how often YouTube channels are taken down by "auto triggers" that are then "reviewed" and you will get a good idea of how often the government is going to screw law abiding citizens. It's not zero, and imho, it will be well above a range of "acceptable fallout".



> Were there 20.3 million arrests of offenders?

I don't know of any data from the US but in Switzerland the Federal Police pre sorts all reports from the NCEMC and around 90% are unusable [1] and cannot be acted upon. The remaining 10% are then forwarded to local authorities. There is no data I know of what the final conviction rate looks like. But from what I have gathered from local news paper these departments are usually short staffed.

I suspect that the situation is similar in the US.

[1]: https://fedpol.report/en/fedpol-in-figures/fight-against-pae...

Edit: Fix typo: persorts to pre sorts.


That YouTube example hits home. I uploaded my daughter's videos on there when she was little to share with family. Then my account got flagged for a copyright notice and the channel got suspended. Been trying to get YouTube/Google to re-instate the channel - it's been six years, no luck. I keep trying. I hate it.


There are lots of googlers here on HN. Message all of them. Message them all the time. I am sure you'll get your issue resolved.

I know it would be rude to keep messaging them but in the end they represent Google and they are suppose to help you, at least morally.


It's the general problem that if the violations and negative PR hurt more than false positives, then they will err on the side of being wrong. Which means more people demonetized/banned. We're all just collateral damage.

As for the 20.3 million reports, Facebook probably decided it was better to bury the authorities in a massive pile of paperwork than to see a day in court. Plus they get to claim they are doing something.


> AND THEN THE PERPETRATORS CAN STILL GET AWAY BY JUST TURNING IT OFF??!??!?!

Not trying to defend Apple here, but we should keep in mind that the goal for Apple is to prevent such photos from being uploaded to iCloud and detecting those that do. If someone turns off iCloud sync, then Apple would argue that it's not their concern to scan photos saved on devices (and thank god it doesn't otherwise the repercussions would have been much more significant).


From what I've heard Facebook is one of the better places to go to groom children.

Better would be educating people: don't use your real name or upload pictures of yourself to the internet, be very careful talking to strangers. Don't let younger children use social media services.


Hasn’t the ship sailed already. What is TikTok but everyone’s little kid dancing around. We are here for a reason I think, we let rampant exhibitionism go so far via social media that teenagers and preteens are mimicking the compulsive sharing of the adults who first seeded these platforms.

Now they literally need to check everyone’s phone for illegal content.

Man, have I become my parents or what.


I don't know about your parents but mine signed me up for Facebook when I was a preteen (thankfully I never used it and stuck to forums/IRC/wikis that all used psuedonyms.)


This only makes sense for non-CSAM data and/or premptive surveillance/policing. Proper criminals/terrorists will evade this by either ... uhm ... disabling this, network filtering this, using Android or no phone at all for their criminal activities... at the very least once you put this pressure on the "market" for a while.

Few people have access to the databases. I would bet quite a bit, this will be used for marking possible offenders of all sorts of crimes and "crimes", e.g. having ISIS propaganda material on your phone, or leaked data. Sometimes you may not even know files to be on your phone. Try exploring Telegram's nearby groups with image autodownload for groups activated and see what happens to your "share recent files" dialoge... It's all porn and nazi memes now!

Overall, I am happy Apple admitted this ultimately, because I was conflicted about buying into the ecosystem for the recent hardware appeal. Not conflicted anymore. Not at all. It's Linux/ASOP or ~~die~~ get stressed out. No M1 benchmark or fancy watch health features can make up for the chronic knot in my stomach using Apple's products; die Schere im Kopf . Hope I find a low latency pen input tablet runnig FOSS Android or Linux, too, as I feel uneasy about journaling or drawing on my iPad now. Thanks Apple, I truly hate you too <3


It's automated, weaponized doxxing. We used to worry about teenagers calling the FBI on us, now we have to worry about someone AirDropping you a photo album that will get you arrested.


Not sure why the downvotes - the more automation like this we have, the more one can make another's life a hell, fighting the false positives, and then the resulting checkered history of it.


I think people underestimate the black mark of even being investigated, much less charged by the police. Even if they find you didn't do something wrong, even if they drop charges, even if you are found innocent after a trial, the mere record that you were ever suspected of something can follow you the rest of your life.

The standard is supposed to be that criminals are those "convicted" of crimes, but in reality the nature of risk mitigation for most people and organizations means that they will avoid anyone with any type of history with the criminal justice system. No one will care that there was an overzealous prosecutor, corrupt cops, etc. The burden falls on you to prove you're innocent and even with convincing evidence it may not be enough.


I agree. Even being considered of doing something bad can easily be life-changing - going by the example of celebrities, for one. One of the secrets of the nothing to hide argument is, I think, that people haven't experience a thorough search on them, by some authority. The situation where one "lives or dies" according to what they find, and especially, on how they interpret it. If not exposed to this, it's much easier to think that it won't happen to them, that they're not the type to be exposed to things like this.


Given that it’s already scanned server side, you could literally do this to someone now. Both cases (client or server side checking) simply require getting a user to download it to a location that auto syncs to iCloud.

If this was as big an issue as people think, we’d have seen more of it.


> My goodness, I couldn't imagine a stupider way to fight CSAM.

That's the point. It satisfies the letter of the demands from the crazies, while not actually accomplishing anything, and creating discourse in the rest of the population to work towards realizing that we shouldn't be bending to the whims of the crazies at all.


But that doesn’t work, because the crazies will simply take this victory and move the goalposts forward. They’ll do so emboldened by your capitulation and shame you for not acquiescing earlier.

The only winning move is not to play.


Not really. Without not playing eventually any Big Tech Corp that sticks to its guns will face opposition.

What Apple should do is educate people. (Of course then we have the problem with corporate propaganda.)


It was becoming clear that the goalposts were going to move forward towards outlawing end-to-end encryption entirely. This provides malicious compliance to prevent worse.


Apple's goal is not to fight all CSAM on iPhones; Their goal is to keep it out of their servers. A criminal could bypass the iCloud-related checks, but they still have to contend with similar measures taken by video- and file-sharing services.


I think this is something that has not really been discussed much:

How useful is this feature? If the goal is to protect children, there must be good evidence that this does, in fact, protect children.


I think the goal is to protect Apple.


When I saw that - 20.3 million reports - I was like... hold on. Apple's smaller reported numbers are probably almost completely accurate, whereas that 20 million number is almost guaranteed to be mostly wrong.

Who really made the mistake? The one reporting cases that are almost definitely true? Or the one reporting so many false cases that authorities are buried in false reports?


Listen to the story of any former Facebook content filtering employee and you would not be surprised by that figure. They have horrific jobs that leave them with PTSD and drive some to suicide. Also it's worth mentioning that 20 million flagged images does not mean 20 million unique offenders.


For any population you test wherein the true positive rate is very low even a very low false positive rate will produce overwhelmingly false positives.

Also you could trivially have 1 bad guy with thousands of photos being accounted as 5,000 reports instead of 1 report listing 5000 images.


> Apple's smaller reported numbers are probably almost completely accurate

Where are the hundreds of prosecutions resulting from apple's reports? I can't find them: https://courtlistener.com/


> JUST TURNING IT OFF

Or just apply image filters to the data, that will kill any neural network detection (since there is an infinite number of filters, specially the deep fake ones).


The on device software is not an algorithm, it compares hashes of an image vs a known list of hashes.


I'm unclear how this is distinct from an algorithm. Can you clarify?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: