Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

this is false. it's only for photos uploaded to apple's cloud.

the tech runs locally, but only on those photos.



Why do some people seem to think that it will stay limited to that forever? It's not just you, it's multiple people in here who think that this is ok. They are our devices. They should not be running software that can get us arrested on our devices. It won't stop there. It never stops there when they can ask for more.


If you take their technical summary [1] at face value, they designed it to be limited.

Even if the hashing and matching happen on the local device, a match can only be revealed server-side. The hash database distributed to local devices will be blind hashed with a server-side secret key and the locally derived hash match will need to be decrypted with that key to be read by Apple. So theoretically if the local device doesn't upload content to iCloud, no content matching can be revealed, even if the hashing and matching has been done locally.

Of course, you also need to trust that Apple won't be uploading those locally derived hashes to iCloud without the user's permission if iCloud backups are disabled.

[1]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


because we’re not discussing hypotheticals, but real life.

and in real life governments elected by the people have been pushing for this for years. the result has been google and all the other cloud providers already implementing this. apple was the last big one to hold out.

will they expand this in the future? sure, whatever. the system is so broken, and i’m so powerless, that at this point in time it doesn’t matter what i want.

at least it will only apply to the US. the ROW is spared. at least for now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: