Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In raw, mathematical black/white terms, yes, I grant that image with my name next to it is technically public. However, uploading a profile portrait is typically intended for someone who searches for your name to determine whether or not a particular URL represents the person they met offline. I have to walk past a security camera to go through a store checkout - my image is in that camera because I want to buy stuff, not because I want my image to be public - and I might present my ID to the clerk because I want to be identified as of age to legally buy alcohol, not because I want the camera to link my face to my name (and my purchase list) or anything creepy like that.

When a person uploads a profile picture or appears in a security camera feed, they typically have an intent that doesn't match Clearview's use case, and an expectation that the stuff that Clearview.ai is trying to do with the image was humanly impossible. Historically, it has been impossible. True, some people are good with faces, and I'm sure some of them work in law enforcement or advertising, but no one can cross-reference 7 billion profile pictures to every security camera on the planet, and remember who went where at what time.

I'd argue that there's a fundamental difference in whether a right does or does not apply based on scale. A human looking at one data point needs to be approached ethically and legislatively differently from a machine looking at a million identical data points, because the use cases are different.

Clearview.ai is trying to make a land grab on human rights, asserting that because the things that they're trying to do have not yet been prohibited (because they're complicated, and because no one realized they were feasible) that they ought to continue to be allowed to do them.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: