Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They already censor Grok when it suits them.


Yep. "Oh grok is being too woke" gets musk to comment that they'll fix it right away. But turn every woman on the platform into a sex object to be the target of humiliation? That's just good fun apparently.


And when it's CSAM suddenly they "only provide the tool", no responsibility for the output.


I even think that the discussion focusing on csam risks missing critical stuff. If musk manages to make this story exclusively about child porn and gets to declare victory after taking basic steps to address that without addressing the broader problem of the revenge porn button then we are still in a nightmare world.

Women should be able to exist in public without having to constantly have porn made of their likeness and distributed right next to their activity.


Exactly this, it's an issue of patriarchy age the domination of women and children. CSAM is far too narrow.


What does that have to do with what I said?


If censoring Grok output means legal liability (your question), then the legal liability is there anyway already.


But that’s not my question/proposition of their position.

I replied to:

> They don’t seem to have taken even the most basic step of telling Grok not to do it via system prompt.

“It” being “generating CSAM”.

I was not attempting to comment on some random censorship debate,

but instead: that CSAM is a pretty specific thing.

With pretty specific legal liabilities, dependent on region!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: