Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, the confusion between jailbreaking and prompt injection is definitely a big problem.

People who are frustrated at the safety measure that jailbreaking aims to defeat often assume prompt injection is equally "harmless" - they fail to understands that the consequences can be a lot more severe to anyone who is trying to build their own software on top of LLMs.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: