Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While that will always be true, LLMs do it a lot more often and do so with confidence and poise. We have evolved ways to tell if someone is making shit up (which usually works); LLMs subvert this. We are also being sold the idea that these LLMs are some kind of super intelligence which isn't helping matters.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: