Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So what do we do with that information? If I apply a critical framework around interpreting the LLM output, the answer is to reject it for being both not necessarily true but also knowing that the LLM isn't even trying to be correct, it's strictly trying to produce convincing sentences.

What value does a link to a source that's not held to any standard to be informative do? Seems a waste of everyone's time to me.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: