Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can but it’ll take longer. So one way to get faster answers is to stream the response as it is generated. And in GPT-based apps the response is generated token by token (~4chars), hence what you’re seeing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: