Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We're really going to need to figure out how to power all these GPUs with green power real quick, or we're going to melt the planet having AIs debate with themselves on the optimal solution to tik-tac-toe...


Ive felt this way when using chatgpt for a simple search. Stuff that google could handle but would just be slower, mostly from me having to manually filter.

Sometimes its the easiest way to complete a very small task but the cost difference on the backend has to be pretty damn large. The user inevitably ends up not caring whatsoever. Its just not real to them.


I caught infra people saying that's pretty much the only bottleneck in the data center right now, power and cooling. We know the AI needs to run against itself continuously, and that's just a fact.


Maybe we should assign them a practical task, like making paperclips.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: