Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

odd, that was my google query of yesterday..

I'm curious what kind of hardware can sustain 100k concurrent connections these days.



We were running a speed test with node vs dotnet core and even on a small Linux box (4GB, 1 core), we could reach nearly 10K concurrent requests for a basic HTTP response but the exact nature of the system will affect that massively.

Add large request/response sizes or CPU/RAM bound operations and your servers can very quickly reach their limits with far fewer concurrent requests.

Architecture is a big picture task since you have to consider the whole system before implementing part of it, otherwise you end up having to start again.


thanks that's already a lower bound point of reference




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: