Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This exact argument has been made since people had PCs on their desks (the 1990s, at least). We're still trying to figure out how to do it well.


Yes, but as the numbers advance from "the remote desktop has like 128KB" and "the remote desktop can chew through gigabytes without much stress", the delta between O(n) and O(n^2) opens up a lot more. It is perhaps a bit counterintuitive, but as systems grow in capability that delta grows.


Server capacity has grown along with it if not even faster.

A server in 1990 had total storage and RAM roughly comparable to a low-end mobile phone today.


Server capacities have not grown at O(n^2) on data sets.

Computational complexity classes are not intuitive things. You can't just go "oh, that's much bigger than that so it must that many times better".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: