I was pondering why modern operating systems take so long to start up... We have a huge amount of processing power on each machine after all, and we should be able to process all the required start up routines thousands of times faster than just a few years ago. I realize that the slowest part of modern computers are the hard drives (about 90 Mb/s is high for non-SSD) and not to mention seek times.
Which leads me to the question: why not GZIP all the required startup files? In fact, why not TAR them too, and make them sequential? Our modern CPUs would be more than powerful enough to handle the decoding process. Is there something preventing this compression? Or are they doing this already?
The reason that modern operating systems take a fairly long time to start up is due to things like enumerating hardware for PnP, getting a network address from DHCP, and starting whatever system services are necessary for you to have a pleasant (if you want to call it that) user experience. It really doesn't have anything to do with the size of the executable files, since the CPU is going to have to run the same code to make it all happen anyway (you can't execute GZIP'd code and have it run faster).