I agree with your general point, but the specific example isn't as egregious as you make it out to be. In the mid-90s, protocols for determining supported monitor resolutions were just coming out. The first widely-adopted protocol was DDC2, which wasn't standardized until mid-1996. Had Windows 95 or 98 defaulted to 1024x768@72Hz, a significant fraction of users would upgrade and find their display didn't work. To get a usable system, they'd have to figure out how to boot into safe mode and manually tweak the display settings.
I'm pretty sure this was 1999--2000 or thereabouts.
It's possible that standards were still too rough, but the statistic left a strong impression on me. I believe the general concept is validated elsewhere as well --- I'm pretty sure Jacob Nielsen has a similar finding.