For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.
Say you change your workload model there might be ~20% improvements between bare metal, virtual machines and containers. If you simulate a part of infrastructure using containers you may not need more RAM, but more CPU would be nice. But when you then want to do a lot of recording/capturing and process that, RAM gets more important. Just upgrading the RAM wouldn't help much because without a CPU to generate the data you might as well offload the whole thing.
> For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.
This is definitely an echo-chamber/bubble point of view. The vast majority of users out there don't even know or care what AVX is, don't use external GPUs, and don't know or care about the difference between Thunderbolt 2 and 3.
If you personally need these things and want to buy a new machine every few years, then that's great, you should do that. But there are a ton of people who would benefit from an easily-repairable, easily-upgradable (RAM, storage) machine that end up dropping $1500 every few years instead of the couple hundred they could instead spend for a reasonable upgrade.
Seems you are responding to the wrong thread here. User the_af was asking me why I replace a machine and I answered with some reasons specific to me. This was a deeper dive in to the point that some people don't need to upgrade at all because they don't do anything different between day 1 of their usage or day 1780. And they don't need to because the laptops of the last decade don't fall apart as much as they used to and a Mac specifically tends to work well during its entire lifecycle. This is also why there aren't as much people interested in modifying their computers mid-lifecycle.
While I bet that there are a lot of people that do want to modify their systems, they are such a minority that it's not very logical for a large multinational to invest in that to the detriment of other goals. It might simply mean that you are not the target audience for their product(s).
Some other manufacturers/brands have the same, while others do a mixed portfolio to cater to smaller groups as well. We also have large manufacturers that cater to the classical enterprises which still run on the old idea that you need a fleet of identical machines and then swap out parts all day long, so machines that have facilities for that exist. Most notably Lenovo, HP and Dell do that.
Say you change your workload model there might be ~20% improvements between bare metal, virtual machines and containers. If you simulate a part of infrastructure using containers you may not need more RAM, but more CPU would be nice. But when you then want to do a lot of recording/capturing and process that, RAM gets more important. Just upgrading the RAM wouldn't help much because without a CPU to generate the data you might as well offload the whole thing.