Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Is Software Getting Worse?
47 points by zgk7iqea on April 12, 2023 | hide | past | favorite | 104 comments
A couple of days ago I was trying to check my messages on a real estate website (I won't say which). But scrolling only worked in landscape mode.

Then I tried to watch a TikTok that a friend sent me, but the website took forever to load. When it finished loading, the player didn't work.

What is happening? Is software quality in decline?



To me as a user, what's in decline is UI, UX and trust. Lots of software these days is just outright hostile in very "in your factface ways" and subtle ways.

Even software I pay for seems to try to screw me at every turn. "You paid for me? Have some ads! I'll scan everything you do and sell it off to the highest bidder! No you can't copy and send this image to a friend, but here's a link to try and lure them into the walled garden so I can data rape them too. Stay in my eco system it's great. You love it here! Click click click keeping clicking. Moar data to sell! Give moar! Data breach! I'm soz (but no compensation for you."

Oh and when you live off grid with little power you learn to hate electron apps. They are such power wasters. Three things cause my inverter to spin up its cooling fans: games, windows updates and electron apps. (And of course if you have "AC" plugged in most apps seem to think its a damn free for all for power. So its more power efficient to charge powered down and only run using battery if possible).


> data rape

No. Just, No.


Honestly, I think the answer is no. But the sheer amount of software in our world is increasing. So if software quality lands on a bell curve, you'll see the lower end of that curve more often. Which actually argues that software quality is increasing - when so much of it "just works", the ones that don't stand out.


I lean to yes, because even things that formerly worked, fail without good reason. If feels like quality is not a top priority anymore, and I think that trend started with network patches, suddenly you could sell unfinished software while you still work at it.

For example, the most recent update to iBooks from Apple glitches if you rotate the screen, and stays that way even if you rotate it back. iBooks is an app that recently got a tiny facelift, yet no one tested what happens if you flip the phone.

I agree that we use more software now, so we should experience more buggy software, but if that was the case, we should have experienced the same proportion of buggy software all our life. I still think easy patching facilitated bad quality practices, although those bad practices probably make commercial sense.


Te most recent update fixed this issue. While the easy workaround was loking rotation inside iBooks preferences, I much prefer for it to follow the orientation.

That was fast, which is the bright side of network patches.



Indeed and the issue with the web is that 90%, instead of being buried in the pile of broken software, it is only a click away.


For software quality to have bell curve distribution, you have to define negative software quality first and then proceed to define inifinite qualities also. Otherwise it is either one-sided distribution (when there is no negative quality) or bounded distribution. Both cases have no guarantee to have sums of independent variables to converge to a bell curve. Log-normal distribution certainly does not converge to normal distribution however hard you try [1].

[1] https://stats.stackexchange.com/questions/238529/the-sum-of-...

Which brings up the question: what is negative software quality? Can we define that?


Solitaire on Windows 3.1 started almost instantly and did exactly what you wanted. Solitaire on Windows 10 is a laggy behemoth that won't even work if you aren't connected to the Internet. Nobody asked for solitaire to be connected to the Internet.

It's absurd that a program should be so much slower on machines that are literally about 1000x faster.


Investors: Dude! How about "IoS"? Internet of Solitaire's! (sound of cash machines)


> (sound of cash machines)

That's basically the end of this thread.

As in, why isn't it better? (sound of cash machines)


Microsoft in general has consistently had a problem with dropping blocking network IO "stuff" into it's applications - i.e. it's always been absurd that Explorer will just lock right up if it hits network lag when doing SMB browsing.


Same with Mac OS constantly asking Apple whether it's fine to start a program. Which was fun not long ago when that Apple server went down and people couldn't start programs anymore as long as the Mac was connected to the internet because Apple of course couldn't imagine their own creation being fallible.


Are you talking about gatekeeper flags or app signatures? As far as I know, both of those can be managed locally, I don’t think Apple is informed every time you try to open a process.


I have no idea how it's called but it is a very noticeable effect and also happens whenever I run a freshly compiled executable for the first time. It usually causes a delay of about 0.5-5 seconds depending on the wifi connection. Which is kind of insane, to upgrade to a literally 3 times faster machine and then it sometimes runs slower than before.

And not network-related but still annoying is Mac OS' habit of always spinning up connected harddrives each time a "save file" dialog opens, which causes that dialog to be completely unresponsive until all drives are ready. Save a file and the most expensive Mac can be bottlenecked by a single external HDD. It's just so unnecessary.


Discovered this the hard way while diagnosing heavy Teams latency in the UI while Pihole was blocking MS telemetry.


Almost nobody cares about the value to the user anymore. Value to the owner/seller of the product is what matters.

This results in poor UX, apps spying you constantly, useless features and easy to fix bugfixes that aren't fixed in years.

As a software developer I'm disappointed in this approach. If I can I try my best to produce the best experience for the end user, but in the end I can only get satisfaction in open source...

But open source is sometimes hell too!

Just recently I found super popular open source library that earns $200k each years from donations, but the its Github is filled with issues and PRs. Author doesn't care, the money flows anyway. It hurts to see most major issues are already fixed by PRs, but the author doesn't even bother to merge & release.

Sad.


I'm curious which OSS project you're referring to. That doesn't sound typical at all.


Please share the link so I can fork it and snag some of these sweet donations


> Almost nobody cares about the value to the user anymore. Value to the owner/seller of the product is what matters.

I think the situation might be different if the expectation for software wasn't that it cost nothing.


I think two things are happening.

One is the “things aint built like they used to be” bias — buildings from the past that are physically well built are the ones that don’t get torn down, so we associate the past with the sturdy stone buildings instead of the clapboard houses.

The other is that you are right. We keep pushing rendering engines originally designed to render hypertext into a de facto operating system, so even basic software like a real estate site or video player has to deal with a bunch of accidental complexity that the equivalent desktop software from 1999 didn’t have to. It’s worth noting that both your examples are browser-based apps, not desktop or systems software.


The year is 2036. I sit down at my 128 core 7 GHz computer with 1 Tb of DDR8 memory on my 10 gig fibre line.

Discord and MS Teams still take 30 seconds to open.


How did they managed to get 7GHz CPU? Was this nothing more than a marketing way to sell good old 3.5GHz crystals?


It's like the thing where light bulb brightness is rated in watts, but it's not the amount of power actually used, it's how much power an incandescent bulb of the same brightness would use.

Numbers all become meaningless around 2029 and CPUs are now sold as "as powerful as a 7 GHz CPU". No attempt is made to specify anything else about the hypothetical 7 GHz CPU.


In my opinion, software is getting much, much better.

Twenty years ago, applications crashed all the time and there was no thought given to data recovery. For example, whilst writing this post, I accidentally clicked a link, then pressed back and Chrome hadn't lost my post. Back in the day, a mistimed backspace would irretrievably destroy your input. Video games frequently wouldn't work without messing with driver settings - how do I know if want to use glide or DX?

Playing a video involved messing around with codecs and if you were unlucky, the video was actually just a virus that would completely own your computer.

There is one major exception, which is websites. After the age of popups, but before the age of privacy warnings, there was this short window where you could just open a web page and it didn't have a whole bunch of crap you had to dismiss before you could access the content.


you gave 2 large companies as examples.

There's SO MUCH incredible software out these days...better than anything thats come before it...software is getting better - media harvesting is getting worse. So...support good companies, and make a point of not supporting the ones which do this. It comes at a cost - but - that's life.

for creators, particularly in music - we've never had it better...

PAID -Ableton blows my mind - every time i touch it. -So does FL studio

FREE -Surge XT - most incredible free software synthesizer going -Vital - a free version of Serum which sounds incredible. -Ardour - a free protools stand in. -Air Windows - tonnes of audio processing plugins -Variety of Sound - beautiful UIs, awesome sound. -Blender...holy shit...

i could go on..


DaVinci Resolve. Proper video editing, VFX, and colour grading software as used on stuff you've seen in the cinema and on telly.

Free, if you don't want the clever "neural engine" stuff or >4k timelines.

Oh, you do want that? That'll be 300 quid then, and it's yours for life. They have never even charged for the updates, either.

Of course, they make their money on cameras and control surfaces.


DaVinci and Ableton are both good examples to me. I love Ableton and use it a ton, but man does it crash almost every session for me. I finally bit the bullet and bought Resolve to use some of its non free effects and have already run into a number of bugs. I’m still very grateful I have access to that level of software, but it does feel like things being 90% complete at best is the standard these days.


> I love Ableton and use it a ton, but man does it crash almost every session for me.

Is that when using a plugin window or using the Ableton UI itself?

I can only recall plugins crashing for me... Always worth checking for updates with the vendors.


What bugs are you running into with Resolve? There are certainly a few things you need to be a little careful of.


One is my fusion page always going back to a clip I’ve never opened in fusion. The bigger one is an insane rendering time that grows endlessly on a small clip (36+ hour estimated time before I finally cancelled it) that otherwise only takes a few minutes to render and display inside the program itself.


You need to make sure the play head is over the clip you've selected. I don't know why.

The insane rendering time one is weird, but it only seems to show up in the Windows version. Apparently it's something to do with having one dodgy frame in the clip. I suspect it's something to do with sources encoded in h.264, but I'd expect that to affect the Mac port as well. Linux doesn't seem to be affected.

You can try doing "render in place" for the affected clip, which seems to help.


Yeah, my playhead is over the clip. It still keeps going back to a clip I never opened in fusion.

Re: render in place, I'll try your approach next time I run into it. Thanks.


DaVinci recently also added hardware acceleration to the free version for .h265, which was previously only available in the paid version.


here.. more free software which rocks:

OBS Screen Recorder, Obsidian Mind Mapping Software, Mock Mechanic (free engineering sandbox), Ranstads Circuit Modelling JS site, Inkscape - getting better and better


The load speed on Sublime Text is worth the price alone.


We never reached quality on mobile UX for websites. Not because it can't be done (see Wikipedia or apple.com) but because it seems like we don't want it. Social media websites are total crap, presumably because they want you to use the app where they can harvest a lot more data than inside your sandboxed browser. News sites and listing sites earn money from ads, and it's now some sort of standard practice to force your users to see/click on them with those gotcha popups and parallax effects. And then there's websites that appear not to care enough to do something about it, like HN which is usable but could be done better.

So overall I don't think it's getting worse but it's always been bad on mobile for whatever reason.


Software gets worse as it gets bigger. The classic measure is 15 or so bugs per 1000 lines of code. Most software companies have career ladders that reward building broad abstractions that affect most of the product / company. So people add layers of complicated middleware to have “broader impact”.

So over time, you aren’t just subject to bugs in feature A. You’re subject to bugs in feature A and library B and framework C and injected monitoring D and so on for the same user action.


TikTok is designed to get people to download the app, so it's intentional.


Underrated comment.


Things have gone rogue. You can no longer tell a button is a button anymore or a link, menu, etc.

So often have friends and family got lost wondering what to do. Oh you got to click on it? The visual clues are all gone. "Flat"... right.

Too many things are rushed out the door. Too much focus has been placed on "features" "features" and more "features" and no polish. Every release there are new features and yet the old 1s still have lots of bugs :(


Reddit admitted to maintaining more than 10 different video players and most of them are broken one way or another. As a company, they apparently do not have the capabilities to fix this. Their website and app are used as examples to teach bad practices.

ChatGPT, literally the biggest thing of the year, still has an incredibly amateurish website. Their auth UX is broken. They do not offer exporting conversations. You just see an infinite scroll list of previous conversations. It's a complete mess!

Yes, indeed, almost no one cares about UI/UX anymore at all, let alone polishing it towards perfection.

How did this happen? If yor product is "good enough", no one will care about the UI/UX. It's not an important distinguisher. Everyone keeps complaining about reddit, but everyone keeps using it. Same for ChatGPT.


> Is software quality in decline?

No, you just have a skewed memory of some time when software was "good". Remember Windows ME?


I somewhat disagree. As an avid Visual Studio user.. it's definitly gotton worse. A lot of little Editor UX things seem to stop working.

Example: in C++, writing switch(..) on an enum would automatically default create all switch cases. Then later versions, it only occasionally did it, now it doesn't do it at all.

Another example: When my application finishes and VS is activated, sometimes the cursor is in one window, but typing happens in another.

Another example: In Windows 11, when resizing a window, the artifacts seem to have gotton worse. Sometimes even the whole window is flickering.


There are better alternatives UX-wise for both of the examples you gave.

Visual Studio -> Jetbrains.

Windows 11 -> Linux (or Mac)

While VS and Windows have gotten worse in recent years, other software like Jetbrains IDEs and Linux UX have gotten better.


Not sure why this is downvoted - I also have the experience that Linux on the desktop now vs 10 years ago is a much better experience, both UX wise (at least in general, though it's a bit subjective), but also in terms of stability.

From what I've scrolled through so far, most examples people give of software that got worse is by Microsoft, though I wouldn't say it's limited to them. There seems to be a desperation for profit in commercial software right now like nothing I've seen in the past. There's usually alternatives worth a look though!


Well maybe. But these alternatives come with their own set of problems.

I guess it's a tradeoff.. how much can a software degrade until you're willing to learn and deal with another set of problems. Unfortunalty, you kind of get used to UX quirks.

I have to say though, all in all Win 11 is not as bad as people make it out to be. After turning off a few things, it stops getting in the way.

I'm definitly going to have another look at Jetbrains though.


Perhaps the best counterpoint in the thread. That being said, the exception proves the rule, and I never had issues with 95 or XP. It’s crazy how many basic pieces of OS functionality stop working for me in windows 10.


OMG this!

I grew up on Windows and only around XP did it begin to get okay. Before then it was mostly garbage, and after then it was Vista! I have my issues with modern MacOS, but it's generally pretty amazing. I feel like asking the same of these Jonathan Blow acolytes -- When exactly was this golden age? Are you sure it's not nostalgia?

If your poison is Linux, do you remember what the Linux desktop looked like in the early 2000s? On the server, it's ridiculously more resilient and polished.

FWIW I have tremendous respect for Jonathan Blow as a software engineer, but I think this has always been mostly a self aggrandizing pitch, from a craftsperson to people who like the idea of crafted things, that is -- if all the software in the world is getting worse, then there must be some priesthood of the worthy. When, in fact, the "bad" software Jonathan Blow sees is software produced simply under different constraints (expectation is software costs nothing, it must run everywhere/on the web, it's part of a distributed system, compatibility with earlier crappy software really matters, it's made by people who simply care less about the it being a perfect experience, because its not their game). Of course, some new software is actually bad, but I'm not certain software is actually getting worse.


I feel your pain. I think software is certainly getting much more complex (because it can, not because it has to). Complexity comes with a cost.

I hate when a mostly static website is rendered completely client side, for example. Most of the time, I just want to read the text, maybe see some (non-marketing) images, and that's it. I don't want hundreds of kilobytes of JS executed on my machine just for that.


The physical world has peaked, in every metric.

That makes software harder. Just wait until you can't buy a new computer just like that.

Stick to vanilla Java/C+/HTML(5)/JSON/OpenGL/AL and make something great and you'll be fine.

Note + is NOT a typo: it means C++ compiler with mostly C syntax. I only use string, stream and namespaces. Classes very rarely.

Don't download things that are >1GB.


I think it is also because even with the more vanilla languages and APIs, they have also become moving targets. Java 1.4 felt like a sweet spot, similar C99, WebGL2/HTML5 etc. It seems like we are unable to say 'job done' and feature freeze them. And imagine the software stability if that was paired with a 5 year stable hardware platform. Maybe that is what you mean? that we approach a point with a 'last computer' that will be forever more optimized?



>The physical world has peaked, in every metric.

How do you figure that? Seems to me that things are still rapidly changing. Starlink, fusion, JWST, computer hardware, 3D printing etc...loads still happening.


All statistics show 2016 being peak in population growth in the western world.

Acceleration stopped 1968.

CPU improvements stalled around 2012.

GPU around 1030 so in 2017.

Energy is the master resource.


CPUs have improved so much with AMDs chiplet push. I can now buy an eight core AMD/Intel CPU for 300 $/€ that the people of 2012 could only dream about.

This thing enables me to have my complete production environment on my machine and develop faster. Apart from that programs in general run very fast. I don't want to go back to 2012.


You should get together with https://news.ycombinator.com/user?id=eimrine since you both say the same thing at the same time.


You genuinely believe a CPU from 2023 to be no better than one from 2012?


Not by much, if you count in the energy used to make new CPUs, and the waste you crate by not using an old one; you are better off using an Ivy Bridge than a newest AMD/Intel.

I also believe the older stuff is better built in terms of longevity if you cool them properly.

I also know the new ones have more crap in them that will allow Intel/Microsoft to block things when they need money.


I really believe in that. We used to have 4 cores in 2012 plus and this is slightly different than 2 cores (hardware cryptorgaphy in 2012th processor was more noticeable for me than extra cores). Modern processors eats significantly less energy and can hold significantly more of RAM and that's all I can notice.


We used to have 4 cores in 2012 plus and this is slightly different than 2 cores

It is double actually. Also there are CPUs with 64 cores now. Also in 2012 there were CPUs with more than 4 cores.

that's all I can notice

What you can notice is not real information when speed, cores and energy use can be quantified.


We also used to have 4 cores in 2009 but that was very unstable.

BTW, for modern multitasking computers, the main jump in performance comes from going from a single processor to dual processor. Than Amdahl's law neglects the difference.

> What you can notice is not real information when speed, cores and energy use can be quantified.

Single-core performance is totally the same, the number of cores does not change anything visible and energy use seems like the only outcome of 10 years spent from 2012.


We also used to have 4 cores in 2009 but that was very unstable.

This is nonsense

Than Amdahl's law neglects the difference.

Then you don't understand "Amdahl's law". It is very basic and only about parts of software that aren't multithreaded becoming a bottleneck. This sounds like desperate pessimism.

Single-core performance is totally the same,

This is not true either and you could look at benchmarks to see it.

the number of cores does not change anything visible

Disable hyperthreading and all your cores in the bios, then run multithreaded software and say the same thing.


> This is nonsense

Core 2 duo Quad from 2007+ used to have 2 L2 cashes (not 4) and no L3 and task manager of Windows refused to see more than 2 cores, but for some rare tasks Quad was perfect such as playing GTA4. That is just my memories, feel free to correct me if you see me mistaken.

> Then you don't understand "Amdahl's law". It is very basic and only about parts of software that aren't multithreaded becoming a bottleneck.

Everything except of browser is not multithreaded in 2023, for example a garbage collecting of interpreted languages.

> This is not true either and you could look at benchmarks to see it.

I have a heavy single-threaded application which I need to run 24/7 and I still use Pentium 4 because no modern computer can make it 2x faster.

> Disable hyperthreading and all your cores in the bios, then run multithreaded software and say the same thing.

I mean starting from 2 cores the number of cores does not add anything visible.


Core 2 duo Quad from 2007+ used to have 2 L2 cashes (not 4) and no L3 and task manager of Windows refused to see more than 2 cores

This is software has nothing to do with your claims of hardware not advancing or four cores being 'unstable'. I don't know what it has to do with anything.

Everything except of browser is not multithreaded in 2023

This is a bizarre claim because it's so easy to disprove. Games, content creation video encoding and decoding and of course, your whole OS which is running multiple programs.

This also has nothing to do with amdahl's law, which is about how non-multithreaded parts of multithreaded software scales.

for example a garbage collecting of interpreted languages.

Who cares, an interpreted language is not meant to be fast in the first place. Even so you can still run multiple threads and multiple processes.

This is also software and has nothing to do with your claim that CPUs haven't changed since 2012.

I mean starting from 2 cores the number of cores does not add anything visible.

They do to everyone buying CPUs with more than 2 cores, which is basically everyone, since even phones and $35 rasberry pi boards have four or more cores.


You have written a lot of responses about why am I a little wrong on every my sentence but I don't see any powerful point. What is the most important feature of post-2012 CPU and except of energy consuming?


a little wrong

Pretty much everything you said was not only wrong, it didn't even have anything to do with your point.

What is the most important feature of post-2012 CPU and except of energy consuming?

They are faster per clock cycle, run at higher frequencies, have more cores, more memory bandwidth, more pcie bandwidth, wider SIMD lanes, deeper out of order buffers, much more cache and more execution units.


> They are faster per clock cycle, run at higher frequencies, have more cores, more memory bandwidth, more pcie bandwidth, wider SIMD lanes, deeper out of order buffers, much more cache and more execution units.

This is called extensive progress, no intensive developing since 2012 as it was mentioned before I have joined discussion. You are completely wrong about frequency (no increase since Haswell's 4Ghz) and somewhat boring about other points so I will not answer here any more.


This is called extensive progress, no intensive developing since 2012

This is a word salad that means nothing. Where did you even get these ideas?

You are completely wrong about frequency (no increase since Haswell's 4Ghz)

Why would you say something that is so obviously wrong and easy to disprove? Most current AMD cpus have base clocks that are higher than 4ghz, let alone their boost clocks.

https://wccftech.com/amd-ryzen-9-7950x-cpu-5-85-ghz-peak-5-1...


It's probably fair to say modern software should be seen as a disposable commodity. It doesn't have to be good or durable, but it should be cheap and abundantly available. If you're developing software, see your employer as a factory owner in China mass producing some low quality imitation product and you're the factory worker.


Both of those are more web experiences than "software" in my books though the two are blurring.

Web in general seems to be getting worse yes. Partly due to dark pattern-y BS but also because it's a hard problem. There are just so many combinations of configurations that it's hard to cover it all. See also those fingerprinting sites showing you're unique (your configuration)...that's effectively a measure of that diversity.

I don't mind the latter. There is certainly a movement towards intentional user hostile designs though which is annoying


Desktop UI has also gotten worse, mostly due to the mobile/web influence. Peek desktop UI was 15-20 years ago.


Definitely feels that way. Not sure if it’s No Country for Old Men phenomenon or not.

In that vein and as a counterpoint: I remember thinking software like VLC and irfanview were great back in the day. Now, I find them both to be incredibly frustrating. I wonder if the bar for quality software has actually gone up.

That being said, I never remember basic websites of the 90s being as buggy as the ones I run into on a regular basis today, so who knows.


Software complexity is increasing. With the increased complexity many failure modes are not understood by developers. Thus those failures are not handled gracefully. This feels like a decline in quality. Depending on what you measure the quality could actually be better, the same or worse.


Furthermore, with software built as a teetering jenga-tower of abstraction layers and frameworks the actual error often isn't even propogated as far as the UI - and instead of a description of the problem we get "Oops, something went wrong."

This inscrutability of modern software is one of my biggest pet peeves with it. There was a time when the resource-limitations of computers forced software to be much simpler - and the end result was that when things went wrong, someone with sufficient time and motivation could sift through the rubble, figure out what the problem is, and often lash up some kind of workaround. That's getting harder and harder to do these days.

Likewise, in the open-source world, the breadth of dependencies is getting out of hand - I've lost count of the number of times I've tried to build something and had to go on half a dozen side-quests to track down a particular version of a particular build tool because the version in my distro is either too new or too old.


I think the issue is the incentives have shifted over the years.

A lot of software today is built to drive a specific user behavior to generate revenue instead of being designed from the start to provide customer value.

User-hostile behavior, dark patterns, etc. are all a symptom of this. Additionally time-to-market and marketing bullet points are often more important than user experience of bug-free software because many companies have found they can make the same revenue without having to focus on the customer experience.

Classic software was not as big-free as people like to think. There was lots of BAD software. The difference was that with higher prices and the inability to push updates (aside from mailing floppies) meant that the incentive was to produce software that had fewer bugs.


The given examples are no so much about software getting worse but specifically web programming to do so. Here the answer is clearly yes, web programming is in a suboptimal phase at the moment with web programmers acting more like carpenters who try to cobble up a house using only IKEA flat-packed products which they jury-rig together using copious amounts of glue, packing tape and baling twine than like the artisan of old who went into the woods with an axe and built himself a log cabin. True, the log cabin was rather spartan but it it kept out the elements and lasted a long time. The IKEA mansion has all the furnishings you want and some of them even work, sometimes. Often they don't, the lights go dim when you want them bright or end up blinding you just when you dozed off in bed, the dishwasher only works when the washing machine is first switched on and off, the doorbell rings day and night except for when someone is actually at the door, etc.

Blame the enormous amount of churn in the web tool world, the chase after the latest fads, the fact that many 'web programmers' entered this field not so much because they like the intellectual challenge of solving puzzles but because it pays well, the fact that project targets keep on moving or whatever.

Commercial software is getting 'worse' in that the distinction between customer and product is disappearing. A product is that what is made by a company in order to sell it to a customer so the company gets to make money. A customer is he who buys a product from a company because he deems it fit for some purpose.

That was then but now things have changed. Final sales are making way for rental agreements, products are turning into advertising and data mining applications to be used by companies to herd and milk their customers. Customers are thus productised and sold to other companies who target them with directed advertising based on what was harvested earlier.

Free software is still mostly freed from this plague - although even there it sometimes shows up, e.g. the Shopping lens [1] which appeared in Ubuntu 12.04 and only recently disappeared was an unwelcome reminder of this phenomenon - so there is a way out.


Software is getting worse, yes, although to be fair and mildly abrasive:

1) If you haven't actually lived through the arc you can't really spot this so you invite "okay boomer" sniping if you point it out.

2) Knowledge does not update the genome, therefore knowledge transmission is A Hard Problem, therefore because "developing software" is not a small and formal discipline the lessons learned by experience are not reliably/effectively/efficiently/at-all transmitted to the newer "generation" of "developers". This gives the appearance of "not learning from mistakes" or "things degrading" but the reality is simply that everyone is always reinventing things they don't understand as a result.

3) There are too many people engaging in the act of producing software who shouldn't actually be doing it. This is true of all fields where demand outstrips supply. LLMs are going to solve this problem in a most unfortunate manner.

4) The reality that cannot be discussed is that "stupid" is a much larger problem than anyone realizes, it extends well into areas that people normally think of as "requiring high intelligence", and it exacerbates all of the above.

5) We incentivize and, for reasons already given, actually justify behaviors that also tend to produce this. Ex: I assure you that somewhere out there is a product manager with a drug habit desperate to find a way to turn hammers into smart devices with a subscription service, NOT because this will help people who use hammers, but because this will increase profits for a company that produces hammers. This is an objectively bad and harmful practice that we encourage daily.

Soooooo, again, yep.


Yes. There's no incentive to make good software from a business perspective. Sure, most developers worth their salt would love to do it, but it's not what they're paid to do.


People don't want to pay for 'good' software, so you get more and more corners cut until you end up with something that nearly unusable (or more likely, just not very good).


Anyone who thinks: No, just watch the apps you use and sites you visit for a week and mark down whenever something fails or loads too long. You will very fast change your mind.


What I interacted with twenty years ago did less, operated on fewer device types, required a great deal more developer expertise and effort to create even simple applications. Commercial software was distributed largely on physical media with a buy it and own it business model (no freemium, ad-supported) and the development processes was largely based on those from the hardware world, i.e no agile.

For most software all of that has changed and it comes with a cost.


It's UX for me. Maybe because I am getting old and I can't keep up with latest "software trends"

I literally had to google how to restart my phone the other day.


Certain kinds are, I think. I've noticed that when I log in to [ well known ecommerce website ] these days it tells me that I have -1 pending orders. It isn't the only bug like that.

With this one as with many others I think it's because a lot are worked on by multiple teams and a lot of bugs get stuck in a kind of "no man's land" between teams where nobody is really responsible.


In some sense in some aspects - yes software is getting worse. For example optimisation. A lot of practices that been utilized today from "code standards" to "convince over cleaver solutions" we lose about 10 years of hardware evolution when it comes to overall performance. Of course there are exceptions. And this is just single example of software getting worse


Yes, and it's not just basic quality but also a lack of thoughtful implementation. For example "smart" IoT lightbulbs that do firmware updates during the night, when having some light would be kinda nice.

I especially find it frustrating how software gets more sluggish while at the same time we're making 500Hz screens, VR etc which should actually encourage the opposite.


No. Software is getting better and developers are more productive than ever.

It is just that the goal the developers had is not what you assume (to let you view video as easily as possible) but instead onboard you as an active user of the app.

If you try and use Facebook without being logged in you will have a similar experience.


Yes, I believe so. There is an endless push to get your data. Prompts to subscribe.

For example, I have a BlackVue camera. App used to work great.

Now I get a prompt to subscribe to their paid system. Why? I Paid for a webcam, and knew that the online service was paid, but then it was my choice.

Now they are coercing me.


Yea, ofc. There's just a lot going on all at once. And some of it is honest to god huge quality of life improvements, which impose guard rails around piles of garbage and creates reluctance to see or acknowledge very real negatives.


No. It isn't in decline. Saying it is in decline assumes there was a period of time where it was better. There are a lot of bad programmers out there. There is, and always had been a lot of bad software


"Experts say yes."


The issue is rent seeking behaviour that spawns across our current capitalist systems. (No this is not a rant against capitalism or a pro communism statement, just a problem description)

Basically our current capitalism system incentivises not "value creation" as an utopian version of capitalism would incentivise, but actually "rent seeking" aka "build a moat, capture users". Everyone is after guaranteed recurring revenue, and it is done by shackling you to the service being provided, check "right to repair" issues, check compatibility in software ecosystems.

Someone builds a very good product, people migrate to it, they pay you money, maybe even pay you to keep using it (subscription), but how to you guarantee that they are not switching to something else? Add as much lock in as possible, or make it free so you can sell user data/put adds. All of these are not connected to the quality or "value" you are providing to customers, but rather into ways of keeping your moat / stop competition from rising.

tragedy of the commons (and similiar issues) means that we can never have the utopian capitalist/free market world. Regulation is always needed to avoid such issues, but at the same time regulation can also backfire and instead create these moats/rent. The utopian Capitalist world, just like the utopian Communism cannot exist. And the end of the day, there can never be purity, and systems need continuous engineering/maintenance effort.

In your example, TikTok want you to use the app, website is not a priority, they can't gather as much info there, it just exists so that someone using the app could send a video to someone without it. Also, it still needs a ton of bloat to still get as much info from you as they can. So, with less resources and with still wanting to spy on you as much as they can, it's normal that the video player might not work in all browser/OS/device combinations.

Same shit across every product, now add things like speed of delivery of new features or new ways to spy being more important that performance/quality, and you get trash software.


It's because we don't know when to stop.


The alarm app on Windows now asks me to sign into a Microsoft account, presumably so it can track me.

I think that answers your question.


Software becomes better but not for user?

Alarm which can track someone seems like more advanced kind of alarm software from some points of view.


When was the last time you had to debug an IRQ conflict? I remember when "Plug and Play" was derisively nicknamed "Plug and Pray". Today, it all just works. When I'm assembling a computer, I don't have to set jumpers. I don't have to fiddle with making sure the boot drive is at the end of the IDE cable rather than the middle. I can just snap all the pieces together like Lego, hit the power button, and be assured that I'll get a bootable system (assuming, of course, that I haven't been a dolt and forgotten to plug the video card power cable in).

When was the last time you had an application blue-screen/bugcheck/kernel panic your machine? Yes, Windows still blue-screens from time to time, but over the past decade, I've found that 100% of my blue-screens have been caused by faulty drivers, rather than application code or bugs in the OS itself. This wasn't always the case. I remember, on Windows 98, there was one particular game that my brother had (I think it was Reader Rabbit), which would repeatedly and reliably blue-screen the machine when we got to a certain level. I haven't seen any errors like that in more than decade. And even the driver blue-screens are getting better. I remember not too long ago, my Windows PC's monitor blinked off, then came back. When I looked in Event Viewer, I saw that the GPU driver had crashed and had been automatically restarted. This is something that still causes kernel panics on Linux and MacOS, but Windows just shrugs it off and keeps on chugging.

With regards to Linux, when was the last time you had to mess with xorg.conf? Wifi drivers? WPA supplicant? I remember when I had to download the Windows drivers for my wireless card, extract the binary blobs, compile NDISWrapper, and then pray that I'd set everything up correctly, before unplugging the Ethernet cable to test whether my wifi was working. Now? I browse Hacker News while Linux is installing, because wifi drivers have been part of the kernel for years.

As for programming tools, they're more stable, robust, and widely available than ever. When was the last time you had to pay for a compiler, interpreter or language runtime? When was the last time GCC or LLVM crashed? Today one can write code in C, C++, Java, Python, Go, Rust, and a plethora of other languages... all for free, even on Windows! This is a huge improvement from the bad old days when your choices were to either pay for Borland or pay for Visual Studio. And as for web programming, do you really pine for the days when your only option for a backend language was a collection of perl scripts in `cgi-bin`?

The one regression, in my opinion, is with communication software. We used to have open (or "open-enough" i.e. reverse engineered) protocols that enabled multi-protocol, multi-platform clients such as Pidgin. That world is gone. Our communications are now siloed into proprietary, hostile software stacks, such as Slack, Google Meet and Teams. And our personal communications are siloed between Discord, WhatsApp, and the multifarious other messenger apps that we have to install in order to communicate with that one person who refuses to use anything else.

But other than comms, has software improved? I have a hard time arguing otherwise.


> I don't have to fiddle with making sure the boot drive is at the end of the IDE cable rather than the middle.

But now we are having USB-c.


> When was the last time you had to debug an IRQ conflict?

Never. But in my home server I have 4, sometimes 5 RAID controllers. Some combinations of RAID/AHCI/IDE modes won't work due to lack of resources. I presume it's about I/O addresses - the boot message is not really informative compared to Device Manager. I regret not taking a picture of the error message. The system won't even pass the POST when this happens.

> I remember when "Plug and Play" was derisively nicknamed "Plug and Pray". Today, it all just works.

No, it doesn't! I just plugged in an old webcam and there's no driver. It's the exact same "Pray"... well, not really. I don't pray. These days I default to the "It's not going to work" mindset.

> When I'm assembling a computer, I don't have to set jumpers.

How is that a good thing? Overclock something, system won't boot, you have to reset the CMOS and loose all settings, including the boot order.

> I don't have to fiddle with making sure the boot drive is at the end of the IDE cable rather than the middle.

No, you could have used the jumpers, but I think you hated them too much to let them help you. I wish I had the jumpers now. Every time I insert or remove a SATA HDD from the rack I have to redo the boot order.

> I can just snap all the pieces together like Lego, hit the power button, and be assured that I'll get a bootable system (assuming, of course, that I haven't been a dolt and forgotten to plug the video card power cable in).

And then you notice that the system won't recognize the CPU without a BIOS/UEFI update, which you can't do because it won't boot unless it recognises the CPU. Then the OS is installed with IDE drivers and you can't switch to AHCI without major OS surgery.

> When was the last time you had an application blue-screen/bugcheck/kernel panic your machine? Yes, Windows still blue-screens from time to time, but over the past decade, I've found that 100% of my blue-screens have been caused by faulty drivers, rather than application code or bugs in the OS itself.

These days there's no blue screen or error. Apps just won't start (click the icon and it does nothing), opened apps suddenly close without any error (just dissappear from the screen), system suddenly reboots, or won't wake up from sleep, or an update makes the system unbootable or stuck in a boot cycle.

> This wasn't always the case. I remember, on Windows 98, there was one particular game that my brother had (I think it was Reader Rabbit), which would repeatedly and reliably blue-screen the machine when we got to a certain level. I haven't seen any errors like that in more than decade. And even the driver blue-screens are getting better. I remember not too long ago, my Windows PC's monitor blinked off, then came back. When I looked in Event Viewer, I saw that the GPU driver had crashed and had been automatically restarted. This is something that still causes kernel panics on Linux and MacOS, but Windows just shrugs it off and keeps on chugging.

True. This part is better.

> With regards to Linux, when was the last time you had to mess with xorg.conf? Wifi drivers? WPA supplicant? I remember when I had to download the Windows drivers for my wireless card, extract the binary blobs, compile NDISWrapper, and then pray that I'd set everything up correctly, before unplugging the Ethernet cable to test whether my wifi was working. Now? I browse Hacker News while Linux is installing, because wifi drivers have been part of the kernel for years.

I did most of the enumerated items this week.

> But other than comms, has software improved? I have a hard time arguing otherwise.

It did improve a little in stability, at a very very high cost of user's money, time and privacy: waiting for stuff to open, commands to get processed, buying ever faster and more expensive hardware to do basically the exact same things as 20 years ago, only slower, and not because of the 56K modem.

It's because of people always wanting the latest newest stuff that good old software gets abandoned. See: IRC, FTP, Opera Presto, websites without JS, single-user OS, also hardware: ethernet on laptops, headphone jacks on phones.


I'm not going to respond to your points one by one. My overall response to you is that the system you're describing, with its multitude of RAID controllers, ancient webcam, overclocked CPU, etc, etc, wouldn't even have been possible to put together in the '90s. You'd have ended up spending all your time debugging random crashes and failures, and figuring out how to get stuff working. Whereas today, it's usable and functional and, while it still might have issues, it at least all works most of the time.

As far as the system not recognizing newer processors without a BIOS update, that was also true in the '90s. It's just that, back then, things changed so much, you'd just end up tossing the entire motherboard when it came time to install a new CPU, and you'd "upgrade" the BIOS that way.


I agree with the result of your research but examples are really poor.


Yes. Software naturally declines unless it is maintained.


Define quality software. Making money for the business? Improving customer's life? But who is customer? The one interacting with it or paying for it?


My opinion is: software is building on higher- and higher-level components, and people don't know what's happening inside, hence they are not aware of the border cases. Testing is getting much more difficult. Also https://xkcd.com/2347/


no, its just optimized for adds not you




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: