This still vexes me greatly. Some young people (I'm old... everyone seems young) think it was all very silly in retrospect. When I have the time, I will explain to them just how serious it really was, media reports notwithstanding, and that the only reason things didn't go very badly was the work of a great cadre of programmers working hard to make sure bad things did not happen.
Agreed. In my case, I and the rest of a team spent a year analyzing and finding problem areas, fixed them, ran factory acceptance tests, and finally travelled to a major customer and on-site tested the fixes (this was, at that time, pretty old software). In this particular case a non-fixed Y2K situation would result in a major setback in the type of weather forecasts we all were getting used to at the time..
So yes, nothing much bad happened, _because_ a huge effort was put in everywhere to prevent it.
I worked with a lot of systems that had Y2K bugs, but in every single case the failure would have been quickly noticed and fixed (or not: living with buggy software was pretty common).
None of the systems were all that reliable in the first place. Software was even more buggy and unreliable than it is today. “Turn it off and on again” was the standard way of fixing things. Manual workarounds were very common.
Yet, who knows how much money was spent hunting down and fixing Y2K bugs in systems that were not critical, and where not even dependable.
I’d wager that for almost all the systems that a lot of money was spent bringing into compliance, it would have been cheaper and simpler just to let them fail, and deal with the problem afterwards, which would have been just another day at the office.
IT staved off disaster, a disaster that most of us at the time had no involvement in creating, and what did we get for our herculean efforts and SUCCESS?
We got told it was all a hoax.
I immediately lose all respect for anyone saying Y2K was a hoax. That just tells me you're another dumb ass falling for conspiracy theories.
I don't know; for several years afterwards it seemed to be a given that it was a real thing, and efforts to avoid disaster were well directed and successful and that IT folks "did the world a solid".
I choose to take it as a (rather dispiriting) lesson about humanity that only ~20 years later there's such scepticism. Certainly explains why the cranks cast doubt on moon landings and other historical events.
Many Y2K “solutions” just borrowed 20 years, and a few of those systems didnʼt actually invest that time in a fix before the 1st of January 2020:
One solution involved a technique called “windowing” — in which two-digit years are assigned to either one century or another based on one hard-coded “pivot year” determining where they belong. Even back in 1999, HPCWire was describing it as “highly controversial,” citing one expert who said computers using the technique were “little ticking time bombs waiting to go off.”
I led the team that investigated Y2K issues for some very consequential systems.
You'll recall that no cities were destroyed by accidental blinding white flashes on January 1, 2000. We made sure in advance that that wasn't going to happen.
The place where I worked at the time was already all UNIX/Linux so our Y2K was 2038. We just needed to adjust a few printed reports and client-side birth date heuristics to make everything work fine.
Until February 29, 2000. Then several of our customers' systems messed up. I seem to recall it was a problem with SCO OpenServer and that our Red Hat customers didn't have any issues, but I might be misremembering. Either way, for whatever reason the leap year was not recognized on one of the system or application layers, and that mismatch caused bigger problems than any 1900/2000 client-side confusion would've done.
Unix/Linux or not didn't matter for Y2K, the problems were in software elsewhere and that was as major in Unix/Linux as anywhere else (the systems I was fixing at the time were based on both Unix and Linux).
(When I was way younger I ran into a software problem where the company providing the software did all date calculations by adding up dates, months, years and.. not getting it right. Between Christmas and New Year something bad always happened. I was working night shifts at the time, and the software was part of locating emergency transmitters, mainly at sea.. when those systems went down it was pretty critical. The first thing I mandated when I got any say in things back then was that all time calculations should be done by adding up seconds.. not dates. Oh well. Fun times)
The leap year law translated to English doesn't just say "Every year divisible by 4 has a leap day.", it also says, "Unless the year is divisible by 100, then it's not a leap year". But then it also says "But if it's divisible by 400, it's a leap year again!"
So maybe some of the programmers only knew the first 2 rules...
It was not a joke, and systems had to be fixed, but let us also not forget how this was whipped up into a storm far bigger than its cup by IT consultants smelling a feeding frenzy, and a media loving a good tech doom story.
I rented a small house in the mountains from the power company for the 2000 New Years Eve party. It happened to have an office for the local power employees.
At about 22:00 the phone in the office rang up and I took the call (I am not sure why, actually - the phone was ringing so I picked it up).
It was an older lady calling and she said "there is no power at home, is this this year 2000 issue already?".
I told her that this is unlikely and have her the correct number to call (written on the office door).
It was all over the news and thanks to the heroic work of IT/IS the world was saved.
I got my start in IT focusing on Y2K in the mid-90s. I had no business doing the work I was given at the time, but the greybeards of that era were something else. They mentored me in ways that I could never have gotten in school, it was a different time, pre-google. I was also very appreciative that my young age wasn't held against me :)
Bad things could have happened. And probably would have without all the media attention which forced frugal business leaders to take the issue seriously.
It's important to realise an averted disaster was still a potential disaster before it happened.
If anything this problem really put technical debt on the map. Which is a problem in more than just date formats.
I distinctly remember the Wired article that claimed power plants would fail - in an anecdote they said - "Imagine a temperature sensor reading comes in and the system then divides it by the year, the system now sees the temp as infinity and the whole plant shuts down". I read it a few times to make sure I followed and then asked myself, why exactly would you divide the temperate by the last two digits from the year? Wired anyway thought you would do that and it was going to cause massive failure.
Consider a system with PID driver that samples temperature every 15-30min.
Measures are not 100% regular, so we have to substract dates to calculate time duration between samples.
Lets say we got one measure with timestamp 99-12-31 23:50 and second one with 00-01-01 00:15.
What will calendar code do if we substract 99-12-31 23:50 from 00-01-01 00:15?
Does 00y mean 2000 or 1900?
What will the driver do when suddenly temperature differential will change without relation to physical reality?
Many lives would have been lost if so many billions were not spent on fixing Y2K issues.
I don’t know about direct lives saved but I have 2nd hand knowledge (I didn’t become a programmer until a few years after Y2K) of systems which if they did break down would have cost lives because people would have not had access to desperately needed money and it was statistically likely that at least a few of them would have died through suicide if not due to the issues created by the lack of money.
I'm fine with all that but there were no follow-ups. All lead-up and no release.
Where were the interviews with Cobol nerds standing in front of huge tape drives, sweating even in the chilly server room air as they mumbled inscrutable things?
The folks who fixed Y2K bugs were not the fad-driven, easy-money-chasing web weenies of the dot com boom. They were folks who worked on computers because they actually liked it.
>Where were the interviews with Cobol nerds standing in front of huge tape drives, sweating even in the chilly server room air as they mumbled inscrutable things?
Those didn't sell to the general public nearly as well as "oh my god we are all going to die", which seems to sell gangbusters to the same-ish cohort every ten years.
I could see real problems with plane navigation and even air traffic control. Not that it would mean planes falling from sky... Nuclear plants however... Yeah, no...
I knew someone who worked on a year 2000 project. I was in school at the time.
He started this job around 1997 and believe it officially ended around Aug 2000.
He worked for oil and gas sites with relations to big companies like BP etc. He was checking all systems being used - ranging from 2 to 25 years old. maybe more.
I was learning C programming at the time so was (barely) able to have a conversation with him. I found it very interesting. If I was a few years older he might have been given a low pay job to get my foot in the door.
Not once did he portray this as some kind of 'doom' like scenario but expressed potential problems if not dealt with - and that was his job. He focused on software used on oil rigs along with oil/gas sites.
There was a Sega Saturn in the room and used this as an example. Pointing at it, he said there are 3 outcomes to consider. I think there was more to it than that but was making the point to (me) a 14 year old.
1) It wont have any problems going into year 2000 and over.
2) The internal clock does not go from 1999 to 2000. Instead, it goes 1999 to 1900.
3) Or - The console stops working entirely when it reaches year 2000.
If the outcome was 2) - the console will likely still work fine. Worst case is some games could stop working if using the internal clock. Another example is saved game data being a problem.
He didn't think the Saturn would have outcome 3) but if it did stop working - it isn't the end of the world. A lot of parents will be complaining but thats it. Its just a games console.
He then said to think of other computer systems out there. Think about electricity powering your home. Think about your boiler. Think about the london underground or aeroplanes. Think about banks or hospitals. There are computers doing far more important things than the Sega Saturn. A number of these machines could be more than 15 years old. Back then we would be talking about programs originally created in 1985.
I remember having this conversation with kids in my school who were the "computer whizz" kids at that time. I was a nobody as I never presented my skills on computers. I am shocked at their attitude towards it. They would always interrupt me, laughing at me, because I wasn't part of their cliq. Because of their reputation in school, even the teachers were siding with them smirking. Their argument was always about the Windows 95/98 PC's in front of them. They did not think about the bigger picture. Software written in COBOL from past days still being used.
The joke of Y2K is not recent. It was a joke back then as well. It was either a joke or fearmongering. Majority sided with one (non computer people) or the other (many who thought they did)
Don't worry everyone will just assume that somebody else is fixing the Unix Rollover in 2038. Competence and stability breeds incompetence and instability.
Hopefully, we're lucky and everything just migrates seamlessly to 64bit, but a Minsky Moment for technical rather than monetary debt would be pretty apropos. Maybe the LLM designed to port everything makes a silly mistake that goes unchecked. Who knows maybe it becomes Skynet.
Distros are already planning to change time_t to 64 bit for the 32 bit platforms that are still affected. Debian is planing on doing it this year [1].
It's not actually hard to fix the problem, it's just tricky to manage an ABI break without breaking things. Anybody who compiles their whole system (like Yocto based builds or similar for embedded systems) and doesn't use any closed source binaries can do it today.
The hard part of the 2038 problem is not the ABI changes, but all the higher-level systems and data storage formats that use 32 bit unix time. Such as file systems. Fortunately many of them can be adjusted to treat time as unsigned so they won’t need to break compatibility with old data, but it’s really hard to make a signed -> unsigned change without overlooking corner cases.