IMO the first graph would make a lot more sense when plotted in log scale.
Also this way of framing "As of February 2026, the US dollar has lost 96.9% of its purchasing power relative to January 1914. This means that $100 in 1914 would buy only approximately $3.05 worth of goods today" is of course math-correct but difficult to understand intuitively.
I think it makes more sense to explain it in the opposite direction or in both directions: "$100 in 1914 would buy only approximately $3.05 worth of goods today, or equivalently, $100 in 1914 is worth ~ $3278 nowdays (because 100 / 3.05 ~= 32.78 "
This also makes it easier to understand that the term "millionaire == person that has 1 million USD" only makes sense around 1914, because the equivalent amount of wealth nowdays would be "millionaire == person that has 32 million USD"
It's still a bad chart because of the "the great inflation destroyed more value than both world wars combined" claim, for two reasons:
1. It's not clear (from the chart at least), that the claim is true. 20.0% + 18.1% = 38.1%, greater than 30.2%, but the quote claims otherwise. True, the red and orange segments cover more than just ww1 and ww2, but if more granular data is available why not show it?
2. "destroyed more value" might be technically true if we define "value destroyed = inflation", but it's a non-intuitive definition to use. If you asked someone about the value destroyed in ww1/ww2, they'll talk about europe being bombed out, not higher inflation.
That software is lifetime license only at $70 for the basic and $110 for the pro... I wonder how that pricing model is working in today's day-and-age? It's not common anymore.
Not really. AFAIK it all amounts to something between 600 to 800Mhz for real world code, at best. About the same for affordable FPGAs.
That aside, I don't really get this nostalgy for these systems. I don't care about Doom, or some port of Quake. While 68K assembly was much nicer for me than anything common today, what do I get from that without a usable Browser, Office, "daily driver" apps? Show me how to port Firefox, Chromium or something functionally equivalent to these, and how those perform! :-)
If we're talking about an actual, modern 68060 CPU running at multiple GHz, then it would be trivial to run Firefox or Chromium -- just install Debian m68k and compile. :)
Apart from the nostalgy factor, I suspect there would be no actual benefit from such a system. I doubt m68k would compare well to ARM or x64 in terms of compatibility or modern-app performance.
I took a small look at your repo and noticed a possible issue: while Gravis Ultrasound did linear interpolation of samples [1], on Amiga all sample replay was strictly non-interpolated [2] ... essentially on each display scanline the hardware would check the period down-counter and load the next sample if zero was reached.
Thanks for looking at the repo and for the links! Initially I just used non-interpolated output, but it was extremely noisy. I suspected the Amiga had some additional analog hardware which avoided this, so I just added some interpolation to make it sound better. But anyway, it's far from finished, if I have the time to pick it up again someday I could make it more accurate...
> Maybe it is all related to errno needing to read the base TLS pointer?
Probably everything that uses __thread requires this. e.g. malloc(), or some functions that return pointers to static buffers, and implementators wanted to make them multi-thread safe.
/*
The interface of this function is completely stupid,
it requires a static buffer. We relax this a bit in
that we allow one buffer for each thread.
*/
static __thread char buffer[18];
char * inet_ntoa (struct in_addr in)
{
unsigned char *bytes = (unsigned char *) ∈
__snprintf (buffer, sizeof (buffer), "%d.%d.%d.%d",
bytes[0], bytes[1], bytes[2], bytes[3]);
return buffer;
}
Well.. I just noticed that implementation of inet_ntoa in glibc is a bit suspect. Using %d with char, without typecasting to (int) or using %hhd is tricky, and depending on memory alignment rules, and on whether arguments are passed in registers on on the stack this can produce incorrect results.
This is because arguments from the stack can be taken as ints (typically in batches of 4 octects), and put there as char (1 octet typically, but alignment might force it to be aligned to 4 bytes, I guess it depends). Interesting.
Also this way of framing "As of February 2026, the US dollar has lost 96.9% of its purchasing power relative to January 1914. This means that $100 in 1914 would buy only approximately $3.05 worth of goods today" is of course math-correct but difficult to understand intuitively.
I think it makes more sense to explain it in the opposite direction or in both directions: "$100 in 1914 would buy only approximately $3.05 worth of goods today, or equivalently, $100 in 1914 is worth ~ $3278 nowdays (because 100 / 3.05 ~= 32.78 "
This also makes it easier to understand that the term "millionaire == person that has 1 million USD" only makes sense around 1914, because the equivalent amount of wealth nowdays would be "millionaire == person that has 32 million USD"
Anyways, I liked a lot this visualization https://mlde8o0xa4ew.i.optimole.com/cb:VNTn.d9a/w:auto/h:aut... that visualizes the compression in time of the big value changes.
reply