Helium supply issues are only going to make this worse.
I feel like for the first time in our lives we might have seen peak technology for the next few years. Everyone is going to have to make do instead of depending on ever increasing performance.
>I feel like for the first time in our lives we might have seen peak technology for the next few years.
This happened for a while with CPUs in 2004 or 2005, IIRC. At the end of the Pentium 4 era clock speeds and TDPs were so high that we hit a wall. Nobody was pushing past 4 GHz even with watercooling (I tried).
Dual-core processors were neither widely available nor mainstream yet, and those that were available had much lower clock speeds. It definitely felt like we hit a lull, or a stagnation, in those years. It picked back up with a fury when Intel released the Core 2 Duo in 2006, though.
> Helium supply issues are only going to make this worse.
I believe helium, although important constitutes a small percent of the cost of semiconductors, so its effect on price will be less severe. It will be more noticeable in other uses of helium though - party balloons could get very expensive etc.
A hospital isn't going to shut down because their MRI's new helium load is getting more expensive - they'll pay a fortune for it. For a lot of other applications there are no suitable alternatives either.
The real question then becomes: what's going to happen when there's a 1000x price increase?
The problem is not that it's finite, the problem is that by the time prices rise enough to discourage people from using it frivolously, you might already be dangerously low on it.
This is a really interesting question. Is it? My intuition would say no since you have no inherent duty to protect or help others. I have no clue though.
If there's no downside to leaving the ladder in place, then I would think yes - there's a reasonable expectation that people will die due to your actions. You'd likely have to argue about "involuntary manslaughter" vs something more intentional though, depending on circumstances.
If there is a reasonable downside, probably no? You have a right to try to keep yourself alive, in nearly all contexts.
There are alternatives to oil for energy, a lot of them. Helium is unique in its place in the universe, for the properties it possesses as an element. And once it's gone, it's gone. Hydrogen is similar but extremely volatile, where Helium is not volatile.
Helium could be made with nuclear fusion, but a 1 Gigawatt nuclear fusion plant would only produce 200kg of helium per year, so it's still not a viable path to make the quantities of helium we currently use. Current usage is almost 30 Million kg per year.
The helium that goes into balloons is mostly a byproduct of industrial grade helium production that would otherwise just go to waste. It's not pure enough for industrial uses.
You could always purify it, it's just uneconomic to do so at a smaller scale. But if the price rises enough, that will change and no one will be using helium for party balloons.
Reminds me of a demo my college physics professor did in our first class (presumably to get our attention).
He had two floating balloons, one about twice as big as the other. Pointed a blowtorch at the smaller one and it (of course) popped.
"That one was filled with helium. Now, there's only one gas less dense than helium..." and right as I thought to myself "he's not gonna do what I think he's gonna do", he pointed the blowtorch at the other balloon which exploded into a much larger (and much louder) fireball.
Finally, good efficient code is going to get its moment to shine! Which will totally happen because it's not like 80% of the industry is vibe coding everything, right?
Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
> Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
Remove the "I actually only want a slideshow" instruction from your prompt :-)
speedrunning super mario world with neural nets is weirdly effective though. i guess you need a genetic algorithm to refine different approaches rather than a neural net.
Honestly speaking, it has started to look like AI coders could actually do a better job than 80% of app developers in writing efficient apps just by being set to adhere to best-practice programming conventions by default (notwithstanding their general tendency of trying to be too clever instead of writing clear and straightforward code).
This is my theory: we're going to see a lot of languages with straightforward and obvious semantics, high guard rails, terrible dx, and great memory allocation and performance behavior out of the box. Assembler or worse, but with extremely strong typing bolted on in a way that no human would ever tolerate, basically, something in that vibe.
I vibe coded a library in Nim the other day (a language I view very much as a spiritual continuation of the Pascal/Modula line), complete with a C ABI.
The language has well defined syntax, strong types, and I turned up the compiler strictness to the max, treat all warnings as errors etc. After a few hours I put the agent aside, committed to git then deleted everything and hand coded some parts from scratch.
I then compared the results. Found one or two bugs in the AI code but honestly, the rest of our differences were “maters of taste” (is a helper function actually justified here or not kind of things).
Yeah actually I worked with Pascal early in my career and that's kinda the vibes I am thinking about, with maybe a stronger type system more ada-esque though (composite, partial and range-and-domain types, all that jazz)
This one might last longer. The AI race is on, and the US tries its best to make it as expensive for China as possible to participate in it. Every dollar China spends on GPUs they get at markup is one not spent on building navy ships.
If there is an escalation over Taiwan, then that will cause the loss of most of the world's high grade chip manufacturing capacity. TSMC is busy doing technology transfers into the US, but it is going to take time, those fabs won't have capacity for the whole world, and they still heavily depend on Taiwan based engineers if something goes wrong etc.
Just like with COVID you don't know how long this shortage will last.
It will incredibly hard for China to conquer Taiwan. One hundred kilometers across the straits introduces a brutal geographic hurdle. If anything, the fabs will probably be severely damaged in the war. Plus most senior execs and elite engineers would be moved to US offices in Arizona.
We are going to have that now in a couple of months regardless. So it won't matter if Taiwan's manufacturing base gets disrupted, the hardware will have already effectively stopped.
Wow, I wasn't aware Samsung, Intel, SMSC were unable to produce "modern technology." Not everything needs to be on a 3nm TSMC process, believe it or not.
TSMC makes a lot of stuff besides the EUV-scale parts that all the YouTube videos talk about.
Almost everything you own that runs on electricity has some parts from Taiwan in it. TSMC alone makes MEMS components, CMOS image sensors, NVRAM, and mixed-signal/RF/analog parts to name a few.
Also, people seem to assume that TSMC is an autonomous entity that receives sand at one loading dock and ships wafers out at another. That's not how fabs work. Their processes depend on a continuous supply of exotic materials and proprietary maintenance support from other countries, many of them US-aligned. There is no need to booby-trap any equipment at TSMC; it will grind to an unrecoverable halt soon after the first Chinese soldier fires a rifle or launches a missile.
Hopefully Xi understands that. But some say it's a personal beef/legacy thing with him, and that he doesn't even care about TSMC.
Russia weren't able to take Ukraine even when they were able to just drive their tanks right up to Kiyv. Modern warfare tech just favors the defender too much. China has ninety km of sea to cross before they even get to Taiwan. Missiles and drones have already taken out the Russian naval fleet in the Black Sea. China will be losing a lot in the same way if they ever attempt the crossing.
That's what happens when consumer demand rapidly shifts, and businesses start panic-buying and panic-cancelling. As far as I recall, actual chip fab output didn't really change that much.
I ask ChatGPT about this. It says the root was demand collapse at the start of COVID. So fabs stopped producing the many low-end chips reqd for modern cars. They retooled/pivoted to higher-end chips. When auto manufs came back knocking after COVID, the fabs didn't want/need their biz of low-end chips.
Moore's law only really works when at least part of the world is functioning under practically ideal conditions. Right now that's far from what's happening.
Helium is almost all captured from gas wells by cryogenically liquefying the nitrogen out of it. I guess you could do technically do that with the fab's air but it is a LOT of volume of air to liquefy and likely costs more than even inflated helium prices.
Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration, and I imagine even the best case scenario for capturing it from a fab has abysmal concentration of helium. But because most of it is vented it also means if the capital is put down to build more helium separators on gas wells it wouldn't take long to increase supply. Short term for a year or two it can be a problem, but beyond that it is simply a cost versus demand issue. There is neither a technological nor source limitation, it is a pure capital investment limitation.
> Helium is almost all captured from gas wells by cryogenically liquefying the nitrogen out of it.
This is wild. I never thought about how they separated gases from natural gas fields. The carbon footprint of each kg of that helium must be astonishingly large.
> Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration
I remember a similar situation with neon early in the Ukraine invasion a few years ago. What I expect to happen is some other source coming online that currently doesn't try to capture it for economic reasons.
Helium recovery in scientific settings for cost saving reasons is already done, so it's not like there isn't expertise in using it.
Helium is actually pretty hard to keep ahold of, being a very light and small noble gas. It can diffuse through a surprising amount of materials, flow through far smaller cracks than you would expect, and is quite hard to filter out of a mixture of gases.
Also superfluid helium (a big chunk of helium used for refrigeration like in e.g. the LHC) has the weird property of flowing the same speed through a tiny hole as a large one and coating everything with a molecular coating. Superfluid helium is basically a bose einstein condensate but macro-scale, totally counterintuitive. Essentially a thermal superconductor. Zero viscosity.
AFAIK they recapture most, but recapturing all simply isn't possible / financially feasible. And they use a lot of helium, so even if they capture most of it, the losses are still higher than the currently available supply.
I feel like for the first time in our lives we might have seen peak technology for the next few years. Everyone is going to have to make do instead of depending on ever increasing performance.