Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
NASA’s new shortcut to fusion power (ieee.org)
200 points by GordonS on Feb 28, 2022 | hide | past | favorite | 138 comments


> Existing fusion reactors rely on the resulting alpha particles—and the energy released in the process of their creation—to further heat the plasma. The plasma will then drive more nuclear reactions with the end goal of providing a net power gain. But there are limits. Even in the hottest plasmas that reactors can create, alpha particles will mostly skip past additional deuterium nuclei without transferring much energy. For a fusion reactor to be successful, it needs to create as many direct hits between alpha particles and deuterium nuclei as possible.

This is one of the most clear explanations of fusion power I've read so far. Worth a read just for that alone.

> And as the technology matures, it could also find uses here on Earth, such as for small power plants for individual buildings

Distributed power generation is the ideal. Why bother with transporting energy when you can just generate it where you need it?

Really cool article/tech. I've not heard of LCF until now. Seems promising.


> Why bother with transporting energy when you can just generate it where you need it?

For the same reason every household no longer grows, harvests and threshes its own wheat, bakes its own bread, and why Mao Zedong's Great Leap Foward idea of building a blast furnace in every village, in order to increase the country's steel output was an utter failure.

Power transmission is cheap, and economics greatly favor utility-scale deployments. You also get significantly less need for wasted peak capacity when multiple power producers can pool together into a grid.


> For the same reason every household no longer grows, harvests and threshes its own wheat

This feels like a weird point to make when solar power is as popular and growing as it is.

Localized power generation is not only here now, we already have programs to tie your localized power generation into the existing power grid and you get paid for it. I don't see how this system couldn't work the same way.


It's popular and growing because of credits and subsidies. Remove all credits and subsidies, and I can build utility-scale solar for less than half the cost per KWH than you can install panels on your roof.

There's way too much human labour involved in getting someone to drive to your house, climb onto your roof and bolt panels to it. In that time, that same worker could set up a dozen similarly-sized panels when building out a utility solar farm.

And yes, if your time is worthless, and you don't value your neck, you could DIY, and save on some of those costs.

... But you'd still be tied to the grid (and paying grid fees), unless you are ready to invest $XY,000 for a massive battery bank... That might still leave you without electricity during a period of low generation/high consumption.


> There's way too much human labour involved in getting someone to drive to your house, climb onto your roof and bolt panels to it. In that time, that same worker could set up a dozen similarly-sized panels when building out a utility solar farm.

At some point I'm going to pay someone to come out re-shingle my roof anyway.


Sure, but even shitty tar paper roofs last 20 years. Stone and metal roofs can last much longer.


>And yes, if your time is worthless, and you don't value your neck, you could DIY, and save on some of those costs.

This is such a cop out. How I spend my time is up to me. If it's a DIY weekend project being worked on a weekend that I had no other plans, then it's not really a cost to me. Sure, professionally, I have my hourly rate that determines my "worth". However, I do not get to bill those hours 24/7/365. Even playing along with your premise, if I'm playing weekend electrician, I'm not a master electrician making the same rates as my other job so a 1:1 correlation is just a lame argument.


How you spend your time is fine. Most people lack the expertise to maintain a solar panel. This becomes an issue when you die or sell your house and the new owners don't know anything about maintaining solar roofing. Over time, this will become a massive cost center especially since solar panel installation have a painful lack of standardization, especially DIY panels, which drives up costs immensely.


>This becomes an issue when you die or sell your house and the new owners don't know anything about maintaining solar roofing

I'm setting aside the "when you die" part, but acknowleding its presence.

However, the new owners get a small tear and a tune from the world's tiniest violin. They know the panels are there when they buy the house. It's not a surprise at closing. If they decide to buy the house with that knowledge, then that's on them, their realtor, their inspector, etc. It's like saying someone that buys a full size truck and complains they didn't know the expense of fuel.


> There's way too much human labour involved in getting someone to drive to your house, climb onto your roof and bolt panels to it. In that time, that same worker could set up a dozen similarly-sized panels when building out a utility solar farm.

Where do you put the solar farm?


Literally anywhere that's not prime real estate. There's three orders of magnitude more of that kind of surface area on this planet than there are single-family home roofs. [1] I think we can figure something out.

[1] 2 billion 'single-family' homes, 800 square feet of roof/average [2], ~50,000 square miles of roof space. Total land area of the Earth is 57 million square miles. You can take your pick of which 50,000 of it can be used for utility solar...

[2] This is a large over-estimate, reality is much smaller than that.


If you take Europe as an example, every area that is not "prime real estate" is either agriculture or protected forests. Transport of electricity over very long distances is not easy, see https://www.pv-magazine.com/2021/05/18/geopolitical-impact-o... for a few of the challenges.


Unproductive farm fields.


The interesting question is what happens if battery prices continue to decline so that $XY,000 becomes $X000 and thereby less than grid fees.

Low generation can be solved with an inexpensive gas generator for use in emergencies the likes of which might see three days use in a year.


If battery fees get lower than peaker plant-related grid fees, then grids will shut down peaker plants and instead buy warehouses full of batteries.


Aren't some significant fraction of grid fees the grid itself? The land, paying linemen to repair storm damage, the administrative apparatus to negotiate with everyone and collect payments.

You still need all of that if you put all the batteries together in a warehouse, not if you put them in each person's basement.


Those costs are still significant lower than building XY square feet of basement, that you now can't use for anything, because its space is physically occupied by electric infrastructure.

Again, economies of scale.


Don't you still need the same amount of space at the warehouse?

Tesla Powerwall 2 appears to be <5 cubic feet. That doesn't seem like a lot.


I don't think the grid can ever go away. Regional solar is just too variable. You might get backup for a day, but storage costs scale linearly with duration.


> This feels like a weird point to make when solar power is as popular and growing as it is.

Funny (ironic?) thing about most the significant residential solar installations in my hood; they exist because they're grid-tied and generate revenue for the households.

If there were no grid for these homeowners to sell their excess power to, they wouldn't have bothered with installing and maintaining solar panels. It's not like there's zero risk involved, hell just having thousands of dollars of solar panels sitting on your roof alone is a big fat sign announcing "well-heeled folks reside here, break-in and pillage when absent". Not exactly the greatest thing to broadcast in the predominately low-income desert regions solar is most applicable to.


Utility scale solar has a MASSIVE cost advantage.

Also, the vast majority of houses can’t sustain their energy use off of rooftop solar alone.


Solar power is inherently distributed by the sun shining everywhere on Earth.

Other forms of energy, perhaps with the exception of wind power, do not work like that.


Ive read that creating enough food for a family of 4 from a home garden requires something like 8 square meters of land, chickens, and loads of labour. It’s non trivial, and if you make a mistake you go hungry.

Assuming some putative ideal future Mr Fusion, plugging it into the wall would be a completely different proposition, require relatively little space, and zero household labour.

Considering the massive infrastructure and street furniture required to distribute electrons, the unit economics of home fusion would need to be terrible in order for centralisation to remain competitive against the significant benefits for reliability and decentralisation.


> creating enough food for a family of 4 from a home garden requires something like 8 square meters of land, chickens, and loads of labour. It’s non trivial, and if you make a mistake you go hungry.

Not even close. Potatoes produce more calories per unit of land than anything else you can grow in a temperate zone garden. Intensively cultivated potatoes may produce 10,000 calories per square meter, but that would be a stupendously successful crop. In other words, to provide all the calories for a family of four, you'd need on the order of 1 to 2 square meters of potatoes per day.

Since you can't really live on potatoes alone, to get adequate nutrition across the spectrum of human needs, you need quite a bit more than that. You could feed a family of four for a year on less than a acre, if none of your crops failed or did poorly. I'd hate to be responsible for trying it on less than that though in the temperate US.


To be clear, I was responding to the OP by comparing the difference between centralised agriculture and centralised energy production.

As everyone has pointed out, you needs lots of land for food. Far less for energy, and so less reason to centralise it if there is a good alternative.


> Ive read that creating enough food for a family of 4 from a home garden requires something like 8 square meters of land, chickens, and loads of labour.

I'm pretty sure you can't feed a family from what you can grow on a balcony.

/edit: >Research in the 1970s by John Jeavons and the Ecology Action Organisation found that 4000 square feet (about 370 square metres) of growing space was enough land to sustain one person on a vegetarian diet for a year,

https://www.growveg.com.au/guides/growing-enough-food-to-fee...


I would be hugely surprised if we have a Mr. Fusion future where you toss garbage into it and get electricity. More likely is that you'd have to buy fuel pellets periodically. It's not free to package them into household-sized packages and ship them to every home.

Oh, god, I can even see the pain... just like inkjet cartridges. "Non-genuine fuel cartridge detected. Please remove and replace with a genuine cartridge."


> 8 square meters of land

Pretty sure this is off by a few orders of magnitude.


Why? Staple crops can be planted very close together and there are many crops that can coexist on the same plot.

Crop nutrients are obviously a concern, but if you're only trying to survive for a couple of cycles seems totally feasible and could be extended artificially.


Conservatively each person needs 1500 calories per day and that's cutting it close. That's about 2 million calories per year for 4 people.

Potatoes are one of the most calorie-dense vegetables by weight and by growing space. They have 350 calories per pound, so you need 5700 pounds of potatoes to feed a family of four.

A good potato yield is about 25,000 pounds per acre, so you need nearly a quarter acre of potatoes (1000 square metres) to feed your family for a year.


It's also worth noting that you're not going to be in shape to do much agricultural work on 1500 calories/day.


Have you actually tried to grow stuff?

I grow a lot of stuff and have for years and it’s an excessive amount of labor, you only get certain food during certain times of the year, sometimes for only for a few weeks, and that’s assuming you don’t have losses due to pests.

Agriculture is best left to the professionals for feeding societies.


On the contrary. I think everyone should try growing some of their own food. It's not only great for your physical health, but your mental health too! Also having some long-term food storage and water on hand is a great way to weather unexpected circumstances. It doesn't have to be an all or nothing thing. It's sort of like having a generator or battery storage for backup power... If everyone had it, the whole grid (or food availability) would be a lot more resilient. Imagine if all of Ukraine's residents had plenty of food and water in their homes for the next 2 months. That would be huge.


Being prepared by storing two months of food and water has nothing to do with growing your own food.

If your municipal water supply is compromise, the last thing you should be doing with water is pouring it on the ground.


Have you actually tested this in any way? How many calories/year do you grow in each square meter of your own garden?

In the real world, people doing this for a living find that they need literally 100 to 1000 times that much land to feed a family of four.


Tell that to my septic system!

If the cost of generating power drops, then distribution is a more viable model. Especially if you get extremely high fuel density. Also as we've found in California, power delivery can be very expensive.


> Distributed power generation is the ideal. Why bother with transporting energy when you can just generate it where you need it?

You need to be able to throttle the power output up and down. That is harder to design and harder to make efficient. Or you need lots of batteries, which is expensive. And it all has to be sized for peak demand rather than being able to benefit from flows across the grid.


Produce at double max and use the excess power to remove carbon from the atmosphere?


The problem with using excess power for things like that (or desalination per sibling comment) is that the infrastructure to do these things is expensive, and it becomes less cost-effective if it's only used part of the time, when the energy is available.

The one application I'm aware of that can cost-effectively use excess power of that sort is crypto mining, since there the vast majority of the cost is electricity; the capital cost is relatively low. Unfortunately it's arguable whether it does anything useful besides enriching the producer.


>The problem with using excess power for things like that (or desalination per sibling comment) is that the infrastructure to do these things is expensive, and it becomes less cost-effective if it's only used part of the time, when the energy is available.

Not if you reverse the grid - instead of providing power to the nodes it takes excess away to scalable workloads. And you have grid access as a backup.


It's nothing to do with the organization of the grid.

Almost any "scalable workload" actually capable of scaling up will have a high $$$ upfront cost for all the machinery / infrastructure involved.

If a decarbonation plant costs many millions of dollars to set up, nobody is going to let it sit idle waiting for some excess power.

The best use case for smoothing the demand curve IMO is thermal "storage". While production is high, people can crank the AC or heat. That way they don't have to use as much energy when production is lower.


Yes, that's a good one. Can even be automated to some extent using smart thermostats. Water heating too. Even without the thermal storage element though, there are lots of loads than can be shifted using a financial incentive. Clothes washing and drying is a good one. Electric vehicle charging too. Even if they aren't used to feed back into the grid, just having people charge when power is plentiful will make a big difference as more electric vehicles are added to the system.


Electric vehicles are really the best option IMO - as long as it’s fully charged in the morning when it’s needed, most drivers won’t care if it charges at 9pm or 3am.

I’m less certain about the other things - not many people will eg. put off a load of laundry to a less-convenient time to save 21 cents. And if my partner is too hot because the heater has been running for 30 minutes I really do not want to have a discussion with her about “thermal storage” :)


IMO the better application of the heating ones would be to have people opt-in to potentially having their heating or air conditioning disabled for short periods of time if necessary, but even if it isn't, they get a credit each bill. If there's a spike though, the power company could shut off those circuits remotely, freeing up some room. It would just be used for smoothing of short-term spikes.

I believe in some places this is already done.


This is pretty much the worst implementation of this idea. Nobody is going to be happy stuck in a 85F+ house with no way to cool it. It is also seriously dangerous to just shut off heating randomly. The credit would have to be impractically large for people to participate voluntarily.

It is so much nicer to cool/heat pre-emptively. The power company has a very accurate view of what the next 12-24 hours will look like. If they know The sun will be shining bright and there will be an excess of solar energy, they should crank up people's AC so that they don't need to draw as much power later in the day. This way we smooth the demand curve without letting pipes freeze or giving people heat stroke.


No one is getting heat stroke from having their AC shut off for a few minutes. I'm talking about short term spikes. I don't think many people are going to want to be freezing in the morning to avoid using power when the sun's up later in the day.


But these elaborate uses for excess energy are easier to do at scale using the existing grid. You have a specialist centre in a single location that can take excess power from across a large area.


Yes this is what I was suggesting would probably make most sense - you keep the existing grid even if you distribute the production.


We do this will rooftop solar cells, so the precedent already exists.


You mean you can cost effectively use the excess power to produce even more heat?


Once the power is produced it's going to become heat regardless, so if you're not able to scale down generation you're going to get that heat. But as I said, it's certainly debatable whether there's any value in that type of work.


The most obvious use for excess power is surely storing it. In a battery, a flywheel, a gravity storage system, compressed gas—hugely economical.


Yes. But storage is easier at scale using the existing grid. A large battery storage site can take excess power from across the grid and get economy of scale.


  > Unfortunately it's arguable whether it does anything useful besides enriching the producer.
I don't see how that goal is problematic given the use of a carbon-neutral energy source. I actually see it as an advantage, something that might help speed the adoption of fusion energy and thus get us off hydrocarbons.


You could always run Folding@Home or something in that vein.


or desalinate & clean water, or store the excess for next peak use in ie high & low connected dams, or... that's really a nice problem to have


Yes, accept those things are all designed for large scale grid systems.


Size it for peak consumption and use the excess for bitcoin mining.

I'd prefer to do something really useful, like desalination, pre processing of waste water prior to dumping it into the sewers, and so on, but all these would require a lot of plumbing.


Maybe excess energy could be used to recycle limited and precious elements from landfills via Santa Claus machine-like disassembly with very hot plasma. Horribly energy inefficient but may be useful in some situations.


The reason to use MCF (and why IEC can't work) is that it's okay if particles don't collide often if they are confined for sufficiently long. So what if the fast alpha doesn't collide? It's charged and thus well confined. It will transfer its energy to other particles eventually.


Inertial confinement fusion works just fine if you are building a 1 megaton device driven by a fission bomb. You probably can't make it work if you're driving with a laser because the wallplug efficiency of a laser is terrible, but at least you can build a failing facility which is only huge as opposed to gargantuan. Real breakeven might be possible with heavy ion beam ignition but the minimum size facility to make an attempt is gargantuan.


IEC is inertial electrostatic confinement: fusors and polywells. Conduction through the confinement/accelerator coils being immersed in the plasma reduces the confinement time lengths far too short for a reactor.


... like Philo Farnsworth's Fusor

https://en.wikipedia.org/wiki/Fusor


But a really great source of fast neutrons


So cold fusion is viable after all, you just have make sure you call it something else so you're not laughed at. They are very careful to avoid the label:

"LCF isn’t cold fusion—it still requires energetic deuterons and can use neutrons to heat them."


> So cold fusion is viable after all

No, it isn't. There is more than just a change of name involved with LCF: the statement "it still requires energetic deuterons" means the deuterons still have to be hot. They can't be at room temperature.


> There is more than just a change of name involved with LCF: the statement "it still requires energetic deuterons" means the deuterons still have to be hot.

the original cold fusion experiments explanation was lattice confinement in heavy metal (i.e. large electron clouds) like Pt/Pd plus energetic deuterons. What was very unclear is where those deuterons got their energy. It was theorized something along the lines that high electrostatic charges in the metal cracks accelerate the deuterons, etc.

Unfortunately pseudo-scientificity got somehow attached to that research, and that for decades prevented any meaningful research into the source of those deuterons and how to efficiently increase their number and/or how to efficiently add another source. Only passage of time and the name change to LCF - marketing, yea! - has allowed to restart the research, though still without due credit to the original research.


> What was very unclear is where those deuterons got their energy.

Yes, and that was because no energy source was being used to start the reaction; the metal with deuterons in it was just sitting there.

In these experiments, an energy source (gamma rays) is used to heat up the deuterons to start the reaction. That's a key difference, and it's why a different term from "cold fusion" is entirely appropriate.


Cosmic gamma-rays and cosmic muons are energetic enough to start a reaction, so LENR devices are just boosters. IMHO, it explains why some labs in mountains are able to reproduce experiment, while other labs cannot. I had plans to put a charged LENR device on a plane, to test that.

If this is true, then LENR can be used to power cosmic apparatus in deep space. Maybe, it can power airplanes on distant routes also, like solar panels, but 24x7.


strictly speaking there was an energy source doing water hydrolysis though of course there is no known machinery for that to result in such energetic deuterons.


From the article: Electron screening makes it seem as though the deuterons are fusing at a temperature of 11 million °C. In reality, the metal lattice remains much cooler than that, although it heats up somewhat from room temperature as the deuterons fuse.

Sounds pretty much the same as room temperature to me. Also the pictures with the experimental setup suggest that the glass does not melt, which is pretty cool.


Much the same as room temperature is only the average temperature of the metal.

The few irradiated deuterons and the products of their collisions have speeds (kinetic energies) many millions times higher than those corresponding to the room temperature.

The average temperature remains low only because few nuclei take part in fusion.

If they would succeed to make enough nuclei to take part in fusion reactions to produce more energy than consumed, it is not clear how great the average temperature of the metal would become.

If the temperature of the metal would not increase excessively, that could happen only if most of the energy produced by fusion would be carried away by neutrons, which would be absorbed somewhere else, generating useful heat, but also creating undesirable radioactive waste.

This approach is indeed very promising, but there are many problems that must be solved, so there is still no chance for a fusion reactor in only a few years.


>it is not clear how great the average temperature of the metal would become.

Certainly not above the melting point of the metal if the lattice structure is required to sustain fusion.


The part about "electron screening" makes no sense at all. The deuterium nuclei are in the slots between the erbium nuclei. Most of the electrons are very close to the erbium nuclei, so the slots where the deuterium nuclei are have a low electron density. Approximately the same density of an isolated deuterium atom, perhaps the double, but I doubt it's 10x higher.

The orbitals of the electrons of deuterium are like 1000x bigger than the size of the nuclei. So once the incoming deuterium nuclei approach, it will be much closer to the target deuterium nuclei and it will not see the electrons. Note that most of the energy of the repulsion is when the nuclei are close, not when they are far away.

The erbium are useful to keep a lot of deuterium together, but the electrons shelling is probably very small.

The trick they use is to use a very energetic gamma rays that colides (indirectly) with one deuterium, and this deuterium is very fast that is the same effect you get when you have a very hot deuterium.


>> The trick they use is to use a very energetic gamma rays that colides (indirectly) with one deuterium, and this deuterium is very fast that is the same effect you get when you have a very hot deuterium.

That notion of "hot" is not the norm. Most of us think in terms of temperature, not "energy". Would you want to get an X-ray if it were described in a way that sounded like high temperatures going to fry you? No.

When the original cold fusion work was published, physicists across the board declared it impossible, insisting that high temperatures (and/or pressures?) were absolutely required for fusion to happen. The notion of a desktop fusion reaction was categorically ridiculed. Now it's OK so long as we change our conventional definitions to make those earlier denials not seem ignorant. BTW I'm not saying the origial CF worked, just that those rejecting it used words that would also exclude the possibility of LCF (or LENR or whatever we call it now).


> The notion of a desktop fusion reaction was categorically ridiculed.

The fusor was invented in 1964 https://en.wikipedia.org/wiki/Fusor


But with neutron reflectors and such, they hope to make it self sustaining. Insisting that it's not cold fusion really is splitting hairs.

I suspect a magnetic field will help reaction rates too, even though you'll have a hard time finding research supporting or refuting that.

Even I had wondered if firing neutron into a cold fusion cell might be helpful. Turns out it probably is. But then it's not cold, its LCF.


How about we call it "Cool Fusion " as a compromise??


it's really more of a tepid fusion


I can see why they might not want to use that name...

Now, if we could just make tepid superconductors...


How about "Average Fusion"?


Speaking of cold fusion, what happened to that Italian scam from few years back?

ah yes https://en.wikipedia.org/wiki/Energy_Catalyzer still a scam



https://ecat.com/

I'm disappointed, because the Dec 9 event was previously a new LED light of some sort, and now I can't find it with a cursory search.

$25 maybe for preorder at the time. Maybe I'll check wayback machine or something.


I don't believe this.

If deuterons could fuse with other deuterons in a metal lattice, this would have been seen ages ago, just by accelerating deuterons into a deuterium-loaded material. This is how neutron generator tubes work, and they've been used for longer than you've probably been alive.

What happens in these tubes is the vast majority of deuterons lose energy by ionization and don't undergo nuclear reactions. There is no radical increase in fusion from electron shielding.


Thanks for the reference to neutron generators. Wikipedia agrees: https://en.wikipedia.org/wiki/Neutron_generator

Just maybe people wanting a neutron source stay with thin foils to let the neutrons out because that's what they want. I wouldn't be surprised if this method of initiating fusion we completely overlooked for 90 years, but turns out to be viable after all. There are still many questions to be answered since the probability of a fusion event still seems to be too low. I just wish people would do more searching for answers and less outright rejection.


The point I was making was that the physical effect they claim is occurring makes no sense, and would have glaringly shown up elsewhere, decades ago, if it actually was real.


This was seen ages ago.

> In 1989, two electrochemists, Martin Fleischmann and Stanley Pons, reported that their apparatus had produced anomalous heat ("excess heat") of a magnitude they asserted would defy explanation except in terms of nuclear processes. They further reported measuring small amounts of nuclear reaction byproducts, including neutrons and tritium. The small tabletop experiment involved electrolysis of heavy water on the surface of a palladium (Pd) electrode. The reported results received wide media attention and raised hopes of a cheap and abundant source of energy.

https://en.wikipedia.org/wiki/Cold_fusion


Um, no. P&F saw nothing but their own experimental artifacts. Replication of their claims failed. Their setup was also different from this NASA one, with no gamma radiation shining on the palladium (and also palladium, not titanium or erbium as used by NASA.)


Most of the replications failed, except some. My theory is that nuclear reaction in hydrogen loaded palladium is initiated by cosmic gamma rays or cosmic muons, thus those laboratories which were open to cosmic rays, such as at top floors or in mountain areas, are able to reproduce results, while shielded labs are unable.

I planned to set up a similar experiment and bombard a hydrogen loaded target with muons, or move the experiment to the top of a Carpathian mountain, but the institute of nuclear research in Kharkiv, Ukraine is under attack, so it's not possible.

The idea of using ErD3 instead of Nickel or Palladium is very good, because it saves weeks of time to load the target with Deuterium.


Look, P&F's putative results cannot be the same thing as what these NASA guys are claiming. The NASA people are claiming ordinary DD fusion reactions are occurring, just at an enhanced rate. P&F's putative heat excess would have killed P&F from neutron radiation if it had been the ordinary DD reactions.

Physicists knew P&F's nonsense was garbage because it required multiple miracles (that fusion would occur, that the ordinary fusion reactions would not, that some weird non-standard fusion reaction would occur instead.) Far, far more likely was they just were bumblers. And when replication failed (and it did fail; other vaguely similar but different sporadic results are not "replication") that prior was validated.


- "One promising alternative is lattice confinement fusion (LCF), a type of fusion in which the nuclear fuel is bound in a metal lattice. The confinement encourages positively charged nuclei to fuse because the high electron density of the conductive metal reduces the likelihood that two nuclei will repel each other as they get closer together."

Isn't this just cold fusion? The end paragraph even credits an "International Conference on Cold Fusion".


As many physicists pointed out at the time of the Fleischmann and Pons controversy, fusion at room temperatures may well be possible. If fact we know it’s possible because we have Fusors, and then there’s muon catalysed fusion. The problem wasn’t with the temperature range, it was with the not working and not being fusion.


> The problem wasn’t with the temperature range, it was with the not working and not being fusion.

And them being chemists not physicists.

And the University of Utah issuing a press release before the anything was peer reviewed.

The whole episode is is an example of what can go wrong with science. This article shows what clearly could have been a useful and productive field of investigation became poisoned to the extent that no significant research could go on for a quarter century and still the authors have to go to great pains to distance themselves from Fleischman and Pons.


>> And them being chemists not physicists.

That was a huge part of it - I wrote a paper on the controversy for a class. The physicists basically said "That can't work, it must be a chemical reaction." Regardless of the physics working or not, they were effectively saying two chemists couldn't properly do calorimetry on a cell containing 4 elements that don't do very much chemically. This a much harsher thing than the chemists saying physics overlooked something.


Nothing about fusors are room temperature.


The point is you can operate it unshielded or thermally lagged. A typical Fusor has about the power output of a domestic lightbulb. I have several of those on in the room I’m in right now. Fusion in a desktop apparatus, by itself, is no big deal.


I still don't get this line of reasoning. Pulsed MCF research devices only get hot on divertors and are thermally lagged. These distinctions don't seem important because they're decoupled from the physics.


> Isn't this just cold fusion?

No. You still have to heat the fuel. The claim of cold fusion was tnat the fuel could just sit there at room temperature and fuse.


Cosmic rays and muons can initiate a nuclear reaction: they can have very high energy sometimes. If the metal plate is large enough, then it can be hit often enough to produce significant output, when it is not shielded from cosmic rays.

For example, lab at top floor or in mountain area may produce much more heat than lab in basement at sea level.


No, because of the gamma ray input.


Yes, it is. The contrary position is motivated by a scurrilous need to justify ignoring the mechanism for 40 years, using semantic obfuscations.


Good to see this field getting some serious investigation. Last I looked it was still very hypothetical with only questionable characters investigating.


> with only questionable characters investigating.

Well the assumption in the physics community is that if you're investigating this field you're a questionable character, so its surprising that it got investigated at all.


Physicists look at cold fusion scientists like archaeologists do at ancient alien theorists?


We need fusion, we need more power YESTERDAY.

Whatever amount we are spending to develop these moonshots it needs to be more.

So many "hard things" become that much easier when we remove having to power it out of the equation.

A generator for a forklift is probably unrealistic, but for a huge, skyscraper building crane? Or for one of those giant shipping barges that produce multiple percentage points of our carbon emissions? That no longer sound so crazy.

Isn't that how the US powers its aircraft carriers and submarines anyway? Only with fission, which clearly is too dangerous to put on civilian ships.


Yeah, portable fusion would make large-scale power grids obsolete and revolutionize shipping. Things get more interesting when you scale it down to fit a passenger aircraft.

I'll believe it when they can power a toaster or something.


There seems to be an inconsistency in the article. First it says the Dynamitron produces gamma rays:

> We can jump-start the fusion process using what is called a Dynamitron electron-beam accelerator. The electron beam hits a tantalum target and produces gamma rays, which then irradiate thumb-size vials containing titanium deuteride or erbium deuteride

But later it says:

> producing neutrons from a Dynamitron is energy intensive. There are other, lower energy methods of producing neutrons including using an isotopic neutron source

Is the input neutrons or gamma rays?


From what I gather the actual lattice requires neutrons. Their current setup uses an electron source (dynamitron) on tantalum which generates gamma rays. The gamma rays have the energy needed to push neutrons around inside the lattice. It seems they're saying it would be more energy efficient just to directly generate neutrons without having to go electron -> gamma -> neutron.


Any good overviews on the different types of fusion and relative sense of liklihood to achieve their aims?

I know the throwaway comment is about fussion always being 30 years away but also does appear from the outside that hype/excitement is picking up for some of the recent advances in magnetic confinement fusion.


Now everybody is saying 10 to 15 years, so it looks like we're getting somewhere.


Or, they are starting to worry about getting their funding cut off because it will obviously never be commercially competitive with renewables.


General Fusion's demo plant should be operational in 3-4 years.


General Fusion's liquid-metal reaction bubbles, unlike tokamak and stellarator, might have a future.

It seems to me to depend on whether they can find a way to get the reaction rate usefully high.


I didn't see an answer to the big question I have: what about energy loss (and, to a lesser extent, container damage) through neutron loss? There is this quote:

> The released neutron may collide with another deuteron, accelerating it much as a pool cue accelerates a ball when striking it. This second, energetic deuteron then goes through one of two processes: screened fusion or a stripping reaction.

So a neutron "may" collide. But what if it doesn't?

To be fair, the density is a lot higher an dhtere's an erbium lattice so this may just be a non-issue (or at least a much-reduced issue).

Anyway, I'm glad to see alternatives to "hot" fusion being researched. I'm far from convinced the tokamak approach of massive boondoggles like ITER will ever be commercially viable.


The good thing is a lot of labs across the world trying to solve this problem. Here are some I am currently watching https://e-catworld.com, https://www.youtube.com/watch?v=z01586zdnM0


Do the names "Fieschmann" and "Pons" ring any bells? Here is the wikipedia article that covers these recent developments. Look in the "Later Research" section.

https://en.wikipedia.org/wiki/Cold_fusion


The NASA research isn't mentioned there. Differences from cold fusion: they're hitting the lattice with gamma rays, and seeing 2.45MeV neutrons come out.

Doesn't mean they'll achieve net power this way, or that the lattice will survive the neutrons at practical fusion rates, but they seem to be seeing D-D fusion reactions.


Is it real or could there be unaccounted for neutron sources? That’s been an issue with past metal lattice setups.


No idea, but it'd be odd if neutrons from another source just happened to have the energy of D-D neutrons.


Fusing hydrogen is easy, ionize it and accelerate the plasma with a voltage of on the order of 10 to 100 kV, hobbyists do this somewhat regularly. Doesn't sound too surprising that hitting hydrogen with gamma rays produces some fusion. But that's the crucial point, some fusion is not useful as an energy source, and not all fusion methods can be scaled up.


Sure, I'm not denying that at all. It looks like real fusion happening, but might not be no more useful than fusors. They mentioned their current methods are too lossy, but they did have some interesting arguments for it being a practical energy source someday.


If they can use it to produce He3, that would be a valuable contribution independently of any energy production goal.


If you're producing gamma rays, then you have a neutron source. This is exactly how the cold fusion people mislead themselves into thinking they had succeeded.


I misread lattice as lettuce and was a bit confused for a while about the purpose of hitting lettuce with gamma rays!



To make Hulk lettuce.


Indeed, came to say the same. Toward the end of the IEEE article it has this:

> We’ve also triggered nuclear reactions by pumping deuterium gas through a thin wall of a palladium-silver alloy tubing, and by electrolytically loading palladium with deuterium. In the latter experiment, we’ve detected fast neutrons.

"electrolytically loading palladium with deuterium" is very, very similar to the Pons/Fleischman setup, which was basically electrolysis with palladium electrodes.


How does one extract power if it barely warms up? Does it generate electricity directly?



High energy Gamma sources don't sound like a fun and easy place to get started. Maybe that's why this is a NASA project for deep space, rather than something you'll be able to buy at Tesco.


It can be done the same way x-rays can be generated. Accelerate electrons in a magnetic field using a syncrotron.


I really hope this works.

Honestly, there are a number of low-hanging fruits in fusion.

More recent calculations show that, if you include the kinetic energy of the muons, Muon-catalyzed fusion may be net positive [0]. This LCF stuff is a low-hanging fruit too——ignored for many decades because scientists didn't want to hurt their reputations.

[0]: https://en.wikipedia.org/wiki/Muon-catalyzed_fusion#Alternat...


Do those revised calculations take into the energy requirements for generating the muons?

The wikipedia article doesn't appear to give a clear answer


this strikes me as the sort of thing that ends up as an engineering enhancement to an RTG that’s like “hybrid radio fusion heat engine” and uses a radioactive source together with this lattice confinement fusion material and end up with something like a 20% efficiency instead of the current 7.5%, but is otherwise never used outside of space probes.


NASA seems to achieve quite a bit with its relatively meagre budget


I’m looking forward to the work Solomon Epstein is doing in this area


Willen haven been doinging.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: