Instead of inverters, there should be a push for DC network.
Virtually all SMPS would be able to to work with DC (right now) and they won't even need active power factor correction either. Washing mashines with brushless DC motors, most ovens would be fine too. When I think almost everything that I have home, save for geothermal pump and three-phase appliances would be ok powered by DC(0). DC has its own issues: galvanic corrosion possibly being one, arcs too (no self-extinguish 50/60 times sec) but it has no issues with inductance, hence lower transport costs.
(0)Cheap LEDs with capacitor dropper or regulation by triacs are not compatible.
I see DC used in electric cars, general automotive and RVs.
I will say 12v is not very useful, except maybe for LED lighting. It has high losses to travel any distance. To power anything of significance requires higher voltage (or enormous cables).
I do agree that it would be really good to use DC, but I kind of wonder what the reality of efficient and useful DC would be like.
12v DC is not even useful for LEDs as it requires current limiting resistors - hence power wasted to heat. The proper (efficient) way to drive LEDs is via constant current source.
I meant DC as in 230-330V DC for home/office use. I meant generally replace AC (high voltage) lines with DC ones.
Are they? The US consumes 11.5TWh of electricity daily. It's estimated that it needs 3 weeks if storage to get to 100% carbon free [1]. By comparison global battery production in 2019 was 300GWh [2]. Even if you factor in the fact that the curve of battery production is rising, it'd take decades to fullfil just the US's storage demands with batteries. There are other storage options like hydroelectricity and pumping air into mineshafts, but those are geographically limited.
Apparently each of these structures stores 20 MWh of energy. By comparison, the US consumes 11.5 TWh of electricity daily. To reach the 12 hours of storage estimated to be necessary to reach 80% renewable generation we'd need 287,500 of these towers. To reach the 3 weeks of storage necessary for 100% renewable generation we'd need 12,075,000 of these towers.
The former is easily achievable, if the towers are profitable to install. We only have to build them one at a time, and if a collection of 10-50 is cheaper than a peaker gas plant, then the utility decommissions the latter and builds the former.
The latter is simply not necessary. There are a number of ways to solve the last few percent, including leaving a few peaker plants running; presumably, we'll do all of them.
Those peakers can be carbon-neutral, because it's possible to run CO2 extractors. One implication of our renewables is that there are times when they generate power considerably in excess of requirements: using that energy to sequester carbon is plausible.
Of course, if we got over our civilizational horror of nuclear energy, we could solve this problem in a thorough and boring way. I'm not holding my breath.
Thorium reactors or pebble bed reactors might be options. The only reason we have the current designs is due to the us navies need to build a reactor that would fit on submarines and ships. That technology became the basis for our deployed reactors. Since then nothing else has been tried outside of small experimental plants from what I understand.
This is getting tiring... No, nobody is insane enough to only use batteries. They are primarily used to load-shift by a few hours within a 24 hour window. This is mostly useful for PV because it has a very low window during which it generates energy. Longer term storage would simply involve power to gas infrastructure. Alternatively you can use natural gas as your storage mechanism until power to gas has been implemented.
A lot of complaints against renewables are rooted in the myth of "baseload". For renewables to be effective they have to be available 24/7 except this is completely wrong. The atmosphere doesn't care if you pollute at 3 am or at 3 pm. A kilogram of CO2 is a kilogram of CO2. All you have to look at is how much renewable energy has been produced per year and divide it by total energy produced per year to calculate the renewable share. As long as that number goes up you're reducing CO2 emissions. Of course at some point you hit a roadblock where that share ceases to go up because you need to store energy but no country on earth has reached that point.
Existing energy storage options other than batteries are either geographically limited, or not feasible. I explain the issues with natural gas storage in another comment: https://news.ycombinator.com/item?id=24253317 You can't just produce natural gas out of hydrogen and air, you need a source of carbon (and the fraction of a percent of the carbon in the atmosphere cannot feasibly be used).
Yes, solar and wind can mitigate fossil fuel use in the short term. But there is no feasible way to use them for the primary source of energy without massive amounts of storage. It provides an option to produce some carbon-free energy, but does not provide an option to go from "some" to "all". If we actually want to stop climate change, we need to go for "all". The link you posted does not refute this. The solution proposed in the article is to just use fossil fuel backup plants until a storage solution is found.
Germany and California are both already approaching the saturation point for daytime generation. And once you hit that point there's no way to keep going without massive amounts of storage to transfer the excess generation during peak hours to non-producing hours. Intermittent sources cheaply reduce carbon emissions in the short term, but offer no path to fully renewable energy without a miraculous advancement in energy storage. The paper you linked to doesn't refute this. The plan laid out amounts to, "keep burning fossil fuels and hope we eventually figure out energy storage". When the fate of the world is at stake do you want to bet on an unknown and unproven solution, or a solution that has successfully delivered the lion's share of a nation's power since the 1980s?
They are [1]. It’s a self reinforcing cycle. For example, batteries are already cheaper than most gas peakers. That demand (along with EVs demand, primarily Tesla) ramps manufacturing capacity, realizing further cost declines along the way.
Remember, the sun and wind are free once the turbines and panels are installed (both which last at least two decades). That’s what fossil and nuclear are competing against, and it’s a losing battle. Sounds like those companies that sell batteries to utilities and consumers are in a very favorable position, considering the storage requirements you note.
Yes, the capacity is increasing. But there's still a massive disconnect between the amount of battery storage required to make renewables usable and the capacity that exists. The link I posted already factors in increasing demand, but the amount of battery production in these projections is still orders of magnitude lower than what is requires.
3 weeks of energy storage in the US works out to 240 TWh given its daily 11.5 TWh consumption. Even if we assume that the projections are accurate and that global battery production will reach 1 TWh in 2023 and 2 TWh per year in 2030, we're still talking about several decades worth of global battery production just to supply the United States' storage requirements. Even just reaching the 12 hours of storage required to reach 80% renewables is going to be difficult to achieve.
The reality is that storage even remotely close to what is necessary to run a country predominantly on intermittent sources has never been built. This is totally uncharted territory as far as energy infrastructure goes. And none of our current solutions are capable of delivering the required scale, save for geographically limited options like hydroelectricity. By comparison, multiple countries generate the majority of their electricity from nuclear power, and one has generated over 80% of its energy from nuclear. Between building a wholly novel solution and simply replicating an existing solution, the latter is almost always the easier task.
This thread is being rate limited, reply in edit:
> 1. Instantaneous excess can electrolyse water, producing hydrogen which can be converted into methane and stored/used in existing natural gas systems.
This is much more difficult to do than it sounds. In order to convert hydrogen into methane you need a source of carbon. The fraction of a percent of carbon that exists in the atmosphere cannot feasibly be used. The plans to use synthetic methane as energy storage are contingent on somehow capturing the carbon from burning a gas turbine. California tried to do this by pumping the exhaust into mine shaft, but a leak was sprung and the gas escaped. Not to mention, places without mineshafts nearby have to construct structures to store the exhaust.
> 2. Overcapacity in good times implies proportionally higher production in bad times, lowering the storage requirements.
No it does not. Overcapacity of solar doesn't make the sun shine at night. Overcapacity of wind somewhat increases production in less windy days, but a totally windless day still generates zero energy regardless of overcapacity.
If you look at Figure 2, they model different combinations of overproduction, storage, and regional aggregation (interconnection). Overproduction helps significantly. At 50% (1.5x) overproduction, the United States can supply 75% of demand with no storage and little new interconnection. At 1.5x overproduction with 12 hours of storage, it can supply 95% of demand with even less interconnection.
If the AP1000 units under construction at Plant Vogtle do not go over budget any further, they will have cost $11.20 per watt to construct [1] -- $25 billion spread over 2234 MW of generating capacity. If they have a capacity factor of 93%, that's $12.04 per watt of annualized output.
Utility scale solar in the United States achieves an average capacity factor of 25% as of 2018 [2] and had a median construction cost of $1.60 per watt as of 2018 [3], with costs still falling. That's $6.40 per watt of annualized output, or $9.60 with 1.5x overproduction.
Wind costs are similar. Wind has higher up front construction costs but also higher capacity factors in favorable regions.
Once constructed, solar and wind also have lower operation and maintenance costs per MWh generated than nuclear plants do.
I was very enthusiastic about nuclear power from about 2000-2015. I still think that it's safe enough and far cleaner than fossil power. We shouldn't shut any operating reactors down prematurely. But large scale renewable projects have cut costs drastically since the turn of the millennium while new nuclear projects keep going late and over budget. The EPR and AP1000 reactors were supposed to be the standardized, affordable, predictable reactors that retired the nuclear industry's history of cost overruns and schedule blowouts. Instead they have become spectacular contemporary examples of those same old problems.
By comparison, a plant of the same design as the Vogtle plant was built in China at a cost of $7.3B, yielding a cost of $3.17 per watt of output [1] - under 1/3rd of the solar cost with the required overproduction. The Vogtle plant is the first of its kind in the United States. First of a kind plants typically cost about twice as much as subsequent designs.
Furthermore, you're not including the cost of storage for your estimates on solar and wind. Even 12 hours of storage represents a staggering amount of storage - that's 5.75 TWh given the US's average daily 11.5 TWh consumption. Costs per KWh are still on the order of hundreds of dollars per KWh. At $200/KWh 5.75 TWh works out to over a trillion dollars. And this is a recurring cost, as storage needs to be replaced. Hydroelectric storage is cheaper and doesn't need to be replaced, but is geographically limited. Even then, it's not that much cheaper at $100-160/KWh [2]
Yes, renewables work if energy storage becomes effectively free. But energy storage is anything but free.
The paper you cited shows that 1.5x overproduction can get the US to 75% of electricity from solar and wind without building any storage.
Every big construction project is cheaper in China than in the US. China can build a solar farm for $0.83/watt [1], half the cost I cited up-thread for an American solar farm. But I don't think that $0.83/watt is a credible construction cost for a solar farm in the US in 2020. Nor is $3.17/watt a credible cost for constructing a nuclear plant in the US in 2020.
I am weary of hearing about how the next reactor design is going to be affordable and take only 5 years to build. That's what I heard from Westinghouse and Areva/EDF when the AP1000 and EPR were still theoretical reactors. I believed it for a while. I don't believe it any more in 2020. I don't believe it about molten salt reactors, traveling wave reactors, or whatever theoretical reactor is currently cherished by Ted Talkers. I'm willing to start believing it again after a reactor design has entered commercial operation in the United States or European Union and lived up to its claims.
I don't want to write off nuclear power entirely. It's much safer than fossil power, both in terms of acute human health risks and climate risks. I still couldn't advise any American utility that new nuclear plants are a fiscally prudent part of decarbonization planning for the next decade.
> The paper you cited shows that 1.5x overproduction can get the US to 75% of electricity from solar and wind without building any storage.
Right, without storage you have to scale back the percentage of electricity you get from intermittent sources. Like I said, without storage, renewables do not present a valid option for a carbon free source of energy.
I you want a US example of a nuclear plant construction, you can take the Diablo Canyon plant [1]. This is not only built in the US recently, but also in an earthquake prone area and thus needed more robust construction. It cost 13.8 billion in 2018 dollars. Plants constructed earlier are even more cost effective, even when adjusted for inflation. The Donald Cook plant [2] produces about as much as the AP1000 plant, for only 3.35 billion 2007 dollars. That's about 4.25 billion in 2020 dollars. And this isn't an anomaly. Plenty of plants built during the 70s were similarly cost effective [3] [4] You're picking an individual plant to inflate the cost of nuclear dramatically.
You're absolutely right that the next reactor design isn't going to be affordable. First of a kind production is always the most expensive. Nuclear power gets cheaper with repeated constructions of the same design. You don't need to re-build the manufacturing pipeline to make components, staff become experienced in construction, and other benefits.
This is why France's nuclear program was one of the most successful. They standardized on 3 designs, and built serial production of those same 3 designs. The APR is expensive precisely because it's one of the first large plants that France has built in decades, and they don't have that advantage of serial production.
This is also why nuclear plants built during the 1970s in the US are much more cost effective than the ones built today. The US was building a lot of nuclear plants, and so it benefited from this economy of scale.
75% of all electricity consumption is too low to be a "valid option" for decarbonizing electricity? We have different ideas of what's valid.
If you want a US example of a nuclear plant construction, you can take the Diablo Canyon plant [1]. This is not only built in the US recently, but also in an earthquake prone area and thus needed more robust construction. It cost 13.8 billion in 2018 dollars. ... You're picking an individual plant to inflate the cost of nuclear dramatically.
Diablo Canyon unit 1 construction began in 1968 and unit 2 in 1970 according to the page you linked. That's hardly recent. I was actually trying to be charitable to American nuclear by talking about the new Vogtle units without mentioning the billions spent at V.C. Summer on two incomplete AP1000 reactors that will not enter service at all.
I agree that 1970s era nuclear plants cost less. Part of that was that there was more experience with building them, so the average was lower. Another reason is that when we look at nuclear plants running today and tabulate cost by year of construction, we're cherry picking the projects that went well.
I count 41 American reactors that were cancelled while under construction on this page:
> 75% of all electricity consumption is too low to be a "valid option" for decarbonizing electricity? We have different ideas of what's valid.
The US already generates ~40% of its electricity from carbon free sources. An increase to 75% represents only about a 60% reduction in fossil fuels from the current status quo. This is absolutely not a solution. We're still advancing climate change. Halving the time it takes to reach a disaster is not even remotely close to the same thing as averting a disaster.
> 37 of them started construction in the 1970s. That's a high project failure rate compared to any other generation source
Yes, because of Three Mile Island. Most of the cancellations were for reactors that started in the 1970s, because the US mostly stopped constructing reactors in the 1970s. You don't have cancellations for projects that began in the 1960s because those projects finished before Three Mile Island. You don't have project cancellations in the 1980s and 1990s because the US largely stopped starting new nuclear projects.
This pattern of cancellations demonstrates the fact that the high failure rate is due to political pressure, not economic unsuitability of nuclear reactors. Serialized production yields costs per kilowatt hour well below your estimates for renewables even without the cost of storage.
And no, if we look at reactors that are no longer in operation we don't see a much higher cost [1] [2] [3] [4] [5].
Serialized nuclear construction is by far the cheapest and most effective way to eliminate carbon emissions from electricity generation. Intermittent sources are cheap until they reach saturation during their time of production. Then costs skyrocket when storage becomes necessary. And even more importantly, nuclear is the only proven way of accomplishing this.
The 75% figure is from solar and wind alone. The balance can come from any combination of nuclear, other renewables, and fossil fuels. It doesn't mean 25% electricity supplied by fossil fuels.
Plus 19.7% from existing American reactors, though that number is going to drift downward as retirements continue to outpace construction.
I think that the residual fossil demand would be significantly less than 25%. I can't say how much exactly. That needs another study. In the absence of storage, neither nuclear nor geothermal plants are good at supplying peak demands. Hydro can serve a peaking role to some extent even without building new pumped storage but it's constrained by reservoir volumes, minimum downstream flow rates, seasonal snow melt, etc.
> In the absence of storage, neither nuclear nor geothermal plants are good at supplying peak demands.
This is false. Nuclear and geothermal power can both satisfy peak demand just fine. Nuclear plants do take time to alter the thermal output of the reactor, but they can easily reduce the electric output through overcooling. Same MWt, but lower MWe. Geothermal plants just pump less water into the borehole. Better yet, nuclear plants can direct the excess energy to things like desalination. This is much easier than trying to satisfy peak demand with solar or wind, where peak demand often occurs when the sun isn't visible or when wind speed is substantially higher or lower than demand.
Again, France at its peak generated over 85% of it's electricity from nuclear power. I'm not sure where you're getting this myth that nuclear plants can't satisfy variable demand.
Nuclear plants can throttle down. But the economics of nuclear power make it prohibitive to satisfy peak demand from nuclear reactors. Even France relies on gas plants and power imports to satisfy its electricity demand peaks, despite being a net electricity exporter over the course of a full year.
The CAISO grid had an average power demand of 25 GW in 2019 but a peak demand of 46 GW (Table 1.1):
Keeping enough nuclear reactors operating to supply the most demanding hours of the most demanding season would have extraordinarily high marginal costs for the last few terawatt hours.
This is true not because of any shortcomings of nuclear, but by virtue of the fact that demand is not uniform. The same need to have excess capacity during non peak hours in order to have sufficient capacity during peak hours exists with fossil fuels. If peak demand is 130 GW and trough demand is 100GW you need 130 GW of capacity.
Nuclear has never been inexpensive enough to be the first choice for meeting peak demand in the absence of storage. Some pumped hydro plants were built in the 20th century to be charged by nuclear generation so nuclear could effectively supply peaks too. Peaking batteries could also be charged by nuclear power.
Gas turbines have a construction cost under $1/watt in the US while Vogtle's AP1000s were estimated at $6/watt even before all the cost overruns started. If you're going to leave an asset idle most of the time, much better to idle a sub-$1/watt asset than a $6/watt asset.
This is just factually wrong. France had generated the majority of it's electricity from nuclear since the 80s, and Belgium now generates the majority of it's power from nuclear. Nuclear had repeatedly been used to satisfy peak demand.
Nobody doubts that fossil fuels are cheaper. Yes, that's why fossil fuels are still in widespread use outside of France, and several countries with lots of hydroelectric power. But if we want to halt climate change we need to eliminate - not just reduce - usage of fossil fuels. Any plan to use renewables as a significant source of energy either involves the continued use of fossil fuels, or the involvement of staggering amounts of energy storage.
Per the beginning of this long thread, wind and solar can supply 75% of annual US electrical demand without storage. France supplies about that much of its annual demand from nuclear power and Belgium supplies a bit over half of annual demand with nuclear power. Both countries rely on fossil generated electricity for meeting demand peaks, both via domestic generating plants and cross-border imports from foreign fossil plants. There was never a year when France met its peak electrical demand without fossil power.
The example of France proves that electrical generation could have been largely decarbonized in the 20th century, if other leaders had made a concerted push to reduce fossil fuel dependency like France's leaders did. It's tragic that other major economies did not do the same. But even France did not eliminate all fossil generation. 10% of French electricity generation is fossil as of 2017 [1].
Getting the USA's electrical generation down to only 10% fossil would be a vast improvement. I don't think that a contemporary optimized plan for getting there involves much new nuclear power even though a 20th century plan would have. A cost optimized plan from 1985 certainly would have called for a lot more nuclear power. The costs of building American solar and wind farms have plummeted since 1985. The costs of building American nuclear plants have not.
France's peak energy consumption is more than 40% higher than it's trough energy demand [1]. Most years France generates ~10% of its energy from fossil fuels. Nuclear was indeed used to supply a substantial part of the peak energy demand. The substantial majority of its peak demand was fulfilled with nuclear energy. France's share of fossil fuels actually used to be lower than 10%. France's more recent uptick in fossil fuels accompanied by a drop in the share of nuclear power generation [2]. Renewable energy production increased, but its intermittency leads to an increase in fossil fuel consumption. A real world example of how the shortcomings of renewables as compared to nuclear power results in more carbon emissions.
As per your own comments, solar and wind even with substantial overproduction leave a quarter of electricity demand unfulfilled. The places that are fortunate to have hydroelectric or geothermal power available could go carbon free, but the rest are left supplying a quarter of energy with fossil fuels. The overproduction puts their price well above the cost of nuclear when you don't cherry pick one of the most infamous cost overruns as a representative example. And let's just be generous and ignore the ecological devastation caused by covering 2-4% of the Earth's land surface in solar panels.
So we have a more expensive option that leaves a quarter of electricity demand to be fulfilled by other sources (mostly fossil fuels), requires massive amounts of land to be cleared and covered in solar panels, and is subject the challenges of intermittent power generation. And we have nuclear, which is cheaper, much less land intensive, and generates consistent energy. The superior choice is unambiguous. And real world examples demonstrate this. Look at the disparity between France and Germany. The former is the posterchild of nuclear, the latter the posterchild of renewables. France's carbon intensity of electricity is usually more than a factor of 4 times smaller than Germany's. We've already put nuclear and renewables to the test. And nuclear proves to be far superior.
I'll be interested again when France builds new reactors cheaper than new renewables. Flamanville 3 and Olkiluoto 3, even if they don't have further cost overruns, are going to produce electricity at a higher cost per MWh than German solar farms completed in the same year.
If France does build 6 more EPRs, a proposal floated last year, they will have a chance to prove that the problems to date were merely FOAK learning experiences.
France doesn't need to build new reactors largely because it's existing reactor fleet is still enough to fulfill demand. That's one of the biggest advantages of nuclear power, it lasts for the better part of a century not a decade or two. A serial run if reactor production makes more sense once the disparity between supply and demand is greater.
I'll be interested in renewables once Germany's carbon intensity per Watt is on the same order of magnitude as France. They need to drop from ~500 grams per Watt to under a hundred.
I'd bet there will be some natural gas generation in the US in non-remote areas in 2030. Maybe not new construction, but gas is cheap, responsive, and already in place. I don't think it'll be completely replaced by batteries and added renewable capacity that quickly, though I'd love to be proven wrong.
Although my personal hope is for storage, inverters, controllers and other technologies to get cheaper. Especially longer term storage beyond 4 hours.