Your comment was chemically and biologically decomposed by microorganisms and fungi, which extracted energy from it and returned the remaining nutrients to the surrounding soil, providing a fertile ground for the growth of plants?
No idea what you're going on about. Those in the West who stand for a rules-based international order certainly didn't ask for this war, and Trump, who did start this war, never gave a shit about rules or norms, international or otherwise.
In a lottery, the more tickets you buy, the higher your chances to have the winning number.
If we played with a roulette and said "the goal is to be the first to have a winning number at the roulette" and I could try 50 times before you started, obviously I would be more likely to win our game, wouldn't I?
> In a lottery, the more tickets you buy, the higher your chances to have the winning number.
Yes, and it's exactly the same in bitcoin with the hashing power. Each hash is a ticket.
> If we played with a roulette and said "the goal is to be the first to have a winning number at the roulette" and I can try 50 times before you start, obviously I am more likely to win our game, am I not?
In bitcoin the goal is not to be the first. The goal is to find a winning hash that's on a chain that will not be abandoned. As soon as a new block is propagated you start mining on the new head. It doesn't change anything that you previously worked on another chain. The time spent on the previous chain is not wasted, unless finding a block wouldn't have got you the reward.
There is a kind of a race if 2 blocks are found simultaneously. But that's not really what this discussion is about, and in this case the outcome depends mostly on network connectivity.
It is precisely what this discussion is about. From the article:
> The key idea behind this strategy, called Selfish Mining, is for a pool to keep its discovered blocks private, thereby intentionally forking the chain. The honest nodes continue to mine on the public chain, while the pool mines on its own private branch. If the pool discovers more blocks, it develops a longer lead on the public chain, and continues to keep these new blocks private. When the public branch approaches the pool's private branch in length, the selfish miners reveal blocks from their private chain to the public.
> In bitcoin the goal is not to be the first. The goal is to find a winning hash that's on a chain that will not be abandoned.
The goal is to be the first (or very close to the first), because it makes it much more likely that your chain will not be abandoned. If you wait 2 days before you reveal your block, obviously it will be abandoned...
> The key idea behind this strategy, called Selfish Mining, is for a pool to keep its discovered blocks private, thereby intentionally forking the chain. The honest nodes continue to mine on the public chain, while the pool mines on its own private branch. If the pool discovers more blocks, it develops a longer lead on the public chain, and continues to keep these new blocks private. When the public branch approaches the pool's private branch in length, the selfish miners reveal blocks from their private chain to the public.
I don't understand how this scenario is beneficial. If the selfish miner doesn't have 51% of the hashing power, they can discover more blocks than the public chain only if they are very lucky. They don't know in advance that they will be that lucky. Withholding blocks in hope of this luck means putting these blocks at a very high risk of being discarded and losing the rewards. Why would they do that, exactly? If they get lucky, they get the rewards of their chain, and discard the rewards of the other miners. If they don't, they lose a lot of rewards. On the other hand, if they just publish the blocks they find, they're almost guaranteed to get the rewards. Why take the risk? It sounds like putting your own rewards at risk just to put others' rewards at risk. It looks like the risks even out.
> The goal is to be the first (or very close to the first), because it makes it much more likely that your chain will not be abandoned.
Yes, if there are blocks that are found at almost the same time. But that's not the situation discussed here.
In other situations, being first doesn't matter. If a miner finds a block before you do, then you just start mining on top of their block. You haven't lost anything.
> Yes, if there are blocks that are found at almost the same time. But that's not the situation discussed here.
It VERY MUCH is.
Of course if you take another scenario that doesn't make sense, then it doesn't make sense :-).
> They don't know in advance that they will be that lucky.
Whenever you find a block, you know you are one of the first to find it. It's obvious because nobody else has published a block. So you know you are lucky right now. You can decide to wait 1, 2, 5, X seconds before you reveal your block and start mining the new block in the meantime.
Maybe you just mine for 5 seconds before revealing the block, and that's the winning strategy. Maybe you wait until someone else publishes their block and you immediately reveal yours, ending up with two competing chains but knowing that you had a headstart with yours.
The detail of whether or not this is profitable, and how exactly you should do it (Wait X seconds? Wait until someone publishes a block?) is statistics and game theory ("What if the others are also withholding their blocks now? What is their strategy?"). The whole question is whether or not there is a practical, profitable strategy doing that.
But yesterday's jackpot is still running, here. If you find a block on the public chain while the other miner kept their block secret, your block becomes the main chain. If they publish their block after you, both blocks compete for the head, but it's usually the first published one that wins.
There is an advantage because occasionally you find the second block while the first block is still secret, then you release the two blocks in quick succession. That’s the edge.
What advantage does it provide vs not withholding? If you don't keep your first block secret and find a 2nd block, you get the same rewards.
On the other hand, if someone finds a block while you're keeping yours secret, it's very likely you'll lose the reward of your block.
So, you get a chance to discard the block of another miner, but you have to put your own block at risk of being discarded. Maybe there's a gain here, but it's not clear.
How do you interpret these numbers? If your point is that we can simply overprovision photovoltaik arrays by a factor of 6.67, then that would make solar the most expensive power generation method by far.
And it only gets worse the more households transition to heatpumps, because the consumption in winter is so lopsided. For example, I heat my home with a heatpump, and I have 10 kWp of solar arrays on my roof. In the last week of July, we consumed 84 kWh and generated 230 kWh (273 %). In the last week of November, we consumed 341 khW and generated 40 kWh (11 %). This means we'd need roughly 10 times as much PV area to match demand (10 roofs?), and huge batteries because most of that consumption is in the evening, at night, and in the morning.
Of course, utility-scale and residential solar behave a bit differently, and it becomes more complicated if wind is factored in. But it shows that you can't just overprovision PV a little to fix the main problem of solar power: that it is most abundant in summer, and most in demand in winter.
My point was really only that neither is solar what I'd consider negligible in winter, nor are there really weeks with no wind. Other than that, my interpretation is pretty much the same as yours.
Above, I looked at the weekly min/max ratio. Of course the daily ratios are much higher, 1:60 for solar, and about 1:30 for wind. But wind and solar do have a useful anti-correlation: the ratio is "only" about 1:15 for combined solar+wind. Still high, but a huge improvement on both wind and solar individually.
In reality, the ratio is even higher since we routinely have to drop solar and turn off wind turbines when there is more production than demand (and I don't think that generation is reflected in the graph).
Ie. the max is already a representation more of grid and demand than of production, and it'd make more sense to use the ratio of min:mean, so comparing what we expect PV+wind to produce on average with what they give on the worst day. That gets us a different, more favorable ratio: 195 TWh produced in 2025 so far, let's call it 550 GWh/day, giving a ratio of about 1:6.
Thank you for actually running the numbers. I think the data is quite convincing that overprovisioning won't be the solution to the seasonal storage problem, or at least not the major factor in it.
Personally, I have high hopes for flow batteries. Increasing storage capacity is so easy with them, liquids can easily be stored for a long time, and it would even make long-distance transport by ship feasible. If only we can find a cheap, suitable electrolyte.
This is just a slighty more sophisticated version of the "solar doesn't work at night" trope.
The implications of bringing it up is that these silly hippies haven't even thought of this basic fact so how can we trust them with our energy system.
Meanwhile, actual energy experts have been aware of the concept of winter for at least a few years now.
If you want to critique their plans for dealing with it, you'd need to do more than point out the existence of winter as a gotcha.
I don't see you countering my argument, only attempting to ridicule it ("slighty more sophisticated", "trope", "these silly hippies", "been aware of the concept of winter", "existence of winter as a gotcha"). That sucks, man :-(
> If you want to critique their plans for dealing with it […]
There are many ideas for seasonal storage of PV-generated electricity, but so far there is no concrete plan that's both scalable to TWh levels and economically feasible. Here on HN, there's always someone who'll post the knee-jerk response of "just build more panels", without doing the simple and very obvious calculation that 5x to 10x overprovisioning would turn solar from one of the cheaper into the by far most expensive power generation method out there [1].
[1] Except for paying people to crank a generator by hand, although that might at least help with obesity rates.
> 5x to 10x overprovisioning would turn solar from one of the cheaper into the by far most expensive power generation method out there.
This is trivially false if the cost of solar generation (and battery storage) further drops by 5x to 10x.
Additionally that implies the overprovisioned power is worthless in the summer, which does not have to be the case. It might make certain processes viable due to very low cost of energy during those months. Not trivial as those industries would have to leave the equipment using the power unused during winter months, but the economics could still work for certain cases.
Some of the cases might even specifically be those that store energy for use in winter (although then we're not looking at the 'pure' overprovisioning solution anymore).
> This is trivially false if the cost of solar generation (and battery storage) further drops by 5x to 10x.
That's a huge "if". The cost of PV panels has come down by a factor of 10 in the last 13 years or so, that's true. I doubt another 10x decrease is possible, because at some point you run into material costs.
But the real issue is that price of the panels themselves is already only about 35% of the total installation cost of utility-scale PV. This means that even if the panels were free, it would only reduce the cost by a factor of 1.5.
> But the real issue is that price of the panels themselves is already only about 35% of the total installation cost of utility-scale PV. This means that even if the panels were free, it would only reduce the cost by a factor of 1.5.
1. Do the other costs scale with the number of panels? Because if the sites are 5 times the scale of the current ones I would imagine there are considerable scale based cost efficiencies, both within projects and across projects (through standardization and commoditization).
2. Vertically mounted bifacial PV already greatly smoothes the power production curve throughout the day, improving profitability. Lower cost panels make the downside of requiring more panels in such a setup almost non-existent. Additionally, they reduce maintenance/cleaning costs by being mounted vertically.
3. Battery/energy storage (which further improve profitability) costs are dropping and can drop further.
Also, please address the matter of using the overprovisioned power in summer. Possible projects are underground thermal storage ("Pit Thermal Energy Storage", only works in places where heating is required in winter), desalination, producing ammonia for fertilizer, and producing jet fuel.
> 1. Do the other costs scale with the number of panels?
Mostly yes. Once you're at utility-scale, installation and maintainance should scale 1:1 with number of panels. Inverters and balancing systems should also scale 1:1, although you might be able to save a bit here if you're willing to "waste" power during peak insolation.
But think about it this way: If it was possible to reduce non-panel costs by a factor of 5 simply by building 5x larger solar plants, the operating companies would already be doing this. With non-panel costs around 65%, this would result in 65% * (1 - 1/5) = 52% savings and give them a huge advantage over the competition.
I agree that intra-day fluctuations will be solved by cheaper panels and cheaper batteries, especially once sodium-ion battery costs fall significantly. But I'm specifically talking about seasonal storage here.
> Also, please address the matter of using the overprovisioned power in summer.
I'm quite pessimistic about that. Chemical plants tend to be extremely capital-intensive and quickly become non-profitable if they're effectively idle during half of the year. Underground thermal storage would require huge infrastructure investments into distribution, since most places don't already have district heating.
Sorry, very busy today so I can't go into all details, but I still wanted to give you an answer.
> That's a huge "if". The cost of PV panels has come down by a factor of 10 in the last 13 years or so, that's true. I doubt another 10x decrease is possible, because at some point you run into material costs.
A factor of 5 is certainly within the realms of physics, given the numbers I've seen floating around. Note that prices are changing rapidly and any old price may not be current: around these parts, they're already so cheap they're worth it as fencing material even if you don't bother to power anything with them.
> But the real issue is that price of the panels themselves is already only about 35% of the total installation cost of utility-scale PV. This means that even if the panels were free, it would only reduce the cost by a factor of 1.5.
This should have changed your opinion, because it shows how the material costs are not the most important factor: we can get up to a 3x cost reduction by improving automation of construction of utility-scale PV plants.
I think I've seen some HN startups with this as their pitch, I've definitely seen some IRL with this pitch.
What amounts to „concrete plan“? Right now we’re still in the state where building more generation is the best use of our money with batteries for load shifting a few hours ramping up. So it’s entirely expected that there is no infrastructure for seasonal storage yet. However the maths for storing energy as hydrogen and heat looks quite favorable and the necessary technology exists already.
"Concrete plan" means a technology which satisfies all of these requirements:
1) demonstrated ability in a utility-scale plant
2) already economically viable, or projected to be economically viable within 2 years by actual process engineers with experience in scaling up chemical/electrical plants to industrial size
Yes, that's hard to meet. But the thing is, we've seemingly heard of hundreds of revolutionary storage methods over the last decade, and so far nothing has come to fruition. That's because they were promised by researchers making breakthroughs in the lab, and forecasting orders of magnitude of cost reductions. They're doing great experimental work, but they lack the knowledge and experience to judge what it takes to go from lab result to utility-scale application.
> 2) already economically viable, or projected to be economically viable within 2 years by actual process engineers with experience in scaling up chemical/electrical plants to industrial size
Why 2 years?
Even though I'm expecting the current approximately-exponential growths of both PV and wind to continue until they supply at least 50% of global electrical demand between them, I expect that to happen in the early 2030s, not by the end of 2027.
(I expect global battery capacity to be between a day and a week at that point, still not "seasonal" for sure).
Electrolysis hydrogen is only a little bit more expensive than hydrogen derived from methane and electrolyzers with dozens of megawatt are available. That seems pretty solid to me at this point in the energy transition.
Hydrogen generation isn't the problem, storing it over several months is. Economical, safe, and reliable storage of hydrogen is very much an unsolved engineering challenge. If it weren't, hydrogen storage plants would shoot out of the ground left and right: Even here in Germany, we have such an abundance of solar electricity during the summer months that wind generators have to be turned off and the spot price of electricity still falls to negative values(!) over noon, almost every day.
Yes, those are easier to store, but more expensive and less efficient to generate.
The question is the same as for hydrogen: If it's easy, cheap, and safe to generate, store, and convert back into electricity, why isn't it already being done on a large, commercial scale? The answer is invariably that it's either not easy to scale, too expensive (in terms of upfront costs, maintainance costs, or inefficiencies), or too unsafe, at least today.
With rapidly dropping PV prices it just gets cheaper - this is only a relatively recent thing; the projects that exist to exand production are barely complete yet .. capital plant takes time to build.
Fortescue only piloted athe the world's first ammonia dual-fuel vessel late last year, give them time to bed that in and advance.
If that's so easy, cheap, and safe, why aren't there companies doing it on a large scale already? We're talking about billions of Euros of market volume.
Right now it’s cheaper to make hydrogen from methane and methane is easier to store and process so no large scale storage of hydrogen is in demand. Nevertheless storage in salt caverns is a proven process that is in use right now eg. Linde does it.
That's a funny meta comment, where are you from? Are you consuming a lot of US based content? I ask because I mainly see Americans here writing about the "CCP" based on what they regularly hear from government officials and certain news outlets. It's rarely framed as "China" it's usually "the Chinese Communist Party" emphasizing "Communist" because that word carries negative connotations in the US given its history and in the EU. But maybe framing is similar in your country.
So just to clarify, I'm from the EU, and I'm not paid for anything I write here. Maybe your world model is influenced by propaganda? The world isn't black and white.
I also encourage people to read more about the history and culture of other countries, especially the ones they have strong opinions about, which they often haven't formed themselves (In my experience, this is often lacking in US education, people learn a lot about US history, but not as much about the rest of the world).
Reading more philosophy can also broaden your perspective. In particular, I recommend learning about Singapore, its history, Lee Kuan Yew, and why many highly educated people there willingly accept restrictions on individual freedom. If you understand that, you can then start reading about China, its culture, and its history.
It's not. Why would lsl+csel or add+csel or cmp+csel ever be faster than a simple add? Or have higher throughput? Or require less energy? An integer addition is just about the lowest-latency operation you can do on mainstream CPUs, apart from register-renaming operations that never leave the front-end.
In the end, the simple answer is that scalar code is just not worth optimizing harder these days. It's rarer and rarer for compilers to be compiling code where spending more time optimizing purely scalar arithmetic/etc is worth the payback.
For one, signed integer overflow is allowed and well-defined in Rust (the result simply wraps around in release builds), while it's Undefined Behavior in C. This means that the LLVM IR emitted by the Rust compiler for signed integer arithmetic can't be directly translated into the analogous C code, because that would change the semantics of the program. There are ways around this and other issues, but they aren't necessarily simple, efficient, and portable all at once.
You guys seem to be assuming transpiling to C means it must produce C that DTRT on any random C compiler invoked any which way on the other side, where UB is some huge possibility space.
There's nothing preventing it from being some specific invocation of a narrow set of compilers like gcc-only of some specific version range with a set of flags configuring the UB to match what's required. UB doesn't mean non-deterministic, it's simply undefined by the standard and generally defined by the implementation (and often something you can influence w/cli flags).
> You guys seem to be assuming transpiling to C means it must produce C that DTRT on any random C compiler invoked any which way on the other side, where UB is some huge possibility space.
Yes, that's exactly what "translating to C" means – as opposed to "translating to the very specific C-dialect spoken by gcc 10.9.3 with patches X, Y, and Z, running on an AMD Zen 4 under Debian 12.1 with glibc 2.38, invoked with flags -O0 -g1 -no-X -with-Y -foo -blah -blub...", and may the gods have mercy if you change any of this!
The Gregorian calendar is the de-facto global calendar system today, even in cultures and states that are far removed from its Christian and European roots. You might as well complain about the text on the website being in English.
But he is not complaining that we use the Gregorian calendar. He is pointing out that is just one calendar among many, and we should be aware that it is a conscious choice the world has made to use it by convention.
> But he is not complaining that we use the Gregorian calendar.
Yes, he is:
>>> Yet this "world" history uses Europe's reference point [of BC/CE] as universal.
It wouldn't make sense to use any other than the Gregorian calendar for this map, and it also wouldn't make sense to mix different calendar systems.
> He is pointing out that is just one calendar among many […]
But it's not. The Gregorian calendar is the calendar in world wide use today. Giving dates in BC/CE is not an expression of Eurocentrism, it simply reflects reality.
> Yet this "world" history uses Europe's reference point [of BC/CE] as universal.
What in this sentence indicates he think is it wrong to use that calendar? He is saying it is NOT universal. What about that is hard to understand?
> The Gregorian calendar is the calendar in world wide use today.
Again, you are arguing with a straw-man. Please read my comment carefully again. I am not arguing this your statement.
As an analogy, the WWW is the dominant (probably virtually only) form of the internet in use today, but it is only one architecture. There were/can be others, but they failed to gain or maintain traction. A summary from Google:
> Besides Gopher, other historical internet systems and protocols existed before the World Wide Web, including Wide Area Information Servers (WAIS) and the Archie search engine. While the World Wide Web eventually surpassed them all, these systems provided different ways of discovering and navigating information online in the early 1990s.
When you construct an object containing a mutex, you have exclusive access to it, so you can initialize it without locking the mutex. When you're done, you publish/share the object, thereby losing exclusive access.
struct Entry {
msg: Mutex<String>,
}
...
// Construct a new object on the stack:
let mut object = Entry { msg: Mutex::new(String::new()) };
// Exclusive access, so no locking needed here:
let mutable_msg = object.msg.get_mut();
format_message(mutable_msg, ...);
...
// Publish the object by moving it somewhere else, possibly on the heap:
global_data.add_entry(object);
// From now on, accessing the msg field would require locking the mutex
Initialization is always special. A mutex can't protect that which doesn't exist yet. The right way to initialize your object would be to construct the message first, then construct the composite type that combines the message with a mutex. This doesn't require locking a mutex, even without any borrow checker or other cleverness.
Dude, it's a simplified example, of course you can poke holes into it. Here, let me help you fill in the gaps:
let mut object = prepare_generic_entry(general_settings);
let mutable_msg = object.msg.get_mut();
do_specific_message_modification(mutable_msg, special_settings);
The point is, that there are situations where you have exclusive access to a mutex, and in those situations you can safely access the protected data without having to lock the mutex.
Sorry, I don't find that convincing but rather construed. This still seems like "constructor" type code, so the final object is not ready and locking should not happen before all the protected fields are constructed.
There may be other situations where you have an object in a specific state that makes it effectively owned by a thread, which might make it possible to forgo locking it. These are all very ad-hoc situations, most of them would surely be very hard to model using the borrow checker, and avoiding a lock would most likely not be worth the hassle anyway.
Not sure how this can help me reduce complexity or improve performance of my software.
Your comment was chemically and biologically decomposed by microorganisms and fungi, which extracted energy from it and returned the remaining nutrients to the surrounding soil, providing a fertile ground for the growth of plants?