Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Output from radiating heat scales with area it can dissipate from. Lots of small satellites have a much higher ratio than fewer larger satellites. Cooling 10k separate objects is orders of magnitude easier than 10 objects at 1000x the power use, even if the total power output is the same.

Distributing useful work over so many small objects is a very hard problem, and not even shown to be possible at useful scales for many of the things AI datacenters are doing today. And that's with direct cables - using wireless communication means even less bandwidth between nodes, more noise as the number of nodes grows, and significantly higher power use and complexity for the communication in the first place.

Building data centres in the middle of the sahara desert is still much better in pretty much every metric than in space, be it price, performance, maintainance, efficiency, ease of cooling, pollution/"trash" disposal etc. Even things like communication network connectivity would be easier, as at the amounts of money this constellation mesh would cost you could lay new fibre optic cables to build an entire new global network to anywhere on earth and have new trunk connections to every major hub.

There are advantages to being in space - normally around increased visibility for wireless signals, allowing great distances to be covered at (relatively) low bandwidth. But that comes at an extreme cost. Paying that cost for a use case that simply doesn't get much advantages from those benefits is nonsense.



Whatever sat datacenter they biuld, it would run better/easier/faster/cheaper sitting on the ground in antarctica than it would in space, or floating on the ocean, without the launch costs. Space is useful for those activities that can only be done from space. For general computing? Not until all the empty parts of the globe are full.

This is a pump-and-dump bid for investor money. They will line up to give it to him.


Yup - my example of the Sahara wasn't really a specific suggestion, so much as an example of "The Most Inconvenient Inhospitable part of the earth's surface is still much better than space for these use cases". This isn't star trek, the world doesn't match sci-fi.

It's like his "Mars Colony" junk - and people lap it up, keeping him in the news (in a not explicitly negative light - unlike some recent stories....)


> Whatever sat datacenter they biuld, it will run better/easier/faster/cheaper sitting on the ground in antarctica than it will in space

That is clearly not true. How do you power the data center on antarctica? May i remind you it will be in the shadow of earth for half a year.


Space is so expensive that you can power it pretty much any way you want and it will be cheaper. Nuclear reactor, LNG, batteries (truck them in and out if you have to). Hell, space based solar and beam it down. Why would there ever be an advantage to putting the compute in space?


Get those penguins doing something productive for once, put them on treadmills!


Or burn them in a furnace. Pretty much any way you can think of to accomplish something on earth, is vastly cheaper, easier, and faster than doing it in space.


A tanker full of LNG and a turbine would probably work.


Kinda like the ones they are already burning in Starship to put these in space in the first place.

Anywhere on earth is better than space for this application.


> How do you power the data center on antarctica?

Nuclear power plant?


By tapping into the geothermals of the volcanoes under the ice. Otherwise nukkular.


Then you put another in the high north. Two, or six, is still cheaper than one in orbit.


Why would they bother to build space data center in such monolithic massive structures at all? Direct cables between semi-independent units the size of a star link v2 satellite. That satellite size is large enough to encompass a typical 42U server rack even without much physical reconfiguration. It doesn't need to be "warehouse sized building, but in space", and neither does it have to be countless objects kilometers apart from each other beaming data wirelessly. A few dozen wired as a cluster is much more than sufficient to avoid incurring any more bandwidth penalties on server-to-server communication with correlated work loads than we already have on earth for most needs.

Of course this doesn't solve the myriad problems, but it does put dissipation squarely in the category of "we've solved similar problems". I agree there's still no good reason to actually do this unless there's a use for all that compute out there in orbit, but that too is happening with immense growth and demand expected for increased pharmaceutical research and various manufacturing capabilities that require low/no gravity.


Not just a 42U rack, but a 42U rack that needs one hundred thousand watts of power, and it also needs to be able to remove one hundred thousand watts of heat out of the rack, and then it needs to dump that one hundred thousand watts of heat into space.


And it needs to communicate the data to and from a ground-based location. It’s all of the problems with satellite internet, but in your production environment!


Based on the filing for launching a million satellites, apparently their solution is to simply launch one GPU per satellite.


Hrrm. Lemme glassballit...

Imagine a liquid which can be electrically charged, and has a low boiling point.

(Ask 3M/DuPont/BASF/Bayer... - context 'immersion cooling')

Attach heat-pipes with that stuff to the chips as is common now, or go the direct route via substrate-embedded microfluidics, as is thought of at the moment.

Radiate the shit out of it by spraying it into the vacuum, dispersing into the finest mist with highest possible surface, funnel the frozen mist back in after some distance, by electrostatic and/or electromagnetic means. Repeat. Flow as you go.


This is sort of where I think he is going with it. Run the compute part super cold (-60C) in a dielectric fluid. Maybe even at a low pressure. It boils off, gets collected, and is then condensed into something way hotter. Like boiling water hot. This is sent through a high temperature radiator for heat dispersion (because Stefan-Boltzmann has a damned 4), and then pumped back into the common storage area. Cycle indefinitely. Beyond the simple space whatever non-sense, there is a nugget of a good idea in there. Cold things are going to have less internal resistance - so they will produce less waste heat. If you can keep them at a constant temperature via submerged cooling they are also going to suffer less thermal stress due to heat fluctuations. So the vacuum of space becomes the perfect insulator. You can’t have humans getting into them anyways because then you have to reheat and recool, causing stress on the system. Just have to accept your slow component losses. Microsoft and IBM have been working the same basic concept for a while (decade plus), Elon is just throwing ‘Space!!’ into the equation because of who he is. I think it’s 50% hype and 50% this is where the industry is going regardless. I always assumed they would just find an abandoned mine or something. But the always-cold, thermally-stable, no-humans-allowed data center is coming. We are hitting the point where the upfront cost of doing it is overshadowed by the tail cost savings.


> Radiate the shit out of it by spraying it into the vacuum, dispersing into the finest mist with highest possible surface, funnel the frozen mist back in after some distance, by electrostatic and/or electromagnetic means. Repeat. Flow as you go.

Even if that worked, you don’t gain much. It’s not the local surface area that matters — it’s the global surface. A device confined within a 20m radius sphere can radiate no more heat than a plain black sphere of the same radius.

There are only two ways to cheat this. First, you can run hotter. But a heat pump needs power, and you need to get that power from somewhere, and you need to dissipate that power too. But you can at least run your chips as hot as they will tolerate. Second is things like lasers or radio transmitters, but those are producing non-thermal output, which is actually worse at cooling.

At the end of the day, you have only two variables to play with: the effective radiating surface temperature and the temperature of the blackbody radiation you emit.


hits crack pipe used by elon but only after washing it thoroughly What if we used the waste heat to power a perpetual motion device that generated electricity?


Congratulations! You have formed a fabric made out of satellites. You better hope it doesn't crumple.


> using wireless communication means even less bandwidth between nodes, more noise as the number of nodes grows, and significantly higher power use

Space changes this. Laser based optical links offer bandwidth of 100 - 1000 Gbps with much lower power consumption than radio based links. They are more feasible in orbit due to the lack of interference and fogging.

> Building data centres in the middle of the sahara desert is still much better in pretty much every metric

This is not true for the power generation aspect (which is the main motivation for orbital TPUs). Desert solar is a hard problem due to the need for a water supply to keep the panels clear of dust. Also the cooling problem is greatly exacerbated.


You don’t need to do anything to keep panels with a significant angle clear of dust in deserts. The Sahara is near the equator but you can stow panels at night and let the wind do its thing.

The lack of launch costs more than offset the need for extra panels and batteries.


What’s your source for that claim? Soiling is a massive problem for desert solar, causing as high as 50% efficiency loss in the Middle East.[1]

[1] https://www.nlr.gov/news/detail/features/2021/scientists-stu...


A relevant quote from that article.

“The reason I concentrate my research on these urban environments is because the composition of soiling is completely different,” said Toth, a Ph.D. candidate in environmental engineering at the University of Colorado who has worked at NREL since 2017. “We have more fine particles that are these stickier particles that could contribute to much different surface chemistry on the module and different soiling. In the desert, you don’t have as much of the surface chemistry come into play.”


You’re not summarizing the article fairly. She is saying the soiling mechanisms are environmentally dependent, not that there is no soiling in the desert. Again, it cites an efficiency hit of 50% in the ME. The article later notes that they’ve experimented with autonomous robots for daily panel cleaning, but it’s not a generally solved problem and it’s not true that “the wind takes care of it.”

And you still haven’t provided a source for your claim.


I’m saying the same thing she is, that soiling isn’t as severe in the desert not that it doesn’t exist.

The article itself said the maximum was 50% and it was significantly less of a problem in the desert. Even 50% still beats space by miles, that only increases per kWh cost by ~2c the need for batteries is still far more expensive.

So sure I could bring up other sources but I don’t want to get into a debate about the relative validity of sources etc because it just isn’t needed when the comparison point is solar on satellites.


You are again misquoting the article. She did not say soiling was "significantly less of a problem" in the desert. She in fact said it "requires you to clean them off every day or every other day or so" to prevent cement formation.

You claimed it was already a solved problem thanks to wind, which is false. You are unable to provide any source at all, not even a controversial one.

And that's just generation. Desert solar, energy storage and data center cooling at scale all remain massive engineering challenges that have not yet been generally solved. This is crucial to understand properly when comparing it to the engineering challenges of orbital computing.


Now you make me want to come up with a controversial source. The Martian rovers continued to operate at useful power level for decades without cleaning.

But but lack of water…


Anyway here’s some actual science on why going vertical makes a big difference.

https://link.springer.com/article/10.1007/s11356-022-19171-5


Thank you for providing a source. That’s an early stage research paper, not the proven solution you originally implied. There are tons of early stage research papers on all these problems on earth and in space. Often we encounter a bunch of complications in applying them at scale such as dew-related cementation[1], which is a key reason why they haven’t been deployed at sufficient scale.

That you point to the Mars rover, a mission with extremely budgeted power requirements, as proof of how soiling doesn’t pose an impediment to mega scale desert solar farms, only underscores the flaw in your reasoning.

[1] https://www.sciencedirect.com/science/article/abs/pii/S22131...


“I don’t want to get into a debate about the relative validity of sources etc”

> Not the proven solution

Yet you quote a paper saying it can work. “This impact can have a positive or negative effect depending on the climatic conditions and the surface properties.”

I have no interest in debating with you because I don’t believe you are capable of a honest debate here. The physics doesn’t change and the physics is what matters.

> doesn’t pose an impediment

Nope. I said it beats “space” not that soiling doesn’t exist. That’s what you have to demonstrate here and you have provided zero evidence whatsoever supporting that viewpoint. Hell they could replace the entire array every 5 years and it would still beat space.. Even if what you said was completely true, you still lose the argument.


The argument here is simply over your false claim that "You don’t need to do anything to keep panels with a significant angle clear of dust in deserts." Your only source does not, in fact, establish that, and cementation is in fact a challenge with desert solar -- something that happens much faster than every five years.

Repeating unsupported claims and declaring yourself the winner does not, it turns out, actually help you win an argument.


Shouldn't swarms of quadcopter drones zipping around the panels be able to handle that?

Wouldn't even need to be that 'autonomous', since the installation is fixed.

More like the things simulating fireworks with their LEDs in preprogrammed formation flight over a designated area.


You don't need quadrocopters. Solar panels arranged in rows have rails that cleaning robots can drive on.


Indeed, that seems unnecessarily complex for what is actually needed. I don't understand why the great grandparent comment seems to suggest it's an "unsolved" problem - as if grid-scale solar buildouts don't already have examples of things like motorized brushes on rails for exactly this already.

And it's always a numbers game - sure they're not /perfect/, but a few % efficiency loss is fine when it's competing against strapping every kilo of weight to tons of liquid hydrogen and oxygen and firing it into space. How much "extra" headroom to buffer those losses would that equivalent cost pay for?

And solar panels in space degrade over time too - between 1-5% per year depending on coatings/protections.


The same panel produces much more electricity in space than at the bottom of the atmosphere, because the atmosphere already reflects most of the light. Additionally, the panel needs less glass or no glass in space, which makes it lighter and cheaper.

Launch costs have shrunk significantly thanks to SpaceX, and they are projected to shrink further with the Super Heavy Booster and Starship.


Space doesn't really change it though because the effective bandwidth between nodes is reduced by the overall size of the network and how much data they need to relay between each other.


Yup. We don't use fibre optics on earth rather than lasers because of some specific limitation of the earth's surface being in orbit would avoid.

We use them because they're many orders of magnitude cheaper and simpler for anywhere near the same bandwidth for the distances required.


> We don't use fibre optics on earth rather than lasers because of some specific limitation of the earth's surface being in orbit would avoid.

That's incorrect. Lasers can suffer from atmospheric interference and fogging on earth.

Here is a post from NASA explaining why they like laser communications better than RF in space.[1]

[1] https://solc.gsfc.nasa.gov/modules/kidszone7/mainMenu_textOn...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: