Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No humans go, no information returns.

Sending an acorn-sized probe to another galaxy to make more acorn-sized probes: what even is the point of that? To make very slow grey goo a reality?

If actual humans find a way to go to Andromeda (other than waiting for it to arrive, heh) and want to, good for them. Otherwise we should actively discourage anything like the project proposed

 help



This is a study of the feasibility of launching an intergalactic colonization wave (and its implications re: the Fermi paradox), not a proposal that humans should do that (it would be just slightly ahead of its time for that!), or a discussion of the ethics or higher-level utility of doing so. It would be refreshing to see someone discuss the things paper actually discusses. To use early-2000s terminology, the paper's future shock level is higher than that of most HN readers, leading to rather banal discourse.

In any case, I'm fairly sure the authors agree that sending mindless automata to colonize the universe doesn't seem like a great idea. Nevertheless some alien intelligence (including an Earth-based AGI) might find it a completely reasonable, even imperative, goal.

But sentient machines or uploads (assuming for the sake of this this thought experiment that they are possible)? That's a different thing.


But the thing is that the Fermi paradox isn't illuminated by scenarios that are technically possible but extremely unlikely. I get that all it takes is some subset of people who want to do it to make it happen, or as you say, some alien species that decides it's a good idea, but I'd argue that the idea is patently bad, and there's good reason to think that no species would bother -- not 100%, obviously, and the steel man argument would say it only takes a fraction of a percent, but I'm personally unconvinced that anyone would bother.

You two debated this as a philosophical or even moral issue, but it changes everything when you look at it from the natural hazard prospective. It doesn't even have to be a subjective matter. Picture this - you know the climate change is happening and you understand that some colony of animals will surely vanish because of that if you wont do a thing about it. Doing something could mean just taking a few pair of animals and relocate them to a safer area. Do you think that the survival of such descending colony of animals mean anything (to anyone)? Who can argue that it won't be our time (and obligation) to reduce the risk of having the only known capable civilization residing on only one planet or galaxy?

For sure, but that's my point; "taking a few pair of animals and relocate them to a safer area" is not what this paper is discussing. A better parallel would be "take digital photos of the endangered animal and circulate them around the internet." The proposed method doesn't spread us -- it spreads teeny tiny machines we made, for no reason at all other than to say we did it. And long after we're gone, when the Sun has died, far away galaxies will be polluted with little machines, each containing a copy of some data about us.

Right, but it’s only a feasibility study. By definition these only study the minimum system that could accomplish the goal, which was to visit as many galaxies as possible. Given the mass budget and density of the data storage contemplated there’s no reason the probes couldn’t carry enough information to create real human colonies in the process of replicating themselves.

Fun story: https://www.fimfiction.net/story/368986/message-in-a-bottle




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: