I neither said nor implied that AGI being firmly in the realm of science fiction is a bad thing. There's nothing wrong with thinking about the implications and they would be significant. It's just such a stretch from where we currently are that it's not worth investing significant resources in mitigation. The threat from AI that is present and real is janky systems being depended upon for life and death decisions. Authors like Vernor Vinge (and I love his stories) depend on bending the rules of both physics and computation. Physics in that computation is somehow fundamentally different in different areas of the universe and computation in that known computability limits no longer apply.
It's easy to re-imagine science fiction as being closer to reality after the fact.
AI safety is an issue whether AGI is 10 or 100 years away.
It's easy to re-imagine science fiction as being closer to reality after the fact.
AI safety is an issue whether AGI is 10 or 100 years away.