In particular I know that Go's GC is optimized for very low latency (rather than throughput), and is not a stop the world GC. So I'm wondering why it doesn't work for games? Are the pauses still too high for games, or is he missing something? Benchmarks or concrete results in Go would be great here.
Edit: specifics from https://blog.golang.org/ismmkeynote - Go hugely improved GC latency from 300ms before version 1.5, down to 30ms, then again down to well under 1ms but usually 100-200us for the stop the world pause. So I guess it is stop the world but the pauses seem ridiculously small to me. They guarantee less than 500 microseconds "or report a but", which seems more than fast enough for game framerates (16ms for 60Hz frames). Am I missing something?
The thing with games compared to regular services is that they start with the bar of rigor a bit higher than normal in all respects.
GCs really let you trivially not care about a LOT of things... Until you need to care. Things like where and when you allocate, the memory complexity of a function call, etc. With games you need to care about all that stuff so much sooner.
Once you're taking the time to count/track allocations anyway you might as well just do it manually. It just codifies something your thinking about anyway.
Short stop the world is well and good, but some of the performance impact is pushed to the mutator via write barriers. That can be another can of worms.
maybe this is an ignorant question, but why do people always cite GC latencies in seconds? is there an implied reference machine where the latency is 300ms? I would expect it to vary a lot in the wild based on cpu frequency and cache/memory latency. is there some reason why this doesn't matter as much as I think it does?
People understand seconds, and any other measurement would require specifying a lot of computer-specific stuff, and if you're going to do that, then you might as well fully specify the workload, too, to answer those questions before they come.
It isn't meant to be a precise answer, it's meant to put the GC performance broadly in context.
People don’t write games in GC languages, so GC language developers don’t optimize for games, so people don’t write games in GC languages. It’s just a vicious cycle. It never breaks because for big projects there is too much financial risk.
Also, because everybody who tries optimizing a GC language for games fails, badly. And, GC language developers know that if they promote their language for games, any game developers (temporarily) taken in will hate them forever, and bad-mouth their language.
Languages not used for anything anyone cares much about don't attract much bad press. There are reasons why games are written the way they are. It is not masochism.
Edit: specifics from https://blog.golang.org/ismmkeynote - Go hugely improved GC latency from 300ms before version 1.5, down to 30ms, then again down to well under 1ms but usually 100-200us for the stop the world pause. So I guess it is stop the world but the pauses seem ridiculously small to me. They guarantee less than 500 microseconds "or report a but", which seems more than fast enough for game framerates (16ms for 60Hz frames). Am I missing something?