Fantastic article. This is the best piece I've seen written on this subject so far, with lots of context and a good background on what's happened over the last five years.
One thing I don't see mentioned, and which has been a huge red flag for me, is Intel's seeming pre-occupation with raytracing. As a graphics programmer, I find it absurd that Intel would hope to mount a technological disruption on an industry that has been following a very steady course since SGI and Pixar were founded in the mid-80s. I suspect that the Larrabee approach, as a whole, is a case of arrogance in believing that they can jump ahead of everyone else without having to slog through the hard, painful work of incremental improvements that has been taking place at nVidia and ATI/AMD.
That being said, the author correctly points out that Intel has invested in many technological mistakes in the past, and always eventually recognized their mistakes and changed course.
It's not trivial to engineer a massively-multi-core general-purpose processor. I would try it with MIPS or ARM instead of x86, but I guess this is not an option for Intel.
AMD, however, doesn't need to preserve the x86 ISA's status as an industry standard. Quite the contrary: anything they do to hurt Intel will only make them stronger. And they have some folks with massively-multi-core expertise in-house.
That chip appears to be worse than Larrabee in every respect. In particular it is much harder to program, which would just make Intel's late drivers later.
Before SIGGRAPH in August 2008 Intel was talking about raytracing, but since then the PR focus has been almost completely on rasterization. IMO raytracing was a diversion (sort of like Cell's mobile agents with true AI); it was never serious.
One thing I don't see mentioned, and which has been a huge red flag for me, is Intel's seeming pre-occupation with raytracing. As a graphics programmer, I find it absurd that Intel would hope to mount a technological disruption on an industry that has been following a very steady course since SGI and Pixar were founded in the mid-80s. I suspect that the Larrabee approach, as a whole, is a case of arrogance in believing that they can jump ahead of everyone else without having to slog through the hard, painful work of incremental improvements that has been taking place at nVidia and ATI/AMD.
That being said, the author correctly points out that Intel has invested in many technological mistakes in the past, and always eventually recognized their mistakes and changed course.