1. The Alioth results are not necessarily final - they compare a single JS engine, and we have several fast ones now (SpiderMonkey with type inference can be significantly faster on some benchmarks, for example). Even so, the median speed there is 2X, which is fairly close. Admittably there are some bad cases though, in particular pidigits (badly written benchmark code? bug in v8?).
2. It is true that JS on Mono has had far less work done, and that the DLR exists. However, the fact remains that dynamic languages are a late addition to the JVM/.NET model. For example, one very important thing for dynamic language performance is PICs, and to my knowledge there is no good example of fast PIC performance on the JVM or CLR. In fact, we don't even have a good example of a generic virtual machine that can run multiple dynamic languages fast (Parrot exists, but is not that fast) - all the fast dynamic language implementations are language-specific, so it shouldn't surprise us that VMs built for static languages don't do that well either.
In my opinion it makes no difference that we have several fast engines where some are faster at some things than others. When executing in the browser you don't get to pick and chose how and where your application will be executed. If you run into performance problems on one of the engines you can: A) dismiss a subset of your users and their performance problems, telling them to use a browser with a faster engine (they wont), B) only allow certain functionality based on a user agent string, or C) limit your applications scope to one that runs suitably in the slowest of the engines you're willing to support. In essence, if the application runs great in browser A but chokes in browser B, are you willing to say bye bye to your A users to take advantage of performance gains on B? I've been in this situation, and in my experience I've always had to look away from the faster browser rather than the user.
Outside the browser you probably have a little more freedom, but it's not like you get to pick and chose in the style of "Oh, I'll execute this function in V8 since it does this faster, and that function in SpiderMonkey since it's faster there". For this reason, I don't think the fact Alioth only has measures for one engine would make a significant difference in the overall comparison. You'd be, for the most part, gaining performance in one place by sacrificing in another.
Anyway, in my personal experience, I've ran into performance problems in JS a lot more often than with C#. I also have to go through a lot more tedious practices to ensure my JS code runs as fast as it can, where as in C# Some.lookup.with.lots.of.dots.does.not.scare.me(). That's why your claim sort of surprised me. Then again, the last serious JS performance problem I had was 6 months ago (before FF4), so maybe a lot has happened in those 6 months.
By the way, I'm not too informed on how type inference is done in SpiderMonkey, so I may be completely wrong in mentioning this, but it sounds like they're trying to speed up a dynamic language by mimicking static typing. If that's how far they're going to improve performance, maybe soon enough JavaScript will in fact sit better in the Mono/.NET/JVM?
I agree with your point about multiple JS engines, indeed you can't pick and choose the best results. What I was trying to say is just that the best results we see are an indication of where things are going. But again, I agree, we are not there yet and right now, each user has just one JS engine, and problems on some benchmarks. Static languages have much more consistent performance.
About the last 6 months: Yes, a lot happened during that time, namely FF4's JaegerMonkey and Chrome's Crankshaft. Both are significant improvements.
About typing, yes, in a way that could let this code run faster inside the JVM or Mono. If you can figure out the types, you can generate fast statically typed code for those VMs. However, type analysis can be both static and dynamic, should integrate with the PICs and so forth. So even with that, I don't expect dynamic languages to be able to run very fast on static language VMs.
2. It is true that JS on Mono has had far less work done, and that the DLR exists. However, the fact remains that dynamic languages are a late addition to the JVM/.NET model. For example, one very important thing for dynamic language performance is PICs, and to my knowledge there is no good example of fast PIC performance on the JVM or CLR. In fact, we don't even have a good example of a generic virtual machine that can run multiple dynamic languages fast (Parrot exists, but is not that fast) - all the fast dynamic language implementations are language-specific, so it shouldn't surprise us that VMs built for static languages don't do that well either.