when it comes to complex web apps, it makes a huge difference.
Such as what? Give some examples. Further -- as a developer -- I cannot fathom "big data coming to browsers".
This is racing tires on street cars: Theoretically useful, but of absolutely zero relevance for the overwhelming majority of users.
Never has there been an actually practical benchmark of javascript engines and legitimate, real-world websites. Instead it's all nonsense like Sunspider. And don't misunderstand -- this actually can be a net negative for users because what benefits a 10M iteration loop is of negative consequence to an occasionally run event handler.
i'm doing interactive image analysis in canvas. i have pixel iteration/algo loops that easily hit 10M. my canvases are typically 1550x2006.
mozilla also demoed js decoding H.264 video in realtime, the utility of which is of course questionable...
the point is, though, that typed arrays are already a reality, and they are enormous. you can argue about how good an idea that was, but now that they're here, JITs are absolutely necessary.
Fun project, but how many users of your app are there? One, including you?
Such "high intensity" apps see incredibly limited marketplace success because it generally isn't the ideal platform for them, and the marginal gains from JavaScript engine to JavaScript engine (without a major reboot like asm.js) makes them completely non-competitive to alternative platforms.
Talk about goalpost-moving. First you ask for apps that benefit from these kinds of optimizations. Then someone gives you an example and you say it isn't important because it probably doesn't have many users?
That's a catch-22. Apps like that can't have many users because the improvements they depend on aren't widely deployed. However, those improvements will never be widely deployed if people persuasively argue against their importance using evidence like a small user base...
Goalpost-moving? You mean in my OP where I asked whether it benefits the average user? Where I said that the majority of performance issues remain in the DOM?
I moved no goalpost. I am saying exactly what I always said. What the GP said that they are doing is almost certainly an ill-conceived project because JavaScript -- without a major change like asm.js -- simply cannot offer competitive performance as a facet of the language. Which is why we don't build a renderer in JavaScript, but instead layer it over WebGL, for instance.
yes, it's an internal tool for several people at the moment. your characterization of "real-world websites" is somewhat misguided, I think. JITs, by design are targeted towards compute-heavy applications built for the web platform, not "typical" websites. so to say they will have little effect on websites is probably accurate, but misplaced. mozilla is building Firefox OS, where these JITs will play a critical role in smoothness of interaction and almost certainly fewer cpu cycles on mobile devices to save battery life.
Photoshop, GIMP, or photo touchup kind of apps are for the mass. The GP's app is the first stage of such app. Photo effect update and photo editing involve processing all the pixels. 10M per pass is the norm. Make that interactive and updating in real time with multiple passes.
The next wave will be audio and video processing and editing. Applying effects like cleaning up the colors in every frame is very expensive.
I actually have an example not long ago. There are some long backend job processing that I want to display the status log, beyond just the progress bar. The log records are polled in via Ajax. When it got pretty big, like couple hundreds K, Chrome crawled to its knee, freezing hard. I thought it was the DOM updates but it was the Javascript code. Firefox has no problem. I ended up having to re-structure the feature to not showing the whole log.
I commented out the DOM update portion of the code and left just the Javascript code running. It still hung Chrome, so it's Javascript related, at least on Chrome's version of Javascript.
Such as what? Give some examples. Further -- as a developer -- I cannot fathom "big data coming to browsers".
This is racing tires on street cars: Theoretically useful, but of absolutely zero relevance for the overwhelming majority of users.
Never has there been an actually practical benchmark of javascript engines and legitimate, real-world websites. Instead it's all nonsense like Sunspider. And don't misunderstand -- this actually can be a net negative for users because what benefits a 10M iteration loop is of negative consequence to an occasionally run event handler.