Dark Ages would imply that progress was set back significantly by some catastrophic event (ie: the fall of Netscape).
Some of us would argue that JavaScript itself was that catastrophic event. No, really -- I'm not trolling. It's no longer reasonable for people to deny that JavaScript doesn't lend itself well to optimizing run-time performance.
These Herculean efforts are impressive, but perhaps it's time to fix the language, even if that breaks backwards compatibility.
They're working on it; it's called ES6. But in terms of your claim that "It's no longer reasonable for people to deny that JavaScript doesn't lend itself well to optimizing run-time performance", I'm not sure what to say to that. JavaScript code is some of the fastest interpreted code around, and projects like asm.js take that even farther. It's a phenomenally beautiful and expressive language once you get around the fact that it has some minor warts.
I might have agreed with you 10 years ago, but I don't think there claim that JS simply doesn't lend itself to being performant is true at all -- all the evidence I see, both as a web JavaScript and Node.js developer, and following the recent relative news, has pointed me to quite the opposite: javascript is doing great right now.
> JavaScript code is some of the fastest interpreted code around, and projects like asm.js take that even farther.
That says something about market forces, but nothing about the language. JavaScript is the fastest dynamic language because it was the language that was most profitable to optimize.
We don't have any real-world comparisons to other languages where an equal amount of brainpower was spent on optimization so that we could see how the language itself affects things.
> We don't have any real-world comparisons to other languages where an equal amount of brainpower was spent on optimization so that we could see how the language itself affects things.
Yes we do: Lua--optimized by lots of big brains for use in gaming. And it turns out that LuaJIT is much faster than any current Javascript JIT--with the reason frequently given that it is a much simpler language.
By "lots of big brains," I think you mean Mike Pall :). But yeah, LuaJIT is really fast.
Julia (http://julialang.org/) is another dynamic language designed from the start for high performance, although the objectives are slightly different from those of Lua. While LuaJIT is fast at everything, Julia is really optimized so that running the same code many times is fast. Running a function the first time can be much slower than in other dynamic languages, I think because there is no baseline JIT or interpreter.
If it will really true, Mozilla would be writing Servo in JS, not Rust. The realty is, Rust is designed for optimum performance, JS is not, and so extracting performance is significantly more complex and also more fragile in the sense that it's easier to write code that screws up the JIT.
I really with that the effort that Mozilla was putting into Rust could be combined with the effort Google is putting into Dart to design a new, from the ground up, language for the Web that combines a lot of the benefits of JS, but with more predictable performance.
It's awesome that the JITs are getting better, but they don't really solve all the warts of JS.
A "bit slower" is being generous. And asm.js doesn't really represent how a general purpose web application will perform, like a Gmail, it represents mostly how a WebGL targeted cross-compiled game can perform.
The same overselling that happened with Java is happening again, don't worry, a magical JIT is right around the corner that will come within spitting distance of native C. This time it'll be different. When TraceMonkey was announced, there was a lot of excitement about how Tracing JITs were going to rock C level performance.
Are we really thinking that Javascript's VM semantics are going to last for decades? That 30 years from now we'll still be stuck with the same JS, just better VMs?
IMHO, Sooner or later we're going to have to admit that a language designed in 10 days is stretched to its limit and as amazing as the JITs and hacks like asm.js are, eventually we'll need to get together and design the next generation.
> The same overselling that happened with Java is happening again, don't worry, a magical JIT is right around the corner that will come within spitting distance of native C
For many purposes the JVM achieved the goals you mention. Humans have an infinite capacity for hand-made optimizations, which is why it's pretty hard for a virtual machine or a compiler to beat hand-written assembly by a developer that knows his shit, but it takes a herculean effort to build big, long running, highly concurrent apps in C/C++ and it takes companies with the resources of Google or Mozilla to do it.
For instance Mozilla is still struggling with memory leaks in Firefox. Why is that? Because memory gets fragmented due to improper allocation patterns, not to mention hard to prevent memory leaks due to cyclic references. And to get around that without a generational garbage collector, you have to use object pools and manage allocations to a really fine level of detail. Or you have to make your app use multiple processes and simply not care much about it, like Google did in Chrome with their one process per tab model, which is why Chrome chokes on multiple long-running tabs opened.
With a precise generational garbage collector for instance, problems of fragmentation and memory leaks due to cyclic references simply go away. People complain about the latency of JVM's CMS garbage collector, but after seeing it in action in a web app that is able to serve 10000 requests per second per server in under 10ms per request, I'm actually quite impressed. It gets problematic when you've got a huge heap though, because from time to time CMS still has to do stop-the-world sweeps and with big heap sizes the process can block for entire seconds. However CMS is actually old generation and there's also the new G1 from JDK7 that should be fully non-blocking when it matures and if you need a solution that works right now you can shell out the cash for Azul's pauseless garbage collector.
It's really hard to build a precise generational garbage collector on top of a language that allows manual memory allocation. Mono's new garbage collector for instance is not precise for stack-allocated values. Go's garbage collector is a simple parallel mark-and-sweep that's conservative, non-precise and non-generational. The most common complain you'll hear about Go from people that actually used it is about its garbage collector, a problem that will simply not go away because Go is too low-level.
And I'm really happy about Mozilla improving Firefox. Firefox is my browser, but how many years did it take for them to solve the memory issues that Firefox had?
Yes, you probably couldn't build a reasonably efficient browser on top of the JVM right now, especially since browsers also have to run on top of devices that are less efficient, but most developers can't build browsers anyway. And in a couple of years from now, mark my words, security will be considered much more important than performance and suddenly the usage of languages in which buffer overflows are a fact of life will be unacceptable.
Also in regards to big iron, I chose Cassandra (a Java app) instead of MongoDB (a C++ app that's the darling of the NoSQL crowd). I did that because Cassandra scales better horizontally and because performance degrades less on massive inserts. Apparently low-level optimizations can't beat architectures more tuned to the problems you're having, go figure.
Rust is not intended to be a web scripting language at all AFAIK. The fact that JavaScript is a bit slower than a systems language is hardly an indictment of JavaScript.
I'm not too keen on the fact that for ALL programming tasks in ANY system you can pick amongst dozens of languages, but the web is still limited to just one. That is not modern at all.
People need and demand choice, that web programming is still a monopoly to this day is bad, even though people do achieve tremendous things with Javascript every day.
>It's no longer reasonable for people to deny that JavaScript doesn't lend itself well to optimizing run-time performance.
For a language that "doesn't lend itself well to optimizing run-time performance", it did far better than: Python, Ruby, Perl and most other dynamic languages...
I think that is not a really fair comparison, Python with PyPy is pretty fast. Actually I would like to see a comparison for Python on PyPy vs JS in IonMonkey/V8.
There was just a higher incentive to optimize Javascript as it was pretty slow to begin with and was exposed to a lot more users (in the sense that it runs on clients rather than servers) than Python, Ruby and Perl
I think that PyPy is pretty impressive.. I also thought that IronPython's performance was impressive.
If you look at the link to the followup post, it does show that in certain use cases NodeJS does a lot better... though without any code to review/reproduce it's hard to say.
I happen to like JS.. Python's probably next on my list of languages to learn, but right now, I'm so deep in getting more proficient with NodeJS + Grunt + RequireJS, it isn't funny... our next-gen stack is much more NodeJS and MongoDB as a few tests, and backend processes have shown them to work very well together...
We have a newer site on ASP.Net MVC 4 (started as 3, with EF), and an aging site built on layers of .Net cruft since 2006 that's nearly unmaintainable) ... So I'm trying to structure things moving forward so that they will be well maintainable for the future as much as possible. Which means some new, and some bleeding edge stuff.
It also means some things I just don't care as much for... I actually like how the OneJS/Browserify takes CommonJS/NodeJS patterns more than AMD (RequireJS), but AMD seems better for the client side... I also don't care for Jade so much, but it was a group decision, and going that direction to share templates for email/client/server usage.
Still working out sharing Backbone models, etc... it's all work. Sorry for blathering on.
If I were doing desktop development, I'd be far more inclined towards Python today. As it stands, imho JS is a better fit for web development.
I've pretty much ignored javascript until this week.
What exactly is wrong with it? Perhaps that is too broad. Can you give an example of something in javascript that impedes runtime performance optimization?
Some of us would argue that JavaScript itself was that catastrophic event. No, really -- I'm not trolling. It's no longer reasonable for people to deny that JavaScript doesn't lend itself well to optimizing run-time performance.
These Herculean efforts are impressive, but perhaps it's time to fix the language, even if that breaks backwards compatibility.