Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ArrayBuffers aren't meant to be executable code either.

JS strings are not UTF-16, they are 16-bit chunks of (potentially) nonsense, and enforcing valid UTF-16 would break quite a few existing uses. For example, anything that stores encrypted data in a string. Which "shouldn't" be done, that should be a Uint8Array, but existing APIs basically force you to do it. And there's such a thing as backwards compatibility.

Your 3rd point is much more feasible. I doubt any "real" mangling would be good enough from a performance standpoint while still being too difficult for attackers to use. But I could imagine eg breaking any invalid UTF-16/UTF-8 string up into separate rope nodes, maybe even ensuring the nodes don't get allocated too close to each other and/or injecting disrupting hardcoded bytes in between them. (I work on SpiderMonkey, the Firefox JS engine, and we do at least make sure to allocate string data in a separate part of the heap from everything else.)



Valid UTF16 is already being sporadically enforced.

People who hack JS to store arbitrary data in strings are already fighting a loosing battle, and I see no point to help them.

But my point is that we have moved from the JS as a scripting language which did not allow for arbitrary binary data, to one which did without much though over that.

Half of existing problems with zero click, zero days, and zero browse exploits running in the wild, and Chrome becoming the ActiveX 2.0 is that.

There is really no reason for a web browser to do computing on the web, and thus no need for binary manipulations in Javascript on the web.

I'm not saying to axe it from JS, but JS may limit the the browser use case by a limited set of JS standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: