We are lacking a principled approach to JavaScript performance. On the one hand, we're supposed to ignore any issues and just write our code and leave it up to the automagic runtime to make things just work. On the other hand, people are constantly writing articles about JS performance, and also inventing things like asm.js. So performance does matter. Apparently.
What I want is a report in a regular (automatically parsable) format that the JIT vendors provide, which tells us what and when they do/not optimize.
Then people who are writing JavaScript from scratch, or who are writing static checker tools for JS, or who are writing compilers from some other source language to JS, can all consume those reports and use it to figure out how they want to go about writing/generating the final JS.
(The report could get arbitrarily complex, adding more data that is hopefully useful to the consumers. How much an optimization saves; Summary data based on some test suites; etc.)
No comments:
Post a Comment