When you build V8 from source, you can have it generate a snapshot of the memory state once the libraries are loaded and this gets packaged in the executable. It makes it slightly bigger but the load time is blazingly fast. Pretty sweet.
That's like the Smalltalk image! (A few years ago, a 10 megabyte VisualWorks image could load faster than the 768k Perl runtime.)
"The most interesting part is a new implementation for the JavaScript Engine called "V8" done by members of the orginal Animorphic team. Animorphic Smalltalk was a Smalltalk system built around the mid-90s as part of a startup that was informally known as Animorphic Systems. In early 1997, Animorphic was acquired by Sun, and much of the underlying VM technology was put to real use in the Java Hotspot VM."
The cool part about that is maybe they could precompile stuff like prototype, jquery, etc and include it with chrome since majority of websites use some kind of framework. All it would have to do is run a checksum on the prototype js file and if it matches, load a precompiled version.
Then again, it probably wouldn't help too much, since most websites employ fancy packaging tricks to speed up javascript file load times.
Or more generally... I've been hearing about these things called caches...
Maybe they should "cache" this data for the most common files. Voila, no favoritism, only sweet sweet performance. (At the cost of having some sort of frequency table.)
And cache eviction and all that is pretty well understood too...
But you don't know that url A and url B in fact point to the same file until after the file has been downloaded. At that point, why not just load the javascript instead of performing an md5 checksum.
Because parsing isn't free. In addition, one interesting optimization it opens up is something akin to the JVM where performance improves as profiling of code is done. Imagine if every site that used a Javascript library served to improve the performance of that library for all future invocations. That would be nifty.
It would also be an incentive to stick with an older version of something. I think I prefer the "publicly hosted on a very highly available site" idea, but it's tough finding someone to do that.
I have to wonder why they chose C++ for writing the VM. Since their architecture is based on compiling Javascript, why not just write the compiler in Javascript to begin with?
The most likely reason I can think of is that it makes it easier to interface with the existing WebKit code, but this doesn't seem like a very strong reason to me.
Because you would need to bootstrap it every time you want to run it, since Javascript is interpreted. This means you'd need a 3rd party Javascript interpreter to run your Javascript interpreter.
No, the whole point is that they already wrote a compiler, so the in this case Javascript is not interpreted. You only have to bootstrap it once.
They would have had to make their compiler provide a batch compilation mode in addition to the incremental compilation mode, but that should not be a pain point. After all, batch compiling is easier to do than incremental compiling.
Ah. So you propose writing a Javascript compiler that spits out a dynamically linkable, PIC executable for multiple architectures using semi-static assembly, and then using that to write V8? Otherwise, if you simply dump a raw image you've got a closed environment that can't really be integrated into a browser very well, or linked into other apps for their use.
That's a whole lot of unnecessary -- and nontrivial -- work, when all you want to do is interpret Javascript.
I'm not really proposing anything, since the V8 team has forgotten more about VMs than I know, just pointing out another common design alternative. The post implies that V8 can already perform some sort of AOT compilation:
When you build V8 from source, you can have it generate a snapshot of the memory state once the libraries are loaded and this gets packaged in the executable. It makes it slightly bigger but the load time is blazingly fast.
There's certainly extra work in this approach, but maybe it would be worth it if it allows the compiler to be written in a more productive language.
This linked to a post where one of the key developers (Dave Griswold) said "The release of the V8 VM is the beginning of a whole new era for dynamic languages"...
From this, I wouldn't see this as the case at all. I can't imagine many people wanting to use JavaScript as an intermediate language.
Without an intermediate language or some form I wouldn't call this a Virtual Machine - it's an interpreter... Granted, a pretty fast and sophisticated one.
Actually, it's more like a compiler than an interpreter. I think people use the term "VM" out of ignorance -- they think VM is a fancy term for "thing that runs javascript".
But anyway, if you want to run your $NEW_LANGUAGE on V8, just compile your language to JavaScript. That is the "IR".
Has anybody taken a shot at building a js server stack that can intelligently detect the capabilities of the user-agent, and determine what code can be executed client-side and what code can be executed on the server?
Interestingly enough, server-side Javascript is actually quite old. I remember one of the very first web apps I worked with was in SSJS (aka Server-Side JavaScript), on the Netscape Enterprise Server.
I remember ASP 2.0/3.0 supporting server-side js! I've never seen anything though where you write the js code once, and the server determines where to execute the code.
That's like the Smalltalk image! (A few years ago, a 10 megabyte VisualWorks image could load faster than the 768k Perl runtime.)