> Is there a JITing C compiler, or something like that?
Yes, for example, compiling C to JavaScript (or asm.js, etc. [0]) leads to the C code being JITed.
And yes, there are definitely benchmarks where this is actually faster. Any time that a typical C compiler can't see that inlining makes sense is such an opportunity, as the JIT compiler sees the runtime behavior. The speedup can be very large. However, in practice, most codebases get inlined well using clang/gcc/etc., leaving few such opportunities.
[0] This may also happen when compiling C to WebAssembly, but it depends on whether the wasm runtime does JIT optimizations - many do not and instead focus on static optimizations, for simplicity.
Lots of people use wasm in production right now, and toolchain support is in a good place across several languages. In that sense we are already there.
You likely visited a website using wasm today without realizing it, and major apps like Photoshop have been ported to wasm, which was the original dream behind it all. That has all succeeded.
But if you want to replace containers specifically, as this article wants, then you need more than wasm 1.0 or even 2.0. I don't know when that future will arrive.
There's a difference between a handful of sites that use wasm and it being the mainstream way in which we write web applications and run hosted software. It's still a very very niche platform that has not fulfilled its promise of either being a first party web tool or a universal runtime.
Like, how easy is it to write a web application in Wasm? Or how easy is it to compile your average program written for a native platform to Wasm without hand picking your dependencies to work on the platform?
You're right, but wasm's goals were never to be a mainstream way to write web applications. It was designed very specifically to allow things like Photoshop, Unity, and other high-end applications to run on the Web, things that just didn't run at all, and were really important. But despite their importance, those applications are a small fraction of total websites.
Wasm succeeded at its initial goals, and has been expanding into more use cases like compiling GC languages. Perhaps some day it will be common to write websites in wasm, but personally I doubt it - JavaScript/TypeScript are excellent.
Overall this is very good, but I have one specific note: Lesson 6 says "LLMs aren't conscious."
I think I get what you're saying there - they are not conscious in the same way that humans are - but "consciousness" is a highly-debated term without a precise definition, and correspondingly philosophers have no consensus on whether machines in general are capable of it. Here is one of my favorite resources on that, the Stanford Encyclopedia of Philosophy's page on the Chinese Room Argument:
Things that appear conscious, or that appear to understand a language, are very hard to distinguish from things that actually are those respective things.
Again, I think I get the intended point - some people interact with ChatGPT and "feel" there is another person on the other side, someone that experiences the world like them. There isn't. That is good to point out. But that doesn't mean machines in general and LLMs specifically can't be conscious in some other manner, just like insects aren't conscious like us, but might be in their own way.
Overall I think the general claim "LLMs aren't conscious" is debatable on a philosophical level, so I'd suggest either defining things more concretely or leaving it out.
Philosophy aside - how can an LLM be conscious without a memory or manifestation in the real world? It is a function that, given an input, returns an output and stops existing afterwards. You wouldn't argue that f(x)=x^2 is conscious?
I would maybe accept debates about whether for example ChatGPT (the whole system that stores old conversations and sends the history along with the current user entry) is conscious - but just the model? Isn't that like saying the human brain (just the organ lying on a table) is conscious?
There's a great exploration of this concept in Permutation City, a science fiction novel by Greg Egan. In the book, a deterministic human brain is simulated (perfectly) in random-access order. This thought experiment addresses all three of your arguments.
I don't see why something that doesn't exist some of the time inherently couldn't be conscious. Saying that something's output is a function of its inputs also doesn't seem to preclude consciousness. Some humans don't have persistent memory, and all humans (so far) don't exist for 99.99999999% of history.
I'm not trying to claim a particular definition of consciousness, but I find the counterarguments you're presenting uncompelling.
It is true that human consciousness is continuous over time, but maybe some animals have very little of that?
Or, to look at it like Black Mirror, if you upload your consciousness into a machine, are you not conscious if it pauses the simulation for a moment? Perhaps you would have no memory of that time (like in Severance), but you could still be conscious at other times.
I do agree that a model at rest, just sitting on a hard drive, doesn't seem capable of consciousness. I also agree x^2 is not conscious. But the problem, philosophically, is actually separating those cases from things we know are conscious. The point of Searle's Chinese Room theorem is that he thinks no machine - not x^2, not a super-AI that passes the Turing Test - truly "thinks" (experiences, understands, feels, is conscious). But that position seems really hard to defend, even if it gives the "right" answer for x^2.
True that the US's share of global GDP is lower than it has been. But there are many other ways to measure its power (and dominance), so it is easy to argue about this between reasonable people.
Rather than make any specific point, I'd recommend acoup's detailed post about the US's overwhelming dominance across a huge swath of areas:
The US is still the most powerful economy in the world. No question about it.
What I was questioning was the argument from OP that it never have been as dominant since WWII. And no, the US has been way more powerful in the past, even if it is still the most powerful economy.
It looks like the wasm file here is not fully optimized. Reading the wasm and JS, I'd guess -O1 perhaps? Linking with -O3 would make it smaller and possibly faster (10% smaller binary in a quick local test, and it removes 90% of locals, which should speed things up too)
But it is possible that it's already fast enough for the purposes here, and this doesn't matter.
It's worse, it is -O0 -- this is because of the GC and binaryen/llvm interaction. For GC to work we need to spill stack call pointers (and binaryen has such a flag!), but for the optimization level 1 and above said pointers are sometimes optimized away :3
I'm experimenting with WASI and the GC extension for WASM, but that's months from today if we speak about complete port (given my time capacity at the moment).
WasmGC would be the best solution here, yeah, then the VM handles pointers for you.
Otherwise, I could look into the SpillPointers issue for you if you want - optimizations should not remove GC pointers, so that sounds like a bug. If so feel free to file an issue with a testcase. (But WasmGC would be best, avoiding all that.)
As far as I know, optimization levels higher than -O0 work fine with SpillPointers. But at least in a cursory first look I had a while ago, the optimizations made things slower overall. I guess they might lead actually to more "moving pointers in and out of the heap" since the SpillPointers pass is done at the very end. But this should all be investigated more thoroughly.
Hey! Thanks for the offer and thanks for the correction. I've revisited relevant threads and it seems that it is indeed -O0 because things are slower with higher optimization levels (I must have misremembered).
```
the optimization level -O0 is used because higher optimization
levels seem to interfere with the binaryen options needed to get the
garbage collector to work correctly and tend slow down the program
(might be worth experimenting with the optimization options)
```
The wasm file here could be optimized more. It looks like a 43 MB wasm is downloaded, and wasm-opt -O3 shrinks that to just 14 MB. It might also be faster that way (due to inlining and other optimizations). Startup will definitely be faster.
Aside from tech, though: practically none of the non-tech people I followed on Twitter moved to Mastodon. Almost all of them went to Bluesky. I follow a mix of people, so I ended up mostly on Bluesky.
I would have been happy on Mastodon too, and I don't know why it didn't catch on with non-tech people, but it just hasn't. So Bluesky is our main opportunity for an open social web, at this time.
It sounds stupid but I think the bit where you pick your host was too much for normies or led to pushing off the decision and just not joining. Even when you have an account you know have to pick a client.
Definitely a lot is missing, yeah, and adding more will take time. But it works well already for pure computational code. For example, Google Sheets uses WasmGC for Java logic:
This really is a case of trading a good addressable market for a much smaller one. Business-wise it makes no sense at all.