Something I really appreciate about Zig is its "colorblind async/await" [0].
To my eyes, it's the best of both worlds. It doesn't require a GC like Go's "contiguous stacks" approach [1] or Loom's approach [2]. It also doesn't require coloring like Javascript/Rust's async/await approach, which can sometimes seem like a viral leaky abstraction, causing incidental complexity in the presence of certain patterns.
The main catch is around recursion, but that's not so bad, as Zig as a whole seems to steer users away from recursion anyway, with another interesting feature that detects it at compile time.
I think new languages should be going this direction: focus on making things easier for the user, avoid causing incidental complexity, and keep abstractions zero-cost.
The mistake that article makes (imo) is that "colorblindness" means at compile-time. i.e. you don't have to define both a sync and an async version of a function. Obviously async vs sync have different representations at run time.
As someone who has used this (to enable a single function to be called either preemptively or cooperatively by a surrounding VM)... It's not leaky. To be frank, until you've actually done it, it's hard to see.
Section 4 of the article I linked demonstrates how it’s leaky. Admittedly just in special scenarios, but it means that the mechanism isn’t fully transparent.
Something I appreciate about the implementation is that the suspend/resume primitives make it nigh-trivial to write co-routines, which saved me several nanoseconds per frame in my toy Gigatron emulator's VGA decoding implementation and took a fraction of the time to write and debug compared to the traditional struct-based state machine I wrote first.
Oh totally. I was just surprised to see a well-known (to people on HN!) language on the front-page w/o any particular news to drive it. HN front page can be random like that. :-) (See also: random wikipedia pages that sometimes show up.)
If anything, Zig is over-represented on Hacker News with posts and positive mentions, particularly when you compare it to other newer languages. To include the other newer languages getting ignored, bashed, or trolled. Where with Zig, it doesn't get that kind of "heat". But just because Zig might be a "darling" of Hacker News, doesn't mean that's the case in general.
For these newer languages, they need to distinguish themselves, in terms of reasons why people would turn to them. In a world of established languages like C, Rust, Golang, Object Pascal, etc... Why use Zig? The argument to use it over what they already know, might not be there for a lot of people.
Pretty neat language. One of the few that lets you treat allocation failures are regular errors (and makes writing/adding your own allocators really straight-forward).
Although I think HN is probably aware of this lang by now...