To test a Claude Skill for analyzing cryptographic implementations of cryptographic side-channels ([1] see constant-time-analysis), I had Claude vibe-code an Ed448 implementation.
This includes:
1. The Ed448 signature algorithm
2. The Edwards448 elliptic curve group (which could conceivably be used for ECDH)
3. The Decaf448 prime-order group (a much better target for doing non-EdDSA things with)
I've been putting off reviewing it and making the implementation public (as it was an exercise in "is this skill a sufficient guard-rail against implementation error" more than anything), but if there's any interest in this from the Go community, I'll try to prioritize it later this year.
(I'm not publishing it without approval from the rest of the cryptography team, which requires an internal review.)
This could be game-changing for a lot of open source software.
I spent years avoiding X.509 (and ASN.1, for that matter) in my designs because every time someone I trust poked it, a remotely exploitable bug fell out. Most often, it was a Denial of Service issue rather than Remote Code Execution. Moving to Rust would demonstrably improve the security of the entire Internet.
You might be tempted to ask, "What about BouncyCastle?" (or similar queries).
The "classic" example of this is enums as sum types, rather than a thin wrapper over an integral type: Rust makes it possible to construct in invalid enum variant, whereas plenty of C logic bugs stem from taking untrusted user input and converting it into an enum variant.
My understanding is that Java doesn't allow this directly, but has adjacent historical deficiencies (e.g., not allowing exhaustive enumeration handling until recently).
> I think that attitude vastly underestimates the complexity of a typical TLS implementation
If you ever get the impression that I'm underestimating the complexity of a typical TLS implementation, I promise you that I'm not. I speak to improvements, not panaceas.
Until the end of last year, I was one of the security engineers that the s2n team at AWS consulted on potential security issues. You will never hear me say anything will magically fix all our problems. Especially with TLS.
However, Rust does bring a lot to the table, so I feel I'm allowed to be excited about not reviewing another X.509 library written in C.
This reasoning doesn't make sense. If TLS is astonishingly complex, which it is, then we absolutely want the strongest type system that can simultaneously represent its complexity and afford developer ergonomics. TLS's complexity is a good reason for types that reflect invariants, not a good reason to give up.
I'm not even that good at writing Rust and even I recognize that countless libs I'm using are written in a way, with Rust types, that prevent serious mis-use. In ways that would be infeasible and unergonomic in other languages, or require internal library invariant assertions that are prone to bugs.
Sometimes the errors wind up being nasty, but I've also gotten better at trusting that the compiler is giving me helpful info, even if it's a huge message. And usually those errors indicate some library invariant that I've missed that the type system is enforcing.
while it's nice that the rest of the world is slowly waking up to type systems functional programmers have been bleating on about for the past four decades
... having read through the first couple of pages of bc vulns: even a much stronger type system than rust provides wouldn't appear to help very much in this specific example
however if someone wants to rewrite OpenSSL in Rust that would be a massive massive improvement
I don't think this is true. Rust cannot prevent all possible forms of denial of service, but there are plenty of underlying DoS causes that Rust either outright eliminates (such as memory corruption without further control) or mitigates through stronger types.
A recent example of this is CVE-2024-0567 in GnuTLS: an invariant that otherwise would likely have been noticed at the type level is instead checked with an assert, leading to a remotely trigger-able DoS.
Nor the other myriad of logic and parsing bugs that led to incorrect behavior (more than just denial of service) in the Java library that was somehow not as good as Rust :/.
Aptitude with a technology certainly correlates with intelligence, but it doesn't necessarily imply above-average intelligence.
Some people attain their skills through Herculean levels of hard work, rather than reading about it once and the information just clicking because of their excellent brains. (Though, in my experience, they tend to also be less arrogant than techno-prodigies, but YMMV.)
Self-awareness is, similarly, orthogonal to intelligence (as you correctly state).
I find it interesting that an assumption of equivalence (or, at least, strong correlation) is so prevalent among tech workers and their friends.
This includes:
1. The Ed448 signature algorithm
2. The Edwards448 elliptic curve group (which could conceivably be used for ECDH)
3. The Decaf448 prime-order group (a much better target for doing non-EdDSA things with)
I've been putting off reviewing it and making the implementation public (as it was an exercise in "is this skill a sufficient guard-rail against implementation error" more than anything), but if there's any interest in this from the Go community, I'll try to prioritize it later this year.
(I'm not publishing it without approval from the rest of the cryptography team, which requires an internal review.)
But if you're curious about the efficacy of the Skill, it did discover https://github.com/RustCrypto/signatures/security/advisories...
[1] https://github.com/trailofbits/skills