Hacker Newsnew | past | comments | ask | show | jobs | submit | thadt's commentslogin

So the intersting question: are we long term safer with "simpler" closer to hardware memory unsafe(ish) environments like Zig, or is the memory safe but more abstract feature set of languages like Rust still the winning direction?

If a hypothetical build step is "look over this program and carfully examine the bounds of safety using your deep knowledge of the OS, hardware, language and all the tools that come along with it", then a less abstract environment might be at an overall advantage. In a moment, I'll close this comment and go back to writing Rust. But if I had the time (or tooling) to build something in C and test it as thoroughly as say, SQLite [1], then I might think harder about the tradeoffs.

[1] https://sqlite.org/whyc.html


What about this article raises this question? If anything, this article makes it pretty clear that memory safe languages are a win. It seems like a serious disadvantage to require a nondeterministic program to evaluate your code's safety.

In general I agree and suspect that memory safety is a tool that will continue to pay dividends for some time.

But there are tradeoffs and more ways to write correct and 'safe' code than doing it in a "memory safe" language. If frontier models indeed are a step function in finding vulnerabilities, then they're also a step function in writing safer code. We've been able to write safety critical C code with comprehensive testing for a long time (with SQLite presenting a well known critique of the tradeoffs).

The rub has been that writing full coverage tests, fuzzing, auditing, etc. has been costly. If those costs have changed, then it's an interesting topic to try to undertand how.


> If frontier models indeed are a step function in finding vulnerabilities, then they're also a step function in writing safer code. We've been able to write safety critical C code with comprehensive testing for a long time (with SQLite presenting a well known critique of the tradeoffs).

More like: a few people have been able to write C code where the vulnerabilities are obscure enough that we mostly don't discover them very often.

The result of the phenomenon described in the article is that the gap between 99.9% secure and 100% secure just got a whole lot wider. Using herculean amounts of testing and fuzzing to catch most of the holes in a language that lacks secure-by-construction qualities is going to be even less viable going forward.


They're great at Python and Javascript which have lots of tooling. My idea was to make X-to-safe-lang translators, X initially being Python and Javascript. Let the tools keep generating what they're good at. The simpler translators make it safe and fast.

If translated to C or Java, we can use decades worth of tools for static analysis and test generation. While in Python and Javascript, it's easier to analyze and live debug by humans.

Multiple wins if the translators can be built.


> My idea was to make X-to-safe-lang translators, X initially being Python and Javascript.

Both of those languages are already safe. Then you talk about translating to C, so you're actually doing a safe-to-unsafe translation. I'm not sure what properties you're checking with the static analysis at that point. I think what would be more important is that your translator maintains safety.


Bananas are like XML that way. If you're not getting the results you want, you're just not using enough of them.

They specifically call out Yingxin Li[1] in the acknowledgements section of the paper?

[1] https://eprint.iacr.org/2024/349


As someone who spent hours playing Jedi Knight with friends and lots of mods, allow me to say - thank you :)


Browsers


Since when are browsers themselves built in JavaScript? Mainstream, fast ones?


Clarification - in the past when I've written high performance data tools in JS, it was almost entirely to support the use case of needing it to run in a browser. Otherwise, there are indeed more suitable environments available.

To your question, I was about to point out Firefox[1], but realized you clarified 'mainstream'[2]...

[1] https://briangrinstead.com/blog/firefox-webcomponents

[2] https://gs.statcounter.com/browser-market-share


Getting a broad overview of "world history" is useful for having basic context for large events, but, IMHO, history gets so much more interesting and educational when you're deep into individual people's lives and stories. I'm probably a bit biased, but tend to agree with the suggestions that you pick a time and place and dive deep into an individual or event that catches your fancy.


Oh man, have I gotten to read a lot of history recently.

And also fiction.

Frequently at the same time.


I like the quote that claims that as a science history is probably closer to animal husbandry than anything else.

Don't get me wrong I like history and think it a critical thing to study. but it is very telling to try ones hand at meta-history, the history of history, look to how the narrative of a historical subject changes through time and space.

An easy one is world war 2 documentaries. The difference in tone and focus of those done right after the cessation of hostilities compared to those done later is fascinating.


I used to love using em dashes.

I still do - but I used to, too.


Hey wait, - isn't one! Did a human write this?


In the 80's we had a way to deal with that kind of thing [1]. Just gotta practice to get the technique right.

https://www.youtube.com/watch?v=D1GyHQiuneU


I had this exact scene in my mind and I am glad I am not alone, friend


Exactly! Also, that random ride across the bridge towards Marin is taking forever


A cursory search doesn't seem to turn up anything solid.

But it would be interesting to know. I'll be in the German diary archive in March and made a note to keep an eye out for it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: