Hacker Newsnew | past | comments | ask | show | jobs | submit | jandrewrogers's commentslogin

There are old idioms in C where null pointers are intentionally dereferenced to induce the expected outcome. Not the best way to write that code because beyond being less explicit about intent it also isn't guaranteed to work.

The rule is likely speaking to this code.


I've done it, I think more than once.

I was getting to a point in the code. I could tell by a log statement or some such. But I didn't know in what circumstances I was getting there - what path through the code. So I put in something like

  char *p = 0;
  *p = 1;
in order to cause a core dump. That core dump gave me the stack trace, which let me see how I got there.

But I never checked that in. If I did, I would expect a severe verbal beating at the code review. Even more, it never made it into release.


Early returns makes the code more linear, reduces conditional/indent depth, and in some cases makes the code faster. In short, it often makes code simpler. The “no early returns” is a soft version of “no gotos”. There are cases where it is not possible to produce good code while following those heuristics. A software engineer should strive to produce the best possible code, not rigidly follow heuristics even when they don’t make sense.

There is an element of taste. Don’t create random early returns if it doesn’t improve the code. But there are many, many cases where it makes the code much more readable and maintainable.


What comparable alternative is available today? None of the European companies has a production 5th generation aircraft nor the integrated sensing capabilities. This is what is driving the incredible demand despite misgivings. You can't survive in a near peer combat environment without it.

Countries are buying it because it is the only game in town for certain high-value capabilities, not because they necessarily like the implications of there being a single seller of those capabilities. For better or worse, the US has been flying these for 30 years and has 6th generation aircraft in production. Everyone else is still figuring out their first 5th generation offering.

Closing that gap is a tall order. Either way, European countries need these modern capabilities to have a capable deterrent.


I'm no expert, but the narrative is that it really depends what you need them for. And keep in mind that joining the jet fighter programme also means joining the development of it, enacting a certain amount of influence through your funding. For example, it is conceivable that a sufficiently upgraded Gripen tailored to our needs would be just as effective (which aren't really dogfighting, as I understand it), and cheaper.

Anyway we're all just crossing our fingers that the US is just temporarily insane and will eventually come to its senses. What else can you do.


> You can't survive in a near peer combat environment without it.

How well will the european countries survive with it if the US cuts off access to spare parts, SW maintenance links etc?


> What comparable alternative is available today?

You know the answer, but I'll say it anyway. There is no comparable alternative today, and there will not be one in the near future.


In many regards, the F-35 was the first aircraft explicitly engineered for the requirements of drone-centric warfare. Its limitations are that this capability was grafted onto an older (by US standards) 5th generation tech stack that wasn't designed for this role from first principles. I think this is what ultimately limited production of the F-22, which is not upgradeable even to the standard of the F-35 for drone-centric environments.

The new 6th generation platforms being rolled out (B-21, F-47, et al) are all pure first-principles drone-warfare native platforms.


This is simply not what happened historically.

Drones were not discussed much when the requirements for the F-35 were formed.

The F-22 was considered very open and upgradable for it's era. It's just that freakin' old where FireWire was the unproven new hotness.

Current AF efforts do focus on drone and loyal wingman concepts, but these don't have much material impact on avionics. There everything the AF is talking about is agility in deliverin capabilities through open systems architecture. That's why they're doing things like trying out k8s on military aircraft. It's not about drones specifically but things like delivering new EW capabilities in days or hours instead of decades.

For a dive on the latter stuff look into what Dr Will Roper was talking about during his tenure.


My recollection is that it came down to two factors. Pragmatically, the pool of highly skilled C++ programmers was vastly larger and the ecosystem was much more vibrant, so development scaled more easily and had a lower maintenance risk. By 2005 they had empirical evidence that it was possible, albeit more difficult, to build high-reliability software in C++ as the language and tooling matured.

These days they are even more comfortable using C++ than they were back then due to improvements in process, tooling, and language.


Why would this be relevant?

There isn't much of a conversation to be had here. For low-level systems code, exceptions introduce a bunch of issues and ugly edge cases. Error codes are cleaner, faster, and easier to reason about in this context. Pretty much all systems languages use error codes.

In C++, which supports both, exceptions are commonly disabled at compile-time for systems code. This is pretty idiomatic, I've never worked on a C++ code base that used exceptions. On the other hand, high-level non-systems C++ code may use exceptions.


What you wrote is historically correct, but new analisys shows exceptions are faster that error codes if you actually check the error codes. Of course checking error codes is tedious and so often you don't. Also is micro benchmarks error codes are faster and only when you do more complex benchmarks do exceptions show up as faster.

The performance benefits of exceptions are not borne out in practice in my experience relative to other error handling mechanisms. It doesn't replicate. But that is not the main reason to avoid them.

Exceptions have very brittle interaction with some types of low-level systems code because unwinding the stack can't be guaranteed to be safe. Trying to make this code robustly exception-safe requires a lot of extra code and has runtime overhead.

Using exceptions in these kinds of software contexts is strictly worse from a safety and maintainability standpoint.


thanks for the explanation.

For those interested, the F-35 (née Joint Strike Fighter) C++ coding standards can be found here, all 142 pages of it:

https://www.stroustrup.com/JSF-AV-rules.pdf


From quickly glancing over a couple of pages, that looks sensible. Which makes me curious to see some exceptions to the "shall" rules. With a project of this size, that should give some idea about the usefulness of such standards.

As is common in hard real time code, there is no dynamic allocation during operation:

    allocation/deallocation from/to the free store (heap) 
    shall not occur after initialization.
This works fine when the problem is roughly constant, as it was in, say, 2005. But what do things look like in modern AI-guided drones?

Why would the modern environment materially change this? The initialized resource allocation reflects the limitations of the hardware. That budget is what it is.

I can't think of anything about "modern AI-guided drones" that would change the fundamental mechanics. Some systems support very elastic and dynamic workloads under fixed allocation constraints.


Basic flight control is a fixed-sized problem. More military aircraft systems now on what the environment and enemy are doing.

You're just imagining things at this point.

The overwhelming majority of embedded systems are desired around a max buffer size and known worst case execution time. Attempting to balance resources dynamically in a fine grained way is almost always a mistake in these systems.

Putting the words "modern" and "drone" in your sentence doesn't change this.


The compute side of real-time tracking and analysis of entity behavior in the environment is bottlenecked by what the sensors can resolve at this point. On the software side you really can’t flood the zone with enough drones etc such that software can’t keep up.

These systems have limits but they are extremely high and in the improbable scenario that you hit them then it is a priority problem. That design problem has mature solutions from several decades ago when the limits were a few dozen simultaneous tracks.


There are missiles in which the allocation rate is calculated per second and then the hardware just has enough memory for the entire duration of the missile's flight plus a bit more. Garbage collection is then done by exploding the missile on the target ;)

We call this "explosive deallocation". Destructors have a whole new meaning.

What you are actually doing here is moving allocation logic from the heap allocator to your program logic.

In this way you can use pools or buffers of which you know exactly the size. But, unless your program is always using exactly the same amount of memory at all times, you now have to manage memory allocations in your pool/buffers.


"AI" comes in various flavors. It could be a expert system, a decision forest, a CNN, a Transformer, etc. In most inference scenarios the model is fixed, the input/output shapes are pre-defined and actions are prescribed. So it's not that dynamic after all.

This is also true of LLMs. I’m really not sure of OP’s point - AI (really all ML) generally is like the canonical “trivial to preallocate” problem.

Where dynamic allocation starts to be really helpful is if you want to minimize your peak RAM usage for coexistence purposes (eg you have other processes running) or want to undersize your physical RAM requirements by leveraging temporal differences between different parts of code (ie components A and B never use memory simultaneously so either A or B can reuse the same RAM). It also does simplify some algorithms and also if you’re ever dealing with variable length inputs then it can help you not have to reason about maximums at design time (provided you just correctly handle an allocations failure).


How do you think these modern AI-guided drones use their AI? What part of the drone uses it?

Sensor input evaluation using subsystem produced confidence values?

I wonder if they use static analysis to enforce these rules, or if developers are expected to just know all of this

"shall" recommendations are statically analyzed, "will" are not.

static analysis

In general, are these good recommendations for building software for embedded or lower-spec devices? I don't know how to do preprocessor macros anyhow, for instance - so as i am reading this i am like "yeah, i agree..." until the no stdio.h!

Embedded more so than just lower-spec devices. Depends on the domain too.

stdio.h is fine in some embedded contexts, and very very not fine in others


stdio.h is not what you would use in safe code.

do they use f35io.h?

Depends. You use vendor specific libraries for hard real time systems, or in house libraries, or roll your own functions.

Afair they use a lot of stuff related to the Green Hills toolchain.

The first time I came across this document, someone was using it as an example how the c++ you write for an Arduino Uno is still c++ despite missing so many features.

Interesting font choice for the code snippets. I wonder if that's been chosen on a whim or if there is a reason for not going with mono space.

The font used for code samples looks nearly the same as "The C++ Programming Languages" (3rd edition / "Wave") by Bjarne Stroustrup. Looking back, yeah, I guess it was weird that he used italic variable width text for code samples, but uses tab stops to align the comments!

Interesting they're using C++ as opposed to Ada.

The video goes into the history of why the military eventually accepted c++ instead of enforcing Ada.

i.e. standard practice for every C++ code base I've ever worked on

What industry do you work in? Modern RAII practices are pretty prevalent

This is common in embedded systems, where there is limited memory and no OS to run garbage collection.

Garbage collection in C++?

What does RAII have to do with any of the above?

0 allocations after the program initializes.

RAII doesn't imply allocating.

My guess is that you're assuming all user defined types, and maybe even all non-trivial built-in types too, are boxed, meaning they're allocated on the heap when we create them.

That's not the case in C++ (the language in question here) and it's rarely the case in other modern languages because it has terrible performance qualities.


Open a file in the constructor, close it in the destructor. RAII with 0 allocations.

std::vector<int> allocated and freed on the stack will allocate an array for its int’s on the heap…

Sure, but my point was that RAII doesn't need to involve the heap. Another example would be acquiring abd releasing a mutex.

I've heard that MSVC does (did?) that, but if so that's an MSVC problem. gcc and clang don't do that.

https://godbolt.org/z/nasoWeq5M


WDYM? Vector is an abstraction over dynamically sized arrays so sure it does use heap to store its elements.

I think usefulcat interpreted "std::vector<int> allocated and freed on the stack" as creating a default std::vector<int> and then destroying it without pushing elements to it. That's what their godbolt link shows, at least, though to be fair MSVC seems to match the described GCC/Clang behavior these days.

RAII doesn't necessarily require allocation?

Stack "allocations" are basically free.

No. And they're unsafe. Avoid them at all costs.

Well if you're using the standard library then you're not really paying attention to allocations and deallocations for one. For instance, the use of std::string. So I guess I'm wondering if you work in an industry that avoids std?

I work in high-scale data infrastructure. It is common practice to do no memory allocation after bootstrap. Much of the standard library is still available despite this, though there are other reasons to not use the standard containers. For example, it is common to need containers that can be paged to storage across process boundaries.

C++ is designed to make this pretty easy.


Not an expert but I’m pretty sure no exceptions means you can’t use significant parts of std algorithm or the std containers.

And if you’re using pooling I think RAII gets significantly trickier to do.



And what does "modern" has to do with it anyway.

Not in the US. There is a lot of BLM land if you want to live a nomadic lifestyle in the middle of nowhere.

For hunting in a way you want? Not having to pay taxes? Raise your children in the nomadic hunter livestyle? I think schooling (and lots of other things) is mandatory in the US as well. And child protection service etc. exist. So it might be easier in the US to cosplay as a forest nomad for some time (and I know some people did it as eremits for a bit longer) but a real nomadic livestyle means living with other people together in a tribe. That does not work (just the rule to move camp after 2 weeks prevents that).

It isn't common but it definitely happens in some parts of the US.

There are no taxes to pay if you aren't earning anything. It is legal, if inadvisable, to raise children this way in much of the US. There is a "live and let live" ethos around it, especially in the western US. The true nomads are probably most common in the mountain West of the US in my experience. While the rule is two weeks in one location, in many remote areas there is no enforcement and no one really cares. They sometimes have mutually beneficial arrangements with ranchers in the area. These groups tend to be relatively small.

Alaska is famously popular for groups of families disappearing into the remote wilderness to create villages far from modern civilization. It is broadly tolerated there. Often many years will pass between sightings of people that disappeared into the wilderness.

I always wondered what a high-resolution satellite survey of the Inside Passage of Alaska and the north coast of British Columbia would find in that vast and impenetrable wilderness. Anecdotally there should be dozens of villages hidden in there that have been operating for decades.


Read into it; it happens, and CPS isn’t usually involved until it’s well into horror-show territory.

It’s usually around a cult or similar; we don’t have much in the way of hereditary nomadic but even those do exist.


I think I did read about it and met folks who are into that. I have never been in the US, though, but the main complaint I got is pretty much, state laws make it impossible. But I am open for reading suggestions.

There’s what is explicitly legal, there is what you can get away with, and there is moving between jurisdictions before they even know you’re there.

The US is large and if you keep your head down and homeschool to some level of competence I bet you could go many generations- especially if you were willing to blend in as necessary.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: