Take airplane safety: plane crashes, cause of the crash is thoroughly investigated, report recommends procedures to avoid that type of cause for planecrashes. Sometimes such recommendations become enforced across the industry. Result: air travel safer & safer to the point where sitting in a (flying!) airplane all day is safer than sitting on a bench on the street.
Building regulations: similar.
Foodstuffs (hygiene requirements for manufacturers): similar.
Car parts: see ISO9000 standards & co.
Software: eg. memory leaks - been around forever, but every day new software is released that has 'm.
C: ancient, not memory safe, should really only be used for niche domains. Yet it still is everywhere.
New AAA game: pay $$ after year(s?) of development, download many-MB patch on day 1 because game is buggy. Could have been tested better, but released anyway 'cause getting it out & making sales weighed heavier than shipping reliable working product.
All of this = not improving methods.
I'm not arguing C v. Rust here or whatever. Just pointing out: better tools, better procedures exist, but using them is more exception than the rule.
Like I said the list goes on. Other branches of engineering don't (can't) work like that.
Exactly. The driving force is there but what is also good is that the industry - for the most part at least - realizes that safety is what keeps them in business. So not only is there a structure of oversight and enforcement, there is also an strongly internalized culture of safety created over decades to build on. An engineer that would propose something obviously unsafe would not get to finish their proposal, let alone implement it.
In 'regular' software circles you can find the marketing department with full access to raw data and front end if you're unlucky.
How do you know that?