Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Be very careful with what you are suggesting. If we are going to call them criminals because of bug in their code and system. Then don't be upset the day you are called a criminal for a bug in your code, for not having a test unit or deploying the wrong thing and not having the right process in place.


Having worked in the aerospace side of software dev, I'd actually be cool with this. Outside of software, real engineers have real liability when their systems fail [1]. PEs [2] are held accountable in most countries that license engineers when they sign off on things that should not have been signed off on. Too many people on the software side, even in safety critical systems, play fast and loose with the "engineering" aspect. We know how to design sound software, we know how to analyze it, we know how to test it. Deliberate, systematic approaches to these tasks can bring orders of magnitude greater confidence in the systems we produce, we just choose not to do it. Technically, in aerospace, we have DO-178C, which provides guidelines on artefacts that need to be produced, but too often these artefacts are created after the fact, rather than prior/concurrent to the development (where they belong). Criminal liability, career ruined by loss of licensure, these are risks that might actually temper some of the recklessness in the industry.

[1] http://en.wikipedia.org/wiki/Hyatt_Regency_walkway_collapse#... - no criminal penalties, but civil penalties and lost their license to practice.

[2] http://en.wikipedia.org/wiki/Regulation_and_licensure_in_eng...


You are missing an important point: different systems fail in very different ways. The reason why people will generally prefer software over other ways of building things is because the cost of failure is often very low. No one is going to die if a website serves an incorrect image, nor will such a bug require billions of dollars worth of semiconductor inventory to be recalled to repair.

If every software project was run like an avionics project, software would be more reliable and of higher quality. But the world would be worse off; most of the software people use would never come into existence.


I don't think I'm missing the point, I'm not calling for photoshop plugin devs to have the same requirements as others, I'm saying I wouldn't object to licensure for software devs on many categories of systems: medical and avionics are two obvious categories. Perhaps I should have stated that more clearly, but I figured most people would get that a browser-based game is in a totally different league than the software controlling acceleration in your vehicle.


Sometimes this is true, but it misses some things.

Sometimes the cost of failure appears low, but is actually massive because the failure mode is not understood. For example a spreadsheet that miscalculates and causes a bad investment decision and a corporate failure.

Software is often chosen because it trades off against weight (in physical systems) or people (more commonly).

Software is fundamentally different because it is not commonly toleranced or the tolerancing of software is not understood. Reliability in physical engineering is understood in terms of the limits to which a component can be pushed. This concept seems not to be applicable to software.


But the way in which 'real world' engineering is treated by companies is extremely different to that of software engineering. A structural engineer tells you that your mega-bridge is going to take another 2 years to build safely and you eat up the cost, a software engineer tells you they need another 2 years to make sure your program will work reliably and you tell them they have 6 months.

It would be nice to see strict regulation on systems where lives would be endangered should the software fail, but this also raises the issue of how you regulate.

In Structural engineering you can say don't use material X if the forces acting on it exceed Y newtons. The same regulation in software doesn't make sense, you can't say "only use Haskell" or "don't use library Z" because the interactions between the tools we use are much more complicated than many "real world" engineering tasks.

We then run into the fact that a lot of software engineers have no real power in their companies, they do what management says or they get fired, I'd guess that when any other kind of engineer says "this won't work" managers listen. In my opinion a better solution to holding software engineers responsible would be holding the company and managers to account, at least at this point in time.


Correct, you can't declare language use as a requirement in the same way as material. Check out DO-178C, it doesn't do that. It requires artefacts and processes to be in place so that when the system is done, if done right, you have a high degree of confidence that written in C or Haskell or OCaml or Fortran that it was designed and tested well, and consequently that errors are minimized or by design their impact is mitigated.

And this brings us back to licensure, if we had a PE category for this sort of software engineering, where people really staked their livelihood on what they signed off on, these sorts of processes might be taken seriously. So when you're told, after giving a 2 year estimate, that you have 6 months, you can honestly reply: I cannot do that. And have a body to point to to back you up in your decision when you get fired and they hire on a less reputable "engineer".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: