I was intrigued by this observation, near the end:
> 18) Failure free operations require experience with failure.
> Recognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure. More robust system performance is likely to arise in systems where operators can discern the “edge of the envelope”.
In analyzing what separates organizations that win victories from those that do not, Boyd wrote of a quality he called Fingerspitzengefühl -- a German word that can be understood as something like "intuition." (The literal translation of the German word is "fingertip feeling," as in how a successful baseball pitcher can tell where the ball is going to go solely from how it feels rolling out of his hand.) His point was that winning organizations exposed their people to both training (good) and experience (better!) enough so that they could learn to react to emergent situations on instinct, rather than by consulting a manual or waiting for instructions from above. The point quoted above sounds like a call for people working on complex systems to get opportunities to develop their own Fingerspitzengefühl.
Which leads to the thought that maybe a completely failure-free system is not something we should strive for. After all, in a completely failure-free system, nobody would ever get enough experience groping around the edge of the envelope to learn how to intuit where the other edges are. All they'd have is "here there be Dragons!" warnings from the past, which would become less compelling the farther into the past they come from. People are quick to discount warnings that contrast with their personal experience, and if your experience is that the System never fails, it's not hard to imagine people starting to believe that the System cannot fail. Which is fine, until it does fail, and nobody has any idea what to do to fix it.
It's sort of the same thing that happened to the financial sector in the US. After the Crash of 1929 and the Great Depression, a whole set of legal and institutional safeguards were put in place to prevent those things from happening again. But as time passed and generations grew up that had not experienced those crises directly, people began to decry those safeguards as needless bureaucracy. Eventually enough people did so that most of the safeguards were stripped away; at which point the system promptly collapsed again.
> 18) Failure free operations require experience with failure.
> Recognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure. More robust system performance is likely to arise in systems where operators can discern the “edge of the envelope”.
This maps interestingly to the work of strategic thinker John Boyd (http://en.wikipedia.org/wiki/John_Boyd_%28military_strategis...). (I summarized the general thrust of Boyd's thought in a blog post here: http://jasonlefkowitz.net/2013/03/how-winners-win-john-boyd-...)
In analyzing what separates organizations that win victories from those that do not, Boyd wrote of a quality he called Fingerspitzengefühl -- a German word that can be understood as something like "intuition." (The literal translation of the German word is "fingertip feeling," as in how a successful baseball pitcher can tell where the ball is going to go solely from how it feels rolling out of his hand.) His point was that winning organizations exposed their people to both training (good) and experience (better!) enough so that they could learn to react to emergent situations on instinct, rather than by consulting a manual or waiting for instructions from above. The point quoted above sounds like a call for people working on complex systems to get opportunities to develop their own Fingerspitzengefühl.
Which leads to the thought that maybe a completely failure-free system is not something we should strive for. After all, in a completely failure-free system, nobody would ever get enough experience groping around the edge of the envelope to learn how to intuit where the other edges are. All they'd have is "here there be Dragons!" warnings from the past, which would become less compelling the farther into the past they come from. People are quick to discount warnings that contrast with their personal experience, and if your experience is that the System never fails, it's not hard to imagine people starting to believe that the System cannot fail. Which is fine, until it does fail, and nobody has any idea what to do to fix it.
It's sort of the same thing that happened to the financial sector in the US. After the Crash of 1929 and the Great Depression, a whole set of legal and institutional safeguards were put in place to prevent those things from happening again. But as time passed and generations grew up that had not experienced those crises directly, people began to decry those safeguards as needless bureaucracy. Eventually enough people did so that most of the safeguards were stripped away; at which point the system promptly collapsed again.