It's systemic because the whole point of procedures and policies is to avoid the failure of any one part of the system.
Imagine the power going out at your company's data center and then one of the generators doesn't turn on, and as a result your website goes down and you lose $10k/hour. Would management suggest that the generator is an individual and an outlier and we'll just hope it doesn't happen again? My guess is that a set of policies would go into place such that even if a single generator did fail, the website would not go down. You now have a systemic fix to what was entirely likely a very rare systemic problem. Just because it's rare doesn't mean it's not systemic.
> It's systemic because the whole point of procedures and policies is to avoid the failure of any one part of the system.
I can see that point and agree with it. (Well, I'd say the point is to prevent the failure of any one part of the system from producing unacceptable outcomes rather than preventing the failure of any one part of the system, but that's a minor quibble.)
I think my more significant disagreement with the original post is more with the description in that post that it is systemic and little to do with the individual. I'll agree that the fact that the system allowed the individual problem to produce a catastrophic undesired outcome is a systemic problem that (provided a reasonable correction is available) ought to be addressed, but that doesn't mean that the problem had little to do with the individual(s) involved, the fact that it is an outlier means that you aren't dealing with the normal behavior of the system, and that if you are going to address this kind of outlier event, you need to understand the contribution of special circumstances (of which, again, the contributing features of the individuals involved are quite likely relevant components) that produce the outlier event. You can't effectively address the systemic issue if you view it as unrelated to the individuals.
> Well, I'd say the point is to prevent the failure of any one part of the system from producing unacceptable outcomes rather than preventing the failure of any one part of the system
That's effectively what I was trying to say. I didn't get all the way there, though, and you've stated it quite nicely.
I guess part of the reason that people are suggesting this is indicative of systemic problems is that there's very little in the way of compounding factors. Yes the glide path indicator was non-operational, but that didn't cause 20 crashes that day. It had been out for quite some time (http://www.latimes.com/local/lanow/la-me-ln-equipment-out-of...). Many other planes landed safely. From what I understand (http://www.weather.com/news/san-francisco-plane-crash-weathe...) the weather was fairly clear so that didn't play in.
A lot of the analysis I've read seems to indicate that this was basically a rookie-esque mistake and that there's no possible way such a thing should have happened. That there should have been 20 checks to make sure that this guy wasn't authorized to fly the plane, but that for whatever reason, none of them were acted upon. At that point it's a systemic problem.
I would be looking at this very differently if there were a bunch of bad circumstances beyond his control that converged in ways completely unforseeable and as a result of that there was a crash. But from what I can tell, this was a very nice day to be flying around SFO.
EDIT: I just realized that I got really off-topic. Whoops!
Imagine the power going out at your company's data center and then one of the generators doesn't turn on, and as a result your website goes down and you lose $10k/hour. Would management suggest that the generator is an individual and an outlier and we'll just hope it doesn't happen again? My guess is that a set of policies would go into place such that even if a single generator did fail, the website would not go down. You now have a systemic fix to what was entirely likely a very rare systemic problem. Just because it's rare doesn't mean it's not systemic.