A bit more information came out in the Washington Post about the possible cause of the Washington DC Metro crash that killed nine and injured nearly 80 people came out yesterday afternoon. The National Transportation Safety Board (NTSB) said that it found "anomalies" after performing standardized tests in one of the sensor circuits along the track where the crash occurred that communicates information to a subway car as well as back to Metro's central computer system.
The Washington Post also reports that: "A senior Metro official knowledgeable about train operations said an internal report confirmed that the computer system appeared to have faltered."
Not sure what "falters" exactly means, however, since it wasn't explained in the story.
The NTSB also confirmed that the operator did apply the trains brakes, but they have not been able to determine yet for how long. Nor have they said whether the train operator, who was relatively inexperienced (she was 18th from the bottom of 523 operators in seniority), was distracted or not.
Even if she wasn't, it does bring up an interesting issue of humans being the last line of defense in a computer controlled system. If you were sitting as operator in that train, and it started to move on its own, how long would it be before you noticed that it shouldn't be moving, that in fact a serious anomaly occurred?
As an inexact example, how long is it before you recognize that a pause in your PC's operation means that there is a problem, rather than it just being slow?
So, if you were a new train operator, what would you believe? The computer that was supposed to control the train, and which probably had always "worked" before or what is in front of your eyes? And would your first thought be that the computer malfunctioned, or something that you did (or didn't do)?
Shades of the Vincennes incident.
I'm speculating of course, but I wonder if experienced Metro train operators who were around since the 2005 Metro computer incident would have caught on that something was wrong a little more quickly.
I trust the NTSB will look at Metro operator training in regard to dealing with computer anomalies. I would also be curious on how much time the operator had driving a train on manual versus in automatic, computer controlled mode.
It will be interesting to see whether crash investigators determine that the flight crew for Air France Flight 447 may have got caught out by a computer problem that went unrecognized until it was too late.
There is a very, very old joke about airlines of the future having only two occupants in the cockpit, a dog and a pilot. The pilot is there to feed the dog; the dog is there to bite the pilot if he touches anything.
Maybe we need another dog to bite the pilot if he or she doesn't touch the controls when necessary.
Robert N. Charette is a Contributing Editor to IEEE Spectrum and an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.