Last week, Popular Mechanics published a very long and interesting but also controversial story using the final few minutes of transcripts from the recovered cockpit voice data recorder of Air France Flight 447 to try and reconstruct the final moments of the crash off the coast of Brazil in June of 2009. The transcripts came by way of a new edition of the book titled "Erreurs de Pilotage" (Pilot Error) by French flight instructor Jean-Pierre Otelli.
Mr. Otelli's latest edition drew a stern rebuke (PDF) the French aviation safety bureau BEA (Bureau d'Enquêtes et d'Analyses pour la sécurité de l'aviation civile) in October for his publishing of sensitive information from the Air France 447 voice data recorder. The BEA, in a press release, stated that:
"The BEA strongly condemns the disclosure of this transcription, which is a violation of Article 14 of the European Regulation of 20 October 2010 that came into effect on 2 December 2010. This transcription mentions personal conversations between the crew members that have no bearing on the event, which shows a lack of respect for the memory of the late crew members. The BEA safety investigation has not yet been completed and any attempt at interpretation at this stage is partial and, as a result, can only fan the flames of the controversies of the last few months, which is harmful to all concerned."
The BEA had published an edited transcript (PDF) of Air France 447's voice data recorder in August. BEA states that its final accident report is planned to be completed by June 2012.
The Popular Mechanics story cleverly uses the Otelli transcript to examine in depth the final 15 minutes or so of Flight 447, especially the increasing confusion present and inexplicable decisions that were being made in the cockpit.
For instance, apparently (I say this because again no formal conclusions have been reached by BEA) the pilots didn't seem to realize that their Airbus 330 was approaching and then had reached a stall condition. As the Popular Mechanic story relates:
"The Airbus's stall alarm is designed to be impossible to ignore. Yet for the duration of the flight, none of the pilots will mention it, or acknowledge the possibility that the plane has indeed stalled - even though the word 'Stall!' will blare through the cockpit 75 times. Throughout, Bonin [Pierre Cédric Bonin, the co-pilot the captain left in charge of the aircraft while he took his rest break] will keep pulling back on the stick, the exact opposite of what he must do to recover from the stall."
The Popular Mechanics story also indicates that the pilots at the controls weren't familiar/trained enough with the workings of their Airbus 330 aircraft to be able to understand and then diagnose what was going wrong in the situation they encountered and be able to "correct it." I say "correct it" because the evidence, as argued within the Popular Mechanics story, indicates that the crash was avoidable.
For instance, when the problem of the Airbus 330 pitot tubes icing up first appeared and which caused the aircraft's autopilot to disconnect, the pilots apparently didn't recognize the cause and then know what to do next. According to the Popular Mechanics story:
"Just then an alarm sounds for 2.2 seconds, indicating that the autopilot is disconnecting. The cause is the fact that the plane's pitot tubes, externally mounted sensors that determine air speed, have iced over, so the human pilots will now have to fly the plane by hand."
"Note, however, that the plane has suffered no mechanical malfunction. Aside from the loss of airspeed indication, everything is working fine. Otelli reports that many airline pilots (and, indeed, he himself) subsequently flew a simulation of the flight from this point and were able to do so without any trouble. But neither Bonin nor Roberts [David Robert, the other co-pilot] has ever received training in how to deal with an unreliable airspeed indicator at cruise altitude, or in flying the airplane by hand under such conditions."
The Popular Mechanics story in its conclusion also raises the specter of the automation paradox that I wrote about a few years ago for Spectrum magazine being present without specifically calling it out:
"But the crash raises the disturbing possibility that aviation may well long be plagued by a subtler menace, one that ironically springs from the never-ending quest to make flying safer. Over the decades, airliners have been built with increasingly automated flight control functions. These have the potential to remove a great deal of uncertainty and danger from aviation. But they also remove important information from the attention of the flight crew. While the airplane's avionics track crucial parameters such as location, speed, and heading, the human beings can pay attention to something else. But when trouble suddenly springs up and the computer decides that it can no longer cope - on a dark night, perhaps, in turbulence, far from land - the humans might find themselves with a very incomplete notion of what's going on. They'll wonder: What instruments are reliable, and which can't be trusted? What's the most pressing threat? What's going on? Unfortunately, the vast majority of pilots will have little experience in finding the answers."
The issue of whether the automation paradox came into play in the crash seems to be under investigation. BEA in September announced that it had formed a "Human Factors" working group:
"...to analyze all aspects connected to the conduct of the flight:
- Crew actions and reactions during the last three phases of the flight described in the third Interim Report (PDF), in particular in relation to the stall warning;
- Cockpit ergonomics;
- Man-machine interfaces."
The BEA expects the working group to be complete its effort by the end of this month, and its report will be published with the final accident report.
If you are interested in what happened to Air France Flight 447, the Popular Mechanics story is worth a read, as well as are the comments, some of which are from Airbus 330 pilots who take issue with a couple of points raised in the article.
Contributing Editor Robert N. Charette is an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Along with being editor for IEEE Spectrum’s Risk Factor blog, Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.