French aviation authorities have released the final report on the June 1, 2009 crash of Air France 447, one of aviation’s most dramatic and troubling disasters. And though earlier reports painted a fairly clear picture of what transpired, there is one surprise in the final documents—the extent to which the authorities blame not mechanical failures but the actions of the flight crew. The Bureau d’Enquete et d’Analyse (BEA) suggests that the accident “shows the limits of the current safety model.” That is, there is an element of fatal risk so deeply baked into modern aviation that it may be unfixable.
A quick recap of the Air France investigation: As PM’s December 2009 cover story reported, suspicion initially centered on the aircraft’s pitot tubes. Data sent by the aircraft in the moments before its disappearance suggested that these airspeed sensors had iced up and stopped working. Most aviation experts assumed that the pitot tubes’ failure and the severe weather (the crew had flown into a severe thunderstorm) were the key factors in this disaster.
But when searchers finally recovered the plane’s flight data recorder and cockpit voice recorders (commonly called black boxes) from the ocean depths in early 2011, those recorders painted a much different picture. Investigators determined that the loss of airspeed was in fact a minor hiccup that would not have endangered the flight if pilots had followed proper procedures. What doomed the flight was a series of almost incomprehensible mistakes on the part of the flight crew. Two were particularly astounding. First, a co-pilot at the controls pulled the plane up into a climb—simply leveling out the aircraft could have saved it. Second, he and the two other pilots on the flight deck failed to realize that, as a result of climbing, the plane had entered an aerodynamic stall and began plummeting toward the ocean.
The BEA’s final report offers revealing insight into the psychological factors behind these failures. For instance, of the numerous indications that should have tipped off the pilots that their plane had stalled, the most obvious was a loudspeaker in the cockpit blaring the word “Stall!” every few seconds. Yet, incredibly, the flight crew seemed not to have even noticed it.
How? We’ll never know what the pilots were thinking, of course. But as psychologists have long understood, stress hinders the ability of the brain to process information and to pay attention to multiple things at once. And when human beings are under stress, they tend to prioritize visual information over auditory information and to disregard cues that are unusual or deemed untrustworthy. Once the pilots of AF447 lost faith in the reliability of their airspeed sensors, they apparently distrusted all their other instruments as well and found it particularly easy to ignore the stall-warning horn.
If they had been able to listen and understand the stall warning’s significance, then returning the plane to normal flight would have been a straightforward matter. Instead, the BEA report puts it: “The failure of the attempts to understand the situation and the destructuring of crew cooperation fed on each other until the total loss of cognitive control of the situation.”
There may be no scarier or more telling phrase in the AF447 final report than “total loss of cognitive control.” One of the fundamental premises of modern aviation, in which so many flight controls are automated, is that the human pilots are onboard a flight to monitor the aircraft’s performance and to take corrective steps if something goes wrong. As the BEA report says:
When crew action is expected, it is always supposed that they will be capable of initial control of the flight path and of a rapid diagnosis that will allow them to identify the correct entry in the dictionary of procedures. A crew can be faced with an unexpected situation leading to a momentary but profound loss of comprehension. If, in this case, the supposed capacity for initial mastery and then diagnosis is lost, the safety model is then in ‘common failure mode.’ During this event, the initial inability to master the flight path also made it impossible to understand the situation and to access the planned solution.
As AF447 demonstrates, if a flight crew lacks the training and the cognitive resources to figure out what the problem is, they not only won’t be able to do anything useful, but they could also turn a minor crisis in a catastrophe.
The final report includes a long list of changes to equipment and training procedures aimed at preventing a repeat of Air France 447, with a jet airliner stalling at high altitude. But the greater issue remains unaddressed. No matter how many possible scenarios a training program can simulate, pilots will continue to find themselves in unexpected circumstances. They’ll have to respond creatively to novel problems and figure out a way to get the plane and its passengers to safety.
Modern civil aviation remains incredibly safe. And thanks to investigations like this one, it will continue to get safer all the time. But the AF447 catastrophe is a chilling reminder that human beings are prone to screw up when they’re needed most.
This is a cross-posting from the Popular Mechanics web site.
Jeff – the BEA report appendix 2 – the FDR – shows Bonin taking “priority” on the control stick only once – a couple seconds before finality. Yet your December article stated he had taken it earlier and that Robert was unaware Bonin had taken priority. Q: Why does the BEA report not note that priority was taken earlier by Bonin? Can you comment?
Bonin was the “pilot flying,” so he had the controls when the autopilot turned off, and it was he who initiated the sharp climb that led to all the trouble. Later he passed control back and forth with Robert, and at the very end took it back again, as you note. I guess this appendix to the final report doesn’t include every detail.
There was one US NTSB AAR with similar circumstances: Northwest Airlines B727-251, N274US Near Thiells, New York on 12/1/1974.
An a/c repositioning flight with a crew of 3, the passenger cabin was empty. The crew neglected to activate the pitot head heaters.
As they climbed through clouds and above, the water in the pitot tubes froze. The result of which is increased ias.
The PF believed the ias indicator and increased the angle of attack to bleed-off the excess speed which only appeared to increase as they climbed. The actual a/s deteriorated to a/c stall at 24,800 ft msl. The a/c entered an uncontrolled spiral descent to ground impact without flightcrew recognition of the actual conditions.
I suspect the lack of passengers and cabin crew was instrumental in subdued media attention and promulgation of the accident.
For a number of subsequent years, NWA pilot training stressed attitude and power control as optimal resolution of similar instances.
Very interesting! Of course being nearly 40 years ago, I imagine it was the era of steam gauges, so the information would have been coming to the flight crew in a very different manner — there wasn’t the need then to try to interpret what the computer knew and didn’t know. At any rate, thanks for the tip, I’m going to look into that one.
Interesting article, Jeff. You summed it up well by stating that human beings are prone to screw up when they’re needed most.
Also, some speculated that what may have contributed was that the Airbus side stick controls give no sensory or tactile and little visual feedback to the second pilot, meaning that when the less experienced pilot flying was pulling on the stick and pushing the nose of the craft up, the second pilot did not realise this, as he did not get any tactile feedback and no visual clues as it was pitch dark outside and they did not see the horizon as reference.