Improving Human Decision Making
On December 28, 1978, United flight 173 took off from JFK airport in New York bound for Portland, Oregon with a stopover in Denver. In the cockpit were Captain Malburn McBroom, the First Officer Rod Beebe, and the Engineer Forrest Mendenhall. The plane landed in Denver on time and without complication. After deplaning the first leg of the flight and boarding the new passengers for the final leg, the DC-8 aircraft pulled away from the gate with enough fuel to cover the flight plus an extra sixty-five minutes in case of an unforeseen delay getting into Portland.
United 173 took off from Denver’s Stapleton Airport at 2:47 p.m. with Beebe at the controls and McBroom acting as co-pilot. At 5:00 p.m. the flight neared Portland under ideal landing conditions. As the plane descended to 7,000 feet, Beebe asked for the landing gear down. When McBroom lowered the wheels, a tiny problem popped up: the indicator lamp for the right main landing gear didn’t light up. The crew wasn’t sure whether or not the landing gear was locked in place.
At 5:09 p.m. McBroom radioed the tower to report the problem and entered a holding pattern at 5,000 feet to diagnose and handle the situation. In the scheme of things, this was a relatively minor issue because if they had to land with the right landing gear collapsing, they might damage a wing, but all the passengers would arrive safely.
McBroom glanced at the gauges, and seeing that there were 12,000 pounds of fuel, he shifted his attention to the landing gear. The crew consulted the thick manual for instructions. They considered retracting and re-extending the gear but feared they could compound the problem. They considered a low pass by the control tower for a visual inspection by the controller but ruled that out because it was dark. At 5:30 p.m. McBroom called United maintenance in San Francisco. They didn’t have any useful information and indicated that the crew had done everything they could. It was now 5:49 p.m. and the crew still didn’t know if the right landing gear was locked or not.
McBroom now shifted his attention to contingency plans for a rough landing. As they began to review their checklist, Mendenhall, with concern in his voice, reported that their fuel pump lights were starting to blink. McBroom reassured that it was alright because the plane was in a turn and the pump lights have a tendency to blink with the sloshing of the fuel around the sensors when fuel was low. At 6:06 p.m. McBroom started to tell the tower he expected to land in five minutes, but he didn’t finish his sentence because the engine noise had changed. Over the next two minutes, one by one each of the four engines shut down. The plane had run out of fuel.
McBroom took control of the noiseless jet as it steadily lost altitude. Although it was just eight miles from the runway, the distance was impossible. At 6:13 p.m. Beebe declared a mayday, and at 6:15 p.m. United 173 ripped through two vacant houses and a patch of trees before coming to rest. Because they were out of fuel, there was no fire. Fortunately, 179 of the 189 people on board survived the crash. Among the ten who perished was the flight engineer Mendenhall. McBroom and Beebe both survived.
The crash baffled the aviation community. How could an experienced three-person crew run out of fuel before landing? They had over an hour of spare fuel, a relatively minor technical issue, and clear protocols for dealing with a landing gear failure.
A few months after the crash, the National Transportation Safety Board (NTSB) issued its report and concluded that the plane could have landed safely 30 or 40 minutes after the landing gear malfunction. The cause of the accident was the captain’s failure to monitor the plane’s fuel state and to properly respond to crewmember advisories regarding the fuel state.
This wasn’t a simple case of pilot error; it was a massive failure in human decision-making. When the investigators listened to the cockpit voice recorder, it was clear that McBroom had attempted to keep track of everything by himself. He did not take advantage of support offered by his crew and was so intent on solving the landing gear problem that he was simply not listening to them. Consistent with the cockpit culture of the time, the captain was the expert. Crew members were merely instruments for executing the captain’s commands. In this culture, everyone understood you don’t challenge the captain.
Nevertheless, the failure to challenge captains was resulting in too many crashes. An analysis of aviation accidents in the 1970’s showed that poor human decision-making enabled by an autocratic pilot culture was resulting in needless crashes. While the captains were experts, they weren’t infallible, and they were capable of senseless errors. Something had to be done. That something was called Crew Resource Management (CRM), and it would forever change the aviation industry for the better.
In 1981, United Airlines trained every one of its pilots in CRM. The training emphasized that, in a crisis, the crew members are a team. While the captain is the leader, he or she is not a dictator. They were taught how to quickly distribute tasks, and everyone was encouraged to speak up. The expectation was made very clear: regardless of your rank or your experience, if you see something, say something. In other words, it was the failure to speak up rather than speaking up that could lead to disciplinary action.
Today, CRM is the cultural norm across all airlines. Pilots understand that speaking up is not just a right but an obligation, and that no one has the authority to silence another team member. They don’t defer to the thinking of the captain. They understand that the laws of physics have no regard for the egos and the opinions of experts overly invested in their own thinking. Instead, they blend their collective intelligence by working as a team, and these teams don’t filter data to fit their theories; they adapt their theories to fit the facts. By shifting to a new team-based management mindset that’s focused on asking the right questions and forging a shared understanding on how to handle stressful complex problems, these teams have greatly improved decision making in the cockpit, as evidenced by the outstanding safety record of the commercial aviation industry in recent decades.
To learn more about team-based decision making, see my new book Nobody Is Smarter Than Everybody: Why Self-Managed Teams Make Better Decisions and Deliver Extraordinary Results.