What Public Health Bureaucrats Could Learn from the NTSB
On December 28, 1978, United flight 173 took off from JFK airport in New York bound for Portland, Oregon with a stopover in Denver. In the cockpit were Captain Malburn McBroom, First Officer Rod Beebe, and Engineer Forrest Mendenhall. The plane landed in Denver on time and without complication. After deplaning the first leg of the flight and boarding the new passengers for the final leg, the DC-8 aircraft pulled away from the gate with enough fuel to cover the flight plus an extra 65 minutes in case of an unforeseen delay getting into Portland. United 173 took off from Denver’s Stapleton Airport at 2:47 p.m. with Beebe at the controls and McBroom acting as co-pilot. At 5:00 p.m. the flight neared Portland under ideal landing conditions. As the plane descended to 7,000 feet, Beebe asked for the landing gear down. When McBroom lowered the wheels, a tiny problem popped up: the indicator lamp for the right main landing gear didn’t light up. The crew wasn’t sure whether or not the landing gear was locked in place.
At 5:09 p.m. McBroom radioed the tower to report the problem and entered a holding pattern at 5,000 feet to diagnose and handle the problem. In the scheme of things, this was a relatively minor problem because if they had to land with the right landing gear collapsing, they might damage a wing, but all the passengers would arrive safely.
McBroom glanced at the gauges, and seeing that there were 12,000 pounds of fuel, he shifted his attention to the landing gear. The crew consulted the thick manual for instructions. They considered retracting and re-extending the gear but feared they could compound the problem. They considered a low pass by the control tower for a visual inspection by the controller but ruled that out because it was dark. At 5:30 p.m. McBroom called United maintenance in San Francisco. They didn’t have any useful information and indicated that the crew had done everything they could. It was now 5:49 p.m. and the crew still didn’t know if the right landing gear was locked or not.
McBroom now shifted his attention to contingency plans for a rough landing. As they began to review their checklist, Mendenhall, with concern in his voice, reported that their fuel pump lights were starting to blink. McBroom reassured that it was alright because the plane was in a turn and the pump lights have a tendency to blink with the sloshing of the fuel around the sensors when fuel was low. At 6:06 p.m. McBroom started to tell the tower he expected to land in five minutes, but he didn’t finish his sentence because the engine noise had changed. Over the next two minutes, one by one each of the four engines shut down. The plane had run out of fuel.
McBroom took control of the noiseless jet as it steadily lost altitude. Although it was eight miles from the runway, the distance was impossible. At 6:13 p.m. Beebe declared a mayday, and at 6:15 p.m. United 173 ripped through two vacant houses and a patch of trees before coming to rest. Because they were out of fuel, there was no fire. So fortunately, 179 of the 189 people on board survived the crash. Among the ten who perished was the flight Engineer Mendenhall. McBroom and Beebe both survived.
A Failure in Human Decision-Making
The crash baffled the aviation community. How could an experienced three-person crew run out of fuel before landing? They had over an hour of spare fuel, a relatively minor technical issue, and clear protocols for dealing with a landing gear failure.
A few months after the crash, the National Transportation Safety Board (NTSB) issued its report and concluded that the plane could have landed safely 30 or 40 minutes after the landing gear malfunction. The cause of the accident was the captain’s failure to monitor the plane’s fuel state and to properly respond to crewmembers advisories regarding the fuel state.
This wasn’t a simple case of pilot error; it was a massive failure in human decision-making. When the investigators listened to the cockpit voice recorder, it was clear that McBroom had attempted to keep track of everything by himself. He did not take advantage of support offered by his crew and was so intent on solving the landing gear problem that he was simply not listening to them. Consistent with the cockpit culture of the time, the captain was the expert. Crew members were merely instruments for executing the captain’s commands. In this culture, everyone understood you don’t challenge the captain. You did what you were told and you kept your mouth shut. The two unwritten rules of the cockpit were understood by all pilots. Rule #1, the captain is always right. Rule #2, if the captain is wrong, see Rule #1.
Nevertheless, these two unwritten rules were resulting in too many crashes. An analysis of aviation accidents in the 1970’s showed that poor human decision-making enabled by an autocratic pilot culture was resulting in needless crashes. While the captains were experts, they weren’t infallible and they were capable of senseless errors. Something had to be done. That something was called Crew Resource Management (CRM), and it would forever change the aviation industry for the better.
If You See Something, Say Something
In 1981, United Airlines trained everyone of its pilots in CRM. The training emphasized that, in a crisis, the crew members are a team. While the captain is the leader, he or she is not a dictator. They were taught how to quickly distribute tasks and everyone was encouraged to speak up. The expectation was made very clear: regardless of your rank or your experience, if you see something, say something. In other words, it was the failure to speak up rather than speaking up that could lead to disciplinary action.
To reinforce the new behavioral expectations, the pilots were videotaped in simulators as part of the training process. The story is told that when one crew was reviewing the training tape, a captain, seeing how others saw him for the first time, asked the instructor to stop the tape. He turned to his crewmates and asked, “Am I really like that?” When they replied yes, he said, “How can you work with me?” He knew he needed to change the way he managed his crews.
Today, CRM is the cultural norm across all airlines. Pilots understand that speaking up is not just a right but an obligation, and that no one has the authority to silence another team member. This understanding is reflective of the fundamental operating principles of how the NTSB has worked for decades. When investigating a crash, the NTSB routinely brings together representatives from everyone involved, including aviation engineers, data analysts, airplane manufacturers, pilots, and air traffic controllers. While they often form hypotheses as they do their work, the investigators readily abandon them if data and facts say otherwise. Their goal is to prevent a specific type of accident from ever happening again, and they understand that the laws of physics have no regard for the egos and the opinions of experts overly invested in their own thinking. NTSB teams don’t filter data to fit their theories; they adapt their theories to fit the facts. Similarly, when pilots are following the principles of CRM, they don’t defer to the thinking of the captain, they blend their collective intelligence by sharing observations and tasks to achieve better decision-making, as evidenced by the outstanding safety record of the commercial aviation industry in recent decades.
Lessons for Public Health Experts
We can only wonder how different things might be if the public health bureaucrats had emulated the operating principles of CRM. Unfortunately, the behavior of the public health experts has more in common with the crew of United 173 than it does with CRM.
Like Captain McBroom, the bureaucrats have narrowed their focus to accomplishing one activity: mass vaccination. But just as safely landing a plane was not solely dependent upon whether or not the landing gear is engaged, the safety of a population from a pandemic is not solely dependent upon the extent of vaccination. There are other factors, such as natural immunity and the efficacy of treatments that are important contributions to population safety.
Like Captain McBroom, the public health bureaucrats are not listening to or taking advantage of the support from other health professionals, especially front line doctors. Instead, they are proactively silencing valid data that does not fit their theory of the solution and coercively fostering a climate of compliance where highly intelligent professionals are hesitant to raise questions or point out inconvenient truths for fear of reputational retaliation or, worse yet, loss of their licenses to practice medicine.
The essential dynamic that has enabled CRM to radically improve aviation safety is the obligation for all crew members to raise questions when they see that things aren’t working or could work better. This is a dynamic that could improve our population safety if public health experts could let go of their version of the “the captain is always right,” embrace the operating principles of CRM, and behave more like the leaders of the NTSB.