A sunk cost is essentially the price already paid and should have no bearing on future decisions; however, humans are hardwired to consider these irreleveant costs when making decisions. - Heyne, 1977
For example, rapid intervention is a prominent and accepted strategy to “protect our own” when functioning at a working emergency incident. We place an incredible amount of stock in rapid intervention as an effective strategy to rescue firefighters who are not capable of rescuing themselves. Yet it has been demonstrated repeatedly that rapid intervention is a high risk, low payoff activity.
What makes us as fire professionals believe that if on an emergency incident we cannot make decisions that prevent firefighters from becoming lost, trapped or injured, that when such an event occurs we can manage an even more complex situation to effect a successful, rapid intervention deployment and rescue?
Murphy’s Law states, “what can go wrong usually does go wrong.” The reality is that “what can go wrong usually goes right.” Because of this, the more we tempt fate, the more confidence we develop that our avoidance of a bad outcome is based on rational decision-making rather than luck. However, when things do go wrong on an emergency incident it can, with almost absolute certainty, be attributed to human error (Hallinan, 2009). Yet we seem reticent to admit this and confront the actual problem: human decision-making. Our intuitive biases drive our perceptions and decision-making processes, which often causes us to ignore salient information while placing emphasis on unimportant artifacts.
If we are willing to at least consider the possibility that as human beings we are limited in our ability to recognize, analyze, formulate and then implement good decisions, we should honestly address the reasons we are biased toward making bad decisions. Fortunately, behavioral science theory sheds much light on decision-making biases that plague the fire service, and the ideals of Crew Resource Management are perfectly suited to provide firefighters an applicable framework to overcome these cognitive biases when confronted with complex situations. Michael Roberto’s research into a Mt. Everest expedition that had a tragic outcome pointedly identifies three cognitive biases that influenced decision-making and are particularly instructive for the fire service.
The three biases Roberto identified are failure to ignore sunk costs, recency effect, and overconfidence bias. These are all root causes for bad decisions, which are only magnified under stress.
Failure to Ignore Sunk Costs
No matter your experience or expertise, Roberto’s research makes it clear that we are all susceptible to biases that affect decision-making. The first factor he develops is the failure to ignore sunk costs when making decisions. “Sunk costs effect refers to the tendency for people to escalate commitment to a course of action in which they have made substantial prior investments of time, money, or other resources,” according to Roberto.
On Everest that fateful day in May of 1996, as Roberto describes it, the expedition was making a final push for the summit. The trek from Camp IV to the top and back takes climbers about 18 hours. Because of this, climbers set out for the summit in the very early morning hours, so they can summit and return to Camp IV before nightfall. The expedition leaders, both highly experienced climbers, set a hard turn-around time for the climbers of two o’clock in the afternoon. This was a safety precaution, because descending the mountain in dark is extremely dangerous.
At two o’clock on May 10th the climbers had not reached the summit, which based on the two o’clock rule meant they had to give up on the summit attempt and return to Camp IV. However, the two expedition leaders and some of the climbers decided to ignore their own rule and continue the climb. The sunk costs influencing this decision included the incredible time, effort, and money the climbers had expended for the opportunity to summit Everest, which they used to justify their decision to continue the climb. This proved to be a bad decision, as weather moved in and the climbers were not able to reach safety, dying on the mountain.
Failure to ignore sunk costs is a phenomenon that can adversely affect fire ground operations, as crews continue to pour resources into futile efforts based on the resources and effort already expended (i.e. sunk costs). In such examples, firefighters who have worked incredibly hard to extinguish a fire or make a rescue often justify taking additional risk or “escalating their commitment” to a failed strategy based on sunk costs, rather than recognizing that the energy expended should have no bearing on their future strategic or tactical decisions.
How often have you heard an officer or crew request an extra minute from command after being ordered to leave the structure because they think they have “almost got it”? This is an example of sunk costs driving future decisions. The building does not care how hard you have worked or the number of resources used to extinguish the fire. When the final structural element is compromised, the building is coming down. This is no different than the gambler who continues to throw good money after bad in a vain attempt to make up for growing losses.
The second factor that leads to bad decision-making is recency effect. This is our natural bias to place too much emphasis on recent events when making decisions (Roberto, 2002). Recency effect biased the Mt. Everest expedition leaders, as they debated whether to turn back or continue the climb having not reached the summit by two o’clock. Major storms that generate high winds, zero visibility, and sub-zero temperatures are common occurrences on Mt. Everest, but for the years leading up to this expedition these weather conditions had been rare during the peak climbing season. This was a significant departure from the normal weather patterns.
The expedition leaders had enjoyed great success leading climbers to the summit during these years. When it came time to decide whether to continue the climb on May 10th, they reasoned that weather conditions would remain good as they had for recent expeditions.
However, storms on Everest are a significant reason over 160 people have died attempting to summit. The expedition leaders failed to separate differences between previous climbs and this summit attempt. On each prior climb, they were fortunate to avoid such weather dangers. This anomaly gave them confidence that the same conditions would follow them on this expedition, which proved very wrong.
Firefighters often fall victim to recency effect, as we place an incredible amount of emphasis on prior experiences. Experience is an important component of making good decisions, but every fire or rescue event is independent. The danger of recency effect is not recognizing the differences that may exist from one incident to the next.
For example, just because the last bow string roof did not collapse under fire conditions, does not mean the next one won’t. However, we are very willing to justify future decisions on previous experiences that may not actually be similar, because this is an easy and often low cost means for making decisions. Again, what can go wrong often does not; however, we should recognize this and not consider such scenarios good decision-making rather than luck.
The final and most influential factor associated with poor decision-making is overconfidence (Hallinan, 2009). Recency effect clearly contributes to overconfidence, as firefighters continue to get away with poor decisions. This creates a feeling of invincibility. In addition, the culture associated with the industry breeds overconfidence. The most endeared and overused word associated with effective firefighting is “aggressive.” Unfortunately, aggressive probably better describes someone who is reckless, takes unnecessary risks, and fails to consider the “big picture.” We compound this culture of “aggressive” firefighting with a sincere belief that a rapid intervention team will miraculously appear if a firefighter gets into trouble.
I dare say that a rapid intervention deployment in even the most benign circumstances is incredibly complex. Think about the circumstances that must exist to prevent the average firefighter or fire crew from self-extricating from an emergency situation: high heat, low visibility, collapse or pending collapse. Most fire departments struggle to effectively deploy a rapid intervention team under training conditions, where the intensity is far lower, heat is usually non-existent, and fire is not an issue, yet we continue to have confidence in rapid intervention as an effective strategy to rescue downed firefighters. It is this overconfidence in unproven and suspect strategy and tactics that influences and alters our decision-making on the fire ground.
We as humans are limited in our ability to recognize and analyze information, especially under stressful conditions. This is only compounded when individuals find themselves also having to deal with group dynamics. Failure to ignore sunk costs, recency effect, and overconfidence leads to cognitive bias when making decisions. Recognizing these factors is just the beginning of creating a culture and environment that consistently stimulates good decisions. The fire service must also deal with the “group think” problem and vertical hierarchy that inhibits the open and honest flow of information up and down the chain-of-command.
The airline industry has been dealing with the pitfalls of group think and cognitive bias in decision-making for decades. Its research in the late 1970s into a myriad of airline crashes pinpointed human error as the main reason for these incidents. Astoundingly, in almost every situation investigated the co-pilot or engineer in the cockpit knew the plane was going to crash. It seems insane that trained pilots would sit back and let their aircraft crash, but this was in fact happening as the pilot’s word was final and not to be questioned.
The airline industry recognized how catastrophic cognitive bias and group think are when managing complex operations and took action. Crew resource management (CRM) was developed to combat this culture and is incredibly applicable to the fire service.
The failure to have open and honest communication contributes significantly to poor decision-making, whether on the fire ground, in the cockpit, or on the side of the most treacherous mountain in the world. This was exemplified by Roberto in his final analysis of the decision-making that occurred during the tragic expedition to the 29,002 foot summit of Mt. Everest. Throughout the preparation and during the climb, Roberto notes that the two highly experienced expedition leaders made it clear to the group that their decisions were final, and were not to be questioned. This ultimatum went unquestioned, as the two expedition leaders had an impeccable track record for guiding groups to the summit of Mt. Everest.
Those in the group with less experience and expertise recognized the knowledge gap and willingly submitted their lives to the decisions made by these two individuals. When the expedition leaders decided to break their own rule, the two o’clock rule, no one questioned the decision. However, interviews with climbers after the fact, who were forced to turn back, revealed that many were uncomfortable with continuing the climb and were somewhat confused by the expedition leader’s blatant disregard for their own rule.
However, no one questioned the decision because it was made clear that the expedition leaders would not tolerate such discussions. This failure to speak truth to power, and the leader’s refusal to be questioned ultimately cost five lives.
CRM to Improve Decision-Making
Inculcating a culture that follows the CRM model to open up lines of communication could significantly benefit the fire service, if we are to improve decision-making and reduce the number of firefighter injuries and deaths at emergency incidents. It is critical to recognize that CRM is not about subordinating decision-making or even “flattening” the hierarchy, it is about recognizing that we as humans are incredibly limited and subject to cognitive bias when making decisions.
Each person views the world differently, which is an incredibly powerful virtue of human beings if used effectively. CRM is an effort to break communication barriers, so that important information for decision-making is transferred to the decision-makers. Experience and expertise are valuable but can also contribute to cognitive bias. Each member of a team, no matter their experience or expertise, can contribute important information to a decision, especially when the group is working under stressful conditions.
In fact, it is often an inexperienced person who is less inhibited by things such as recency effect or overconfidence, who often recognizes critical happenings at an event. However, if that information is not communicated for whatever reason, it is worthless.
An effective leader, whether on or off the fire ground, empowers team members to voice concerns and participate in decision-making. Firefighters are often quick to react to this idea with a feeling that when fighting fires or performing high stress activities there is not time for a debate or that an officer who solicits other’s opinions may be perceived as weak. Again, utilizing CRM is not about having a long, drawn-out debate. It is about empowering subordinates to communicate important information to the decision-maker.
Think if the Mt. Everest expedition leaders had facilitated a more collegial environment and encouraged the other climbers to voice concerns. It may have only taken one of the climbers, who we know in hindsight were uncomfortable with the decision to continue the climb, to question potential weather changes, risks of descending at night, or saliency of violating the two o’clock rule to break the cognitive biases that ultimately guided the poor decision-making.
There is no debate involved in this transaction, just the simple and straight forward communication of different perspectives. It is no different on the fire ground. Officers should want and encourage firefighters to communicate their observations. An officer who respects and trusts his or her crew will facilitate such an environment to improve decision-making. “Moreover, it means that team members do not believe that the group will rebuke, marginalize, or penalize individuals for speaking up and challenging prevailing opinions,” according to Roberto.
However, an officer who is arrogant enough to believe that he or she always knows best is unlikely to receive any information from crew members, and is likely to “crash the plane.”
It is time to change the paradigm for how we train firefighters and officers to alter the culture of the fire service. Recognizing our limitations to evaluate large amounts of information, which is further compromised by stress, is a first step to shifting how we go about making decisions. Let’s stop being so reactive in our techniques to protect firefighters and place that energy toward being proactive. No rapid intervention team, bright green traffic vest, governed truck, or bailout rope will ever be an effective means to compensate for poor decision-making at emergency incidents.
These devices and techniques may make us feel better and provide plausible deniability when things go bad, but the real solution to creating a more effective fire service is inculcating a culture that demands communication and recognizes the failure to ignore sunk costs, recency effect, and overconfidence as real threats to effective decision-making.
Carr Boyd is currently a Captain with the Charlotte Fire Department, an Adjunct Faculty member at the University of North Carolina at Charlotte, an instructor for the National Fire Academy, and serves as an affiliate with Academy Leadership, LLC facilitating leadership development programs for public and private sector organizations. He is a Nationally Registered Paramedic, has a Master of Public Administration degree, and a Ph.D. in Public Policy.