Good Morning,
Time to talk safety and this week I have another interesting article from Aero Safety World that talks about the decision making process and provides valuable insight in to a subject that we can all take advantage of.
Enjoy…………….
Management ultimately means calling the shots. In an airline organization, the decision making lies with the management team, in particular the accountable manager who has the authority for ensuring that all activities can be financed and carried out in accordance with the applicable requirements while establishing and maintaining an effective management system. On the flight deck, the responsibility lies with the pilot-in-command and, as in senior executives’ decisions, the tools matter.
The high-consequence nature of airline operations, the ever-existing threat of catastrophic failure, makes decision making a sensitive process.
A tragic decision-making failure related to flight safety is the 1986 Space Shuttle Challenger disaster. Flight STS-51L was launched from the Kennedy Space Center in Florida on Jan. 28, 1986, but never made it into space. The shuttle broke apart 73 seconds into its flight, leading to the deaths of its seven crewmembers. The spacecraft disintegrated over the Atlantic Ocean.
The accident occurred because an O-ring seal in the shuttle’s right solid rocket booster (SRB) failed at liftoff, and cold weather was a contributing factor. The O-ring failure caused a breach in the SRB joint it sealed, allowing pressurized hot gas from within the solid rocket motor to reach the outside and impinge on the adjacent SRB attachment hardware and external fuel tank. This led to the separation of the right-hand SRB’s aft attachment and the structural failure of the external tank. Aerodynamic forces rapidly broke up the orbiter.
According to the Rogers Commission Report, produced by a U.S. presidential commission charged with investigating the Challenger disaster, there also was a serious flaw in the decision-making process leading up to the launch of flight 51L: “A well-structured and -managed system emphasizing safety would have flagged the rising doubts about the solid rocket booster joint seal. Had these matters been clearly stated and emphasized in the flight readiness process …, it seems likely that the launch of 51-L might not have occurred when it did.” The commission report also said that “the waiving of launch constraints appears to have been at the expense of flight safety. There was no system which made it imperative that launch constraints and waivers of launch constraints be considered by all levels of management.”
The Challenger accident became a case study in academic fields such as engineering safety, safety communication and, most importantly, decision making.
In behavioral economics, the Challenger disaster is used as a case study in trying to boost efficiency in the way humans make decisions individually and collectively. Behavioral economics is a discipline derived from studies in economics as well as in psychology, and it “has demonstrated that people are motivated by impulses that are measurable and predictable, and often irrational.”1
Although the first research activities in the field of behavioral economics date back to the 1970s, until recently the discipline was limited to the academic world. However, since the 2008 global financial crisis, behavioral economics has started to receive more attention from policy makers and developers of risk models.
About the contribution of behavioral economics to driving efficiencies in decision making, Iris Bohnet, a behavioral economist and academic dean and professor of public policy at the Harvard Kennedy School, said, “Theories assuming rationality — such as the rational actor model used in economics — prescribe what is optimal but do not do a good job at describing how people really behave.”
Critical analyses of decision-making styles, as enabled by the considerable amount of research produced by behavioral economists, can reduce the probability of flight safety failures like that of the Challenger. Bringing sophisticated — that is, as far as possible free from biases — decision making into risk management optimizes the results of the risk analyses and provides a more targeted treatment of risks.
The alternative to flawed decision making suggested by behavioral economics is sophisticated decision making, which assumes that individuals are biased and biases are systematic. “Biases, however, can be dealt with systematically,” Bohnet said. “There are theories built on these behavioral regularities which describe and predict how people actually behave. Sophisticated (although not fully rational) decision making implies understanding how to overcome biases to make quasi-optimal decisions.”
Common decision-making biases are numerous, and counterstrategies exist for many of them. We have elected to analyze three biases that are evident in aviation — non-systematic consideration of all options available; the availability heuristic; and groupthink. We also will outline strategies for encouraging sophisticated safety-related decision making in aviation organizations.
Aviation organizations often require multiple management systems (including several trans-organizational systems), have dispersed operations, have many technical functions requiring skilled employees, and are highly regulated and characterized by overlapping state jurisdiction.2 Within this operational complexity, there is considerable room for inefficiencies in decision making due to the difficulty of accessing the full spectrum of options.
Counterstrategies include graphic techniques — such as fault tree analysis, decision trees and bowtie risk analysis — that are helpful in organizing thinking, and therefore decision making, systematically.
Fault tree analysis (FTA) was developed at the Bell Laboratories in the early 1960s and later was adopted and refined by, among others, the Boeing Co. FTA is a graphical tool for analyzing complex systems to determine potential failure modes and their probabilities of occurrence. FTA uses a logic block diagram with symbols to indicate various states. It is built from the top down, beginning with a potential failure mode. Pathways are used to interconnect events that contribute to the failure. These pathways use standard logic symbols. If the probability of failure of each component is known, a quantitative analysis can be performed.3 Figure 1 contains an example of a fault tree.
Decision trees conceptually are very similar to fault trees. The main difference is that decision trees are not developed with strict regard to failure events, but more generally to allow for decisions to be made in a more systematic way, representing all paths that a decision maker might follow.
The symbols used in decision trees differ from, and are fewer than, those utilized in FTA (Figure 2). Decisions to be made are represented by squares. Chance events are represented by circles. Options available to the decision maker are represented by branches emanating from a decision node (square). Branches from decision nodes must guarantee that one alternative can be chosen. Outcomes of a chance event are represented by branches emanating from a chance node (circle). Outcomes must be mutually exclusive and collectively exhaustive (no other possibilities exist; probabilities have to sum to 1). Consequences are specified at the ends of the branches.”4
Using bowtie risk analysis (Figure 3) is an effective way of understanding risk analysis and managing threats (called hazards in this method). The analysis, already being used in the aviation industry, consists of a simple methodology to frame an undesired event within a standardized scheme with the following principal components: triggering events, avoidance barriers, hazards, hazardous events, recovery barriers and outcomes.
Bowtie risk analysis makes visible what essential controls should be in place and which need to be provided and maintained. Bowties can support safety investigations, aid audit teams in tracking controls and enable staff to report when additional controls are needed.
Have a good week, thanks for letting the 3DB be a part of your week, and fly safe/be safe.
Robert Novell
August 18, 2014