02/03/2026
Accident and incident investigation_Banner

Accident and incident investigation – lessons from rocket science

The problem of cognitive bias

In his book, “What do you care what other people think?” the American physicist Richard Feynman describes his involvement in the investigation of the space shuttle Challenger disaster in 1986. Whilst the technical details are interesting, of relevance here are his concerns about the process of investigation, and cognitive bias that arises in the process.

Feynman was working with a team of experts including astronauts Neil Armstrong and Sally Ride; an aeronautics professor; aerospace engineers; and US Air Force personnel. They were supported by a team of other investigators. If even this expert and well-resourced team could make errors due to cognitive bias, how much more likely are we to make cognitive errors when we’re short of time and under-staffed?

In Investigating accidents and incidents (HSG 245) the Health and Safety Executive (HSE) explains that an investigation “should be thorough and structured to avoid bias and leaping to conclusions.” We’ll look at some examples of bias and consider how to overcome its impact on accident and incident investigation outcomes.

Why we are biased

We don’t set out to jump to conclusions, or to stereotype some people as careless and others as heroes. However, our natural way of dealing with the world is to take short cuts in receiving and processing information. Our tendency to quickly draw conclusions from limited evidence has developed through evolution as a survival technique. The primitive humans that survived were the ones who could make quick decisions as to whether to run from an animal, or to kill and eat it. Those who paused to weigh up the evidence missed a meal – or were eaten.

Confirmation bias

Long before the space programme was underway, Arthur Conan Doyle’s most famous fictional character, Sherlock Holmes, understood the concept of confirmation bias. “It is a capital mistake,” Holmes tells Dr Watson in A Study in Scarlet, “to theorise before you have all the evidence. It biases the judgement.”

If we start an accident investigation with a theory, we are drawn to look for evidence that confirms that theory, and to ignore information that doesn’t. If I tell you a vehicle hit a pedestrian, has your brain already started to theorise about the causation? Perhaps you assume it’s usually the driver’s fault for speeding? Or the pedestrian’s for not looking where they’re going? But it might be road layout, or the lighting, or the vehicle. The outcome of your investigation could depend on your first assumption, so it’s essential to step back from any theories we have before gathering information.

This was a problem Feynman identified in the Challenger investigation. The disaster emerged from a combination of factors, one of which was the failure of a small component known as the “O-ring”. O-rings had failed before, but the flights had succeeded, despite the damage. So previous investigations of damage had concluded it was safe to continue. As Feynman explained in his appendix to the report on the Challenger disaster:

“The acceptance and success of flights is taken as evidence of safety. But erosion and blow-by are not what the design expected. They are warnings that something is wrong. The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviations in this unexpected and not thoroughly understood way. The fact that this danger did not lead to a catastrophe before is no guarantee that it will not the next time.”

Feynman uses a comparison which makes this point even clearer: “When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next.”

To overcome this bias, use open questions, such as “What did you see?” or “What instructions were you given before you started this task?” And when we’re investigating non-injury incidents, we must remind ourselves that just because no one was hurt this time, it doesn’t mean no one could be hurt in the future.

Groupthink

You go to a meeting to discuss a recent accident, equipped with evidence of poor signage, damaged flooring and badly written procedures. But your manager says the operator was careless, and the six people speaking before you agree, with hearsay “evidence” of other times operators have been careless. It’s your turn. You want to suggest a review of written procedures, repairs to the floors, simpler signage – all measures that will cost money and take time. The others are suggesting a team brief to tell operators to be more careful. What do you do?

The 1950s study on line length by Solomon Asch is a fascinating study which shows that even where something is obvious – like one line being longer than another – some people feel compelled to agree with the incorrect majority verdict. Groupthink refers to this tendency to allow the desire for harmony in a group to override critical thinking and consideration of alternatives.

If groupthink causes us to see a longer line as shorter, how much easier is it to be influenced by other people in an accident investigation, where the answers are more complex, and the best solutions might require more resources?

In the Challenger investigation, Feynman saw how the desire for unanimity sometimes overrode the need to consider alternative explanations. Dissent felt disloyal. That pressure nudged people towards agreeing even when their experience contradicted the consensus. Feynman overcame this during the investigation by asking people for their views one-on-one, rather than as a group. Make it explicit that you are not asking for the official line – you want to hear as many alternative explanations as possible.

Another approach is to bring someone new into an investigation after the information gathering phase. You will have been influenced by the order in which you collected information, placing more weight on the earliest evidence (primacy) or most recent evidence collected (recency).

Avoid primacy and recency effects by presenting the evidence (without theories) in the order events occurred. The newcomer (perhaps from outside your organisation) must be prepared to play devil’s advocate – to challenge thinking and look to see where gaps have been filled inadvertently with bias rather than information. In this way, the newcomer will be less subject to groupthink and can give a more objective view of the incident.

Hindsight bias

When you read anything about the Challenger disaster now it is hard to see how NASA missed the problem with the O-rings. It was surely obvious, even to someone without a degree in rocket science! When we think this, we are experiencing hindsight bias.

Unfortunately, some popular models of incident and accident causation promote hindsight bias. Picture a row of dominos arranged such that when the first one is pushed, all the other dominoes will fall. The implication is that every time the unsafe act happens (the first domino falls), the accident happens (the other dominoes fall). If this were true, the unsafe act would not keep happening. The O-ring failed before, but the mission succeeded. Workers took a short-cut across the vehicle route hundreds of times before, and no one was killed. Workers might even have been rewarded because skipping a safety step saved time and got the job done. But when there is an accident, the unsafe act is defined as the cause.

Approaches such as “five whys” and the description of immediate, underlying and root causes also promote hindsight bias if care is not taken. Why did the rocket crash? Because the O-rings failed in cold temperatures. Why was the launch permitted in cold temperatures when it was known the O-rings might fail? The only answer to that seems to be to blame someone.

After an accident, the decisions taken can look so obviously wrong it is hard to believe anyone would act that way. Instead of working backwards from an accident, start with the situation before the accident, and consider all the options that were available at that time. As an investigator, put yourself in the place of the people who had to make decisions. Ask “What would I have known? What information, training and resources did I have? What were people around me doing?” You need to keep asking questions until you understand why people made the decisions they made – until you believe you would have made the same decision as them with that information.

Conclusion

Avoiding bias as an individual investigator is particularly difficult, so work with a team of differently minded people where you can. The issues that led to the Challenger space disaster were more complex than just the O-rings. There were multiple interacting systemic factors, just as there will be in any accident you investigation. Don’t let cognitive bias lead you to the simpler conclusion.

 

 

With a first degree in computer science and psychology, Bridget Leathley started her working life in human factors, initially in IT and later in high-hazard industries. After completing an MSc in Occupational Health and Safety Management, she moved full-time into occupational health and safety consultancy, training and writing.

Our one-day Accident Investigation course is designed to equip you with the practical skills and knowledge necessary to conduct robust non-biased accident and incident investigations. The course can be delivered via virtual classroom or in-company.

Ideal if:

  • You're a line manager, supervisor or safety representative
  • You have responsibility for investigating accidents in the workplace.

Book a course or request more information today