The Field Guide to Understanding 'Human Error'
Summary
Sidney Dekker's "The Field Guide to Understanding ‘Human Error’" is a compelling call to action for organizations to abandon outdated and ineffective approaches to safety and embrace a more nuanced, systemic understanding of human error. Dekker argues that the traditional "Old View" of human error, which focuses on blaming individuals, is unjust and counterproductive to creating a truly safe and resilient organization. Instead, he advocates for a "New View" that recognizes human error as a symptom of deeper systemic issues, urging organizations to shift their focus from blame to understanding, learning, and improvement.
Dekker's guide is a comprehensive exploration of the complexities of human error, delving into the psychological, organizational, and systemic factors that contribute to accidents and incidents. He challenges common assumptions, debunks popular myths, and provides practical guidance for organizations seeking to create a more just and effective safety culture.
Five Key Ideas with Detailed Explanations and Quotes
Human Error is a Symptom, Not a Cause:
Explanation: The traditional view of human error treats it as the primary cause of accidents, focusing on individual mistakes and blaming those involved. Dekker argues this approach is flawed, failing to acknowledge the complex interplay of factors contributing to incidents. Instead, he proposes that what we label "human error" is a symptom of deeper systemic issues, such as organizational pressures, inadequate training, poor communication, or flawed system design.
Quotes:
"The behavior which we call ‘human error’ is not a cause of trouble. It is the consequence, the effect, the symptom of trouble deeper inside your organization."
"What we call ‘human error’ is a symptom of deeper trouble."
"‘Human error’ is an attribution, a judgment that we make after the fact."
"Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization."
"‘Human error’ is information about how people have learned to cope (successfully or not) with complexities and contradictions of real work."
Why it Matters: This fundamental shift in perspective is crucial for moving beyond simplistic explanations and addressing the root causes of accidents. By recognizing human error as a symptom, organizations can investigate the underlying systemic issues contributing to unsafe conditions.
Understanding the Context of Human Error:
Explanation: Dekker emphasizes the importance of understanding the context in which errors occur. He argues that people's actions are shaped by their tools, tasks, and environment. To truly understand why an error occurred, we must consider the individual's perspective, understanding of the situation, the pressures they were facing, and the resources available.
Quotes:
"Behavior is systematically connected to features of people’s tools, tasks, and operating environment."
"If it made sense for people to do what they did, then it may make sense for others as well."
"Tries to understand why people did what they did. Asks why it made sense for people to do what they did."
"To understand error, take the view from the inside of the tunnel and stop saying what people failed to do or should have done."
"Failures can only be understood by looking at the whole system in which they took place."
Why it Matters: Understanding the context of human error helps us avoid hindsight bias, which leads to oversimplification and misattribution of blame. By considering the individual's perspective and the surrounding circumstances, we can gain a more accurate and nuanced understanding of why an error occurred.
Learning from Errors, Not Punishing Them:
Explanation: Dekker advocates for a learning-focused approach to human error, emphasizing the importance of creating a culture where employees feel safe to report errors and near misses without fear of punishment. He argues that punishing individuals for errors discourages open communication and hinders the organization's ability to learn from mistakes and prevent future incidents.
Quotes:
"New View investigations have the following characteristics: they are not about individual practitioners; they open a window on problems that all practitioners may be facing. Their “errors,” after all, are symptoms of systemic problems that all practitioners may be exposed to; they are not a performance review; they are not about discipline or blame; they are a learning opportunity."
"Getting rid of Bad Apples tends to send a signal to other people to be more careful with what they do, say, report or disclose. It does not make ‘human errors’ go away, but does tend to make the evidence of them go away; evidence that might otherwise have been available to you and your organization so that you could learn and improve."
"Discipline without understanding the problem is ineffective."
"The challenge is to create a culture of accountability that encourages learning. Every step toward accountability that your organization takes should serve that goal. Every step that doesn’t serve that goal should be avoided."
"Accountability can mean letting people tell their account, their story."
Why it Matters: Creating a learning-focused environment is essential for fostering a culture of safety and resilience. By encouraging open communication and embracing errors as opportunities for learning, organizations can identify systemic weaknesses and implement effective solutions to prevent future incidents.
Shifting from Backward-Looking to Forward-Looking Accountability:
Explanation: Traditional accountability systems often blame individuals for past events, leading to punishment or dismissal. Dekker argues that this backward-looking approach is counterproductive, focusing on retribution rather than learning and improvement. He proposes a shift towards forward-looking accountability, emphasizing understanding the systemic factors contributing to the error and preventing recurrence.
Quotes:
"If holding people accountable means getting them to take responsibility for their work, then the New View does not deny this at all. It is a misunderstanding to think that the New View eschews individual responsibility."
"You cannot fairly ask somebody to be responsible for something he or she had no control over. It is impossible to hold somebody accountable for something over which that person had no authority."
"Go from backward to forward-looking accountability. Backward-looking accountability means blaming people for past events...Forward-looking accountability is consistent with a new type of safety thinking. People are not a problem to control, but a solution to harness."
"If you hold somebody accountable, that does not have to mean exposing that person to liability or punishment. You can hold people accountable by letting them tell their story, literally “giving their account.”"
"Storytelling is a powerful mechanism for others to learn vicariously from trouble."
Why it Matters: Forward-looking accountability encourages a more proactive and constructive approach to safety, focusing on learning from errors and implementing solutions to prevent recurrence. It also promotes a more just and equitable system, recognizing that individuals often operate within a complex system with limited control over all factors.
Embracing a Systemic Approach to Safety:
Explanation: Dekker advocates for a systemic approach to safety, recognizing that accidents and incidents are rarely caused by a single human error but rather by the complex interplay of multiple factors within the system. He argues that organizations must move beyond linear causation models and embrace a more holistic understanding of how systems function, adapt, and fail.
Quotes:
"‘Human error’ is systematically connected to features of people’s tools, tasks and operating environment."
"A ‘human error’ problem is at least as complex as the organization that helps create it."
"Systems are not basically safe. People have to create safety despite a system that places other (sometimes contradictory) expectations and demands on them."
"Cause is not something you find. Cause is something you construct."
"There is no single cause—neither for failure, nor for success."
Why it Matters: A systemic approach to safety allows organizations to identify and address the underlying vulnerabilities within their systems rather than focusing solely on individual behavior. Organizations can implement more effective safety strategies that address the root causes of accidents and incidents by understanding how systems function and interact.
Conclusion
Dekker's "The Field Guide to Understanding ‘Human Error’" is a powerful and insightful guide for organizations seeking to create a more just, effective, and resilient safety culture. By challenging traditional assumptions about human error and advocating for a more nuanced, systemic approach, Dekker provides a roadmap for organizations to move beyond blame and embrace learning, improvement, and a deeper understanding of the complexities of human performance in complex systems.
The guide's practical recommendations, grounded in research and real-world examples, offer actionable steps for organizations to implement the "New View" of human error. These include:
Changing investigation procedures: Shifting from a blame-focused approach to a systemic analysis that seeks to understand the contributing factors and context of the error.
Encouraging open communication: Creating a culture where employees feel safe to report errors and near misses without fear of punishment, fostering transparency and learning.
Investing in system design: Improving the design of tools, tasks, and environments to reduce the likelihood of errors, recognizing the influence of the work environment on human performance.
Resilience training: Equipping employees with skills to adapt to unexpected situations, recover from errors, and contribute to a more resilient organization.
By embracing Dekker's "New View" of human error, organizations can create a safer, more just, and more effective work environment. In this environment, individuals are empowered to contribute to a culture of continuous learning and improvement.
A few interesting stories from the book:
a) The story of Abraham Wald demonstrates this. Wald was born in the Austrian–Hungarian empire around the turn of the twentieth century. After studying mathematics in Vienna, he could not get a university position because of his Jewish heritage, and in 1938 he was able to emigrate to the United States. There, he applied his statistical skills to the problem of Allied bomber losses due to enemy fire (ground-based anti-aircraft flak as well as bullets from attacking fighter aircraft). A study had been made of the patterns of damage that aircraft returned with, and it was proposed that armor should be added to those places that showed the most bullet damage. Armor, of course, increases the weight of an airplane dramatically and cuts into its payload or range. So you have to be really picky about where you put it, and how much you put on.
Wald, after doing his own extensive statistical analyses of returning bomber aircraft, came to quite a different conclusion. The airplanes that made it back with holes in them, he concluded, were the ones who had taken hits in areas where they could survive and return. Adding armor to those places would not do anything to help them. Instead, he said, we should add armor to those places that did not show holes. Because those were the airplanes that didn’t come back. His statistical analysis identified the weak spots in non-returning airplanes. These were the weak spots that led to the loss of the bomber when hit. Those areas had to be reinforced, he argued, not the areas with holes in them. His insight became seminal for what is today known as operations research. In a sense, the areas with holes in them were the decoy phenomenon. They were evidence of the survivable incident; a marker of resilience. These were not markers of fatal risk that needed to be further controlled. That risk, instead, was in the areas on the returning bombers that did not show bullet holes.
Ironically, Wald himself died in a plane crash (of quite a different kind) in southern India while on a lecture tour in 1950.
Look in the places where there are no holes, where your people do not see holes, where they do not see things that are worthy of reporting. In other words, look at normal work. Get a sense of their daily experiences and frustrations, their workarounds, adaptations, of the places where they finish imperfect designs and procedures in order to get the job done. It is those places, where there are no holes, that may one day play a role in your organization’s fatality or accident.
b) A few years ago, I heard of a woman who was slightly injured at work. She told her supervisor, showed the injury, and went to see the doctor that same afternoon. While in the waiting room, she got a call from school. Her son had fallen ill and been sent home. After her appointment and having her gash cleaned and glued by a nurse, she rushed home to take care of her boy. She later informed her supervisor.
News of the injury made its way to the company’s safety manager. He was horrified. Not necessarily because of the injury or the employee’s fate, but because he had been on a “winning streak.” Next to the entrance of the plant, a sign announced that the company had been without injury for 297 days. 300 had been within reach! The number would have looked so good. It would have made him look so good.
The day after the incident, the safety manager went to see the supervisor before committing, with lead in his shoes, to change the sign to 0. Zero days since the last injury. Then he learnt that the employee had gone home after the incident and doctor’s visit. It was a gift from heaven. He called the HR manager and together they resolved to generously give the woman the previous afternoon off. It was no longer a Loss-Time Injury (LTI). The woman had simply gone home to take care of her child. He also called the clinic. Because no suturing was done and no doctor needed to touch the woman, this was not a medical treatment in the strict definition held by the company. So no Medical-Treatment Injury (MTI) either. The safety manager could breathe again.
A few days later, the sign next to the entrance proudly showed 300.
Other Quotes
Old View
‘Human error’ is the cause of trouble
‘Human error’ is a separate category of behavior, to be feared and fought
‘Human error’ is the target; people’s behavior is the problem we need to control
‘Human error’ is something to declare war on. People need to practice perfection
‘Human error’ is a simple problem. Once all systems are in place, just get people to pay attention and comply
With tighter procedures, compliance, technology and supervision, we can reduce the ‘human error’ problem
We can, and must, achieve zero errors, zero injuries, zero accidents
New View
What we call ‘human error’ is a symptom of deeper trouble
‘Human error’ is an attribution, a judgment that we make after the fact
Behavior is systematically connected to features of people’s tools, tasks and operating environment
‘Human error’ is information about how people have learned
to cope (successfully or not) with complexities and contradictions of real work
A ‘human error’ problem is at least as complex as the organization that helps create it
With better understanding of the messy details of people’s daily work, we can find ways to make it better for them
We can, and must, enhance the resilience of our people and organization
“You can help change the way your organization thinks about ‘human error.’ This begins with asking questions, looking at things from another perspective, and using a different language.”
“Some believe that they need to keep beating the “human” until the ‘human error’ goes away.”
“Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization.”
Old View
Asks who is responsible for the outcome
Sees ‘human error’ as the cause of trouble
‘Human error’ is random, unreliable behavior
‘Human error’ is an acceptable conclusion of an investigation
New View
Asks what is responsible for the outcome
Sees ‘human error’ as a symptom of deeper trouble
‘Human error’ is systematically connected to features of people’s tools, tasks and operating environment
‘Human error’ is only the starting point for further investigation
“If it made sense for people to do what they did, then it may make sense for others as well.”
Old View
Says what people failed to do
Says what people should have done to prevent the outcome
New View
Tries to understand why people did what they did
Asks why it made sense for people to do what they did
“The New View does not claim that people are perfect. But it keeps you from judging and blaming people for not being perfect.”
“Practitioners are not all exposed to the same kind and level of accident risk. This makes it impossible to compare their accident rates and say that some, because of personal characteristics, are more accident-prone than others.”
“A “Bad Apple” problem, to the extent that you can prove its existence, is a system problem and a system responsibility.”
“Accountability can mean letting people tell their account, their story.”
“Discipline without understanding the problem is ineffective.”
“The challenge is to create a culture of accountability that encourages learning. Every step toward accountability that your organization takes should serve that goal. Every step that doesn’t serve that goal should be avoided.”
“If you truly want to create accountability and a “just culture” in your organization, forget buying it off the shelf. It won’t work, independent of how much you pay for it. You need to realize that it is going to cost you in different ways than dollars.”
“Justice can never be imposed. It can only be bargained.”
“The more you react to failure, the less you will understand it.”
“Hindsight gets you to oversimplify history. You will see events as simpler, more linear, and more predictable than they once were.”
“Bad process may still lead to good outcomes, and vice versa.”
“What you believe should have happened does not explain other people’s behavior. It just makes you look ignorant and arrogant.”
“To understand error, take the view from the inside of the tunnel and stop saying what people failed to do or should have done.”
“Failures can only be understood by looking at the whole system in which they took place. But in our reactions to failure, we often focus on the sharp end, where people were closest to (potentially preventing) the mishap.”
“Cherry-picking means taking fragments from all over the record and constructing a story with them that exists only in your hindsight. In reality, those pieces may have had nothing to do with each other.”
“It is easy to gather cues and indications from a sequence of events and lob them together as in a shopping bag. This is not, however, how people inside the unfolding situation saw those cues presented to them.”
“See the unfolding world from the point of view of people inside the situation—not from the outside or from hindsight.”
“Loss of situation awareness” is another name for ‘human error’—the failure to notice things that, in hindsight, turned out to be critical.
“Complacency is also a name for ‘human error’—the failure to recognize the gravity of a situation or to follow procedures or standards of good practice.”
“Non- compliance is also a name for ‘human error’—the failure to stick with standard procedures that would keep the job safe.”
“Don’t make a leap of faith, from your facts to a big label that you think explains those facts. Leave an analytic trace that shows how you got to your conclusion.”
“The way to bridge the gap between facts and conclusions (about those facts) is to find a definition or operationalization in the literature for the phenomenon and start looking in your facts for evidence of it.”
“Leaving a trace. Using a definition for “loss of effective Crew Resource Management” that lists misunderstanding the problem, no common goal and uncoordinated corrective actions, you can find evidence for that in your facts.”
“Leaving a trace. Overlapping talk, no response when one is expected, unequal turns at talking and offering repair of somebody else’s talk when none is needed together could point to a “loss of effective Crew Resource Management”.”
“Cause is not something you find. Cause is something you construct.”
“What is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident.”
“We make assessments about the world, which update our current understanding. This directs our actions in the world, which change what the world looks like, which in turn updates our understanding, and so forth.”
“In plan continuation, early and strong cues suggest that sticking with the original plan is a good, and safe, idea. Only later, and weaker, cues suggest that abandoning the plan would be better. In hindsight, it is easy to forget to see the cues from the point of view of people at the time, and when and how strongly they appeared.”
Old and New View interpretations of procedural adaptations:
Model 1 (Old View)
Procedures are the best thought-out, safest way to carry out a task
Procedure-following is IF-THEN, rule-based behavior
Safety results from people following procedures
Safety improvements come from organizations telling people to follow procedures and enforcing this.
Model 2 (New View)
Procedures are resources for action (next to other resources)
Applying procedures successfully is a substantive, skillful cognitive activity
Procedures cannot guarantee safety. Safety comes from people being skillful at judging when and how they apply
Safety improvements come from organizations monitoring and understanding the gap between procedures and practice
At any moment, behavior that does not live up to some standard may look like complacency or negligence. However, deviance may have become the new norm across an entire operation or organization.
“Loss of situation awareness” is the difference between what you know now, and what other people knew back then. And then you call it their loss.
“Using complacency is an investigative cop-out. Guess who is being ‘complacent’ when you use the term in your investigation!”
“A continued belief in the triangle can probably make your industry less safe. It rocks you to sleep with the lullaby that the risk of major accidents or fatalities is under control as long as you don’t show minor injuries, events or incidents.”
We may believe that blocking a known pathway to failure somewhere along the way will prevent all similar mishaps.
“The “Swiss Cheese” analogy. Latent and active failures are represented as holes in the layers of defense. These need to line up for an accident to happen.”
“The barrier, or defenses- in-depth, model, is still a chain-of-events model too. One breach needs to precede the next. Also, the final link in the chain is an “unsafe act” by a frontline operator. A ‘human error’ in other words.”
“Murphy’s law is wrong. What can go wrong usually goes right, and then we draw the wrong conclusion: that it will go right again and again, even if we borrow a little more from our safety margins.”
“A safety culture is a culture that allows the boss to hear bad news.”
“Drift into failure is hard to see inside of the tunnel. Each step away from the norm is only small and offers success on other important measures (for example, time, cost, efficiency, customer satisfaction).”
“Safety has increasingly morphed from operational value into bureaucratic accountability. Those concerned with safety are more and more removed— organizationally, culturally, psychologically—from those who do safety-critical work at the sharp end.”
“Having a large safety bureaucracy could actually increase the probability of an accident. A bureaucracy can produce “fantasy documents” and demand compliance with cumbersome processes while not dealing with real sources of risk.”
“Safety management systems can become liability management systems when their main role is to prove that management did something about a problem (even if that meant telling everybody else to try harder).”
How to organize safety according to the Old View and New View
Old View Safety
Whoever is boss or safety manager, gets to say
Dominated by staff
Guided by rules and compliance
Make it impossible for people to do the wrong thing
Governed by process and bureaucracy
Strives for predictability and standardization
Safety as accountability that is managed upward
New View Safety
Whoever is expert and knows the work, gets to say
Driven by line
Guided by insight and context
Give people space and possibility to do the right thing
Adjusted by mutual coordination Strives for diversity and innovation
Safety as a responsibility that is managed downward
“If all you do is look after people’s safety, they may feel treated as children. If you look after people’s work, they will likely feel treated as colleagues.”
“To take responsibility for safety on the line, you should first and foremost look at people’s work, more than (just) at people’s safety.”
“No wonder we blame “safety culture” for more and more accidents. By blaming “safety culture,” we pretty much cover everything. But it explains little.”
Old View Safety and New View Safety
Old View Safety
People seen as a problem to control
Focus on people’s attitudes and behavior
Safety defined as absence of negative events (incident/injury free)
Whoever is boss or safety manager, gets to say
Dominated by staff
Guided by rules and compliance
Make it impossible for people to do the wrong thing
Governed by process and bureaucracy
Strives for predictability and standardization
Safety as accountability that is managed upward
New View Safety
People seen as a resource to harness Focus on people’s working conditions
Safety defined as presence of positive capacities to make things go right
Whoever is expert and knows the work, gets to say
Driven by line
Guided by insight and context
Give people space and possibility to do the right thing
Adjusted by mutual coordination Strives for diversity and innovation
Safety as a responsibility that is managed downward
“Behavior-based safety programs focus on workers’ “unsafe acts.” They see ‘human error’ as the cause of trouble and place responsibility for safety on workers themselves.”
“Behavioral safety interventions typically focus on who is responsible, not on what is responsible for creating risk.”
“Initiatives that try to suppress the number of reported incidents see safety as an absence of negatives. These might actually harm safety in the long run.”
“A commitment to zero suggests that we must manipulate a dependent variable. This has got things upside-down.”
“A zero vision does not come out of safety research. It is neither a research result, nor a scientific prediction. There is no accident theory underpinning it.”
“Over time, our industries have become safer, with many now living at the “thin edge of the wedge.” The safer you become, the more difficult it is to achieve safety improvements.”
“Safety actions taken in ultra-safe (near-zero) organizations are often repetitions or retreads of those taken in a less safe organization. They miss the point and do not help in creating additional safety.”
At the thin edge of the wedge:
That which is going to bite you—the unlikely but still possible accident— will likely be something you were not measuring;
It will likely be something that has not been reported as an incident by one of your workers.
“Doing more of the same will not lead to something different. You will stay where you are.”
“At the thin edge of the wedge, holes in layers of defense and formally reported incidents are no longer the herald of accidents or fatalities. Normal work is.”
“How do you make ‘human error’ go away? The answer isn’t as simple as the question. A ‘human error’ problem, after all, is an organizational problem. It is at least as complex as the organization that has helped create it.”
“Hard fixes change something fundamental about, or in, the organization. This makes them hard. But it also makes them real fixes.”
“Reprimanding “Bad Apples” is like peeing in your pants. You did something about the problem and feel relieved. But then it gets cold and uncomfortable. And you look like a fool.”
“The Old View really has efficiency and expediency going for it. You don’t have to do a lot of analytic work, or put in the hard investigative yards. Just say where other people went wrong.”
“The alternative of speaking for the dead is speaking about the dead. Investigations often speak badly about the dead.”