Occupational Overuse Syndrome – Human Error Variant (OOS-HEV) is a condition involving the overuse of the notion of ‘human error’ to explain unwanted events in complex systems. The condition develops as the result of a number of factors such as the desire for a simple explanation, psychological avoidance of complexity and uncertainty, moral outrage and the need to blame, and a lack of desire to understand work and sociotechnical systems.
OOS-HEV is widespread in society, but for those in safety-related professions, Occupational Overuse Syndrome (OOS-HEV) tends to be one of a number of conditions increasingly referred to as Trained Incapacity (TI), where one’s abilities serve as inadequacies or blind spots. TI includes a range of other syndromes, including Safety-fication.
Signs and Symptoms
Symptoms tend to develop gradually and worsen over time if left untreated. Symptoms mainly seem to occur following unwanted events (but not when comparable ‘human errors’ are involved in wanted or fortuitous outcomes). The primary symptom of OOS-HEV is the habitual use of ‘human error’ and related synonyms as an explanation of an event in a complex system or situation. Other symptoms may include:
- Anger and disbelief.
- Anxiety during periods of uncertainty and lack of explanation.
- Blame and scapegoating.
- A perceived need to use labels.
- A focus on discrete actions and components instead of interactions.
- Fixation on first stories (including shallow and oversimplified accounts of the ’cause’).
There may be feelings of relief at having explained away the event. But as the condition progresses the use of the explanation becomes more frequent. The desire to think systemically reduces further, resulting in a total inability to see beyond surface features of events.
Contributory Factors
Any repetitive or habitual use of the human error explanation can lead to the development of OOS-HEV, but a number of factors may contribute. These factors are both individual and systemic:
- Lack of understanding of complex systems and human performance.
- A mindset that the human is a hazard and source of risk.
- A distorted view of safety as a result of a particular experience of a profession (déformation professionelle).
- A preference for automation as a means to reduce risk.
- An overwhelming desire to protect company reputation.
- Dislike of an individual.
Treatment
If OOS-HEV is suspected, is it important to seek early treatment to prevent the condition progressing. Treatment options include:
- Personal reflection.
- Discussion of human performance in a systemic and humanistic context.
- In-depth reading of relevant educational materials.
- Therapeutic use of systems concept and methods.
- Education via a formal course that adopts a systems thinking approach.
- Observation of people at work.
- Listening to stories of people’s work.
Treatment options are similar to prevention options below, but require a greater degree of unlearning to over come personal barriers. A tailored, individual treatment programme can help to achieve best results. It may be useful to consult a human factors specialist, humanistic psychologist or systems thinking specialist.
Prevention
There are a number of steps that can be taken to prevent OOS-HEV. The following factors play an important role in OOS-HEV prevention:
- Be mindful of your internal reaction to unwanted events. In particular, observe your thoughts and feelings after an unwanted event involving someone else. Compare these with comparable events that have involved yourself. Notice when you focus on the person versus the environment and situation.
- Meditate on how you think about people in complex systems. Reflect on how people are a source of resilience in complex systems.
- Generate and demonstrate empathy for others who are caught up in system accidents. Remember that people come to work to do a good job, not to have an accident. Resist the urge to think of those involved in unwanted events as perpetrators.
- Monitor your use of the ‘human error’ explanation. Try to avoid simply switching to synonyms for ‘human error’ which do not help describe the work and the system (e.g. ‘loss of situation awareness’).
- Use language carefully. Try to use systems concepts when describing work. Stop and think when you notice yourself slipping into old thought patterns.
- Be critical of all news media and politicians, especially the ‘first stories’ that appear in the aftermath of disasters. Be wary of the notion of causation and be skeptical of identified ’causes’ in systems accidents.
- Try to develop a curious attitude, seeking multiple perspectives on the same event.
- Observe and talk about how the work really works. Notice how success and failure both stem from ordinary work. Pay attention to system conditions, in particular demand, pressure, resources and constraints. Consider how human performance variability is essential to respond to variability in system conditions. Try to understand how and when people make trade-offs, and how these help to achieve system goals.
- Use systems methods along with the field experts to help understand both accidents and normal work.
Support and information
Further support and information is available from a wide variety of sources, including:
- Dekker, S. (2014). A Field guide to understanding ‘human error’. Third edition. Ashgate.
- EUROCONTROL (2013). From Safety-I to Safety-II: A White Paper. EUROCONTROL.
- EUROCONTROL (2014). Systems Thinking for Safety: Ten Principles Systems. A White Paper. Brussels.
- Shorrock, S. (2013). Human error: The handicap of human factors, safety and justice. Hindsight Magazine, Issue 13, Winter, 32-37.
- Woods, D.D. & Cook, R.I. (2002). Nine steps to move forward from error. Cognition, Technology & Work, 4 (2), 137-144.
- Woods, D.D., Dekker, S., Cook, R., Johannsen, L. & Sarter, N. (2010). Behind human error. Ashgate.
5 thoughts