System Safety: Seven Friends of Intervention

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven friends of intervention. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety
  2. Seven foes of intervention in system safety
  3. Seven friends of explanation in system safety
  4. Seven friends of intervention in system safety (this post)
whereisemil CC BY-NC-ND 2.0

1. Acceptance of uncertainty

Whether one is intervening* to try to understand a situation or intentionally to bring about change, it is important to accept that one probably does not and cannot fully understand a complex situation or sociotechnical system. Once one accepts this, unwarranted confidence reduces, and the need for competency, time, and information becomes clearer. With competency, time, and information, the form of practical arrangements for understanding the system at all stages of its lifecycle become clearer, including during implementation, where surprises that result from intervention actions will tend to emerge.

2. Competency, expertise and involvement 

If you want to intervene in a system, you need expertise in system safety. It is astonishing how often this simple fact is neglected. Suitably qualified and experienced persons (SQEPs) are needed with recognised multidisciplinary competencies and perspectives, such as from safety science, safety engineering, human factors/ergonomics, psychology, anthropology, and related disciplines. Such expertise is often missing (e.g., HF/E competency in healthcare). And of course competency is needed from those who do the work. Learning teams and action research are examples of the use of competency in intervention.

3. Research

When intervening in a system for understanding or intentional change, an important initial step is to get knowledge. For system safety, this may include original research for new knowledge, or summaries, reviews or syntheses of existing sources of knowledge. The knowledge may relate to a topic within a safety-related discipline (e.g., in scientific journals), a sector (e.g., aviation, healthcare), or an organisation (e.g., history of interventions). In system safety, this important step is often missing in practice, resulting in ineffective interventions. Greater attention to research provides data, concepts, theories and methods to guide practice, benefiting safety and effectiveness more generally.

4. Listening and observing

Two fundamental methods for understanding systems are observing people at work and listening to people talk about their practice – how and why they intentionally make and transform the world – including the context of practice. These activities, while often lacking in practice, are vital to increase congruence between work-as-imagined and work-as-done, via appropriate alignment rather than simple compliance. Accepting the equivalence of failure and success in terms of their origins in ordinary work, we try to understand not only unusual events, but work in all its forms, whether the outcome is expected or unexpected, wanted or unwanted.

5. Human-centred, activity-focused design 

Human-centered design (HCD, e.g. ISO 9241-210) is a design philosophy and process that aims to align systems with human needs. It is relevant to anyone involved in the design or modification of procedures, equipment, or other artefacts. HCD requires that stakeholders are involved throughout design and development, which is based on an explicit understanding of people, activities, tools, and contexts. The process is refined by iterative user-centred evaluations and learning cycles. A strong focus on activities helps to understand not only how the world should adapt to people, but how people adapt to the world.

6. Multiple perspectives and thick descriptions

There tend to be multiple perspectives on situations, events, problems and opportunities. Each may be partial, but together can give a more complete picture. Shifting between different perspectives illuminates different experiences, perceptions and understandings, and how these interact. Different aspects of systems and situations come to light, along with the trade-offs, adjustments and adaptations that are or were locally rational. Multiple perspectives help generate thick descriptions of human behaviour. Facts, along with commentary and interpretations, explain work-as-done in context, such that it becomes more meaningful to an outsider, and possible implications of situations and proposed ‘solutions’ come to the surface.

7. Systems methods

Systems methods help to understand system boundaries, system structure, and system interactions across time and scale. They can make patterns of system behaviour visible, and can reveal previously unknown or unforeseen influences and interactions between parts of the system. Methods can be used for describing, analysing, changing, and learning about situations and systems. Common methods include system maps, influence diagrams, causal loop diagrams, multiple cause diagrams, stock and flow diagrams, activity theory/systems, FRAM, AcciMaps, and STAMP, among others. Such methods can help to go ‘up and out’ to the system context instead of just ‘down and in’ to components, ‘causes’, or events.

* A note on intervention: The term intervene comes from the Latin intervenire, from inter- ‘between’ + venire ‘come’. To intentionally try to understand a situation, or take action to change it (e.g., improve it or prevent it from getting worse) is to intervene. While there may be no intention to change a situation while observing or measuring, that is very often an unintended consequence. 

If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am interdisciplinary humanistic, systems and design practitioner interested in human work from multiple perspectives. My main interest is human and system behaviour, mostly in the context of safety-related organisations. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock or email contact[at]humanisticsystems[dot]com.

One thought

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.