Almost a year ago on the safetydifferently blog, Sidney Dekker asked “Can safety renew itself?“. He asked whether the profession was even capable of doing this, with its goal to eliminate what goes wrong; “For a profession that is organized around the elimination, reduction and control of risk, innovation can be a tall order“. Since then, there has been progress. Significantly, a new way of looking at safety has further emerged – so-called Safety-II, and the resilience engineering movement has gathered pace. Rocking the boat can annoy people, but moreover it get people thinking and asking some fundamental questions; questions about paradigm, purpose and processes. It is unlikely that safety is unique, that there can be only one paradigm, one way. But that is sometimes how it feels in practice. For all professions, there is so much invested in paradigms that new thinking is resisted.
So I wonder how we might switch our thinking to avoid a dogmatic approach. Perhaps there is a way to try before you buy, without necessarily buying a one-way ticket to nu safety. I rediscovered some of Edward de Bono’s work, and in particular his little book on Six Thinking Hats®. The approach employs parallel thinking, and encourages everyone to think from the same perspective at a particular moment in a workshop. It struck me that the hats provide a way to think different about safety, not just in a meeting – more generally, and not necessarily lock, stock and barrel.
Considering the hats with regard to our thinking about safety, I had a few thoughts. Here they are:
White Hat: Facts and Figures
With the White Hat, we might ask ourselves a few questions about our safety data – particularly the neutral(ish) ‘hard facts’. An obvious question is, what facts and figures about work and safety do we actually collect? Considering whatever data we have and use, we need to ask ourselves, what are we really measuring? If we collect only negative outcome data, then this may tell us little about how safe the system actually is. To understand this, we need to understand how ordinary work works, not just exceptional events. But a more fundamental question is this: why do we measure what we measure (and in the way that we measure it)? We rarely ask ourselves these sorts of questions, but purpose is a fundamental aspect of any system. There may well be several purposes, some of which may be incompatible. But the answer might reveal something about our ideas about safety. With purpose in mind, we might now ask, what do we need to measure, how often, and over what time? What we actually measure and what we need to measure according to purpose may be different things. If our purpose is to improve the system, then we need data on the system works – how the work works. The problem is, this leads to measuring things that don’t obviously (to us) relate to safety. Organisations often collect a vast array of data. Some of the data we need to understand safety is not the data collected by safety departments, but may well already exist somewhere. Perhaps a habit of gathering ‘safety data’, rather than work data, may be part of the problem. If we want to understand safety, we need to get out from behind our desks.
Red Hat: Gut Feelings
The Red Hat concerns emotion and instinct – our gut feelings. We all have them, even if they are accessible in their untwisted form only for moments. Gut feelings can be a problem for safety professionals. Safety professionals are drawn to ‘facts’, logic, analysis, process, method, technique – but not gut feeling. Other safety actors however – pilots and air traffic controllers, doctors and nurses – clearly do value gut feeling, and act on it. And according to Gerd Gigerenza, it usually works very well.
With the Red Hat, we might ask, what are our immediate and initial gut feelings, intuitions or emotions with regard to work and safety? Unless we are close to the action, we might not have the exposure needed to cultivate these gut feelings, or sense those of others. We might have reactions to secondary data, but not first-hand experience of operations. Even if gut feelings don’t show up on our internal radars, others closer to the action will have them, and we can access these. If we are aware of some gut feelings – our own or others – do we take notice of them, or disregard them? Our organisational processes and systems, or our own preferred way of thinking, may encourage us to reason-away our inner voices. That is a terrible waste of data. Even if we attend to gut feelings, do we talk about them? If we don’t value gut feelings, we are less likely to talk about them because they are not ‘evidence’ or ‘facts’. Perhaps we are happier with partial or distorted statistics. And even if we talk about our gut feelings, do we act on them as an organisation? Front-line workers act on their gut feelings. Under intense time pressure, they have to. So when we safety professionals have strong gut feelings about something, will we act?
Black Hat: What Goes Wrong
The Black Hat is the most comfortable hat for safety professionals. It concerns things that go wrong, or might go wrong and indeed is a hat that we must wear. Along with the White Hat, it might seem like the others are superfluous (even ridiculous). The key Black Hat question seems to be, how do we think about failure? Whether or not we realise it, we all have an accident model. It’s the way we think about causality and failure – in the head or written down. It may be simple, linear and direct, or a bit more complicated like a set of dominos or slices of Swiss cheese. Or it may be more of a network of influences, characterised by non-linear relationships, causal loops and emergence. Formally we may use methods that are more reductionist or more holistic. We may or may not really buy into the methods we use. A particular problem arises when the way that we understand systems no longer matches the methods we (have to) use. Another issue is how do we consider emergence, including the unwanted consequences of interventions and projects. Emergent properties are surprising and some would say unpredictable by nature, but we can learn from previous emergent phenomena, and use systemic and creative approaches to try to understand emergence.
Yellow Hat: What Goes Right
The Yellow Hat looks at what goes right: positives, benefits, success. It is not a hat that safety professionals often wear. It’s not that we lack methods – there are hundreds of methods – from operations research, systems thinking and human factors. It’s just that we don’t think in this way. Perhaps there is a difference in thinking between front-line workers, who think of safe operations as ensuring things go right, and safety professionals, who think of safety as avoiding that things go wrong. There is good reason for this – someone has to wear the black hat, and that is to some extent our cross to bear and our hat to wear. But wearing only this hat is counterproductive. It separates safety (or unsafety) and sets us against other organisational goals. To understand what goes right, we need to look at ordinary work, as well as exceptional performance. How often do safety investigators use their skills to investigate success or performance variability? Not very often. How often to safety assessors consider safety benefits? Some are starting to do this, but it is not the norm. Wearing this hat, we can look at how the work works, and how and why it normally works so well. This can be extended and enhanced, perhaps using appreciative inquiry‘s cycle of discover, dream, design and deploy. However we do it, we need to direct some of our attention to understanding the adjustments, trade-offs and conditions from which safe operations emerge.
Green Hat: Creativity and Innovation
Creativity is not something that one naturally aligns with the safety profession. And neither is innovation. Perhaps, as Dekker says, the profession is conservative and risk averse by nature. I tend to think systemic factors are at play – particularly regulation. But perhaps, even with such constraints, we can use creativity to create new ideas and innovate in order to improve work, and therefore safety. This means going beyond analysis, structure, process and order, but it does not mean abandoning them. Can we use creativity to overcome obstacles, or to achieve possibilities? As a start, it might mean stopping doing things the way we routinely do them, for a while at least. On a personal level, I try to look to completely different fields, such as psychotherapy, film, design and photography. For instance, in designing the safety culture discussion cards, I worked with mental imagery and photos (abstract and concrete). In safety culture focus groups, creativity might involve free-wheeling discussions where anything goes – at least for a while – where there is no structure. Where they end up, without being forced through process, is often surprising – and useful.
Blue Hat: Thinking Process
Blue Hat Thinking is about meta-cognition, and invites us to think about how we think about safety. What is our safety paradigm? Do we have default or habitual ways of thinking, perhaps coded in tools and methods? Is it possible to switch to new ways of thinking, at least for periods, to think differently about safety issues? The Blue Hat manages the thinking process, so we might consciously switch between different hats and different sorts of thinking. Some tools and methods are more appropriate for some situations, problems or opportunities than others. Similarly, a way of thinking may be appropriate some times and not others, or we might need to blend aspects of different modes of thinking. But personal and system factors keep us attached to one paradigm. Can we shift? Might we move towards Safety-II thinking – even just to try? If we find our existing paradigm unsatisfactory, how might we change paradigms? Might we even transcend paradigms – open-minded, willing to adapt and try the best of what is? If we can, we have our hands on the most powerful of Donella Meadows‘ twelve leverage points to intervene in a system.