Surprises, Fast and Slow: Preparing for the Limits of Work-as-Imagined

This article is a reproduction of the Editorial published in HindSight magazine issue 34 in December 2022 (all issues available at SKYbrary)

In safety-critical industries, surprises are rarely welcome. Aside from unexpected events we perceive as pleasant, like receiving a birthday cake, a thank-you note, or even a day when everything works as expected, surprises are not good things. The unwanted surprises that we may encounter, and how they are handled, differ depending on who we are and where we are in the system, whether in the control room, flight deck, surgical theatre, or boardroom.

Image by Andreas Glöckner from Pixabay

Fast Surprises

In operational roles, surprises tend to be experienced over a short period. The most common variety seems to have ‘fast shoots’ and ‘fast roots’, developing quickly, then emerging and becoming detectable quickly, perhaps over seconds or minutes. There is often a rapid change in the context, or a mismatch between expectation (or imagination) and reality, or both. For a pilot or controller, it could be an in-flight medical emergency. For a clinician, it could be a rapidly deteriorating emergency patient.

Such surprises evolve with rapid changes to the operational situation and the associated contexts, such as physical (e.g., aircraft behaviour), environmental (e.g., wind shear; thunderstorm), technological (e.g., automation surprises), informational (e.g., display parameters), temporal (e.g., time pressure, exponential effects), and social (e.g., others’ unexpected actions). These are operational surprises, dealt with operationally. A fast response is usually necessary, which requires training to recognise the signs and react. One well-established model is known as recognition-primed decision-making (RPDM) and applies when people need to make fast and effective decisions in complex situations. What happens is a blend of intuition (recognition) and mental simulation, typically considering responses serially for the first ‘good enough’ option that fits the developing contexts.

But what we experience as ‘fast surprises’ may develop slowly behind the curtain, sometimes over many years, and peep out to become observable quickly, perhaps in seconds (‘fast shoots, slow roots’). Such surprises may be very difficult to handle because of the interconnected changes in the contexts of work that originate further back in time and space. These may be political (e.g., performance targets), legal and regulatory (e.g., prescriptive limits), organisational (e.g., training cuts; staff shortages), technological (e.g., software updates; new automation), and procedural (e.g., out-of-date procedures; conflicting policies).

Again, a fast response will typically be necessary, but it is more difficult because decision-making faces formidable constraints. Other constraints may be invisible as people become habituated to how things are. Whatever solution is applied in the moment will not fix the contextual sources of the problem, so more surprises are likely.

For fast surprises, Captain Ed Pooley noted in HindSight 21 that “the ‘system’ in both the flight deck and in the control room must be able to cope with the particular case of a (very) sudden and (entirely) unexpected transition to high workload … Recovery – or at least containment – before overload is reached becomes the aim.” He noted that many situations are covered by procedures, in training and in operations. Others are more unique and demand ad hoc decision-making. To be effective, surprising simulated scenarios must be hidden so that they are indeed surprising, and “a huge library of representative training scenarios must be developed so that the surprise they provide is as near to real as possible.” But not every scenario can be anticipated. Training must therefore assess fundamental competence in coping with surprises.

Talking about firefighting incident command, Sabrina Hatton-Cohen said in HindSight 31 that simulations “can be incredibly powerful learning tools because you can go through the ‘what if’ scenarios and run through a number of different variations of each scenario.” Her team found that well-designed command training simulations elicited similar decision-making processes to those observed in real life.

In a healthcare context, surgeon Euan Green noted in HindSight 33 that “Given the rarity of true surgical emergencies … it is important to continue to run these drills at intervals; while surgeons stay in their roles for many years, nursing and support teams can change regularly.”

Fundamental competencies proved important in the landing of QF32 (see HindSight 29). Four minutes after take-off, engine number two exploded without warning, followed by a second explosion, with 21 out of 22 aircraft systems compromised. Within a few minutes, there were over 100 ECAM checklists. Competency was often
in the spotlight when I interviewed Captain Richard de Crespigny. Richard said that controllability checks were critical to the safe landing of QF32. He explained that, while this procedure is habitual for military aviators, it wasn’t documented in any Airbus manual or the airline’s manual until after QF32.

He learned about them in the Air Force: “It’s normal Air Force procedure that if your aircraft has a mid-air collision or has taken damage from an attack, and flight controls are affected, then you must determine the best configuration and the minimum speed that you need to land.” Similarly, during landing, he used a technique that is “not practised in any simulator.”

Slow Surprises

Other surprises develop slowly, and become observable slowly, without the same kind of urgency for response as the kinds described above. Both the ‘roots’ and ‘shoots’ may grow over weeks, months or years, and recognising, understanding and handling them can take a long time. They are still surprises because reality and our expectation are mismatched, but this mismatch is revealed or accepted slowly.

The underlying contexts are similar to the ‘slow roots’ variety above (societal, political, legal and regulatory, organisational, technological, procedural, etc.). There are likely to be cultural implications, as shared assumptions about the world change and develop over years. This cultural context, combined with the slow unfolding of the surprise, creates even more constraints on handling surprises. The reality of the situation may be harder (for some groups, at least) to accept.

From a flight deck perspective, Kathy Abbott explained in HindSight 33 that there can be crucial differences between claims and operational reality when it comes to new technology. “We’ve seen so many cases where there are side effects that were not expected.” She explained that the problem for people in technical roles is not a lack of willingness to consider unintended consequences, but lack of knowledge how to do it, or who can help.

Predicting so-called ‘emergent properties’ of new technology is notoriously difficult, and expertise in individual technical systems or even technical system architecture probably won’t be sufficient. Kathy Abbot indicated an issue with slow surprises: they can be surprising to some but not others. “I personally have heard design engineers say that they don’t understand why it’s a problem, that it works exactly as designed.” But from an operational point of view, there is a surprise because their expectations are not met.

In HindSight 25, Suzette Woodward told the story of the World Health Organisation (WHO) surgical checklist, designed in 2006. The checklist includes things to check off prior to surgery to ensure that critical tasks are carried out and that the whole team is adequately prepared for the surgical operation. “During the implementation process, in the main, anaesthetists and nurses were largely supportive of the checklist but consultant surgeons were not convinced. There is currently huge variability in use and implementation. … Using checklists in healthcare is not a way of life and has become simply an administrative task. This is a classic ‘work-as-imagined’ versus ‘work-as-done’ story.”

This brings us to a key point for slow surprises: We tend to overestimate the degree to which future work-as-done will follow our designs and plans. On the one hand, this is because of the nature of the world, and the ever-changing contexts of work. On the other hand, it because of the nature of us, and the lethal human cocktail of ignorance, fantasy, denial and overconfidence.

Not only do our plans not always work, but our designs and plans often bring more problems. Even small changes to procedures can have disproportionately large effects. And so we experience unwelcome surprises. As work becomes more complex, unintended consequences become the thorn in the side of imagination.

For these kinds of surprises, it is rare to find procedures and training on how to detect and handle them. But in HindSight 27, Anders Ellerstrand reported on requirements to improve resilience, and the potential to respond, monitor, learn and anticipate. In short, competency is needed, from front-line operators to senior managers, to respond, monitor, anticipate and learn from unexpected events. It should be known who has what expertise and authority to handle a given part of a situation. Expertise is not the only requirement (teamwork is critical), but almost all capability to handle surprises is dependent upon it.

Investment in expertise, however, is often a victim of cost-cutting in lean times. It is a mistake repeated so often that it seems that organisations have lost the ability to learn even from this mistake. Since surprises will continue, and almost none will be pleasant, the question is whether we will ensure that we continue to commit to our own expertise, and make sure our organisations and professional associations support us and the wider system.

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am an interdisciplinary humanistic, systems and design practitioner interested in work and life from multiple perspectives. My main interest is human functioning and system behaviour, in work and life generally. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I work as a human factors practitioner and psychologist in safety critical industries. I am also an Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. LinkedIn: www.linkedin.com/in/steveshorrock/ Email: contact[at]humanisticsystems[dot]com

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.