Just Culture in La La Land

Photo: Steven Shorrock CC BY-NC-SA 2.0 https://flic.kr/p/Rpf4za

It was always going to happen.

The wrong Best Picture winner was read out live on air at The Oscars. Someone had to take the blame. Attention first turned to Warren Beatty and Faye Dunaway. They, after all, ‘touched it last’. But they had mitigating circumstances; they were given the wrong envelope. In any case, and perhaps more to the point, they are unsackable.

And so we go back a step, and ask who gave the wrong envelope? Now we find our answer: the PricewaterhouseCoopers auditors Brian Cullinan and Martha Ruiz. Both were sacked from the role of overseer shortly after the mistake.

Three key charges are levelled against Cullinan. First, he gave the wrong envelope, confusing the right envelope and the spare envelope for an award just given. Second, Cullinan posted a photo of Emma Stone to his Twitter account just before the fatal mistake. Third, when the wrong Best Picture winner was read out, he didn’t immediately jump into action. And neither did Ruiz

They had one job to do. They had one job! And they messed up.

So what should be the response? The relevant concept here is ‘just culture’. In his book ‘Just Culture‘, Sidney Dekker says that “A just culture is a culture of trust, learning and accountability“.  He outlines two kinds of just culture.

Retributive Just Culture

The first kind of just culture is a retributive just culture. According to Dekker, this asks:

  • Which rule is broken?
  • Who did it?
  • How bad was the breach, and what should the consequences be?
  • Who gets to decide this?

This is the typical form of just culture found in societies around the world, for thousands of years. Most of us are familiar with this from being small children.

Dekker explains that with retributive just culture, we have three scenarios:

  • Honest mistake, you can stay.
  • Risk-taking, you get a warning.
  • Negligence, you are let go.

There are even commercialised algorithms to help organisations with this distinction and the appropriate response. David Marx’s Just Culture Algorithm advises to console true human errors, coach against risky behaviours, and ultimately discipline reckless behaviour.

If we look at the Oscars scenario, we can address the three charges made against Culling and Ruiz.

On the first charge – giving the wrong envelope – we can conclude that this is an example of an ‘honest mistake’ category. This ‘honest mistake’ was influenced by a confusable envelope. In human factors and psychology, we have researched and catalogued such actions-not-as-planned for decades through diary studies, experiments, report analysis, interviews and naturalistic observation. We have many terms for such an error, common terms including ‘slip’ and ‘skill-based error’. In doctoral research that I began 20 years ago in the context of air traffic control, I developed a technique called ‘technique for the retrospective and predictive analysis of cognitive error’ (‘TRACEr’, download here). With TRACEr, we would probably classify this kind of error as Right action on wrong object associated with Selection error involving Perpetual confusion and Spatial confusion, which would be associated with a variety of performance shaping factors – aspects of the context at the time such as design, procedure, pressure and distraction. We’ve all done it, like when you pick up the wrong set of near identical keys from the kitchen drawer, or the wrong identical suitcase from the airport luggage carousel. In stressful, loud, distracting environments and with confusable artefacts, the chances of such simple actions-not-as-planned increase dramatically.

On the second charge, some might argue that posing for a photograph and sending tweets just prior to handing out the ‘Best Picture’ envelope is risk-taking, or even negligence. The TMZ gossip site wrote, “Brian was tweeting like crazy during the ceremony, posting photos … so he may have been distracted. Brian has since deleted the tweets.” Meanwhile, People reported an anonymous source who claimed that “Brian was asked not to tweet or use social media during the show. He was fine to tweet before he arrived at the red carpet but once he was under the auspices of the Oscar night job, that was to be his only focus.” The source reportedly continued, “Tweeting right before the Best Picture category was announced was not something that should have happened.” We can’t verify whether this is true and if so, who asked him not to use social media during the show. It is certainly sensible advice, bearing in mind what we know about distraction in safety critical industries and its role in accidents such as the 2003 train crash at Santiago de Compostela.

But perhaps the acid test for this assertion is whether people would have said anything about that photograph or tweet had everything gone according to plan. Just culture requires that we isolate the outcome from the behaviour. Applying the definition and principles of retributive just culture, what we are interested in is the behaviour. If the right envelope was given, then the photo on twitter would likely have been retweeted hundreds or thousands of times, and reported on various gossip websites and magazines, with no judgement from the press and public about the wisdom of such an activity. Instead, the photo would have been celebrated, and any deviation from alleged instructions ‘not to tweet or use social media during the show‘ would have been laughed away.

The third charge, levelled at both accountants, was that they failed to respond in a timely manner on hearing “La La Land”. The prospect of an erroneous announcement was clearly imaginable to Cullinan and Ruiz, who spoke to The Huffington Post about this scenario just a week or so before that fateful night: “We would make sure that the correct person was known very quickly,” Cullinan said. “Whether that entails stopping the show, us walking onstage, us signaling to the stage manager — that’s really a game-time decision, if something like that were to happen. Again, it’s so unlikely.” But could it be that, live on the night of the biggest show on earth, with the eyes of tens of millions upon them, they froze? Again, TRACEr might classify this as OmissionNo decisionDecision freeze, with a variety of performance shaping factors such as stress and perhaps a lack of training (e.g., simulation or practice).

The ‘freeze’ response is the neglected sibling of ‘flight’ and ‘flight’, and occurs in traumatic situations. It’s the rabbit-in-the-headlights response. Many people involved in accidents and traumatic events have been known to freeze, including in aircraft accidents. It is a psychophysiological response and few of us can claim immunity. If we take this as an example of freeze, associated with confusion, shock and fear, then can we say this is an ‘honest mistake’? Even this seems not to fit well, but for the sake of retributive just culture process, let’s classify this omission as such (since it would seem hideously harsh to judge a psychophysiological response as ‘risk taking’ or ‘gross negligence’).

Now we have two counts of ‘honest mistake’ for Cullinan, and one for Ruiz, and one count for Cullinan where we are unsure of its classification. But if the tweet was not seen as a problem had the error not have occurred, then no harsh personal responses are justified.

But they had one job! And such an important job (by Hollywood standards)! And it’s not like that are losing their actual jobs or their liberty. It’s hard to feel sorry for two well paid accountants, mingling with Hollywood celebs during one of the biggest shows on earth.  And remember that the consequences for PwC are not insignificant. An unnamed source told ‘People’ that “The Academy has launched a full-scale review of its relationship with PwC but it is very complicated.” So surely cancelling their involvement is justified, along with a few stories in the media?

Put aside for one moment that the pair are celeb-mingling accountants, and think of them as Brian and Martha – two human beings with families and feelings and ordinary lives outside of this extraordinary day. Most of us have experienced some kind of humiliation in life. It is deeply unpleasant and the memory can resonate for months, years, or a lifetime. Most of us, though, have not felt this humiliation in front of tens of millions of people on live TV, played back by hundreds of millions afterwards. Most of us have not been the subject of thousands of global news stories – and over a million web pages – with front-page stories labelling us a ‘loser’ and a ‘twit’, and a ‘bungling bean counter’, with press hounding us and our families. Most of us have not been subject to hundreds of thousands of comments and memes on social media, nor have we needed bodyguards due to death threats. This is the reality for Brian Cullinan and Martha Ruiz.

Restorative Just Culture

There is another way, and according to Dekker this is restorative just culture. Dekker says that a restorative just culture asks:

  • Who is hurt?
  • What do they need?
  • Whose obligation is it to meet that need?
  • How do you involve the community in this conversation?

Here we might say that those hurt might include the producers of La La Land and Twilight, though neither have given that impression since the event. We might also list the The Academy and PwC, in terms of repetitional damage.

But the individuals most hurt are surely Brian and Martha. What do they need? That we don’t know, but it is certain that their needs are not met by the response so far. Whose obligation is it to meet that need? Here one might say it is the obligation of The Academy and PwC, but we all have an obligation at least not to cause further harm.

The event may live on as an example to individuals and organisations in safety-critical, security-critical and business-critical industries when ordinary front-line workers get caught up in accidents that they never wanted to happen. Should we scapegoat pilots and air traffic controllers, or doctors and nurses, for good-will actions and decisions with unintended consequences? Or should we seek to understand and redesign the system to increase the chances of success in the future? The choice will influence whether front-line workers disclose their ‘honest mistakes’, or cover them up. In his book Black Box Thinking, Matthew Syed explains that “Failure is rich in learning opportunities for a simple reason: in many of its guises, it represents a violation of expectation. It is showing us that the world is in some sense different from the way we imagined it to be.

The event is also a challenge to us, to society. Syed notes that “Society, as a whole, has a deeply contradictory attitude to failure. Even as we find excuses for our own failings, we are quick to blame others who mess up.” He continues, “We have a deep instinct to find scapegoats.” We are deeply hypocritical in our response to failure. He describes examples from healthcare and aviation, where, on reading or hearing about an accident, we feel “a spike of indignation“, “fury“, and a a desire to stigmatise.

Paradoxically, the families of victims of accidents often have empathy for the front-line workers involved, and have a far more systemic view of the events than the general public, politicians, or – in many cases – official accident reports. This can be seen in the case of Martin Bromiley, whose wife died in a routine accident. Martin Bromiley went on to set up the Clinical Human Factors Group, and campaigns for just culture (see this video). It can also be seen in the families of those who died in the train crash at Santiago de Compostela in 2013, which was blamed on ‘human error’ both in the press, and in the official accident report (Spanish version). Following a review of the official accident report by the European Railways Agency, Jesús Domínguez, chairman of the Alvia victims’ association, told The Spain Report that “it confirms that the sole cause is not human error and that the root causes of the accident still need to be investigated“. On 28 July 2013, Garzón Amo was charged with 79 counts of homicide by professional recklessness and an undetermined number of counts of causing injury by professional recklessness. The charges still stand today. (See Schultz, et al, 2016 for a more detailed treatment of the accident.)

Of course, we cannot compare the outcome of The Oscars with any event involving loss of life. But the point is that our corporate and societal responses are similar, and have recursive effects, as Syed explains:

It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own. We anticipate, with remarkable clarity, how people will react, how they will point the finger, how little time they will take to put themselves in the tough, high-pressure situation in which the error occurred. The net effect is simple: it obliterates openness and spawns cover-ups. It destroys the vital information we need in order to learn.

A scapegoat, or safer systems? We can’t have both

So we have two options available to us. According to Dekker, retributive justice asks who was responsible, and sets an example where those responsible have crossed the line. Restorative asks what is responsible, then changes what led up to the incident, and meets the needs of those involved. Both are necessary, and both can work and result in fair outcomes for individuals and society, and better learning. But – especially outside of the judiciary – perhaps the latter is more effective and humane. If we want to learn and improve outcomes in organisations and society, focus on human needs and on improving the system.

The Just Culture in La La Land approach takes the retributive route, but gets it badly wrong. Blaming individuals for their actions-not-as-planned in messy environments has destructive and long-lasting effects on individuals, families, professions, organisations, industries and society as a whole.

In the end, we all have one job. Our job is to learn.

See also

Human Factors at The Oscars

Just culture: Who are we really afraid of?

Safety-II and Just Culture: Where Now?

Human Factors at The Fringe: My Eyes Went Dark

Never/zero thinking

‘Human error’ in the headlines: Press reporting on Virgin Galactic

Life After ‘Human Error’ – Velocity Europe 2014

Human error’: The handicap of human factors, safety and justice

Posted in Culture, Human Factors/Ergonomics, Humanistic Psychology, Safety, systems thinking | Tagged , , , | 3 Comments

Human Factors at The Oscars

5121440257_e81647480b_o.jpg

Photo: Craig Piersma CC BY-NC-ND 2.0 https://flic.kr/p/8NyHL6

“An extraordinary blunder”

It has variously been described as “an incredible and almost unbelievable gaffe” (Radio Times), the greatest mistake in Academy Awards history” (Telegraph), “an extraordinary blunder…an unprecedented error” (ITV News), “the most spectacular blunder in the history of the starry ceremony” and “the most awkward, embarrassing Oscar moment of all time: an extraordinary failure” (Guardian).

It was, of course, the Grand Finale of the Oscars 2017.

Faye Dunaway and Warren Beatty are all set to announce the best picture win. Beatty begins to read out the winners card. But he looks visibly puzzled, pausing and looking in the envelope to see if there is anything else that he’s missed. He begins to read out the winners card, “And the Academy Award…”. He pauses and looks in the envelope again. “…for Best Picture“. He looks at Dunaway, who laughs “You’re impossible!”, then hands the card to her. Dunaway, perhaps assuming this is all for effect, simply reads out what she sees, and announces “La La Land!“.

Music sounds and a narrator gives a 17-second spiel about the film: “La La Land has fourteen Oscar nominations this year, and is tied for the most nominated movie in Oscar history, winning seven Oscars…

The La La Land team exchange embraces and walk to the stage. Jordan Horowitz, a producer, delivers the first thank-you speech. Everything looks normal. But as the second and third thank-you speeches are being delivered, there is visible commotion. A member of the Oscars production team takes back the envelope that has been given to the La La Land producers.

The winner’s envelope is, in fact, the envelope for best actress, just given to  La La Land’s Emma Stone. Behind him, the PricewaterhouseCoopers overseers – Brian Cullinan and Martha Ruiz – are on stage, examining the envelopes.

At the end of his speech, Producer Fred Berger says nervously: “We lost, by the way”. Horowitz takes over, “I’m sorry, there’s a mistake. Moonlight, you guys won Best Picture“. Confused claps and cries ensue. “This is not a joke“, Horowitz continues. Beatty now has the right card, but Horowitz takes it out of Beatty’s hand and holds it up to show the names of the winning producers.

Beatty tries to explain, and is interrupted by host Jimmy Kimmel: “Warren what did you do?!“. Beatty continues, “I want to tell you what happened. I opened the envelope and it said, ‘Emma stone – La La Land’. That’s why I took such a long look at Faye and at you. I wasn’t trying to be funny.” Horowitz hands his Oscar to Barry Jenkins, Moonlight’s director.

It was “the first time in living memory that such a major mistake had been made” (Reuters). The accountancy firm PriceWaterhouseCoopers has apologised and promised an investigation. In a statement, they said, “The presenters had mistakenly been given the wrong category envelope and when discovered, was immediately corrected. We are currently investigating how this could have happened, and deeply regret that this occurred. We appreciate the grace with which the nominees, the Academy, ABC, and Jimmy Kimmel handled the situation”.

Such a mistake, in an ordinary setting, is usually quite uneventful. Similar sorts of things happen every day. The only thing that is “incredible“, “spectacular” and “extraordinary” is the context. It is worth, then, looking a little deeper at this extraordinary event, and considering how similar sorts of  events are played out in many ordinary, but critical, contexts.

Design

The design of the envelopes for the various Oscar awards is identical. The only difference between the envelopes is the text that indicates the category. There is no other means of coding (e.g., colour, pattern) to indicate any difference. Several industries have realised the problem with this approach, and in some ways this can be considered the beginnings of the discipline of human factors and ergonomics: “A seminal study that set the agenda for the scientific discipline of human factors was by the experimental psychologists, Fitts and Jones (1947), who adapted their laboratory techniques to study the applied problem of ‘pilot error’ during WWII. The problem they faced was that pilots of one aircraft type frequently retracted the gear instead of the flaps after landing. This incident hardly ever occurred to pilots of other aircraft types. They noticed that the gear and flap controls could easily be confused: the nearly identical levers were located right next to each other in an obscure part of the cockpit” (van Winsen and Dekker, 2016) .

This problem still exists today in settings far more important than The Oscars, but far less newsworthy…until disaster strikes. A notable example is medicine packaging, where medicine names look alike or sound alike or have very similar labels for different drugs or doses. Many packages and labels require users to force attention onto small details of text, perhaps with the addition of a small area of colour which, on its own, is quite inconspicuous. It is asking a lot of people to make critical – sometimes life-and-death-critical – decisions based on small design features. This is in addition to drugs that look alike or sound alike, such as Aminophylline and Amitriptyline, or Carbamazepine and Chlorpromazine, or Vinblastine and Vincristine.

Experience of human factors suggests a number of coding methods (e.g., shape, colour, size) that, used appropriately, can help to make vital distinctions. There are also several design guidelines for medicines by NHS NPSA (2007) and the European Medicines Agency (2015). In human factors/ergonomics, these are used as part of an iterative human-centred design method that understands stakeholders and context, identifies user needs, specifies design requirements, produces prototypes, and tests them.

In the absence of this process, what is amazing is not that such errors occur, but that they do not occur much more often than they do. Because it happens fairly infrequently, when it does happen it is often (and unhelpfully) branded ‘human error’. But this is not simply a problem of ‘human error’. It is a problem of design, where form (such as branding and aesthetics) so often trumps function. As Hollnagel (2016) states, “The bottom line is that the artefacts that we use, and in many cases must use, should be designed to fit the activity they are intended for“. Form-over-function design places the human in a position where they have to bridge the gap between form and function every time they use an artefact.

Safeguards

For the Oscars, two identical sets of the winners cards are made for ‘safety purposes’. These duplicate envelopes are held in the wings in case anything should go wrong with a presenter or an envelope. In this case, it may be that the duplicate of the Best Actress award, which had just been announced, was handed to Beatty as he walked out to announce the Best Picture winner.

Safeguards feature in most safety critical industries, and are often the result of a risk assessment that specifies a risk control for an identified risk. But the risk assessment process is often a linear cause-effect process, and it often stops at the risk control. And risk controls can have unintended consequences and introduce new risks. Consider this example in the context of aviation and air traffic control:

In early 2014, the UK experienced a prolonged period of low atmospheric pressure. At the same time, there was an unusual cluster of level busts [where aircraft go above or below the flight level or altitude instructed by ATC] at the transition altitude, which were thought to be linked to incorrect altimeter setting on departure into the London TMA [London airspace].

Level busts have been, and remain, a key risk in NATS operation. Longer-term strategic projects, such as the redesign of the London TMA and the raising of the Transition Altitude, are expected to provide some mitigation. However, to respond tactically to the perceived trend in the short-term, it was decided to issue a Temporary Operating Instruction (TOI) to controllers.

The TOI required the inclusion of additional phraseology when an aircraft was cleared from an altitude to a Flight Level during low pressure days. The additional phraseology was “standard pressure setting” e.g. “BigJet123, climb now FL80, standard pressure setting”. The change was designed to remind pilots to set the altimeter to the standard pressure setting (1013 hPa) and so reduce level busts associated with altimeter setting. As this phrase was deemed to be an instruction, it was mandatory for flight crews to read back this phrase.

The TOI was subject to the usual procedural hazard assessment processes and implemented on 20 February 2014 on a trial basis, with a planned end date of 20 May 2014, after which the trial results would be evaluated. The change was detailed in Notices to Airmen (NOTAMs).

During the first day of implementation, several occurrence reports were received from controllers, who noted that flight crews did not understand the meaning of the phraseology, and did not read back as required. This led to additional radio telephony to explain the instruction, and therefore additional workload and other unintended consequences.

Extract from case study by Foster, et al, in EUROCONTROL (2014). 

Every industry has many examples of ‘safeguards gone bad’. We often fail to understand how such changes change the context and introduce secondary problems.

Decision making under uncertainty

Beatty is standing there, with the eyes of tens of millions of viewers upon him. He is being recorded for perpetuity, for viewing by hundreds of millions more. He has to make a decision about an announcement, which will feel like a gold Olympic medal to a few producers. But he isn’t sure what’s going on. As Beatty  explained, “I opened the envelope and it said, ‘Emma Stone – La La Land’. That’s why I took such a long look at Faye and at you. I wasn’t trying to be funny“.

Here we cannot be certain what was going through Beatty’s mind, but could it be that – live on one of the most important TV events in the world – Beatty did not want to voice his confusion and uncertainty? He appeared visibly puzzled and gave the envelope to Dunaway to read out the ‘winner’. Dunaway could not have known about Beatty’s thoughts, since his behaviour could easily have been a time-filler or fumbling joke, and of course it made sense to her to simply read hat she saw: “La La Land“.

When under pressure, any delay can have associated costs. For Beatty, asking for clarification would have meant an awkward period of filler, a clumsy live-on-air check of envelopes, perhaps a loss of advertising time. In a state of confusion and self-doubt, perhaps it made sense to say nothing and pass the confusing artefact to someone else.

In many safety-critical activities, decisions are made under uncertainty. The information and situation may be vague, conflicting or unexpected. In some cases, there is a need to signal confusion or uncertainty, perhaps to get a check, or to ask for more time. It can seem hard for us to give voice to our uncertainty in this way, especially under pressure. When someone has a command position – in an operating theatre, cockpit, or at the Oscars  – it can be difficult for that person to indicate that they are not sure what is going on. This has played out in several accidents and moreover in everyday life. But sometimes, the most powerful phrase may be something along the lines of, “I do not understand what is going on”. This identifies a problematic situation and opens the door for other members of the team to help problem-solve. This kind of intervention is part of many training programmes for ‘team resource management’ (by whatever name), and can help everyone involved – no matter what their formal position – to voice and resolve their doubts, uncertainties and concerns.

It’s just an awards show

The events of Oscars 2017 will be emblazoned forever on the minds of participants and aficionados. But it will also soon be a feature of a trivia game or TV show. As host Jimmy Kimmel said “Let’s remember, it’s just an awards show.” But for those who have to put up with the same sorts of problems every day, it’s much more than that. In many industries, people help to ensure that things go well despite other aspects of the system and environment in which they work. For the most part, the human in the system is less like a golden Oscar, and more like a Mister Fantastic, using abilities of mind and body to connect parts of systems that only work because people make them work. This aspect of human performance in the wild is usually taken for granted. But in the real world, people create safety. And for that, they deserve an Oscar.

References

EUROCONTROL (2014). Systems Thinking for Safety: Ten Principles. A White Paper. Brussels: EUROCONTROL Network Manager, August 2014. Authors: Shorrock. S., Leonhardt, J., Licu, T. and Peters, C.

Hollnagel, E. (2016). The Nitty-Gritty of Human Factors (Chapter 4). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

van Winsen, R. and Dekker, S. (2016). Human Factors and the Ethics of Explaining Failure (Chapter 5). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

See also

Just Culture in La La Land

‘Human error’ in the headlines: Press reporting on Virgin Galactic

Life After ‘Human Error’ – Velocity Europe 2014

Human error’: The handicap of human factors, safety and justice
The HAL 9000 explanation: “It can only be attributable to human error”
Occupational Overuse Syndrome – Human Error Variant (OOS-HEV)
‘Human error’: Still undefined after all these years

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , | 9 Comments

The Archetypes of Human Work: 7. Defunct

This is the seventh and last in a series of posts on The Archetypes of Human Work, which are based on the interactions or relationships between The Varieties of Human Work. For an introduction, see here.

The seven archetypes are:

  1. The Messy Reality
  2. Congruence
  3. Taboo
  4. Ignorance and Fantasy
  5. Projection
  6. P.R. and Subterfuge
  7. Defunct (this Archetype)

Each archetype includes a number of examples (currently healthcare-related). If you have further examples – from any industry – please provide an example as a comment or get in touch. More examples will be added over time.

Archetype 7: Defunct

slide7

Archetype 7: Defunct

Composition: work-as-prescribed but not as-done. May or may not be as-imagined or as-disclosed.

Short description: Some forms of prescribed work are not enacted, or else drift into disuse, but are still officially in place. Some will imagine that these are in place, while others know or think they are not. However, the existence of the Defunct work may be used to judge actual activity.

What is it? 

Much human work exists in prescribed form, such as regulations, management systems, policies, procedures, guidelines, checklists, good practice, user interface dialogues, etc. Sometimes, this work-as-prescribed does not reflect the reality of work-as-done, which might be characterised as The Messy Reality. The prescribed work still exists, but in a form which is Defunct. Sometimes, this is just a temporary matter, where work-as-prescribed for some reason does not apply. Other times, work-as-prescribed may be permanently Defunct. Work-as-prescribed may even seem quite irrelevant; few would even think about it or discuss it, let alone follow it, especially at the front line of work, or even throughout an organisation or industry sector.

Why does it exist? 

It is often the case that Defunct designed work has been prescribed without adequate attention to the the design process, often an efficiency-thoroughness trade-off at the blunt end. A thorough approach to design (of interfaces, procedures, checklists, etc) would require that: 1) the stakeholders (especially the users), system, activities and context are understood; 2) stakeholder needs are investigated and design requirements specified; 3) prototypes are developed; and then 4) prototypes are tested. The testing would reveal any flaws in the implementation of this process, and thus there would be iterative loops back to each stage. If the prototype (e.g., checklist) meets the users’ and other stakeholders’ needs, then we have a final step: 5) implementation. The whole process would be planned with appropriate resources allocated (expertise, time, etc). This is a thorough approach, known as human-centred design (or ergonomics).

The ‘efficient’ approach, which is more common, is to go straight to step 5 (implementation), perhaps with some perfunctory consideration of step 1. Commercial-off-the-shelf/pre-designed systems and artefacts are often purchased, which is understandable and often completely necessary. The problem is, neither the developer nor the purchaser may have completed the previous 4 steps. Even if the developer has used some kind of human-centred design process, the new context and stakeholders (and therefore the stakeholders’ and users’ needs and design requirements) may well be very different. Since there is no testing, feedback is gathered in real operations, by which time it is too late. Local adaptation of the artifact (e.g., checklist, user interface dialogue) to the users’ needs may be impossible, prohibitively expensive or impractical.

People at the sharp end are now faced with a Catch-22. Either they comply with work-as-prescribed (Congruence) or they find another unprescribed solution (The Messy Reality) and the work-as-prescribed is Defunct. In either case, work-as-done may have unintended and unforeseen consequences.

Even with human-centred design, work-as-prescribed may fall into disuse. Such cases are often a mystery to those at the blunt end and even many at the sharp end. This tends to happen when the work-as-prescribed is not understood, either the details or the purpose. In such cases, continuous monitoring and discussion of work-as-done is likely to be helpful, with appropriate adjustment and education where necessary.

There may also be cases where work-as-prescribed is simply not annulled or abolished when it should be. Many organisations and governments have numerious policies, procedures, regulations, laws and so on that remain officially in place, but that no-one imagines are in use. (British law is replete with such laws. For instance, Section 54 of the Metropolitan Police Act 1839 makes it an offence to carry a plank of wood on a pavement.)

Shadow side

Many of the problems associated with the Defunct archetype concern the nature of work-as-done and work-as-imagined, and so are associated with other archetypes, especially The Messy Reality and Ignorance and Fantasy.

In some cases, work-as-prescribed is Defunct only in particular circumstances. This was the case with the QF32 engine failure. The Airbus A380 ECAM checklists could not be followed as prescribed. In such cases, the people in control are deep into The Messy Reality and have to use their judgement and experience to find alternative solutions to the problems that they face. If appropriate training is not provided to help deal with such exceptional events, then the assumption that work-as-prescribed is universally safe becomes a particular liability.

In other cases, work-as-prescribed is more or less permanently Defunct. This presents some different problems, again mostly associated with other archetypes. A particular problem concerns the consequences of not working to rule. Gaps between work-as-prescribed and work-as-done may be the basis for disciplinary and regulatory/legal action against individuals and organisations. In some cases, such action may be unfair and vindictive, for instance when Defunct rules are used for used as a tool for workplace bullying.

Finally, an obvious problem with this archetype is that the Defunct work might actually represent good practice with benefits for safety, health, or other goals. In this case we need to try to understand why the work-as-prescribed failed to make it over the line of reality.

Examples (Healthcare)


Of the 2184 policies, procedures and guidelines (PPGs) in my organisation, 28% are currently out of date and may therefore not reflect current practice. More interesting still, are the nearly 19% of PPGs that have been opened less than 5 times in total, including by their authors. These documents are often written to meet the requirements of external agencies with the idea that not having a policy leaves the organisation vulnerable to criticism. These documents remain unopened, unused and unrelated to daily work but may be used after incidents as a form of organisational protection: “yes, we had a policy for that”.

Carl Horsley, Intensivist, @horsleycarl


In operating theatres that use lasers, certain precautions, rules and safety precautions have to be in place. Part of this is to have a risk assessment and standard written laser protection policy. This risk assessment is normally carried out by a laser protection supervisor from a distant site who has no knowledge of local practice. In addition this tends to be written when a new laser is purchased and then is never updated. While work-as-imagined would be following the policy to the letter, if the policy is impractical for the local use of the laser, the local team will tend to develop workarounds (The Messy Reality). When there is a site visit by the laser protection supervisor however, work-as-disclosed will follow work-as-imagined – as they are reassured that everyone follows all the rules to the letter (P.R. and Subterfuge). If a laser protection incident does however occur, the local team would all be held to account by the Defunct laser protection rules.

Craig McIlhenny, Consultant Urological Surgeon, @CMcIlhenny


When the surgical team book a patient for theatre, they are supposed to discuss this with the anaesthetic team, to explain the indication for surgery, the degree of urgency and any medical conditions the patient has. The anaesthetic team should therefore be a central point who are aware of all the patients waiting for theatre to help with appropriate prioritisation. In reality this only happens if they happen to see an anaesthetist when they book the case. More often than not, cases are “booked” with no discussion with the anaesthetist and often the cases are not ready for theatre (may need scans first for example) or may not even need an operation. This only becomes obvious when the anaesthetist goes to review the patient, or perhaps even later. Despite many organisations having guidelines about this, it still seems to happen.

Emma Plunkett, Anaesthetist, @emmaplunkett


 

Posted in Culture, Human Factors/Ergonomics, systems thinking | Tagged , , , , , , | 6 Comments

The Archetypes of Human Work: 6. P.R. and Subterfuge

This is the sixth in a series of posts on The Archetypes of Human Work, which are based on the interactions or relationships between The Varieties of Human Work. For an introduction, see here.

The seven archetypes are:

  1. The Messy Reality
  2. Congruence
  3. Taboo
  4. Ignorance and Fantasy
  5. Projection
  6. P.R. and Subterfuge (this Archetype)
  7. Defunct

Each archetype includes a number of examples (currently healthcare-related). If you have further examples – from any industry – please provide an example as a comment or get in touch. More examples will be added over time.

Archetype 6: P.R. and Subterfuge

slide6

Archetype 6: P.R. and Subterfuge

Composition: work-as-disclosed and often as-prescribed, but not as-done. May or may not be as-imagined by the discloser. 

Short description: This is what people say happens or has happened, when this does not reflect the reality of what happens or happened. What is disclosed will often relate to what ‘should’ happen according to policies, procedures, standards, guidelines, or expected norms, or else will shift blame for problems elsewhere. What is disclosed may be based on deliberate deceit (by commission or omission), or on Ignorance and Fantasy, or something in between… The focus of P.R. and Subterfuge is therefore on disclosure, to influence what others think.

What is it?

Work-as-disclosed is what people say (in verbal or written form) about work-as-done by themselves or others, and is the dominant variety of human work in the P.R. and Subterfuge archetype. ‘P.R.’, in this context, could stand for ‘Public Relations’ or ‘Press Release’, which focus on disclosure but not necessarily reality. P.R. could also mean ‘Pre-Reality’ (disclosing that something is real before it really is real) or ‘Post-Reality’ (where “words don’t matter nearly as much as the intent, the emotion, the subtext…”, Seth’s Blog). It might also be seen as what is now called ‘alternative facts’ and fake news. P.R. and Subterfuge is commonly associated with politicians, spin doctors, lawyers, lobbyists, reporters, public relations specialists, sales people, and advertisers, but will be familiar to most, to some degree.

P.R. and Subterfuge tends to concern what in-group members say about work-as-done to out-group members. It is especially evident when people have to disclose the circumstances of failures or compliance with regulations, management systems, policies, procedures, guidelines, checklists, good practice, etc. to internal specialists (e.g., auditors, investigators, competency assessors, doctors, HR, senior managers) or outside agencies, organisations or individuals (e.g., regulators, supervisory bodies, professional associations, judiciary, journalists, citizens, interfacing organisations). It includes what is said or written, and what is not, in audits, investigations, inquiries, press releases, interviews, freedom of information requests, corporate communications, social media, etc.

P.R. and Subterfuge may involve varying levels of deception. Generally, where the consequences of disclosure are pertinent, unless the other party is trusted, people will tend to describe the work that they do in a way that accords with work-as-prescribed or (what is thought to be) work-as-imagined by other party. In some cases, the difference between work-as-disclosed and work-as-done with P.R. and Subterfuge is very much deliberate, from minor omission to large scale cover-ups. In such cases, a partner archetype will often to be found in Taboo; the aspects of work-as-done that cannot be discussed openly will be omitted from P.R. and Subterfuge. In other cases, there may not no intentional deceit on behalf of the discloser, but what is disclosed may be fed by subterfuge by others.

Why does it exist?

There is often a need to describe or explain performance, both internally within organisations and outside of organisations. What is said (work-as-disclosed) will clearly influence the work-as-imagined of these others, and this is the primary purpose of P.R. and Subterfuge. Because work-as-disclosed does not align with work-as-done, P.R. and Subterfuge will tend to feed the archetype Ignorance and Fantasy in others, inadvertently or deliberately.

The reasons for P.R. and Subterfuge are varied but many of these can be grouped into two major categories: ignorance and fear. Often, those who are distant from work-as-done talk about it based on Ignorance and Fantasy. Such individuals are reliant on their work-as-imagined, knowledge of work-as-prescribed, and work-as-disclosed by others. For instance, a corporate communications specialist, press officer, or a senior manager, will tend to know little about the specifics of how front-line workers actually work, and will rely on others for this information.

P.R. and Subterfuge can also be motivated by fear of possible consequences should the reality of work-as-done be revealed. These consequences for individuals and organisations may relate to legal action, bad publicity, journalistic inquiry, regulatory investigation or sanctions, fines, cut backs to funding or resources (e.g., staff, training), loss of reputation or status (individual or organisational), loss of profession, operating/professional licence or livelihood, and in extreme cases, loss of liberty. The perceived risk of such consequences will tend to shape what is disclosed, what is not, and what else is said.

It may seem like P.R. and Subterfuge is the product of dishonest organisations and individuals, but a number of systemic features of organisations and industries can  cultivate the archetype. Examples include aspects of regulatory practice, management control measures, procedural constraints, measures, information flows, performance targets, incentive systems, punishments, and goals (especially goal conflicts). In the face of conditions or interventions that get in the way of the work (and potentially make it unsafe or otherwise ineffective), individuals and groups may justify P.R. and Subterfuge via a perceived higher purpose or goal. An illusion of Congruence may be created for out-groups, perhaps in response to the Defunct archetype, or to try to see off damaging interventions based on a superficial and inaccurate perception of work-as-done, such as cutbacks to resources (e.g., cutbacks to staff based on observation of a quiet period) or inappropriate constraints (e.g., procedural diktats based on one incident). P.R. and Subterfuge may therefore offer perceived benefits by protecting people from unwanted and potentially damaging outside influence or intervention which does not recognise the reality of work.

Shadow side

P.R. and Subterfuge, especially in its more deceptive form, involves a variety of ethical problems and dilemmas. More generally, it increases further the distance between work-as-imagined and work-as-done. Work-as-prescribed may become increasingly detached from reality, perhaps Defunct, thus invalidating many organisational and regulatory control measures, which are tied to  work-as-prescribed. Work-as-done (and associated risks) remains unknown to most stakeholder groups. This creates problems of safety, accountability and liability.

In many industries, organisations have been known to cover up work-as-done (especially The Messy Reality) when things have gone wrong (see this reported decades-long cover-up by Dupont, which has long promoted itself as a “world class safety leader”). In explaining failure, the activity of an organisation may be Taboo, and what is disclosed may differ markedly from what is found by an independent inquiry. In 2016, four Dupont workers died in a toxic gas leak four workers died in a toxic gas leak (see here). The U.S. Chemical Safety Board inspectors said the reasons for the accident related to the corporate safety culture nationwide, citing design flaws in DuPont’s complex pesticide production unit, inadequate gas detectors, outdated alarms and broken ventilation fans. DuPont, the company originating from the founder of the ‘zero injury’ philosophy (chemist and industrialist Éleuthère Irénée du Pont de Nemours, 1771-1834), attributed the cause of the disaster to actions by rank-and-file employees. The tendency of organisations to point the finger at sharp end workers is an example of P.R. and Subterfuge which perpetuates P.R. and Subterfuge among rank-and-file employees, in order to protect themselves from blame; a spiral of subterfuge.

Examples (Healthcare)


Commissioners often use CQUINs (Commissioning for Quality and Innovation payments framework) to drive innovation and quality improvement in the NHS. In theory, the metrics relating to individual CQUINs are agreed between commisioners and clinicians. In practice, some CQUINs focus on meaningless metrics. A hypothetical example: a CQUIN target for treating all patients with a certain diagnosis within an hour of diagnosis is flawed due to a failure of existing coding systems to identify relevant patients. Clinicians inform the commissioners of this major limitation and offer suggested improvements to the metrics. These suggested improvements are not deemed appropriate by the commissioning team because they deviate significantly from previously agreed definitions for the CQUIN. The clinicians are demotivated by the process of collecting meaningless data and are tempted to use gaming solutions to report best performance. This situation is exacerbated by pressure from the management team within the NHS Trust who recognise that failure to demonstrate adherence to the CQUIN key performance indicators is associated with a financial penalty. The management team listen to the clinicians and understand that the data collection is clinically meaningless, but insist that the clinical team collect the data anyway. The motivational driver to improve performance has moved from a desire to improve clinical outcomes to a desire to reduce financial penalties. The additional burden is carried by the clinical team who are expected to collect meaningless data without any additional administrative or job plan support. 

Anonymous, NHS paediatrician


It is one thing when you find out that your local hospital has suffered serious failures of care resulting in numerous preventable deaths, it is another when you find that hospital is involved, if not in blatant cover-up, in obscuring the extent of the problems. But when you find the organisations responsible for regulating hospitals have not only failed to maintain standards but are complicit in their own cover-ups then you can begin to despair whether you will ever get to the bottom of just how and why these tragedies occurred. [Extract from Joshua’s Story, by James Titcombe – used with permission.]

James Titcombe, Father of Joshua Titcombe, who died nine days after his birth at Furness General Hospital in Barrow in October 2008, @JamesTitcombe.


Healthcare staff often have to complete mandatory online modules, e.g. in fire safety, manual handling, blood transfusion. The modules have a pass rate (e.g. 80%) and sometimes a maximum number of attempts before the healthcare worker is locked out and has to discuss their poor performance with their line manager. Healthcare workers may then sit down in groups to share the correct answers and therefore pass the module.

Anonymous


The use of checklists for the prevention of Central Line Associated Bacteraemia (CLAB) is well described and has been taken up widely in the healthcare system. The purported benefits of the checklist include ensuring all steps are followed as well as opening up communication between team members. After introducing the CLAB bundle into our Intensive Care Unit, we saw very high levels of reported checklist compliance followed by the expected drop in our rates of infection, confirming the previously reported benefits. However, when we observed our staff it became apparent that they were actually filling in the checklist retrospectively without watching the procedure, as they were busy with other tasks.The fall in the CLAB rate could therefore not have been due to the use of a checklist and instead appears to be due to the use of “CLAB packs”. These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly.

Carl Horsley, Intensivist, @horsleycarl.


 

Posted in Culture, Human Factors/Ergonomics, Safety, systems thinking, Uncategorized | Tagged , , , , , , | 6 Comments

The Archetypes of Human Work: 5. Projection

This is the fifth in a series of posts on The Archetypes of Human Work, which are based on the interactions or relationships between The Varieties of Human Work. For an introduction, see here.

The seven archetypes are:

  1. The Messy Reality
  2. Congruence
  3. Taboo
  4. Ignorance and Fantasy
  5. Projection  (this Archetype)
  6. P.R. and Subterfuge
  7. Defunct

Each archetype includes a number of examples (currently clinical). If you have further examples – from any industry – please provide an example as a comment or get in touch. More examples will be added over time.

Archetype 5: Projection

slide5

Archetype 5: Projection

Composition: work-as-imagined, often as-prescribed and perhaps as-disclosed. May or may not be as-done.

Short description: We are prone to imagine that things will work according to a plan, and prone to wishful thinking, ignoring the potential for problems. The focus of Projection is the imagination of the future, as we think it will be, or would like it to be.

What is it?

When we need to design or plan human work, we project our imagination into the future. Informally, we plan our or others’ work, at some level, over the coming minutes, hours, days, months or years. Projection may involve planning a task about to be performed, via  mental preparation, or the use of specific tools. Or it might involve planning a new system to be implemented some time in the future. This formal Projection might involve new or major changes to major infrastructure or facilities (such as hospitals, airports or railways), changes to equipment, changes to staffing and competency, changes to artefacts of management (such as performance targets or league tables) or changes to procedures. For changes to the design of work, there will be some kind of prescription of how we think things should happen, and this may be communicated to others, in designs, plans, procedures, etc. We might also try to project what we don’t want to happen, perhaps via hazard identification or risk assessment.

Why does it exist?

Projection serves our need to reduce fear and uncertainty about the future, and have some confidence that our future needs will be met.

Shadow side

In our attempts to bring future work-as-done into the present, Projection will often be far from the mark, and will usually be inaccurate in some way or other. Even when you are familiar with work-as-done now, Projection of future work-as-done, and related resources (including time), can be very unreliable. We tend to overestimate the degree to which future work-as-done will follow our designs and plans (due to overconfidence, lack of imagination, wishful thinking, variability in demand and resources, etc.). We also tend not to foresee unwanted side-effects or long-term consequences of our designs and plans. Even small changes can have disproportionately large effects/

It is difficult to project with accuracy even seemingly straightforward activities, but as work becomes more complex, emergence becomes the thorn in the side of Projection. We try to overcome this with the application of formal methods, but most of these involve decomposing future tasks and related systems into parts, considering these parts, and using these parts to project performance. Because of interactions between activities and the environment, adaptation, and the effects of multiple changes over time, future work-as-done cannot always be projected in this way, and so is often not as expected. The mismatch between what we expect and what happens will tend to increase with complexity.

At the blunt end, those involved in the design of future work may engage in Projection on a basis of Ignorance and Fantasy, especially if they are distanced from The Messy Reality of work-as-done even today. Close proximity to work-as-done is no guarantee of success in predicting the future, but increasing distance – which is common – stretches the feedback loop back to imagination and design.

As work-as-done comes into fruition, other archetypes emerge. The Messy Reality will tend to rise to the surface, of course alternating with Congruence, and instances of Taboo may also emerge as certain aspects of work – at the blunt end or sharp end – cannot be discussed openly, perhaps replaced with P.R. and Subterfuge. Unwanted effects are covered up by day-to-day adaptations at the sharp end, perpetuating Ignorance and Fantasy at the blunt end. The Defunct archetype will also tend to unfold over time, as policies, procedures and plans remain docked in the work-as-imagined of days gone by…

Examples (Healthcare)


The computerised estimation of the time it will take to perform a case in theatre can be an example of Projection. Theatre scheduling uses the average time that similar cases have taken in the past to predict how long a case will take in the future. Individual patient, surgical and anaesthetic factors are not considered. Sometimes this is accurate, but other times it is not. It is therefore a crude system, although it is the best that we have at present. The problem comes when staff feel they have failed when cases take longer than the projection and theatre over runs. This is inevitable given the nature of the system.

Emma Plunkett, Anaesthetist, @emmaplunkett.


Installation of computerised medical systems can display this trait. For instance with the installation of a fully computerised system for ordering all sorts of tests (radiology requests, lab requests, etc.) work-as-imagined (and -as prescribed) was that this would make work more efficient and safer, with less chance of results going missing or being delayed. Prior to the installation there was much chat  (work-as-disclosed) with widespread talk of how effective and efficient this would be. After installation it became apparent that the system did not fulfill the design brief and while it could order tests it could not collate and distribute the results. So work-as-done then reverted back to the system that was in place before where secretaries still had to print results on bits of paper and hand them to consultants to action.

Craig McIlhenny, Consultant Urological Surgeon, @CMcIlhenny.


There are a lot of discussions about how electronic solutions will solve all the problems! Medicines reconciliation still remains a challenge, on admission and discharge, and there is great faith put into how electronic solutions will solve these. They are seen as reducing risks but often just introduce other different risks. Fundamentally we still need competent practitioners to be able to use good clinical judgement and clear decision making for them to be effective.

Anonymous, Pharmacist.


Posted in Culture, Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , | 6 Comments

The Archetypes of Human Work: 4. Ignorance and Fantasy

This is the fourth in a series of posts on The Archetypes of Human Work, which are based on the interactions or relationships between The Varieties of Human Work. For an introduction, see here.

The seven archetypes are:

  1. The Messy Reality
  2. Congruence
  3. Taboo
  4. Ignorance and Fantasy (this Archetype)
  5. Projection
  6. P.R. and Subterfuge
  7. Defunct

Each archetype includes a number of examples (currently healthcare-related). If you have further examples – from any industry – please provide an example as a comment or get in touch. More examples will be added over time.

Archetype 4. Ignorance and Fantasy

slide4

Composition: work-as-imagined, often as-prescribed but not as-done (may or may not be as-disclosed).

Short description: This is what people don’t know about real work and what they imagine happens. The imagination relate to official policy, procedure, standards, guidelines, etc that people assume are in force, or there may just be a general impression of how things work and should work. The primary focus of Ignorance and Fantasy is the imagination of those removed from the actual work.

What is it?

The Ignorance and Fantasy archetype concerns work-as-imagined, usually in the minds of those who are more distant from the work, who often lack knowledge about how things work, perhaps imagining that work is a reflection of what is actually prescribed. Ignorance and Fantasy may be harmless, but if it is disclosed inappropriately in verbal or written form (e.g., to those who can invalidate it or hold people to account for it), or if it is the basis of decisions about the actual work (e.g., demand, resources, constraints), then it may be harmful. As it applies to current work, Ignorance and Fantasy will tend to apply more to some policy makers, journalists, senior managers, other  professions (who do not do the work), and the public. Ignorance and Fantasy may occasionally apply to those who actually do the work, when those people have an imagination about how they work (or would have worked) which is not how it is really done – a fantasy. We may genuinely think and declare that we do work one way but actually do it another way. Ignorance and Fantasy inhabits a different zone to The Messy Reality, and if the two ever come into contact there can be surprise, bewilderment, even outrage…and more mess.

Why does it exist?

Most of those who exhibit the Ignorance and Fantasy archetype are far removed from work-as-done and those who do the work, or lack understanding of it, or both. Lack of knowledge and understanding is therefore a primary reason for the existence of archetype. A group who knows little about the real work may reinforce shared beliefs about the work: false consensus.

There may be several perceived benefits to Ignorance and Fantasy. A false narrative about the work may reduce the perceived need to really understand The Messy Reality, which is often difficult to understand. It can also reduce uncertainty, confusion and anxiety about how we think others work. There can be significant cost savings, because Ignorance and Fantasy can negate the perceived need to spend resources to properly understand or improve work-as-done (including the system conditions under which work is done). For journalists and the public, Ignorance and Fantasy offers an easy to understand narrative which helps to reduce uncertainty and gives a simple explanation for events that mask context, complexity and causation (e.g., ‘human error‘ or hero/villain narrative).

Shadow side

As mentioned above, Ignorance and Fantasy can be harmless. Most people do not need to know much, or even anything at all, about various types of work-as-done. We may not, however, want to know about the details of work-as-done (when we really ought to know) in light of the consequences of this knowledge for us. As said by Iris Murdoch (Irish-British novelist and philosopher): “We live in a fantasy world. A world of illusion. The great task is to find reality. But given the state of the world, is it wise?”.

But Ignorance and Fantasy, whether through simple lack of knowledge or not wanting to know, can also be extremely harmful, and lead to problematic decisions, or inaction. At a management or regulatory level, this may concern, for instance, staffing, training, and equipment, and constraints such as rules and regulations or goals such as performance targets. There can be problems of risk control, accountability and liability. Various means of organisational monitoring, assessment and control – including risk assessments and resulting risk controls – may be relied up yet rendered impotent, essentially Defunct. At a journalistic level, Ignorance and Fantasy may feed simplistic or inaccurate narratives. Among citizens it may affect purchasing and shareholding/service user decisions and other civic participation

Examples (Healthcare)


The WHO Surgical Safety checklist was introduced into the National Health Service following the release of Patient Safety Alert Release 0861 from the National Patient Safety Agency on 29 January 2009. Organisations were expected to implement the recommendations by February 2010 including that ‘the checklist is completed for every patient undergoing a surgical procedure (including local anaesthesia)’. All organisations have implemented this Patient Safety Alert and the WHO Surgical Safety checklist is an integral part of the process for every patient undergoing a surgical procedure. Whilst the checklist appears to be used in every patient, there is clear evidence that there is variability in how the checklist is used both within an organisation and between organisations. Within an organisation, this variability can occur between teams with differences in the assumed value of using the checklist  and within a team between individuals or professional groups. Its value can degrade to a token compliance process to ‘tick the box’. The assumption within an organisation at ‘the blunt end’ is that it is done on every patient.

Alastair Williamson, Consultant Anaesthetist, @TIVA_doc


Senior management often believe that all healthcare staff have received basic or intermediate life support training, as these staff work in the acute setting and would, of course, have received this training. In reality, life support competence is merely recommended and not mandated by bodies such as the Resuscitation Council (UK). This means that competence in life support is dependent on the number of resuscitation officers, whether staff have been released from work to go to training, etc.

Anonymous, Anaesthetist.


I think the simplest example of this is hand hygiene. Work-as-imagined (and indeed as prescribed) in this situation is that all healthcare staff follow the WHO Five Moments for hand hygiene. Multiple audits do of course reveal that our compliance (work-as-done) with hand hygiene is abysmal (especially amongst medical staff) with compliance rates of around 30%. Work-as-disclosed in regards to hand hygiene depends on who is asking – but again generally does not reflect work-as-done. Our patients however are mostly ignorant of our very poor levels of compliance in this regard.

Craig McIlhenny, Consultant Urological Surgeon, @CMcIlhenny


In 2005 my wife was admitted to hospital for a routine elective procedure. It took just over 20 minutes for people and a system that didn’t do human factors to leave my wife brain dead. It would be another 13 days before she really was dead. As clinicians the world over have reviewed my late wife’s case, in a quiet break room perhaps, they have all, with very few exceptions stated clearly: “I wouldn’t have done what they did”. Yet place those same people in a simulated scenario with the same real world disorder, which deteriorates into the same challenging moment, most actually do. This gap illustrates the difference between human performance as imagined and human performance in the real world. (Adapted from the Foreword of Human Factors and Ergonomics in Practice [CRC Press].)

Martin Bromiley OBE, Pilot and Chair of Clinical Human Factors Group, @MartinBromiley


There are high levels of burnout. A target-driven culture is exacerbating this problem. A typical example was when the government seemingly became convinced by poor quality data which suggested that dementia was under diagnosed So it decided to offer GPs £55 per new diagnosis of dementia. Targets were set for screening to take place – despite the UK National Screening Committee having said for years that screening for dementia was ineffective, causing misdiagnosis. And when better data on how many people had dementia was published – which revised the figures down – it was clear that the targets GPs were told to meet were highly error-prone. The cash carrot was accompanied with beating stick, with the results – naming and shaming supposedly poorly diagnosing practices – published online. Setting doctors harmful tasks, leading them almost to “process” patients, fails to respect patient or professional dignity, let alone the principle of “do no harm”. [Extract from article The answer to the NHS crisis is treating its staff better, New Statesman]

Margaret McCartney, General Practioner, @mgtmccartney


This archetype is at the heart of the clinician-manager divide to the extent that it exists. (I understand that many clinicians get on well with many managers. And many wear both hats.) Senior managers (blunt end) may be ignorant of what clinicians (sharp end) do. They may have a fantastical view informed by preconceptions, unconscious bias, the views of intermediaries etc. The genesis of such a view may be consciously or unconsciously purposeful. Humankind cannot bear too much reality. It would be unfair not to say that in another sense clinicians (blunt end) may be just as ignorant of what senior managers do at their own equally sharp end. In the interest of the primary objective of the activities of both groups, high quality, safe patient care, a sympathetic mutual understanding is essential. No to ignorance. No to fantasy.

On the cusp of a major hospital change programme in 2009 I found myself at the centre of a situation that in the interest of patients required total cooperation between clinicians and managers. Unfortunately the way it was handled resulted in all out conflict.

On the one hand we had the board represented by the CEO. On the other the consultant body led by me. I had until recently been clinical director and understood the department, its workings and its history better than anyone. In between the two we had the three service managers, a midwife, an obstetrician and a recently arrived non-clinician who had not managed a clinical department before.

The CEO introduced a major  change programme under the slogan “More for Less”. After this there was no direct contact with the consultants. The managers were her main conduit for the top down communication. The consultants were unable to stand fully together to either co-operate with positive changes or challenge initiatives which would jeopardise care. You may sense Reason’s cheese slices sliding into position here.

A RCPCH review later criticised senior management for the failed change programme being all about cuts and not service improvement, and for naively thinking that the soon to be commissioned PFI hospital would resolve deep relational issues (people are more important than buildings!). The middle managers were criticised for their aggressive managerial style. All nursing staff for example were put on notice of possible redundancy in a circular, without any face-to-face meeting. The wiser ones quickly jumped ship to adjacent Trusts. Two managers were accused of bullying by nursing staff, shouting, swearing, threatening job security etc. The consultants were unable to speak with a common voice. On the whole the clinicians had only one real interest – seeing patients. With the odd exception they were very good at that.

At this time I had become familiar with the ideas of Gerry Robinson, a management “guru” who had achieved a certain media profile. One of his central ideas on NHS management was expressed thus:

“I understand how this culture of multiple managers develops. I think Chief Executives get to a point where it is easier to manage other managers than it is to deal with medical and nursing staff, especially consultants, who can be resistant to being told what to do by those with no medical background. Instead, Chief Executives surround themselves with a safe set of managers who tell them what they want to hear, and perhaps they look to hire more – for business development or finance or new initiatives. Increasingly, the man or woman at the top of the tree is distanced from the reality of leading doctors, nurses and other staff, and delivering care to patients.”

I still believe there is a lot of truth in this. It is an arrangement that strengthens hierarchy and pits different groups against each other. (The remedy is fairly obvious by the way.) In our case we became locked in a triangle of mistrust. To different extents we all became the prisoners of our own fantastical views of each other with little or no desire to understand the other’s perspective. This fed the conditions that militate against co-operative working for high quality and safe patient care.

Older and wiser now I have at least come to understand in terms of organisational psychology why many of the actors in this tragedy behaved as they did. Where understanding falls short of a full explanation only agnosticism serves any purpose. A benevolent agnosticism.

I have one piece of evidence for the ignorant and fantastical view the CEO developed of me. The denouement was my own dismissal for, amongst other things, insubordination. Any CEO who views a senior consultant who has led his department for many years as a subordinate (as in some kind of military hierarchy) can only do so out of ignorance. It is essential that clinical/managerial teams are coalitions of equals who come to understand and respect each other. Only insecure leadership could believe otherwise.

In fairness to anyone I have criticised here there is nothing personal intended. In any case although I have put a fresh gloss on this the story has now been in the public domain for some years without those individuals making public comment.

David Drew, Consultant Paediatrician in a former life, @NHSwhistleblowr


Posted in Culture, Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , | 6 Comments

The Archetypes of Human Work: 3. Taboo


This is the third in a series of posts on The Archetypes of Human Work, which are based on the interactions or relationships between The Varieties of Human Work. For an introduction, see here.

The seven archetypes are:

  1. The Messy Reality
  2. Congruence
  3. Taboo (this archetype)
  4. Ignorance and Fantasy
  5. Projection
  6. P.R. and Subterfuge
  7. Defunct

Each archetype includes a number of examples (currently healthcare-related). If you have further examples – from any industry – please provide an example as a comment or get in touch. More examples will be added over time.

Archetype 3: Taboo

slide4

Archetype 3: Taboo

Composition: work-as-done but not as-disclosed, nor usually as-prescribed, nor usually as-imagined.

Short description: This is activity that people don’t want to talk about outside of one or more groups. It is often not in accordance with official policy, procedures, etc, or there is no relevant policy, procedures, or if it is described in procedures, others would find the activity unacceptable. As such, the activity is often not widely known outside of specific groups. The main defining feature is that it is not openly discussed.

What is it? 

The Taboo archetype represents activity governed by social norms, but which is kept hidden, deliberately not disclosed outside of a defined group, usually for reasons associated with fear. The activity is often informal and not prescribed, but in some cases some prescription may exist but not be widely known. The activity will usually not be known outside of specific groups, though there may well be suspicion among others outside of these groups, though even this is still not widely disclosed. The distinguishing feature of Taboo is that disclosure of the activity is deliberately restricted, more so than will usually be the case with The Messy Reality, which is quite ordinary.

Those familiar with archetype are those who do the work, and those who sanction the practices (explicitly or implicitly), but it may concern work in any part of an organisation, from front-line to senior management. The Taboo archetype may exist in partnership with P.R. and Subterfuge, which may be used to throw out-group members off the scent of Taboo.

Why does it exist? 

At the heart of Taboo is one or more conflicts between goals, needs, or values, concerning, cost, financial gain, efficiency, productivity, capacity, safety, security, satisfaction, comfort, sustainability, power, etc., and associated trade-offs and dilemmas. These conflicts may exist within and between groups.

The practices (work-as-done) that are pertinent to Taboo will usually be contrary to a prevailing norm (social, procedural, legal, moral or ethical) or expectation, such that if the activity were widely known, action may be taken that would be detrimental to the continuation of the activity. Hence, disclosure could be damaging to the goals, needs or values of the in-group.

Taboo may simply concern basic human needs, such as the need for rest or sleep, which are not catered for in the design or prescription of work. It is not unusual for sleep to be forbidden on nightshifts, and yet arrangements are made among staff to ensure that they get some sleep. In some cases, the practices might involve personal gain (e.g., remuneration, time off, power or prestige), perhaps associated with practices that might be seen as unfair or unethical, or that might trigger outrage if aired more widely. Taboo may also concern group-level needs (e.g., the need for survival or influence of an occupation). Often, the reasons for Taboo appear personal but are actually systemic, for instance involving perverse incentives, inadequate organisational processes, poor resources and conditions, inappropriate constraints, and goal conflicts. For instance, unhealthy and unsafe levels of overtime may offer financial benefits to individuals (pay) and organisations (fewer staff required), and thus may be form part of a Taboo archetype for both staff and management.

In many instances, there will be an efficiency-thoroughness trade-off (i.e., an emphasis on efficiency over thoroughness) or an acute-chronic trade-off in operation. Increases in demand and pressure, in an environment of inadequate resources, will tend to result in an emphasis on efficiency and short term goals, which will tend to breed practices which cannot be widely disclosed.

The Taboo archetype can, however, in conjunction with P.R. and Subterfuge offer groups protection from unhelpful or detrimental outside influence based on Ignorance and Fantasy of complex issues associated with work-as-done (e.g., safety margins or buffers, the need for resources). This is a complex issue that is difficult to understand without knowledge of the work.

Shadow side

What people can and can’t do and talk about openly sheds light on the shared assumptions, beliefs and values that underlie a group’s culture. Unsustainable, unethical or unacceptably risky practices can remain hidden, leading to ever wider gaps between work-as-imagined and work-as-done and potentially a drift into failure. Those who break the taboo (often referred to as ‘whistleblowers’) and disclose work-as-done may be outcast, from the group, organisation or profession.

Examples (Healthcare)


The case of Dr Raj Mattu provides an example of Taboo. He was suspended by University Hospitals Coventry and Warwickshire NHS Trust in February 2002 on allegations of bullying, 5 months after he spoke to the BBC about the death of a patient in an over-crowded bay at Walsgrave Hospital, Coventry. A 5th bed was put into 4 bedded bay (so called ‘5 in 4’) in order that the hospital could never be deemed full. I worked as a Neurology SpR at the Walsgrave between January-December 2000 and it was the most stressful period of my career. I too was appalled at the policy of putting a 5th bed into 4 bedded bay (so called ‘5 in 4’) in order that the hospital could never be deemed full. Dr Mattu has faced years of mistreatment and ‘detriment’, and the effective end of his career at a cost for legal expenses alone of around £6 million. His successful employment tribunal was one of the most expensive in NHS history. However the most disturbing aspect of the Mattu case is that those responsible for the ‘5 in 4’ policy have faced no serious public scrutiny. How can we have any confidence that staff concerns such as Dr Mattu’s will be dealt with any differently the next time? The treatment of whistleblowers in the NHS is a reflection of the Taboo archetype: how whistleblowers are treated is often not openly discussed, nor prescribed, and hard to imagine.” (Based on a letter to The BMJ: http://www.bmj.com/content/348/bmj.g2881/rapid-responses.)

Dr David Nicholl, Consultant Neurologist, @djnicholl


When preparing intravenous injections for a patient, guidelines (e.g., NMC medicines management guidelines) and procedures require that the injection must be prepared immediately before it is due to be given, and not prepared in advance of this time. However, under current service pressures, including staff shortages and high acuity, doses may be prepared in advance to save time, or if prepared on time and then for some reason not given, may be stored to one side for later use, instead of being disposed of and re-made at a later time.

Anonymous, Pharmacist.


Although most people would like to believe that admission to critical care does not depend on the bed status of the unit, this is not the case. If there are many critical care beds available, patients are likely to be admitted who would not be admitted if there was only one bed available.

Anonymous, Anaesthetist.


Taboo describes the attitude of some healthcare workers to uniform policies. For example hospitals have a “bare below the elbow” uniform policy, where people can only wear a plain wedding band on the hands and forearms. Some people choose to ignore this and wear a watch, or a stoned ring. In theatre, this is most often ignored when people wear theatre gowns, as it is often cold in theatre, and no alternative is provided.

Anonymous.


Nursing staff on night shifts take turns to have a 2-hour sleep if it is quiet. If it is busy then obviously it is all hands to the pump. This is not described in any job description but is tacitly known about and approved to ensure functioning if required.

Anonymous. 


With acute prescribing in GP practices, some medicines are kept separate from the repeat prescribing – generally quantities no longer than a month’s supply – with the general idea that these are meds that require a regular review by the GP to determine appropriateness of ongoing supply. Often, these are dealt with as “special requests”; the scripts are not run off by the admin staff with the regular repeat meds, but are passed to the GP (or they are run off by the admin staff but stored separately for the GP to review). The idea is that these meds have greater scrutiny and are not supplied in larger quantities, so there is a sort of a safety net around them becoming inappropriate long term medicines. The reality is these are often not given the greater scrutiny as intended and we see months of antidepressants and analgesics (to name a couple of drugs) issued every month with no proper patient review.

Anonymous, Pharmacist.


Posted in Culture, Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , | 6 Comments