The Wrong Kidney

The post discusses a short fictional film, ‘The Wrong Kidney’, created to explore the complexities of accidents. The film depicts a fictional scenario where a patient undergoes ‘wrong-site surgery’ resulting in removal of the wrong kidney. It serves as a free educational tool, offering insights into how judgements are formed in complex situations. The film is accompanied by fictional statements from various stakeholders, presenting diverse and sometimes conflicting perspectives. The post suggests various ways to use the film and statements for educational purposes inside or outside of healthcare, with practical tips on facilitating these discussions to enhance understanding of human and systemic factors in accidents and the aftermath.

When things go seriously wrong and people are harmed, there are often mixed views on what happened, how, and why, including culpability and responsibility. How we make sense of accidents and their human contributions has many influences: informational, temporal, personal, social, cultural, historical, organisational, regulatory, legal, and so on. But we can never fully understand work-as-done in a complex situation. Since work-as-done and the associated context are unknowable, we construct an understanding via proxies. These proxies and the multiple contexts of sensemaking interact to inform our work-as-imagined, combined with our imagination of context of work-as-done. It is on this that we base our judgement, evaluation or appraisal of work – work-as-judged, which has several characteristics.

For all of us, but especially those in certain professions, it is important to learn about how we make judgements, the nature of these judgements, and the effects of them. We can learn about this from books and articles of course, but a more immersive experience results in a different type of learning.

The Film

To help understand our work-as-judged, a short film was made in 2023. The film concerned a fictional serious medical accident or ‘Never Event’ where the wrong kidney was removed from a patient. The film is a mix of narration, imagery, filming in a hospital theatre simulation facility, and an interview.

The filming was done in the Scottish Centre for Simulation and Clinical Human Factors in Scotland in summer 2023. I coproduced the film with Sebastian Daeunert (safety specialist), Dr Michael Moneypenny (consultant anaesthetist), Mr Craig McIllhenny (consultant urological surgeon) and several others (see Credits below). The idea for the film emerged from a training session that I co-facilitated with Mr Craig McIllhenny, Dr Michael Moneypenny, and Dr Shelly Jeffcott (Human Factors Specialist) in 2017, where we introduced a description of the accident and some statements (see below), written by Mr McIllhenny. Trainee urological surgeons taking part in a systems thinking training day were asked to choose five statements as a basis for discussion about the accident. The session powerfully showed how we can be influenced by opinion, hearsay, and others’ judgements about an event.

So, six years later, the concept behind the film production was to explore judgements and evaluations about the accident and those involved using film instead of a description. The aim was to produce a training simulation video to recreate a medical accident scenario for discussion. The emphasis during recording was on high learning value rather than high production value. (All involved are amateurs when it came to short film production acting on a volunteer basis.) The whole film was recorded on an SLR camera and smartphones during a short slot in the simulator, and voice over audio and some additional filming was done afterwards. (Subsequent production by my coproducer Sebastian made the most of the video and audio material that I provided.)

The film is supplemented by 19 statements by various stakeholders, both involved and not involved on the day. There is no statement from the patient or his family. This is the 20th ‘missing statement’. reflecting a lack patient and family involvement in this (fictional) hospital.

The Accident

The film The Wrong Kidney concerns a fictional accident with features similar to real cases. It shows how several factors can combine to produce a severe medical accident, or ‘never event’. The film features a very serious type of ‘wrong site surgery’ known as wrong site nephrectomy – removal of the wrong kidney. This is defined by NHS Improvement as “an invasive procedure performed on the wrong patient or at the wrong site (eg wrong knee, eye, limb). The incident is detected at any time after the start of the procedure.”  This is an extremely rare event, but it has happened many times, which can be found in news reports. There are usually various factors that combine, but each incident is unique.

The fictional hospital setting is ‘St Just’, a modern, well-equipped district general hospital. The patient is Craig Ferguson, a 48-year-old man with a left renal tumour requiring a laparoscopic nephrectomy. On Friday 7th of April, Mr Ferguson was booked in for surgery. The surgery had been planned for a long time, and the date of surgery was just within the target waiting time. Targets have been a critical issue for the government post-pandemic, and waiting times are critical for patients, especially those with deteriorating conditions. 

Background on Wrong Site Nephrectomy and Never Events

In its 2018 Never Events Policy and Framework document, the National Health Service describes Never Events as “Serious Incidents that are wholly preventable because guidance or safety recommendations that provide strong systemic protective barriers are available at a national level and should have been implemented by all healthcare providers.” The Never Event concept was introduced into the NHS in 2008. 

The Never Events policy and framework suggests that “Never Events may highlight potential weaknesses in how an organisation manages fundamental safety processes. Never Events are different from other serious incidents as the overriding principle of having the Never Events list is that even a single Never Event acts as a red flag that an organisation’s systems for implementing existing safety advice/alerts may not be robust.” When a Never Event occurs in a health care facility such as a hospital, it must be reported.

In recent years, the NHS has emphasised that the concept of Never Events is not about apportioning blame to organisations when these incidents occur. The original concept, however, included an option for commissioners to impose financial sanctions when Never Events were reported. This option was subsequently removed. 

The foreword to the framework states “We heard that allowing commissioners to impose financial sanctions following Never Events reinforced the perception of a ‘blame culture’. Our removal of financial sanctions should not be interpreted as a weakening of effort to prevent Never Events. It is about emphasising the importance of learning from their occurrence, not blaming.” 

The Investigation Statements

Those involved on the day of the fictional accident, and several others, have given ‘statements’ (see below). The statements are fictional and were initially written by Mr Craig McIllhenny, and subsequent adapted and edited by me and Dr Michael Moneypenny. The informal verbal statements are from various stakeholders, from close working colleagues to the Chief Executive at St Just.

The statements differ in various ways in content and tone. The statements contain conflicting or incomplete information, reflecting work-as-disclosed and other varieties, proxies and archetypes of human work and the associated contexts. There are clear differences in evaluations (work-as-judged). Some statements are broadly positive toward the surgeon Mr Black and his character or work performance, while others are broadly negative, with others being mixed. More generally, the statements give confusing messages, which may reflect personal perceptions, opinions and suppositions, sometimes reflecting relationships between the stakeholders and Mr Black.

The statements are classified red, amber or green according to the overall favourability of perception of Mr Paul Black:

  • (G) 🟢 – Positive
  • (A) 🟠 – Mixed
  • (R) 🔴 – Negative

Involved on the day

Others

Using the Film and the Statements: Some Practical Tips

The film and statements may be used freely for education purposes subject to the Creative Commons license. The work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND 4.0).

The film was first used in a conference on Just Culture and the Judiciary in October 2023. The film was screened to a large group of safety and human factors practitioners, pilots, controllers, and members of the judiciary. Tom Lintner (an experienced safety investigator, controller, pilot and manager), masterfully facilitated the session, which lasted around 60 minutes. Tom asked preprepared questions which were answered by participants immediately after the film, and then again after he read batches of selected statements.

I used the film in a Masterclass on the Safety Culture Discussion Cards for the Chartered Institute of Ergonomics and Human Factors in April 2024. The group was mixed, representing many industries and various professions. In this exercise, the cards were used with sticky notes to reflect on the accident in terms of influences (see ‘Talking about just culture and safety culture’ below), again with selected statements read out in batches at different stages of discussion, from negative to positive. These applications have shown that the film and associated resources can be understood and used by many people in different sectors (the film can be used with YouTube subtitles and translation if required).

There are various options for using the materials. After watching the film, people may be asked a variety of questions about the accident, concerning the people, the operation, the context and the outcome, and the needs and responsibilities arising. The statements are optional and don’t have to be used at all; three are already integrated into the film. If you wish to use the statements, they are probably best read out to participants, or read by them, after viewing the film. Of course, it is also probably best if participants have not seen the film or statements before. A good approach is to give (selected or all) statements in batches, e.g.:

  • negative, then mixed, then positive
  • close working colleagues, then distant colleagues, then managers
  • selections – eg “Choose 5 based on job title only”.

In terms of discussion, this may take place as follows, for example:

  • immediately after seeing the film
  • after hearing or reading batch 1 statements
  • after hearing or reading batch 2 statements
  • after hearing or reading batch 3 statements.

Many different approaches may be used for discussion and analysis and synthesis. Some examples are below, along with several associated Humanistic Systems albums and EPs and associated posts, along with some sample questions. Of course, other lenses are available and many thousands of articles elsewhere, along with many other questions. The below is just to give a few ideas.

Reflecting on ‘human error’ (see Album 1 & Album 12)

  • Critically evaluate the human-error-as-cause explanation in the context of the accident. How well does this fit and what are the alternatives?
  • Consider how errors propagated through the system. How could so many things go wrong?
  • Critically evaluate the concept of ‘Never Events’. What are some of the intended and unintended consequences?

Exploring human work: varieties, archetypes and proxies (see Album 2, Album 4, Album 9 & Album 12)

Talking about just culture and safety culture (see EP5, EP8, Album 7, Album 12)

Using the lenses of Safety-I and Safety-II (see Album 6)

  • Consider the scenario and accident through the lenses and principles of Safety-I and Safety-II. What are the differences and similarities between the perspectives?
  • Consider three zones of performance. How do these relate to the event in terms of what happened and what could have happened in different circumstances?
  • Reflect on the situation in terms of everyday or normal work. What conditions and aspects of work were present at some time prior to the accident?

Systems thinking and diagramming methods (see Album 5, Album 8 & Album 12)

  • Consider the influence of targets. How did the risk of the patient breaching the target waiting time seem to influence decision making and the situation as a whole?
  • Reflect on the EUROCONTROL ten principles in the context of the accident. How do they apply separately and together in terms of influence?
  • Consider the event from different perspectives and professions using person and process empathy: judiciary, regulator, management, administrative staff, clinical staff, family, patient, etc. How do the different points of view affect your thinking?
  • Construct a system map (see video). What new insights does it bring about system structure?
  • Construct an actor map (see video). Who is influences or is affected by the event?
  • Construct an influence diagram (see video) or AcciMap. How does the method help conversation and understanding?

Exploring complex work (see Album 10)

  • Think about the goal conflicts and trade offs that seem to be at play. How did they emerge, and were they specific to the event or people involved?
  • Discuss the checklist and how it was used. What were the issues with the use of the checklist and how might these issues be relevant more generally?
  • Consider the surprises that the various stakeholders were faced with. How did they emerge and how were they handled?

Thinking about explanations and interventions for system safety (see EP2)

  • Reflect on the accident in terms of the friends and foes of explanation. How might the explanation look different when viewed through these concepts?
  • Reflect on the accident in terms of the friends and foes of intervention. How might past interventions have influenced the event?
  • Reflect on the accident in terms of the friends and foes of intervention. What might be the influence of future interventions that could be imagined now?

Thinking about the four kinds of Human Factors (see EP3)

  • Consider the four kinds of Human Factors. How does each influence understandings of human and system performance?
  • Think about intervention possibilities through the lenses of the four kinds of Human Factors? What interventions might make sense from each lens?
  • Think about the context that you are familiar with. What kind of Human Factors is likely to be the predominant lens for examining such an event?

I and the co-producers of this work hope that the film and related resources offer helpful way to discuss and reflect on the complexities of understanding accidents and our responses. We hope you find it as useful and enlightening as we intended, and have found in our own screenings.

Credits

Thank you to everyone who contributed to this project. Your efforts have created something really valuable.

  • Scriptwriting – Dr Steven Shorrock
  • Video production – Sebastian Daeunert
  • Filming – Dr Steven Shorrock
  • Statements – Mr Craig McIllhenny, Dr Michael Moneypenny, Dr Steven Shorrock
  • Narrator and interviewer – Dr Steven Shorrock
  • Interviewee – Dr Michael Moneypenny (Consultant Anaesthetist)
  • Surgeon (acted) – Jamie Dickson
  • Anaesthetist (acted) – Dr Michael Moneypenny
  • Scrub nurse (acted) – Catherine Moneypenny
  • Theatre nurse (acted) – Andrew Bain

How to cite (APA)

Shorrock, S. (2024, May 24). The wrong kidney. Humanistic Systems. https://humanisticsystems.com/2024/05/24/the-wrong-kidney/

Creative Commons Licence

The Wrong Kidney © 2024 by  Steven Shorrock is licensed under CC BY-NC-ND 4.0 

This license requires that reusers give credit to the creator. It allows reusers to copy and redistribute the material in any medium or format.

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am an interdisciplinary humanistic, systems and design practitioner interested in work and life from multiple perspectives. My main interest is human functioning and system behaviour, in work and life generally. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I work as a human factors practitioner and psychologist in safety critical industries. I am also an Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. LinkedIn: www.linkedin.com/in/steveshorrock/ Email: contact[at]humanisticsystems[dot]com

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.