A legendary rivalry: one mountain and two climbers seeking to be the best. We join them at basecamp as they prepare for the challenges of the ascent. Invited into separate tents to join just one of the two climbers, audiences experience the subjective and different sides of this rivalry, sharing only one side of the story. As time passes, the voices travel through the camp and the line between truth and lies, fact and fiction, begin to blur. Award-winning Fever Dream Theatre return after their 2016 sell-out hit Wrecked. ‘Stays with you long after you’ve left’ (NME).
(See Human Factors at The Fringe for an introduction to this series of posts.)
As you meet the two climbers at the venue – ‘BaseCamp’ – you are taken into one of two tents. The climbers are raising money for their next climb, and you will hear about one of their climbing lives.
You are taken into a canvas tent and the climber starts to talk about climbing – her passion. You noticed on being introduced to the two climbers initially that there was tension between the two, and as your host continues her story, the knotty relationship between her and her friend in the other tent surfaces. Your host seems honest and credible. In the other tent, people are hearing from the other climber. You don’t know what she’s saying, and perhaps you never will. You will only hear one side of the story. Do you get the feeling that you’re not hearing the whole story, that you are missing part of the picture? Are you curious to find out? Or are you content with the version of events that you have heard?
In many work situations, we rely on the accounts that people provide. This is what I call Work-as-Disclosed.
“This is what we say or write about work, and how we talk or write about it. It may be simply how we explain the nitty-gritty or the detail of work, or espouse or promote a particular view or impression of work (as it is or should be) in official statements, etc. Work-as-disclosed is typically based on a partial version of one or more of the other varieties of human work: Work-as-imagined, work-as-prescribed, and work-as-done. But the message (i.e., what is said/written, how it is said/written, when it is said/written, where it is said/written, and who says/writes it) is tailored to the purpose or objective of the message (why it is said/written), and, more or less deliberately, to what is thought to be palatable, expected and understandable to the audience. It is often based on what we want and are prepared to say in light of what is expected and imagined consequences.” From The Varieties of Human Work
BaseCamp provides two versions of Work-as-Disclosed. To some extent, each may contain P.R. and Subterfuge
“This is what people say happens or has happened, when this does not reflect the reality of what happens or happened. What is disclosed will often relate to what ‘should’ happen according to policies, procedures, standards, guidelines, or expected norms, or else will shift blame for problems elsewhere. What is disclosed may be based on deliberate deceit (by commission or omission), or on Ignorance and Fantasy, or something in between… The focus of P.R. and Subterfuge is therefore on disclosure, to influence what others think.” From The Archetypes of Human Work: 6. P.R. and Subterfuge
Each version of events seems credible, and as you listen to the story, for nearly an hour, you develop a felt rapport with the reporter. How much do you want to hear a second account? And if you do hear another account, how will you respond to conflicts with the account that you have heard, and trusted?
In these sorts of situations, at home, in organisations, in courtrooms, we often hear and accept the stories that we want to hear. Sometimes we choose not to hear the stories that we don’t want to hear. We may also choose the sequence of the stories that we hear, or else this might be forced upon us by others or by circumstance. In safety investigations, formal inquires, court cases and disputes of all kinds, who you chose to (or are able to) listen to, and the order in which you listen, will affect the story that you create about what happened. By hearing only from clinician(s), but not the patient and family, for example, your story will lack the perspectives and details that are required for a more thorough understanding. And the order in which you listen to people, even when you listen to many, will affect what you hear in subsequent accounts because it will affect your questions, your mental set and perceptual filter. This is an ‘anchoring’ heuristic that has been researched extensively in the context judgement. Mostly, people think about anchoring in the context of quantitative judgement:
‘In many situations, people make estimates by starting from an initial value that is adjusted to yield the final answer. The initial value, or starting point, may be suggested by the formulation of the problem, or it may be the result of a partial computation. In either case, adjustments are typically insufficient (Slovic & Lichtenstein, 1971). That is, different starting points yield different estimates, which are biased toward the initial values. We call this phenomenon anchoring.” Tversky & Kahneman (1974)
Anchoring can also affect our understanding of stories, by anchoring our expectations, questions, and desire for certainty.
There may indeed be misunderstandings between different parties to an event, because they each has partial knowledge and information, because each has different goals and expectations, and because each sees things from different perspectives and resolutions. This is the case with BaseCamp. Not only are there inconsistencies between the accounts, there is a crucial unspoken aspect to each of their thinking about the relationship and the factual and counterfactual aspects of a critical event. They don’t know because it is a Taboo, and you will only know if you hear both stories, or if you can, as two listeners, piece together the aspects of the stories.
In the EUROCONTROL ‘Systems Thinking for Safety: Ten Principles‘ White Paper, the term field experts was used to describe people who possess expertise relative to their own work-as-done.
“The perspectives of field experts need to be synthesised via the closer integration of relevant system actors, system designers, system influencers and system decision makers, depending on the purpose. The demands of work and various barriers (organisational, physical, social, personal) can seem to prevent such integration. But to understand work-as-done and to improve the system, it is necessary to break traditional boundaries.” From: Systems Thinking for Safety/Principle 1. Field Expert Involvement
There are many influences on who speak to, how, for how long, and when, for example:
- Desire for certainty – by introducing new accounts, we may well introduce uncertainty, which may bring us anxiety.
- Prejudice and confirmation bias – we may have a predetermined goal to achieve, or a preconceived idea about what happened and who is responsible for an outcome, and choose (more or less consciously) who and how we speak to people in order to confirm our hypothesis.
- Time – listening to different accounts takes time, which is always limited. Even when there is time, we may perceive as better spent on something else (e.g., analysis, reporting, action). Sometimes, system constraints such as regulations can force the issue (see the example here).
- Theory of causation – we may perceive that that those closest to an event (e.g, an air traffic controller) are ‘causal’ to it, and therefore important to hear, while those less close to an event (e.g,, a procedure writer) are merely ‘contributory’ to it (and therefore less important to hear). The second group are rarely interviewed, and so we tend to hear the first story, and not the second story (see talk here).
- Expertise – we may simply lack the competency to investigate an issue appropriately.
Broadly these and other influences relate to barriers to new thinking about systems and safety, outlined here.
Multiple perspectives are not a sources of weakness. Diversity is a source of resilience, even – or especially – when accounts do not agree. This is counterintuitive for those who wish to have a straightforward, perhaps mechanistic, account.
This advice might help (adapted from Systems Thinking for Safety Ten Principles White Paper and Learning Cards):
- Listen to people’s stories. Consider how people can best tell their stories from the point of view of how they experienced events at the time. Try to understand the person’s situation and world from their point of view, both in terms of the context and their moment-to-moment experience.
- Understand their local rationalities. Be curious about how things make sense to people at the time. Listen to people’s individual goals, plans and expectations, in the context of the flow of work and the system as a whole. Focus on their ‘knowledge at the time’, not your knowledge now. Understand the various activities and focus of attention, at a particular moment and in the general time-frame.
- Seek multiple perspectives. Don’t settle for the first explanation; seek alternative perspectives. Discuss different perceptions of events, situations, problems and opportunities, from different people and perspectives, including those who you might think are not directly involved. Consider the implications of these differential views. One way to do this is to adopt a group approach to debriefing, as explained in this Etsy Debriefing Facilitation Guide on leading groups to learn from accidents, by John Allspaw @, Morgan Evans @, and Daniel Schauenburg @.
I will leave you with this – an advertisement of my childhood, which remains my favourite of all time. I talk about it here.
“An event seen from one point of view gives one impression. Seen from another point of view, it gives quite a different impression. It’s only when you get the whole picture that you fully understand what’s going on.”
You may well have to accept that you can never fully understand what went on. But you can get past the basecamp of understanding.