Human Factors at the Fringe: BaseCamp

A legendary rivalry: one mountain and two climbers seeking to be the best. We join them at basecamp as they prepare for the challenges of the ascent. Invited into separate tents to join just one of the two climbers, audiences experience the subjective and different sides of this rivalry, sharing only one side of the story. As time passes, the voices travel through the camp and the line between truth and lies, fact and fiction, begin to blur. Award-winning Fever Dream Theatre return after their 2016 sell-out hit Wrecked. ‘Stays with you long after you’ve left’ (NME).

Basecamp, Fever Dream Theatre, C South  (Venue 58), Edinburgh, 4-13 & 15-27 August 2018

(See Human Factors at The Fringe for an introduction to this series of posts.)

As you meet the two climbers at the venue – ‘BaseCamp’ – you are taken into one of two tents. The climbers are raising money for their next climb, and you will hear about one of their climbing lives.

You are taken into a canvas tent and the climber starts to talk about climbing – her passion. You noticed on being introduced to the two climbers initially that there was tension between the two, and as your host continues her story, the knotty relationship between her and her friend in the other tent surfaces. Your host seems honest and credible. In the other tent, people are hearing from the other climber. You don’t know what she’s saying, and perhaps you never will. You will only hear one side of the story. Do you get the feeling that you’re not hearing the whole story, that you are missing part of the picture? Are you curious to find out? Or are you content with the version of events that you have heard?

In many work situations, we rely on the accounts that people provide. This is what I call Work-as-Disclosed.

“This is what we say or write about work, and how we talk or write about it. It may be simply how we explain the nitty-gritty or the detail of work, or espouse or promote a particular view or impression of work (as it is or should be) in official statements, etc. Work-as-disclosed is typically based on a partial  version of one or more of the other varieties of human work: Work-as-imagined, work-as-prescribed, and work-as-done. But the message (i.e., what is said/written, how it is said/written, when it is said/written, where it is said/written, and who says/writes it) is tailored to the purpose or objective of the message (why it is said/written), and, more or less deliberately, to what is thought to be palatable, expected and understandable to the audience. It is often based on what we want and are prepared to say in light of what is expected and imagined consequences.” From The Varieties of Human Work

BaseCamp provides two versions of Work-as-Disclosed. To some extent, each may contain P.R. and Subterfuge

“This is what people say happens or has happened, when this does not reflect the reality of what happens or happened. What is disclosed will often relate to what ‘should’ happen according to policies, procedures, standards, guidelines, or expected norms, or else will shift blame for problems elsewhere. What is disclosed may be based on deliberate deceit (by commission or omission), or on Ignorance and Fantasy, or something in between… The focus of P.R. and Subterfuge is therefore on disclosure, to influence what others think.” From The Archetypes of Human Work: 6. P.R. and Subterfuge

Each version of events seems credible, and as you listen to the story, for nearly an hour, you develop a felt rapport with the reporter. How much do you want to hear a second account? And if you do hear another account, how will you respond to conflicts with the account that you have heard, and trusted?

In these sorts of situations, at home, in organisations, in courtrooms, we often hear and accept the stories that we want to hear. Sometimes we choose not to hear the stories that we don’t want to hear. We may also choose the sequence of the stories that we hear, or else this might be forced upon us by others or by circumstance. In safety investigations, formal inquires, court cases and disputes of all kinds, who you chose to (or are able to) listen to, and the order in which you listen, will affect the story that you create about what happened. By hearing only from clinician(s), but not the patient and family, for example, your story will lack the perspectives and details that are required for a more thorough understanding. And the order in which you listen to people, even when you listen to many, will affect what you hear in subsequent accounts because it will affect your questions, your mental set and perceptual filter. This is an ‘anchoring’ heuristic that has been researched extensively in the context judgement. Mostly, people think about anchoring in the context of quantitative judgement:

‘In many situations, people make estimates by starting from an initial value that is adjusted to yield the final answer. The initial value, or starting point, may be suggested by the formulation of the problem, or it may be the result of a partial computation. In either case, adjustments are typically insufficient (Slovic & Lichtenstein, 1971). That is, different starting points yield different estimates, which are biased toward the initial values. We call this phenomenon anchoring.” Tversky & Kahneman (1974)

Anchoring can also affect our understanding of stories, by anchoring our expectations, questions, and desire for certainty.

There may indeed be misunderstandings between different parties to an event, because they each has partial knowledge and information, because each has different goals and expectations, and because each sees things from different perspectives and resolutions. This is the case with BaseCamp. Not only are there inconsistencies between the accounts, there is a crucial unspoken aspect to each of their thinking about the relationship and the factual and counterfactual aspects of a critical event. They don’t know because it is a Taboo, and you will only know if you hear both stories, or if you can, as two listeners, piece together the aspects of the stories.

In the EUROCONTROL ‘Systems Thinking for Safety: Ten Principles‘ White Paper, the term field experts was used to describe people who possess expertise relative to their own work-as-done.

“The perspectives of field experts need to be synthesised via the closer integration of relevant system actors, system designers, system influencers and system decision makers, depending on the purpose. The demands of work and various barriers (organisational, physical, social, personal) can seem to prevent such integration. But to understand work-as-done and to improve the system, it is necessary to break traditional boundaries.” From: Systems Thinking for Safety/Principle 1. Field Expert Involvement

There are many influences on who speak to, how, for how long, and when, for example:

  • Desire for certainty – by introducing new accounts, we may well introduce uncertainty, which may bring us anxiety.
  • Prejudice and confirmation bias – we may have a predetermined goal to achieve, or a preconceived idea about what happened and who is responsible for an outcome, and choose (more or less consciously) who and how we speak to people in order to confirm our hypothesis.
  • Time – listening to different accounts takes time, which is always limited. Even when there is time, we may perceive as better spent on something else (e.g., analysis, reporting, action). Sometimes, system constraints such as regulations can force the issue (see the example here).
  • Theory of causation – we may perceive that that those closest to an event (e.g, an air traffic controller) are ‘causal’ to it, and therefore important to hear, while those less close to an event (e.g,, a procedure writer) are merely ‘contributory’ to it (and therefore less important to hear). The second group are rarely interviewed, and so we tend to hear the first story, and not the second story (see talk here).
  • Expertise – we may simply lack the competency to investigate an issue appropriately.

Broadly these and other influences relate to barriers to new thinking about systems and safety, outlined here.

Multiple perspectives are not a sources of weakness. Diversity is a source of resilience, even – or especially – when accounts do not agree. This is counterintuitive for those who wish to have a straightforward, perhaps mechanistic, account.

This advice might help (adapted from Systems Thinking for Safety Ten Principles White Paper and Learning Cards):

  • Listen to people’s stories. Consider how people can best tell their stories from the point of view of how they experienced events at the time. Try to understand the person’s situation and world from their point of view, both in terms of the context and their moment-to-moment experience.
  • Understand their local rationalities. Be curious about how things make sense to people at the time. Listen to people’s individual goals, plans and expectations, in the context of the flow of work and the system as a whole. Focus on their ‘knowledge at the time’, not your knowledge now. Understand the various activities and focus of attention, at a particular moment and in the general time-frame.
  • Seek multiple perspectives. Don’t settle for the first explanation; seek alternative perspectives. Discuss different perceptions of events, situations, problems and opportunities, from different people and perspectives, including those who you might think are not directly involved. Consider the implications of these differential views. One way to do this is to adopt a group approach to debriefing, as explained in this Etsy Debriefing Facilitation Guide on leading groups to learn from accidents, by John Allspaw @allspaw, Morgan Evans @NeonMorgan, and Daniel Schauenburg @mrtazz.

I will leave you with this – an advertisement of my childhood, which remains my favourite of all time. I talk about it here.

“An event seen from one point of view gives one impression. Seen from another point of view, it gives quite a different impression. It’s only when you get the whole picture that you fully understand what’s going on.”

You may well have to accept that you can never fully understand what went on. But you can get past the basecamp of understanding.

See also:

Human Factors at The Fringe

Human Factors at The Fringe: The Girl in the Machine

Human Factors at The Fringe: My Eyes Went Dark

Human Factors at The Fringe: Nuclear Family

Human Factors at the Fringe: Lemons Lemons Lemons Lemons Lemons

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , | 1 Comment

The Safety-II Dance: A Podcast by Greater Than Code

A few weeks ago, I had a chat with Jamey HamptonJessica KerrJohn K. Sawers of Greater Than Code. Here is the podcast that resulted, expertly produced by Mandy Moore.

In the podcast, we roamed around topics of human factors/ergonomics, system performance and human wellbeing, empathy, appreciative inquiry, asset-based community development (ABCD), and Safety-II.

All Greater Than Code podcasts are on their website and on iTunes.


Posted in Culture, Human Factors/Ergonomics, Safety | Tagged , , , , , , , | 1 Comment

Suitably Qualified and Experienced? Five Questions to ask before buying Human Factors training or consultancy

Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance. (IEA, 2018)

This definition – accepted by human factors and ergonomics (HF/E) societies worldwide – emphasises that HF/E is a discipline and profession. A discipline is “a branch of knowledge, typically one studied in higher education”. A profession is “a paid occupation, especially one that involves prolonged training and a formal qualification” (Oxford dictionaries).

Practitioners of ergonomics and ergonomists contribute to the design and evaluation of tasks, jobs, products, environments and systems in order to make them compatible with the needs, abilities and limitations of people. (IEA, 2018)

This contribution tends to be made by HF/E practitioners in two ways:

  1. as an external human factors consultant/trainer
  2. as an in-house human factors specialist (a typical job description is here).

But how do we assess whether a practitioner is a ‘suitably qualified and experienced person’ (SQEP)?

This is an important question because there is so much at stake for system performance and human well-being, but it is not straightforward to answer. In this post, I provide five questions that will help. The questions are for reflection and discussion. They are not definitive. In considering these questions, the point is not necessary to answer “yes’ to every question. Some will be more relevant than others, and there will be exceptions. But especially where the answer to two or more questions is “no”, there should be careful consideration as to why this is the case.

The emphasis of this post is not on those who fulfill specific HF/E roles in-house (e.g., HF/E in medical simulation). In such cases, internal practitioners with HF/E-related roles may well have education and experience in a specific area of HF/E, and use this in their role. But they would probably not describe themselves as ‘HF/E specialists’ (just as I have education in counselling but would not call myself a counsellor). This post does not cover these in-house practitioners, though they may wish to consider the questions and what support they might need.

Rather, the post concerns paid-for HF/E consultancy and training, and also employment as an HF/E specialist, where one has to abide by ethical professional standards in the practice of HF/E.


Marco Bellucci CC BY 2.0

1. Qualification

Do they have a recognised qualification in HF/E?

There are several academic programmes in HF/E in the UK, USA, and other countries, which you can find via the relevant Society or Association in your country. Some of these programmes will be accredited by your national HF/E Society (the Centre for Registration of European Ergonomists offers is a guide to such courses in Europe).

An HF/E qualification gives reassurance that the person has undertaken an approved programme of study in HF/E, which addresses the relevant competencies (e.g., the CIEHF Professional Competency Guidance, or the Requirements for Registration of European Ergonomists in Europe). (But note that some qualifying courses are no longer offered and so may not be listed.)

Others academic programmes will not be accredited, but will offer a substantial component of HF/E as part of a mixed programme, or as a substantial part of (e.g., a major in) a programme in experimental psychology, industrial engineering, systems engineering, patient safety, occupational health and safety, etc. This is especially true in the USA), which only a small minority of the programmes listed on the Human Factors and Ergonomics Society website are accredited by the Human Factors and Ergonomics Society. Most Human Factors practitioners (HF being the dominant term used in the US) tend to have academic qualifications in psychology.

For specialist external HF/E consultancy and commercial HF/E training support, a university degree in HF/E (or closely related discipline, as listed by the HFES in the USA) will usually be necessary, and perhaps a higher postgraduate (e.g., Doctorate) degree in very specific circumstances (e.g., expert witness work).

2. Accreditation and Membership

Do they have an appropriate level of accreditation or membership of an HF/E related professional organisation?

Unlike some professions, the terms ‘human factors specialist’ or ‘ergonomist’ are not legally protected, or regulated (e.g., by the Health and Social Care Professions Council in the UK), and so are not regulated with titles that are legally protected (e.g. Registered Occupational Psychologist, Registered Dietitian, Registered Physiotherapist).

However, HF/E is subject to accreditation (e.g., registration, certification, and chartership) in many countries (e.g., UK, USA, Canada, Australia, NZ, and Europe as a whole). So perhaps the easiest way to have confidence in the competency of an HF/E consultancy, training provider, or individual practitioner is to check for accreditation. This varies throughout different countries. In the UK, the Chartered Institute of Ergonomics and Human Factors provides various accreditations via Chartership, which is conferred to those members who fulfil certain criteria. This includes “having a high level of qualification and experience and being able to demonstrate continuing professional development”. Additionally, different grades of membership of the CIEHF – Fellow, Registered Member, Graduate Member, Technical Member – reflect competency, proficiency and experience.

Member and consultancy directories of HF/E Societies and Associations are available to help. For instance, Members of the HFES can be seen here. Chartered Members of the CIEHF can be seen here. Registered Consultancies that are accredited by the CIEHF can be seen here. You can find other directories of individuals and organisations via the relevant Society or Association in your country. (Note that ‘Associate’ or ‘Affiliate’ Membership is, in most cases, available to anyone and indicates interest and commitment – since all members have to abide by the Code of Conduct – but does not provide assurance of qualifications or experience. Therefore a minimum membership grade for paid support should typically be Graduate or Technical Member.)

In some cases, those who identify as ‘human factors specialists’ will have accreditation via other professional organisations. Typically, these relate to psychology and engineering. Some human factors specialists will be Chartered Psychologists in the UK. (There are other organisations relating to psychology and human factors, often in specific sectors, but these are not recognised by the International Ergonomics Association, which is the umbrella organisation for Human Factors and Ergonomics worldwide. These other organisations also sometimes require members to purchase the organisations’ own training for accreditation, which raises questions that are beyond the scope of this post.) The point is that many who are accredited via another route (e.g., Chartered Psychologist or Chartered Engineer) may well be competent HF/E practitioners, but perhaps for specific aspects of HF/E and not in the whole score of HF/E, and may have a different perspective (e.g., more aligned with psychology) and different approach (e.g., more cognitive-behavioural, social-organisational).

Accreditation will require that the person undertakes appropriate continued professional development, and submits evidence of this. The is important, but difficult for buyers of consultancy and training services to assess. Accreditation and membership removes some of that burden, because the Society does this as a requirement of the person’s membership.

3. Code of Ethics

Do they abide by a code of ethical conduct from an HF/E-related society or association?

This issue is covered by Accreditation above, but it its worth considering specifically because it is so important. A person offering HF/E consultancy or training services who is a member of an IEA Federated Society will have to abide by the Code of Conduct of that Society. The person should be aware of the Code. In any case, the Code (e.g. the CIEHF Code of Conduct) will cover such ethical standards, such as:

  • working within limits of competence
  • representation and claims of effectiveness
  • supervision
  • respect for evidence
  • confidentiality
  • impartiality
  • probity
  • considerations of religion, gender, race, age, nationality, class, politics or extraneous factors.

Professional societies of other disciplines and professions (e.g., psychology, engineering, health and safety) will also have codes of ethical conduct, and while these will not reference ergonomics, they will refer to similar sorts of issues mentioned above, and so working within competence would normally be formally recognised as an ethical issue.

This is an important question to ask anyone offering HF/E services and training, or seeking a job as an HF/E specialist.

If the person is not operating under the Code of Conduct of a professional organisation, then the protections available are limited to those under the law.

4. Experience

Do they have experience in the HF/E work and in the domain of interest?

The question here is whether the person has relevant experience in:

  • the kind of HF/E work (e.g., interface design, fatigue assessment, human error identification, cognitive work analysis, manual handing assessment), and
  • the sector of application (e.g., manufacturing, oil and gas, aviation, healthcare).

The first is the more important of the two, since HF/E – more than many other disciplines and professions – applies across sectors. HF/E practitioners tend to spend time in several sectors in their career. However, sector knowledge is important and HF/E specialists with a deep knowledge of one sector will have a greater understanding of the stakeholders, activities, procedures, technologies, regulations, cultures, etc. So at a micro level of application (e.g., the design of display elements or manual handling), much in HF/E crosses sectors. But at a macro level (e.g., the integration of HF/E throughout an organisation), this is not the case. When it comes to training others in aspects of HF/E (e.g., short courses), experience in the sector is a huge advantage, if not essential.

If the HF/E specialist offering consultancy or training services is accredited, then issues will be covered by the Code of Conduct or Ethics of their HF/E Society or Association, and the person will have to abide the relevant requirements (it is the focus of several items of the CIEHF Code of Conduct).

5. Social recognition

Is the person recognised as an HF/E specialist by other qualified HF/E specialists?

It can be hard to know if a person is suitably qualified and experienced, though answering ‘yes’ to the above will suggest that the person is. But there will be occasions when people fall outside of one or more of the criteria above, but where HF/E colleagues and associated would say that the person is an HF/E specialist. This will tend to involve those who specialise in a specific aspect of HF/E, but perhaps do not call themselves human factors specialists or ergonomists (and perhaps use other terms, such as UX designer, interaction designer, etc), and who are not a member of an HF/E Society or Association (e.g., a Technical Member of the CEIHF). Such people may well use HF/E theory and methods appropriately, and may even be an recognised expert in the specialism. In this case, social recognition by experienced HF/E specialists will give a good indication.

Summing up

To sum up, here are the five criteria and questions that apply to paid-for human factors and ergonomics (HF/E) consultancy and training support and employment, that may help with reflection and discussion.

1. Qualification – Do they have a recognised qualification in HF/E?

2. Accreditation – Do they have an appropriate level of membership of an HF/E related professional organisation?

3. Code of Ethics – Do they abide by a code of ethical conduct from an HF/E related society or association?

4. Experience – Do they have experience in the HF/E work and the domain of interest?

5. Social recognition – Is the person recognised as an HF/E specialist by other qualified HF/E specialists?

The aim of these criteria and questions is to ensure that professional standards – including ethical standards – are met. The criteria and questions are frames above in the context of HF/E, but in fact they apply to any professions, such as psychology, dietetics, or physiotherapy. Proper consideration of the criteria and questions should help to protect organisations, individuals, and the integrity of the profession.

Further Reading

Education and application is discussed practically (in the context of aviation, but applicable more generally), in:

Hawkins, F. H. (1987). Human factors in flight.. Gower Technical Press, pp. 326-341.

Posted in Human Factors/Ergonomics | Tagged , , , , , , | 1 Comment

Work-as-Imagined Solutioneering: A 10-Step Guide

Have you ever come across a ‘problematic solution’ that was implemented in your workplace, and wondered, “How did this come to be?” Wherever you sit in an organisation, the chances are that you have. Many problematic solutions emerge from a top-down process that I will call work-as-imagined solutioneering.

In this post, I outline a typical process of 10 Steps by which problematic solutions come into being. Some of the steps may be skipped, but with the same outcome: a problematic solution.

At the end of the post, you will find 10 ‘Solutions’ from healthcare, provided by healthcare practitioners in a series of posts on this blog on the archetypes of human work. These solutions do not typify the process below (since the process that these solutions were subject to is not known to me). And the solutions will all probably have various advantages and disadvantages. The solutions simply provide rich and messy examples of unimagined and unintended side-effects. But you will be able to think of many others in your own context (please provide an example as a comment or get in touch).

Throughout the 10 Steps, I will use terms to describe seven kinds of systems that must be reckoned with when making changes in socio-technical systems (from Martin’s [2004] Seven Samurai framework).

Slide14Step 1. Complex problem situation

The process of work-as-imagined solutioneering starts with a complex problem situation. Complex problem situations  occur in systems with:

  • a number of stakeholders with conflicting goals
  • complex interactions between stakeholders and other elements of the socio-technical system (visible and invisible, designed and evolved, static and dynamic, known and unknown),
  • multiple constraints (social, cultural, technical, economic, regulatory, legal, etc), and
  • multiple perspectives on the nature of the problem.

Problems may well be interconnected to form a ‘mess’.

Step 2. Complexity is reduced to something simple

Complex problem situations are hard to understand and have no obvious solutions. This is unappealing to most people. Understanding complex problem situations requires that we seek to understand:

  • the various expressions of, and influences on, the problem,
  • the context system, including the stakeholders, their activities, the tools and artefacts that they use, the context or environment (physical, ambient, social, cultural, technical, economic, organisational, regulatory), and
  • the history of the context system.

One of the hallmarks of work-as-imagined solutioneering is a neglect of one or more of these facets of the problem situation or context system. This is partly because understanding requires:

  • high levels of field expertise – expertise in the work that is influenced by and influences the problem, whatever the work is,
  • an understanding of people (which can be approached via various disciplines: psychology, sociology, anthropology, community development, human factors/ergonomics, etc),
  • an understanding of socio-technical systems and the nature of change in such systems, and
  • sufficient expertise in a human-centred and system-oriented design process.

Once you have approached the problem situation in a sensible way, an analysis of stakeholder assets and needs should follow.

Unfortunately, once a problem is identified, the perceived urgency to do something creates pressure to be efficient, when thoroughness is required – a blunt-end efficiency-thoroughness trade-off. The required thoroughness is time-consuming and difficult. It requires specialist expertise and – crucially – bridging social capital to engage with field experts in order to get the understanding necessary to help, rather than hinder. [There is almost always a lack of expertise, and we should try to understand why solutions make sense to managers and not simply berate them.]

So these critical activities (understanding the context system and problem situation, and understanding stakeholder assets and needs) are often neglected. And complexity is reduced to something simple. For example, a mismatch between demand, resources and capacity may reduced to a problem of ‘poor performance’. A mismatch between work-as-prescribed and work-as-done is reduced to ‘non-compliance’ or ‘violation’. A mismatch between design and performance is reduced to ‘human error‘.

Step 3. Someone has a solution waiting for a problem

While there may be little understanding of the complex problem situation, solutions are at hand. Past experience, ideas from other industries or contexts, and committee-based idea-generation or diktats from authority figures make a number of ‘solutions’ available. Examples include:

  • measures
  • monitoring arrangements
  • quantified performance targets and limits
  • commercial off-the-shelf products (equipment, artefacts)
  • checklists
  • procedures
  • standard training
  • processes
  • incentives
  • punishments
  • reorganisation or activities, processes and reporting lines
  • redistribution of power.

Most of these (aside from targets, in most circumstances) are not inherently Bad Things. The Bad Thing is introducing them – any of them – without a proper understanding of the context system and the problem situation within that context system. But it is too late. The focus is now on the solution – the intervention system.

Step 4. Compromises to reach consensus

As the solution (intervention system) is revealed, people at the blunt end are now at the sharp end of a difficult process of design and implementation. There are disagreements and they start to see a number of complications. But the stability of the group is critical. The intervention system is put out for comment, usually to a limited audience and with the aim to prove its viability. There are further insights about the problem situation and context system, but these arrive in a haphazard way, instead of through a process of understanding involving design and systems thinking. Eventually, compromises are made to achieve of consensus and the intervention system is specified further. Plans are made for its realisation. The potential to resolve the problem situation is hard to judge because neither the problem situation nor the context system is properly understood.

Step 5. The project becomes a thing unto itself

The focus now turns to realisation. The problem situation and context system, which were always out of focus, are now out of view. The assets and needs of all stakeholders were never in view, but the needs of the stakeholders who are invested in the roll-out the solution (intervention system) have been met: they can now feel reassured that something is being done. The need of corporate anxiety-reduction is now being addressed. Something is being done.

So the focus now switches from the intervention system to the realisation system – the system for bringing the solution into effect (management teams, resources, project management processes, materials, etc).

Step 6. Authorities require and regulate it

As the intervention system (the ‘solution’) gets more attention, authorities believe that this is a Good Thing. Sometimes, solutions will be mandated and regulated, and monitored by the with regulatory power. Now there is no going back.

Step 7. The solution does not resolve the problem situation

As the solution is deployed, it becomes the deployed system. This is not necessary the same as the original idea (the intervention system). Compromises have been made along the way, both by those responsible for the intervention system (compromising on aspects of the concept), and by those responsible for the realisation system (compromises on aspects of implementation).

The design or implementation (or both) of the solution meets a need (corporate anxiety reduction) but does not resolve the original problem. The original problem remains, perhaps in a different form. Never events still happen (Solution 4), a ‘paperless’ discharge summary process (Solution 6) still requires paper. The feedback loops, however, contain delays and distortion, which we will come back to.

Step 8. Unintended consequences

Not only does the solution not resolve the original problem, but it brings new problems that were never imagined. These include problems concerning system conditions (e.g., higher unwanted demand, more pressure, more use of resources), and problems concerning system behaviour (e.g., increased workload, unwanted workarounds).

Here are some healthcare examples:

A Duty of Candour (Solution 1) process results in a “highly bureaucratic process which has reinforced the blame culture.”

A Do Not Attempt Resuscitation (DNAR) form (Solution 2) results in patients being”subjected to aggressive, yet ultimately futile, resuscitation measures which may include multiple broken ribs, needle punctures in the arms, wrists and groin, and electric shocks” and nurses and paramedics working “in such fear of not doing CPR when there is no DNACPR that they may override their own professional judgement and do CPR when it is clearly inappropriate.”

Dementia diagnosis targets (Solution 3) result in “naming and shaming supposedly poorly diagnosing practices – published online. Setting doctors harmful tasks, leading them almost to “process” patients.”

Never Events list (Solution 4) – similar to various popular zero harm initiatives, “ignored the potential for using never events as a stick to beat people up with, … ignored the potential for gaming the data, … ignored the potential for people to become fearful of reporting and the loss of learning as a result.”

A ‘paperless’ discharge summary process actually results in more paper, along with”discrepancies between the notes of doctors, nurses, physiotherapists, occupational therapists, and social workers” (Solution 5). Similarly, following the implementation of a computerised medical system, “work-as-done reverted back to the system that was in place before where secretaries still had to print results on bits of paper and hand them to consultants to action” (Solution 6).

Amidst these unintended consequences, the context system has now changed and there may well be competing systems that address the problem, masking the effects of the deployed system. For instance, along with a Central Line Associated Bacteraemia (CLAB) checklist (Solution 9) another deployed system was CLAB packs, “These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly.” Which has the effect imagined?

Furthermore, there may be inadequate collaboration or support from collaborating systems and sustainment systems (which collaborate with the deployed system to achieve some goal or help it continue to function). Examples include blunt-end roles for monitoring, analysis, feedback, and the supply of tools, materials, and technical support. These stakeholders are typically far removed from operational work-as-done and do not understand the assets and needs of those who work on the front line. It may be that thedeployed system cannot even function as intended, as designed or as originally implemented.

Step 9. People game the system

Many work-as-imagined solutions can be gamed, and it may well be locally rational to the people who do – rather than imagine – the work. This is typical of measures (especially when combined with targets or limits) and processes. Following are some healthcare examples.

Radiology request forms are meant to be completed and signed by the person requesting the procedure. However, “In the operating theatre, the surgeon is usually scrubbed and sterile, therefore the anaesthetist often fills out and signs the form despite this being “against the rules” (Solution 7).

On the introduction of Commissioning for Quality and Innovation payments framework (CQUINs) to drive innovation and quality improvement in the NHS, clinicians are “demotivated by the process of collecting meaningless data and are tempted to use gaming solutions to report best performance” (Solution 8), having informed the commissioners of problems with the deployed system and offering suggested improvements to the metrics (which do not fit the intervention system concept).

Checklists for the prevention of Central Line Associated Bacteraemia (CLAB) (Solution 9) is are completed “retrospectively without watching the procedure, as they were busy with other tasks”.

Step 10. It looks like it works

The gaming, combined with feedback lags and poor measures, may well give the illusion that the deployed solution is working, at least to those not well connected to work-as-done.

After introducing the CLAB bundle (Solution 9) “very high levels of reported checklist compliance” were observed “followed by the expected drop in our rates of infection, confirming the previously reported benefits.” but the drop instead “appears to be due to the use of CLAB packs. These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly.”

With the WHO Surgical Safety Checklist (Solution 10), “The assumption within an organisation at ‘the blunt end’ is that it is done on every patient” despite “clear evidence that there is variability in how the checklist is used both within an organisation and between organisations”.

Of course, there may well be knowledge that work-as-imagined does not align with work-as-done, but this is an inconvenient truth. Too often, what we are left with is a separation (or even inappropriate congruence) of the four varieties of human work: work-as-imagined, work-as-prescribed, work-as-done, and work-as-disclosed. This is enacted in a number of archetypes of human work.

This is not the end of the process, but by this stage, the project team that worked on the originally intended solution (the intervention system) have moved on. The deployed system remains and now we must imagined a solution for both the original problem and the new problems.

Solution 1: Duty of Candour

Over the last few years there has been a call to enshrine ‘saying sorry’ in law. This became the ‘duty of candour’. When this was conceived it was imagined that people would find the guidance helpful and that it would make it easier for frontline staff to say sorry to patients when things have gone wrong. Patient advocates thought it would mean that patients would be more informed and more involved and that it would change the relationship from an adversarial to a partnership one. In practice this policy has created a highly bureaucratic process which has reinforced the blame culture that exists in the health service. Clinical staff are more fearful of what to say when something goes wrong and will often leave it to the official process or for someone from management to come and delivery the bad news in a clinical, dispassionate way. The simple art of talking to a patient, explaining what has happened and saying sorry has become a formalised, often written, complied duty. The relationships remain adversarial and patients do not feel any more informed or involved as before the duty came into play. Suzette Woodward, National Clinical Director, Sign up to Safety Team, NHS England @SuzetteWoodward

Solution 2: Do Not Attempt Cardiopulmonary Resuscitation (DNACPR) form

A Do Not Attempt Resuscitation (DNAR) form is put into place when caregivers feel that resuscitation from cardiac arrest would not be in the patient’s best interests. These forms have received a significant amount of bad press, primarily because caregivers were not informing the patient and/or their families that these were being placed. Another problem with DNAR forms is that some clinicians feel that they are being treated as “Do Not Treat” orders, leading (they feel) to patients with DNAR forms in place receiving sub-standard care. This means that some patients who would not benefit from resuscitation are not receiving DNAR forms. As a result when these patients have a cardiac arrest they are subjected to aggressive, yet ultimately futile, resuscitation measures which may include multiple broken ribs, needle punctures in the arms, wrists and groin, and electric shocks. It is not unusual to hope that these patients are not receiving enough oxygen to their brains to be aware during these last moments of their lives. Anonymous, Anaesthetist

What is sad is that this is not an unusual story. Unless a person dying in Hospital or a Nursing Home has a DNACPR then CPR will be usually be done. CPR may even be done when a person in frail health dies at home without a DNACPR, because the paramedics may be instructed to do CPR ”Just in case it was a cardio-pulmonary arrest”. Nurses and paramedics work in such fear of not doing CPR when there is no DNACPR that they may override their own professional judgement and do CPR when it is clearly inappropriate. Recently a nurse was reprimanded by the Nursing and Midwifery Council for not trying CPR on a nursing home resident who, in my opinion, was clearly already dead. I know of a case in our Hospital in which CPR was started on a person whose body was already in rigor mortis. Dr Gordon Caldwell, Consultant Physician, @doctorcaldwell

Solution 3: Dementia Diagnosis Targets

There are high levels of burnout. A target-driven culture is exacerbating this problem. A typical example was when the government seemingly became convinced by poor quality data which suggested that dementia was under diagnosed So it decided to offer GPs £55 per new diagnosis of dementia. Targets were set for screening to take place – despite the UK National Screening Committee having said for years that screening for dementia was ineffective, causing misdiagnosis. And when better data on how many people had dementia was published – which revised the figures down – it was clear that the targets GPs were told to meet were highly error-prone. The cash carrot was accompanied with beating stick, with the results – naming and shaming supposedly poorly diagnosing practices – published online. Setting doctors harmful tasks, leading them almost to “process” patients, fails to respect patient or professional dignity, let alone the principle of “do no harm”. [Extract from article The answer to the NHS crisis is treating its staff better, New Statesman.] Margaret McCartney, General Practitioner, @mgtmccartney

Solution 4: Never Events List

When we created the list of ‘never events’ at the National Patient Safety Agency we genuinely thought that it would lead to organisations focusing on a few things and doing those well. We thought it was a really neat driver for implementation of evidence based practice (e.g. the surgical safety checklist). We ignored the potential for using never events as a stick to beat people up with, we ignored the potential for gaming the data, we ignored the potential for people to become fearful of reporting and the loss of learning as a result. We importantly ignored the fact that in the vast majority of cases things can never be never – that it is a fact of life that things can and do go wrong no matter how much you try to prevent it. There is no such thing as zero harm and the never events initiative unfortunately gave the impression that it could exist. Suzette Woodward, National Clinical Director, Sign up to Safety Team, NHS England @SuzetteWoodward

Solution 5: ‘Paperless’ Discharge Summary Process

Our paperless Discharge Summary process generated about 5 times as many sheets of A4 as the old paper system, as the ‘paperless’ prescription got corrected and refined prior to discharge. Then we still were told we had to print a copy to go into the paper notes and of course the patient has to have a paper copy because there was no way to email it to the patient. The software could not message pharmacy, so we had to print out the discharge meds to be sent to pharmacy, who then checked found the errors, got doctors to correct them, then another print out, and round again. Then there were discrepancies between the notes of doctors, nurses, physiotherapists, occupational therapists, and social workers, and and soon we are all working on different problems in different directions, and the patient becomes a ‘delayed discharge’. There are so many paper copies that sometimes an earlier incorrect paper copy gets filed into the notes. Then, unless someone hits ‘Finalise’, the pdf copy never gets emailed to the GP at all. Dr Gordon Caldwell, Consultant Physician, @doctorcaldwell

Solution 6: Computerised Medical systems

With the installation of a fully computerised system for ordering all sorts of tests (radiology requests, lab requests, etc.) work-as-imagined (and -as prescribed) was that this would make work more efficient and safer, with less chance of results going missing or being delayed. Prior to the installation there was much chat with widespread talk of how effective and efficient this would be. After installation it became apparent that the system did not fulfill the design brief and while it could order tests it could not collate and distribute the results. So work-as-done then reverted back to the system that was in place before where secretaries still had to print results on bits of paper and hand them to consultants to action. Craig McIlhenny, Consultant Urological Surgeon, @CMcIlhenny

Solution 7: Radiology Request Forms

Radiology request forms are meant to be completed and signed by the person requesting the procedure. In the operating theatre, the surgeon is usually scrubbed and sterile, therefore the anaesthetist often fills out and signs the form despite this being “against the rules”. Managers in radiology refused to believe that the radiographers carrying out the procedures in theatre were “allowing” this deviation from the rules. Anonymous.

Solution 8: CQUINs (Commissioning for Quality and Innovation payments framework)

Commissioners often use CQUINs (Commissioning for Quality and Innovation payments framework) to drive innovation and quality improvement in the NHS. In theory, the metrics relating to individual CQUINs are agreed between commisioners and clinicians. In practice, some CQUINs focus on meaningless metrics. A hypothetical example: a CQUIN target for treating all patients with a certain diagnosis within an hour of diagnosis is flawed due to a failure of existing coding systems to identify relevant patients. Clinicians inform the commissioners of this major limitation and offer suggested improvements to the metrics. These suggested improvements are not deemed appropriate by the commissioning team because they deviate significantly from previously agreed definitions for the CQUIN. The clinicians are demotivated by the process of collecting meaningless data and are tempted to use gaming solutions to report best performance. This situation is exacerbated by pressure from the management team within the NHS Trust who recognise that failure to demonstrate adherence to the CQUIN key performance indicators is associated with a financial penalty. The management team listen to the clinicians and understand that the data collection is clinically meaningless, but insist that the clinical team collect the data anyway. The motivational driver to improve performance has moved from a desire to improve clinical outcomes to a desire to reduce financial penalties. The additional burden is carried by the clinical team who are expected to collect meaningless data without any additional administrative or job plan support. Anonymous, NHS paediatrician

Solution 9: Central Line Associated Bacteraemia (CLAB) checklists

The use of checklists for the prevention of Central Line Associated Bacteraemia (CLAB) is well described and has been taken up widely in the healthcare system. The purported benefits of the checklist include ensuring all steps are followed as well as opening up communication between team members. After introducing the CLAB bundle into our Intensive Care Unit, we saw very high levels of reported checklist compliance followed by the expected drop in our rates of infection, confirming the previously reported benefits. However, when we observed our staff it became apparent that they were actually filling in the checklist retrospectively without watching the procedure, as they were busy with other tasks. The fall in the CLAB rate could therefore not have been due to the use of a checklist and instead appears to be due to the use of “CLAB packs”. These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly. Carl Horsley, Intensivist, @horsleycarl

Solution 10: WHO Surgical Safety Checklist

The WHO Surgical Safety checklist was introduced into the National Health Service following the release of Patient Safety Alert Release 0861 from the National Patient Safety Agency on 29 January 2009. Organisations were expected to implement the recommendations by February 2010 including that ‘the checklist is completed for every patient undergoing a surgical procedure (including local anaesthesia)’. All organisations have implemented this Patient Safety Alert and the WHO Surgical Safety checklist is an integral part of the process for every patient undergoing a surgical procedure. Whilst the checklist appears to be used in every patient, there is clear evidence that there is variability in how the checklist is used both within an organisation and between organisations. Within an organisation, this variability can occur between teams with differences in the assumed value of using the checklist  and within a team between individuals or professional groups. Its value can degrade to a token compliance process to ‘tick the box’. The assumption within an organisation at ‘the blunt end’ is that it is done on every patient. Alastair Williamson, Consultant Anaesthetist, @TIVA_doc


Martin, J. N. (2004). The Seven Samurai of Systems Engineering: Dealing with the Complexity of 7 Interrelated Systems. Presented at the 2004 Symposium of the International Council on Systems Engineering (INCOSE). Available here.

Note: This is a post from June that curiously disappeared from the blog. I probably pressed a wrong button somewhere. Like ‘Move to Trash’, os something similarly unclear.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , , | Leave a comment

The Loneliest Profession in Healthcare

IMG_20180511_114240 3.jpg

Steven Shorrock CC BY-NC-SA 2.0

Health and social care is one of the biggest employers in developed countries, and the National Health Service (NHS) in the United Kingdom is one of the largest employers in the world. By some calculations, the NHS is the fifth biggest organisation in the world in terms of number of staff, calculated at around 1.5 million, or 1.25 million full-time equivalent. That figure does not even include temporary staff, general practitioners, dentists, optometrists, and other staff in the independent sector or private hospitals.

It is an organisation facing enormous demand. Looking at just a few headline aspects, the NHS deals with over 1 million patients every 36 hours, with over 16 million hospital admissions in 2015/16, over 23 million attendances at Accident & Emergency departments in 2016/17, and over 89 million outpatient attendances in 2015/16. In terms of NHS net expenditure, meeting this demand cost over £120 billion in 2016/17, and is expected to rise to over £126bn in 2018/19.

Not surprisingly, it is also a bafflingly complex organisation, in terms of: the variable and unpredictable nature of demand; the huge variety of staff roles and competencies; the incalculable number of different types of equipment (which are very often not designed according to ergonomic standards) and medicines (which often look or sound alike); the tens of regulators, professional bodies, and associations; the thousands of laws, regulations, diktats, policies, procedures, guidelines, and good practice documents for clinical and non-clinical staff; the complicated record-keeping and communication channels; the links to government agencies, local authorities, police and fire services, suppliers, independent providers, universities; and the interactions between all of these that make it such a complex sociotechnical system of systems.

Added to this are the elements of the system that cannot easily be seen, let alone counted, but strongly affect human behaviour, including: shifting goals, incentives, punishments, subcultures, and pressures from the public, media, regulatory and professional bodies, politicians, and associations.

You’d expect, then, that Human Factors/Ergonomics would be very relevant to the NHS (the two terms are seen as equivalent within the discipline, though the terms are used in different contexts). After all, HF/E is:

“the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance” (International Ergonomics Association).

Healthcare organisations should naturally be interested in human well-being, of patients and staff. It is obviously important that the NHS, as a system of systems, performs effectively.

So you’d also expect that many HF/E practitioners would be embedded in the NHS, directly employed by the NHS and its Trusts, just like quality improvement specialists, performance management specialists, human resources specialists, and indeed even staff who work directly with patients, such as Dietitians. After all, the role of HF/E practitioners is directly relevant to the effective provision of health services, as per the ‘7 key principles that guide the NHS in all it does‘.

“Practitioners of ergonomics and ergonomists contribute to the design and evaluation of tasks, jobs, products, environments and systems in order to make them compatible with the needs, abilities and limitations of people.”  (International Ergonomics Association).

HF/E practitioners are well-embedded in a number of sectors, notably aviationraildefenceoil and gasnuclearmanufacturingregulationproduct designinclusive design, and UX.  NATS – a provider of air traffic services in the UK and international airports, airlines and governments – employs around 25 qualified Human Factors practitioners (comprising Human Factors/Ergonomics practitioners and Psychologists). The Rail Safety and Standards Board (RSSB), Network Rail, London Underground, BAe SystemsQinetiQ (formerly part of the Defence Evaluation and Research Agency [DERA]), and other large organisations and regulators such as the Health and Safety Executive, all have long-established teams of qualified HF/E practitioners, who help to optimise system performance and human well-being by contributing to the design and evaluation of tasks, jobs, products, environments and systems in order to make them compatible with the needs, abilities and limitations of people.

Compared to any of these organisations, the NHS is enormous. To make one comparison, NATS is an organisation of around 4,500 staff – a third of the staff in some of Britain’s biggest hospital trusts. The NHS is 333 times bigger than NATS in terms of staff. Even individual hospital Trusts are enormous, with staff counted in the thousands, and up to 15,000 – three times bigger than the whole of NATS.

Of the 1.5 million members of staff, professionally qualified staff make up over half (53.8 per cent) of the Hospital and Community Health Service (HCHS) workforce (based on FTE). Healthcare obviously requires professionally qualified and accredited staff in order to provide effective services.

So how many professionally qualified HF/E practitioners are there in the NHS? It is not straightforward to answer, because ‘human factors practitioner’ (or human factors specialist, etc) and ‘ergonomist’ are not protected titles. Anyone may describe themselves as such (much as anyone can call themselves a ‘psychologist’, which is not a specifically protected title). But the profession of Human Factors/Ergonomics is Chartered in the UK, like others such as Chartered Psychologist, Chartered Engineer and Chartered Accountant. Chartered status in HF/E is conferred by the Chartered Institute of Ergonomics and Human Factors to those members who fulfil certain criteria. This includes “having a high level of qualification and experience and being able to demonstrate continuing professional development”, and operating under a Code of Conduct.

We can therefore count the number of ‘Chartered Ergonomists and Human Factors Specialists’ (CErgHF) employed by NHS Trusts, since a list of CErgHFs is published here. At the time of writing, there are 429 CErgHFs on the register. While there are others with various qualifications, and those who may have ‘human factors’ in their job title, the CIEHF is the only arbiter that provides a countable, unarguable category (as is the case for a Surveyor or Accountant) for the purposes of determining how many work in NHS Trusts.

From my own research, I have determined the number of NHS Trusts that directly employ a Chartered Ergonomics and Human Factors Specialist, and the number CErgHFs in the 233 NHS Trusts.

That number is 1.

Whichever way you look at it, the number is one.

To my knowledge, after researching the network of CErgHFs, one Trust employs a CErgHF as a Human Factors/Ergonomics specialist, and that one Trust employs – at present – one CErgHF.

Let that sink in for a moment.

An organisation of 1,500,000 staff and £120 billion expenditure.

If the number of HF/E specialists in NATS (which works in an ultrasafe sector – commercial aviation) were scaled up to the number of staff in the NHS, there would be over 8,000 HF/E specialists. Obviously, that is neither feasible nor necessary. But even if there were just one CErgHF per Trust, then there would be 233 CErgHFs in Trusts, plus those who should certainly be in other central bodies, such as NHS Education (NHS Education Scotland has a CErgHF working in an HF/E role), NHS Improvement, NHS Digital, etc.

Even one CErgHF for a Trust of up to 15,000 staff is, however, inadequate, especially given the thousands of so-called ‘excess deaths’ every year, and the apparent focus on ‘Human Factors’ in bodies such as:

The tens of thousands of so-called ‘excess deaths‘ should lead us to question why the NHS Trusts have only one Chartered Ergonomist and Human Factors Specialist. Even if HF/E could help prevent just a relatively small number of deaths and injuries, then we must ask why it is not being integrated professionally. HF/E focuses on many aspects of healthcare that must be designed properly in order to deliver safe, effective, and cost-efficient services, and relates directly to the work of the NHS, especially (NHS England):

In other industries, HF/E contributes to to the design and evaluation of tasks, jobs, products, environments and systems, for improved system performance and human well-being, in terms of:

  • the design and evaluation of equipment
  • the design of tasks and jobs
  • the design of physical and ambient environments
  • the design of policies, procedures, checklists, guidelines and job aids
  • human factors integration into management systems
  • human performance assessment and support
  • safety assessment (and risk assessment generally)
  • incident and accident investigation and analysis
  • staffing and manpower planning
  • shift design and fatigue risk assessment and management
  • stress management
  • communication design
  • safety culture and organisational culture evaluation
  • non-technical skills, team resource management, and (simulation) training.

Of these functions, the main focus of human factors integration in the NHS has been the latter. Many clinicians and educators have embraced human factors and integrated it into their non-technical skills, team resource management, and simulation training. It’s perhaps an obvious place to start, it’s vitally important, and it’s very well done (in fact, simulation training in some Trusts could teach aviation a thing or two – and has here and here). There are also a few individuals in the NHS with HF/E qualifications and experience, but who are not Chartered. Aside from a small number (mostly in Scotland), these do not work as HF/E specialists per se and/or are not performing the activities above for Trusts.

And there are Chartered HF/E specialists in medical device design companies outside of the NHS, and of course in consultancies and universities, who consult or conduct research in healthcare organisations.

More generally, there is significant interest in HF/E from clinicians; so much so that one would expect that it was actually integrated into the NHS. The term ‘human factors’ regularly crops up on social media and in conferences (though it could mean anything).

But none of this embeds a systematic consideration of designing for humans in healthcare, which is part of normal business in many high-risk industries. This can only really be done with trained and qualified practitioners, just as is the case with Physiotherapists, Dietitians, and Counselling Psychologists. In the absence of the HF/E equivalent in the NHS, front-line staff are having to do what they can, with the time they have, and otherwise work around and patch up problems in resources, constraints and environments that are:

  • not understood at the blunt end
  • always changing
  • not implemented or functioning as originally designed or imagined
  • often not ‘designed’ at all, and
  • degraded and stretched beyond design intent.

When things go wrong, it is patients, families, and clinicians who suffer the consequences.

And yet, of the 233 NHS Trusts, only one Trust employs a Chartered Ergonomist and Human Factors Specialist in an HF/E role, and that Trust employs – at present – one CErgHF.

It’s incredible. But it’s true.

Posted in Human Factors/Ergonomics, Safety | Tagged , , , , , , | Leave a comment

Bonding and Bridging at the Philosophical Breakfast Club

On 26 April 2018, I presented at the ‘Philosophical Breakfast Club’ () conference on High Performing Teams (). It was a remarkable conference bringing together healthcare professionals, psychologists, sports scientists, athletes, managers, human factors/ergonomics specialists, military officers and specialists, and others, My first conversation while having tea before the conference was with a spinal surgeon and bomb disposal expert. Throughout the conference I had many other fascinating conversations with people from a diverse range of backgrounds.

This leads me to the focus of my talk: collaboration at the interfaces, and what happens between teams, groups, professions, layers of management, organisations…  In this post, I summarise the talk, slide by slide, with tweet-sized explanations.

This is a talk on bonding and bridging. It is about what happens with and between groups. I draw on my and my colleagues’ experience in over 30 countries over 10 years understanding groups and organisations psychometrically and ethnographically. 1/.


This infographic from  Issue 26 shows the scale and interconnectedness of the European air traffic management system. Many sources of data show how ATC is an very safe part of an ultra safe industry – commercial aviation. 2/


Air traffic controllers and others routinely state that teamwork is something they value most in creating safety, and they nearly always rate it very positively. Where problems do occur – as in all industries and organisations – tend to relate to interfaces between groups. 3/


Here is a small example. A new ATC centre was built and, during discussions, it emerged that there were now communication issues between controllers and engineers. In the old centre, controllers had to pass through the engineers’ coffee area. The new centre designed this out. 4/


In another example, problematic relations between two operational groups reached crisis point after a serious incident occurred, when blame turned inward between groups. There wasn’t sufficient trust and openness to cope with the outfall of the event. This had to be worked on reparatively. 5/


Another example involved inadequate coordination between an airport and ATC tower. The airport mandated a new procedure that increased ATC workload significantly and unsustainably. The impact was not understood. Other problems have involved procedures mandated by safety departments, with unintended consequences. 6/


Very few accidents are associated with ATC. One of the few occurred in 2002 in Überlingen over southern Germany. It involved problems of coordination and communication between engineering and ATC, as well as management, and the whole aviation system. 7/Slide07

This is a figure from the EUROCONTROL White Paper on Safety-I and Safety-II. To cope with especially problematic circumstances, incidents, emergencies, and the aftermath of accidents, we need to pay attention to collaboration during everyday work, and learn from ‘the best of what is’. 8/


It can seem that the design of organisations gets in the way of collaboration. We organise via ‘Divisions’ and ‘Departments’ (which, from the old French, means the same). Our org charts don’t show the demand or need for services, nor the customer, nor the flow of work and comms. 9/


The siloisation of work extends to all levels. This figure shows a slightly adapted ActorMap template (Rasmussen), which can be used to map interactions, and in conjunction with AcciMaps to understand issues at the interfaces.10/


The question is, where is the boundary of the sociotechnical system that you are interested in? We need to appreciate systems theory in order to understand teamwork and collaboration. Teams exist within a much larger interconnected network. (Beer – also part of a eco-consumer system [courtesy of Black Isle Beer].) 11/


So let’s turn to the work that we do, and how we imagine others work 12/


We understand the work that we do. It is hard to understand from afar, because it is complex and messy. It is characterised by variability, adjustments, adaptation and trade-offs between goals (efficiency-thoroughness, acute-chronic, tasks-relationships, etc). 13/


But when we imagine the work of others, our imagination of the work is vastly simplified and wrong in important ways. While work-as-done is dynamic, our imaginations are static. We all have a different image of others’ work. And they are all wrong. 14/


Many people, when reflecting on others’ imaginations of their work – expressed in artefacts such as policies, procedures, diktats, equipment, etc – describe two different worlds. We live in a state of Ignorance and Fantasy of others’ work. 15/


The relationship between work-as-imagined and work-as-done was the theme of  Issue 25 (free). It includes articles by professors of safety and human factors, controllers, pilots, and some clinicians. 16/Slide16

Erik Hollnagel made a distinction between egocentric and allocentric work-as-imagined. The former refers to our imagination of our (current and future) work. The latter refers to imagination of others’ work. The two have very different feedback loops. 17/


Within close knit groups, we are close to others’ work, and we tend to do similar kinds of jobs. So we can more easily imagine others’ work. We also form trust and reciprocity through our interactions – bonding social capital. 18/Slide18

Between groups – professions, teams, organisations – it is not so easy. It is hard to see and understand what others do, or why they do it (in the way they do it). We often lack trust and reciprocity. It’s hard to trust someone you don’t see. We lack bridging social capital. 19/Slide19

The issue of Safety at the Interfaces was the theme of  Issue 26. Here we explored interfaces between groups within aviation, as well as healthcare () WebOps () and communities (). 20/


Some lessons that have struck me really strongly over the last few years have come from seeking to understand how communities work. It seems rare that organisations and professions try to understand communities, and yet there is much to be learned. 21/Slide22

One strand of practice is known as Asset Based Community Development – . ABCD is an approach to understanding and developing communities from the inside based on that they have – assets – instead of what (we imagine) they don’t have – deficiencies. It starts with assets. 22/Slide23

A major figure in has been John McKnight. He has worked in activist organizations and civil rights agencies, and learned the Alinsky approach to community organizing. He created a new Department at Northwestern University, to support urban change agents. 23/Slide24

Another major figure in community development has been Peter Block, known for work on organization development, community building, and civic engagement. He works on building the capacity of community to value its gifts and see its own possibility. 24/


I’m grateful to have been introduced to ABCD, and its history, via Cormac Russell (). He has worked with communities in over 30 countries and has brought ABCD to many via . Cormac is a faculty member of the ABCD Institute at Northwestern University. 25/Slide26

One of the many insights from concerns boundaries and invitation. What are the boundaries of our groups? Is there an invitation at the edge? 26/Slide27

John McKnight recalled to Cormac Russell a story about John’s ‘County Labrador Retriever Owner Association’, where people and their Labrador dogs got together. One day, someone with a beautiful German Shepherd approached the group. But it wasn’t a Labrador. It illustrated something about the often arbitrary boundaries that we create and maintain. 27/Slide28

Boundaries between groups vary in nature. They can be situational, perhaps involving time or place. They can be physical, like a redesigned building that separates groups. They can be more personal, professional and social. Or they can be built into organisations and systems. 28/Slide29

Whatever kind of boundary it is that separates groups, it defines who is in, and who is out. It is usually clear to everyone on which side of the fence they are on, even though it may never be stated (see this interview in #HindSightMagazine Issue 26 with .). 29/Slide30

Here is the challenge. Whether we think about our organisation, profession, group, team, association, community… “How can we keep expanding the limits of our hospitality. Our willingness to welcome strangers.” ‘Outsiders’ don’t dilute our group. They invigorate it. 30/Slide31

Another key aspect of is about mindset. Communities – and people in organisations – are often seen through a deficit lens. So people are defined in terms of (imagined) needs. This is the wrong way to look and the wrong place to start. We need to start with their assets. This is also a way of thinking that resonates with Safety-II. 31/Slide32

It is valuable to discover the gifts, skills and passions of our fellows. What are they naturally good at – something their mother might point out? What have they learned as a skill? What are they passionate about? And how can we get these connected up between people? 32/


We want to increase participation in organisations and communities, and between the two. “Community building is about getting the greatest number of contributions by the greatest number of people” (from Looking Back to Look Forward). 33/Slide34

I think that this participation requires three things. The first is contribution or capability. We all have something to contribute, but this needs to be discovered. The second is the opportunity to show up and contribute. The third is the motivation or desire to do so. 34/Slide35

Increasing the diversity of contributions counters our ‘déformation professionnelle‘ – our tendency to look at things from our limited professional perspective. It also increases the quality of ideas and allows ’emergent expertise’ to emerge from our interactions. 35/Slide36

In getting our gifts, skills and passions connected, and in increasing participation from the greatest number of people, you don’t start at the centre. You start at the edge. This is a profound insight from with implications for organisations and professions, too. 36/Slide37

A final insight I’d like to draw from for the purposes of this thread concerns connection. We all have a role to play in getting people’s gifts, skills and passions connected, but we can learn from some people with particular roles or ways of being. 37/Slide38

In my podcast conversation with Cormac Russell, he highlighted four roles. Leaders crystallise issues that people can gather around, and develop followers. Networkers develop their network and may bring people together, but do so more opportunistically. Gappers link together functions and people at the edges or boundaries. Connectors connect in a special and natural way. 38/


You may have met neighbours/colleagues who are like this. They are well connected, see the best of others, are trusted & create trust. They believe in community & move around comfortably between different groups. They get joy from connecting people. They’ve no other agenda. They are connectors. 39/Slide40

You can listen to/read the whole discussion with Cormac Russell here.  40/Slide41

Asset-Based Community Development has affected my practice in some quite important ways. I routinely try to integrate and Safety-II insight into safety, which is notoriously deficit-based, and organisational work more generally. How might you start discussions, observations, etc, on an asset footing? 41/Slide42Slide43






Here are couple of Editorials on the theme of this thread, from Issue 25 and Issue 26. 42/Slide44


And that’s a wrap. It took longer than I imagined…but hopefully was helpful. Thanks for reading/listening. 43/43



Posted in Culture, Human Factors/Ergonomics, Safety, systems thinking, Uncategorized | Tagged , , , , , , , , , , , , | 1 Comment

Human Factors and Ergonomics: Looking Back to Look Forward

During the second world war, the United States lost hundreds of planes in accidents that were deemed ‘pilot error’. Crash landings were a particular problem for the Boeing B-17 ‘Flying Fortress’. The planes were functioning as designed, and the pilots were highly trained, but made basic errors. In 1942, a young psychology graduate, Alphonse Chapanis joined the Army Air Force Aero Medical Lab as their first psychologist. Chapanis noticed that the flaps and landing gear had identical switches that were co-located and were operated in sequence. In the high-workload period of landing, pilots frequently retracted the gear instead of the flaps. This hardly ever occurred to pilots of other aircraft types. Chapanis fixed a small rubber wheel to the landing gear lever and a small wedge-shape to the flap lever. This kind of ‘pilot error’ almost completely disappeared.

A few years later in 1947, experimental psychologists Paul Fitts and Richard Jones analysed accounts of 460 errors made in operating aircraft controls, through interviews and written reports. They noted” that “It has been customary to assume that prevention of accidents due to materiel failure or poor maintenance is the responsibility of engineering personnel and that accidents due to errors of pilots or supervisory personnel are the responsibility of those in charge of selection, training, and operations.” Fitts and Jones took a different slant altogether. The basis for their study was the hypothesis that “a great many accidents result directly from the manner in which equipment is designed and where it is placed in the cockpit.” What had been called ‘pilot error’ was actually a mismatch between characteristics of the designed world and characteristics of human beings, and between work-as-imagined and work-as-done.

Fitts and Jones considered a range of problems, including operating the wrong control, failing to adjust a control properly, forgetting to operate a control, moving a control in the wrong direction, unknowingly activating a control, and being unable to reach a control when needed. The flap-gear substitution error, and many other ‘pilot errors’ were actually problems of cockpit design. They concluded: “Practically all pilots of present day AAF aircraft, regardless of experience or skill, report that they sometimes make errors in using cockpit controls. The frequency of these errors and therefore the incidence of aircraft accidents can be reduced substantially by designing and locating controls in accordance with human requirements” (p.2). They went on to specify design measures for controls and displays (concerning standardisation, simplification, sequencing, interlocks, and other aspects of compatibility of controls with human characteristics and expectations).

These and other studies brought into focus the ‘obvious fact’ that human performance cannot be separated from the design of tasks, equipment and working environments. We can’t just train and supervise human performance. We have to design for it. Accidents associated directly with cockpit design are now extremely rare, and in 2017 there were no passenger deaths from flights in commercial passenger jets.

The birth of a discipline

Research in the US and UK concerning real work in real environments during and after WWII formed the beginnings of the discipline that was termed ‘human factors’ (US) and ‘ergonomics’ (UK). It was not the intention of early researchers to form a new discipline. Rather, “the intention was much more modest, namely, to facilitate discussion, information exchange and collaboration between scientists working across a range of specialisms” (Waterson, 2016). These specialisms were anatomy, physiology, psychology, industrial medicine, industrial hygiene, design engineering, architecture and illumination engineering (Murrell, 1965).

Over time, human factors/ergonomics (HF/E) became a distinct discipline, with its own societies. The first was the Ergonomics Research Society in the UK in 1950 (now Chartered Institute of Ergonomics and Human Factors), following by the Human Factors Society of America in 1957.

Despite the different names for the discipline, a formal definition has been agreed, via the International Ergonomics Association – the umbrella association for national HF/E societies and associations. The definition is accepted by member societies around the world:

“Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance.” (International Ergonomics Association)

Another simpler definition was provided by the late John Wilson, who later defined ‘systems ergonomics and human factors’ as follows (extract):

“Understanding the interactions between people and all other elements within a system, and design in light of this understanding.” (Wilson, 2014, p.12)

Simpler still, HF/E is sometimes referred to as ‘design for human use’.

HF/E takes a scientific approach to understanding and design, including the generation and application of associated theory, principles, data and methods. Decades of scientific research in a range of contexts have enabled a sophisticated understanding of human needs, limitations and capabilities, influences on human performance and wellbeing, human influences on system performance, and patterns of interaction between human and other system elements – physical, technical, informational, social, organisational, political, and economic.

Designing interactions

HF/E focuses on the design of these interactions. This differentiates HF/E from other design and engineering disciplines. For industrial applications, a good shorthand for this is ‘work’. So HF/E seeks to optimise the design of work, but with a focus on work-as-done, and not simply work-as-imagined (see also EUROCONTROL, 2016).

Interactions occur at different levels. At a micro level, we have basic interactions such as pulling a lever, pressing a button, turning a dial, or hearing an alarm. At a meso level, interactions combine, bringing more complexity, such as communication and coordination between a pilot, co-pilot, and cockpit. At a macro level, the number of elements and interactions, and associated complexity, increases further, perhaps expanding to air traffic controllers, air navigation equipment, ground staff, airport, airspace, management, regulation, etc. As the lens widens, so does the number of stakeholders, and the number of goals, needs and system or design requirements that need to be considered.

These interactions occur in a context, and context is critical to HF/E. If I turn the wrong burner on my stove (which I do, very often), it is not a problem. I simply turn it off and now I know the correct dial to turn. If I want to be sure I can bend down to look at the little diagram, but often I can’t be bothered. A similar action for B-17 pilots resulted in retracting the gear instead of the flaps, and accidents. A similar action for an anaesthetist might inadvertently turn off a continuous-flow anaesthetic machine because of a badly positioned power switch. If the consequence of my turning the wrong burner dial were more severe, I would bother to check the little diagram often, but I would still make mistakes, mostly because the layout of the stoves is incompatible with the layout of the dials, which look identical and are co-located. If the consequences were indeed more severe, cooker designers would be forced to design dials to be compatible with burners, along with other designed safety features

HF/E in practice is a blend of craft, engineering and applied science. The approach tries to make system interaction and influence visible. It uses methods for data collection, analysis and synthesis, to understand and map system interaction at every stage of the life-cycle of a system or product. HF/E can therefore help in the design of interactions in the context of:

  • artefacts (e.g., equipment, signs, procedures)
  • designed environments (e.g., airport layout, airspace design, hospital design, lighting)
  • planned organisational activity (e.g., supervision, training, regulation, handover, communication, scheduling)
  • work and job design (e.g., pacing, timing, sequencing, variety, rostering, critical tasks)
  • emergent aspects of organisations and groups (e.g., culture, workload, trust, teamwork, relationships).

I like to think of human factors and ergonomics as rooted – to some extent – in four kinds of thinking:

  • systems thinking, including an understanding of system goals, system structure, system boundaries, system dynamics and system outcomes;
  • design thinking, including the principles and processes of designing for human use;
  • humanistic thinking, emphasising human agency, awareness, wholeness, intention, meaning, values, choice, and responsibility; and,
  • scientific thinking, purposeful thinking that aims to enhance scientific understanding by problem specification, hypothesising, predicting, observing, measuring, and testing.

The ultimate goals of this design activity are to optimise human well-being and overall system performance. Some argue that this joint ‘and’ purpose characterises the unique holistic nature of HF/E (e.g., see Wilson, 2014). In practice, it means optimising for several goals concerning the effectiveness of purposeful activity (such as efficiency, productivity, maintainability) and particular human values (such as safety, security, comfort, acceptance, job satisfaction, and joy). Some goals are usually of higher priority than others for particular applications, but they often conflict and compete, requiring practical trade-offs and compromises.

Since the 1950s, HF/E specialists – practitioners and researchers – have come from various academic backgrounds and increasingly a wide variety of professional backgrounds and industries. They work with all sorts of people at all levels: consumers and service users, front-line and support staff, supervisors and senior management, regulators and policy makers in almost all industrial sectors (see Shorrock and Williams, 2016, for an overview).

Human Factors/Ergonomics is booming in certain sectors, where success seems to have begat success. ‘Ultra safe’ sectors such as air traffic management, rail and nuclear power in the UK have well-developed HF/E capabilities. NATS – the UK’s en route air traffic control provider – has a human factors department that has been staffed by 20-30 full time HF/E specialists and psychologists over the past 15 years or so. The Rail Standards and Safety Board (RSSB) and Health and Safety Executive have long had a mature and effective human factors capability, as have the nuclear and defence industries. All provide HF/E services in all aspects of the different sectors, from concept design through detailed design, prototyping and simulation, construction and commissioning, operation and maintenance, and decommissioning.

But the success has not been evenly spread, and has not matched need. It often appears that those sectors with the greatest need – healthcare, road transport, and farming, for example – benefit least in terms of HF/E practitioners in applied roles. Seventy years after Fitts and Jones’ seminal reports on controls and displays, quite basic design problems remain in many industries.

In healthcare, for instance, different medicines look alike and sound alike, despite the presence of official guidance informed by HF/E. There are thousands of machines with design problems so basic as different number formats; in a single hospital, one can find pumps with keypads that are like a telephone, like a calculator, or a keyboard. This shows how far ahead of its time was the work of Chapanis in the 1940s.

In fact, it was Chapanis who designed the standard telephone numerical keypad configuration that is in use today on every telephone and smartphone around the world. He tested six configurations of buttons, two vertical, two horizontal rows, and different three-by-three arrangements. All of these variations can still be found in safety-critical equipment. And most of the problems in using controls that were analysed by Fitts and Jones can be found in in safety-critical equipment used for mining, oil and gas extraction, agriculture, forestry, fishing, manufacturing, construction, recycling, digital products, telecommunication, transport, and healthcare. There may be several reasons for this.


One reason may be a failure of branding and marketing. HF/E specialists have not come from marketing backgrounds are not typically good at it. For a start, HF/E is a discipline and profession with two names, seen as equivalent in the discipline, but different in industry and the media (with ‘human factors’ associated with accidents, and ergonomics associated with ‘design’, Gantt and Shorrock, 2016). Its focus on ‘system interactions’ appears to be lost to many outside of the profession. It doesn’t have a clear elevator pitch, and is not instantly recognised and understood by the public in the way that HF/E specialists would like it to be (with ‘ergonomics’ being associated with office furniture, and ‘human factors’ being associated with nothing much).

Staying technical

A second reason may be a failure of ambition and lobbying. Sherwood-Jones (2009) argued that “many ergonomists are committed to an entirely technical career and have no aspirations to management. … The consequence of staying technical is of course that you will be ignored, overruled and brought in when it is too late to do anything useful, but not too late to demonstrate that ergonomics can fail.” There are few (often no) qualified and experienced HF/E specialists on company boards, in national regulators (even aviation), or policy makers, let alone governments. While aviation is often seen as a paragon of HF/E, only one national aviation administration maintains a high level of expertise and research programme in the discipline: the United States Federal Aviation Administration. With a few exceptions, it seems that HF/E specialists have been happiest at the micro and meso levels of interaction design, and not at the macro level, despite the systemic adverse influence of top-down interventions on system and human performance (e.g., government performance targets, see Shorrock and Licu, 2013).

Shortage of HF/E specialists

A third reason may be a shortage of qualified HF/E professionals (accredited, certified, registered or chartered by relevant societies and associations) situated in industry and government agencies. This is also associated with limited demand and a shortage of HF/E courses. In many countries, there are few or no HF/E professionals even – or especially – in sectors with the highest number of ‘avoidable deaths’.

Taking the UK as an example, in England there are 233 National Health Service Trusts – providers of urgent and planned health care (‘secondary care’). NHS England is an organisation of over 1 million staff, with a planned expenditure for 2017/18 of over £123bn. It espouses a focus on patient safety, and its focus areas for 2017/2018 clearly require HF/E expertise, including improving investigations, reducing medication error, and “an approach to patient safety is widely recognised as world-leading” (NHS England, 2018). The number of qualified full-time HF/E specialists in NHS England care providers can be counted on one hand. In fact, only one out of 233 NHS Trusts employs any Chartered Ergonomist and Human Factors Specialists.

There is some excellent training for clinicians in aspects of behavioral human factors, such as team training, team resource management and non-technical skills, and many Trusts have their own advanced simulation facilities and staff. This does not, however, address the underlying design problems that remain, and at best may provide awareness of these, and compensatory behavioural routines.

Rising popularity

Despite the shortage of HF/E specialists, HF/E is becoming more popular. Over the last decade or so, the term ‘human factors’ and HF/E issues have gained currency with an increasing range of people, professions, organisations and industries. This is a significant development, bringing what might seem like a niche discipline into the open, to a wider set of stakeholders. In healthcare, there is now significant participation in discussions about ‘human factors’, which can be seen especially on twitter. The same can be seen in other industries, especially new sectors such as web operations and engineering. Front-line workers know that HF/E is relevant. It’s kind of obvious that work should be designed for human needs and characteristics. The difficulty seems to be in getting commitment for resource at upper levels.

A two-pronged solution

The criticality of HF/E is not in dispute. So how to gain more traction on designing for human wellbeing and system performance? One way is of course more training opportunities. Another is more lobbying for HF/E posts in commercial, governmental, and intergovernmental organisations. Certain roles, typically involving a wide and deep level of content and method expertise will always require highly qualified and experienced HF/E practitioners (e.g., certified, registered, chartered). For instance, these specialists are now higher demand, and having greater impact, in medical device design and pharmaceuticals. But this h as been tried for decades, with limited success.

So the other half of the solution is to spread HF/E to others, who might be familiar with certain aspects of HF/E theory and method, practicing certain aspects of HF/E design, or advocating or evangelising HF/E principles, but not HF/E specialists as such. The founders of HF/E were not HF/E specialists then (and were probably too specialised to ‘qualify’ as HF/E specialists today!). So this is where you come in. If the idea of designing for human use to optimise performance and human wellbeing appeals to you, then now is a good time to think about how you might learn more, and integrate HF/E in your practice.


EUROCONTROL (2016). HindSight: ‘Work-as-imagined and work-as-done. Issue 25. Brussels: EUROCONTROL

Fitts, P.M. and Jones, R.E. (1947). Analysis of factors contributing to 460 “pilot error” experiences in operating aircraft controls. Dayton, OH: Aero Medical Laboratory, Air Material Command, Wright-Patterson Air Force Base, 1947.

Gannt, R. & Shorrock, S.T. (2016). Human factors and ergonomics in the media. In Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.

Murrell, K.F.H. (1965). Ergonomics: man in his working environment. London: Chapman and Hall.

NHS England (2018) Patient safety. Accessed on 10/01/18 at

Sherwood-Jones, B. (2009). Usability assurance (blog). Accessed on 10/01/18 at

Shorrock, S. and Licu, T. (2013) Target culture: Lessons in unintended consequences. HindSight: Safety versus Cost. Issue 17. 10-16. Brussels: EUROCONTROL.

Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.

Waterson, P. (2016). ‘Ergonomics and ergonomists’: lessons for human factors and ergonomics practice from the past and present. In Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.

Wilson, J (2014). Fundamentals of systems ergonomics/human factors. Applied Ergonomics, 45(1), 5-13.

For more information on HF/E degree courses in the UK, see here. For shorter courses in the UK, see here.

For more information on HF/E degree courses in the USA, see here.

For information on other HF/E societies and associations and educational opportunities, see here.

See also

Four Kinds of ‘Human Factors’: 1. The Human Factor

Four Kinds of ‘Human Factors’: 2. Factors of Humans

Four Kinds of Human Factors: 3. Factors Affecting Humans

Four Kinds of Human Factors: 4. Socio-Technical System Interaction

Posted in Human Factors/Ergonomics, Safety | Tagged , , , , , | 6 Comments