The Real Second Victims

In many professions, specific terms – both old and new – are often established and accepted unquestioningly, from the inside. In some cases, such terms may create and perpetuate inequity and injustice, even when introduced with good intentions. One example that has played on my mind over recent years is the term ‘second victim’.

The term ‘second victim’ was coined by Albert W Wu in his paper ‘Medical error: the second victim’. Wu wrote the following:

“although patients are the first and obvious victims of medical mistakes, doctors are wounded by the same errors: they are the second victims”.

As someone with a PhD in ‘human error’, the potential for trauma associated with one’s own actions and decisions is a phenomenon that I have come across in many interviews and discussions, albeit in a different context – air traffic control. In this context, professionals’ decisions and actions are almost never associated with death, but there are rare examples, and the prospect of hundreds of lives being lost at once can be devastating in the context of a near miss.

The term ‘second victim’ in healthcare was further popularised by Sidney Dekker in his 2013 book Second Victim: Error, Guilt, Trauma, and Resilience. There are tens of thousands of webpages on ‘second victims’. It is a term that is accepted by healthcare practitioners who see only too clearly the immediate consequences of mistakes and actions-not-as-planned.

While the term is accepted within the medical professions, important questions have been asked by those who have lost more than their confidence, profession or even – for however long – mental health. Sara Ryan – the mother of Connor Sparrowhawk (popularly known as LB, or Laughing Boy), is one of several families who have questioned the use of the term in healthcare. Sara remarked on twitter:

The thread continued:

She later clarified:

Surely families are the second victim? It was one of those questions that could perhaps only come from the profound truth of pain. LB was “a fit and healthy young man, who loved buses, London, Eddie Stobart and speaking his mind” (see the #JusticeforLB website). As described on #JusticeforLB:

LB’s mood changed as he approached adulthood and on 19 March 2013 he was admitted to hospital, the STATT (Short Term Assessment and Treatment Team) inpatient unit run by Southern Health NHS Foundation Trust). LB drowned in the bath on 4 July 2013. An entirely preventable death.

Sara and her family were not only victims following the death of Connor. They were further victimised by organisations responsible for Connor’s death. The process of getting justice has involved an inhumane ordeal, including a good deal of ‘mother blame’, detailed in Sara’s book ‘Justice for Laughing Boy’. This is a book that should be standard reading on a wide range of courses, from medicine to law. But in a paragraph, from the website #JusticeforLB:

How are you all doing?

Mmm. Good question. Not sure really. I can probably only speak for myself [Sara]. Not brilliant really. The death of a child is an unimaginable happening. That it could have been so simply and easily avoided, in a space in which no one would have thought he was at risk of harm, is almost impossible to make sense of. The actions of Oxfordshire County Council and Southern Health NHS Foundation Trust since his death have been relentlessly battering.

So perhaps it takes an experience of being a real second victim, and of being victimised, to see that the the term ‘second victim’ is one that only applies to loved ones.

Then again, it’s obvious. Of course family are the second victims. How could they not be?

But it is not obvious to tens of thousands, perhaps hundreds of thousands or more, of healthcare workers who find personal meaning in the term ‘second victim’, as applied to themselves – actually or potentially.

I asked my partner – an experienced practising psychotherapist and trainee counselling psychologist – what came to mind with the term ‘second victim’. Without hesitation, she said “family“. She had never heard the term ‘second victim’ before and did not know why I was asking.

She said, “If you’d have said ‘secondary trauma’, I’d have said the professional“. That is because, in this sense, the primary trauma is with the family who survive a person who has died. She also mentioned the difference in choice and control between clinicians and family, in that a clinician for instance, while unable to control the environment and resources, has control over whether she or he is a clinician. While my partner has no control over clients, she has control over her choice to remain a psychotherapist.

Some have tried to combine those who have died and their families as first victims (e.g., https://www.youtube.com/watch?v=YeSvCEpg6ew). But this casual combination of the dead and their loved ones is unconvincing, and seems like a fudge. My own mother died at 45 years old (following delays in treatment and lack of communication between a private and public hospital, which I won’t go into here). I remember my father at the time saying, “People tell me they feel sorry for me. I say they should feel sorry for her. She died at 45!”

There is a very real difference between a someone who has died, and a loved one who is grieving for that person, and someone who is suffering having witnessed or somehow been involved as a healthcare professional before the person died. Sara writes more about that here. She notes that “I’m not ignoring or denying that healthcare staff may/must be devastated by the death or serious harm of a patient here. It simply ain’t comparable to the experiences of families.”

Questions about first and second victims inevitably imply a ranking. So if loved ones are the real second victims, different in a very real sense to the deceased, then where does this leave professionals, who are different in a very real sense to bereaved families? Logically, however unsavoury the ranking exercise, professionals are third victims. The conversation in the third tweet above continued on this line of inquiry:

While ranking victimhood may seem like a troubling exercise, professionals in healthcare have, in effect, already created a ranking by establishing – quite uncritically it seems – the term ‘second victim’. ‘Second victim’ indicates a first victim, and implies a third victim.

During bereavement, families are sometimes victimised further still by organisations during the natural quest for justice. Justice, in this context, includes apology, truth, genuine involvement, learning, and change. For LB and his loved ones, it included this and this. In effect, justice involves the proper meeting of needs. There are millions more like LB, and millions of families like his, who feel forgotten and discounted by the professionals, organisations, and society, who morally and ethically should be involved in meeting these needs.

Sadly, the established ‘second victim’ concept, in effect, further victimises the forgotten. Acknowledging and helping to meet the needs of loved ones as the real second victims, as well as healthcare professionals as third victims, would be a truly restorative act of justice.

Reference

Wu, A.W. (2000). Medical error: the second victim. The doctor who makes the mistake needs help too. British Medical Journal, 320, 726–727.

Posted in Safety | Tagged , , , , | 2 Comments

The Commercialisation and Commodification of Competency

to-thine-own-club-2.jpg

Image: https://www.gapingvoid.com CC BY-NC-ND 3.0

Two or three years ago, I undertook a course involving UX ‘certification’. I had already undertaken courses in HCI and design as part of an MSc(Eng) in Work Design and Ergonomics some years (ahem…21) earlier. And I had already been involved in most aspects of the design and evaluation of interactive systems. So I was interested in what was new. In fact, the course was an overview of an ergonomics standard (and a good one: ISO 9241-210, 2010), which was not new to me but was enjoyable nonetheless. The course lasted two days, with a half day revision session, and a multiple choice exam. The course was well delivered, and the exam was properly invigilated.

But the test, in my view, was primarily a memory test that tested recall or recognition of specific vocabulary. Aspects of the test seemed to focus on dubious and debateable semantic differences, using very similar options that seemed to be designed to confuse. The certification arrangement seemed to encourage teaching to the test, and ironically felt like UX (and accessibility) had been ignored in the certification process, which required a high level of English to wade through the semantic quagmire.

Those who undertook the test came out feeling deflated, doubtful, discouraged and demoralised. Their passion for the subject as newcomers was gone, while existing practitioners were now skeptical of certification, at least of this sort. I know this because I spoke to many immediately after the course. After a while, we either learned that we had passed, or not, the test. Some of the questions were so vague and convoluted that complaints were made. People waited to hear whether their money – or moreover that of their employer – had been well spent and whether they were now certified. I am quite sure though that a ‘pass’ would give most a feeling of relief and pride. We humans, indeed mammals generally, like to be members of clubs, and we like ranks. We see this natural preference throughout organisational life.

There are many other such courses, often a day or a few days in duration, relating to all aspects of work (e.g., safety management, crew resource management [sometimes sold as ‘human factors‘], safety culture, just culture, error management, etc). In my experience, at their best, they offer a starting point for further exploration, but usually little more than that. That is enough. But they are often sold as much more. Importantly, rather than acting as a springboard for reflection, exploration and divergent learning, they act as a dragnet for further convergent indoctrination and up-selling of a defined set of ideas and tools. More worrying still is when they infer membership of an ‘exclusive’ club (which may benefit the owner of the club much more than the members).

Such training is often associated with ‘tools’ (almost always trademarked) that are licensed for profit, often combined with mandatory commercial training, refresher training, and ongoing subscription by the tool developer. Trade-marking and licensing is often a legitimate and necessary way to protect intellectual property (especially for small businesses). But it does not infer quality. Some of these tools lack innovation, have been overtaken by fundamental changes in theory, or are available in similar form elsewhere freely or at reduced cost, and yet subscription and licensing services can lock users into hard or soft dependency.

So here are a five things to look out for, and associated questions to consider, when considering products and services of this nature. They are not in any way definitive. There will be other criteria and questions, and some of these may not indicate a problem, but they may be useful things to think about.

  1. Dependency: Does it lock you into dependency? Is it hard to move to something more suitable, with a different supplier or service provider, for hard reasons (e.g., contracts; subscription) or soft reasons (e.g., feelings of commitment; sunk cost)?
  2. Manufactured exclusivity: Does it create ‘exclusivity’, and the sense of being an ‘insider’, or ‘part of something’ (a club, scheme, network, community, user group, benchmarking group)? Does your feeling about it, and evaluation of it, depend on your membership status, or whether you pass a test? Does it involve ranks (belts, ‘Master’ status, bronze/silver/gold) or other appeals to pride?
  3. Dubious value: Can your need realistically be met by reading along with online/in person discussion groups, supervised practice, etc? Is something comparable available elsewhere that provides much of the value, at much reduced cost? Is the product or service outside of a respected, independent not-for-profit regulation or certification body?
  4. Closed: Does it remain fixed, and not updated in light of scientific developments and changes in theory and method? Is independent evaluation precluded? Does it ignore fundamental challenges to its assumptions, theory, method, etc? Is critical reflection and inquiry discouraged? Is exploration of alternative approaches discouraged, without good reason?
  5. Control: Is control (over ideas, information, method, theory, means of interaction and exchange) highly centralised into one person or private commercial entity?

If you can answer ‘Yes’ to a few of these questions, this may not be a problem. The product or service may provide sufficient value, or the questions answered ‘Yes’ may not be significant. But increasing ‘Yes’ responses may indicate a problem, and in this case you might want to consider whether the product or service is what you need, or what someone else wants you to need.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , | Leave a comment

Giving Guidance to Government

This article was published in The Ergonomist, published by the Chartered Institute of Ergonomics and Human Factors, No. 568, Nov-Dec 2018.


33920168702_1bf818d110_h

From healthcare and patient safety, to the latest developments in driver automation, human factors is not only relevant across many issues of societal concern, it can achieve significant impact too. Steven Shorrock and Sarah Sharples share their experiences contributing to three key government reports.

Human factors and ergonomics seeks to optimise interactions between people and all other elements of the system at all levels. Much of the time, practitioners and researchers are concerned with evaluating and designing work, tools and environments for specific applications. Occasionally, however, opportunities arise at the level of organisational decision-making, regulation and at government level. For many issues of societal concern, human factors expertise is particularly relevant and could have significant impact, if it secures a place at the table.

The following three reports illustrate the span of issues and impact that human factors advisers can achieve when working closely with government.

Learning in the NHS

Steven Shorrock gave oral evidence, with Scott Morrish, father of the late Sam Morrish and Member of the Healthcare Safety Investigation Branch (HSIB) Expert Advisory Group, on Tuesday 8 November 2016 in a meeting Chaired by Bernard Jenkin MP in the Houses of Parliament.

This report focused on the issues arising from the Parliamentary and Health Service Ombudsman’s (PHSO) July 2016 report, ‘Learning from Mistakes: An investigation report by the Parliamentary and Health Service Ombudsman into how the NHS in England failed to properly investigate the death of a three-year old child’.

‘Learning from Mistakes’ was the PHSO’s second report on the tragic death of a three-year old child, Sam Morrish, on 23 December 2010. It set out four key findings:

  1. A defensive culture in the NHS.
  2. A lack of competence and sufficient independence in the conduct of NHS investigations into potentially avoidable harm and death.
  3. Poor coordination and cooperation between NHS organisations involved in investigations, and failure to collectively identify and act on lessons.
  4. Insufficient involvement of families and staff in NHS investigations.

The report made conclusions and recommendations regarding:

  • The Investigative Landscape in the NHS in England.
  • HSIB and the learning culture.
  • Learning and accountability: implementation of the ‘safe space’ .
  • System-wide ‘just culture’.
  • Improving local competence.
  • Measuring improvement.

In response to discussion surrounding a ‘just culture’ taskforce, Steven said that from his experience in aviation, there must be consensus on the need for a just and fair culture that is about learning as a whole. He said that if you don’t have that consensus from a range of stakeholders, you’ll always have something in your system that is pushing against it. “An inclusive taskforce where people are trying to understand each other’s worlds is really the only way to go about it,” he said. “We have certainly learned that that is the only way to get people to understand the need for a just culture, and also to understand each other’s worlds, that the world of the judiciary is very different to the world of practitioners, and both of those worlds do need to co-exist,” he added.

Responding to Scott Morrish’s comments around blame culture, Steven said he felt there that healthcare needed to start looking more at similarities between the ways that things work in different parts of the system. “Fundamentally, most adverse events in healthcare do have at their heart a certain level of pressure, which is one of the system vulnerabilities,” he said.

“Understanding that the system as imagined and system as found, are two different things, is vital,” he said. “The system that we imagine is a very different one to the system that really exists, where resources are often inadequate, the constraints affect the work in a way that is counterproductive, and pressure makes everyone’s job, especially practitioners, much more difficult.” Steven went on to say that healthcare managers must focus on the system as they find it; the work as it’s actually done, and not the one that they imagine. “That means we need to involve an awful lot of people to understand how the system really works if we want to understand and improve it.”

Autonomous vehicles

Oral evidence was provided by Professor Sarah Sharples on Tuesday 22 November 2017 in Committee Room 4A at the Palace of Westminster.

The House of Lords Science and Technology Committee heard evidence from the Department for Transport, the Department for Business, Energy and Industrial Strategy as well as leading academics. The Committee explored with Government Ministers how driverless vehicles fit into wider transport strategy and policy and what the Government is doing to ensure knowledge gained in their development benefits all sectors. The Committee also examined with the academics the socio-economic aspects of the deployment of self-driving cars such as how much is really understood about human interaction with the technology.

The four main findings of the report into connected and autonomous vehicles (CAV) were:

  • The Government is too focused on highly-automated private road vehicles (‘driverless cars’), when the early benefits are likely to appear in other sectors, such as maritime and agriculture.
  • The development of CAV across different sectors needs coordination and the Government, working with key stakeholders, must get a grip on this chiefly by establishing a Robotics and Autonomous Systems (RAS) Leadership Council as soon as possible to play a key role in developing a strategy for CAV.
  • There is a clear need for further Government-commissioned social and economic research to weigh the potential human and financial implications of CAV.
  • This is a fast-moving area of technology and the Government has much to do, alongside industry and other partners, to position the UK so that it can take full advantage of the opportunities that CAV offer in different sectors.

Asked for her view on full-scale trials and live testing, Sarah recommended a mixed-methods approach. Referencing early data from a Transport Systems Catapult demonstration, she said public attitude towards the vehicles was very positive. “It’s only when the public see those vehicles deployed in a real situation that we can start to understand what people might think when they see these new technologies implemented in the context they are so familiar with,” she said.

“Humans are fallible, but humans are also brilliant,” said Sarah in response to the notion that people could be the biggest barrier to autonomous vehicle success. “We know that humans are great at adapting to new situations and changing the way they work with new technologies, but we need to be aware of their capabilities and limitations when we design those technologies.”

Commenting on the potential loss of skills and the responsibility of the driver, Sarah highlighted the control task of the vehicle and the need to maintain both the skills and understanding, and that people gain an appropriate level of competence through a driving test. “Even with fully automated vehicles we need to build in contingency for when the driver will need to take control,” she said.

She went on to suggest that within the conventional driving test, an understanding of the capabilities of those different types of vehicles could be introduced.

Gross negligence manslaughter

Oral evidence was provided by Steven Shorrock at De Vere Grand Connaught Rooms, London, on 6 April 2018.

The Williams Review was a rapid policy review into gross negligence manslaughter in healthcare and was chaired by Professor Sir Norman Williams. The review was set up to make recommendations to support a more just and learning culture in the healthcare system. It covered:

  • The process for investigating gross negligence manslaughter.
  • Reflective practice of healthcare professionals.
  • The regulation of healthcare professionals.

The review heard evidence from a variety of organisations and individuals. It was set up to look at the wider patient safety impact of concerns among healthcare professionals that simple errors could result in prosecution for gross negligence manslaughter, even if they happen in the context of broader organisation and system failings.

Providing evidence

Based on Steven and Sarah’s experience of providing evidence, they offer nine pieces of advice:

  1. Ask for a list of topics or likely questions. You can then consider the kinds of things that you want to discuss. Prepare, but don’t rehearse answers to the questions.
  2. Get advice from people who have done it before. There are likely to be CIEHF members who have participated in similar kinds of committees or reviews.
  3. Maintain good contact with the clerks. They’ll help you to understand what is expected and when.
  4. Find out whether the evidence will be recorded, and how. Evidence may be televised, or transcribed, or not. If the evidence is not recorded, then you may wish to take notes on the themes of your answers during and after the session, in case the notes don’t reflect your answers.
  5. Be comfortable with yourself as an expert. You are expected to base your views on the state of the art, but your opinions are also respected.
  6. Don’t campaign. You need to be objective and evidence-based where possible, and not political. Your answers may be professional opinion or fact, but this must be clearly distinguished.
  7. Follow up with resources and information. There will be things that you won’t mention during oral evidence, or that were not recorded, that you think are pertinent and it’s fine to send these to the clerk after you have given evidence.
  8. Check what extra input will be required and when. You may be sent information to fact check, with very little notice, maybe 24 hours.
  9. Be mindful that your evidence may be used selectively. On publication, you may find that your evidence is used very partially or not in a way that you expect. This may relate to the terms of reference of the review or committee.

Authors’ affiliations

Steven Shorrock is a Chartered Psychologist and Chartered Ergonomist & Human Factors Specialist with experience in various safety-critical industries, including aviation, rail, chemical manufacturing and healthcare.

A former CIEHF President, Sarah Sharples is Faculty Pro-Vice-Chancellor for Research & Knowledge Exchange, and Professor of Human Factors at the Faculty of Engineering at the University of Nottingham. She is also Non-Executive Director of the Transport Systems Catapult.


Further reading

Learning from Mistakes: Oral evidence was given, recorded and broadcast at https://goo.gl/XJyXNB. The evidence transcription is at https://bit.ly/2NnItlY. The report is available at https://bit.ly/2wJ1DbD

Autonomous vehicles: The evidence transcription is at https://bit.ly/2wZKTOi. Supplementary written evidence is at https://bit.ly/2MfuBWv. The report is available at https://bit.ly/2NBbon2

The Williams Review report is available at https://bit.ly/2sN7ADw

Posted in Human Factors/Ergonomics | Tagged , , , , , , , , , , , , , , , , | Leave a comment

The Real Focus of Safety-II

Safety-II has become a talking point. It is discussed not only among safety professionals, but – perhaps more importantly – among front line practitioners, managers, board members and regulators in a wide array of industries. Its practical and inclusive focus on everyday work seems to strike a chord, acknowledging the reality of work for those who actually do the work.

There are, however, a few myths and misconceptions about Safety-II, some of which I highlighted in What Safety-II Isn’t. One is that Safety-II is about exceptional performance – excellence. This is perhaps associated with the use of the term ‘success’ and the phrase ‘go well’ in the literature on Safety-II (e.g., the EUROCONTROL [2013] White Paper). ‘Success’ is used here in a rather general sense, that work achieves its goals, in line with one definition of the term: The success of something is the fact that it works in a satisfactory way or has the result that is intended. (Collins). The word is also commonly used to refer to exceptional attainment (i.e., that someone is. ‘successful’). This is not what is meant from the viewpoint of Safety-II, though the scope of Safety-II is inclusive of excellence, or especially desirable sociotechnical system performance.

Safety-II should be seen as focusing on all forms of work and all outcomes, routine and (perceived as) ‘unremarkable’ work, incidents and accidents, and exceptional performance. It is not about how things go well, so much as how thing go, but with the aim of course that things do go well. This is clearly depicted in the graph from the EUROCONTROL White Paper in Safety-I and Safety-II.

The focus of Safety-I and Safety-II. From EUROCONTROL (2013). From Safety-I to Safety-II: A White Paper. Brussels, p. 25. https://www.skybrary.aero/bookshelf/books/2437.pdf

What this shows is that the focus of Safety-II in terms of work and outcomes includes the focus of Safety-I. But Safety-II does not include Safety-I in terms of its precepts and concepts, which are quite different. (Importantly, both approaches can and should be practised – see Mind your Mindset: Safety-I and Safety-II – though some adjustments and compromises are naturally to be expected.) Both Safety- I and Safety-II include a focus on accidents, actual and potential. (In reality, accidents are a typically fraction of the 0.1%, in the graph above, though potential accident scenarios are a much greater, albeit unquantifiable, proportion.) The difference is that this is the whole focus of Safety-I, which reacts to events and risks primarily via an analytical approach, considering the human role in terms of contributions to accidents (causal or mitigating).

For Safety-II, the major focus is on less remarked-upon work and outcomes, as well as work and outcomes that are especially wanted (and might be seen as goals) or especially unwanted (anti-goals). But Safety-II does not focus specifically on ‘excellence’, and does not ignore accidents and other unwanted events (And ‘best practice’ really makes no sense, since what is best in one context – place or time – will not be best in another. Practice is always contextual.)

A key reason for this focus on everyday work is that work-as-done is the reason why sociotechnical systems are effective, including safe operations, and also the reason why they fail. By ignoring work-as-done, whether it is more or less congruent with work-as-prescribed or work-as-imagined, or whether it is quite different (see the messy reality), we don’t know how the system is functioning and whether it is drifting into an unwanted state, or shifting toward an especially wanted state (see Work and how to survive it: Lesson 2. Understand variation inside your organisation).

Focusing on normal work also makes sense from a Safety-I point of view, with its focus on accidents, actual or potential. This was highlighted in 1984 by sociologist Charles Perrow in his book Normal Accidents. Perrow was making the point that unusual events such as accidents are not fundamentally different to normal, everyday system functioning. They are, in some important senses, equivalent. Big accidents don’t have big causes. It’s just that ‘normal disorders’ combine in unexpected, often emergent, ways. ‘Normal disorders’ might be seen as degraded aspects of the system and context (e.g., technology used beyond design intent, degraded tools, excessive and overly complex procedures, stretched shift systems, competency gaps) along with differences between work-as-imagined and work-as-done. An important point is that it is normally the context of work that is disordered, while work-as-done tends to adapt, adjust and stretch to make things work, in locally rational ways. Work-as-done strives to create order in a system that is fundamentally disordered and not as-imagined from afar.

Adapted from EUROCONTROL (2013). From Safety-I to Safety-II: A White Paper. Brussels. https://www.skybrary.aero/bookshelf/books/2437.pdf

So while we want to ensure that work goes well, aiming for excellence, the focus of Safety-II is on the whole picture, but especially work that we might consider routine, everyday, and even unremarkable. This is the work that may end up in incident reports, or excellence reports, or simply keep the organisation running effectively. If we don’t look, we’ll never know.

Posted in Safety, systems thinking | Tagged , , , , , , , , , , , , , , | Leave a comment

The problem with professional appropriation: The case of ‘human factors’ and ‘ergonomics’

In a recent article in the Sydney Morning Herald newspaper by journalist Liam Mannix (A difficult position: Experts question whether ergonomics holds up), a Sydney University Professor calls out physical ‘ergonomics’ as bad science and practice:

Every year, companies around the world spend hundreds of millions of dollars on ergonomic chairs, keyboards and consultants, believing they are taking science-backed steps to care for their workers.

Ergonomists are regularly called as expert witnesses in court, where their findings can decide workplace injury claims worth hundreds of thousands of dollars. Ergonomics is promoted by work safety organisations around the country.

Yet “ergonomics does not have a firm basis in science”, says Sydney University professor Chris Maher, a leading authority on back pain.

But it seems that some who operate under the label of ‘ergonomics’ and ‘ergonomist’ are neither qualified nor experienced. The article notes that there are only 82 certified professional ergonomists in Australia, according to the Human Factors & Ergonomics Society of Australia, plus another 250 or so full members qualified to practice. (There would be many more, however, who are full members of other professional Human Factors and Ergonomics [HF/E] societies.)

“But there are thousands of people calling themselves ergonomists who aren’t,” says Associate Professor Jodi Oakman, head of the Centre for Ergonomics and Human Factors at La Trobe University.

“People will go out doing ergonomic work station assessments, they’ll call themselves an ergonomist – and they have no training whatsoever. It’s not a protected title,” she says.

Leon Straker, a Distinguished Professor at Curtin University added:

I don’t like a product being given the title ‘ergonomic’ – it’s not correct. If you don’t know who I am, what my job is, you cannot know my ergonomic requirements.”

Stephen Hehir, chair of the Human Factors & Ergonomics Society of Australia’s professional affairs board, remarked to Liam Mannix that many of the studies weren’t published in leading ergonomics journals, and most of the interventions they tested weren’t done by qualified ergonomists. 

“Imagine if they were reviewing surgical outcomes and including those operating without a medical licence rather than only qualified surgeons,” he said.

So it seems that the primary problem may not be with the evidence-based discipline and profession, so much as what I will call ‘professional appropriation’.

5116676245_3a94df7f7b_z.jpg

“There are thousands of people calling themselves ergonomists who aren’t,” says Associate Professor Jodi Oakman, head of the Centre for Ergonomics and Human Factors at La Trobe University. Photo: Jisc infoNet C BY-NC-ND 2.0 https://flic.kr/p/8N9izX

Professional appropriation

If we accept that HF/E is a profession, with registration schemes, codes of conduct, etc, then the next question is whether it is ethically acceptable to appropriate a professional title. Here, I define professional appropriation as taking as one’s own professional identity the label of a recognised profession, without undertaking the requirements to practise the profession, as accepted by professional bodies. The requirements to join a profession typically involve the following:

  • extended study, resulting in an appropriate qualification (for HF/E, these can include human factors /ergonomics or allied disciplines such as HCI, psychology, industrial engineering, biological sciences)
  • supervised experience
  • registration with a recognised regulator or professional body (professional society, association, or government body)
  • adherence to the Code of Professional Conduct of a professional society
  • other requirements, such as continuing professional development.

Professional appropriation seems to happen when individuals with limited exposure to a discipline appropriate an associated title based on this limited exposure. With limited explosive and experience, it may not be clear that professional appropriation is problematic.

The title ‘human factors specialist’ is sometimes appropriated, and this has happened historically with the title with ‘psychologist’, a term that is now legally protected in some countries. Despite being a discipline (with academic courses, journals, text books, professors, etc) and a profession (with certification, chartership, Codes of Professional Conduct, etc), HF/E professional titles are widely appropriated. Some describe themselves as ‘human factors experts’ without qualifications in human factors and without professional accreditation by a professional body. In most cases, this is probably done quite innocently, without understanding the unintended consequences.

Professional appropriation has occurred with a number of professions. The world of user experience/UX (an emerging profession) is apparently experiencing a growth in the use of terms such as ‘UX Psychologist’ by individuals who are not suitably qualified and experienced in psychology (e.g., Chartered or Registered Psychologists, in the UK). While some titles are legally protected (such as ‘Psychologist’ in Australia), other titles are only protected in their variant forms (e.g. ‘Psychologist’ is not legally protected in the UK, but ‘Counselling Psychologist’ and ‘Occupational Psychologist’ are legally protected). Other than legal protection of titles, we are left with legal protection of services, and associated laws (e.g., advertising laws, health and safety laws).

Equivalency

One could argue that professional titles are archaic, and that anyone should be able to choose whatever title one chooses. This argument seems to fall down quickly once one considers just a few professions, for instance physicians and surgeons, nurses and pharmacists, architects and structural engineers, accountants and solicitors, social workers and psychologists.

If one accepts that appropriate qualifications and experience are necessary to work as a professional (by definition), then the next question is whether Human Factors/Ergonomics should be included in this list of professions. Is HF/E a profession that requires suitably qualified and experienced people?

Whatever our view on this, HF/E is already a profession that requires appropriate qualifications and experience. This is evidenced by professional registration in many countries (including Chartership in the UK, as per Chartered Accountants, Chartered Psychologists and Chartered Architects). If one still rejects the idea that one needs to be suitably qualified and experienced, then one risks saying that professional standards in Human Factors/Ergonomics are unimportant and that the quality of Human Factors/Ergonomics professional services, including ethical considerations, is unimportant. This devalues HF/E to such an extent that to offer professional services becomes illogical. One cannot offer professional services (e.g., consultancy, training, expert witness) in something that one does not consider to be a profession. QED.

10082263794_f84c523928_z

NATS employs 25-30 Qualified Human Factors/Ergonomics Specialists. Photo: NATS.- UK Air Traffic Control CC BY-NC-ND 2.0 https://flic.kr/p/gmWeLo

Risk

From the client’s point of view, the above may not seen terribly relevant. What matters more to clients is risk management. What is the risk of professional appropriation? The ‘risk’ concerns problems or opportunities that may not be properly recognised or managed. The risks could be risks to process safety, occupational safety, health, wellbeing, productivity, efficiency, quality, morale, etc,  By hiring someone who is not suitably qualified and experienced, you are hiring someone who is lacks the required competency to help recognise, understand and manage problems and opportunities relating to system performance and human wellbeing. And someone who is not suitably qualified and experienced may be unaware of this. The Dunning-Kruger effect shields us from the limits of our knowledge and skills.

The risks of professional appropriation are quite obvious and immediate for some professions (e.g., surgery, dentistry, anaesthesia), while for others the risks are obvious to some but usually emerge after some time as a project develops (e.g., civil and structural engineering, safety engineering). For still others, the risks are less obvious and may take longer to come to light. HF/E tends to fall into the latter two categories.

One particular risk of hiring someone who is not suitably qualified and experienced is second order problems. With relatively little knowledge and skill in a profession, we tend to be more focused on first order problems – immediate issues. With more knowledge and skill, we are more focused also on second order problems – possible unintended consequences. This requires systems thinking, which happens to be the foundation of HF/E. For instance, focusing only on non-technical skills training and labelling this as ‘human factors training’, without addressing underlying system and design problems to an appropriate degree, can consume an organisation’s ‘Human Factors budget’ and leave people (usually a small and diminishing proportion of the total number of people) to cope with systemic and design problems using their non-technical skills: an ethical dilemma.

And there are very specific risks to professional appropriation. The SMH article recounts a case where a worker was awarded tribunal-ordered compensation – after she suffered an injury caused by a so-called ‘ergonomics intervention’. 

Cakir was working as a web publishing officer with the Department of Employment and Workplace Relations when she was given an “ergonomic assessment of [her] workstation” by an injury management consultant, according to tribunal papers.

The ‘ergonomics intervention’ was apparently not prescribed by a SQE ergonomist, but by an exercise physiologist (the article does not question the validity of exercise physiology).

The risks of professional appropriation are real but hard for clients to see. Clients can, however, ask if those who use the title ‘ergonomist’, ‘human factors expert’ are suitably qualified and experienced. (Note that ‘expert’ is a term that most bona fide experts seem to avoid. I’ve met a handful of people in HF/E who I would truly consider experts. I am not one of them. Though just to confuse matters, note that in some countries, especially in mainland Europe, the term ‘expert’ simply refers to a specialist or someone occupying a particular job role.)

Professional desertification

If anyone can simply adopt any professional title, then one particular system-wide risk is the illusion that the market for associated services is already well-served. For instance, if everyone with a few days of life coaching or NLP training (or even no training at all) adopts the title ‘psychotherapist’, and if employers and clients are none the wiser, then why the need for suitably qualified and experienced psychotherapists (e.g., meeting the standards laid down for full membership by BACP and UKCP, in the UK, requiring many years of formal study, and supervised [often unpaid] practice)? The same goes for any profession.

I wonder if this has become a hidden reality in some sectors when it comes to Human Factors and Ergonomics. As Associate Professor Jodi Oakman, head of the Centre for Ergonomics and Human Factors at La Trobe University, pointed out, “there are thousands of people calling themselves ergonomists who aren’t.” In the National Health Service (NHS) in England, there was, at the time of writing this post, just one Chartered Ergonomist and Human Factors Specialist formally practicing in the role of an HF/E specialist.

5893752031_c0c8133972_z

The National Health Service in the UK has a focus on Human Factors, but only a few qualified Human Factors and Ergonomics Specialists, out of 1.5 million staff. Photo: Lydia CC BY 2.0 https://flic.kr/p/9YP29k

And yet, ‘human factors’ is a huge buzzword in the NHS. There are many courses, and many external consultants (often from aviation) describe themselves as human factors specialists or ‘experts’. The training provided is typically in behavioural (non-technical skills). Non-technical skills are vitally important but NTS training is – I would estimate – somewhere between 1/100th or 1/1000th of the whole scope of discipline of HF/E, if one were to count the pages of text books or journal articles, or hours of teaching on HF/E degrees. In fact, NTS training is more properly aligned with Applied Psychology, because its principles are behavioural, not design-led. (HF/E is primarily about fitting the task to the person, not vice versa.)

This is not to de-emphasise the importance of this training. I have supported such training in healthcare and aviation, and strongly encourage it. But the effect of labeling this ‘Human Factors Training’ – something that has been inherited from airlines – seems to have had unintended consequences. The most obvious of these is the widespread lack of understanding (including at Board level) about

  • the true focus of HF/E (socio-technical systems)
  • its primary means of gaining insight (understanding system interactions, which we might call ‘work’ for our purposes), and
  • its primary means of intervention (design).

In Frank Hawkins’ 1987 book ‘Human Factors in Flight’, he remarked that “There seems to be little justification for any large organisation not employing, in house, one or more degree-qualified Human Factors specialists. In fact, without some level of in-house expertise, Human Factors problems are unlikely to be recognised adequately to generate a call for reference to an external consultant” (p. 328-329).

It may be the case that the professional appropriation of HF/E is somehow associated with the professional desertification of HF/E. The same would likely happen, to varying degrees, with dietetics, architecture, nursing. and psychology.

Involvement and inclusion

At this point, having described some of the problematic aspects of professional appropriation, I find myself dissatisfied and conflicted. On the one hand, professional services, including those done by people who identify themselves as ‘Human Factors Specialists’ and ‘Ergonomists’, should obviously abide by professional standards, including ethical standards. But there are a few problems (see also Human Factors and Ergonomics: Looking Back to Look Forward).

First, there are not sufficient numbers of SQE HF/E specialists (internal or external) to meet demand for HF/E, let alone get involved in solving problems that could benefit from a professional HF/E approach. (This is similar, however, to clinical and counselling psychology in the NHS, for which there are long waiting lists.)

Second, there are relatively few HF/E courses, and little funding, for those who wish to become suitably qualified in HF/E. This applies more, to degree-level courses, which are also a significant investment in time and money. Still, an increasing number of people, for instance front line professionals and other those coming from other allied professions, are signing up for diploma and degree level courses in order to apply HF/E theory and method to their work. (See here for a discussion of becoming an HF/E practitioner.)

Third, it is crucial that HF/E is not merely a discipline and profession, but a broader endeavour aimed at improving system performance and human wellbeing. This is similar to psychology and psychotherapy (regarding mind, behaviour and mental health) and dietetics (regarding diet). This seems to apply various disciplines and professions that centre on human needs. HF/E theory and methods can be applied by many professions with various qualifications and experience as part of their professional work, given appropriate competency. It is not necessary that everyone undertakes a degree in HF/E, but neither is it sufficient to undertake a one- or two-day training course alone to be considered a specialist of any aspect of HF/E. There are, however, training courses in aspects of HF/E that are recognised by professional bodies affiliated with the International Ergonomics Association. There are also specific membership grades such as CIEHF’s ‘Technical Membership’ that apply to specific aspects of HF/E, as relevant to one’s own professional work. Ultimately, I consider HF/E expertise as emergent, from interaction between those with expertise in theory, findings and methods, and those with expertise in work and the context of work.

We can take some practical steps. It is helpful, for instance, when offering HF/E-related training courses or services, to indicate the scope of HF/E covered, relative to the scope of the discipline as a whole. This can be made more obvious in the title of the course, For example, a course entitled ‘Human Factors in <Operating Theatres>’ might cover human factors issues in operating theatres, including the interactions between people, activities, context and tools, and methods for improving these by design (of artefacts, tools, work, etc). Alternatively, a course could be titled, ‘Human Factors for <Surgeons/Pilots/etc>’. Such a course would be more adapted to the needs of a particular stakeholder group. This night be a blend of NTS training and training related to the design of various aspects of work (routines, checklists, equipment, etc), with an aim to help improve work design or at least compensate or mitigate unwanted effects.

And of course, in providing consultancy and training we must be clear about our own qualifications and experience. I ultimately consider my practice cross-disciplinary, and dip into several other disciplines that I find especially helpful in helping to improve system performance and human wellbeing (e.g., philosophy, anthropology, practice theory, community organising, counselling and psychotherapy, graphic design). My approach is to integrate aspects of these into an eclectic, cross-discplinary practice, but of course I stop short of describing myself as a professional or specialist in any of them. I know that my interpretation and implementation of these disciplines is narrow, often shallow, and selective. So I simply indicate the cross-disciplinary influences on my practice. Even within a discipline, our competency soon reaches its limits, and understanding these is a critical aspect of ethical practice. Physical ergonomics, for instance (the topic of the SMH news report) is not an area of competency for me. My last experience was part of my ergonomics post-graduate degree and I have not practised this, outside of basic anthropometry, for 21 years. I am simply not competent to practise it.

Summing up

As with may human-centred professions, there is a balance between professional standards and inclusion. The way to address this balance is by total honesty and clarity, abiding by ethical standards of professional practice, collaborating between different areas of knowledge and practice, carefully drawing from useful theory and applicable methods, but avoiding appropriating professional titles, which can have significant unintended consequences for professional standards, system performance and human wellbeing.


Afterword

From my previous post on this topic (Suitably Qualified and Experienced? Five Questions to ask before buying Human Factors training or consultancy), here are the five criteria and questions that apply to paid-for human factors and ergonomics (HF/E) consultancy and training support and employment, that may help with reflection and discussion.

1. Qualification – Do they have a recognised qualification in HF/E?

2. Accreditation – Do they have an appropriate level of membership of an HF/E related professional organisation?

3. Code of Ethics – Do they abide by a code of ethical conduct from an HF/E related society or association?

4. Experience – Do they have experience in the HF/E work and the domain of interest?

5. Social recognition – Is the person recognised as an HF/E specialist by other qualified HF/E specialists?

The aim of these criteria and questions is to ensure that professional standards – including ethical standards – are met. The criteria and questions are framed above in the context of HF/E, but in fact they apply to any professions, such as psychology, dietetics, or physiotherapy. Proper consideration of the criteria and questions should help to protect organisations, individuals, and the integrity of the profession.


Related posts:

Posted in Human Factors/Ergonomics | Tagged , , , , , , , , | Leave a comment

Work and how to survive it: Lesson 2. Understand variation inside your organisation

Much of my practice is informed by counselling and psychotherapy as well as humanistic psychology more generally. One of my problems with these fields, however, is that insights and discussions are largely kept within the world of psychotherapy. What a waste! The vast majority of people are not engaged in psychotherapy and for the most part, psychotherapy pays little attention to applying itself to the mundane issues of everyday life, outside of counselling rooms. This is a second in a series reflecting [for now] on excerpts from Life and How To Survive It, by the psychotherapist Robin Skynner and the comedian John Cleese, with some reflections on work and organisations. 

Other posts in the series:


In Chapter 1, John Cleese and Robin Skynner are talking about people and families at different levels of mental health. Cleese asks about families that are unusually mentally healthy.

Robin …in trying to describe excellent mental health, and compare it with ill-health, and with the ‘average’ health in between that most of us enjoy most of the time … it’s difficult not to talk as if they are quite different from one another, and inhabited by different people. But, in fact, our level of health is changing all the time. We all feel more healthy in our better moments, when we are ‘in a good mood’, when things are going well, when we feel loved and valued, when we have done our best. And we can all feel less healthy under stress, when our usual sources of support are removed, when we have ‘let ourselves down’, when we ‘get out of bed on the wrong side’. Also, our level of health is not the same in all areas of our functioning. A person who is ‘average’ overall may be outstandingly healthy in some respects, even though functioning poorly in others.
John And obviously the overall level can change over time, too. Otherwise you’d be out of a job. I mean people can get more mentally healthy, can’t they?

In my last post, I wrote about the everyday experience of work, which is often ignored in safety, for several reasons, sometimes beyond the control of safety practitioners. Within this great area of day-to-day activity, many things are happening that we can easily miss unless we pay attention to them. One is that performance changes over time. One aspect of this is what is sometimes called ‘practical drift’. In Friendly Fire,  Scott Snook defines practical drift as “the slow uncoupling of practice from procedure” (p. 24). It is one way how we end up in the work archetype of The Messy Reality.

This is very hard to see from the inside, as it tends to happen slowly and tends to help achieve a range of goals that are more positively reinforced within the organisation (e.g., cost efficiency and production). But without paying attention to normal, everyday work, we don’t see what is going on. Importantly, we don’t see changes in the normal operating point, and associated behaviours, especially when these changes happen slowly and are only exposed to those who are closely associated with the work, whether front-line staff, middle managers or the Board

Drift
Figure 1: Drift toward failure. Adapted from EUROCONTROL (2013).

It often takes an outsider to see this practical drift. As Edward Hall (1959) wrote in his book The Silent Language, “culture hides much more than it reveals, and strangely enough, what it hides, it hides most effectively from its own participants” (p.39). We are victims of our cultures – professional, organisational, and national – and insights often require an outside perspective. By ‘outsider’, I simply mean someone who is seen as an outsider by those in a particular in-group, or at least someone who is on the edge of the group.

Outsiders not only see this drift more clearly, but have ‘permission’ to ask about it. This can be associated with their relative innocence. Outsiders may be able to ask the sorts of questions that a child asks: Why do you do that? What do you do it like that? What is that for? The outsider will often, however, need a basic knowledge of the work, especially for less observable forms of work and work that is very complex.

‘Permission to question’ can also be because the questioner has been accepted into a particular role. One of these has been termed ‘barbarian’, by Steele (1975) in Consulting for Organizational Change. Steele characterises this role as “violating comfortable but limiting norms and taboos that are preventing the system from being as effective as it might be. (A counter measure against tunnel vision.)”.  This relates to the archetype Taboo. In-group members will find it difficult to raise taboo issues and will often need exceptional interpersonal skill to do so in a way that helps others gain insight.

An outsider may be a cultural insider, e.g., an air traffic control supervisor or anaesthetist from elsewhere. In this case, the person is an outsider in terms of workgroup and location, but an insider in terms of profession. Supervisors observing the work of other workgroups is one way to help people ‘see’ (and improve) their performance. They may be able to see things and ask questions that true insiders can’t.

Another kind of shift or change is where performance moves towards exceptionally good performance, where work is sustainably productive, innovative, healthy, joyful, etc.  Again, if normal, routine, day-to-day performance is unknown and generally ignored (not subject to anything like the same kind of attention as incidents), then we may just gratefully accept the marginally reduced number of incidents (on the left hand side of Figure 2), but not see the way that work is changing for the better, including the ‘good practice’ that contributes to it. In our Ignorance and Fantasy of this day-to-day work, we may well implement changes (rules, limits, targets, league tables, incentives, punishments, etc) that pull the operating point back, halting progress.

Postive drift
Figure 2: Shift toward exceptional performance. Adapted from EUROCONTROL (2013).

As well as changes over time, a second thing is happening that we can easily miss unless we pay attention: there are differences between different parts of an organisation. As Skynner reminds us, “our level of health is not the same in all areas of our functioning”. In travelling to over 50 air traffic control units and centres of various kinds, I have seen and heard about large variations in many aspects of practice and performance. In most cases, where units and facilities are isolated geographically or culturally (e.g., by profession), these differences are unknown or not appreciated beyond the facility, and often beyond the department, work group, or room. Therefore, good practice in one area of an organisation is not known in another that is similar in context and could benefit. For example, one particular air traffic control tower had developed its own refresher training arrangements. These innovative practices could have been of great help to other towers but, lacking day-to-day contact with the tower in question, were unknown. (See Issues 25 and 26 of HindSight Magazine, on ‘Work-as-Imagined and Work-as-Done’ and ‘Safety at the Interfaces: Collaboration at Work’.)

These differences may also be papered over by the way that we measure performance. For instance, if we average measures across the whole organisation, or if we measure things that do not reflect differences between different areas of an organisation, then again we will be less likely to see and pay attention to them. This means we must pay careful attention to the way that differences may express themselves in terms of department, location, profession, gender, age, experience, and on. In many cases, the differences within organisations are greater than the differences between them, but if we don’t pay attention to what’s going on, we’ll never really know.

References

EUROCONTROL (2013). From Safety-I to Safety-II. A White Paper. Brussels:  

EUROCONTROL Network Manager, September 2013. Authors: Hollnagel, E., Leonhardt, J., Shorrock. S., Licu, T. [pdf] (Contributor)

EUROCONTROL (2017) HindSight Magazine. Safety at the Interfaces: Collaboration at Work. Issue 26, Winter. Brussels: EUROCONTROL. [webpage] [pdf]

EUROCONTROL (2017) HindSight Magazine. Work-as-Imagined and Work-as-Done. Issue 25, Summer. Brussels: EUROCONTROL. [pdf]

Skynner, R. and Cleese, J. (1994). Life and How To Survive It. Mandarin.

Snook, S.A. (2000). Friendly fire. Princeton, NJ: Princeton University Press.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , | 2 Comments

Work and how to survive it: Lesson 1. Understand ‘how work goes’

I have recently been reading Life and How To Survive It, nearly 20 years after first reading it. It is a book on relationships and psychology, written in conversational question and answer style, by the psychotherapist Robin Skynner and the comedian John Cleese. 

Robin Skynner was a child psychiatrist and family therapist who practised psychotherapy with individuals, couples, families, groups, and institutions, where he employed group-analytic principles. Skynner, a former WWII bomber pilot, was a pioneering thinker and practitioner whose insights on families and groups are of value to those seeking to understand behaviour in organisations. In this sporadic series of posts, I will share a few of these, as they might apply to work and organisations. 

Posts in the series:


At the start of Life and How To Survive It (p. 2) John Cleese asks about families that are unusually mentally healthy.

John Well, I’d like to know more about them. Especially as I’ve never heard anyone talk about them.

Robin No, that’s right, the research is hardly mentioned.

John I wonder why. You’d think everyone would want to learn about exceptionally well adjusted people and find out what they know that we don’t. Yet even the other shrinks I know don’t seem familiar with this research. But then, the odd thing about psychiatry is that it’s based upon the study of people who aren’t doing very well – people who have more ‘problems’ than normal.

Robin Yes, that’s basically true.

John And the more you think about that, the stranger it seems. I mean, if you wanted to write a book about how to paint, or play chess, or be a good manager you’d start by studying people who are good at those things. And you wouldn’t expect heavy sales of a book called Play Championship Golf by Learning the Secrets of the Worst 20 Players in the World.

Robin True. Doctors do at least study normal physical functions – anatomy, physiology – before going on the wards to study disease. Psychiatrists seem interested almost entirely in people who are abnormal.

Using this analogy, safety scientists and practitioners might be considered a branch of organisational psychiatry, almost entirely focused on the ‘abnormal’ of work and organisations (albeit in terms of outcomes, not necessarily behaviour or processes). The trouble with this is that we fail to understand ordinary day-to-day work and organisational behaviour, let alone that which is especially effective. (The exceptions to this are branches of study and practice on High Reliability Organisations and Appreciative Inquiry, which are more interested in the latter. But these are rarely part of normal safety management and represent niche areas of safety research.)

For a few reasons, possibly chief among them regulatory requirements in highly regulated industries, the vast majority of effort of safety scientists and practitioners is on abnormal and unwanted outcomes, and the work and processes that precede these. My estimation, based on significant contact with safety practitioners and researchers in many countries, is that this tends to take up over 90% of work hours, and many safety practitioners I know place the estimate closer to 100%. Rarely among those working as safety scientists or practitioners is there, or has there ever been, any significant systematic study of normal work (e.g., via ethnography, systems thinking, systems ergonomics, work psychology, organisational behaviour).

The disconnect between our focus of attention (unwanted events) and what we desire (safe or, more generally, effective work and systems) is what I have previously characterised as déformation professionnelle, a play on words referring to job conditioning or acclimatisation, which affects most or all professions, in some way. As noted by literary theorist Kenneth Burke, “A way of seeing is also a way of not seeing — a focus upon object A involves a neglect of object B” (1935, 1984, p. 49). In the case of safety, object A is relatively tiny in number, and object B is huge in number. Because it is so ordinary, we tend not to ‘see’ it (see Figure 1 below, from the EUROCONTROL (2013) White Paper on Safety-I and Safety-II).

Figure 1: The focus of safety.

What this means in practice for safety is that analyses and conclusions about unwanted situations can be based on flawed assumptions about normal work, from the perspective of work-as-imagined (see the archetype, Ignorance and Fantasy). Following safety incidents, even or especially ‘first of a kind’ incidents, an investigation might recommend a new rule. In such cases, where normal work has not been studied and understood, the rule can bring unintended consequences. [Readers with front-line experience will now bring several examples to mind.] The reason is that the rule acts as an unreasonable constraint on normal work, perhaps requiring significantly more time or other resources, which are unavailable, or reduced demand, which is not possible. In the absence of additional resources or reduced demand, the rule may be bypassed or, if enforced, causes secondary problems and leaves the system in a more pressured or fragile state. Meanwhile, those recommending the rule remain unaware of its failure, and assume – through lack of feedback and no further related incidents – that the rule is successful. Examples of unintended consequences in interventions can be found under the Congruence archetype.

Seeing only how things go wrong means that we neglect how things go right (e.g., a desired situation), and – most importantly – how things go, in a more ordinary or general sense. Things can go wrong in countless ways, but in many forms of work, desired outcomes tend to come about in a relatively small number of ways, at a fundamental level. A golfer can hit the ball in any direction and at a wide range of angles. The number of ways to miss a hole is effectively infinite. In comparison, the number of ways to hole the ball is relatively small. The same goes when reverse parking a car, when landing an aircraft or piloting a ship to port. There are many variations in how this is done, and some ways are especially effective, but there are countless ways in which to get it wrong. Hence training is focused, in the main on how to get it right, and not on how not to get it wrong (though many trip hazards will be important to know about).

By studying ordinary, everyday functioning in organisations, and ‘exceptionally well adjusted’ functioning, we can better understand when a sociotechnical system really is healthy or unhealthy, in what ways, how and when this is expressing itself, who is affected, and why (considering sociotechnical system interaction).

This does mean that safety scientists and practitioners, and anyone else interested in the quality and improvement of human work and sociotechnical systems, must spend more time understanding (and in) the world of work-as-done, and the messy reality of work (remembering that is it for the most part the work context that is messy, not the work itself). This is no mean feat when one’s work is driven by regulatory requirements, but if we wish to understand work and systems, and not just sporadic symptoms of unwanted interactions, then we must somehow prioritise time and other resources. As I reflected in this post, if you want to understand work, you have to get out from behind your desk.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , | Leave a comment