What Human Factors Isn’t: 1. Common Sense

‘Human Factors’ (or Ergonomics) is often presented as something that it’s not, or as something that is only a small part of the whole. Rather than just explain what Human Factors is, in this sporadic series of short posts I will explain what it isn’t. The posts outline a number of myths, misunderstandings, and false equivalencies.

In this series:

  1. Human Factors isn’t Common Sense (this post)
  2. (Forthcoming)

NATS – UK air traffic control, CC BY-NC-ND 2.0, https://flic.kr/p/ESpkCk

Human Factors Isn’t Common Sense

People sometimes assert that ‘Human Factors’ is common sense. The same is less often said of ‘ergonomics’ (which is equivalent within the discipline or knowledge base) and rarely said of ‘human factors engineering’ (also equivalent, but seems different because of the ‘engineering’ bit). ‘Common sense’ is also notoriously uncommon. Common frustrations with everyday door handles, shower controls, and websites are testament to this. So ‘Human Factors is common sense’ betrays a lack of understanding of both Human Factors and common sense.

Anyone who describes Human Factors as common sense implies that the interaction of physical, biological, social and engineering sciences, and the application of this to the design of work (including the artefacts and environments of work), is obvious and straightforward, and can therefore be done by anyone based on knowledge and skills that are commonly available. This couldn’t be further from the truth. Most aspects of Human Factors are difficult and complex, including: 1) the research and experience bases that contribute to the knowledge base of Human Factors; 2) the interaction of the empirical findings from the research in these fields; 3) the extrapolation and application of the knowledge base to work environments, including highly-regulated safety-critical environments that require specific evidence for claims; and 4) the practice skills, relationships and resources that are needed to do this in environments as diverse as healthcare, power generation, defence, manufacturing, transportation, and agriculture, between which Human Factors practitioners, and others who apply Human Factors methods and knowledge, often traverse.

The ‘common sense’ claim betrays a lack of understanding of the foundation, scope and application of Human Factors. Typically, the claim comes from those who confuse Human Factors with ‘behaving safely’. While human performance is a key aspect of Human Factors, the primary method of intervention is (work) design, not behaviour modification. Behaviour modification is usually best filed under ‘Applied Psychology’ (also – related – Human Performance as a sphere of professional activity).

Even when the aim and scope of Human Factors are better understood, the ‘common sense’ claim confuses hindsight with foresight. When a task, artefact or environment is well-designed – it is more likely to be unremarkable, or even unnoticable. It blends in with, and subtly assists, the purposive flow of experience. It is part of ‘how things ought to be’. So it may intuitively feel like common sense because it doesn’t make the day longer and harder than it needs to be. But the activities to bring about these things, including the competencies, relationships, tools, time, project arrangements, and other resources, are not common. In sectors such as air traffic control, rail, defence, and major hazard industries, including regulators, designing for system effectiveness and human wellbeing requires the support of suitably qualified and experienced practitioners working as part of teams in multiple organisational divisions – operational, design and engineering, safety and R&D.

If Human Factors is common sense, then so are architecture, surgery, and electrical engineering, or (as foundation disciplines of Human Factors) psychology, biological sciences, and industrial design.

The common sense claim wouldn’t matter much if it were not for the false and dangerous conclusion that follows: that because ‘Human Factors’ is common sense, then no competent design support is needed. People can carry on and ‘Human Factors’ will just happen as the natural order of things. The ‘natural order’ came to light in the 1940s, when ‘common sense’ cockpits led to many gear-up crash landings. Today, the ‘human factors as common sense’ myth leaves heathcare workers with dangerously confusing devices, medicine packaging, and unforgiving work environments, the consequences of which are inherited by them, by patients, by families, and by society generally.


 

Posted in Human Factors/Ergonomics | Tagged , , , | Leave a comment

The Organisational Homelessness of ‘Human Factors’

Most fields of professional activity have a settled home within the divisional and departmental structures of organisations. Operational staff work in operational divisions. Engineering staff work in engineering divisions. Everyone else tends to know their place: finance, human resources, legal, safety, environment, quality, security, corporate communications, and so on.

Not so for human factors (or ergonomics; HF/E). Within organisations that are large enough to have a divisional structure, ‘human factors’ can be found in a variety of divisions.

In this post, I outline four common homes for HF/E within organisations (after Kirwan, 2000), drawing on personal experience in each of the four organisational divisions in different organisations over the past 21 years, and some of the little literature on this (Kirwan, 2000; Shorrock and Williams, 2016). I conclude with some of the implications of organisational homelessness.

5429942502_cfbb0aa85c_o.jpg

Photo: Dave Gray, Design by Division, CC BY-ND 2.0, https://flic.kr/p/9gPSJj

Human Factors in Operations Divisions

‘Human performance’ is, naturally, core to HF/E (but not equivalent), and in sectors such as transportation, energy production, manufacturing, power generation. and mineral extraction, HF/E is sometimes located in operational divisions of organisations. When housed here, HF/E practitioners may assist with the design and assessment of work, training, non-technical skills and team [team/bridge/rail] resource management, procedure and job aid design, observational safety, assessments and advice on fatigue and shiftwork, staffing and rostering, maintenance, personal resilience and confidence, stress management, safety investigation, quality improvement, and advice and support on human performance more generally. Such issues are reflected in texts such as Flin et al’s Safety at the sharp end and Davies and Matthews’ Human performance: Cognition, stress and individual differences.

Being close to operational teams and work-as-done can be especially rewarding. It is the only way to really understand The Messy Reality and Taboo issues. Problems and opportunities for work-as-done are hard to see from afar (if you want to understand risk, you need to get out from behind your desk). This divisional location can provide credibility with front-line operational staff, the beneficiaries of most HF interventions, and allow for the development of the relationships required for problem solving and opportunity management.

The other side of this coin is that there is a particular risk in Ops of becoming too close to operational staff, while also under the operational management structure. Independence can be compromised.

Housed in operations, human factors – as a design discipline – may also be in the unhappy position of inheriting upstream design decisions…and any resulting problematic situations. Without proper involvement to the design process, problems may come to light late in the design and development process. At this stage, there is considerably less opportunity for influence. HF/E practitioners in this context can also risk losing design skills, and also lose track of research; the research-practice gap can seem especially wide from Ops, where it tends to be valued least of all.

The shorter term focus of operations also brings an acute-chronic trade-off: when time is limited (i.e., all the time) handling today’s problems and opportunities leaves less time for future problems and opportunities.

Human Factors in Engineering Divisions

Human factors is, fundamentally, a design discipline. This is sometimes a surprise to some who perceive it as a behavioural (or ‘human performance’) discipline, which might be seen to be more naturally aligned with operations. However, human factors – by definition – operates primarily through design, not behaviour modification. This is exemplified by various textbooks, including old classics such as Sanders and McCormick’s Human Factors in Engineering and Designand Wilson and Sharples’ Evaluation of Human Work and, more generally, ISO 9241 – Ergonomics of hums-system interaction, especially Part 210: Human-centred design for interactive systems).

The international Ergonomics Association – the umbrella organisation for all HF/E societies and associations around the world – defines the profession as that which “applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance”. So HF/E specialists can often be found in engineering divisions of organisations.

In this organisational context, HF/E can help to address the design of equipment, tools, artefacts and infrastructure, such as control rooms, buildings, and signage. In such cases, the costs of not integrating human factors are extremely high. Compared to procedures and work routines in operational contexts, equipment, tools, artefacts and infrastructure are difficult and expensive to modify. Often, operations inherit design problems and have to adjust to them, sometimes with HF/E support in operations…

There are downsides to be aligned with the engineering divisions of organisations. Practitioners will tend to find they have to work within existing design and engineering processes, which may not be ideal for iterative human factors design. Being part of the design and engineering tribe brings some distance from operations – socially and culturally.  As a result of organisational silos, the practitioner embedded in this context may well be closer to work-as-imagined and work-as-prescribed than work-as-done. Some who identify as human factors specialists – especially when previously integrated in safety or operations – will need to develop new design and engineering skills to be accepted. Designers and engineers, meanwhile, can naturally find it frustrating to have to pass a ‘human factors test’, or depend on knowledge that they do not have.

Human Factors in Safety (and Health) Divisions

Many organisations have a division of safety, focusing on operational safety (major hazards) or occupation safety, or both. Human Factors practitioners in this context – especially n high-risk industries – are likely support activities such as safety investigation, safety assessment (e.g., human reliability assessment), safety surveys, specific activities such as fatigue and stress management, and perhaps safety policy and the development of safety management systems. Safety departments may exist within a broader safety, health, environment, quality and, increasingly, security, in which cases other activities may be supported (e.g., concerning noise, vibration, the thermal environment, vision).

This context can be a good compromise between operations and engineering, affording close cooperation with both engineering project teams and operations, given sufficient attention to forging relationships across organisational boundaries. High level independent influence on strategic decisions (e.g., via safety management system requirements) can also be a benefit.

Safety divisions (and departments) are, however, often seen as external to both operations and engineering (both culturally and organisationally, requiring, for instance, internal contracting for services). HF/E may be seen as an interference, or supporting only one aspect of system performance (accident prevention), and not activities that support effectiveness more generally. Safety (and health) is only one of the goals of HF/E, which seeks to optimise system performance and human well-being.

Human Factors in R&D Divisions

For some HF/E practitioners outside of academia, R&D divisions offer a chance to do industry-centred research and development from the inside. Within government, inter-government or commercial organisations, HF/E practitioners conduct applied research on all aspects of the discipline – physical, cognitive, social, and organisational.

It is intellectually stimulating and offers a chance to generate and apply knowledge, with a longer time horizon (see Chung et al, 2016). It can offer the chance to imagine future work, and understand work-as-done now. From a professional development perspective, R&D offers the best chance to try to keep up with the impossible task of keeping up with the literature for any particular aspect of HF/E.

But of the four options outlined above, practitioners in R&D may experience the greatest distance both from front-line staff and senior management. This is reflected in outputs. As Kirwan (2000) notes, “There are three main types of papers, in order of importance to the company: trade journals, conference papers, and journal papers. The order of importance to the company and to the success of the unit is the reverse of the academic ordering of importance” (p. 668). This can be a surprise to practitioners. While Kirwan also noted, that “[journal] papers will be of greater perceived importance to the company if the HF group is located within a research division in that company”, there are in practice several barriers to publication as well as research application in organisations (Chung and Shorrock, 2010; Salmon and Williams, 2016), helping to explain the small minority of industry practitioners that author HF/E journal articles; as low as 3% in 2000 and 2010, compared to 76% and 81% of papers authored by research institution authors only, in the same years (Chung and Williamson, 2018).

This may reflect a decline in in-house HF/E R&D. Some major organisations that were previously heavy hitters in R&D no longer have a large R&D function, or no longer perform HF/E R&D.

Organisational Misfits…or Connectors at the Edge?

To many, the organisational homelessness of human factors brings confusion about the nature of the discipline and profession. Is it about design, or engineering, or operations, or safety, or health…? Human factors has a sort of identity problem.

This identity problem might be seen as fundamentally exogenous, existing in large part because of the functional structures of (especially) large organisations, which divide decision making from work, design and engineering from operations, research from practice, system performance from human well-being. These are all within the scope of HF/E; none can be excluded. But organisations are what they are, and command-and-control structures resist systems thinking.

So HF/E is indeed an organisational misfit, which might seem ironic since HF/E is concerned with the fit between system elements. HF/E is no more at home in operations, engineering, safety, R&D, or other organisational functions. Individual practitioners, may feel more at home in one context in particular, but will often be found at the edge of functions, interfacing with other functions at the organisational system as a whole. Organisations, meanwhile, may see a better fit for HF/E in one division, or indeed – perhaps ideally – spread over several. But there is no universally appropriately home. Traditional organisational structures are simply at odds with systems disciplines that work across functional divisions, especially those that do not reflect the flow of work or influence in a system.

For any individual practitioner, experience of a variety of organisational functions is helpful to understand the internal processes and sub-cultures that exists within organisations, and to identify the formal and informal bridges that exist, or can be built, between them.

So organisational homelessness can be a weakness, but also a source of strength. As a systems discipline, HF/E sees the whole, and focuses on interaction and influence, not just parts. As well as providing technical HF/E support, practitioners using an HF/E approach might ideally combine a systems and humanistic approach, mediating, bridging and connecting different organisational functions as connectors. This quote, from an interview on learning from communities with Cormac Russell, describes well this ideal:

“There are people who are loosely called ‘connectors’ at the edge, who move quite fluidly.  I think about them as multicultural in a sense, in that they can move in between any groupings really but they have that competency and capability.” Cormac Russell

In organisations that divide by design, bridging is just as important as bonding…or more so. Organisational homelessness can help practitioners to navigate different worlds, without getting entrenched in one.

References

Chung, A.Z.Q. and Shorrock, S.T. (2011). The research-practice relationship in ergonomics and human factors – surveying and bridging the gap. Ergonomics, 54(5), 413-429. [pdf]

Chung, A.Z.Q., Shorrock, S., and Williamson, A. (2016). Chapter 9: Integrating research into practice in human factors and ergonomics. In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. CRC Press.

Chung, A.Z.Q., and Williamson, A. (2018). Theory versus practice in the human factors and ergonomics discipline: Trends in journal publications from 1960 to 2010. Applied Ergonomics,66, 41-51.

Davies, D.R. and Matthews, G. (2013). Human performance: Cognition, stress and individual differences.Psychology Press.

Flin, R., O’Connor, P., Chrichton, M. (2008). Safety at the sharp end: A guide to non-technical skills. Ashgate.

Kirwan, B. (2000). Soft systems, hard lesson. Applied Ergonomics, 31, 663-678.

McCormick, E.J. and Sanders, M.S. (1992). Human Factors in Engineering and Design. McGraw-Hill.

Salmon, P. and Williams, C. (2016). Chapter 10: The challenges of practice-oriented research. In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. CRC Press.

Shorrock, S. and Williams, C. (2016). Chapter 8: Organisational contexts for human factors and ergonomics in practice. In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. CRC Press.


This is a repost of a the original, posted 06/04/2018, then lost to the technical vagaries of WordPress.

Posted in Human Factors/Ergonomics, systems thinking | Tagged , , , , ,

Reflections from the edge

Image: Steven Shorrock CC BY-NC-SA 2.0 https://flic.kr/p/pdNPXP

I have ‘worked on work’ for my whole professional career. For the majority of that time, I have worked primarily in aviation. Unlike many in the industry my primary interest is not in aviation, any more than it is in any other activity. My primary interest is not even in safety. My professional interest is, and always has been, in work and people. 

I grew up in a family business. My family, on both sides, were very much working class, though my parents were entrepreneurial and opened a market stall, which grew into a small number of shops and a small distribution business. My siblings and I were co-opted into this effort and this took up our Saturdays and holidays for as long as I can really remember. 

I was the more sensitive and reflective of the older siblings, ill-suited to some of the work, though truck driving was enjoyable in later years. So, I was the first in our known family history to decide to – or be able to – enter higher education.

Being raised in a family business, at least of the sort that I was, is not something that I can recommend, and was not a choice. This upbringing did, however, give me an immense interest in work. And so it was clear to me, from teenage years, that I would study work. This was reflected in every subject choice through high school, college and universities.

Growing up in a family business also helped me to develop a particular capacity for observation from the edge. In a sense, my whole late childhood was an exercise in crude ethnography, though I never wrote up my observations. Some of these observations related to myself and our family dynamics, such as the confusing role transitions, blends and conflicts between life as a son, brother, and employee.

Of course, I was never really asked about my observations on work. No one was. Work was just something you got on with, under a particular power structure, with particular unspoken assumptions, and particular pressures. As an inside-outsider, I could see these, and in organisations of all sorts, insider-outsiders have a particular edge on seeing things from a different – less acculturated – perspective.

This made me think about the ‘outsiders within’. There are always people who are more naturally on the edge, of groups, departments, divisions, professions. They may be more interested in the edges, in the connections, and may be naturally drawn to connecting the disconnected. From the edge, they may not be fully accepted as a ‘true’ member of any particular tribe, and so may have relatively little power and may not be heard often. But they may be accepted into many tribes, as a guest, which may well afford them an understanding of the bigger picture, as well as the unseen within. 

As Kurt Vonnegut’s character Finnerty said in Player Piano, “I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can’t see from the center.” 

So in your organisations, who would these people be? What might they see from the edge that others don’t? 

Posted in Culture, Humanistic Psychology, systems thinking | Tagged , , , | 1 Comment

‘Human Factors’ and ‘Human Performance’: What’s the difference?

The term ‘Human Performance’ (and ‘Human and Organisational Performance’ (or HOP) has become increasingly common in recent years in a number of industries, especially those with a safety focus. It is often associated with ‘Human Factors’, or even used as a replacement for the term. But in some cases, different practitioners have identified with one term or both. So I thought it might be useful to clarify a few important distinctions between the two.

16036248432_14e5c7e4de_z

Clement127 CC BY-NC-ND 2.0 https://flic.kr/p/qr4XXW

In this post, I use ‘Human Factors’ and ‘Human Performance’ (mixed case) to refer to spheres of academic research/teaching and practice in applied contexts by internal and consultants (e.g., Human Factors Specialist, Human Performance Specialist). But there is another, more ordinary meaning of ‘human performance’ (lower case), as simply what people do and how. This ordinary meaning is not the focus of this post.


Human Factors emerged from many disciplines. ‘Human Factors’ (or Ergonomics) emerged from disciplines including psychology, anatomy, physiology, biomechanics, anthropometry, industrial design and engineering, industrial medicine, industrial hygiene, sociology, architecture, illumination engineering, interaction design, visual design, and user interface design. Still today, journals and textbooks cover all of these disciplines, and none dominates singly, but Human Factors now sits as a discipline itself (see later). Those who practise as qualified Human Factors professionals today tend to come mainly from psychology, engineering and physiology/biomedical academic backgrounds.

Human Performance is related primarily to psychology and physiology. ‘Human Performance’ as a sphere of research and practice is related primarily to psychology in industry, and to physiology and sports science in sport and leisure. For industrial applications, psychology dominates in discussions (evident on social media), and in research, though theory is not particularly well-connected to practice (arguably less so than for Human Factors). One of the very few academic textbooks for industrial applications with ‘Human Performance’ in the title (Matthews et al, 2000) is written by four academic psychologists, and covers cognition, stress, and individual differences. (Other books that mention Human Performance in the subtitle mostly concern sport and exercise.)

‘Human Factors’ emerged from many disciplines, with none dominating completely. Human Performance is related primarily to psychology, physiology, and sports science, with psychology dominating industrial applications. 


Human Factors is a discipline. ‘Human Factors’ emerged as a distinct field of academic study – taught and researched as part of higher education – over time since WWII (see this chapter and this article by Pat Waterson). The first learned Society (now CIEHF) was set up in 1949, and during the 1950s and 1960s, Professorial Chairs, postgraduate degree courses, and scientific journals were established. But for some time, Human Factors/Ergonomics was a “convenient gathering place” (Rodgers, 1959) for a variety of stakeholders, including other disciplines. Human Factors is now considered a distinct scientific and design discipline, with university departments/schools, research institutes, professors, conferences, and scientific journals, including Human Factors, Ergonomics, and Applied Ergonomics (the top three journals in the discipline).

Human Performance is an interdisciplinary focus. ‘Human Performance’ is not a discipline as such, but rather an interdisciplinary focus. It has long been associated with sport and exercise, with performance in extreme environments, and with work, but as a focus of activity for sports scientists, physiologists and industrial-organizational psychologists. There are scientific journals associated with the term Human Performance (but not many). Examples include Human Performance, Journal of Human Performance in Extreme Environments, and Organisational Behaviour and Human Performance (1966-1984). There are few university schools/departments and Professors of ‘Human Performance’. Those that exist tend to focus on sport and exercise science.

‘Human Factors’ is a distinct discipline, as well as a forum for other disciplines that share a similar focus. ‘Human Performance’ is not a distinct discipline, though it is a focus of, or umbrella for, allied human sciences. 


Human Factors is a profession. The profession of ‘Human Factors Engineer’/’Ergonomist’ emerged (unexpectedly) over 50 years ago, and is now associated with specialised education, recognised qualification routes, professional associations, and associated codes of conduct. Specialists are now employed in many industries – especially safety critical industries – such as aviation, rail, military, nuclear, oil and gas, and healthcare. These roles tend to require formal, post-graduate degree qualifications in Human Factors (or Ergonomics), and/or certification (‘Chartership’ in the UK) by recognised professional bodies. Membership of professional bodies requires adherence to a Code of Conduct (such as this from CIEHF).

Human Performance is not yet a profession. ‘Human Performance’ cannot be described as a profession, with specialised education, recognised qualification routes, professional associations, and associated codes of conduct. This may emerge in the future. Sometimes, those who identify as ‘Human Performance Specialists’ are full members of professional associations for disciplines such as Human Factors, Industrial/Organisational Psychology, Medicine, Sports Science, etc. More commonly, Human Performance (or Human and Organisational Performance) is a term adopted by health and safety practitioners, and is sometimes described as a ‘movement’.

‘Human Factors’ is a distinct profession, and is also sometimes used by other allied professions with similar aims and scopes. ‘Human Performance’ is not a profession, but is a focus of interest for allied professions.


Human Factors and Ergonomics are considered roughly equivalent. Within the discipline and profession, the terms ‘Human Factors’ and ‘Ergonomics’ are generally considered equivalent. The scope of research units, schools, and journals, and the official internationally-accepted definition, is equivalent. Different terms are, however, used in different industries and contexts. Human Factors Specialists tend to be happy with either title, depending on the context (the formal Chartered title in the UK is ‘Chartered Ergonomist and Human Factors Specialist’.)

Human Performance and Ergonomics are considered more distinct. While human performance (what people do and how they do it – concerning physical, cognitive, social aspects) is of course of critical interest to Ergonomics, the terms are not equivalent. Those who identify as ‘Human Performance Specialists’ tend not to identify as ‘Ergonomists’, unless they are qualified in Ergonomics. ‘Ergonomics’ has clear design connotations, while ‘Human Performance’ tends to have training connotations, or (lowercase) human performance is simply seen as something that people do – perform.

Human Factors and Ergonomics are considered roughly equivalent within the discipline, and by many in the profession. Human Performance is of interest to Ergonomics (Human Factors), but also of many other disciplines.


Human Factors has a design focus. ‘Human Factors’ interventions tend to have a design focus. This has been the method of intervention since the inception of HF in WWII, and since then in many definitions, including that of the International Ergonomics Association (adopted by all Human Factors [or Ergonomics] professional associations), to apply “theory, principles, data and methods to design in order to optimize human well-being and overall system performance” (IEA). ‘Design thinking’ is therefore inseparable from Human Factors.

Human Performance has a behavioural focus. ‘Human Performance’ interventions by those who identify as Human Performance Specialists tend to have a more direct behaviour modification focus, frequently associated with safe behaviour, leadership, culture, and teamwork. The primary methods of intervention for Human Performance are primarily training, coaching, awareness-raising, and behaviour change methods that tend not to be design-led. ‘Design thinking’ is not necessarily associated with Human Performance (though it may be, in some interventions and publications).

Human Factors and Human Performance tend to have different modes of intervention. Human Factors tends to have a design focus, while Human Performance tends to have a behaviour modification focus.


Human Factors is concerned with system performance. ‘Human Factors’ concerns “interactions among humans and other elements of a system” (IEA). It has an military-industrial heritage, with a focus on the sociotechnical system. This system focus can be confusing, especially outside of the discipline, where it is sometimes associated with ‘factors of humans’ (which is, confusingly, more aligned with Human Performance). The focus is therefore not only human performance per se, but system performance more generally, with human performance being a key influence on this. Human performance at an individual or team level could be considered effective (locally), but – by the nature of system interactions – produce unwanted effects at a higher system level, or in another part of the system, or be detrimental to human wellbeing. ‘Systems thinking’ is inseparable from Human Factors.

Human Performance is concerned with individual and team performance. ‘Human Performance’ is primarily focused on the performance of individuals and teams (and organisations, in the case of Human and Organisational Performance) – what people do, and how. Academically, it has a human science heritage, in sport and exercise science, physiology (endurance and survival in extreme environments), and also industrial-organisational psychology. ‘Systems thinking’ is not necessarily associated with Human Performance (though it may be, in some interventions and publications)

Human Factors, despite the name, is concerned with system performance, as a discipline and profession. Human Performance tends to be concerned with individual and team performance, as a focus for various disciplines and professions.


Summing up

‘Human Performance’ is seemingly self-evident in its focus; it is about what it says on the tin – human performance. It is not a distinct discipline or profession, but offers a convenient gathering place for those who are interested in improving human performance. ‘Human Performance’, as used by some health and safety professionals now (who sometimes identify as Human Performance or Human and Organisational Performance specialists) is, in some respects, in a similar position to that of ‘Human Factors’ (and ‘Ergonomics’) in the 1960s. It is also in a similar position to ‘User Experience’ or UX a decade or two ago (compared to Human Computer Interaction, Usability Engineering or Interaction Design).

Whether ‘Human Performance’ should become a discipline and profession is a matter of opinion. But since there are already a number of academic disciplines and professions concerned with human performance, I would say this is unnecessary and unhelpful. I would also say that it is unhelpful to call it a ‘movement’. Rather, the term ‘human performance’ is more useful in a multi-disciplinary, non-professionalised (or multi-professional) way concerning what people do, and how, and to bring people together to talk about this, somewhat like ‘systems thinking’. It is something of interest to many stakeholders

But I see three key future risks for ‘Human Performance’ as a ‘movement’. The first risk is that – disconnected from a discipline – it becomes allied with populist science, without an evidence base in pragmatic science. Populist science can appeal to industry, but takes practice further from theory, to the point that intervention may be ineffective or counterproductive.

The second risk for the Human Performance movement is that – disconnected from a profession – clients of services related to Human Performance do not really know who or what they are getting, and have no recourse to a code of conduct and associated professional association. Clients therefore have to ensure the person employed or contracted is suitably qualified and experienced for the work, whether it is labelled as ‘Human Performance’ or ‘Human Factors’.

The third risk is that the term ‘Human Performance’, as often used by HP/HOP consultants, may reinforce behavioural approaches to improvement (training, coaching, supervision, monitoring, behaviour-based safety), at the expense of system and design approaches, which may well be more effective. As Sanders and McCormick (1987) stated in their textbook Human Factors in Engineering and Design, “it is easier to bend metal than twist arms”. And so we should be wary of abandoning ‘Human Factors’ for a term that may be on trend, but risks taking us back to an ideology of only fitting the human to the task, rather than (first) fitting the task to the human.


Post-note

This post reflects on developments in a number of industries concerning the growth of ‘Human Performance’ as a movement or sphere of activity for internal and external consultants – separate from, equivalent to, an aspect of, or even subsuming ‘Human Factors’ as a discipline and profession. In some cases, the terms Human Factors and Human Performance refer to rather different things as spheres of professional activity. In others, and for some publications, they refer to closely related things or the same thing, but with one term or the other being used depending on the purpose, scope and readership. This White Paper on Human Performance in Air Traffic Management Safety, for instance (for which I was lead editor) is arguably more about Human Factors, though it does not include human wellbeing in its scope (which is core to the definition of Human Factors and Ergonomics). This Human Performance Standard of Excellence (also within air traffic management) similarly includes design and behavioural approaches, and also mentions wellbeing. Again, this is more aligned with Human Factors (in a non-professionalised way), but the term human performance is used. So ‘Human Performance’ as movement or a sphere of research and professional activity is different to ‘human performance’ as simply what people do and how. Both of the above publications essentially concern ‘human performance’ (lowercase) in the ordinary sense – what people do and how they do it, and how to improve that using training, design, management, and other interventions. In summary, in some applications, publications, and contexts, either term may be used with essentially the same meaning, while in others, the terms have somewhat different meanings and implications, and even the meaning of ‘human performance’/’Human Performance’ (and even ‘human factors‘/’Human Factors‘) can differ.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , | 5 Comments

System Safety: Seven Friends of Intervention

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven friends of intervention. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety
  2. Seven foes of intervention in system safety
  3. Seven friends of explanation in system safety
  4. Seven friends of intervention in system safety (this post)

18267051428_d56207dd78_k

whereisemil CC BY-NC-ND 2.0 https://flic.kr/p/tQcpYy

1. Acceptance of uncertainty

Whether one is intervening* to try to understand a situation or intentionally to bring about change, it is important to accept that one probably does not and cannot fully understand a complex situation or sociotechnical system. Once one accepts this, unwarranted confidence reduces, and the need for competency, time, and information becomes clearer. With competency, time, and information, the form of practical arrangements for understanding the system at all stages of its lifecycle become clearer, including during implementation, where surprises that result from intervention actions will tend to emerge.

2. Competency, expertise and involvement 

If you want to intervene in a system, you need expertise in system safety. It is astonishing how often this simple fact is neglected. Suitably qualified and experienced persons (SQEPs) are needed with recognised multidisciplinary competencies and perspectives, such as from safety science, safety engineering, human factors/ergonomics, psychology, anthropology, and related disciplines. Such expertise is often missing (e.g., HF/E competency in healthcare). And of course competency is needed from those who do the work. Learning teams and action research are examples of the use of competency in intervention.

3. Research

When intervening in a system for understanding or intentional change, an important initial step is to get knowledge. For system safety, this may include original research for new knowledge, or summaries, reviews or syntheses of existing sources of knowledge. The knowledge may relate to a topic within a safety-related discipline (e.g., in scientific journals), a sector (e.g., aviation, healthcare), or an organisation (e.g., history of interventions). In system safety, this important step is often missing in practice, resulting in ineffective interventions. Greater attention to research provides data, concepts, theories and methods to guide practice, benefiting safety and effectiveness more generally.

4. Listening and observing

Two fundamental methods for understanding systems are observing people at work and listening to people talk about their practice – how and why they intentionally make and transform the world – including the context of practice. These activities, while often lacking in practice, are vital to increase congruence between work-as-imagined and work-as-done, via appropriate alignment rather than simple compliance. Accepting the equivalence of failure and success in terms of their origins in ordinary work, we try to understand not only unusual events, but work in all its forms, whether the outcome is expected or unexpected, wanted or unwanted.

5. Human-centred, activity-focused design 

Human-centered design (HCD, e.g. ISO 9241-210) is a design philosophy and process that aims to align systems with human needs. It is relevant to anyone involved in the design or modification of procedures, equipment, or other artefacts. HCD requires that stakeholders are involved throughout design and development, which is based on an explicit understanding of people, activities, tools, and contexts. The process is refined by iterative user-centred evaluations and learning cycles. A strong focus on activities helps to understand not only how the world should adapt to people, but how people adapt to the world.

6. Multiple perspectives and thick descriptions

There tend to be multiple perspectives on situations, events, problems and opportunities. Each may be partial, but together can give a more complete picture. Shifting between different perspectives illuminates different experiences, perceptions and understandings, and how these interact. Different aspects of systems and situations come to light, along with the trade-offs, adjustments and adaptations that are or were locally rational. Multiple perspectives help generate thick descriptions of human behaviour. Facts, along with commentary and interpretations, explain work-as-done in context, such that it becomes more meaningful to an outsider, and possible implications of situations and proposed ‘solutions’ come to the surface.

7. Systems methods

Systems methods help to understand system boundaries, system structure, and system interactions across time and scale. They can make patterns of system behaviour visible, and can reveal previously unknown or unforeseen influences and interactions between parts of the system. Methods can be used for describing, analysing, changing, and learning about situations and systems. Common methods include system maps, influence diagrams, causal loop diagrams, multiple cause diagrams, stock and flow diagrams, activity theory/systems, FRAM, AcciMaps, and STAMP, among others. Such methods can help to go ‘up and out’ to the system context instead of just ‘down and in’ to components, ‘causes’, or events.


* A note on intervention: The term intervene comes from the Latin intervenire, from inter- ‘between’ + venire ‘come’. To intentionally try to understand a situation, or take action to change it (e.g., improve it or prevent it from getting worse) is to intervene. While there may be no intention to change a situation while observing or measuring, that is very often an unintended consequence. 

If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , , | 1 Comment

System Safety: Seven Friends of Explanation

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven friends of explanation. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety
  2. Seven foes of intervention in system safety
  3. Seven friends of explanation in system safety (this post)
  4. Seven friends of intervention in system safety

heathrow 1

Tim Caynes CC BY-NC 2.0 https://flic.kr/p/6BNpAf

1. The [degraded] system

The ‘system‘ in system safety does not operate as designed or as prescribed. It is neither fully understood nor fully understandable, and only slithers of performance can be measured. There are degraded resources (staffing, competencies, equipment, procedures, time) and – often – inappropriate constraints, punishments and incentives, whose effects are not as imagined. There are also gaps between these elements of the system, and people – the flexible system element – have to stretch to bridge these gaps, resolving the unforeseen pressures and dilemmas that result. While we are mostly successful, sometimes the reality of the system surfaces in unwanted ways.

2. Goal conflicts 

Safety is just one of several goals, among cost-efficiency, productivity, capacity, security, and environmental factors such as noise and emissions. Safety is rarely of highest priority in any permanent sense. Instead, there are almost always conflicts or tensions between goals, presenting stakeholders with dilemmas. As situations change over time, different goals and relative priorities will be perceived differently by different individuals and groups. Goal conflicts will also look different in hindsight, when one has access to more information, including the outcome. While the solutions to goal conflicts may seem ‘obvious’ looking back, they were gambles when looking forward.

3. Work-as-done

We often base safety-related work on work-as-imagined, -prescribed, and -disclosed. In doing so, we often neglect the real thing – work-as-done. Work-as-done is what people do to meet their goals during expected and unexpected situations. It is characterised by patterns of activity to achieve a particular purpose in a particular context. It may look messy, but in fact it is the environment that is messy. The work is adaptive. Work-as-done varies between people and situations, and much of it is in the head. So any understanding – gained via listening, observing, recording, and modelling – will only ever be partial and approximate.

4. Trade-offs and adjustments

People work not in order to ‘be safe’, but to meet demands. Constant performance adjustments and trade-offs are required order to meet variable, unpredictable demands, and to resolve goal conflicts. When we look at human performance, even when simply walking in a crowd, all we do is adjust and adapt to a dynamic, uncertain environment. We have to make trade-offs and choose among (often sub-optimal) courses of action, and make adjustments to our plans and responses as situations unfold. This is mostly very successful, and needs to be understood from an inside perspective, whether the outcome is as expected or not.

5. Local rationality

Work-as-done is guided by the local rationality principle: people do things that make sense to them given their goals, the evolving situation, and their understanding of it at a particular moment. Our rationality is not only bounded by human limitations, complexity, and time available, but local to the situation and our experience. Everyone has their own local rationality. We need to understand how people make sense of situations and how they choose to act. This requires empathy and careful discussion and observation to understand work-as-done (in the head and the world) and what helped and hindered it.

6. Interactions and patterns

In a system, everything is connected to something. While we often attend to components, it is the nature of interactions, along with goals, that characterises the system. These interactions – between human, social, organisational, regulatory, political, technical, economic, procedural, informational, and temporal components – should be a focus of attention, whether considering the past, present or future. Viewing the system as a whole, emerging patterns of activity – including flows of activity and information – become evident. These wanted and unwanted patterns can be understood using systems methods, which help to reveal influence in the system, and possibilities for intervention.

7. Strengths and assets

Systems operate successfully, for the most part, because of strengths and assets in the system (especially human strengths such as flexibility, creativity, learning, collaboration, pattern recognition, curiosity, insight, and perspective-shifting). Strengths and assets are often almost missing from system safety research and practice. In any discussion or analysis, we should start with what’s strong, not what’s wrong. Instead of focusing only on perceived deficiencies, we must find out what capacities ensure that system goals are balanced appropriately. If we don’t explicitly appreciate what we have, how can we know if interventions – including efficiency-focused cut backs – are wise?


If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , , , ,

System Safety: Seven Foes of Intervention

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven foes of intervention. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety
  2. Seven foes of intervention in system safety (this post)
  3. Seven friends of explanation in system safety
  4. Seven friends of intervention in system safety

6559167161_566715d44f_b

Michael Coghlan CC BY-SA 2.0) https://flic.kr/p/aZBrSZ

1. Haste

When responding to an unwanted event, there is often an urge for urgency to choose a solution. This meets a need to reduce anxiety associated with uncertainty. It often results in premature choice of intervention, without properly understanding the problem situation(s), the system components, interactions and boundary, and the context, and without considering other possible interventions. Effort is then focused on implementation, bringing relief that something is in progress. The intervention itself may be built on false assumptions about the problem and the evolving system in which it exists or existed.

2. Overreaction

A single unwanted event (such as this example), set against perhaps tens of thousands of successes, can trigger a system-wide change that makes work harder for many stakeholders, and perhaps riskier. When overreaction and haste are combined, efficiency is favoured over thoroughness, and critical understanding is missing. Secondary problems are common, and may well be worse than the original one. Because risk assessments often have a component focus, the secondary problems are not foreseen. The result can involve large compensatory adjustments.

3. Component focus

System safety concerns interactions between micro, meso and macro components or elements of socio-technical systems – human, social, organisational, regulatory, political, technical, economic, procedural, informational, and temporal. Everything is connected to and influences something. The system is more than the sum of its parts, and does something that no component can do. But organisations are formalised around functions and silos, and interventions are often at the level of components, instead of interactions and flows. Acting on individual components blindsides organisations to the interactions between components, suboptimising the system by changing system-wide patterns, and creating unintended consequences elsewhere.

4. Over-proceduralisation

Work-as-prescribed – rules, procedures, regulations – is necessary to guide work-as-done and keep variation within acceptable limits. But work can rarely, if ever, be completely prescribed. Work-as-done takes work-as-prescribed as a framework for human work, adjusting and adapting to situations, drawing from and connecting disparate procedures, in a dynamic and creative way. But from afar, there can be a fantasy that work-as-done and work-as-prescribed are closer than is the reality, and nailing down more details and tightening regulatory requirements is a favoured intervention strategy. The result is more pressure and fewer degrees of freedom for necessary human performance adjustments.

5. Scapegoating

Blame – whether individual- or system-focused – is a natural human tendency following unwanted events or situations, in all aspects of life. Feeling or assigning some moral responsibility is natural and – in some cases – necessary. It is fundamental to the rule of law, especially to prevent or punish intentional affliction of harm. But scapegoating singles out and mistreats a person or group for unmerited blame. This relates to component focus, above, since one component is unfairly blamed. The result may satisfy outrage or displace responsibility, without solving a wider or deeper problem, leaving the system vulnerable to similar patterns of dysfunction – a moral and practical problem.

6. Never/zero thinking and targetology

Never/zero thinking and targetology involve conflating a measurement and a goal, (or anti-goal, in the case of accidents). With never/zero thinking, the implication is that there can be zero harm/zero accidents, while non-zero targets may refer to a maximum number of unwanted events in a given time frame, often with consequences for breaches. One intention is to motivate people to be safe and to avoid accidents. This misunderstands the nature of accidents, measurement, and human motivation. Unintended consequences tend to be hard to see from afar (e.g., under-reporting), resulting in blunt-end Ignorance and Fantasy, perhaps reinforced by green-light dashboards.

7. Campaigns

Organisational campaigns are a favoured top-down means of change, often triggered by new management. They are characteristic of the ‘done to’ and ‘done for’ modes of change and may concern, for example, safety culture, error management, team/TRM training, ‘hearts and minds’, behavioural safety, or high reliability organisations. This is often done via external training consultants. Unless the activity helps to understand work-as-done (including the messy reality) in the context of the system as a whole, the effects visibly wear off shortly after the campaign ends. Staff know this dynamic well; it has been done to/for them many times.


A note on intervention: The term intervene comes from the Latin intervenire, from inter- ‘between’ + venire ‘come’. To intentionally try to understand a situation, or take action to change it (e.g., improve it or prevent it from getting worse) is to intervene. While there may be no intention to change a situation while observing or measuring, that is very often an unintended consequence. 

If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , ,