During the second world war, the United States lost hundreds of planes in accidents that were deemed ‘pilot error’. Crash landings were a particular problem for the Boeing B-17 ‘Flying Fortress’. The planes were functioning as designed, and the pilots were highly trained, but made basic errors. In 1942, a young psychology graduate, Alphonse Chapanis joined the Army Air Force Aero Medical Lab as their first psychologist. Chapanis noticed that the flaps and landing gear had identical switches that were co-located and were operated in sequence. In the high-workload period of landing, pilots frequently retracted the gear instead of the flaps. This hardly ever occurred to pilots of other aircraft types. Chapanis fixed a small rubber wheel to the landing gear lever and a small wedge-shape to the flap lever. This kind of ‘pilot error’ almost completely disappeared.
A few years later in 1947, experimental psychologists Paul Fitts and Richard Jones analysed accounts of 460 errors made in operating aircraft controls, through interviews and written reports. They noted” that “It has been customary to assume that prevention of accidents due to materiel failure or poor maintenance is the responsibility of engineering personnel and that accidents due to errors of pilots or supervisory personnel are the responsibility of those in charge of selection, training, and operations.” Fitts and Jones took a different slant altogether. The basis for their study was the hypothesis that “a great many accidents result directly from the manner in which equipment is designed and where it is placed in the cockpit.” What had been called ‘pilot error’ was actually a mismatch between characteristics of the designed world and characteristics of human beings, and between work-as-imagined and work-as-done.
Fitts and Jones considered a range of problems, including operating the wrong control, failing to adjust a control properly, forgetting to operate a control, moving a control in the wrong direction, unknowingly activating a control, and being unable to reach a control when needed. The flap-gear substitution error, and many other ‘pilot errors’ were actually problems of cockpit design. They concluded: “Practically all pilots of present day AAF aircraft, regardless of experience or skill, report that they sometimes make errors in using cockpit controls. The frequency of these errors and therefore the incidence of aircraft accidents can be reduced substantially by designing and locating controls in accordance with human requirements” (p.2). They went on to specify design measures for controls and displays (concerning standardisation, simplification, sequencing, interlocks, and other aspects of compatibility of controls with human characteristics and expectations).
These and other studies brought into focus the ‘obvious fact’ that human performance cannot be separated from the design of tasks, equipment and working environments. We can’t just train and supervise human performance. We have to design for it. Accidents associated directly with cockpit design are now extremely rare, and in 2017 there were no passenger deaths from flights in commercial passenger jets.
The birth of a discipline
Research in the US and UK concerning real work in real environments during and after WWII formed the beginnings of the discipline that was termed ‘human factors’ (US) and ‘ergonomics’ (UK). It was not the intention of early researchers to form a new discipline. Rather, “the intention was much more modest, namely, to facilitate discussion, information exchange and collaboration between scientists working across a range of specialisms” (Waterson, 2016). These specialisms were anatomy, physiology, psychology, industrial medicine, industrial hygiene, design engineering, architecture and illumination engineering (Murrell, 1965).
Over time, human factors/ergonomics (HF/E) became a distinct discipline, with its own societies. The first was the Ergonomics Research Society in the UK in 1950 (now Chartered Institute of Ergonomics and Human Factors), following by the Human Factors Society of America in 1957.
Despite the different names for the discipline, a formal definition has been agreed, via the International Ergonomics Association – the umbrella association for national HF/E societies and associations. The definition is accepted by member societies around the world:
“Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance.” (International Ergonomics Association)
Another simpler definition was provided by the late John Wilson, who later deﬁned ‘systems ergonomics and human factors’ as follows (extract):
“Understanding the interactions between people and all other elements within a system, and design in light of this understanding.” (Wilson, 2014, p.12)
Simpler still, HF/E is sometimes referred to as ‘design for human use’.
HF/E takes a scientific approach to understanding and design, including the generation and application of associated theory, principles, data and methods. Decades of scientific research in a range of contexts have enabled a sophisticated understanding of human needs, limitations and capabilities, influences on human performance and wellbeing, human influences on system performance, and patterns of interaction between human and other system elements – physical, technical, informational, social, organisational, political, and economic.
HF/E focuses on the design of these interactions. This differentiates HF/E from other design and engineering disciplines. For industrial applications, a good shorthand for this is ‘work’. So HF/E seeks to optimise the design of work, but with a focus on work-as-done, and not simply work-as-imagined (see also EUROCONTROL, 2016).
Interactions occur at different levels. At a micro level, we have basic interactions such as pulling a lever, pressing a button, turning a dial, or hearing an alarm. At a meso level, interactions combine, bringing more complexity, such as communication and coordination between a pilot, co-pilot, and cockpit. At a macro level, the number of elements and interactions, and associated complexity, increases further, perhaps expanding to air traffic controllers, air navigation equipment, ground staff, airport, airspace, management, regulation, etc. As the lens widens, so does the number of stakeholders, and the number of goals, needs and system or design requirements that need to be considered.
These interactions occur in a context, and context is critical to HF/E. If I turn the wrong burner on my stove (which I do, very often), it is not a problem. I simply turn it off and now I know the correct dial to turn. If I want to be sure I can bend down to look at the little diagram, but often I can’t be bothered. A similar action for B-17 pilots resulted in retracting the gear instead of the flaps, and accidents. A similar action for an anaesthetist might inadvertently turn off a continuous-flow anaesthetic machine because of a badly positioned power switch. If the consequence of my turning the wrong burner dial were more severe, I would bother to check the little diagram often, but I would still make mistakes, mostly because the layout of the stoves is incompatible with the layout of the dials, which look identical and are co-located. If the consequences were indeed more severe, cooker designers would be forced to design dials to be compatible with burners, along with other designed safety features
HF/E in practice is a blend of craft, engineering and applied science. The approach tries to make system interaction and influence visible. It uses methods for data collection, analysis and synthesis, to understand and map system interaction at every stage of the life-cycle of a system or product. HF/E can therefore help in the design of interactions in the context of:
- artefacts (e.g., equipment, signs, procedures)
- designed environments (e.g., airport layout, airspace design, hospital design, lighting)
- planned organisational activity (e.g., supervision, training, regulation, handover, communication, scheduling)
- work and job design (e.g., pacing, timing, sequencing, variety, rostering, critical tasks)
- emergent aspects of organisations and groups (e.g., culture, workload, trust, teamwork, relationships).
I like to think of human factors and ergonomics as rooted – to some extent – in four kinds of thinking:
- systems thinking, including an understanding of system goals, system structure, system boundaries, system dynamics and system outcomes;
- design thinking, including the principles and processes of designing for human use;
- humanistic thinking, emphasising human agency, awareness, wholeness, intention, meaning, values, choice, and responsibility; and,
- scientific thinking, purposeful thinking that aims to enhance scientific understanding by problem specification, hypothesising, predicting, observing, measuring, and testing.
The ultimate goals of this design activity are to optimise human well-being and overall system performance. Some argue that this joint ‘and’ purpose characterises the unique holistic nature of HF/E (e.g., see Wilson, 2014). In practice, it means optimising for several goals concerning the effectiveness of purposeful activity (such as efficiency, productivity, maintainability) and particular human values (such as safety, security, comfort, acceptance, job satisfaction, and joy). Some goals are usually of higher priority than others for particular applications, but they often conflict and compete, requiring practical trade-offs and compromises.
Since the 1950s, HF/E specialists – practitioners and researchers – have come from various academic backgrounds and increasingly a wide variety of professional backgrounds and industries. They work with all sorts of people at all levels: consumers and service users, front-line and support staff, supervisors and senior management, regulators and policy makers in almost all industrial sectors (see Shorrock and Williams, 2016, for an overview).
Human Factors/Ergonomics is booming in certain sectors, where success seems to have begat success. ‘Ultra safe’ sectors such as air traffic management, rail and nuclear power in the UK have well-developed HF/E capabilities. NATS – the UK’s en route air traffic control provider – has a human factors department that has been staffed by 20-30 full time HF/E specialists and psychologists over the past 15 years or so. The Rail Standards and Safety Board (RSSB) and Health and Safety Executive have long had a mature and effective human factors capability, as have the nuclear and defence industries. All provide HF/E services in all aspects of the different sectors, from concept design through detailed design, prototyping and simulation, construction and commissioning, operation and maintenance, and decommissioning.
But the success has not been evenly spread, and has not matched need. It often appears that those sectors with the greatest need – healthcare, road transport, and farming, for example – benefit least in terms of HF/E practitioners in applied roles. Seventy years after Fitts and Jones’ seminal reports on controls and displays, quite basic design problems remain in many industries.
In healthcare, for instance, different medicines look alike and sound alike, despite the presence of official guidance informed by HF/E. There are thousands of machines with design problems so basic as different number formats; in a single hospital, one can find pumps with keypads that are like a telephone, like a calculator, or a keyboard. This shows how far ahead of its time was the work of Chapanis in the 1940s.
In fact, it was Chapanis who designed the standard telephone numerical keypad configuration that is in use today on every telephone and smartphone around the world. He tested six configurations of buttons, two vertical, two horizontal rows, and different three-by-three arrangements. All of these variations can still be found in safety-critical equipment. And most of the problems in using controls that were analysed by Fitts and Jones can be found in in safety-critical equipment used for mining, oil and gas extraction, agriculture, forestry, fishing, manufacturing, construction, recycling, digital products, telecommunication, transport, and healthcare. There may be several reasons for this.
Just three of the numerical interfaces that a clinician may use.
One reason may be a failure of branding and marketing. HF/E specialists have not come from marketing backgrounds are not typically good at it. For a start, HF/E is a discipline and profession with two names, seen as equivalent in the discipline, but different in industry and the media (with ‘human factors’ associated with accidents, and ergonomics associated with ‘design’, Gantt and Shorrock, 2016). Its focus on ‘system interactions’ appears to be lost to many outside of the profession. It doesn’t have a clear elevator pitch, and is not instantly recognised and understood by the public in the way that HF/E specialists would like it to be (with ‘ergonomics’ being associated with office furniture, and ‘human factors’ being associated with nothing much).
A second reason may be a failure of ambition and lobbying. Sherwood-Jones (2009) argued that “many ergonomists are committed to an entirely technical career and have no aspirations to management. … The consequence of staying technical is of course that you will be ignored, overruled and brought in when it is too late to do anything useful, but not too late to demonstrate that ergonomics can fail.” There are few (often no) qualified and experienced HF/E specialists on company boards, in national regulators (even aviation), or policy makers, let alone governments. While aviation is often seen as a paragon of HF/E, only one national aviation administration maintains a high level of expertise and research programme in the discipline: the United States Federal Aviation Administration. With a few exceptions, it seems that HF/E specialists have been happiest at the micro and meso levels of interaction design, and not at the macro level, despite the systemic adverse influence of top-down interventions on system and human performance (e.g., government performance targets, see Shorrock and Licu, 2013).
Shortage of HF/E specialists
A third reason may be a shortage of qualified HF/E professionals (accredited, certified, registered or chartered by relevant societies and associations) situated in industry and government agencies. This is also associated with limited demand and a shortage of HF/E courses. In many countries, there are few or no HF/E professionals even – or especially – in sectors with the highest number of ‘avoidable deaths’.
Taking the UK as an example, in England there are 233 National Health Service Trusts – providers of urgent and planned health care (‘secondary care’). NHS England is an organisation of over 1 million staff, with a planned expenditure for 2017/18 of over £123bn. It espouses a focus on patient safety, and its focus areas for 2017/2018 clearly require HF/E expertise, including improving investigations, reducing medication error, and “an approach to patient safety is widely recognised as world-leading” (NHS England, 2018). The number of qualified full-time HF/E specialists in NHS England care providers can be counted on one hand. In fact, only one out of 233 NHS Trusts employs any Chartered Ergonomist and Human Factors Specialists.
There is some excellent training for clinicians in aspects of behavioral human factors, such as team training, team resource management and non-technical skills, and many Trusts have their own advanced simulation facilities and staff. This does not, however, address the underlying design problems that remain, and at best may provide awareness of these, and compensatory behavioural routines.
Despite the shortage of HF/E specialists, HF/E is becoming more popular. Over the last decade or so, the term ‘human factors’ and HF/E issues have gained currency with an increasing range of people, professions, organisations and industries. This is a significant development, bringing what might seem like a niche discipline into the open, to a wider set of stakeholders. In healthcare, there is now significant participation in discussions about ‘human factors’, which can be seen especially on twitter. The same can be seen in other industries, especially new sectors such as web operations and engineering. Front-line workers know that HF/E is relevant. It’s kind of obvious that work should be designed for human needs and characteristics. The difficulty seems to be in getting commitment for resource at upper levels.
A two-pronged solution
The criticality of HF/E is not in dispute. So how to gain more traction on designing for human wellbeing and system performance? One way is of course more training opportunities. Another is more lobbying for HF/E posts in commercial, governmental, and intergovernmental organisations. Certain roles, typically involving a wide and deep level of content and method expertise will always require highly qualified and experienced HF/E practitioners (e.g., certified, registered, chartered). For instance, these specialists are now higher demand, and having greater impact, in medical device design and pharmaceuticals. But this h as been tried for decades, with limited success.
So the other half of the solution is to spread HF/E to others, who might be familiar with certain aspects of HF/E theory and method, practicing certain aspects of HF/E design, or advocating or evangelising HF/E principles, but not HF/E specialists as such. The founders of HF/E were not HF/E specialists then (and were probably too specialised to ‘qualify’ as HF/E specialists today!). So this is where you come in. If the idea of designing for human use to optimise performance and human wellbeing appeals to you, then now is a good time to think about how you might learn more, and integrate HF/E in your practice.
EUROCONTROL (2016). HindSight: ‘Work-as-imagined and work-as-done’. Issue 25. Brussels: EUROCONTROL
Fitts, P.M. and Jones, R.E. (1947). Analysis of factors contributing to 460 “pilot error” experiences in operating aircraft controls. Dayton, OH: Aero Medical Laboratory, Air Material Command, Wright-Patterson Air Force Base, 1947.
Gannt, R. & Shorrock, S.T. (2016). Human factors and ergonomics in the media. In Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.
Murrell, K.F.H. (1965). Ergonomics: man in his working environment. London: Chapman and Hall.
NHS England (2018) Patient safety. Accessed on 10/01/18 at https://www.england.nhs.uk/five-year-forward-view/next-steps-on-the-nhs-five-year-forward-view/patient-safety/
Sherwood-Jones, B. (2009). Usability assurance (blog). Accessed on 10/01/18 at http://processforusability.blogspot.co.uk/2009/10/usability-assurance.html
Shorrock, S. and Licu, T. (2013) Target culture: Lessons in unintended consequences. HindSight: Safety versus Cost. Issue 17. 10-16. Brussels: EUROCONTROL.
Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.
Waterson, P. (2016). ‘Ergonomics and ergonomists’: lessons for human factors and ergonomics practice from the past and present. In Shorrock S. & Williams, C. (2016). Human factors and ergonomics in practice: Improving system performance and human wellbeing in the real world. CRC Press.
Wilson, J (2014). Fundamentals of systems ergonomics/human factors. Applied Ergonomics, 45(1), 5-13.
For more information on HF/E degree courses in the UK, see here. For shorter courses in the UK, see here.
For more information on HF/E degree courses in the USA, see here.
For information on other HF/E societies and associations and educational opportunities, see here.
Four Kinds of ‘Human Factors’: 1. The Human Factor
Four Kinds of ‘Human Factors’: 2. Factors of Humans
Four Kinds of Human Factors: 3. Factors Affecting Humans
Four Kinds of Human Factors: 4. Socio-Technical System Interaction