The Varieties of Human Work

Understanding and improving human work is relevant to most people in the world, and a number of professions are dedicated to improving human work (e.g. human factors/ergonomics, quality management, industrial/work/organizational psychology; management science). The trouble with many of these professions is that the language and methods mystify rather than demystify. Work becomes something incomprehensible and hard to think about and improve by those who actually design and do the work.  Recently, some notions that help to demystify work have gained popular acceptance. One of these is the simple observation that how people think that work is done and how work is actually done are two different things. This observation is very old, decades old in human factors and ergonomics, where it dates back to the 1950s in French ergonomics (le travail prescrit et le travail réalisé; Ombredanne & Faverge, 1955) and arguably the 1940s in analysis of aircraft accidents in terms of cockpit design (imagination vs operation). Early ergonomists realised that the analysis of work could not be limited to work as prescribed in procedures etc (le travail prescrit), nor to the observation of work actually done (le travail réalisé). Both have to be considered. But these are not the only varieties of work. Four basic varieties can be considered: work-as-imagined; work-as-prescribed; work-as-disclosed; and work-as-done. These are illustrated in the figure below, which shows that the varieties of human work do usually overlap, but not completely, leaving areas of commonality, and areas of difference. Here we will consider each variety, one by one.


The varieties of human work.


When we think about human work, we typically think about the things that we actually do. But in thinking about what we or others do, we have already uncovered another important type of work – the work that we imagine. Work-as-imagined is both the work that we imagine others do and the work that we imagine we or others might do, currently or in the future.

The imagination of human work takes place within organisations, between organisations, and from outside of organisations. If we take the example of operational staff (e.g., clinicians, pilots, train drivers, control room operators, road maintenance workers), then the policy makers in national and local government, along with regulators and inspectors, and the general public and various groups (e.g. patient groups, transport user groups, neighbourhood associations) will have an imagination about the work of the operational staff (as well as also the work of middle and senior managers within facilities or companies). Within a facility or company, senior management and middle management, along with non-operational specialists and support staff, will also have an imagination of the work of operational staff. Operational staff will have an imagination of the work of all of the above, and of other operational and non-operational staff in different specialities, inside or outside of the organisation.

To a greater or lesser extent, all of these imaginations – or mental models – will be wrong; our imagination of others’ work is a gross simplification, is incomplete, and is also fundamentally incorrect in various ways, depending partly on the differences in work and context between the imaginer and the imagined.

Work-as-imagined is fed by three basic sources: past experience of work-as-done; knowledge and understanding of work-as-prescribed; and exposure to work-as-disclosed. All three of these are problematic. Past experience of work-as-done can be a useful foundation for work-as-imagined, but it can quickly become outdated with changes in demand, resources, constraints, and changes to how the work works. This outdatedness can be a blindspot. Work-as-prescribed is difficult to understand even for those doing the work, let alone those who do not, is under- or over-specified, and is often far from reality. Work-as-disclosed, meanwhile, is partial and often biased.

Work-as-imagined also applies to our imagination of future work, to change – adaptation or transformation. Most human work begins with work-as-imagined, since someone has an imagination about how work could be done, unless new ways of working are discovered accidentally. In practical terms, this future work-as-imagined includes changes to tasks, workflows, jobs, team and organisational structures and processes, and technology. These changes are the subject matter of human factors and ergonomics (i.e., interaction design), but of course changes mostly happen with no HF/E input.

Similarly, work-as-imagined applies to what we think we would do in a scenario, which may well be different to what we would really do. Martin Bromiley (2016) reflected on a tragic incident where his wife Elaine Bromiley died in a routine operation. He said that “As clinicians the world over have reviewed my late wife’s case, many have stated that ‘I wouldn’t have done what they did.’ Yet place those same people in a simulated scenario with the same real-world disorder, which deteriorates into the same challenging moment, and most actually do.” Similarly, Hollnagel (2016) stated that, especially when something goes wrong, “work-as-done differs from what we imagine we would do in the same situation, but thinking about it from afar and assuming more or less complete knowledge.”

Especially in cases where decisions about the work of others (or that affect the work of others) are made by those whose imagined view of work is incorrect, work-as-imagined can become very problematic. In such cases, inadequate involvement of those who do the work (work-as-done), and inadequate analysis and synthesis of the evolving context of work, often leads to badly designed work and work environments, and unintended consequences, including adaptations to work-as-done to overcome constraints and to work around other unintended consequences.

Much analysis work is done on human work, but this is often done on this imaginary variety. In the context of oil and gas engineering, Miles & Randle (2016) stated that “The design and engineering of new assets is usually contracted out … These contractors may not receive direct feedback on the success of, or problems with, their previous designs in the field, and most engineers designing the asset will not have worked on or even visited an operating installation.” The analysis of imagined work can be seen in the use of many ornate methods, used by those who are distant from the work or have inadequate access to those who do the work – the field experts. Hence, what we are deconstructing and analysing, formally or informally, is often an imagined, abstract system, not a real, concrete one.


Our imagination of human work is not necessarily the same as the way that work is prescribed. Work-as-prescribed is the formalisation or specification of work-as-imagined, or work-as-done, or work-as-disclosed, or some combination of the three. It takes on a number of forms in organisations, including: laws, regulations, rules, procedures, checklists, standards, job descriptions, management systems, and so on. Some of these are more task-oriented (e.g., procedures, checklists) while others are more job-oriented (e.g., job descriptions). While there are infinite varieties of work-as-imagined, there is a limited variety of work-as-prescribed, with each task having one or a small number of prescribed methods.

Work is often prescribed by more senior members of an organisation (supervisors and middle managers), and sometimes by those at a greater distance from the work itself, for instance specialists with little contact with the front line, or external organisations, such as regulators (e.g., work time limits) and policy-makers (e.g., four-hour target for accident and emergency admissions). It is not unusual to see work prescribed far away from the actual work by those who have never actually performed the task. Catchpole and Jeffcott (2016) wrote that, in healthcare, “You will quickly find that there is a difference between policy and practice…and that administrators may not be aware of the latter.” Work is often also prescribed by those who do the work (e.g., working groups), and those have previously done the work, but no longer do so.

Work-as-prescribed is unique among the four key varieties in that it is assumed to be the safe and the right way to work. As such, it is subject to risk assessment, and risk controls are incorporated, as well as other measures to control and standardise work-as-done. These controls may involve soft constraints, which are possible to overcome, albeit at a cost (e.g., rules), or hard constraints, which are difficult or impossible to overcome (e.g., forcing functions). Work-as-prescribed is also often the ultimate arbiter of whether performance is satisfactory.

The problem is, it is usually impossible to prescribe all aspects of human work, even work that is well-understood, except for extremely simple tasks (Hollnagel, et al, 2013). First, there are many ways in which work can be done. Even if it is prescribed in one way, it could and will probably be done in other ways, even if we just consider small differences in implementation. Second, work often or usually incorporates task switching, between different aspects of different procedures. This meta-level of work is hard to impossible to capture in prescribed work. Third, the pre-conditions for, and conditions of, work cannot all be foreseen, let alone guaranteed. Assumed system conditions – staffing levels, competency, equipment, procedures, time – are often somewhat more optimal than those found in practice. Fourth, it is just not possible to articulate, especially in a linear written form, the precise way that work is done in a way that is usable or that can reasonably be followed. Pariès and Hayward (2016) noted that “in most current industrial processes, strict adherence to preestablished action guidelines is unattainable, incompatible with the real efficiency targets, and insufficient to control abnormal situations.” They say that many requests from their clients derive from “difficulty in reconciling this ‘old truth’ with the inflation of applicable aviation safety standards and the compliance expectations of safety management system frameworks.” Work-as-prescribed is therefore usually under-specified even in its most complicated forms. Procedures, standards, regulations, etc., lack the detail and richness of actual work. And the more specified the prescribed work, the more incorrect is it likely to become in messy work situations (a good example being the checklists in QF32). In other cases, there may be relatively little prescription. As an example from an industry that is almost the opposite to aviation in terms of prescription – web operations and engineering – Allspaw (2016) remarked that there is “no singular overarching regulatory, standards, or policy-making body for these services”.

Often, we tend to think that work-as-prescribed is basically the correct way to work. In some cases, this is justified, but in other cases is it not. The famous ‘work to rule’ strategy (used by the National Union of Railway Men against against British Rail) involves a tactical realignment of work-as-done with work-as-prescribed. This has become a standard form of industrial action, also known as a ‘white strike’. The result is that the system cannot function effectively, thus demonstrating the limits of work-as-prescribed. Work to rule has been described as a decision to “Give the rules a meaning which no reasonable man could give them and work to that” (Sir John Donaldson, 959, Secretary of State v. ASLEF (No. 2) [1972]). Similarly, Sir John Denning stated that “Those rules are to be construed reasonably…They must be construed according to the usual course of dealing and to the way that they have been applied, in practice.” The final arbiter for what is reasonable is, according to these statements, work-as-done. Work-as-done, therefore, may be assumed to be reasonable, for all intents and purposes. Of course, when accidents happen, this perspective reverses, such that work-as-prescribed is reasonable and work-as-done is not.

Work-as-prescribed is very often a basis for various sorts of analysis, often combined with Work-as-imagined, or knowledge of work-as-done. Like work-as-imagined (but unlike much work-as-done), work-as-prescribed can be deconstructed. It can also be examined and discussed. Examples include risk assessment, task and job analysis, human error analysis, behavioural safety, and so on.

Field expert involvement in the development and use of work-as-prescribed is critical to limit the gap between work-as-done. Two problems are common here. The first is a lack of field expert involvement, e.g., job descriptions written by HR, procedures written by a procedure department with little contact with operations. The second problem is that those involved are constrained in their ability to communicate with others who are affected. Those who do the work may well think that no one is involved in prescribed work, and thus lack knowledge to feed back flaws in prescribed work.


In addition to the way that we imagine work, and the way that work is prescribed, we can add a third variety of human work: Work-as-disclosed (or -explained, -expounded, -exemplified, or -espoused). This is what we say or write about work, and how we talk or write about it. It may be simply how we explain the nitty-gritty or the detail of work, or espouse or promote a particular view or impression of work (as it is or should be) in official statements, etc. Work-as-disclosed is typically based on a partial  version of one or more of the other varieties of human work: Work-as-imagined, work-as-prescribed, and work-as-done. But the message (i.e., what is said/written, how it is said/written, when it is said/written, where it is said/written, and who says/writes it) is tailored to the purpose or objective of the message (why it is said/written), and, more or less deliberately, to what is thought to be palatable, expected and understandable to the audience. It is often based on what we want and are prepared to say in light of what is expected and imagined consequences.

How we talk or write about work is not necessarily the same as work-as-prescribed, since prescribed work may not be a good basis for how we disclose or explain how things work. As mentioned earlier, work-as-prescribed may be under-specified or over-specified, or just not how things are really done. So a supervisor might explain to a newcomer ‘how things work around here’ (work-as-done), in a summary form, or in a way that is very different to how work is officially prescribed.

But work-as-disclosed is also not necessarily the same as work-as-imagined, because what we think or believe may be different to what we are prepared to say, especially to outsiders. John Wilkinson (2016), a former regulator in the UK Health and Safety Executive, noted that “People choose what they want to say to regulators … The regulator can start to believe that ‘work-as-imagined’ (what the ideal organization does to work safely) should always match ‘work-as-done’ (the ‘real world’ of business). The right position lies somewhere in-between”. Similarly, Cook and Cooper (2016) stated that “many well-intended shortcuts and deficient workplace practices are routinely not detected during audits. The outcomes of this can be an increasing gap between work-as-imagined and work-as- actually-done, and major system failures may be associated with this gap”.

Another example is what a staff member says to a senior manager about work, which may be different to what really happens. There are many reasons not to express how work is really done. For instance, staff may fear that resources will be withdrawn, constraints may be put in place, sanctions may be enacted, or safety margins or buffers will be dispensed. Hence, secrecy around work-as-done can be a self-protective measure against the drive to improve efficiency at the expense of other goals (such as safety and well-being).

Work is disclosed or explained (and expounded, exemplified, espoused) by many people, both those who do the work, and those who do not. Some work-as-disclosed is therefore based on (or disclosed with) intimate knowledge of work-as-done. A surgeon and an anaesthetist/anaesthesiologist, may, for instance, advise a patient about a surgical procedure. What is said will reflect what is done, but only at a high level. Other work-as-disclosed is based only on an imagination of the work (work-as-imagined), or else what others say about the work (work-as-disclosed by a third parties). A corporate communications specialist in an airline, air traffic control organisation or professional association, may, for example explain to the news media or via social media the work of pilots or air traffic controllers. Both of these direct and indirect forms of work-as-disclosed will involve simplifications.

In other instances, work-as-disclosed may not deliberately simplify. An example is a train driver explaining his or her work to a human factors specialist/ergonomist who is undertaking some form of task analysis. Here, the driver will be thinking about his or her work in detail, and explaining it perhaps in more detail than ever before, except perhaps to a new train driver who is undergoing on-the-job training.

People will sometimes modify or limit what they say about work-as-done based on consequences. In an environment where people are punished for trade-offs, workarounds, and compromises, which the staff believe to be necessary to meet demand, then the overlap between work-as-disclosed and work-as-done may be deliberately reduced. Some work-as-disclosed is explicitly designed to reassure, perhaps to provide a basis for work-as-imagined in the other that aligns with work-as-prescribed (e.g. “We fully comply with all relevant rules and procedures”). In a healthcare context, Catchpole and Jeffcott (2016) wrote that “Direct observation usually illustrates a further difference between what is said and what is done.” The celebrated US Anthropologist Margaret Mead was credited with saying “What people say, what people do, and what they say they do are entirely different things” (there is no written evidence that she every did say this, but it is reflective of aspects of her work).

If there is a culture that is mutually experienced as fair and trusting, then there is a good chance that the overlap between work-as-disclosed and work-as-done will be large. In such cases, the areas of lack of overlap may be limited to inconsequential minutia, or aspects of work that are not easily available to conscious inspection from the inside, bearing in mind that much human work is based on unspoken assumptions and norms, and unconscious patterns of activity.

Formal methods for understanding work via work-as-disclosed include individual and group interviews, using a wide variety of more or less structured methods from psychology, human factors and ergonomics, sociology, ethnography, etc. Some of these methods are used in situ along with work-as-done (e.g., think aloud) but most are used remote from work-as-done (e.g., critical incident technique; focus groups). In some cases, assurances of confidentiality may be required to increase the overlap between work-as-disclosed and work-as-done. In all cases, one needs to be mindful that what is said may well differ from what is done.


Work-as-done is actual activity – what people do. It is characterised by  patterns of activity to achieve a particular purpose in a particular context. It takes place in an environment that is often not as imagined, with multiple, shifting goals, variable and often unpredictable demand, degraded resources (staffing, competency, equipment, procedures and time), and a system of constraints, punishments and incentives, which can all have unintended consequences.

Work-as-done is mostly impossible to prescribe precisely and is achieved by adjustments, variations, tradeoffs, compromises that are necessary to meet demand. These adaptations are based on operational know-how, but often have not been subject to formal analyses such as risk assessment; such analysis struggles to handle them. While the adaptations are often necessary to meet demand, they can sometimes put the system and practitioners at risk. This raises ethical problems, according to van Winsen and Dekker, who stated that “We need to ask ourselves, if it is ethically right that operators routinely need to work around or loosely interpret many official procedures…to get their work done?”. One should not get the impression that work-as-done is necessarily the right way. In the context of the rail industry, O’Flanagan and Seeley (2016) noted that “sometimes the motivations for the way that the work is actually done are not laudable.” These motivations may arise from various sources, of course, at different levels within and outside the company.

Still, gaps between work-as-prescribed and work-as-done may be known, and accepted and even encouraged – at least implicitly – at supervisory and local management levels, while demand is met. However, these gaps are usually not disclosed liberally and may not be imagined widely. When things go wrong, the adaptations, and the gaps between the varieties of human work, are subject to scrutiny. Hollnagel (2016) stated that we account for the differences “by inferring that what people actually did was wrong – an error, a failure, a mistake – hence that what we thought they should have done was right. We rarely consider that it is our imagination, or idea about work-as-imagined, that is wrong and that work-as-done in some basic sense is right.”

In light of the risks of disclosing all aspects of work-as-done, workers may keep aspects of it secret, or protect their working environment so as not to expose it. Aside from the risk of sanctions, the reason for this is a suspicion that if a decision maker sees a snapshot of work-as-done, then they may generalise from that snapshot, or make assumptions, and change the design of work system, perhaps changing work patterns (e.g., shift systems or work times, or activities), altering team structures, reducing resources, or tightening safety buffers or margins, usually in order to increase efficiency.

Work-as-done can be examined via observation, but this is challenging. It is particularly prone to change on inspection. This is a known flaw of, for instance, behavioural safety schemes and safety audits, especially those that focus on negatively perceived behaviour, or are perceived as checks of compliance or non-compliance with work-as-prescribed. It can also be very difficult to understand (e.g., the work of a radar controller or radiologist), or unsafe to observe (e.g., military personnel).

Work-as-done is the most important and yet most neglected variety of human work. It is the variety that outsiders (those who do not do the work) pay least attention to. Much attention is paid to the other varieties of work, and this would not be a problem if it were not for the fact that these other varieties are so often mistaken for, or used as a proxy for the real thing: work-as-done. Hollnagel (2016) noted that “We lack models based on what people actually do, on the recurrent patterns of behavior.” This is probably more true of some industries than others. By involving field experts in any activity to understand work, and by getting close to where the work is done, we can help to close the gaps, but there will likely always be differences, and knowing this keeps us humble and aware that our understanding is limited, never complete. Human factors and ergonomics practitioners often find themselves in a privileged but very difficult position, as go-betweens and translators, consciously trying to understand and explain the gaps between the varieties of work to help improve system performance and human wellbeing, without unintentionally bringing about harm along the way.


The early ergonomists were right. The analysis of work cannot be limited to work as prescribed in procedures etc (le travail prescrit), nor to the observation of work actually done (le travail réalisé). Similarly, it cannot be limited to work as we imagine it, nor work as people talk about it. Only by considering all four of these varieties of human work can we hope to understand what’s going on.


Allspaw, J. (2016). Human Factors and Ergonomics Practice in Web Engineering and Operations: Navigating a Critical yet Opaque Sea of Automation (Chapter 25). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Catchpole, K. and Jeffcott, S. (2016). Human Factors and Ergonomics In Healthcare: Challenges and Opportunities (Chapter 13). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Cook, B. and Cooper, R. (2016). Human Factors Practice in Military Aviation: On Time and On Target (Chapter 16). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Hollnagel, E., Leonhardt, J., Shorrock. S. and Licu, T. (2013). From Safety-I to Safety-II. A White Paper. Brussels: EUROCONTROL Network Manager. [pdf]

Hollnagel, E. (2016). The Nitty-Gritty of Human Factors (Chapter 4). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Miles, R. and Randle, I. (2016). Human Factors and Ergonomics Practice in the Oil and Gas Industry: Contributions to Design and Operations, (Chapter 17). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

O’Flanagan,, B. and Seeley, G. (2016). Human Factors/Ergonomics Practice in the Rail Industry: The Right Way, the Wrong Way and the Railway (Chapter 14). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Ombredanne A. & Faverge J.-M. (1955). Lanalyse du travail. Paris : PUF.

Pariès, J. and  Hayward, B. (2016). Human Factors and Ergonomics Practice in Aviation: Assisting Human Performance in Aviation Operations (Chapter 15). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Secretary of State v. ASLEF (No. 2) [1972] 2 All E.R. 949 at 959 (N.I.R.C.) per Sir John Donaldson. Cited in William Twining and David Miers (2010). How to Do Things with Rules. Cambridge University Press. p. 41.

Wilkinson, J. (2016). Human and Organisational Factors in Regulation: Views from a Former Regulator (Chapter 20). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

van Winsen, R. and Dekker, S. (2016). Human Factors and the Ethics of Explaining Failure (Chapter 5). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , , , , , , , , , , , , , , , | 8 Comments

Just culture: Who are we really afraid of?

Douglas Sprott CC BY-NC 2.0

Douglas Sprott CC BY-NC 2.0

When we think about just culture, we usually think about accidents and incidents, associated ‘honest mistakes’ and ‘negligence’ (by whatever name), as well as official responses to these, at company and judicial level. The notion of just culture is driven partly by fear; fear of being judged and blamed, especially fear of being blamed unfairly. The fear is felt most strongly by operational staff, who are at the sharp end of organisations and have sometimes faced disciplinary or legal action for their parts in accidents. This issue was discussed recently at a conference on just culture and the judiciary. The keynote speaker was Martin Bromiley, who talked about just culture in healthcare in the UK (Bromiley, 2016a). He and others raised the issue, both formally and informally, that judgements do not just come from the judiciary. After many hundreds of hours spent talking to thousands of people in interviews and focus groups – from operational staff to Board members and judiciary – about about aspects of safety and fairness, the question that came to my mind was “Who are people really afraid of?”…and why?

Most of my time has been spent talking to operational staff (e.g., air traffic controllers and technicians, and others in different industries). Since they do the sharp-end operational work, it is important to understand from them what and how they think, and to understand how things really work at the sharp end. But I have also spent much time talking to other stakeholders: specialist and support staff, middle managers, senior managers/directors, national judiciary, and policy makers. Whenever talk comes to issues of fairness or justice, there is usually a ‘them and us’ tone to discussions. To front-line staff, it’s the actions of ‘us’, and the responses of ‘them’.

But who is ‘them’ really? The ultimate judge of whether an action constitutes gross negligence is the judiciary. Of course, people have been prosecuted for gross negligence, such as flying while drunk. People have sometimes been prosecuted and even convicted for what many would now term ‘honest mistakes’, perhaps actions or decisions that many others in the same situation, with similar training or experience could have made, at some time or other. But in aviation, these cases are relatively rare. In the past 40 years, the numbers of pilots, engineers and air traffic controllers convicted is low. Prominent cases involving commercial flights include: Zagreb BA476 & JP550 (1976), Athens SWR213 (1979), Mt Crezzoin ATI460 (1987), Habsheim AF296 (1988), London Heathrow BA012 (1989), Schipol DAL039 (1998), Yaizu JAL907 (2001), Linate MD87 & C525 (2001), Uberlingen (2003), Palermo TUI1533 (2005), Athens HCY522 (2005), and Mato Grosso GLO1907 & N600XL (2006) (Smoker and Baumgartner, 2016). Following other accidents, individuals have faced charges, which were later dropped, and others have been acquitted. In aviation, relatively few judgments are made by the judiciary against front-line staff, especially for accidents where there is no sign of reckless or grossly negligent behaviour (what constitutes ‘gross negligence’ is a matter of legal judgment and also varies between states; some do not differentiate between ‘negligence’ and ‘gross negligence’, while other countries have special exceptions for some professionals for ‘minor’ cases of ‘negligence’).

Indeed, people in air traffic management rarely mention the judiciary as a source of significant concern in discussions about just culture in a normal work setting. Even when asked directly, many people have not given prosecutors and judges much thought. People are often unsure what would happen if they got caught up in a prosecution. This is not to say the role of the judiciary is unimportant or that legal and other support in the case of a prosecution is unimportant; it is very important. The judgement of the judiciary is just not something that seems to be weigh heavily on people’s minds when the topic of ‘just culture’ arises in discussions about safety – at least in air traffic management. This would be different for other professionals in transportation (e.g., pilots and some train drivers) who travel through different judiciaries, and for healthcare professionals, who face complaints from patients and are arguably far more exposed.

People in air traffic management also do not often talk about the role of senior management with respect to just culture. The Board is responsible for policy (including just culture policy) but does not frequently make judgements about the performance of individuals. It is not the judgement of the CEO whom people seem to fear, nor usually the Safety Director (where one exists). In some cases, the HR Director may be a person of concern, but only if judgements about performance are passed to them from someone else, for instance a Director of Operations. Directors of Operations usually come from an operational background themselves (though they rarely remain operational, partly due to lack of time). But the Director of Operations is usually only of concern if judgements about performance are passed to him or her from someone else, for instance via an investigation.

Indeed, the main focus of discussion with operational staff about just culture usually concerns investigations and investigators. Being blamed in the context of a safety investigation is contrary to the purpose of a safety investigation, partly because it is deathly for an occurrence reporting system, and for any subsequent investigations and learning. Trust is built up slowly between people, especially in organisations made up of  silos, but it is destroyed in an instant. People immediately lose trust in safety processes and practitioners when they perceive that they are blamed for events in the context of a safety investigation. Again, investigators are typically from an operational background. Some remain operational, while others do not (for instance due to lack of time, or for reasons of competency, age, or health), and tend to become more distanced from the operational work.

Quite often, what is really interesting about discussions concerning safety and justice is what is not said openly, when it is clear that something is being omitted. These are the taboo subjects. Sometimes, people indicate that there is a problem and that they will not discuss it in a group, but will mention these problems in private (interviews), in breaks, and as they are about to leave the room (door handle moments). Just culture among colleagues is one of these issues. What people fear most of all is not the judgement of those who are most distant from the work, whose judgements are relatively rare. What people fear is the judgement of those closest to the work – their co-workers. Except in the most open of cultures (rare exceptions, such as Scandinavian countries), usually people will avoid discussing this openly in a group setting. People fear raising the issue of judgement and blame by colleagues because they fear being judged and blamed for raising the issue. A doctor friend of mine who is the head of a department in a French hospital once told me about his attempt to discuss just culture with his colleagues (the doctor is not French). He decided to recount a story of an ‘honest mistake’ in a messy situation, of the sort that is typical of healthcare. After telling the story, his colleagues pounced on him, pointing out what he did wrong and what he should have done. It was the last time he tried such an exercise. This experience is far from unique. Indeed, in healthcare, clinicians seem to fear most the judgement of other clinicians (Bromiley, 2016a). Human beings tend to have a strong need to belong and a strong need for group identity. Discussing internal threats to that group identity can itself seem threatening.

The judgements of those closest to us are of most concern to us for two key reasons. First, we have to continue working with or alongside these people from one day to the next. Strained relations make for an unpleasant working life. Second, people in the same sort of position have an advantage that is not present in those who are far removed from the work (e.g., senior management of the judiciary). The advantage is this: our colleagues and co-workers know how the work is done and have an imagination about how they think they would have done the work (i.e., better) (Shorrock, 2016). They have confidence that this imagination is what would actually happen, but this is far from the case (Bromiley, 2016b). While a coworker’s Work-as-Imagined is not another worker’s Work-as-Done, it is closer than Work-as-Imagined in the minds of anyone else. Co-worker judgements therefore hit closer to home. Co-workers can point out our errors in the same way that we can point out theirs. They know the work and may do it themselves, so their judgements carry most weight.

It is not just operational staff, of course. It is all of us. Think about how you drive. If you are like most people, you probably spend much more time judging others’ driving (including, or especially, your partner’s) than you spend thinking about how you are driving. In any case, we think our driving is better than average (Roy and Liersch, 2014). Our self-serving bias is strong.

We are then, as groups, our own worst enemies. We demand fairness from others (especially other professionals – out-groups), but continue unfairly to blame others. At this point, you might complain that, “Judgement by colleagues is less important than judgement by a judge!“. It is probably true that, when it comes to justice, an individual judgement by a judge (especially a conviction) is more important than a judgement by a colleague. But to assume greater importance for judiciary judgements overall would be to captured by the déformation professionelle of traditional ways of thinking about safety (Safety-I), that rare adverse events are much more important than everyday work, and therefore we should focus only on accidents. Front-line staff naturally seem to accept that focusing only on accidents in order to understand a lack of accidents doesn’t make sense. Conversely, front-line staff naturally seem to accept that to improve safety, you have to focus on everyday work, not only accidents (past or future). It follows then, that to improve fairness or justice, we have to focus on everyday judgements, not just rare judgements of the sort that arise from judicial investigations (or even safety investigations).

Even accepting that everyday judgements are high frequency, it is a serious mistake to think they are of low consequence. When we think back to the real impact of co-worker judgements about us and our performance, we find that their impact can be enormous. Being judged or blamed for our individual part in routine work in a messy situation and complex system, when outcomes are not as planned, leads to a number of negative thoughts and emotions, including resentment, anger, worry, and preoccupation. Being judged can lead to lost sleep, damaged self image, mental and physical health problems, interpersonal problems and strained or ruined relationships. Being judged can lead to company disciplinary proceedings and even legal action (defamation; slander, libel). On an operational level, blame by colleagues can lead to non-cooperation, such as the withholding of operationally relevant information within or between teams. This, in turn, becomes a safety issue.

Demanding justice from an out-group while ourselves denying it to others in our in-group is understandable. Constructing a common external threat (out-group derogation) seems to help internal solidarity. But when the real threat is internal, then this is a kind of hypocrisy that we should address. And while front-line staff are the most vocal supporters of just culture, for good reason, perhaps the judiciary are the unsung champions of just culture. The judiciary spends weeks and months collecting factual and other information, reviewing and discussing the information, and deliberating upon that information, before forming judgments – all for an event that may have lasted minutes. This difference between the time frame for Work-as-Done versus Work-as-Judged is perhaps one reason why we focus on the judgments formed in criminal and safety investigations, and this is a fair point. We think that such judgments should never be unjust, because there is sufficient time to make a just judgment (while ignoring the constraints of the national legal systems and penal codes). Our everyday judgements, on the other hand, are formed and expressed in haste, in seconds or minutes – a similar timeframe to that of the work being judged.

So what to do? Perhaps the most important actions we can and should take concern us, not them. Addressing our frequent, everyday blaming and shaming judgements in response to outcomes-not-as-planned will likely have the most impact on human wellbeing and safety.

Be mindful of your personal reaction to failure

  1. Reflect on your initial internal reactions. How did you react emotionally to what you observed or heard? What feelings did you experience? Your immediate internal reaction may have been anger or fear, for example. People who are involved and uninvolved will tend to have different internal reactions. Those directly involved, and who could be judged, may be more likely to experience fear. Those uninvolved, or involved but unlikely to be judged, may be more likely to experience anger, or perhaps sympathy (via identification).
  2. Reflect on your judgements and evaluations. Following these reactions and feelings, what did you think about all of this? How did you interpret and evaluate what happened at the time? When considering your involvement in an adverse event or unwanted situation, you may have judged yourself harshly. Perhaps you felt disappointed in yourself, even doubted your competency. When evaluating another person’s involvement, consider whether your focus is on the individual or the situation and system, and to what extent you are judging and blaming an individual (whether or not this is expressed). The focus should not be on part of the picture, but the whole picture. At this stage, be mindful of the outcome bias. Knowing the outcome of an event changes the way that you think about the actions and decisions that took place in the run-up to that outcome. Experimental studies have shown repeatedly that the exact same performance will be judged differently depending on the outcome. This is confirmed in our everyday experience. Often, what makes performance ‘bad’ is not the performance itself, but the outcome (e.g., accident). Had there been no accident, the performance would often be judged as normal (‘uneventful’), perhaps even rather efficient or effective. Be mindful also of the fundamental attribution error. We have a tendency to from dispositional rather than situational explanations for others’ behaviour. We are prone to blame, but this can be overcome with education, at least at the stage of judgement and evaluation (if not initial internal reaction). If you are involved in investigation, then your responsibility to reflect on your judgements and evaluations is greater still.

Be mindful of your interpersonal reaction to failure

  1. Empathise. Empathise with others to understand their local rationality. If we really want a just culture, then we have to empathise with others and understand why what they did made sense to them at the time. Try to understand the background situation and the person’s world via ‘person empathy’ or ‘background empathy’. Also try to develop a moment-by moment empathy for the person’s experience, cognitively, emotionally, and physically, using ‘process empathy‘. Seek not to judge, but at least to understand. This is an interpersonal activity because it will tend to involve talking to people. Empathy is not a solo activity. It has to be experienced by the other person. Carl Rogers (1956) noted that, “Unless some communication of these attitudes has been achieved, then such attitudes do not exist in the relationship as far as the client is concerned.”
  2. Consider needs. Based on this empathic understanding, think about what others need, and what would get in the way of their needs being met. How would they like to be treated, helped or supported? It might be helpful to ask these questions of yourself, thinking back to a situation where you were in a similar position, and when your needs were met or not met. But remember that your needs are not theirs.
  3. Apologise. We all get it wrong and judge or blame others unfairly from time to time in everyday life, including at work. We cannot stop others from doing this, and we will sometimes relapse into blame ourselves. But we can keep our side of the street clean when we do slip up, by apologising. Some people find this easier than others, but it requires little effort other than swallowing one’s pride. Express how you jumped to judgement without thinking it through or thinking about what they need. Consider how the above might be applied. There is little more restorative in a relationship than an honest and unreserved apology, and perhaps an offer to make amends.

So to answer the question, “Just culture: Who do we fear?”, it is the judgement of those close to us – in or from the same world – that we fear the most. It is also those close to us who we can help the most.


Bromiley, M. (2016a). Healthcare’s just culture journey: A long and winding road. Just Culture and the Judiciary. EUROCONTROL Experience Sharing Enhanced SMS ES2-WS04-16 seminar, “Just culture across industries: Learning from each other”, Lisbon, 22-23 November 2016.

Bromiley, M. (2016b). Foreword. In, S. Shorrock and C. Williams (Eds.), Human Factors and Ergonomics in Practice: Improving System Performance and Human Well-being in the Real World. CRC Press.

Smoker, A. and Baumgartner, M. (2016). IFATCA – Experience with accused individuals. Just Culture and the Judiciary. EUROCONTROL Experience Sharing Enhanced SMS ES2-WS04-16 seminar, “Just culture across industries: Learning from each other”, Lisbon, 22-23 November 2016.

Rogers, C. (1957). The necessary and sufficient conditions of therapeutic personality change. Journal of Consulting Psychology, 21, 95-103.

Roy, M. M., Liersch, M. J. (2014). I am a better driver than you think: examining self-enhancement for driving ability. Journal of Applied Social Psychology, 43(8), 1648–1659. DOI: 10.1111/jasp.12117

Shorrock, S. (2016). Work-as-Imagined, Work-as-Done, and Just culture. EUROCONTROL Experience Sharing Enhanced SMS ES2-WS04-16 seminar, “Just culture across industries: Learning from each other”, Lisbon, 22-23 November 2016.

Related posts

Safety-II and Just Culture: Where Now?

Six Thinking Hats for Safety

Exploring experiences using Schein’s cycle

The whole picture

Systems Thinking for Safety: From A&E to ATC

Systems Thinking for Safety: Ten Principles

Occupational Overuse Syndrome – Human Error Variant (OOS-HEV)

Human Factors at the Fringe: My Eyes Went Dark


This post was inspired by several conversations and presentations at the conference mentioned in the post.

Note: I have tried to use the British spelling of ‘judgement’ for the everyday use of the term, and the British legal spelling (and routine American English spelling) ‘judgment’ for legal judgments. I have probably not achieved this aim.

Posted in Culture, Human Factors/Ergonomics, Humanistic Psychology, Safety, systems thinking | Tagged , , , , , , , , | 3 Comments

Human Factors at The Fringe: Every Brilliant Thing

You’re six years old. Mum’s in hospital. Dad says she’s done something stupid. She finds it hard to be happy. You make a list of everything that’s brilliant about the world. Everything worth living for. 1. Ice Cream 2. Kung Fu Movies 3. Burning Things 4. Laughing so hard you shoot milk out your nose 5. Construction cranes 6. Me A play about depression and the lengths we go to for those we love. “Heart-wrenching, hilarious… possibly one of the funniest plays you will ever see” **** The Guardian

 Every Brilliant Thing, by Duncan MacMillan and Jonny Donahoe/Paines Plough, 28 August, Roundabout @ Summerhall, Edinburgh


There are a few words you wouldn’t associate with depression, such as ‘funny’, ‘heartwarming’ and ‘inspiring’. But these are words that would apply to this one-man play about a seven year old boy’s reaction to his mother’s depression and suicide attempt. Johnny decides to create for his mother a list of every brilliant thing in his life, such as ice cream, the colour yellow, chocolate, rollercoasters and being allowed to stay up late. He hopes that his list will cheer up his mother, maybe even help her realise that life is worth living.

But as he grows older, he continues the list, and the brilliant things expand enormously. It reminds him of why life is worth living, despite his own struggles in life, including depression. The brilliant things, more than the experience that prompted him to write them down, seem to define his life. It’s not that everything is brilliant: as Johnny says, “if you got all the way through life without ever being heart crushingly depressed, you probably haven’t been paying attention”. It’s just that there are usually many more brilliant things than bleak things, if we really do pay attention.

This play is about a boy and a family, but the premise clearly applies more widely, to communities and organisations. Even when bad things happen or are happening, it is usually the case that there are many more good things. But so often we don’t pay attention to them. What is good about this community or organisation? What gives life? What brings joy? Very often, we don’t really know because we have never turned our attention to the question. Instead we tend to focus on deficits – things that are wrong or missing – and associated needs. Anyone who does groupwork with organisations will know that deficit-based discussions can be rather downbeat and dispiriting. The opposite is true in asset-based discussions. And this is reflected in Every Brilliant Thing. As the play progresses, audience members read out items from Johnny’s list of brilliant things in response to calls of various numbers from the actor. Audience members also play out various characters in Jonny’s life. Everyone seemed to do so joyfully.

Perhaps we should be more like Johnny, understanding deficits and attending to associated needs, but first understanding the assets that we value. If an organisation is relatively safe, why is this? What is going on that makes it a safe organisation? For sure there will be problems and risks and threats to safety, but unless we understand first what we have – what makes it safe (or healthy, or fun, or meaningful) we won’t know what to protect, nourish, and grow. How many organisations and communities make an inventory of every brilliant thing? I have recently paid much more attention to this question, inspired by asset-based approaches and Safety-II. When asked, people list all sorts of things, but they most often concern people and their skills, knowledge, values, relationships. What they also say is this: “No-one has asked that before“.

Every Brilliant Thing shows how joy can exist despite bleak situations. By attending to the brilliant things that keep us going – as individuals, families, communities and organisations – we find that the things that we had taken for granted, or not even noticed, really do need to be cherished.

See also:

Human Factors at The Fringe

Human Factors at The Fringe: The Girl in the Machine

Human Factors at The Fringe: My Eyes Went Dark

Human Factors at The Fringe: Nuclear Family

Human Factors at The Fringe: Lemons Lemons Lemons Lemons Lemons

Posted in Culture, Human Factors/Ergonomics, Humanistic Psychology, Safety, systems thinking | Tagged , , | Leave a comment

Human Factors at The Fringe: Lemons Lemons Lemons Lemons Lemons

Walrus’ award-winning show returns to Edinburgh in Paines Plough’s Roundabout. ‘Let’s just talk until it goes.’ The average person will speak 123,205,750 words in a lifetime. But what if there were a limit? Oliver and Bernadette are about to find out. This two-person show imagines a world where we’re forced to say less. It’s about what we say and how we say it; about the things we can only hear in the silence; about dead cats, activism, eye contact and lemons, lemons, lemons, lemons, lemons. ‘About as promising as debuts get.’ (Time Out).

By Lemons Lemons Lemons Lemons Lemons, by Sam Steiner, 28 August, Roundabout @ Summerhall, Edinburgh


What if we had a daily limit on the number of words we could speak? This is the premise of this experimental political fantasy focusing on the relationship of Bernadette (a lawyer) and Oliver (a musician) in the context of a new ‘hush law’. The law is introduced by the government to ration citizens to 140 words each per day. Oliver campaigns against the law while Bernadette seems not to believe it will actually be voted into effect. Ultimately, for unexplained reasons, it is.

The play is essentially about the dynamics of Bernadette and Oliver’s relationship and how the prospect and reality of the hush law affects their communication. It skips between the couple’s conversations in the past, when they could speak freely, and the present, when they are restricted.

The couple struggle to manage their lexical allowances. On the first day of the hush law, Bernadatte wastes nearly half of her allowance ordering a smoothie. Inconsistent use of the quota between the pair causes tension and raises questions on the importance of the other and the relationship. When Oliver uses his daily limit before returning home, Bernadette is frustrated and comes up with a bunch of random words to spend rest of her allowance, using up her last five with “lemons, lemons, lemons, lemons, lemons”. With varying degrees of success, they learn to monitor how they use their word quota over the course of each day, greeting one another with a number reflecting their available words. We are left to consider a number of questions. What words would we use and leave out, when every word counts? Who would we save our words for? How might we learn to communicate without words?

But the backstory is a restriction of freedom of speech and the social and political implications. The law has some strange effects in society. Songs gradually lose their words because it takes more than a day for the artists or listeners to sing a song. Perhaps most intriguing to me was when Oliver exclaimed that the law is inherently discriminatory because, even if everyone has the same limit, those with less power need more words, while those in positions of power already have the influence they need. They have less need for words. This was a thought that lingered after the play.

In organisations, and society, words are already funnelled and filtered. The ‘140’ limit is obviously borrowed from twitter, which has today excluded quoted tweets, photos, GIFs, videos and poll from its famous 140-character limit. And between the various strata of organisations and society, the possibility to communicate upwards diminishes with altitude. A front-line worker usually has little or no direct access to the Board, for instance. If they want to express anything they may have a small quota of words to do so, if they are lucky. On matters of safety, individuals may indeed have a limit of around 140 words to pass a concern to senior management, perhaps through a reporting scheme.

When we cannot speak out adequately in organisations and society, the concerns and messages do not go away. They take on new forms: learned helplessness, revolt, or anything in between. In Lemons, the characters learn new workarounds: more efficient words, blends and portmanteaus, rudimentary morse code (as with twitter, where people use images of many more words, or use a series of tweets). Some of this is probably not what the lawmakers imagined. In organisations and societies, competing means of communication emerge in response to limits on communication, including behaviours (e.g., facial expressions, postures, whistleblowing, demonstrations, strikes, riots) or other outcomes (e.g., accidents). As I often say to people in positions of power in organisations, people’s concerns and needs remain whether or not we listen to them. But by spending more time listening – allowing time for more words – everyone’s needs can be met, to some extent at least, before it is too late.


See also:

Human Factors at The Fringe

Human Factors at The Fringe: The Girl in the Machine

Human Factors at The Fringe: My Eyes Went Dark

Human Factors at The Fringe: Nuclear Family

Posted in Culture | Tagged , | 4 Comments

Human Factors at The Fringe: Nuclear Family

Nuclear Family is a gripping piece of interactive theatre which follows Joe and Ellen, nuclear plant workers and siblings, faced with an imminent disaster. Audience members will be privy to what could possibly be their last hours as they struggle with the biggest decisions of their lives. In a heated round table discussion, the audience will experience the pressure of making life and death decisions.

Nuclear Family, 3-29 Aug, Assembly, Edinburgh


To have any chance at understanding why people do the things they do, you have to put yourself in their shoes. Nuclear Family is immersive, interactive theatre that requires you to do just that. The play begins with an introduction by a suited convenor, who explains that you are part of a board of inquiry into an explosion at the Ashtown nuclear power plant in 1996. For the next hour, your decisions are linked to those of two security guards and siblings Joe and Ellen Lynum, who work in the plant. Audience members are seated around the cast and the set – a grim, bunker-esque security office with a desk, some 1990s PCs, telephones, and other peripherals. Joe and Ellen are at the sharp end of the unfolding disaster and the focus is on their decisions, which happen to be yours.

The audience members were taken ‘inside the tunnel’ as events unfolded, watching the ‘video footage’ – the acted scenes. After each superbly acted scene leading up to a critical decision point, we were given short audio recordings of interviews and some documentation, such as police and employment records. We had two minutes to make a binary choice decision: what would be a reasonable or appropriate thing to do next, given the information available and the desired outcome? As an audience, we had to vote on a collective decision. These decisions – four or five in all – were moral dilemmas. Questions of rule-breaking, relationships and competence arose, and each decision had implications, for liberty and loss of life, for instance. Each decision contributed to an unfolding disaster, but the decisions were set against poor management – under-resourcing and reported problems that had never been acted on.

As the audience made each decision, we could not know the consequences until they arose. It was clear that were various routes through the mess and because of this we probably forgot that the ending was actually certain: an explosion. It became a chose-your-own-disaster, but one where we were fooled into counter-factually thinking we could mitigate the outcome and maybe prevent it. We felt the regret and anger for each decision in real time as the next scene unfolded.

This is innovate theatre that teaches the audience about local rationality. The audience, like Joe and Ellen, do what seems reasonable at that time. In hindsight, each decision seems like a bad decision, but at the time each decision is just that: a decision. The decisions seemed reasonable to most people, though there was minority dissent for some decisions, which was not explored. Interestingly, the minority could feel some anger that their preferred option was not taken: even though the consequences of neither option were known at the time, the unknown consequences of the unmade decision seemed better.

The division of the storyline into decision points was reminiscent of the method within Sidney Dekker’s Field Guide to Understanding ‘Human Error’, which suggests breaking down a detailed timeline into critical junctures. But there are crucial differences between an accident investigation, Nuclear Family, and real-time operations. In an accident investigation, you have much of the information and you have knowledge of the final outcome and the outcomes of each decision. You construct the critical junctures (based on the knowledge you now have) and you have many hours or days available to analyse them. In Nuclear Family, you have a some background information and you have knowledge of the final outcome but not the outcome of each decision. You are told the critical junctures and you can pause for a couple of minutes while you make a decision. In real-time operations – in control rooms, cockpits, operating theatres – you don’t have all the information and you don’t know the final outcome, nor for sure the outcome of the decision you are about to take. You may not know in advance that a juncture or decision point is critical and you can’t necessarily pause for long to make a decision.

Understanding local rationality demands a level of empathy, and Nuclear Family cultivated both background empathy (or person empathy) into the characters, and process empathy for their moment-to-moment experience – cognitive, emotional, and social. It is hard to think of a better medium through which to experience this so efficiently than interactive theatre.

See also:

Human Factors at The Fringe

Human Factors at The Fringe: The Girl in the Machine

Human Factors at The Fringe: My Eyes Went Dark

Posted in Human Factors/Ergonomics, Safety | Tagged , , , , , , , | 5 Comments

Human Factors at The Fringe: My Eyes Went Dark

Written and directed by Matthew Wilkinson. A thrilling modern tragedy about a Russian architect driven to revenge after losing his family in a plane crash. Cal MacAninch and Thusitha Jayasundera give electrifying performances in this searing new play about the human impulse to strike back. Inspired by real events. Nominated for three Off West End Theatre Awards.

My Eyes Went Dark by Matthew Wilkinson, 28 Aug, Traverse Theatre, Edinburgh


(See Human Factors at The Fringe for an introduction to this post.)

In 2002 a Bashkirian Airlines Tupolev passenger jet en route to Barcelona collided with a DHL Boeing 757 cargo jet over Überlingen, southern Germany, while under air traffic control. Seventy one people died, 52 of which were children. The controller on duty instructed the Russian jet to descend, after noticing that the planes were on a collision course. Unbeknown to him, the onboard collision-avoidance systems (TCAS) on the aircraft issued instructions that contradicted the controller’s own instruction. The Russian pilots acted on the controller’s instruction, while the DHL pilots acted on that of TCAS (see here for a description of the accident and the aftermath).

This play is essentially a tragedy, inspired by real events, and concerns the aftermath of that accident. It takes place over the course of five years in Switzerland, Germany, and Ossetia. ‘Nikolai Koslov’ lost his two children and his wife in the accident. Koslov was a Russian architect working in France on a major new hotel build.

Koslov is consumed with grief, seething with a quiet anger at how such an accident could have occurred. He runs through the possible causes with a search team co-ordinator on the night of the accident, at the scene. He asks about the age and condition of the plane, about who was responsible for maintenance. He wonders about terrorism. But the coordinator offers a mundane reason for the accident.

CO-ORDINATOR: (Gentle) My opinion is, and I know how stupid this must sound, but it could well have been a … a simple mistake.

KOSLOV: A mistake? But who made the mistake?

Koslov interrogates the perpetrator of the mistake. With no answer, he turns to the mistake itself.

KOSLOV: But what mistake? What sort of mistake?

CO-ORDINATOR: I don’t know, really.

KOSLOV: Then why do you say that?

CO-ORDINATOR: Because – isn’t that usually the reason?

Koslov is incredulous.

KOSLOV: … You cannot put people up there, in aeroplanes, high up there, and then make simple mistakes… it’s completely unheard of.

Koslov’s late wife’s sister comes to a granite memorial to find him. He’s been there for days. While Koslov is full of anger for Olsen, Lizka has compassion.

LIZKA: I heard him interviewed. He was crying. He said it was his duty and responsibility to prevent such accidents happening. I remember that clearly. He sounded at a total loss. He sounded terrible.

Koslov is angry and yet numb to the world, turning to ultra-dark chocolate to get a sense of something external.

While Koslov cannot understand how a ‘simple mistake’ could happen, Lizka cannot understand how the context for it could exist. Koslov focuses on the actions of the controller. Lizka focuses on the context of work. She starts to recount the ‘second story’.

LIZKA: He said he was left all alone on duty that night. I just can’t understand that. He was all by himself, flitting between two screens. … Why would they allow that? He said he wasn’t even aware that the Russian plane’s warning system had told it to go up. When clearly it should have gone up. Just kept on going. If it had kept on going everything would have been OK. The other plane would have missed it completely.


LIZKA: But they’re saying all his phone lines were down. So no one could call anyone. Then, then maintenance men came in as well…

KOSLOV: I know, I heard his describing it.

LIZKA: It sounds horrific … like some crazy soap opera … like they were there to fix the telly!

KOSLOV:  know.

LIZKA: I mean he couldn’t know what was going on! And he had another plane to land in Germany at the same time! Five minutes before. It was complete confusion! My God, his colleague was outside in the hall fast asleep!

KOSLOV: Lizka –

LIZKA: He was all by himself…

KOSLOV: Lizka –

LIZKA: No. No. I don’t understand.

KOSLOV: Lizka –

LIZKA: You don’t let people fall asleep in halls when there are planes flying around do you? Do you? What for? It doesn’t make sense…

KOSLOV: It was common policy.

LIZKA: To sleep in halls?

KOSLOV: To take it in turns. When traffic was slow.

LIZKA: Really? Was it? Really? But traffic wasn’t slow!


From her outside perspective, the conditions of work don’t seem reasonable.

But Koslov cannot escape the feeling that Olsen is culpable. In a phone call he talks about the statements given to the German and Swiss accident investigation authorities.

KOSLOV: It’s an inescapable fact he did do it. Im not saying he wasn’t put in a dreadful position. I’m saying he did it. … He commanded those pilots to dive. To ignore their screens and fly into each other. Yes? OK? Whatever the reasons. …

Koslov believes that someone must be held accountable but Thomas Olsen is acquitted by the courts. A representative of Skyways is in court:

WEITNER: In hindsight, you always ask yourself, could I have done more> More to anticipate, more to prepare, more to … mitigate. More.

Two officials received suspended sentences, and a fine of twelve thousand Euros. Koslov is offered compensation for his wife and children ($60,000, and $50,000, respectively). For Koslov, this defiles the name of his family. For him, justice has not been done. What justice can there be?

WEITNER: From the trial, did you really think someone was going to be prosecuted? Sent to prison? For an accident? Nobody was going to prison. It’s not how it works. Can you imagine? Private employees, in public service, sent to prison – for making mistakes? Who would be willing to take their place?

WEITNER: I know how difficult this must be for you.

KOSLOV: You can’t even say sorry.

Koslov tracks Olsen down in his family home, and murders him. He is sent to prison.

In his region of Russia, blood feuds were traditionally an accepted means of justice . His counsellor proposes that this might explain his actions.

GEISINGER: We know it wasn’t so long ago, perhaps only fifty years or so, that feuds in your country were decided in this way

… You belong to a history, a cultural history, of resolving trauma this way.

Koslov he denies this, and denies planning to kill Olsen, even remembering what happened.

Koslov is released part way through his sentence. On return to Russia, he receives a hero’s welcome. He is given an official post for architecture and construction and designs an Olympic-standard ski resort in Ossetia.

The play ends with Olsen’s daughter, Helena, arriving unexpectedly at a party for Koslov, seeking answers on why he did what he did, and restorative justice for her mother, who has made multiple requests to speak with Koslov. He has never responded.

HELENA: Speak to her. Please. It must mean something to you. It must do. You were a father. You had children.

KOSLOV: And your father murdered them.

HELENA: No! No! My father was a man, a good man! Who made a mistake!

KOSLOV: He is a murderer.

HELENA: (Screams) You are a murderer!!

My Eyes Went Dark raises questions about causation, culpability, justice, revenge and forgiveness. The first story of ‘human error’ and individual responsibility are set out alongside the second story of system conditions and collective and corporate responsibility. Human error, “a simple mistake” (famously cited as being the ’cause’ of 70% or so of accidents) is the first assumption of the co-ordinator. But a mistake is not innocent in the eyes of Koslov (nor in the eyes of many judicial systems around the world). The system as a whole is the focus for Lizka. She describes how degraded modes of operation stack on top of one another and become accepted as normal as an organisation drifts into failure. She feels compassion for the controller who was put in this position, and who ultimately lost his own life.

A mistake and an individual perpetrator gives Koslov a clear reason for the event and an identifiable target for his anger. As recalled by Lizka, the controller said it was his “duty and responsibility to prevent such accidents happening”. An organisation does not provide a clear reason for the event, nor a clearly identifiable target for Koslov’s anger

How would we react to such an event? Would a progressive understanding of human factors and system safety help or hinder forgiveness? Would an understanding of complexity actually make it easier or harder for us to channel our grief, and to get restorative justice? Would our understanding of ‘just culture’ save us from our darkest urges? We hope we’ll never know.


Script: Wilkinson, M. (2015). My eyes went dark. Oberon Books.



See also:

Human Factors at The Fringe

Human Factors at The Fringe: The Girl in the Machine

Human Factors at The Fringe: Nuclear Family

Human Factors at the Fringe: Lemons Lemons Lemons Lemons Lemons

Posted in Human Factors/Ergonomics, Safety, systems thinking | Tagged , , , , , , , , , | 7 Comments

Human Factors at The Fringe: The Girl in the Machine

Polly is a professional, a high achiever and an addict. Her drug of choice is a grade A, top of the range smart phone. She clicks and scrolls for minutes, hours and days at a time. When Polly discovers an app that uses algorithms to create brand new music by long-dead musicians, the line between human and computer begins to blur, and the downloads become increasingly dangerous. A play about networks, nerve endings and Nirvana.

The Girl in the Machine by Stef Smith, 19, 24 & 28 Aug, Traverse Theatre, Edinburgh

(See Human Factors at The Fringe for an introduction to this post.)

This script-in-hand, rehearsed-only-once play for early risers at Edinburgh’s Traverse Theatre was one of a series on the same theme: “Tech will tear us apart (?)” The play features a corporate IP lawyer – Polly – and her tech designer husband – Owen. Polly is addicted to her device, and will spend hours clicking and swiping through apps and the internet. Out of the blue, a new app appears that can create new music by dead artists based on aspects of their existing body of work. This is a problem, because – in her new position – it is Polly’s job to prevent and now deal with this legal quagmire.

The app is downloaded by legions of users, and it has a much darker hidden feature. The app includes an aural code via by which – it is promised – users can leave their bodies and upload their consciousness to the internet, sending messages to those on the other side. Hundreds of lives are lost as people seek to escape the stress of a hyper-connected, information-overloaded life, ironically putting their faith in everlasting life in a high-tech heaven, as pure information. Polly is blamed for the viral suicide and is sacked for failing to spot the emerging threat. She spirals into depression.

As the pair sit, Polly is consumed by her phone much like so many of us today. They grow further apart – physically and emotionally – and the phone becomes a love/hate object in the marriage. Society breaks down as attempts are made to stop the cultish phenomenon. Polly uses the last of her battery to upload her consciousness to the net, or so she thinks.

This sad but riveting play sheds light on our addition to technology while playing on our fears. It also exposes our faith in technological solutions to socio-technical and even spiritual problems. “When did life get so complicated?” Polly asks. “When we tried to make it simple”, Owen responds.

Does technology simplify life, or make it even more intractable?


See also:

Human Factors at The Fringe

Human Factors at The Fringe: My Eyes Went Dark

Human Factors at The Fringe: Nuclear Family

Human Factors at the Fringe: Lemons Lemons Lemons Lemons Lemons

Posted in Human Factors/Ergonomics, Humanistic Psychology | Tagged , , , , , | 5 Comments