Human Factors at The Oscars

Photo: Craig Piersma CC BY-NC-ND 2.0

“An extraordinary blunder”

It has variously been described as “an incredible and almost unbelievable gaffe” (Radio Times), the greatest mistake in Academy Awards history” (Telegraph), “an extraordinary blunder…an unprecedented error” (ITV News), “the most spectacular blunder in the history of the starry ceremony” and “the most awkward, embarrassing Oscar moment of all time: an extraordinary failure” (Guardian).

It was, of course, the Grand Finale of the Oscars 2017.

Faye Dunaway and Warren Beatty are all set to announce the best picture win. Beatty begins to read out the winners card. But he looks visibly puzzled, pausing and looking in the envelope to see if there is anything else that he’s missed. He begins to read out the winners card, “And the Academy Award…”. He pauses and looks in the envelope again. “…for Best Picture“. He looks at Dunaway, who laughs “You’re impossible!”, then hands the card to her. Dunaway, perhaps assuming this is all for effect, simply reads out what she sees, and announces “La La Land!“.

Music sounds and a narrator gives a 17-second spiel about the film: “La La Land has fourteen Oscar nominations this year, and is tied for the most nominated movie in Oscar history, winning seven Oscars…

The La La Land team exchange embraces and walk to the stage. Jordan Horowitz, a producer, delivers the first thank-you speech. Everything looks normal. But as the second and third thank-you speeches are being delivered, there is visible commotion. A member of the Oscars production team takes back the envelope that has been given to the La La Land producers.

The winner’s envelope is, in fact, the envelope for best actress, just given to  La La Land’s Emma Stone. Behind him, the PricewaterhouseCoopers overseers – Brian Cullinan and Martha Ruiz – are on stage, examining the envelopes.

At the end of his speech, Producer Fred Berger says nervously: “We lost, by the way”. Horowitz takes over, “I’m sorry, there’s a mistake. Moonlight, you guys won Best Picture“. Confused claps and cries ensue. “This is not a joke“, Horowitz continues. Beatty now has the right card, but Horowitz takes it out of Beatty’s hand and holds it up to show the names of the winning producers.

Beatty tries to explain, and is interrupted by host Jimmy Kimmel: “Warren what did you do?!“. Beatty continues, “I want to tell you what happened. I opened the envelope and it said, ‘Emma stone – La La Land’. That’s why I took such a long look at Faye and at you. I wasn’t trying to be funny.” Horowitz hands his Oscar to Barry Jenkins, Moonlight’s director.

It was “the first time in living memory that such a major mistake had been made” (Reuters). The accountancy firm PriceWaterhouseCoopers has apologised and promised an investigation. In a statement, they said, “The presenters had mistakenly been given the wrong category envelope and when discovered, was immediately corrected. We are currently investigating how this could have happened, and deeply regret that this occurred. We appreciate the grace with which the nominees, the Academy, ABC, and Jimmy Kimmel handled the situation”.

Such a mistake, in an ordinary setting, is usually quite uneventful. Similar sorts of things happen every day. The only thing that is “incredible“, “spectacular” and “extraordinary” is the context. It is worth, then, looking a little deeper at this extraordinary event, and considering how similar sorts of  events are played out in many ordinary, but critical, contexts.


The design of the envelopes for the various Oscar awards is identical. The only difference between the envelopes is the text that indicates the category. There is no other means of coding (e.g., colour, pattern) to indicate any difference. Several industries have realised the problem with this approach, and in some ways this can be considered the beginnings of the discipline of human factors and ergonomics: “A seminal study that set the agenda for the scientific discipline of human factors was by the experimental psychologists, Fitts and Jones (1947), who adapted their laboratory techniques to study the applied problem of ‘pilot error’ during WWII. The problem they faced was that pilots of one aircraft type frequently retracted the gear instead of the flaps after landing. This incident hardly ever occurred to pilots of other aircraft types. They noticed that the gear and flap controls could easily be confused: the nearly identical levers were located right next to each other in an obscure part of the cockpit” (van Winsen and Dekker, 2016) .

This problem still exists today in settings far more important than The Oscars, but far less newsworthy…until disaster strikes. A notable example is medicine packaging, where medicine names look alike or sound alike or have very similar labels for different drugs or doses. Many packages and labels require users to force attention onto small details of text, perhaps with the addition of a small area of colour which, on its own, is quite inconspicuous. It is asking a lot of people to make critical – sometimes life-and-death-critical – decisions based on small design features. This is in addition to drugs that look alike or sound alike, such as Aminophylline and Amitriptyline, or Carbamazepine and Chlorpromazine, or Vinblastine and Vincristine.

Experience of human factors suggests a number of coding methods (e.g., shape, colour, size) that, used appropriately, can help to make vital distinctions. There are also several design guidelines for medicines by NHS NPSA (2007) and the European Medicines Agency (2015). In human factors/ergonomics, these are used as part of an iterative human-centred design method that understands stakeholders and context, identifies user needs, specifies design requirements, produces prototypes, and tests them.

In the absence of this process, what is amazing is not that such errors occur, but that they do not occur much more often than they do. Because it happens fairly infrequently, when it does happen it is often (and unhelpfully) branded ‘human error’. But this is not simply a problem of ‘human error’. It is a problem of design, where form (such as branding and aesthetics) so often trumps function. As Hollnagel (2016) states, “The bottom line is that the artefacts that we use, and in many cases must use, should be designed to fit the activity they are intended for“. Form-over-function design places the human in a position where they have to bridge the gap between form and function every time they use an artefact.


For the Oscars, two identical sets of the winners cards are made for ‘safety purposes’. These duplicate envelopes are held in the wings in case anything should go wrong with a presenter or an envelope. In this case, it may be that the duplicate of the Best Actress award, which had just been announced, was handed to Beatty as he walked out to announce the Best Picture winner.

Safeguards feature in most safety critical industries, and are often the result of a risk assessment that specifies a risk control for an identified risk. But the risk assessment process is often a linear cause-effect process, and it often stops at the risk control. And risk controls can have unintended consequences and introduce new risks. Consider this example in the context of aviation and air traffic control:

In early 2014, the UK experienced a prolonged period of low atmospheric pressure. At the same time, there was an unusual cluster of level busts [where aircraft go above or below the flight level or altitude instructed by ATC] at the transition altitude, which were thought to be linked to incorrect altimeter setting on departure into the London TMA [London airspace].

Level busts have been, and remain, a key risk in NATS operation. Longer-term strategic projects, such as the redesign of the London TMA and the raising of the Transition Altitude, are expected to provide some mitigation. However, to respond tactically to the perceived trend in the short-term, it was decided to issue a Temporary Operating Instruction (TOI) to controllers.

The TOI required the inclusion of additional phraseology when an aircraft was cleared from an altitude to a Flight Level during low pressure days. The additional phraseology was “standard pressure setting” e.g. “BigJet123, climb now FL80, standard pressure setting”. The change was designed to remind pilots to set the altimeter to the standard pressure setting (1013 hPa) and so reduce level busts associated with altimeter setting. As this phrase was deemed to be an instruction, it was mandatory for flight crews to read back this phrase.

The TOI was subject to the usual procedural hazard assessment processes and implemented on 20 February 2014 on a trial basis, with a planned end date of 20 May 2014, after which the trial results would be evaluated. The change was detailed in Notices to Airmen (NOTAMs).

During the first day of implementation, several occurrence reports were received from controllers, who noted that flight crews did not understand the meaning of the phraseology, and did not read back as required. This led to additional radio telephony to explain the instruction, and therefore additional workload and other unintended consequences.

Extract from case study by Foster, et al, in EUROCONTROL (2014). 

Every industry has many examples of ‘safeguards gone bad’. We often fail to understand how such changes change the context and introduce secondary problems.

Decision making under uncertainty

Beatty is standing there, with the eyes of tens of millions of viewers upon him. He is being recorded for perpetuity, for viewing by hundreds of millions more. He has to make a decision about an announcement, which will feel like a gold Olympic medal to a few producers. But he isn’t sure what’s going on. As Beatty  explained, “I opened the envelope and it said, ‘Emma Stone – La La Land’. That’s why I took such a long look at Faye and at you. I wasn’t trying to be funny“.

Here we cannot be certain what was going through Beatty’s mind, but could it be that – live on one of the most important TV events in the world – Beatty did not want to voice his confusion and uncertainty? He appeared visibly puzzled and gave the envelope to Dunaway to read out the ‘winner’. Dunaway could not have known about Beatty’s thoughts, since his behaviour could easily have been a time-filler or fumbling joke, and of course it made sense to her to simply read hat she saw: “La La Land“.

When under pressure, any delay can have associated costs. For Beatty, asking for clarification would have meant an awkward period of filler, a clumsy live-on-air check of envelopes, perhaps a loss of advertising time. In a state of confusion and self-doubt, perhaps it made sense to say nothing and pass the confusing artefact to someone else.

In many safety-critical activities, decisions are made under uncertainty. The information and situation may be vague, conflicting or unexpected. In some cases, there is a need to signal confusion or uncertainty, perhaps to get a check, or to ask for more time. It can seem hard for us to give voice to our uncertainty in this way, especially under pressure. When someone has a command position – in an operating theatre, cockpit, or at the Oscars  – it can be difficult for that person to indicate that they are not sure what is going on. This has played out in several accidents and moreover in everyday life. But sometimes, the most powerful phrase may be something along the lines of, “I do not understand what is going on”. This identifies a problematic situation and opens the door for other members of the team to help problem-solve. This kind of intervention is part of many training programmes for ‘team resource management’ (by whatever name), and can help everyone involved – no matter what their formal position – to voice and resolve their doubts, uncertainties and concerns.

It’s just an awards show

The events of Oscars 2017 will be emblazoned forever on the minds of participants and aficionados. But it will also soon be a feature of a trivia game or TV show. As host Jimmy Kimmel said “Let’s remember, it’s just an awards show.” But for those who have to put up with the same sorts of problems every day, it’s much more than that. In many industries, people help to ensure that things go well despite other aspects of the system and environment in which they work. For the most part, the human in the system is less like a golden Oscar, and more like a Mister Fantastic, using abilities of mind and body to connect parts of systems that only work because people make them work. This aspect of human performance in the wild is usually taken for granted. But in the real world, people create safety. And for that, they deserve an Oscar.


EUROCONTROL (2014). Systems Thinking for Safety: Ten Principles. A White Paper. Brussels: EUROCONTROL Network Manager, August 2014. Authors: Shorrock. S., Leonhardt, J., Licu, T. and Peters, C.

Hollnagel, E. (2016). The Nitty-Gritty of Human Factors (Chapter 4). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

van Winsen, R. and Dekker, S. (2016). Human Factors and the Ethics of Explaining Failure (Chapter 5). In S. Shorrock and C. Williams (Eds.), Human factors and ergonomics in practice: Improving system performance and human well-being in the real world. Boca Raton, FL: CRC Press.

See also

Just Culture in La La Land

‘Human error’ in the headlines: Press reporting on Virgin Galactic

Life After ‘Human Error’ – Velocity Europe 2014

Human error’: The handicap of human factors, safety and justice
The HAL 9000 explanation: “It can only be attributable to human error”
Occupational Overuse Syndrome – Human Error Variant (OOS-HEV)
‘Human error’: Still undefined after all these years

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am interdisciplinary humanistic, systems and design practitioner interested in human work from multiple perspectives. My main interest is human and system behaviour, mostly in the context of safety-related organisations. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock or email contact[at]humanisticsystems[dot]com.

10 thoughts

  1. Pingback: Human Error
  2. Actually, most of the human factors researchers I know are concerned with errors that have significant, life-threatening consequences. Sorry, the little goof-up with La La Land and Moonlight doesn’t fit that category to me. The bigger error, in my view? The total Oscars snub of Sully, a film that portrays human factors as an influence on aviation safety.

    1. You are right Jim. Most of us, including me, work in safety-critical industries and practice and/or research. But guess what…articles on specific safety critical industries are read by relatively few, usually from that industry. Out of the 70 or so posts on this site over the past few years, the most number of views on a single day was on the day of this post, which is third most viewed post already. Readers will include many who are able to transfer some of the concepts, which may be quite novel to them, to the safety-critical industries that you refer to, in whatever role. Hopefully that is to the good of human factors, safety and justice or fairness. Thanks for taking the time to read and comment.

  3. Reblogged this on NJ Ergonomics Blog and commented:
    Steven Shorrock provides a great look into what went on with the mishandled envelope at the Oscars leading to the announcement of an incorrect winner by Warren Beatty.

  4. It’s funny – I just read an article about the lady from PwC and how much work she puts into making sure it doesn’t go wrong! Good article Steve – I’ve sent it on to my colleagues.

    1. As Demming said, “a bad system will beat a good person every time.” Well he was kinda wrong in that absolute sense because people are so good at filling the gaps, but a bad system will usually win in the end!

  5. It would be interesting to be a fly on the wall during the post-event ‘root cause analysis’ or whatever pass for that in TV land. I nominate Steve Shorrock to go over and facilitate the event!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.