Work-as-Imagined Solutioneering: A 10-Step Guide

Have you ever come across a ‘problematic solution’ that was implemented in your workplace, and wondered, “How did this come to be?” Wherever you sit in an organisation, the chances are that you have. Many problematic solutions emerge from a top-down process that I will call work-as-imagined solutioneering.

In this post, I will outline a typical process of 10 Steps by which problematic solutions come into being. Some of the steps may be skipped, but with the same outcome: a problematic solution.

At the end of the post, you will find 10 ‘Solutions’ from healthcare, provided by healthcare practitioners in a series of posts on this blog on the archetypes of human work. These solutions do not typify the process below (since the process that these solutions were subject to is not known to me). And the solutions will all probably have various advantages and disadvantages. The solutions simply provide rich and messy examples of unimagined and unintended side-effects. But you will be able to think of many others in your own context (please provide an example as a comment or get in touch).

Throughout the 10 Steps, I will use terms to describe seven kinds of systems that must be reckoned with when making changes in socio-technical systems (from Martin’s [2004] Seven Samurai framework).

Slide14

Step 1. Complex problem situation

The process of work-as-imagined solutioneering starts with a complex problem situation. Complex problem situations occur in systems with:

  • a number of stakeholders with conflicting goals,
  • complex interactions between stakeholders and other elements of the socio-technical system (visible and invisible, designed and evolved, static and dynamic, known and unknown),
  • multiple constraints (social, cultural, technical, economic, regulatory, legal, etc), and
  • multiple perspectives on the nature of the problem.

Problems may well be interconnected to form a ‘mess’.

Step 2. Complexity is reduced to something simple

Complex problem situations are hard to understand and have no obvious solutions. This is unappealing to most people. Understanding complex problem situations requires that we seek to understand:

  • the various expressions of, and influences on, the problem,
  • the context system, including the stakeholders, their activities, the tools and artefacts that they use, the context or environment (physical, ambient, social, cultural, technical, economic, organisational, regulatory), and
  • the history of the context system.

One of the hallmarks of work-as-imagined solutioneering is a neglect of one or more of these facets of the problem situation or context system. This is partly because understanding requires:

  • high levels of field expertise – expertise in the work that is influenced by and influences the problem, whatever the work is,
  • an understanding of people (which can be approached via various disciplines: psychology, sociology, anthropology, community development, human factors/ergonomics, etc),
  • an understanding of socio-technical systems and the nature of change in such systems, and
  • sufficient expertise in a human-centred and system-oriented design process.

Once you have approached the problem situation in a sensible way, an analysis of stakeholder assets and needs should follow.

Unfortunately, once a problem is identified, the perceived urgency to do something creates pressure to be efficient, when thoroughness is required – a blunt-end efficiency-thoroughness trade-off. The required thoroughness is time-consuming and difficult. It requires specialist expertise and – crucially – bridging social capital to engage with field experts in order to get the understanding necessary to help, rather than hinder. [There is almost always a lack of expertise, and we should try to understand why solutions make sense to managers and leaders, and not simply berate them.]

So these critical activities (understanding the context system and problem situation, and understanding stockholder assets and needs) are often neglected. And complexity is reduced to something simple. For example, a mismatch between demand, resources and capacity is reduced to a problem of ‘poor performance’. A mismatch between work-as-prescribed and work-as-done is reduced to ‘non-compliance’ or ‘violation’. A mismatch between design and performance is reduced to ‘human error‘.

Step 3. Someone has an idea

While there may be little understanding of the complex problem situation, solutions are at hand. Past experience, ideas from other industries or contexts, and committee-based idea-generation or diktats from authority figures make a number of ‘solutions’ available. Examples include:

  • measures
  • monitoring arrangements
  • quantified performance targets and limits
  • commercial off-the-shelf products (equipment, artefacts)
  • checklists
  • procedures
  • standard training
  • processes
  • incentives
  • punishments
  • reorganisation or activities, processes and reporting lines
  • redistribution of power.

Most of these (aside from targets, in most circumstances) are not inherently Bad Things. The Bad Thing is introducing them – any of them – without a proper understanding of the context system and the problem situation within that context system. But it is too late,. The focus is now on the solution – the intervention system.

Step 4. Compromises to reach consensus

As the solution (intervention system) is revealed, people at the blunt end are now at the sharp end of a difficult process of design and implementation. There are disagreements and they start to see a number of complications. But the stability of the group is critical. The intervention system is put out for comment, usually to a limited audience and with the aim to prove its viability. There are further insights about the problem situation and context system, but these arrive in a haphazard way, instead of through a process of understanding involving design and systems thinking. Eventually, compromises are made to achieve consensus and the intervention system is specified further. Plans are made for its realisation. The potential to resolve the problem situation is hard to judge because neither the problem situation nor the context system is properly understood.

Step 5. The project becomes a thing unto itself

The focus now turns to realisation. The problem situation and context system, which were always out of focus, are now out of view. The assets and needs of all stakeholders were never in view, but the needs of the stakeholders who are invested in the roll-out the solution (intervention system) have been met: they can now feel reassured that something is being done. The need of corporate anxiety-reduction is now being addressed. Something is being done.

So the focus now switches from the intervention system to the realisation system – the system for bringing the solution into effect (management teams, resources, project management processes, materials, etc).

Step 6. Authorities require and regulate it

As the intervention system (the ‘solution’) gets more attention, authorities believe that this is a Good Thing. Sometimes, solutions will be mandated and regulated, and monitored by the with regulatory power. Now there is no going back.

Step 7. The solution does not resolve the problem situation

As the solution is deployed, it becomes the deployed system. This is not necessary the same as the original idea (the intervention system). Compromises have been made along the way, both by those responsible for the intervention system (compromising on aspects of the concept), and by those responsible for the realisation system (compromises on aspects of implementation).

The design or implementation (or both) of the solution meets a need (corporate anxiety reduction) but does not resolve the original problem. The original problem remains, perhaps in a different form. Never events still happen (Solution 4), a ‘paperless’ discharge summary process (Solution 6) still requires paper. The feedback loops, however, contain delays and distortion, which we will come back to.

Step 8. Unintended consequences

Not only does the solution not resolve the original problem, but it brings new problems that were never imagined. These include problems concerning system conditions (e.g., higher unwanted demand, more pressure, more use of resources), and problems concerning system behaviour (e.g., increased workload, unwanted workarounds).

Here are some healthcare examples:

A Duty of Candour (Solution 1) process results in a “highly bureaucratic process which has reinforced the blame culture.”

A Do Not Attempt Resuscitation (DNAR) form (Solution 2) results in patients being “subjected to aggressive, yet ultimately futile, resuscitation measures which may include multiple broken ribs, needle punctures in the arms, wrists and groin, and electric shocks” and nurses and paramedics working “in such fear of not doing CPR when there is no DNACPR that they may override their own professional judgement and do CPR when it is clearly inappropriate.”

Dementia diagnosis targets (Solution 3) result in “naming and shaming supposedly poorly diagnosing practices – published online. Setting doctors harmful tasks, leading them almost to “process” patients.”

A Never Events list (Solution 4) – similar to various popular zero harm initiatives, “ignored the potential for using never events as a stick to beat people up with, … ignored the potential for gaming the data, … ignored the potential for people to become fearful of reporting and the loss of learning as a result.”

A ‘paperless’ discharge summary process (Solution 5) actually results in more paper. Similarly, following the implementation of a computerised medical system, “work-as-done reverted back to the system that was in place before where secretaries still had to print results on bits of paper and hand them to consultants to action” (Solution 6).

Amidst these unintended consequences, the context system has now changed and there may well be competing systems that address the problem, masking the effects of the deployed system. For instance, along with a Central Line Associated Bacteraemia (CLAB) checklist (Solution 9) another deployed system was CLAB packs, “These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly.” Which has the effect imagined?

Furthermore, there may be inadequate collaboration or support from collaborating systems and sustainment systems (which collaborate with the deployed system to achieve some goal or help it continue to function). Examples include blunt-end roles for monitoring, analysis, feedback, and the supply of tools, materials, and technical support. These stakeholders are typically far removed from operational work-as-done and do not understand the assets and needs of those who work on the front line. It may be that the deployed system cannot even function as intended, as designed or as originally implemented.

Step 9. People game the system

Many work-as-imagined solutions can be gamed, and it may well be locally rational to the people who do – rather than imagine – the work. This is typical of measures (especially when combined with targets or limits) and processes. Following are some healthcare examples.

Radiology request forms are meant to be completed and signed by the person requesting the procedure. However, “In the operating theatre, the surgeon is usually scrubbed and sterile, therefore the anaesthetist often fills out and signs the form despite this being “against the rules” (Solution 7).
On the introduction of Commissioning for Quality and Innovation payments framework (CQUINs) to drive innovation and quality improvement in the NHS, clinicians are “demotivated by the process of collecting meaningless data and are tempted to use gaming solutions to report best performance” (Solution 8), having informed the commissioners of problems with the deployed system and offering suggested improvements to the metrics (which do not fit the intervention system concept).

Checklists for the prevention of Central Line Associated Bacteraemia (CLAB) (Solution 9)  are completed “retrospectively without watching the procedure, as they were busy with other tasks”.

Step 10: It looks like it works

The gaming, combined with feedback lags and poor measures, may well give the illusion that the deployed solution is working, at least to those not well connected to work-as-done.

After introducing the CLAB bundle (Solution 9) “very high levels of reported checklist compliance” were observed “followed by the expected drop in our rates of infection, confirming the previously reported benefits.” but the drop instead “appears to be due to the use of CLAB packs. These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly.”

With the WHO Surgical Safety Checklist (Solution 10), “The assumption within an organisation at ‘the blunt end’ is that it is done on every patient.” despite “clear evidence that there is variability in how the checklist is used both within an organisation and between organisations”.

Of course, there may well be knowledge that work-as-imagined does not align with work-as-done, but this is an inconvenient truth. Too often, what we are left with is a separation (or even inappropriate congruence) of the four varieties of human work: work-as-imagined, work-as-prescribed, work-as-done, and work-as-disclosed. This is enacted in a number of archetypes of human work.

This is not the end of the process, but by this stage, the project team that worked on the originally intended solution (the intervention system) have moved on. The deployed system remains and now we must imagined a solution for both the original problem and the new problems.

Slide13

In summary

Work-as-Imagined Solutioneering
Step 1. Complex problem situation
Step 2. Complexity is reduced to something simple
Step 3. Someone has an idea
Step 4. Compromises to reach consensus
Step 5. The project becomes a thing unto itself
Step 6. Authorities require and regulate it
Step 7. The solution does not resolve the problem situation
Step 8. Unintended consequences
Step 9. People game the system
Step 10: It looks like it works


Solution 1: Duty of Candour

Over the last few years there has been a call to enshrine ‘saying sorry’ in law. This became the ‘duty of candour’. When this was conceived it was imagined that people would find the guidance helpful and that it would make it easier for frontline staff to say sorry to patients when things have gone wrong. Patient advocates thought it would mean that patients would be more informed and more involved and that it would change the relationship from an adversarial to a partnership one. In practice this policy has created a highly bureaucratic process which has reinforced the blame culture that exists in the health service. Clinical staff are more fearful of what to say when something goes wrong and will often leave it to the official process or for someone from management to come and delivery the bad news in a clinical, dispassionate way. The simple art of talking to a patient, explaining what has happened and saying sorry has become a formalised, often written, complied duty. The relationships remain adversarial and patients do not feel any more informed or involved as before the duty came into play. Suzette Woodward, National Clinical Director, Sign up to Safety Team, NHS England @SuzetteWoodward

Solution 2: Do Not Attempt Cardiopulmonary Resuscitation (DNACPR) form

A Do Not Attempt Resuscitation (DNAR) form is put into place when caregivers feel that resuscitation from cardiac arrest would not be in the patient’s best interests. These forms have received a significant amount of bad press, primarily because caregivers were not informing the patient and/or their families that these were being placed. Another problem with DNAR forms is that some clinicians feel that they are being treated as “Do Not Treat” orders, leading (they feel) to patients with DNAR forms in place receiving sub-standard care. This means that some patients who would not benefit from resuscitation are not receiving DNAR forms. As a result when these patients have a cardiac arrest they are subjected to aggressive, yet ultimately futile, resuscitation measures which may include multiple broken ribs, needle punctures in the arms, wrists and groin, and electric shocks. It is not unusual to hope that these patients are not receiving enough oxygen to their brains to be aware during these last moments of their lives. Anonymous, Anaesthetist

What is sad is that this is not an unusual story. Unless a person dying in Hospital or a Nursing Home has a DNACPR then CPR will be usually be done. CPR may even be done when a person in frail health dies at home without a DNACPR, because the paramedics may be instructed to do CPR ”Just in case it was a cardio-pulmonary arrest”. Nurses and paramedics work in such fear of not doing CPR when there is no DNACPR that they may override their own professional judgement and do CPR when it is clearly inappropriate. Recently a nurse was reprimanded by the Nursing and Midwifery Council for not trying CPR on a nursing home resident who, in my opinion, was clearly already dead. I know of a case in our Hospital in which CPR was started on a person whose body was already in rigor mortis. Dr Gordon Caldwell, Consultant Physician, @doctorcaldwell

Solution 3: Dementia Diagnosis Targets

There are high levels of burnout. A target-driven culture is exacerbating this problem. A typical example was when the government seemingly became convinced by poor quality data which suggested that dementia was under diagnosed So it decided to offer GPs £55 per new diagnosis of dementia. Targets were set for screening to take place – despite the UK National Screening Committee having said for years that screening for dementia was ineffective, causing misdiagnosis. And when better data on how many people had dementia was published – which revised the figures down – it was clear that the targets GPs were told to meet were highly error-prone. The cash carrot was accompanied with beating stick, with the results – naming and shaming supposedly poorly diagnosing practices – published online. Setting doctors harmful tasks, leading them almost to “process” patients, fails to respect patient or professional dignity, let alone the principle of “do no harm”. [Extract from article The answer to the NHS crisis is treating its staff better, New Statesman.] Margaret McCartney, General Practitioner, @mgtmccartney

Solution 4: Never Events List

When we created the list of ‘never events’ at the National Patient Safety Agency we genuinely thought that it would lead to organisations focusing on a few things and doing those well. We thought it was a really neat driver for implementation of evidence based practice (e.g. the surgical safety checklist). We ignored the potential for using never events as a stick to beat people up with, we ignored the potential for gaming the data, we ignored the potential for people to become fearful of reporting and the loss of learning as a result. We importantly ignored the fact that in the vast majority of cases things can never be never – that it is a fact of life that things can and do go wrong no matter how much you try to prevent it. There is no such thing as zero harm and the never events initiative unfortunately gave the impression that it could exist. Suzette Woodward, National Clinical Director, Sign up to Safety Team, NHS England @SuzetteWoodward

Solution 5: ‘Paperless’ Discharge Summary Process

Our paperless Discharge Summary process generated about 5 times as many sheets of A4 as the old paper system, as the ‘paperless’ prescription got corrected and refined prior to discharge. Then we still were told we had to print a copy to go into the paper notes and of course the patient has to have a paper copy because there was no way to email it to the patient. The software could not message pharmacy, so we had to print out the discharge meds to be sent to pharmacy, who then checked found the errors, got doctors to correct them, then another print out, and round again. There are so many paper copies that sometimes an earlier incorrect paper copy gets filed into the notes. Then, unless someone hits ‘Finalise’, the pdf copy never gets emailed to the GP at all. Dr Gordon Caldwell, Consultant Physician, @doctorcaldwell

Solution 6: Computerised Medical systems

With the installation of a fully computerised system for ordering all sorts of tests (radiology requests, lab requests, etc.) work-as-imagined (and -as prescribed) was that this would make work more efficient and safer, with less chance of results going missing or being delayed. Prior to the installation there was much chat with widespread talk of how effective and efficient this would be. After installation it became apparent that the system did not fulfill the design brief and while it could order tests it could not collate and distribute the results. So work-as-done then reverted back to the system that was in place before where secretaries still had to print results on bits of paper and hand them to consultants to action. Craig McIlhenny, Consultant Urological Surgeon, @CMcIlhenny

Solution 7: Radiology Request Forms

Radiology request forms are meant to be completed and signed by the person requesting the procedure. In the operating theatre, the surgeon is usually scrubbed and sterile, therefore the anaesthetist often fills out and signs the form despite this being “against the rules”. Managers in radiology refused to believe that the radiographers carrying out the procedures in theatre were “allowing” this deviation from the rules. Anonymous.

Solution 8: CQUINs (Commissioning for Quality and Innovation payments framework)

Commissioners often use CQUINs (Commissioning for Quality and Innovation payments framework) to drive innovation and quality improvement in the NHS. In theory, the metrics relating to individual CQUINs are agreed between commisioners and clinicians. In practice, some CQUINs focus on meaningless metrics. A hypothetical example: a CQUIN target for treating all patients with a certain diagnosis within an hour of diagnosis is flawed due to a failure of existing coding systems to identify relevant patients. Clinicians inform the commissioners of this major limitation and offer suggested improvements to the metrics. These suggested improvements are not deemed appropriate by the commissioning team because they deviate significantly from previously agreed definitions for the CQUIN. The clinicians are demotivated by the process of collecting meaningless data and are tempted to use gaming solutions to report best performance. This situation is exacerbated by pressure from the management team within the NHS Trust who recognise that failure to demonstrate adherence to the CQUIN key performance indicators is associated with a financial penalty. The management team listen to the clinicians and understand that the data collection is clinically meaningless, but insist that the clinical team collect the data anyway. The motivational driver to improve performance has moved from a desire to improve clinical outcomes to a desire to reduce financial penalties. The additional burden is carried by the clinical team who are expected to collect meaningless data without any additional administrative or job plan support. Anonymous, NHS paediatrician

Solution 9: Central Line Associated Bacteraemia (CLAB) checklists

The use of checklists for the prevention of Central Line Associated Bacteraemia (CLAB) is well described and has been taken up widely in the healthcare system. The purported benefits of the checklist include ensuring all steps are followed as well as opening up communication between team members. After introducing the CLAB bundle into our Intensive Care Unit, we saw very high levels of reported checklist compliance followed by the expected drop in our rates of infection, confirming the previously reported benefits. However, when we observed our staff it became apparent that they were actually filling in the checklist retrospectively without watching the procedure, as they were busy with other tasks. The fall in the CLAB rate could therefore not have been due to the use of a checklist and instead appears to be due to the use of “CLAB packs”. These put all required items for central line insertion into a single pack thereby making it easier for staff to perform the procedure correctly. Carl Horsley, Intensivist, @horsleycarl

Solution 10: WHO Surgical Safety Checklist

The WHO Surgical Safety checklist was introduced into the National Health Service following the release of Patient Safety Alert Release 0861 from the National Patient Safety Agency on 29 January 2009. Organisations were expected to implement the recommendations by February 2010 including that ‘the checklist is completed for every patient undergoing a surgical procedure (including local anaesthesia)’. All organisations have implemented this Patient Safety Alert and the WHO Surgical Safety checklist is an integral part of the process for every patient undergoing a surgical procedure. Whilst the checklist appears to be used in every patient, there is clear evidence that there is variability in how the checklist is used both within an organisation and between organisations. Within an organisation, this variability can occur between teams with differences in the assumed value of using the checklist and within a team between individuals or professional groups. Its value can degrade to a token compliance process to ‘tick the box’. The assumption within an organisation at ‘the blunt end’ is that it is done on every patient. Alastair Williamson, Consultant Anaesthetist, @TIVA_doc

Reference

Martin, J. N. (2004). The Seven Samurai of Systems Engineering: Dealing with the Complexity of 7 Interrelated Systems. Presented at the 2004 Symposium of the International Council on Systems Engineering (INCOSE). Available here.

Advertisements

About stevenshorrock

I am a systems ergonomist/human factors specialist and work psychologist with a background in practice and research in safety-critical industries. My main interest is human and system behaviour in the context of safety-related organisations. I seek to enable improvement via a combination of systems thinking, design thinking and humanistic thinking. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock
This entry was posted in Human Factors/Ergonomics, Safety, systems thinking and tagged , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s