Traps to Avoid in Safety Investigations, Education and Practice
For those of us in the safety science and human factors profession, it’s been apparent for many years that conceptual thinking and approaches around care safety education, policy and practice can often be out-of-step with other high-risk industries. Indeed, our own NES research outputs frequently point to mistranslations, misconceptions, and clear misunderstandings in care safety policy and education. Similarly, the recent book on the subject by Wears and Sutcliffe (2020) offers a withering critique of what has unfolded in different aspects of the healthcare safety and improvement domain over the past two decades - and which is aptly summarised in the quotation below:
“…the patient safety movement itself has gotten things wrong. Its understandings … of concepts such as safety, harm, risks and hazards are incomplete and simplistic and, as a result, its work has been grounded in assumptions and generalisations that are either wrong or lacking in context’ (Wears & Sutcliffe, 2020)
Against this background, a recently completed NES research project sought to critically review the safety-related content, language and assumptions embedded in a small but diverse range of health and care safety learning reports, policies, databases and curricula. The following information sources, which were in the public domain or volunteered by care organisations, were selected for review:
- NHS Board Adverse Event Learning Summaries
- Ombudsman Reports on Complaints
- Data from incident reporting and learning systems
- National and organisational management of adverse events policies
- Organisational incident investigation reports
- National and international patient safety curricula
What we found
Examples of our key findings are presented in the following themes as basic guidance on ‘traps to avoid’ and supported with a few evidential quotes (Box 1):
Omitting the ‘systems approach’
The systems approach represents ‘best practice’ in modern safety science and is fundamental to understanding why things go wrong in highly complex care systems and extracting meaningful learning. It acknowledges that safety occurrences and complaints are the result of multiple, interacting contributory factors from across the care system, and not solely due to the decisions and actions of individuals at the ‘sharp-end’ of practice. Safety and complaints investigations should always be guided by a relevant system analysis approach, underpinned by the adoption of a ‘systems thinking’ mindset (Box 2).
Using the language of blame and human failure
Pejorative, biased, judgemental and blame-laden language were apparent across multiple information sources, particularly in databases and some incident investigations and complaints reports. It is unclear to what extent this direct and indirect ‘blaming’ is unintentional, but it is unhelpful and should be avoided as it runs counter to organisational learning and supporting the wellbeing of those involved. In this regard it potentially further antagonises those staff affected, who may be reluctant to report incidents, act on recommendations or participate in investigation processes in future.
Overlooking the ‘local rationality’ principle
When looking back with the benefit of hindsight and judging the decisions and actions of individuals we need to consider the local rationality principle i.e. explore and understand why these decisions and actions made sense to them at the time given the context and situation they found themselves in - otherwise they would not have made them given the outcome that is now apparent. Doing this provides a more comprehensive explanation for why something has gone wrong and paves the way for more meaningful learning and improvement. There is limited evidence that this concept is known about or applied in care safety policy, education and practice.
Engaging in counterfactual reasoning
Counterfactual reasoning was clearly apparent across many information sources. This involves prioritising the ‘analysis of what the care system did not do’, rather than focusing on why it made sense at the time for the system to act as it did (see the local rationality principle). Stating that people and organisations in retrospect should have or could have done something is describing an alternative universe – this is completely contrary to what actually happened, so it offers very limited learning potential. When we think or write in terms of what someone ‘coulda, woulda, shoulda’ done, then this is a red flag that we’ve fallen into the counterfactual trap.
Misunderstanding key concepts
Clear evidence was uncovered of multiple misunderstandings and misuses of fundamental human factors, risk and safety terminology. To focus on a single issue: ‘human error’ and its synonyms (error, medical error, nursing error etc) should be avoided entirely, unless properly used in context and should not be attributed as the cause of a safety incident or complaint. These terms are frequently misunderstood and applied incorrectly and are wrongly conflated with health and care outcomes (e.g. ‘medical error is a leading cause of death in this country’). Their use is unhelpful, and potentially stifles learning and foments person-level blame.
The findings of this small study point to potential learning needs to be addressed around how we understand and approach some aspects of safety and complaints investigations, and extract and share related learning. NES is currently developing a Learning Brief to raise awareness of the issues uncovered in this study, which should be of interest to health and care safety educators, practitioners, policy-makers, and regulators as well as those with leadership and advisory roles in related areas.
Prof Paul Bowie
Programme Director (Safety & Improvement)
If you wish to know more about this work, please get in touch: email@example.com
Box 1. Examples of quotes related to language bias, blame, counterfactuals and lack of local rationality
‘In a major departure from accepted medical practice, Dr E agreed to see Caroline and simply forgot about her.’
‘Poor administration and time management by Physio involved.’
‘He was phoned at least three times after her discharge from the city hospital……but failed to realize the seriousness of her condition.’
‘There was unreasonable failings in communication between staff involved in Mr A’s care and treatment’
‘On review, there were signs of sepsis at his initial presentation that should have been recognised leading to appropriate treatment at that stage.’
‘If a scan had been done in A&E this may have led to an earlier diagnosis’
‘an underlying vascular condition was not appropriately considered during initial skin inspections which led to the risk of tissue damage being underestimated.’
‘training and guidance on use and monitoring of security doors to be provided to staff’
Box 2. Examples of fundamental ‘Systems Thinking’ principles for investigating and learning from safety occurrences and complaints
Prioritise all of the People involved in the incident (patients, families, their advocates, and the workforce) and seek multiple perspectives when attempting to understand system safety
Avoid blaming individuals (departments & organisations), focus learning at the system level
Safety incidents are caused by multiple, interacting contributory factors from across the care system
Understand that safety (or lack of safety) is an emergent property of highly complex care systems
Consider ‘human error’ as a symptom of a system problem, not its cause
Recognise that there is no ‘root cause’ of a safety incident in highly complex care systems
Adopt a recognised systems approach to investigation, learning and improvement
It is critical to explore and reconcile ‘work-as-imagined’ and ‘work-as-done’
Consider Local Rationality when learning from incidents
Explore performance variability (trade-offs and adaptations to practice)
Recommendations for improvement should focus on systemic change and redesign, rather than individual performance
Wears, R. L. & Sutcliffe, K. M. Still not safe: Patient safety and the middle-managing of American medicine. New York, NY: Oxford University Press,