How reliable is healthcare?
Dr Dan Cohen, an international medical director based in the US, looks at the biggest challenge to healthcare safety: complacency
The healthcare industry is defined by continuous change, but continuous change does not necessarily mean continuous improvement.
Emerging technologies may provide great promise for advancing our diagnostic and therapeutic options – but with the increasing frequency and complexity of healthcare interventions, so increases the risk of system or personal failures that can harm patients.
Through litigation, these failures can harm institutions and careers. It is highly important that healthcare professionals recognise the hazards associated with providing healthcare services and confront the very real challenge of complacency. Whereas we may see harm when it occurs, more often than not we do not see the “near misses” – and because we do not, this feeds our complacency. We are not truly aware of how often something goes amiss!
Every day thousands of patients are harmed or die in modern well-equipped hospitals staffed by highly-trained individuals. Benevolent intentions do not necessarily translate to safety. The challenge that remains is to understand how so many things can go wrong, when the intentions are to achieve highest quality outcomes and assure patient safety.
High reliability organisations (HROs) are those that function safely and efficiently in industries that are very dangerous. HROs have established cultures and supporting processes designed to dramatically reduce the likelihood of human error and harm. They recognise that in the interactions between humans and technologies, it is the humans that represent the most substantial sources of risk.
Industries commonly considered to portray the attributes of high-reliability include the nuclear power industry, the automotive industry and the aviation industry. In the aviation industry, for example, the aeroplanes are so well-designed, with redundantly engineered systems, that the risks arise primarily from the aircrew. Human factors are the source of most risks and errors.
It has been argued that if the healthcare industry would simply adopt the characteristics and methodologies of HROs, we would move the bars for quality and safety higher. If this is true, then why is there so much inertia in our systems of care; inertia that plagues our improvement strategies? Why have we not solved this problem, when so many solutions abound? Complacency is the pernicious confounder. We do not see the sources of harm, the near misses, and especially do not see ourselves as sources of harm.
The defining characteristics of HROs have been summarised by Weick and Sutcliffe1 and, in abbreviated format, are portrayed below:
- Sensitivity to operations – a constant awareness by leaders and staff to risks and prevention, a mindfulness of the complexities of systems in which they work and on which they rely.
- Reluctance to simplify – avoidance of overly simplistic explanations for risks or failures and a commitment to delve deeply to understand sources of risk and vulnerabilities within systems.
- Preoccupation with failure – a focus on predicting and eliminating catastrophes rather than reacting to them; a “collective mindfulness”2 that things will go wrong and that ‘near misses’ are opportunities to learn.
- Deference to expertise – leaders and supervisors listening to and seeking advice from frontline staff that know how processes really work and where risks arise.
- Resilience – leaders and staff trained and prepared to respond when systems fail and that work effectively as teams to overcome urgent challenges.
A natural fit?
Healthcare systems entail many unique factors that are at variance with HRO industries. Even though some HRO characteristics have been adopted or adapted by healthcare systems, such as the use of checklists, the unique factors of healthcare pose a challenge. These are the increased frequency of human-tohuman interactions and associated communication challenges, and the complex vagaries of our diagnostic processes.3
Healthcare professionals are not engineers or pilots and our way of doing business is fraught with uncertainty and variability. Many of our diagnostic and therapeutic interventions are based on insufficient evidence and are over-utilised, thus increasing risks and the potential for harm.
Most importantly, patients are not aeroplanes. They are far more complex than aeroplanes. They have morbidities and comorbidities, genetic propensities, fears, belief systems, social and economic confounders, intellectual and cognitive challenges, and language and fluency issues.
Because best and safest outcomes are dependent on patient engagement, patients should be viewed as components of the healthcare system, not passive recipients of healthcare services (like passengers sitting in an aeroplane). This perspective is an integral component in a high-reliability system that is focused on avoiding risk.
Dr Dan Cohen is International Medical Director at Datix Inc. In his role as consultant in patient safety and risk management, Dr Cohen advises global thought leaders and speaks at conferences worldwide on improving patient outcomes.