Membership information 0800 225 5677
Medicolegal advice 0800 225 5677

How reliable is healthcare?

Dr Dan Cohen, an international medical director based in the US, looks at the biggest challenge to healthcare safety: complacency

The healthcare industry is defined by continuous change, but continuous change does not necessarily mean continuous improvement.

Emerging technologies may provide great promise for advancing our diagnostic and therapeutic options – but with the increasing frequency and complexity of healthcare interventions, so increases the risk of system or personal failures that can harm patients.

Through litigation, these failures can harm institutions and careers. It is highly important that healthcare professionals recognise the hazards associated with providing healthcare services and confront the very real challenge of complacency. Whereas we may see harm when it occurs, more often than not we do not see the “near misses” – and because we do not, this feeds our complacency. We are not truly aware of how often something goes amiss!

Every day thousands of patients are harmed or die in modern well-equipped hospitals staffed by highly-trained individuals. Benevolent intentions do not necessarily translate to safety. The challenge that remains is to understand how so many things can go wrong, when the intentions are to achieve highest quality outcomes and assure patient safety.

Managing danger

High reliability organisations (HROs) are those that function safely and efficiently in industries that are very dangerous. HROs have established cultures and supporting processes designed to dramatically reduce the likelihood of human error and harm. They recognise that in the interactions between humans and technologies, it is the humans that represent the most substantial sources of risk.

Industries commonly considered to portray the attributes of high-reliability include the nuclear power industry, the automotive industry and the aviation industry. In the aviation industry, for example, the aeroplanes are so well-designed, with redundantly engineered systems, that the risks arise primarily from the aircrew. Human factors are the source of most risks and errors.

It has been argued that if the healthcare industry would simply adopt the characteristics and methodologies of HROs, we would move the bars for quality and safety higher. If this is true, then why is there so much inertia in our systems of care; inertia that plagues our improvement strategies? Why have we not solved this problem, when so many solutions abound? Complacency is the pernicious confounder. We do not see the sources of harm, the near misses, and especially do not see ourselves as sources of harm.

The defining characteristics of HROs have been summarised by Weick and Sutcliffe1 and, in abbreviated format, are portrayed below:

  1. Sensitivity to operations – a constant awareness by leaders and staff to risks and prevention, a mindfulness of the complexities of systems in which they work and on which they rely.
  2. Reluctance to simplify – avoidance of overly simplistic explanations for risks or failures and a commitment to delve deeply to understand sources of risk and vulnerabilities within systems.
  3. Preoccupation with failure – a focus on predicting and eliminating catastrophes rather than reacting to them; a “collective mindfulness”2 that things will go wrong and that ‘near misses’ are opportunities to learn.
  4. Deference to expertise – leaders and supervisors listening to and seeking advice from frontline staff that know how processes really work and where risks arise.
  5. Resilience – leaders and staff trained and prepared to respond when systems fail and that work effectively as teams to overcome urgent challenges.

A natural fit?

Healthcare systems entail many unique factors that are at variance with HRO industries. Even though some HRO characteristics have been adopted or adapted by healthcare systems, such as the use of checklists, the unique factors of healthcare pose a challenge. These are the increased frequency of human-tohuman interactions and associated communication challenges, and the complex vagaries of our diagnostic processes.3

Healthcare professionals are not engineers or pilots and our way of doing business is fraught with uncertainty and variability. Many of our diagnostic and therapeutic interventions are based on insufficient evidence and are over-utilised, thus increasing risks and the potential for harm.

Most importantly, patients are not aeroplanes. They are far more complex than aeroplanes. They have morbidities and comorbidities, genetic propensities, fears, belief systems, social and economic confounders, intellectual and cognitive challenges, and language and fluency issues.

Because best and safest outcomes are dependent on patient engagement, patients should be viewed as components of the healthcare system, not passive recipients of healthcare services (like passengers sitting in an aeroplane). This perspective is an integral component in a high-reliability system that is focused on avoiding risk.

Dr Dan Cohen is International Medical Director at Datix Inc. In his role as consultant in patient safety and risk management, Dr Cohen advises global thought leaders and speaks at conferences worldwide on improving patient outcomes.

Case study

Recently, I was admitted to a hospital for overnight observation after I tore my calf muscle in a fluke accident. I was at risk of developing a compartment syndrome that could have been very serious. The people who cared for me were kind, sensitive and caring. However, they were complacent and did not recognise their liabilities. Below is the litany of concerns I noted during my care:

  • I was misidentified and given another patient’s ID wristband, despite the fact that I handed my insurance details to the ED (Emergency Department) admissions clerk. The wristband did not include information that would enable me to identify this discrepancy, and only when a nurse tried to enter orders into the system was the discrepancy detected. This was not corrected for 30 minutes, delaying my evaluation even as my leg was becoming increasingly numb and purple. I was pointing this out to the nurse; there was urgency here, but...
  • I was seen by several different nurses, technicians and physicians, and it was the exception rather than the rule that these individuals washed their hands before touching me or touching equipment in the room, even after I jokingly pointed this out.
  • The x-ray CT scan technician did not offer me any gonadal shielding, even though he was scanning my entire right leg, and I did not think to ask.
  • When I was admitted, unable to ambulate without assistance, I did not receive a standardised falls risk assessment. I clearly was at very high risk of a fall and, though the nurse was very pleasant, he did not complete the formal risk assessment until morning rounds, and I had to use the toilet twice during the night. I managed, should have called for help but didn’t, and thus potentially became part of my own problem.
  • Finally, at discharge, no-one enquired about challenges in ambulation that might be unique to my home situation. I was to be provided a walker as I was not to bear weight on my injured leg. Though I was assured that the walker would be delivered on the afternoon of my discharge, it did not arrive until the evening of the following day, significantly increasing my risk of a fall at home.

In each of these instances, complacency was the pernicious confounder, including my own complacency. Fortunately, I did not encounter any real harm, only inconvenience; but I could have been seriously harmed. I encountered many ‘near misses’ that no-one even seemed to be aware of. What I experienced is not unique to any particular hospital; rather it is the common experience in hospitals worldwide.

In my view, if a healthcare system is a forest of complexities then a giant coastal redwood of complacency towers high above the forest floor; a floor covered with the moss of ‘near misses’. One colossal tree standing high above the forest floor: it’s not all that complicated.

References

  1. Weick K, Sutcliffe K, Managing the Unexpected: Assuring High Performance in an Age of Complexity, San Francisco: Jossey-Bass (2001)
  2. Chassin M, Loeb J, The Ongoing Quality Improvement Journey: Next Stop, High Reliability, Health Affairs 30:559-568 (2011)
  3. Groopman J, How Doctors Think, Boston and New York: Houghton Mifflin (2007)
Download a PDF of this edition