Debra Searle was a novice rower when she set off to row solo across the Atlantic in a plywood rowing boat. Three and a half months later, she docked after rowing 3,300 miles and battling 30ft waves, force 8 squalls, tankers and sharks.
Top performers fascinate Ms Searle; she often asks herself what makes them great. “I rely on mindset and attitude techniques; I use a lot of visualisation techniques. When I was rowing across the Atlantic I developed a technique called ‘choose your attitude’. Every day at breakfast I would choose my attitude for the day.
“Being aware and taking control of your attitude is an incredibly powerful thing. I’m convinced that if you choose the right attitude anything is possible.”
Debra Searle’s arguments are echoed in the story of Ben McBean, a royal marine, who lost his arm and leg after stepping on a bomb in Afghanistan. He says: “While I will never get used to having one arm and one leg, my injuries have not defined me; they have just changed me.”
Ben surrounds himself with positive people, so that he can lean on them if he’s having a bad day. He chose to not let his injury destroy his life.
Managing your emotions is part of understanding human performance. Olympic athlete Lizzy Yarnold won gold in the skeleton event last year, and she said: “Emotions are a really hard thing to control when you’re under pressure. I try to separate my emotions, so I decide my competition plan three days before the competition, so that when I’m trying to perform it’s a process – everything is pre-decided.”
Beginnings of human factors training
The study of human factors began in the aviation industry in the 1980s. Guy Hirst is a human factors expert; he was a training standards captain on the Boeing 747 and was instrumental in making human factors training a core part of pilot training. “Aviation accidents receive instant press attention, with images of charred hulls appearing in the media more or less immediately after the incident occurred. In the 1980s accidents were being tagged as being caused by ‘human error’ or ‘pilot error’. The authorities finally decided that the status quo was unsustainable and thus research into understanding human error began.”
Human factors in medicine
Medicine has been slower to fully embrace the relevance of human factors in medical error. In 2006 the Chief Medical Officer (CMO) reported in his review Good Doctors, Safer Patients:
“It is only… recently that attention has been focused on patient safety. Despite the relatively high level of risk associated with healthcare – roughly one in ten patients admitted to hospital in developed countries suffers some form of medical error – systematic attempts to improve safety and the transformations in culture, attitude, leadership and working practices necessary to drive that improvement are at an early stage.”1
According to Guy Hirst, medicine is probably more complex than any other field of human endeavour, and patients are far more complex and idiosyncratic than aircraft, ships or power stations. The critical similarity is that they all rely on teams of professionals working together, so there is much to gain from learning about human factors.
Glenn Mead, from the team that launched The Chimp Paradox, an internationally acclaimed mind management model, says that clinicians experience a lot of stress because there are great consequences and expectations of what actions they take. “In this highly pressured and charged environment, being aware of how you think, sometimes irrationally and emotionally under pressure, is important. You should be able to step back and observe, getting some perspective on the situation.”
The psychology of human error
Professor James Reason is widely regarded as the world’s leading expert on human error. He argues that there is a paradox at the heart of the patient safety problem. Medical education is expected to bring about a “trained perfectibility”: after an extensive education, healthcare professionals are expected to get it right, but they are fallible human beings like the rest of society. However, for many, error equates to incompetence or worse, meaning mistakes may be stigmatised or ignored rather than seen as chances for learning.
The other part of the paradox is that healthcare, by its very nature, is highly error-prone. Guy Hirst says one of the reasons that healthcare is so challenging is the requirement to make decisions on the basis of incomplete evidence. “Events are constantly surprising, particularly as human anatomy is variable and each patient is unique.”
So how do these stories link to the performance of health professionals? Gretchen Haskins is an expert in human performance, having studied incidents and accidents her whole life. “People say human performance is messy and difficult to measure, but it’s becoming sadly more and more predictable as people are making the same types of errors over and over again.”
She believes that human beings and human performance are the single greatest factors in the success or failure of a system. This approach to risk is an important aspect of the science of human factors: often referred to as team resource management, it involves the study of all aspects of the way humans relate to the world around them, with the aim of improving operational performance and safety. This approach applies whether you’re working as an individual or as part of a team.
Studies of disasters such as Three Mile Island, The Herald of Free Enterprise, and Bhopal have illustrated human factors issues similar to those found in medical practice.2 According to James Reason, all humans make frequent errors and they make errors in predictable and patterned ways. Novices make errors due to incomplete knowledge and experts make errors due to the intrinsic hazards of semi-automated behaviour.3
Professor Reason argues that although error can never be completely eliminated it can be managed. There are two distinct cognitive processes: firstly there is the conscious cognitive process, which is used when a task is new, and secondly, there is an automatic cognitive process where the task has been practised and perfected and this process occurs at a subconscious level. The salient point is that the working memory is extremely capacity limited. It is also very effortful to be using the working memory and it is the least preferred option.
The case for human factors training
Guy Hirst explains: “When humans work in complex systems, the opportunities for error-inducing conditions are unlimited and may be exaggerated by cultural and systems deficiencies. We have documented many examples of these error-inducing conditions during our own research working in operating theatres. The danger is that eventually the consequences of some of these familiar and generally tolerated conditions may well be fatal.”
The case for human factors
Evidence is growing that human factors training should be an essential element of the broader patient safety curriculum. As with all the limitations of human information processing, the way to reduce the potential for error-provoking situations is by effective team communication, and the design of systems and protocols that appreciate the inadequacies of human cognitive processes. By being conscious of our attitudes and the cognitive factors discussed above, professional performance can be improved and the effects of human factors mitigated.
- DH, Good Doctors, Safer Patients (2006)
- Flin R, O’Connor P; Creighton, M. Safety at the Sharp End: A Guide to Non-Technical Skills, Ashgate, Aldershot UK (2008)
- Reason, J, Human Error, New York, Cambridge University Press (1990)