Human error is a concept that seems pretty self-explanatory but there are different ways of understanding it and its causes. It is perhaps a bit like appreciating art. On the surface people seem the same thing but to the trained eye there is a deeper appreciation of what has produced such an outcome.
In art, professional artists and critics are better able to appreciate the decisions the artist made, the brush strokes, the lighting, the shade, the colour and are able to situate the work within the painter’s portfolio. Similarly, in safety, professional safety experts are able to better see the contextual factors that have led to the accident or error: stress, distractions, pressure to cut corners, a lax approach to following safety procedures, poor design, lack of communication, poor procedures and processes. This appreciation and connoisseurship goes beyond what is on the surface to give deeper and contextualized meaning for what has happened.
An individual perspective towards human error looks for explanations of its causes in people. Essentially, the key drive is: “Who messed up? What can we do about them?” Here the error or accident is not inevitable and usually can be traced back to the fault of an individual or individuals that made a wrong decision or were not paying attention. These people are the bad apples, and the remedial action here is to either retrain and discipline the bad apples so they become good apples, or simply get rid of the bad apples.
A systems perspective towards human error looks for explanations of its causes in the wider system. Essentially, the key drive is: “What wider system factors have influenced this unfortunate event? What can we do to reduce the likelihood of similar events recurring in the future?” What we hear from safety experts, once they start to investigate an incident, is that it is remarkable that it never happened sooner and that there aren’t more. What we gather from this is that systems are often imperfect in many ways but this only comes to light when things have gone wrong. From a systems perspective we would not only look at individuals, but also procedures, practices, technology, communication and the working culture amongst other things.
Human error connoisseurs are less satisfied with just blaming the individual, we know that the full story is often much more interesting. Taking action on individuals might also mask what has caused the accident, and these things might also make it happen again. One of the tests of whether an individual or systems perspective is an appropriate response to an incident is the substitution question: if we substituted the person involved with another rational member of staff would the incident still have happen? Often the same thing would have happened no matter who was involved.
Next time human error is mentioned in the media as a cause for an incident, and they name and shame an individual, ask what the bigger story is and what could have been done to prevent this happening from a systems perspective.
If you found this interesting, maybe you’d like to take a look at some of the other stories and articles in our Learning Zone?