14: Creating a culture of safety

LEAPE ET AL. (1998)


A somewhat lethal cocktail of impatience, scientific ignorance and naive optimism may have dangerously inflated our expectations of safety culture.
(COX AND FLIN, 1998)


Both these statements are true but they point to different roles that culture can play in the struggle for safer healthcare. When Lucian Leape and others talk about changing the culture, they reflect a deeply held belief and commitment to a fundamental change in the way error and safety are approached and an equally deeply felt conviction that until the culture changes, nothing else will. However, there is in fact comparatively little hard evidence that changing the safety culture has any direct impact on safety.As Cox and Flin (1998) point out, a naïve belief in the concept has far out-stripped the evidence for its utility. We will see that these two viewpoints can be reconciled once we distinguish culture as a necessary foundation for change from culture as a force for change in its own right. But first we must examine the concept a little more closely.


The many facets of safety culture in healthcare


Anyone who begins to examine the safety literature comes across a be wildering array of descriptors applied to the word culture, each of which is supposed to illuminate some essential facet of the all important safety culture. No blame culture, open and fair culture, flexible, learning, reporting, generative, resilient, mindful … the list goes on and on. In part, this reflects that safety culture is not fully understood and that people have not rallied around a single definition or set of concepts. However, it also reflects the fact that there are a number of important facets to a culture of safety, as can be seen in the various examples of absent or inadequate safety culture (Box 14.1).



BOX 14.1 Safety culture in healthcare


‘There is too often a blame culture. When things go wrong, the response is to seek one or two individuals to blame, who may then be subject to disciplinary measures or professional censure. That is not to say that in some circumstances individuals should not be held to account, but as the predominant approach this acts as a significant deterrent to the reporting of adverse events and near misses’ (Department of Health, 2000, p. 77).


Increasingly patients and physicians in the United States live and interact in a culture characterized by anger, blame, guilt, fear, frustration and distrust. The public has responded by escalating the punishment for error. Clinicians and some healthcare organizations generally have responded by suppression, stonewalling and cover-up. That approach has been less than successful (Leape et al.,1998: p. 1446).


Absence of safety culture. A young boy died after failing to recover from a general anaesthetic administered at a dental practice. A fatal accident enquiry concluded that the boy’s death could have been prevented if a number of reasonable precautions had been in place. There was no agreement with a local hospital for rapid transfer of patients in emergencies, no heart monitor was attached when the anaesthetic was given, the anaesthetist lacked a specialist qualification and all staff lacked training in medical emergencies (Department of Health, 2000; p. 36).


A culture developed within the hospital that allowed ‘unprofessional, counter therapeutic and degrading – even cruel – practices to take place. These practices went unchecked and were even condoned or excused when brought to the attention of the hospital. Some staff interviewed did not even recognize the abuse, which had taken place, as unacceptable practice. (Report of UK Commission for Health Improvement following an investigation into physical and psychological abuse of elderly patients 2000.)


The examples of poor culture first show the importance attached to culture by experienced clinicians and safety experts. They also illuminate, to some extent, the different facets of culture and the different senses in which the word is used. The first two quotes are primarily concerned with the reaction to errors after they have occurred and the authors are rightly critical of unthinking, heavy handed reactions both inside healthcare organizations and in the wider society; we are therefore concerned with the culture of both healthcare organizations and wider social mores. Another theme apparent here is that excessive blame prevents recognition of error and impedes learning and effective action to improve safety. The principal theme of the third example on the other hand, while also concerned with error, concerns anticipation rather than response. Here safety culture implies that the people concerned should maintain good standards of practice but also be alert to the possibility of error and take steps to reduce or eliminate that possibility. The final example reveals another facet of safety culture, or rather its absence. In a deeply pathological culture, the difficulty is not so much blame, as that problems are denied or not even acknowledged. As is sometimes said, the hardest problems to resolve are those where no one recognizes anything is wrong. Here the abuse referred to seems to have become normal, and therefore unnoticed by the staff concerned. Gradually, little by little, in a group isolated from mainstream clinical practice, behaviour that is unthinkable to begin with can become first tolerated, then routine and finally invisible.


All these examples supposedly concern the culture of safety; it seems to be a pretty broad, ill-defined and all encompassing concept. Does this matter? Well, yes it does. If our challenge is to change the culture, as so many commentators urge, then we need to understand what safety culture is, or at the very least decide what aspects to highlight, and bring as much precision to the definition as can be mustered. First though, we need to see how the concept emerged.


Organizational culture


The word culture has several different, but related meanings. We are accustomed to thinking of culture in terms of the literary and artistic heritage of a people or the prevailing values and ethos of a particular nation. In medicine, culture has another meaning, as an environment in which bacteria or other organisms reproduce. This latter meaning could be seen as a metaphor for safety culture – provide the right culture and the required attitudes and behaviours will flourish. In a business environment, the structural school of thought argue that authority, clear hierarchy and rules are the primary determinants of good functioning organizations; the cultural perspective on the other hand considers attitudes, values and norms to be fundamental (Huczynski and Buchanan, 1991). In the safety context, the contrast would be between relying on rules and regulations to produce safety and trying to engender a culture of safety.


While organizational culture has been studied for decades, it came to prominence as an explanatory concept during the 1980s. Rather than look at the particular structures and management practices, management gurus such as Peters and Waterman (1982) emphasized the cultural attributes and the clear guiding values of high performance organizations. Given that quite a few of these companies have now gone to the wall, it may be that the importance of culture was overstated, but nevertheless the concept of culture as a determinant of organizational performance remained. The person who most clearly articulated the idea of organizational culture was Edgar Schein in a book called ‘Organisational Culture and Leadership’ (Schein, 1985). The link with leadership will be discussed further below, but what interests us now is the clarity of Schein’s conceptualization of culture. Weick and Sutcliffe (2001) summarize this as:


Schein says that culture is defined by six formal properties: (1) shared basic assumptions that are (2) invented, discovered or developed by a given group as it (3) learns to cope with its problem of external adaptation and internal integration in ways that (4) have worked well enough to be considered valid and therefore (5) can be taught to new members of the group as the (6) correct way to perceive, think and feel in relation to those problems. When we talk about culture therefore, we are talking about assumptions that preserve lessons learned; values derived from those assumptions that prescribe how the organisation should act; and visible markers and activities that embody and give substance to the espoused values.
(WEICK AND SUTCLIFFE, 2001)


So, in a healthcare setting, one basic assumption for all clinicians is that colleagues will always respond to a true emergency call; the priority of patient care in such situations is a core value, overriding all others. Locally however, culture takes specific forms. Consider the experience of moving to a new hospital or a new ward to work. Very quickly one senses the differences in, for instance, how formal people are, how easy it is to speak up in meetings and whether it is possible to challenge or question senior staff; all these reflect the culture of that particular organization or group. In primary care, different practices organize themselves in different ways, with differing levels of availability to patients, differing degrees of shared responsibility and mutual support and so on. In short culture is, as has often been said, ‘the way we do things round here’.


Organizational culture and group culture


Culture,as noted above, is how we do things round here. Notice however, that ‘here’ can be a small group, part of an organization, a group of professionals or an entire, huge organization like the British National Health Service, the largest employer in Europe. (The Chinese army is apparently larger worldwide, though I do not have definitive figures.) Ideally, members of an organization share the same values and commitment, whether in a university, a business or a nuclear power plant. Safety, one would hope, would be a value on which everyone could agree and attitudes and values cohere. However, the safety culture within an organization may vary markedly in different areas and in different groups. For instance, in a survey of employees in the nuclear industry, Harvey et al. (2002) found that managers had largely positive views of their own commitment to safety and saw themselves as taking responsibility for safety issues. Shop floor workers, on the other hand, generally had more negative views about management commitment to safety and management’s ability to listen and respond to safety concerns. The divergence in views of managers and shop floor workers may possibly sound familiar to anyone who works in healthcare.


Healthcare is particularly complex because of the large number of professional groups, each with their own culture and ways of doing things. Nursing, for instance, tends to have a much stricter disciplinary code and harsher attitude to errors, than medicine. Substantive nursing errors are often followed by formal warnings or sanctions, to a much greater extent than other professional groups. National culture may also be influential, as Bob Helmreich’s work has elegantly shown in the context of aviation (Helmreich and Merrit, 1998). Efforts to train cockpit teams in more open styles of communication for instance, have had to contend with widely varying cultural attitudes to seniority and hierarchy. Some cultures, particularly Asian nations, have a much greater ‘power gradient’ than most European countries; there is greater deference to authority, and unwillingness to challenge senior figures; in this case, cockpit attitudes reflect wider social mores. As we begin to explore the attitudes and experiences of patient safety in different countries, these differences are likely to emerge in healthcare.


Safety culture


Safety culture is one aspect of the wider culture of the organization. In this section, we will define safety culture and consider some of the most important aspects, those relating to openness, blame, reporting and learning.


The UK Health and Safety Commission (1993) quotes the following definition in many of its documents, which was originally provided by the Advisory Committee on the Safety of Nuclear Installations. It succinctly captures the essential features:


The safety culture of an organisation is the product of the individual and group values, attitudes, competencies and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation’s health and safety programmes. Organisations with a positive safety culture are characterised by communications founded on mutual trust, by shared perceptions of the importance of safety, and by confidence in the efficacy of preventative measures.
(VINCENT, 2006)


A safety culture is therefore founded on the individual attitudes and values of everyone in the organization. A strong organizational and management commitment is also implied; safety needs to be taken seriously at every level of the organization. The Chief Executive needs to provide clear and committed leadership, communicated throughout the organization, that gives the safety of patients and staff a priority. The cleaner on the wards must be conscious of infection risks, nurses are alert for potential equipment problems and drug hazards and managers are monitoring incident reports. Finally, as the ACSNI committee indicates, producing and maintaining a safety culture is a long-term, systematic and continuing process. There is never a time when the job of enhancing and maintaining a safety culture is finished. Safety, like trust, is a highly perishable commodity with, as Richard Cook likes to say, the half life of adrenaline.


An open and fair culture


The tendency for excessive, immediate and unreasoning blame in the face of patient harm, both from within and outside healthcare organizations, has led some to call for a ‘no-blame’ culture. This, if taken literally, would appear to remove personal accountability and also remove many social, disciplinary and legal strictures on clinical practice. A culture without blame would therefore seem to be both unworkable and to remove some of the restrictions and safeguards on safe behaviour. A much better objective is to try to develop an open and fair culture, which preserves personal responsibility and accountability but requires a much more thoughtful and supportive response to error and harm when they do occur.


The tendency to blame people for errors that have severe outcomes, satisfying as it may be in the short term, is often unwarranted and certainly not in the long-term interests of patient safety. Yet it takes a very cool headed and thoughtful clinical leader or chief executive to take a systems view when faced with some awful incident, particularly when they may be under considerable pressure from relatives, the media, even government. Regulatory and professional bodies also face these pressures and equally have to decide whether a clinician’s behaviour is deserving of censure and disciplinary action. It’s no good simply appealing to systems thinking and a just culture;a call has to be made one way or the other and some action taken.


Assessing culpability: the incident decision tree


In order to give form and structure to these decisions about culpability, Boeing developed a decision aid for maintenance error, in which the psychological principles involved in the occurrence of such error were given flesh in the form of a step-by-step decision aid examining the nature of the error, the influence of context and contributing factors, health and pressures and so forth. James Reason (1997) outlined a more general ‘culpability matrix’, which in turn was adapted by the UK National Patient Safety Agency to produce their ‘Incident Decision Tree’.


The structure of the NPSAs Incident Decision Tree is shown in Figure 14.1. Essentially, after the incident has been investigated and some thought given to its causes, a series of questions is asked. Were the actions intentional? If, yes, was there an intention to cause harm or not? Is there any evidence of a medical condition? Was there a departure from agreed protocols and so on. Suppose, for instance, a staff nurse gives a dose of diamorphine to an elderly patient in severe pain without waiting for a prescription to be written. Is this justified? Potentially, if there is no other option. Suppose, however, she has made no attempt to contact the relevant doctor. In this case her actions were clearly intentional, the violation of protocols deliberate and without justification. In other cases, protocols and procedures may still have been ignored, but in circumstances that mitigate the error. The NPSA gives examples of a midwife who failed to notice discrepancies in a foetal heart reading through having been on duty 15 hours without a break to cover absent colleagues. Finally, there are areas of particular difficulty when the ‘correct’ action is not clear cut, when a judgement must be made as to whether the risks outweigh the benefits. The decision aid commendably makes this an explicit issue:



Figure 14.1 Incident Decision Tree adapted from UK National Patient Safety Agency.

images

A surgical patient is receiving opiate analgesia via a syringe pump. A senior nurse, who has just come on duty, realises the pump has been set up to run much too fast and the patient’s breathing is slow and shallow. The charge nurse urgently summons medical staff assistance but there is no response. The patient stops breathing. The nurse decides there is no option but to deliver a naloxone injection himself to try and save the patient’s life. In doing so, he knowingly breached trust protocols (which were generally clear, workable and in routine use) and his own profession’s standards of accountability. However, the nurse was faced with a life or death situation and the risk to the patient of waiting for medical help was much greater than the nurse taking on what was properly a medical decision.
(WWW.NPSA.NHS.UK)


Using the incident decision tree requires an initial analysis of the case and some reflection on the web of causes and contributory factors and the intentions and circumstances of the people involved. Deciding whether someone should be supported, praised or disciplined is never easy, but the formal decision process should make the eventual judgement more explicit, fairer to the staff involved and more in the interests of future patients in that healthcare organization.


A culture of learning


One my favourite aphorisms is that practice, in many different areas, is just ‘one mistake after another’. This is partly a rueful acceptance of the humiliating and frustrating nature of the acquisition of any skill; learning the piano, for example, is inevitably an experience of fumbled notes, incomprehension and strident discords each time one advances to a more difficult piece. More importantly though, this phrase brings out the idea that people, and indeed organizations, learn through noticing and reflecting on errors. The Total Quality Management gurus go as far as to say that every error is a treasure, which may be a step too far for some, but certainly errors can be highly informative. Organizations can advance and evolve by the recognition of error or, conversely, decay and become unsafe by suppressing information about error and safety and adopting an ostrich-like ‘head in the sand’ approach to the landscape of error and hazard.


The nature and mechanisms of reporting systems were discussed in Chapter 4, and some of the reasons why people do and do not report. Returning to this theme again we are more concerned, in the cultural context, with the attitudes and values that underlie a willingness to report and, more importantly, to reflect and learn. This means not just acknowledging error, but sometimes even celebrating its successful resolution. There is a famous story about Werner von Braun, the rocket scientist (and inspiration for Dr Strangelove) presenting a bottle of champagne to a NASA engineer who had brought a major problem to his attention. Don Berwick provides a more recent example showing that this tradition continues (Box 14.2).



BOX 14.2


The Titan rocket was powered by liquid oxygen and hydrogen. The design of the rocket required great precision in the use of fuel – every drop had to be consumed before engine shutdown, completely emptying the tanks. To ensure the liquid emptied completely, four small metal baffles were placed at the bottom of the tank to stop the liquid swirling round the exit from the tank. Unfortunately, the fitted baffles were a little too big and an expensive, but necessary fix was organized. The tanks were drained and a man lowered on a harness in a diving suit to trim the baffles. Four bolts and metal fragments had to be removed and collected; if metal was left in the tank, it would be sucked into the high-pressure pump and the rocket would explode.


The problem arose when the engineer who did the trimming, Jerry Gonsalves, returned, emptied the cloth sack to find only three bolts. They returned, looked carefully for the missing bolt, could not find it and concluded that there must only have been three. That night, Gonsalves could not sleep for thinking about the missing bolt. He returned to the tank, looked down to see if there were any places the bolt could be hidden. He found two, and called the Director of Safety, Guy Cohen. The next morning they all assembled again, emptied the tank at huge expense, and lowered another engineer to check. He went to the first of the two hiding places Gonsalves had identified, and found the bolt.


Guy Cohen asked me a question at this stage in the story. ‘Suppose it had been a nurse,’ he asked, ‘and we were talking about a serious drug error. What would happen in one of your hospitals?’ I knew the answer very well. ‘An incident report,’ I said. ‘And the nurse would probably have had some sort of warning put in her file. If the patient had died, she would probably be fired or worse.’


‘Then you’ll never be safe,’ he said. ‘That’s not what we did. We saved that bolt and had it gold plated and mounted on a plaque. And we had the NASA administrator come to the launch of that rocket a couple of days later. And in full view of everyone there, we gave the plaque to Jerry Gonsalves, and we dedicated the launch to him.’


(ADAPTED FROM BERWICK, 1998)

Stay updated, free articles. Join our Telegram channel

Jun 24, 2017 | Posted by in GENERAL SURGERY | Comments Off on 14: Creating a culture of safety

Full access? Get Clinical Tree

Get Clinical Tree app for offline access