8: Understanding how things go wrong

(TOFT, 2001)


Following an Internal Enquiry at QMC, Professor Brian Toft was commissioned by the Chief Medical Officer of England to conduct an enquiry into the death and to advise on the areas of vulnerability in the process of intrathecal injection of these drugs and ways in which fail-safes might be built in (Toft, 2001). The orientation of the enquiry was therefore, from the outset, one of learning and change. We will use this sad story, and Brian Toft’s thoughtful report, to introduce the subject of analysing cases. Although the names of those involved were made public, I have changed them in the narrative, as identifying the people again at this distance serves no useful purpose. This case acts as an excellent, though tragic, illustration of models of organizational accidents and systems thinking.


The systems view of medical error was not, however, the approach taken by the courts. Dr Mitchell was charged with manslaughter, pleaded guilty and was sentenced to eight months imprisonment. David James’s parents considered the sentence ridiculous, pointing out that he would have probably served a longer sentence for theft of hospital equipment (Balen, 2004). The anger and desire for justice is more than understandable and some would argue that no one, in whatever profession, should be exempt from charges of manslaughter. Conversely, criminalizing fatal medical mistakes and destroying careers and people may not actually help us improve patient safety. As Dr Mitchell said, when interviewed by police, ‘I know it’s a lame excuse, but I am a human being’ (Holbrook, 2003). The proper role of the law in healthcare is too complex an issue to be discussed properly here, and in any event heavily dependent on culture and wider societal attitudes and values. However, we should note the contrast between, on the one hand, the judicial view of error and the concept of manslaughter and, on the other, the view that emerges from Brian Toft’s enquiry. After considering the full circumstances of the case and the way the odds stacked up against the unfortunate patient and doctors involved in this tragedy, the reader can reappraise the verdicts.


Background to the incident


Provided Vincristine is administered intravenously (IV), it is a powerful and useful drug in the fight against leukaemia. The dangers of inadvertent intrathecal administration of Vincristine are well known: there are product warnings to that effect, a literature that stresses the dangers and well publicized previous cases. Medical staff at QMC had put a number of measures in place to prevent inadvertent intrathecal use, and it was clear that these precautions were taken seriously. There was a standard written protocol which, at the request of hospital staff, had been changed so that Cytosine and Vincristine would be administered on different days to avoid any potentially fatal confusion. Drugs for intravenous and for intrathecal use were also supplied separately to the wards, again to reduce the chances of mixing up the different types of drug. Nevertheless, due to a combination of circumstances, all these defences were breached and Mr James died (Box 8.1).



BOX 8.1 The death of David James


Mr James arrived on the ward at about 4.00 p.m.; he was late for his chemotherapy, but staff tried to accommodate him. The pharmacist for the ward had made an earlier request that the Cytosine should be sent up and that the Vincristine should be ‘sent separately’ the following day. The pharmacy made up the drugs correctly and they were put on separate shelves in the pharmacy refrigerator. During the afternoon the ward day case co-ordinator went to the pharmacy and was given a clear bag containing two smaller bags each containing a syringe – one Vincristine and one Cytosine. She did not know they should not be in the same bag.


Dr Mitchell was informed and approached by Dr North to supervise the procedure, as demanded by the protocol. When it had been established that Mr James’s blood count was satisfactory, Dr Mitchell told Dr North that they would go ahead with Mr James’s chemotherapy. The staff nurse went to the ward refrigerator and removed the transparent plastic bag, placed there by the day case co-ordinator, within which were two separate transparent packets each one containing a syringe. She noted that the name ‘David James’ was printed on each of the syringe labels, delivered it and went to carry on her work.


Dr Mitchell looked at the prescription chart noting that the patient’s name, drugs and dosages corresponded with the information on the labels attached to the syringes. He did not, however, notice that the administration of Vincristine was planned for the following day or that its route of administration was intravenous. Dr Mitchell, anticipating a cytotoxic drugs system similar to the one at his previous place of work had presumed that, as both drugs had come up to the ward together, both were planned for intrathecal use. He had previously administered two types of chemotherapy intrathecally and it did not therefore seem unusual.


A lumbar puncture was carried out successfully and samples of cerebro spinal fluid taken for analysis. Dr Mitchell then read out aloud the name of the patient, the drug and the dose from the label on the first syringe and then handed it to Dr North. Dr Mitchell did not, however, read out the route of administration. Dr North, having received the syringe, now asked if the drug was ‘Cytosine’, which Dr Mitchell confirmed. Dr North then removed the cap at the bottom of the syringe and screwed it onto the spinal needle after which he injected the contents of the syringe.


Having put down the first syringe, Dr Mitchell handed the second syringe containing Vincristine to Dr North, again reading out aloud the name of the patient, the drug and dosage. Once again, he did not read out the route of administration. However, Dr Mitchell could not later recall if he:


actually said the word ‘Vincristine’ but once again I had clearly fixed in my mind that the drug was Methotrexate and not a drug for administration other than intrathecally. If I had consciously appreciated that the drug was Vincristine I would have stopped the procedure immediately and would never have allowed Dr North to administer it.


Dr Mitchell could not explain the fact that he mentally substituted the word ‘Methotrexate’ for ‘Vincristine’, except for the fact that his mindset was that drugs for administration by a route other than intrathecal would simply not be available at the same time.


Dr North was surprised when he was passed a second syringe, because on the only other occasion that he had performed a supervised intrathecal injection only one syringe had been used. However, he assumed that on this occasion that ‘…the patient was either at a different stage in his treatment or was on a different treatment regime than the other patient.’ Dr North, with the second syringe in his hand, said to Dr Mitchell ‘Vincristine?’ Dr Mitchell replied in the affirmative. Dr North then said ‘intrathecal Vincristine?’ Dr Mitchell again replied in the affirmative. After which Dr North removed the cap at the bottom of the syringe and screwed it onto the spinal needle. He then administered the contents of the syringe to Mr James, with ultimately fatal results.
(ADAPTED FROM TOFT, 2001)


Defences, discussed further below, are the means by which systems ensure safety. Sometimes the term is used to encompass almost any safety measure, but it more usually refers to particular administrative, physical or other barriers that protect or warn against deviations from normal practice. Usually these defences and barriers will ‘capture’ an error and block the trajectory of an accident. In this example, many defences and barriers existed, in the form of procedures and protocols, custom and practice. Administering Cytosine and Vincristine on separate days, for instance, is clearly intended to be a defence against incorrect administration. The separation of the two drugs in pharmacy and the separate delivery to the ward are other examples of defences against error. Having two doctors present checking labels and doses is another check, another barrier against potential disaster. If one or other of these checks fails, the outcome is usually still good. For instance, as long as the correct drug has been delivered, no harm will result if the doctor does not check conscientiously or is distracted while checking. It is nevertheless good practice to always check ‘just in case’. Sometimes however, as in this case, a series of defences and barriers are all breached at once. This is brilliantly captured in James Reason’s Swiss Cheese (Figure 8.1; Reason, 1990) metaphor of the trajectory of an accident, which gives us the sense of hazard being ever present and occasionally breaking through when all the holes in the Swiss Cheese line up.



Figure 8.1 Swiss cheese diagram. (Figure adapted from Reason, 1997)

images

Death from spinal injection: a window on the system


From the chronology one can see the classic ‘chain of events’ leading towards the tragedy. Dr Mitchell was quite new to the ward, unfamiliar with the chemotherapy regime and did not know the patient. The pharmacy somehow, although separating the two drugs, placed them in a single bag. Although the doctors involved can be held responsible for their specific actions and omissions, one can also see that circumstances conspired against them. However, the case also illustrates some much more general themes, issues that pervade healthcare and indeed other organizations, and which are right now, as you read this, putting patients at risk.


Assumption that the system was reliable


The unit where David James died had used these drugs for many years without a major incident. After an event of this kind, and a subsequent analysis, we can see that the systems, while reasonably robust, were nevertheless far from fault free. Huge reliance was placed on custom and practice and on people simply knowing what they were doing. With experienced staff who know the unit’s procedures, this works reasonably well, but when new staff join a unit without clear induction and training, the system inevitably becomes unsafe. In fact, the unit where David James died seems to have been a well run unit, where professionals respected each other’s work and things went well on a day-to-day basis. Paradoxically, safety creates its own dangers in that an uneventful routine lulls one into a false sense of security. The safer one becomes, the more necessary it is to remind oneself that the environment is inherently unsafe. This is what James Reason means when he says that the price of safety is chronic unease (Reason, 2001). In fact, the very assumption that all is well can itself be dangerous.


Assumptions about people


Brian Toft introduces his examination of the tacit assumptions of those involved in this case with an apposite quote:


A newcomer assumes that he knows what the organization is about, assumes that others in the setting have the same idea, and practically never bothers to check out these assumptions.
(TOFT, 2001)


Dr Mitchell, the newest member of staff involved, assumed for instance that chemotherapy for different routes of administration could never be on the ward at the same time. He also assumed that he was competent to supervise Dr North, and that Dr North was allowed to give these drugs under supervision. More rashly still, he assumed that Dr North was familiar with Mr James’s case and so they did not need to consult his records. Dr North, in his turn, assumed that Dr Mitchell knew what he was doing and was authorized to supervise him. He also assumed that, although he should not have administered the drugs, it was permissible when authorized by Dr Mitchell. This assumptions made by each doctor were unfortunately perfectly matched, each tacitly reassuring the other of their mutual competence and the essential normality of the situation.


Senior doctors on the ward, although not directly involved, made their own assumptions. They assumed that Dr Mitchell knew about the dangers of Vincristine, that there was no need for a formal induction for junior staff, and that Dr Mitchell understood that ‘shadowing’ meant that he should not administer Cytocine.


None of the assumptions made by anyone was completely unreasonable. We all make such assumptions; in fact we need to just get through the day. People are assumed to be competent who in fact are necessarily ‘winging it’, doing the best they can in the circumstances. In healthcare this happens all the time as junior staff battle with situations that are unfamiliar to them, or when more senior staff new to a unit feel that they must display more competence than they actually feel. We cannot check everything all the time. However, one can at least realise that many of one’s assumptions are likely to be wrong and begin to look, before disaster strikes, for the holes in the Swiss Cheese that permeate one’s own organization. We will return to this theme of vigilance and the anticipation of error and hazard later in the book.


The influence of hierarchy on communication


When asked why he did not challenge Dr Mitchell, Dr North said:


First of all, I was not in a position to challenge on the basis of my limited experience of this type of treatment. Second, I was an SHO (junior doctor) and did what I was told to do by the Registrar. He was supervising me and I assumed he had the knowledge to know what was being done. Dr Mitchell was employed as a Registrar by QMC which is a centre for excellence and I did not intend to challenge him.
(TOFT, 2001)


Dr North was in a very difficult position. He assumed Dr Mitchell, as a registrar, knew what he was doing and reasonably points out that he himself had limited experience of the treatment. However, he did know that Vincristine should not be given intrathecally, but he failed to speak up and challenge a senior colleague. Criticism might be made here of both Dr North, for not having the courage to request further checks, and of the Dr Mitchell for not taking the junior doctor’s query more seriously and at least halting the procedure while checks were made.


The interaction can also be seen as reflecting the more general problem of authority gradients in clinical teams. In a survey asking whether junior members of a team should be able to question decisions made by senior team members, pilots were almost unanimous in saying that they should (Helmreich, 2000). The willingness of junior pilots to question decisions is not seen as a threat to authority but as an additional defence against possible error. In contrast, in the same survey, almost a quarter of consultant surgeons stated that junior members of staff should not question seniors.


Physical appearance of syringes containing cytotoxic drugs


Syringes containing Vincristine were labelled ‘for intravenous injection’ and syringes containing Cytosine ‘for intrathecal use’. You might think this is fairly clear cut, but on a busy ward with numerous injections being given every day, the design and packaging of drugs is an important determinant of the likelihood of error. In the final few minutes leading up to the fatal injection, the doctors involved were not helped by the similarity in appearance and packaging of the drugs. First, the labels were similar and, while the bold type of the drug and dose stood out there were no other strong visual cues to draw a reader’s eye to the significance of the route of administration. Second, the syringes used to administer the two drugs were of similar size; the size of the syringe did not give any clues as to the route of administration to be used. Third, both drugs were clear liquids administered in similar volumes; neither colour nor volume gave any indication of the proper route of administration. Finally, the most dangerous physical aspect of all, in Toft’s opinion, is ‘that a syringe containing Vincristine can also be connected to the spinal needle that delivers intrathecal drugs to patients. Clearly, once such a connection has been made, the patient’s life is in danger as there are no other safeguards in place to prevent the Vincristine from being administered.’ (Toft, 2001: p. 14)


We can see therefore, first that the syringes and labelling are unnecessarily similar and second that there are potential design solutions which would reduce, or even eliminate, this type of incident. Most obviously syringes of drugs for intrathecal use could have their own specific, unique fitting, colour and design. While this might not eliminate the possibility of injecting the correct drug, it does add a powerful check to wrong administration. In the same way, fatalities in anaesthesia that resulted from switching oxygen and nitrous oxide supplies were eliminated by the simple expedient of making it impossible to connect the nitrous oxide line to the oxygen input. In daily life, there are thousands of such checks and guides to behaviour. When you fill your car with unleaded petrol you use a small nozzle; larger nozzles for leaded or diesel will simply not fit into the filling pipe. In many areas of healthcare, we still have to learn these lessons and make these obvious improvements.


Unnecessary differences in practice between hospitals


The Joint Council for Clinical Oncology had published guidelines for the administration of cytotoxic chemotherapy. However, these were only advisory, and indeed the Council probably did not have the power to make them mandatory. Thus, what any particular doctor knew about the practice of administering cytotoxic drugs depended, to some extent at least, on local custom and practice. When moving from post to post therefore, new practices are encountered and there is every possibility of confusion, particularly in the first few weeks.


The administration of cytotoxic drugs cries out for the adoption of national standards, aided by good design and training. Simply having the same procedures in place throughout the country would, if they were well designed, in itself be a safety measure. As an example of a much overdue standardization, the British National Patient Safety Agency has done the NHS a great service by the simple expedient of standardizing the hospital crash call number across the country; previously several different numbers were in use.


Much more could, and has, been said about the death of David James. Our purpose here, however, is not to resurrect this particular tragedy or to criticize the people involved, but to use the story to show the complexity of events that lead to harm and illuminate the many facets of patient safety. We can see that a combination of individual errors, assumptions about the workplace, poor design, communication problems, problems in team working and other contributory factors brought about this death. In fact, as we saw in the last chapter, this same blend of personal, design and organizational factors underlies many accidents and disasters. We will now look at this more formally by examining James Reason’s model of organizational accidents and its application in healthcare (Reason, 2001).


Aetiology of ‘organizational’ accidents


Many of the accidents in both healthcare and other industries need to be viewed from a broad systems perspective if they are to be fully understood. The actions and failures of individual people usually play a central role, but their thinking and behaviour is strongly influenced and constrained by their immediate working environment and by wider organizational processes. James Reason has captured the essentials of this understanding in his model of an organizational accident (Reason, 1997). We should emphasise though, before describing the model, that not every slip, lapse or fall needs to be understood in terms of the full organizational framework; some errors are confined to the local context and can be largely explained by individual factors and the characteristics of the particular task at hand. However, major incidents almost always evolve over time, involve a number of people and a considerable number of contributory factors; in these circumstances the organizational model (Figure 8.2) proves very illuminating.



Figure 8.2 Organizational accident model (adapted from Reason, 1997).

images

The accident sequence begins (from the left) with the negative consequences of organizational processes, such as planning, scheduling, forecasting, design, maintenance, strategy and policy. The latent conditions so created are transmitted along various organizational and departmental pathways to the workplace (the operating theatre, the ward, etc.), where they create the local conditions that promote the commission of errors and violations (e.g. high workload or poor human equipment interfaces). Many unsafe acts are likely to be committed, but very few of them will penetrate the defences to produce damaging outcomes. The fact that engineered safety features, such as alarms or standard procedures, can be deficient due to latent failures as well as active failures, is shown in the figure by the arrow connecting organizational processes directly to defences.


The model presents the people at the sharp end as the inheritors rather than as the instigators of an accident sequence. Reason points out that this may simply seem as if the ‘blame’ for accidents has been shifted from the sharp end to the system managers. However, managers too are operating in a complex environment and the effects of their actions are not always apparent; they are no more, and no less, to blame than those at the sharp end of the clinical environment (Reason, 2001). Also, any high level decision, whether within a healthcare organization or made outside it by government or regulatory bodies, is a balance of risks and benefits. Sometimes, such decisions may be obviously flawed, but even prima facie reasonable decisions may later have unfortunate consequences.


As well as highlighting the difficulty of assessing the wisdom of strategic decisions, this analysis also extends the analysis of accidents beyond the boundaries of the organization itself to include the regulatory environment. In healthcare many external organizations, such as manufacturers, government agencies, professional and patient organizations, also impact on the safety of the patient. The model shown in Figure 6.2 relates primarily to a given institution, but the reality is considerably more complex, with the behaviour of other organizations impinging on the accident sequence at many different points.


Seven levels of safety


We have extended Reason’s model and adapted it for use in a healthcare setting, classifying the error producing conditions and organizational factors in a single broad framework of factors affecting clinical practice (Vincent, Taylor-Adams and Stanhope, 1998) (Table 8.1).


Table 8.1 Framework of contributory factors influencing clinical practice

































































Factor Types Contributory Influencing Factor
Patient Factors Condition (complexity and seriousness)

Language and communication

Personality and social factors
Task and Technology Factors Task design and clarity of structure

Availability and use of protocols

Availability and accuracy of test results

Decision-making aids
Individual (staff) Factors Knowledge and skills

Competence

Physical and mental health
Team Factors Verbal communication

Written communication

Supervision and seeking help

Team leadership
Work Environmental Factors Staffing levels and skills mix

Workload and shift patterns

Design, availability and maintenance of equipment

Administrative and managerial support

Physical environment
Organizational and Management Factors Financial resources and constraints

Organizational structure

Policy, standards and goals

Safety culture and priorities
Institutional Context Factors Economic and regulatory context

National health service executive

Links with external organizations

(Reproduced from British Medical Journal, Charles Vincent, Sally Taylor-Adams, Nicola Stanhope. “Framework for analysing risk and safety in clinical medicine”. 316, no. 7138, [1154–1157], 1998, with permission from BMJ Publishing Group Ltd.)


At the top of the framework are patient factors. In any clinical situation the patient’s condition will have the most direct influence on practice and outcome. Other patient factors such as personality, language and psychological problems may also be important as they can influence communication with staff. The design of the task, the availability and utility of protocols and test results may influence the care process and affect the quality of care. Individual factors include the knowledge, skills and experience of each member of staff, which will obviously affect their clinical practice. Each staff member is part of a team within the inpatient or community unit, and part of the wider organization of the hospital, primary care or mental health service. The way an individual practises, and their impact on the patient, is constrained and influenced by other members of the team and the way they communicate, support and supervise each other. The team is influenced in turn by management actions and by decisions made at a higher level in the organization. These include policies for the use of locum or agency staff, continuing education, training and supervision and the availability of equipment and supplies. The organization itself is affected by the institutional context, including financial constraints, external regulatory bodies and the broader economic and political climate.


The framework provides the conceptual basis for analysing clinical incidents, in that it includes both the clinical factors and the higher-level, organizational factors that may contribute to the final outcome. In doing so, it allows the whole range of possible influences to be considered and can therefore be used to guide the investigation and analysis of an incident. However, it has also been used to frame and guide broader inquiries and in the design of reporting systems such as the ICU-SRS described in Chapter 5. For instance, Bryony Dean and colleagues used this framework in an analysis of a series of 88 potentially serious prescribing errors (Dean et al., 2002). Interviews with prescribers who made 44 of these errors provided a rich account of the factors contributing to these errors, which were analysed and classified using the seven levels framework, although in practice the influence of higher level factors could not be identified directly (Box 8.2). Staff identified staffing and workload issues as fundamental, followed by lack of skills and knowledge and physical health as being the most important contributory factors.


The investigation and analysis of clinical incidents


A clinical scenario can be examined from a number of different perspectives, each of which may illuminate facets of the case. Cases have, from time immemorial, been used to educate and reflect on the nature of disease. They can also be used to illustrate the process of clinical decision making, the weighing of treatment options and sometimes, particularly when errors are discussed, the personal impact of incidents and mishaps. Incident analysis, for the purposes of improving the safety of healthcare, may encompass all of these perspectives but critically also includes reflection on the broader healthcare system.


Jun 24, 2017 | Posted by in GENERAL SURGERY | Comments Off on 8: Understanding how things go wrong

Full access? Get Clinical Tree

Get Clinical Tree app for offline access