Therapeutic drug monitoring and chemical aspects of toxicology

Chapter 19 Therapeutic drug monitoring and chemical aspects of toxicology




Therapeutic drug monitoring


The questions that should be addressed when prescribing a drug are summarized in Figure 19.1. All patients treated with drugs should be monitored clinically to assess the efficacy of treatment and to detect any adverse effects; laboratory assessment may also be helpful for these purposes. Thus, it may be possible to measure a particular index of therapeutic response, for example the blood glucose concentration in a patient with diabetes treated with insulin, or thyroid function tests in a patient with thyrotoxicosis treated with carbimazole. In addition, the laboratory may be asked to monitor for possible toxic effects; for example, proteinuria in patients treated with penicillamine, or abnormalities of thyroid function in patients treated with the iodine-containing antiarrhythmic drug amiodarone.



An individual’s response to a particular drug is dependent on many factors. These can be divided into two categories: pharmacokinetic, relating to the effect of the body on the drug (e.g. transport and distribution, metabolism and excretion) and pharmacodynamic, relating to the action of the drug on the body (e.g. interaction with receptors, and the presence of agonists and antagonists). Age, sex, renal function and the concurrent administration of other drugs are of particular importance. These factors should be borne in mind when deciding what dose of drug to prescribe, but in many cases the optimum dosage can be arrived at by commencing treatment with a standard dose and modifying this as necessary in light of the observed response.


This approach is suitable for the many drugs whose effects can be assessed reliably, such as hypotensive agents, anticoagulants, insulin and oral hypoglycaemics, but it is not universally applicable. Obviously, optimization of drug dosage in this way is impossible when the effect of treatment is not easily ascertainable. An example is the use of anticonvulsants as prophylaxis in epilepsy. The incidence of seizures prior to treatment is unpredictable in many patients, making it difficult to assess the effect of the drug in preventing them. It is also difficult to adjust dosage on the basis of the therapeutic effect when a drug has a low therapeutic ratio (i.e. the dose required to produce a therapeutic effect is close to that at which features of toxicity are seen, as is the case, for example, with lithium), especially if the adverse effects are hard to recognize. In such cases, measurement of the concentration of the drug in the plasma may provide valuable objective information.


It is outside the scope of this chapter to discuss in detail the many factors that can influence the relationship between the dose of a drug and the intensity of its effects. Some of these are listed in Figure 19.2. It is reasonable to assume that there will be a greater correlation between the intensity of a drug’s effect and its plasma concentration than with the dose of the drug that the patient takes. Despite this, plasma concentrations and tissue effects may correlate poorly, as the drug must first travel from the plasma to its site of action and, once there, the responsiveness of the tissues may not be constant or predictable. In addition, there may be no correlation at all when a drug is itself inactive (but is metabolized to an active substance in the body) or when it acts irreversibly.



Nevertheless, the correlation between the plasma concentration and pharmacological effect is surprisingly strong for many drugs and provides the rationale for the use of concentration measurement in therapeutic drug monitoring (TDM). It is important that any experimentally determined relationship between plasma drug concentration and the effect of a drug is confirmed in a clinical setting, and that plasma drug concentrations are interpreted in the particular clinical context. The time of sampling in relation to the time of dosage may be critical and the sensitivity of the target organ may vary, being influenced by individual pharmacokinetic and pharmacodynamic factors.


Even if there is good evidence that measuring the plasma concentration of a particular drug can provide useful information, in individual cases there should always be a rational reason for the request (i.e. a specific question should be asked, the answer to which will influence management); the right specimen (particularly with regard to timing) must be provided, and the analysis must be accurate and its result interpreted correctly. Finally, appropriate action should ensue.


In addition to individualizing the drug therapy, measurements of plasma concentrations of drugs can be useful in the diagnosis of suspected toxicity and in the assessment of compliance (sometimes known as ‘concordance’ in an attempt to reflect a more equal relationship between doctor and patient).


Although TDM is based mainly on serum or plasma measurements, there has been some interest in developing assays using saliva. These should reflect the plasma concentration of the non-protein-bound drug (i.e. free drug) that is directly available to the tissues; the advantage of this technique is that venepuncture is not required, but there are technical problems with the assays and they are only in limited use. Salivary assays are unsuitable for drugs that are actively secreted into saliva (e.g. lithium) or are strongly ionized at physiological hydrogen ion concentration (pH) (e.g. valproate).



Measuring plasma concentration


The most frequently used assays measure the total plasma concentration of a drug. With drugs that are protein bound, changes in plasma protein concentration may have a disproportionate effect on the total drug concentration relative to the amount free in the plasma and thus available to tissues. The assay chosen must be specific for the drug itself (or its active metabolite where appropriate) and should not measure inactive metabolites or be affected by other drugs that the patient may be taking.


As with other biochemical measurements, plasma concentrations of drugs are compared with standard data. The term ‘reference range’ is inappropriate in this context, because healthy people will not be taking the drug. The term ‘therapeutic’ or ‘target’ range is used instead. This is the range between the minimum effective concentration of the drug and the maximum safe concentration. Often, only the upper limit is stated, as a drug may be efficacious in some individuals at concentrations below the generally accepted minimum effective concentration. On the other hand, optimum management may sometimes require that the concentration of a drug is maintained above the upper limit of the therapeutic/target range. Such ranges are not absolute: for example, hypokalaemia increases sensitivity to digoxin and effectively lowers the upper limit. The plasma concentrations of therapeutic drugs must always be considered in context with clinical information: decisions should not be based on concentrations alone, unless these are in an unequivocally toxic range.


Readers should be aware that the concentrations of drugs (and toxins) in body fluids may be reported in either mass units (e.g. mg/L) or molar units (e.g. mmol/L). In the UK, the consensus view is that mass units should be used, except for a few substances that have always been reported in molar units (including iron, lithium, methotrexate and thyroxine) or are so reported because the units are enshrined in legislation (e.g. lead).


When a drug is first taken, the plasma concentration rises relatively rapidly as it is absorbed, and then falls, more slowly, as it is taken up into tissues, metabolized and excreted. Many drugs are taken in doses and at intervals such that a steady-state plasma concentration is achieved. This occurs after a period equivalent to five half-lives, and is often the most relevant concentration to measure. For some drugs with short half-lives, significant fluctuations in plasma concentration occur and it is the peak or trough concentrations, achieved shortly after and immediately before the drug is taken, respectively, that are measured.


In the following section, the use of plasma measurements of a few representative and commonly used drugs is discussed to illustrate the general principles of therapeutic drug monitoring.



Monitoring of specific drugs



Phenytoin


The therapeutic effectiveness of this frequently prescribed anticonvulsant drug is difficult to assess without monitoring. It has a low therapeutic ratio and the signs of toxicity may mimic the neurological diseases that can be associated with epilepsy. Furthermore, phenytoin has unusual pharmacokinetic properties: the enzyme responsible for the elimination of the drug (hepatic CYP2C9) becomes saturated within the therapeutic range of plasma concentrations, giving rise to zero-order kinetics. This phenomenon has several important implications. In particular, the relationship between plasma concentration and dose is non-linear (Fig. 19.3); thus small increments in dose may lead to disproportionate increases in steady-state plasma concentrations. On the other hand, even if the dose is unchanged, a small decrease in drug-metabolizing enzyme activity, or the presence of other drugs that inhibit phenytoin metabolism, could transform a therapeutic plasma concentration to a toxic concentration. Figure 19.3 also indicates the wide variation of doses required to achieve therapeutic plasma concentrations in different individuals.



The measurement of plasma phenytoin concentration is also useful if adverse effects occur, if there is an unexplained deterioration in the patient’s control, during intravenous therapy in status epilepticus and if a drug known to interact with phenytoin has to be prescribed. It is of particular value in children and during pregnancy, when dramatic fluctuations in plasma concentrations and in epileptic control may occur. As emphasized above, however, measurements should be interpreted in the light of clinical circumstances: some patients only achieve effective control of seizures at plasma concentrations greater than the upper limit of the target range, yet do not experience toxicity, while others, particularly older patients, may achieve good control at relatively low concentrations.



image Case history 19.1


A young woman developed idiopathic epilepsy at the age of 19 and had three generalized convulsions in ten days before being started on phenytoin, 150 mg/day. She had a further fit two days after the first dose, but thereafter remained fit-free.




Comment


Phenytoin has a relatively long and unpredictable plasma half-life, and steady-state plasma concentrations may not be reached for 3–4 weeks. The upper limit of the therapeutic range is 20 mg/L. The usual procedure when commencing treatment is to give a standard dose of 150–200 mg/day (in adults) and to measure the plasma concentration after 3–4 weeks. If the patient is well controlled and there are no features of toxicity, the same dose may be continued even if, as in this case, the plasma concentration is low in the therapeutic range. A dose increment is not indicated if the patient is fit-free just on the basis of the plasma concentration of the drug. In the well-controlled patient, this initial plasma concentration may be useful later to help ascertain the cause (e.g. poor compliance, drug interaction) should seizures recur.


If the patient is not well controlled, increments in dose can be made, guided by measurement of plasma concentrations, to produce a steady-state concentration in the therapeutic range. Because of its long half-life, plasma concentrations of phenytoin during chronic administration remain relatively constant throughout the day. For this reason (unusually in therapeutic drug monitoring), the time of sampling in relation to the time the drug is taken is not critical. However, it is essential to leave sufficient time after changing the dose to allow a new steady state to develop. This takes approximately five times the plasma half-life of the drug.



Other anticonvulsants


The value of measuring the plasma concentrations of some other anticonvulsant drugs is shown in Figure 19.4. Carbamazepine induces its own metabolism and interactions occur with other anticonvulsants. Monitoring (of trough concentrations) is valuable when carbamazepine is first prescribed, if seizure control is difficult to achieve and if other anticonvulsants are being used, but is complicated by the fact that the drug has active metabolites, which are not measured in the standard assay. The dosage of ethosuximide can often be adjusted on clinical grounds, as toxicity is easily recognizable when the drug is being used alone. The plasma concentration of lamotrigine reflects its effect and TDM is usually recommended, particularly when the drug is used with phenytoin or carbamazepine (which reduce its plasma half-life) or valproate (which prolongs it). With sodium valproate there is no clear safe maximum concentration, there is a poor correlation between plasma concentration and efficacy, and hepatotoxicity, which is anyway rare, cannot be predicted from plasma concentration. There is a poor correlation between plasma concentrations of phenobarbital and either clinical or toxic effects, so that routine monitoring is of little value (an exception is with its use in children as prophylaxis for febrile convulsions). TDM of vigabatrin is unnecessary. Plasma concentrations show little relationship with clinical effect, probably because the drug binds irreversibly to its target enzyme (γ-aminobutyric acid transferase) in the brain. TDM for clonazepam, gabapentin, levetiracetam and oxcarbazepine is not required; neither is it for felbamate, although the toxicity of this drug requires that liver function and full blood count are monitored regularly.




Digoxin


Digoxin is frequently used in the management of cardiac failure with atrial fibrillation, a common problem in the elderly. Plasma digoxin measurements are valuable not only in the assessment of the appropriate dose to prescribe, but also in the diagnosis of digoxin toxicity and in assessing patient compliance. Failure to take a prescribed medication (non-compliance) is a common cause of failure to achieve a therapeutic response.


The therapeutic range for plasma digoxin concentration in heart failure is generally taken as 0.5–1.0 µg/L. There is a significant increase in plasma concentration following a dose of the drug and a minimum period of 6 h should elapse before blood is drawn for assessment of the mean steady-state concentration. In practice, it is often simplest, and satisfactory for clinical purposes, if a blood sample is taken shortly before a dose is due.


While the therapeutic effect is minimal when the plasma concentration is below 0.5 µg/L and toxicity becomes more common at concentrations above 1.0 µg/L and is almost invariable if they exceed 3.0 µg/L, there is in general a rather poor correlation between plasma concentration of digoxin and therapeutic effect.


This phenomenon is partly a result of the existence of various factors that alter either the therapeutic response to a given plasma concentration of digoxin or the plasma concentration achieved on a particular dose (Fig. 19.5). Hypokalaemia is a particular problem because many patients treated with digoxin are also receiving diuretics, which may cause this (see Case history 21.2). In addition, renal impairment may be a consequence of congestive cardiac failure; this is important because digoxin is mostly eliminated via the kidneys. It is thus very important to consider the clinical setting when assessing the significance of plasma digoxin concentrations. It is good practice always to measure the plasma potassium concentration when digoxin is measured.



Digoxin concentrations are also useful in the diagnosis of digoxin toxicity. This is important because some of the features of toxicity are relatively non-specific (e.g. nausea and vomiting), while others include dysrhythmias that could possibly be a complication of the underlying heart disease. It is important that the possible influence of pathological and physiological factors is considered (see Fig. 19.5).


If a patient taking digoxin is symptom-free yet has a plasma concentration less than 0.5 µg/L, it is likely that the drug is not required, and it may be withdrawn, although under supervision.


The occurrence of endogenous substances that bind to the antibodies used in digoxin immunoassays (digoxin-like immunoreactive substances, DLIS) can (albeit occasionally) cause spurious apparent elevations in digoxin concentrations. Such interference should be suspected if unexpectedly high concentrations are found and the measurement repeated using a different method (because the extent to which antibodies supplied by different manufacturers react with DLIS varies).

Stay updated, free articles. Join our Telegram channel

Apr 3, 2019 | Posted by in BIOCHEMISTRY | Comments Off on Therapeutic drug monitoring and chemical aspects of toxicology

Full access? Get Clinical Tree

Get Clinical Tree app for offline access