Therapeutic Drugs and Their Management

Chapter 34


Therapeutic Drugs and Their Management



Physicians have long recognized the limitations of empirical drug dosing, such as standard or fixed dose regimens, and have responded with their clinical judgment and knowledge of basic pharmacology to individualize each patient’s drug dosage. Approximately 40 years ago, quantification of drugs in blood or serum, known as therapeutic drug monitoring, became a standard of practice in cardiology, infectious diseases, neurology, and psychiatry, and more recently in transplantation, to facilitate dose adjustments to attain optimal drug response. Therapeutic drug monitoring offered the physician a scientific rather than empirical approach to selecting a drug regimen to optimize therapy. Now known as therapeutic drug management (TDM), this multidiscipline clinical activity facilitates selection of the drug to which the patient responds best, as well as the optimal dose, and allows assessment of therapeutic compliance and efficacy. It also facilitates detection of drug-drug interactions and is the basis for defining drug-induced toxicity. Laboratory testing to support TDM may include (1) detection of risk factors (e.g., pharmacogenomics; see Chapter 43) that qualify or disqualify a person for a particular therapy, based on the likelihood of predictable pharmacokinetics, toxicity, and response; and (2) quantification of drug and/or drug metabolite concentrations in a biological fluid to assess pharmacokinetics or biomarkers indicative of response. The medical professionals involved in TDM include the ordering physician, the clinical laboratorian (the chemical pathologist, the clinical chemist), the clinical pharmacologist, the pharmacist, and the nurse who handles medication delivery and monitoring.


Once a therapeutic regimen has been selected and initiated, the practice of TDM facilitates optimum therapy by providing the prescribing physician with objective information about drug disposition; TDM describes a patient’s pharmacokinetic status at the moment of specimen collection. Pharmacokinetics is the science that describes the relationship between drug dose and the time course of drug absorption, distribution, and elimination that results in a specific drug concentration in biological systems. Clinical pharmacokinetics involves the application of mathematical relationships to predict whether a drug concentration quantified at a specific time reflects distribution and metabolism of a defined population for a unique patient. Pharmacokinetics that deviates from that typical for a specific population may indicate genetic variants, drug-drug or drug-food interaction, organ failure, or patient noncompliance. Clinical pharmacokinetics can also be applied to predict appropriate change in dose or dosing interval to allow for safe and effective treatment and to attain optimal response to the drug as quickly as possible. It is important therefore for the clinician and the laboratorian to understand how to quantify drugs in biological specimens and how these results are used to achieve effective drug therapy. As mentioned earlier, TDM is a multidisciplinary approach that relies on the cooperative efforts of the physician, nurse, pharmacologist, pharmacist, and clinical laboratorian.


Knowledge of the impact of genetics on drug disposition developed rapidly in the late 1990s and continues to develop in the 2000s. This knowledge field as it relates to drug disposition has become known as pharmacogenomics (PG). TDM and PG are highly interactive disciplines used in conjunction to elucidate the overall pharmacokinetic status of an individual patient. Although the basic concepts of PG are outlined elsewhere in this text (see Chapter 43), the specific aspects of the discipline that relate to the interpretation of TDM results are explained in this chapter. Reviews by O’Kane142 and Weinshilboum208 and the Internet Website offered by Flockhart63 are good sources of additional information.


To be effective, TDM requires the acquisition of a valid specimen followed by timely determination of the drug concentration in the specimen and interpretation of results in the context of dose, time of last dose, and other drugs present. Results should be reported or collated with the dosing schedule so that they may be interpreted in a pharmacokinetic context.


This chapter focuses on the role of the laboratory in the discipline of drug monitoring. Excellent descriptions of the roles of the physician and the consulting pharmacologist are presented in Melmon and Morrelli’s Basic Principles in Therapeutics,129 Goodman and Gilman’s The Pharmacological Basis of Therapeutics,30 Burton and colleagues’ Applied Pharmacokinetics and Pharmacodynamics: Principles of Therapeutic Drug Monitoring,34 and Mandell and colleagues’ Principles and Practice of Infectious Diseases.122 The Physicians’ Desk Reference (PDR), published annually by Medical Economics of Montvale, New Jersey, is also an excellent source of dosing guidance and pharmacokinetic information.



Definitions


Pharmacology comprises that body of knowledge surrounding chemical agents and their effects on living processes. This is a broad field that has traditionally been confined to drugs useful in the prevention, diagnosis, and treatment of disease. Pharmacotherapeutics is that part of pharmacology concerned primarily with the application or administration of drugs to patients for the purpose of prevention and treatment of disease. For this aspect of medical practice to be effective, the pharmacodynamic and pharmacokinetic properties of drugs should be understood.


Pharmacodynamics describes response to drugs (what the drug does to the body) and encompasses the processes of interaction of pharmacologically active substances with target sites, and the biochemical and physiologic consequences that lead to therapeutic or adverse effects.54 For many drugs, the ultimate effect or mechanism of action at the molecular concentration is understood poorly, if at all. However, effects at the cellular or organ system concentration or in the whole body are relatively well understood and usually can be related to the dose of the drug.


Pharmacokinetics describes how drugs are received and handled by the body (what the body does to the drug) and includes the processes of uptake of drugs by the body, the biotransformations they undergo, the distribution of the drugs and their metabolites in tissue, and the elimination of the drugs and their metabolites from the body. Clinical pharmacokinetics is the discipline that applies the principles of pharmacokinetics to safe and effective therapeutic management of an individual patient. It is this aspect of pharmacology that most strongly influences the interpretation of TDM results and that is dealt with in greater detail in this chapter.


Note that the term pharmacology relates to broad knowledge of the systemic effects of a drug; pharmacodynamics refers to the interaction of a drug at its site of action, whereas pharmacokinetics is a mathematical description of drug disposition. These terms are quite different and should not be used interchangeably.


Figure 34-1 illustrates the conceptual relationship between pharmacodynamics and pharmacokinetics. The former relates drug concentration at the site of action to the observed magnitude of the effect (desirable or undesirable). Pharmacokinetics, on the other hand, relates dose, dosing interval, and route of administration (regimen) to drug concentration in the blood over time. For more complete discussions of these basic concepts, the reader is encouraged to review standard textbooks of pharmacology.30,129,167 Toxicology is the subdiscipline of pharmacology concerned with adverse effects of chemicals on living systems. Toxic effects and mechanisms of action may be different from therapeutic effects and mechanisms for the same drug. Similarly, at the high dose of drugs at which toxic effects may be produced, rate processes are frequently altered compared with those at therapeutic doses. For these reasons, the terms toxicodynamics and toxicokinetics are now applied to these special situations.




Basic Concepts


The pharmacologic effect of a drug is elicited by direct interaction of the drug with a receptor controlling a specific function or by drug-mediated alteration of the physiologic process regulating the function; this is known as the mechanism of action. In a given tissue, the site at which a drug acts to initiate events leading to a specific biological effect is called the site of action of the drug. For most drugs, the intensity and duration of the observed pharmacologic effect are proportional to the concentration of the drug at the receptor, predicted by pharmacokinetics.



Mechanism of Action


The mechanism of action of a drug is the biochemical or physical process that occurs at the site of action to produce the pharmacologic effect. Drug action is usually mediated through a receptor. Cellular enzymes and structural or transport proteins are important examples of drug receptors. Nonprotein macromolecules may also bind drugs, resulting in altered cellular functions controlled by membrane permeability or DNA transcription. Some drugs are chemically similar to important natural endogenous substances and may compete for binding sites. Other drugs may block formation, release, uptake, or transport of essential substances. And some may produce an effect by interacting with relatively small molecules to form complexes that actively bind to receptors. These and other examples of receptor binding are more completely discussed in pharmacology texts.30,129,167


Although the exact molecular interactions that describe the mechanism of action remain obscure for many drugs, theoretical models have been developed to explain them. One concept postulates that a drug binds to intracellular macromolecular receptors through ionic and hydrogen bonds and van der Waals forces. This theoretical model further postulates that if the drug-receptor complex is sufficiently stable and is able to modify the target system, an observable pharmacologic response will occur. As Figure 34-2 illustrates, the response is dose dependent until a maximum effect is reached. The plateau may be due to saturation at the receptor or overload of a transport or clearance process.



The utility of monitoring drug concentration is based on the premise that pharmacologic response correlates with the concentration of the drug at the site of action (receptor). Measurement of the concentration at the receptor site in a patient is technically impractical, if not impossible, thus surrogate measures must be used. Because of individual variation in pharmacokinetics, the administered dose is often a poor predictor of the concentration at the receptor and the pharmacodynamic response. However, studies have shown that for many drugs, a strong correlation exists between the serum drug concentration and the observed pharmacologic effect. It is recognized that pharmacodynamic effects may vary between individuals, despite similar serum drug concentrations. For this reason, use of appropriate biomarkers may complement TDM and improve prediction of individual responses to drug therapy; although clinically validated biomarkers have not yet been identified for many therapeutic areas, this is an area of continued research interest.


Years of relating blood concentrations to drug effects have demonstrated the clinical utility of drug concentration information. Nevertheless one must always keep in mind that a serum drug concentration does not necessarily equal the concentration at the receptor; it merely reflects it. However, for pharmacokinetic studies, it is assumed that changes in drug concentration in blood (or serum) versus time are proportional to changes in local concentrations at the receptor site or in body tissue. This assumption is sometimes called the property of kinetic homogeneity and is applicable to all pharmacokinetic models in postabsorptive and postdistributive phases of the time course. Figure 34-3 illustrates that property for a hypothetical compound. Parallel concentrations (log C) are expected in blood at the receptor and in tissue as time passes. Concepts depicted in Figure 34-3 are hypothetical; the absolute concentration of a drug in various tissues is highly variable from drug to drug.



The property of kinetic homogeneity is an important assumption in TDM because it is the basis on which all therapeutic and toxic concentration reference values are established. Measurable concentration ranges collectively define a therapeutic range (Figure 34-4) that represents the relationship between minimum effective concentration (MEC) and minimum toxic concentration (MTC). In the optimal dosing cycle, the trough blood concentration (the lowest concentration achieved just before the next dose) should not fall below the MEC, and the peak blood concentration (the highest concentration achieved within the dosing cycle) should not rise higher than the MTC. This is usually achieved by administering the drug once every half-life, denoted by τ in Figure 34-4. Multiple dosing regimens should achieve steady-state serum drug concentrations consistently greater than the MEC and less than the MTC within the therapeutic range. Steady state is the point at which the body concentration of the drug is in equilibrium with the rate of dose administered and the rate of elimination. Blood concentrations greater than the MTC put patients at risk for toxicity; concentrations less than the MEC put them at risk for the disorder that the drug is supposed to treat. MTC and MEC are useful guidelines in therapy; this concept is incorporated into tables presented later in this chapter summarizing specific drug data. Doses must be planned to achieve therapeutic concentrations, and these must be monitored to guide dose adjustment if necessary. The smaller the difference between MEC and MTC, the smaller the therapeutic index and the more likely TDM will be necessary. The key concept to remember is that MEC and MTC define the therapeutic range for most drugs. In contrast to the concept of reference intervals in clinical chemistry, no protocol has been generally accepted for establishing the therapeutic range of a drug. For some therapeutic agents, the onset of toxicity may occur before maximal clinical response; for others, there may exist a threshold above which no further clinical improvement is seen, but which is not associated with adverse effects. The therapeutic range, therefore, represents the range of drug concentrations within which the probability of the desired clinical response is relatively high, and the probability of unacceptable toxicity or failure to achieve further clinical benefit is relatively low.



Antibiotic administration and management deviate from the principles outlined in the previous paragraph. Antibiotics typically are dosed to achieve a peak blood concentration that exceeds the minimal inhibitory concentration (MIC) sufficient to kill the infecting organism, but is never greater than the MTC. Antibiotics are administered at intervals much longer than the half-life to allow the antibiotic concentration to decay away to allow the host to recover; the ideal trough blood concentration of many antibiotics is nondetectable. Also, antibiotics do not achieve steady state because they do not accumulate.



Pharmacokinetics



Drug Disposition


Pharmacokinetics is the mathematical description of the physiologic disposition of xenobiotics (drugs, poisons, etc.) or endogenous chemicals. The key processes involved in drug disposition include liberation, absorption, distribution, metabolism, and excretion, commonly referred to by the acronym LADME. These processes are affected by several factors specific to the individual receiving the drug, including disease state, comedication, and demographic elements such as age, weight, and gender (Box 34-1). Such factors contribute to interindividual and intraindividual variability in both drug concentration and pharmacologic response, as summarized in Figure 34-5. The processes of drug absorption, distribution, metabolism, and excretion are discussed in the following sections.



BOX 34-1   Factors That Influence Drug Disposition in Humans






Chemical and Environmental Factors Influencing:








Liberation and Absorption


The simplest and most direct route of administering a drug for systemic therapy is intravenous delivery, as infusion into the bloodstream places the complete dose of a compound into immediate circulation. The question of how much of a given dose reaches the patient is therefore essentially bypassed with intravenous administration. However, for reasons of practicality and patient preference, drugs are frequently delivered by alternate means such as oral, intramuscular, transdermal, or sublingual routes; the most common of these is oral administration.


Oral dosing differs from intravenous in that the drug is required to pass from the gastrointestinal tract into the vascular system through a process known as absorption. To be absorbed, a compound must dissociate from its dosing formulation into digestive fluids (i.e., the process of liberation), then cross both gastrointestinal and vascular biological membranes by passive diffusion or, less commonly, by active transport. The ability to negotiate these steps determines the rate and extent of drug absorption and is affected greatly by the nature of the drug itself (e.g., solubility, pKa), the formulation matrix (e.g., immediate- or sustained-release), and the physiologic environment (e.g., pH, gastrointestinal motility).


Most drugs are weak acids or bases that are able to assume ionized or nonionized forms depending on the surrounding pH. The pKa or ionization constant of a compound reflects the pH at which the equilibrium between these forms shifts direction. Passive diffusion across lipid membranes requires the drug to be nonionized; thus absorption will occur most readily at a pH where the nonionized form is favored (i.e., below the pKa of an acidic drug or above the pKa of a basic drug). For this reason, alterations in gastrointestinal pH (e.g., antacid use) can affect the ability of a compound to enter the circulation. Likewise, use of absorptive resins (e.g., cholestyramine) or medications that influence gastrointestinal motility (e.g., opiates) can change the extent or rate of drug absorption, as can diseases that adversely affect gastrointestinal function.


The rate of absorption is also an important consideration for oral administration. Absorption of a drug generally occurs much more rapidly than its elimination; however, the oral formulation of many agents can be manipulated to produce sustained-release products. These prolong the apparent rate of drug absorption, generally with the intent of allowing less frequent dosing or of lessening the variability in plasma drug concentrations between doses.


The amount of drug absorbed relative to the quantity given is referred to as its bioavailability (f). This is calculated as the ratio of drug exposure after equivalent doses of oral and intravenous forms, where exposure is measured as the area under the curve (AUC) of plasma drug concentration over time:


image (1)


Thus, the better a drug is absorbed, the more its exposure (AUC) after oral dosing resembles exposure after intravenous administration, up to a maximum of 100% bioavailability or identical exposure for the two formulations. To be useful as an oral agent, a compound must be absorbed rapidly and extensively enough to provide therapeutically effective concentrations. This typically corresponds to bioavailability greater than 50%, although exceptions to this general rule are certainly known. In some cases, poor bioavailability is advantageous, as with medications (e.g., antibiotics) whose site of action is the gastrointestinal lumen; in this situation, lack of absorption would prevent systemic exposure while still permitting effective therapy.


In addition to the steps required for absorption, the bioavailability of a compound can be affected by first-pass metabolism, which reflects the activity of metabolic enzymes in the intestine and liver. After absorption from the gastrointestinal lumen, drugs can be metabolized in intestinal cells before reaching the bloodstream; furthermore, drugs absorbed from the small intestine are transported via the portal vein directly to the liver, where they are exposed to hepatic metabolic enzymes. Thus, first-pass metabolism in the intestine and liver after absorption can preclude entry into the systemic circulation. Similarly, gastrointestinal transporters such as p-glycoprotein can expel an absorbed drug before it is able to reach the bloodstream. For these reasons, some drugs that are well absorbed nevertheless have low bioavailability.



Distribution


Once in the bloodstream, drugs undergo a process termed distribution. This is the spread of a compound from its point of entry (e.g., digestive tract, infusion catheter) throughout the systemic circulation and into various tissues. Some drugs remain primarily in the blood plasma (e.g., ibuprofen, warfarin); others localize extensively to tissue (e.g., amiodarone, chloroquine). The distribution of a drug to a particular site in the body depends on numerous factors, including drug size, degree of ionization, lipid solubility, extent of protein binding, body composition, and perfusion of the tissue in which the drug accumulates. In general, drugs that distribute extensively tend to be lipophilic, as this facilitates passage through cell membranes; widely distributed compounds often show relatively slow clearance because of the need to remove drug stored in tissue. Alteration of parameters related to distribution can affect the disposition of a drug. For example, rapid weight loss in an acutely ill patient may release drugs previously distributed to adipocytes, leading to elevated serum concentrations and possible toxicity.


Many drugs bind to one or more plasma proteins, most notably albumin, globulins such as α1-acid glycoprotein (AAG), and lipoproteins. In general, acidic drugs associate primarily with albumin, whereas basic drugs preferentially bind globulins and lipoproteins. An equilibrium exists between the amount of drug that is protein-bound and the amount free, that is, not bound to protein; disturbances in serum proteins related to pathologic (e.g., stress response, malnutrition) and physiologic (e.g., pregnancy, aging) settings can shift the balance of this equilibrium. Free drug is more readily accessible to cell membranes, drug receptors, and elimination mechanisms; thus the free fraction is considered the active component of the drug responsible for its biological effects. Changes in equilibrium between free and bound drug can greatly affect the physiologic response to that compound. Serum free drug concentrations have been estimated using ultrafiltration or ultracentrifugation techniques; measurement in oral fluid (saliva) has been proposed as an alternative to ultrafiltration but is unacceptable for most compounds.47


The fact that many drugs and endogenous molecules (e.g., fatty acids) bind to albumin and other serum proteins creates the potential for one compound to be displaced by another as they compete for limited binding sites. Factors determining whether this displacement occurs include the relative affinity for the binding protein and the concentrations of the compounds involved. For example, elevations in fatty acids can displace weakly bound drugs without affecting strongly bound drugs. Similarly, the antiepileptic agents valproic acid and phenytoin compete for the same binding sites on albumin; the higher concentration of valproic acid allows it to displace phenytoin, increasing the free fraction of the latter. Finally, some agents (e.g., valproic acid) can saturate all available protein-binding sites at therapeutic concentrations, leading to rapid elevations in free drug concentrations if the dose exceeds the point of saturation. It is important to recognize in such situations that the total drug concentration may remain unchanged, even in the setting of clinically significant elevations in the free fraction.


Physiologic states or diseases that alter serum composition (e.g., pH, electrolyte balance) can also affect the equilibrium of free and bound drugs. Thus in many situations, patients may experience adverse effects, even severe toxicity, as a direct consequence of increased free drug concentrations. For highly protein-bound drugs (typically more than 60 to 70% bound), clinically significant changes in the free fraction can go unnoticed if only the total (i.e., protein-bound plus free) concentration is monitored; the total amount of drug in serum may not change, or may even decrease, in situations that significantly elevate free concentrations. For example, in healthy individuals, the total concentration of phenytoin comprises (on average) 90% protein-bound and 10% free drug; a total plasma phenytoin of 15 mg/L in a healthy person would correspond to a free phenytoin concentration of approximately 1.5 mg/L. Uric acid competes with phenytoin for protein binding; thus uremic patients can have free fractions of 20 to 30% of the total phenytoin concentration. In other words, in uremia, the same total phenytoin (15 mg/L) could correspond to a free concentration of 4.5 mg/L, a potentially toxic concentration. Measurement of free drug concentrations is required to manage such situations.


Situations involving alterations in protein concentration can also affect the equilibrium between free and protein-bound drug. Acute stress response is one such setting, especially for basic drugs bound to globulins. AAG is a stress response protein, so its concentrations increase notably after physiologic insult; the rise in circulating AAG may necessitate increases in drug dosage to account for the shift in equilibrium toward the protein-bound state. In contrast, hypoalbuminemia is common in pregnancy and in the elderly. Management of these conditions requires careful attention to clinical presentation and, if available, free drug measurements. Elderly patients in particular may manifest atypical signs of toxicity, notably cognitive changes such as confusion; thus analysis of free drug concentrations may be especially helpful in their care.



Metabolism


Metabolism is the process by which the body alters the chemical structure of a compound, whether endogenous or exogenous. In the context of drug therapy, metabolism is typically thought to enhance excretion of xenobiotics, most commonly by increasing water solubility. It is important to note that this does not necessarily coincide with deactivation or detoxification of the drug. Acetaminophen hepatotoxicity, for example, is the result of a minor metabolite (N-acetyl-p-benzoquinone imine) rather than the parent compound. Many drug metabolites are themselves active; an excellent example of this is seen with tamoxifen, a selective estrogen receptor modulator used in breast cancer therapy (Figure 34-6). Not only is tamoxifen active, but three of its metabolites display equal (N-desmethyltamoxifen) or greater (4-hydroxytamoxifen and endoxifen) anticancer activity compared with the parent drug.71 Some therapeutics [e.g., acetylsalicylate (aspirin), codeine, tamoxifen] are delivered as inactive or low-activity compounds, called prodrugs, which require metabolism by the body to exert the desired physiologic effect. Active metabolites must be considered when the clinical effect of a medication is assessed.



Most drug metabolism in humans is the result of enzymatic activity; metabolic enzymes are expressed ubiquitously in tissues and blood components, but the greatest preponderance by far is found in hepatocytes. Hepatic metabolism varies with age: in neonates and very young infants (<1 year), the liver is immature and metabolic activity is slow. The metabolic rate accelerates as a child ages, reaching a peak around puberty and declining thereafter. Age-specific differences in dosing are often necessary to accommodate this variability in hepatic metabolism. In contrast, the current understanding of extrahepatic metabolism is poor, although undeniably important for certain settings (e.g., intestinal modification of ingested agents, lung detoxification of inhaled compounds).146 Tissue-specific metabolism may also play a role in interindividual differences in response to drug therapy.67


Metabolism can be described using similar mathematical models (e.g., Michaelis-Menten kinetics) to those applied to other enzymatic processes; readers are referred to Chapter 15 for a discussion of enzyme kinetics, although a brief description of first-order and zero-order processes is appropriate here. Most drugs exhibit first-order metabolism, that is, the rate of their metabolism is proportional to the drug concentration. This occurs when the available metabolic capacity exceeds the amount of drug present; thus the rate of biotransformation primarily depends on how rapidly drug molecules associate with enzyme active sites. Compounds displaying first-order metabolism show a log-linear association of concentration versus time, meaning that a given fraction of drug is metabolized per unit time. This forms the basis for a half-life (i.e., the time required to remove 50% of the drug present), as will be discussed in the following sections.


Several agents (e.g., ethanol, salicylate, phenytoin, theophylline) do not follow first-order kinetics. Physiologically relevant concentrations of these drugs approach or exceed normal metabolic capacity; thus the availability of enzyme to bind substrate becomes the rate-limiting factor. This situation, where the rate of metabolism is independent of drug concentration, is termed zero-order or nonlinear kinetics (Figure 34-7). The most familiar example of zero-order drug kinetics is the oft quoted clearance estimate for alcoholic beverages: roughly one drink is eliminated per hour, regardless of the number of drinks consumed. Thus, in contrast to first-order metabolism of a set fraction of drug per unit time, zero-order kinetics affects a given concentration of drug per unit time. Although comparatively few drugs display nonlinear behavior at therapeutic concentrations, many will convert to zero-order kinetics in overdoses where available metabolic capacity becomes overwhelmed. An important factor to consider with zero-order behavior is that small dose increases create disproportionately large elevations in serum concentrations because of lack of excess metabolic capacity to accommodate additional drug entering the system.



Metabolic processes can be generalized into two major categories: phase I and phase II metabolism. Phase I consists of chemical modifications such as oxidation, reduction, hydrolysis, or removal of a nonpolar group (e.g., demethylation); phase II processes involve conjugation of the xenobiotic to a water-soluble moiety such as glucuronic acid, sulfate, or glutathione. The names phase I and phase II indicate a convenient grouping system rather than the order in which steps occur; although some compounds do undergo phase I metabolism followed by phase II, others undergo phase II first, while still others proceed through only one type of metabolic reaction. Most compounds have several possible metabolic pathways that require both phase I and phase II reactions. So-called phase III metabolism refers to the activity of transporters such as P-glycoprotein, which are key regulators of drug activity and metabolism but do not themselves alter chemical structure.


The most important enzymes in phase I metabolism are the cytochrome P450 (CYP) family, with just a few CYP isoenzymes accounting for biotransformation of the vast majority of current pharmaceuticals. Those isoenzymes (CYP2D6, CYP3A4/5, and CYP2C9/10) account for less than half the mass of CYP proteins expressed in the liver176; thus there is ample opportunity for substrate competition for enzyme binding sites. Coadministration of drugs or “herbal” products that are metabolized by the same CYPs creates the potential for exceeding available enzymatic capacity, resulting in decreased metabolism of all substrates of that enzyme, whether exogenous or endogenous. Such drug-drug interactions can often be managed by adjusting the dose of one or both compounds, so long as the physician is aware of the interaction.


Drug-metabolizing enzymes are subject to a great deal of interindividual variability, both at the concentration of genetic polymorphisms and at transcriptional or post-translational concentrations. Pharmacogenetics studies the effects of genetic variation in CYPs and other metabolic enzymes; this topic is covered extensively in Chapter 43 and thus will not be discussed here. At the environmental concentration, metabolic activity can be induced or inhibited by a wide variety of drugs, herbal products, and foods. Induction refers to an increase in metabolic activity, typically as a result of enhanced expression of genes encoding drug-metabolizing enzymes. An example of this is the upregulation of CYP3A4 by the herbal product St. John’s Wort; use of this product has been linked to accelerated metabolism of other CYP3A4 substrates including oral contraceptives and immunosuppressive drugs, leading to unintended pregnancies and transplant rejection.123 Intentional induction of enzymes can be performed therapeutically, as with the use of phenobarbital to induce expression of the glucuronide transferase UGT1A1, an enzyme whose reduced activity results in hyperbilirubinemia (i.e., Gilbert or Crigler-Najjar syndrome).163


Inhibition of metabolic activity is more common than enzyme induction. Inhibition can occur by simple substrate competition, where more than one compound must compete for a limited number of enzyme binding sites. This slows the metabolic rate of both substrates, although the difference in metabolism may be more apparent for one of the involved drugs, particularly if one compound has stronger affinity for the enzyme or is present in greater concentration. Other forms of inhibition (e.g., noncompetitive, uncompetitive) directly affect the inherent enzymatic function of a given molecule by binding to the active site or elsewhere on the protein, thus preventing normal metabolic reactions. Mechanisms of enzyme inhibition are discussed in detail in Chapter 15.


Many compounds found in therapeutic drugs (e.g., antiretrovirals, antifungals), herbal products (e.g., saw palmetto, Ginkgo biloba), and common foods (e.g., garlic, green tea) have been reported to inhibit metabolic enzymes. The site of inhibition can be important. For example, grapefruit juice potently inhibits CYP3A4 in intestinal cells.81 Reduced metabolism in the gut actually increases bioavailability with less influence on elimination; this can greatly affect CYP3A4 substrates with variable absorption, such as the immunosuppressive drug cyclosporine. Several algorithms are available to predict potential drug-drug interactions, using current information on metabolic enzyme (mainly CYP) inhibitors, inducers, and substrates.55,68,74



Excretion


Excretion (or elimination) is the final removal of drugs from the body. This can occur by numerous routes, including secretion into sweat, breath, and breast milk, incorporation into hair and nails, or even crossing the placenta into the fetal bloodstream. However, by far the most common means of drug elimination is excretion into urine or stool, depending on the water solubility of the compound. The rate of elimination into urine can be estimated using the glomerular filtration rate (e.g., calculated from serum creatinine).


Clearance can also be measured directly for a particular drug. This requires multiple samples from the same patient and is infrequently done, except for therapeutic agents with a narrow window between efficacy and toxicity. An example of this is the alkylating agent busulfan, used in high doses to ablate bone marrow precursor cells prior to hematopoietic stem cell transplant. Given the delicate balance between effective ablation (leading to successful transplant engraftment) and excessive treatment (leading to serious complications such as veno-occlusive disease of the liver), serial measurements of busulfan are used clinically to assess exposure to the drug and to individualize subsequent doses.126


Urine can be a useful matrix for drug testing; it is readily collected in a noninvasive manner, is relatively poor in protein and other analytical interferences, and generally shows higher drug concentrations because of the ability of the kidneys to concentrate compounds filtered from the blood. For these reasons, it is the most common matrix for drugs of abuse testing and other toxicologic applications (see Chapter 35). However, it is important to note that the correlation between urine drug concentrations and serum concentrations is poor at best. This is the result of wide variability in several factors that can affect renal drug elimination, including patient hydration status, urine pH, and circadian fluctuations in renal function. Although it may be possible to normalize urine drug concentrations somewhat with 24-hour urine samples and correction to a marker of renal function such as creatinine, in practice urine is rarely used for TDM purposes. In select exceptions such as assessing patient compliance in pain clinics,137 samples are obtained frequently, serum concentrations are poorly related to therapeutic efficacy, and risk of drug diversion or misuse is relatively high.



Pharmacokinetic Models


The processes of drug absorption, distribution, metabolism, and elimination are not completely independent steps, but rather occur in an overlapping fashion, often simultaneously, within the body. This is especially true of those agents that are administered serially, as a subsequent dose is typically given before the first dose has been completely eliminated. Thus, it is necessary to have mathematical means of estimating factors such as the amount of drug present at a given time, the rate of clearance of a drug from the system, and the overall exposure to a drug for a given dose. Pharmacokinetic models have been developed to permit such calculations, and the practical aspects of some common models will be discussed here. Readers are referred to previous versions of this chapter for more comprehensive explanation of the derivation of the equations that follow.



Compartmental Models


The concept of physiologic compartments is used to envision the systemic distribution of a drug. A compartment is not a true corollary to a particular organ or fluid; rather, each compartment can be thought of as a representation of those regions of the body (e.g., fluids, various tissues) to which a compound partitions with similar affinity. To clarify this, contrast two dissimilar therapeutic agents: one, a drug such as ibuprofen that remains preferentially in the plasma, and the other, a drug like digoxin that distributes extensively into lipid-rich organs. For the former, an administered dose distributes throughout the systemic circulation with minimal partitioning into tissues; thus, only the pool of drug in the blood needs to be considered when factors such as clearance rate are estimated. Such a compound is well described by a one-compartment model (Figure 34-8), where the compartment in this example is roughly analogous to the systemic circulation.



Alternatively, digoxin exhibits extensive tissue distribution. After absorption of an administered dose, this agent too will rapidly spread throughout the vasculature (the first compartment). However, because of its lipophilic nature, the drug will undergo a second, typically slower process of partitioning into various organs. This step requires passive or active transport into the tissue, thus its kinetics (e.g., rates of entry into, and departure from, tissues) differs from the initial distribution into the bloodstream. This is modeled with a second compartment, approximating the tissue stores of the compound. Because the only fraction of drug available for transport to sites of metabolism and elimination (e.g., liver and kidneys) is in the circulatory system, removal of distributed drug requires re-entry from the secondary “tissue” compartment back into the “blood” compartment (see Figure 34-8). Thus, the presence of a tissue-bound store of drug can greatly increase the amount of time required to fully eliminate a compound. Note that a drug that distributes to tissue can also be modeled reasonably accurately using a single compartment, so long as the drug exhibits similar kinetics in the tissues and fluids involved; the compartments are not true corollaries to regions of the body, merely representations of the number of distinct pools of drug.


The number of compartments included in the model can be extended to three or more; for example, a third compartment could represent sites of extended storage, as seen with strongly lipophilic drugs distributed into adipocytes. For many drugs, increasing the number of compartments will enhance the accuracy of the model. However, each additional compartment increases the complexity of the equations used to describe expected kinetics for the drug of interest. For simplicity, only the one- and two-compartment models will be discussed here.


In a one-compartment model, only a single pool of distributed drug is present within the body, and the rate of elimination is governed by metabolism or clearance from that pool. With a first-order process, a certain percentage of drug is removed per unit time; this is commonly expressed as the half-life (t1/2), which is a measure of the amount of time required to eliminate 50% of the available drug. Compounds that display zero-order kinetics do not have a true half-life because a constant amount of drug is eliminated per unit time, rather than a constant fraction of the total. However, for a given quantity of a drug with nonlinear kinetics, an apparent half-life can be defined that reflects the time required to eliminate 50% of the initial concentration. Note that the apparent half-life changes with alterations in the total amount of drug present (i.e., increasing with higher concentrations and decreasing with lower concentrations).


Drug concentration following first-order elimination decreases in a log-linear fashion, as is shown graphically in Figure 34-9. The slope of the line describing the decline is the elimination constant, k, which is a measure of overall elimination that includes loss of drug into urine or feces, loss due to metabolism, and so on. The elimination constant is related to half-life according to the following formula:



image (2)


In this model, the concentration of drug at any time (Ct) following a single dose can be calculated from the original concentration (C0), the elimination constant, and the time (t):


image (3)


In a two-compartment model, the kinetics of distribution and elimination are distinct from one another, in contrast to the simpler one-compartment model. As shown in Figure 34-10, the initial plasma concentration of drug declines rapidly as the compound equilibrates between the two compartments. This is termed the distribution phase. As equilibrium between the two compartments is approached, the dominant kinetic mechanism becomes the elimination of drug from the plasma pool. This is termed the elimination phase. In general, the elimination process is slowed by the need for drug to leave the tissue compartment before it can be cleared from the body. The slopes fitted to the two phases reflect the distribution (α) and elimination (β) constants, which in turn determine corresponding half-lives for each phase. The distribution half-life is commonly called the alpha half-life, while the elimination half-life is the beta half-life. Calculation of concentration following a single dose incorporates both phases, as is evident in the following equation:



image (4)


As discussed earlier, it is possible for a one-compartment model to describe a variety of drugs, ranging from water-soluble compounds found almost exclusively in the blood, to more lipophilic molecules that simply show similar kinetics in tissues and fluids. To account for differences in the extent of distribution, a volume of distribution (Vd) can be determined for each drug. The Vd is defined by the relationship between a single dose (D0) corrected for bioavailability (f) and the plasma concentration (C0) observed after dosing:


image (5)


The Vd is not an actual physiologic volume; rather, it is a calculated parameter that can be much larger than the volume of a human body. A helpful description is that the Vd is the volume of fluid theoretically required to dilute a given dose to its known concentration if the drug were present only in the blood. If the majority of a compound enters tissue, its plasma concentration will be low, resulting in a higher calculated Vd. Thus a large Vd reflects extensive distribution, and a small Vd suggests that the drug is preferentially retained in the vasculature.


The value of Vd in an individual depends on many of the factors that determine distribution of a compound, including drug lipophilicity, body composition, protein binding, and so forth. Thus, although it is commonly provided as an average value, Vd can show substantial interindividual variability. Vd can be expressed as a volume (e.g., in liters) or as a volume per unit body weight (e.g., L/kg).


The Vd is a useful parameter for estimating plasma concentrations after dosing, and for predicting the clearance rate of a drug. Total body clearance (CLT), the amount of blood or plasma completely cleared of drug per unit time, depends on both the Vd and the elimination constant k:


image (6)



Steady State


Although the example of a single dose is helpful for understanding basic pharmacokinetic principles, in practice TDM is performed for drugs administered multiple times over many days, weeks, or even years. Almost invariably, doses are administered before the preceding dose has been completely eliminated; thus to be useful, TDM models must be able to account for both residual and newly introduced drug.


As seen in Figure 34-4, drugs administered at regular intervals will accumulate to a point termed steady state, that is, where the amount of drug entering the systemic circulation is in balance with the amount being eliminated. Each dose still produces a peak (Cmax) and a trough (Cmin), but once steady state is reached, each subsequent dose should provide an identical profile of drug concentration versus time. For the purposes of this discussion, the dosing interval (τ) will be assumed to equal the half-life, although this is not universally true in practice.


Assuming doses are given at each half-life, a drug with first-order kinetics will require more than five doses to approach steady-state concentrations (>95% of Css). Similarly, at the end of therapy, five to seven half-lives after the last dose must pass for more than 95% of the steady-state concentration to be eliminated. Reaching steady state can be more complicated for drugs with very long half-lives, such as the antiarrhythmic agent amiodarone (t1/2 = 25 days). The time to steady state would be prohibitively long if such a compound were administered only once per half-life; thus such agents are usually given in a larger, initial bolus known as a loading dose to rapidly elevate plasma concentrations closer to steady-state concentrations.


Particular caution must be used with drugs that display nonlinear kinetics. Recall that elimination of such compounds is not affected by the drug concentration; thus an increase in the administered dose is not countered by a corresponding enhancement in clearance. Therefore, a drug with zero-order properties will respond disproportionately to changes in dosing; for example, doubling the dose will result in greater than twofold elevation in plasma concentrations. In addition, because the apparent half-life changes with alterations in the amount of drug present, the time required to reach a new steady-state concentration varies compared with a drug with linear kinetics (e.g., prolonged after a dose increase because of the longer apparent half-life).


Once steady state is achieved, TDM measurements are generally made at trough, that is, immediately before a scheduled dose. The rationale for this is that trough sampling minimizes interpatient variability in absorption, distribution, and so forth, and improves the reliability of comparison of a single plasma concentration versus population therapeutic ranges. For compounds with very long half-lives or those administered as extended-release formulations, less fluctuation between trough and peak concentrations is seen; thus random sampling may be acceptable.


Calculation of trough Css is possible if several parameters [D0, f, Vd, k, τ, and t (time since last dose)] are known:


image (7)


If these factors are not known, it is possible to estimate a median steady-state concentration (C) using a model-independent relationship:


image (8)


This approach provides easier calculation than compartmental modeling, although with the risk of losing pertinent pharmacologic information compared with the more complex model systems. The previous equation may be used for any drug but is most relevant when the half-life of the drug is considerably greater than the dosing interval.



Clinical and Analytical Considerations


A robust TDM program offers clinicians the means to better manage patients and has the potential to improve patient quality of life through optimizing dose, supporting compliance, and minimizing toxicity. The practice of TDM has been expanded and enhanced by advancements in rapid, sensitive, and specific analytical techniques for a wide variety of therapeutic agents.



Clinical Utility


The best candidate drugs for TDM are those meeting one or more of the following criteria: (1) a narrow therapeutic index; (2) used for long-term therapy; (3) correlation between serum concentration and clinical response; (4) wide interindividual or intraindividual variability in pharmacokinetics; (5) absence of a biomarker associated with therapeutic outcome; or (6) administered with other, potentially interacting compounds. Ideally, TDM allows determination of a baseline drug concentration at a time when the patient is responding well clinically and is known to be compliant; this baseline therapeutic concentration can then be used over time to assess compliance, address physiologic or pathologic changes, and maintain optimal dosing for each individual patient. Single measurements of serum drug concentrations should always be interpreted in the context of clinical presentation, length of therapy, comedications, and other factors capable of affecting serum concentrations.


Chronic pharmacologic therapy is a necessary component in managing many conditions. Some therapeutic agents have convenient biomarkers or clinical indicators of their efficacy; for example, statin treatment can be assessed by quantifying cholesterol, and antihypertensive therapy can be evaluated by following blood pressure. However, for many drugs, biomarkers and clinical indicators are absent or are not visible until after the onset of therapeutic failure (e.g., transplant rejection resulting from inadequate immunosuppression). Such drugs are frequently managed using TDM, particularly when the condition for which they are prescribed involves the potential for serious risk to the patient, as with antiseizure therapy or post-transplant immunosuppression. Even for agents with available biomarkers, use of TDM can often assist clinical decision making; if a patient on antiarrhythmic therapy fails to improve cardiac rhythm, TDM may be able to clarify whether the patient requires a different dose, is refractory to that particular drug, or is simply noncompliant.


The ability to detect noncompliance is a major asset of consistent use of TDM. The World Health Organization estimates that only half of patients on long-term drug therapy comply with the prescribed regimen; noncompliance may be a result of taking the medication erratically, too often, too infrequently, or not at all. The cost of medication noncompliance is estimated at more than $100 billion in the United States alone.59 Patients at particular risk include the elderly, who frequently must manage several drug regimens for comorbidities; those with conditions prone to reducing ability or will to comply (e.g., severe depression); and individuals whose conditions include asymptomatic periods, wherein patients feel better and forget or do not feel the need to continue treatment. Without routine TDM, noncompliance with therapy may remain unnoticed until symptoms resume (e.g., renewed seizure activity in an epileptic individual) or the treatment fails (e.g., rejection of a transplanted organ).


Serum drug concentrations are useful in many stages of treatment. Initial selection and dosing of a drug may be guided by TDM, particularly if wide interpatient variability in absorption, metabolism, or other parameters of drug disposition is noted. Without measuring drug concentrations, it is difficult to discern which patients respond poorly to therapeutic concentrations of a particular drug and which ones simply are not within the therapeutic range. Similarly, the presence of comorbidities (e.g., hepatic failure, renal dysfunction) or comedications can complicate the process of establishing an effective dose; population pharmacokinetics often does not adequately address comorbidities or drug interactions, necessitating TDM for such patients.


Routine TDM is also helpful for detecting and managing alterations in drug disposition within an individual. Such changes can occur with physiologic processes (e.g., puberty, pregnancy, aging); however, they may also reflect development or progression of a pathologic state. Conditions as seemingly simple as weight loss or as complex as severe illness can radically affect the disposition of a drug within a single patient; these changes can occur rapidly and may be very difficult to manage clinically. Both acute and chronic shifts in pharmacokinetic behavior can be addressed more effectively with TDM because dose adjustments can be guided by each individual patient’s serum drug concentrations.



Analytical Concerns


A wide variety of analytical techniques are available to facilitate TDM, including numerous permutations of immunoassay methods such as enzyme multiplied immunoassay technique (EMIT), fluorescent polarization immunoassay (FPIA), cloned enzyme donor immunoassay (CEDIA), and chromatographic techniques such as gas chromatography–mass spectrometry (GC-MS), liquid chromatography–mass spectrometry/mass spectrometry (LC-MS/MS), and high performance liquid chromatography–ultraviolet (HPLC-UV). These methods are discussed in Chapters 13 and 16. Immunoassays provide rapid results and ready automation; chromatographic techniques improve specificity and limits of detection, although at a lower throughput. Unfortunately, commercial immunoassays are not available for many of the newer-generation drugs. LC-MS/MS is progressively replacing other HPLC-based methods; it displays greater selectivity and fewer analytical interferences, allowing development of multianalyte assays with higher throughput and less influence from metabolites or other potentially coeluting compounds. The choice of analytical method typically depends on the availability of resources (e.g., technologist expertise, laboratory funding) and the clinical demand for turnaround.


TDM analysis embodies many of the same concerns as other areas of clinical chemistry: the need for accurate, reproducible methods; the requirement for quality assurance and proficiency testing programs; and the necessity of establishing target ranges (i.e., therapeutic indices) and critical values (e.g., toxic concentrations). Certain preanalytical and analytical issues are of particular importance for drug assays. For example, some pharmaceuticals adsorb to the gel matrix in serum or plasma separator tubes, causing falsely low apparent drug concentrations and making these collection devices unacceptable for many tests. Similarly, the time of blood draw relative to administration of the drug is often a key factor in the interpretation of TDM results. Most TDM protocols require sampling at trough (i.e., immediately before the next scheduled dose), particularly for compounds with short half-lives or variable pharmacokinetics.


Other considerations for TDM include the determination of which metabolites and which drug fractions (e.g., free or protein-bound) are clinically relevant. Active metabolites should be quantified, and if the parent compound is also active (i.e., not a prodrug), the concentrations of parent and metabolite should be considered together in interpretation of the results. Inactive metabolites are often of interest as well. They may be associated with toxicity that is independent of the drug’s intended activity (e.g., the acetaminophen metabolite N-acetyl-p-benzoquinone imine) or may serve as a reservoir for conversion to active drug (e.g., the glucuronide conjugate of the immunosuppressant mycophenolic acid). Metabolites often accumulate at a different rate than the parent drug; thus inactive metabolites may provide longer detection windows or in vivo assessment of an individual’s metabolic capacity.


TDM of drugs with extensive protein binding may benefit from monitoring of free drug concentrations. In reasonably healthy individuals free of conditions affecting protein concentrations (e.g., pregnancy, malnutrition) or of comedications capable of altering the free versus bound equilibrium, analysis of free drug concentrations typically is not necessary. However, illness, physiologic alterations, or changes in comedications may shift the balance of free drug concentrations; similarly, free drug measurements are helpful in managing digoxin overdose treated with a drug-binding agent that nullifies but does not remove the excess digoxin. Equilibrium dialysis is the reference method for most free drug assays but is extremely time-consuming. In practice, ultrafiltration is used to remove larger molecules, including protein-bound drug; removal is followed by analysis of the remaining unbound fraction.


Finally, one further issue of clinical and analytical relevance to TDM is the format in which concentration units are expressed. Measured therapeutic drug concentrations are often expressed in units of micrograms per milliliter (µg/mL) or milligrams per liter (mg/L). However, it is recognized that use of the abbreviation µ could adversely affect patient safety.29 For example, in prescribing medication, a handwritten “µg” can be mistaken for mg (milligram), resulting in a thousand-fold overdose of drug, which clearly has the potential to harm a patient. As part of the National Patient Safety Goals initiative, the U.S. Joint Commission (formerly the Joint Commission on Accreditation of Healthcare Organizations) has identified common abbreviations that might be misinterpreted and therefore should not be used in healthcare, especially when medication is prescribed.2 Of relevance to the clinical laboratory is the use of µg as the abbreviation for microgram. Although µg is not currently on the list of abbreviations to be avoided, it is among a group of notations that are reviewed yearly and considered for inclusion on the “Do Not Use” list.1,2,29


Institutions accredited by the Joint Commission now use “mcg” rather than µg when prescribing medication. Some clinical laboratories have likewise chosen to use mcg in reporting concentrations, although other laboratories continue to use µg in laboratory reports, as this practice does not pose the same risks as those inherent in prescribing medication. The Joint Commission states that the “Do Not Use” list of abbreviations does not currently apply to preprogrammed health information technology systems such as laboratory information systems, electronic medical records, or computerized provider order entry systems.1 Complicating the issue, many national and international organizations [e.g., the Unified Code for Units of Measure (http://unitsofmeasure.org/) and the American Medical Association Manual of Style (http://www.amamanualofstyle.com/)] recommend or mandate the use of µ as the symbol for “micro.” It should be noted that if concentrations are reported in units of mg/L, this obviates any problem with the use of µg/mL without affecting the numeric value. Drug concentrations in this chapter are provided as mg/L (equivalent to µg/mL) or µg/L (equivalent to ng/mL) unless conventionally reported in molar units.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Nov 27, 2016 | Posted by in GENERAL & FAMILY MEDICINE | Comments Off on Therapeutic Drugs and Their Management

Full access? Get Clinical Tree

Get Clinical Tree app for offline access