Chapter Eight Electromagnetic radiation and radioactivity
We have already touched briefly on the subject of electromagnetic radiation in the previous two chapters; now it is time to examine this phenomenon in detail. As you will see from Table 8.1, as clinicians, we encounter almost the full range of the electromagnetic spectrum both in everyday and medical life. From tuning our car radio to a long-wave station on the way into work, to treating patients undergoing radiotherapy using gamma rays, we use electromagnetic radiation to see, treat and diagnose patients.
A full understanding of this phenomenon is therefore essential to multiple aspects of our clinical performance. Whether evaluating a patient’s musculoskeletal status or checking their second cranial nerve, we will be using different parts of the electromagnetic spectrum, and without a comprehensive knowledge of the way in which photons of different energies interact with matter – particularly biological tissue – we cannot hope to maintain and improve a patient’s health; or our own!
All electromagnetic radiation is made up of particles called photons, which have neither mass nor charge, and move at a constant speed of 2.997 × 108 ms−1. By now, you will hopefully have mastered the ‘double-think’ of wave-particle duality and the unsettling concept of probability waves and have no difficulty in accepting that electromagnetic radiation consists of waves that, unlike the waves we encountered in Chapter 6, which required a medium for their propagation, can travel readily through a vacuum. The waveform consists of oscillating, perpendicular electrical and magnetic fields: the magnetic field produces an electrical field and the electrical field produces a magnetic field, making the system self-sustaining. The direction of propagation is at right angles to the two fields (Fig. 8.1).
Figure 8.1 • Propagation. The waveform for electromagnetic radiation consists of oscillating, perpendicular electrical and magnetic fields: the magnetic field produces an electrical field and the electrical field produces a magnetic field, making the system self-sustaining. The direction of propagation is at right angles to the two fields.
You can determine the direction of the fields and its associated photons in a similar way to that for electromagnetic induction: the second finger shows the electromagnetic field direction (rather than the electromotive force), the index finger again indicates the lines of magnetic flux, whilst the thumb determines the direction of propagation (movement). The only difference is that this time you have to use your left hand. The reason for this is that electromotive force is associated with conventional current (rather than electron movement) and is therefore ‘back to front’, whilst electric field strength is the ‘right way round’. Fortunately, as the left hand is the mirror image of the right, we have a handy self-correcting mechanism. There are plenty of mnemonics to help one stop confusing the two: most induction ceremonies involve the convention of a handshake (with the right hand) whilst when fielding in baseball you wear the glove on the left hand.
As with all waves, there is a relationship between the distance between each ‘peak’ (or each ‘trough’), known as the wavelength (λ), and the number of times the wave goes up and down each second, known as the frequency (ν or f). We shall be continuing to use f in order to avoid confusion with ν, our symbol for velocity; note though that the alternative symbol for frequency is not in fact a ‘v’ but the Greek character ‘nu’ (ν).
This relationship is one of inverse proportionality; as can be seen from Figure 8.1: if you double the wavelength, you half the frequency and vice versa. High frequency equals low wavelength; low frequency equals high wavelength.
where λ and f have already been defined and v = the speed of the wave.
At the bottom end of the energy range of electromagnetic radiation are those used for everyday media and communication. It is just as well that these photons have such minuscule amounts of energy; our bodies are penetrated by them every millisecond of every day but with typical energies of 10−12 to 10−6 eV they have insufficient energy to cause any damage to the atoms from which we are composed (remember, even at the top end of their energy range, 10−6 eV is equal to 1.6 × 10−25 J !).
Radio (and television) signals are generated by high-frequency alternating currents flowing through the aerial of a radio transmitter. If the receiving aerial, which can be a simple piece of wire, is placed in the path of the radiation, an electromotive force is induced, causing a current to flow. This current, the input transducer that was discussed in Chapter 7, is then amplified in the radio circuitry and the output transducer converts the signal to sound waves.
There have traditionally been two means of encoding a radio signal. It is possible to vary the height or amplitude of successive waves, known as amplitude modulation (AM), or the frequency of the waves, frequency modulation (FM). It is also technically perfectly possible to modulate the polarity of individual pulses, that is the angle at which the electric and magnetic fields are oriented; however, this system was never considered commercially even though ultra-wideband radio has considerable advantages over conventional radio.
More recently, digital signals have started to replace the traditional analogue systems. In these, the information is compressed in to ‘packets’; these have the advantage that a host of additional information can be encoded – including a request to retransmit the packet if information is lost – thus improving the quality of the output. They have the disadvantage that the software needed to decode the signal is complex and expensive, which is why digital radios and televisions are often many times the price of their analogue counterparts.
Although very long wavelength radio waves are used by the military to communicate with submarines (extremely low frequencies can penetrate seawater, removing the need for submarines to have to surface in order to communicate), the practical maximum wavelength for commercial radio waves is approximately 1 mile (1500 m).
It is the convention that radio stations which utilize AM signals are identified by their wavelength, which can be long wave (> 1000 m) or medium wave (100–999 m). Below this, short wave radio has traditionally been used by the amateur ‘ham’ radio enthusiast (5–99 m).
By contrast, FM stations are identified by the frequency at which they broadcast (sometimes quoted as the range over which the frequency is modulated). Typically, these will be in the range of tens to hundreds of kHz. The majority of mainstream radio stations eventually moved to FM owing to the superior signal quality.
Although some sources define microwaves as part of the radio wave spectrum, their uses are specific and different. Measured in centimetres and millimetres, microwaves were originally a neglected part of the electromagnetic spectrum. This changed during the Second World War, when British physicists and engineers secretly developed RADAR (RA(dio) D(etection) A(nd) R(anging)), enabling them to detect and intercept the waves of German bombers that had been devastating the country’s military, industrial and social infrastructure.
RADAR works by transmitting pulses of electromagnetic waves in the microwave spectrum. These are reflected off distant objects, particularly those made of metal, back to the source, thus indicating the presence or absence of the target. By cross-triangulating with other transmitters, the distance and speed of aircraft, ships, and other objects, can be determined.
The question often arises as to why microwave ovens not only cook (and defrost) quicker – because they are more powerful than infrared radiation that is found in conventional cookers – but why they appear to cook from the inside out. Because microwaves are more energetic, they are able to penetrate to the centre of objects more easily, whilst infrared radiation is dependent on conduction to reach the centre of cooking objects. In the same way that the outer layers of, say, a roasting chicken, shield the centre from infrared rays; they insulate the centre of the chicken being cooked by microwaves. Meanwhile, the outer layers of both chickens will radiate energy back into the oven. In the case of the microwaved chicken, which is penetrated more evenly, this means that the insulated inside cooks more quickly than the ‘leaky’ outside … good news if you like sponge pudding; not so good if you like your roast beef medium rare.
As with high-tension cables, mobile phone technology has also been involved in controversy with regard to public health concerns. The difficulty with these is that the symptoms can take a long time to manifest, and therefore even longer to investigate – by which time the technology has almost invariably changed.
A good example of this is so-called first-generation mobile phones. These caused far higher levels of radiation than current phones, and this radiation was held directly over one of the least protected areas of the brain, the ear-hole. By the time the association with brain tumours had been established, the phones were obsolete and many of the victims were dead; whether there are longer-term health issues remains to be seen.
Current concerns focus less on individual handsets and more on radio masts, which are often located on top of residential flats and have an as yet undetermined, largely anecdotal association with clusters of diseases, particularly neoplasms, which, as with high-tension wires, are more prevalent in children.
Mechanisms have been proposed to explain this and the results confirmed in animal experiments, but the masts continued to be mounted close to or on top of family homes. Not only is the research that could answer such questions hard to perform, it is even harder to fund.
As researchers into physical therapies can testify, raising the finance into research that threatens the profits of major international corporations, be they the multi-billion dollar pharmaceutical concerns or the even richer telecommunication giants, is uphill work. Getting money to show that drugs or mobile phones are harmless or even beneficial is far easier – and to controllers of the purse strings have the ability to ensure that negative results never see the light of day.
The truth is, we simply do not know enough about the effects of medium energy electromagnetic radiation to be sure they are safe or not – but it is worth remembering that a generation ago, continual exposure to the ultraviolet radiation of the sun was considered beneficial whilst the generation before were advocating x-rays for curing skin conditions.
Produced by moderately hot objects, infrared ‘heat rays’ are generated by moderately energetic atoms, in much the same way that visible light is produced, albeit with lower energies and longer wavelengths. Classic examples are the conventional ovens discussed above or fires, be they coal, wood or electric; they all radiate infrared, which can be sensed by heat detecting nerve endings in our skin, giving a sensation of warmth. Although mammals cannot see infrared, some fish and most snakes and scorpions have eyes capable of sensing the infrared portion of the spectrum; this means that they can effectively find their prey by the heat that their body radiates, which is particularly useful for hunting in the dark.
Most lay people are perhaps most familiar with infrared through this ability to detect things in the dark by sensing warm objects, such as mammalian bodies. This technology, used in passive infrared (PIR) burglar alarm sensors, has also been widely used by wildlife documentary makers, particularly in the days before photon-multiplying night-imaging technology became easily available.
For the clinician, infrared therapy is a familiar, if largely unproven, technology that is commonly used to treat chronic non-healing wounds and musculoskeletal pain as well as a diverse number of other conditions from haemorrhoids to leukaemia. The technology behind this and the evidence base for infrared therapies is discussed in more depth in Chapter 10.
Although visible light forms only 1% of the electromagnetic spectrum as a whole, it is very important both to us and to a large percentage of the Earth’s fauna and flora that rely upon it for vision and as a source of energy.
To a large part of the animal kingdom, ultraviolet radiation forms part of the visible spectrum. Many fish, reptiles, insects, arachnids and all birds can ‘see’ ultraviolet – it is seemingly only mammals that have problems with this part of the spectrum; indeed, if our eyes are over-exposed to ultraviolet radiation, we develop cataracts that can eventually blind.
Our skin is also sensitive to ultraviolet radiation. Although this has an important function – we cannot make vitamin D without it – it can also cause genetic damage to skin cells and cause neoplastic growths such as basal cell and squamous cell carcinomas and malignant melanomas.
Fortunately, we have a built-in defence mechanism; when ultraviolet radiation strikes the skin, it triggers an endocrine reaction causing cells within the dermis called melanocytes to secrete a pigment that protects against longer-term ultraviolet damage – this is why the hue of a person’s skin is related to the latitude of their ethnic origins and also why we tan.
Ultraviolet exposure also has a role to play in the clinician’s armoury. Many cases of psoriasis respond well to exposure to ultraviolet – patients will often comment that their skin plaques diminish in the summer and this effect can be maintained by a daily artificial dose in the winter months.
Seasonal affective disorder (SAD) can also be combated by ultraviolet radiation. Everyone shows some physiological response to the diminished light levels of the winter season, and the effect is magnified in populations nearer to the arctic circles. The lack of light entering the eye decreases the stimulation of the pineal gland and reduces production of the neurotransmitter serotonin. Although this is a useful evolutionary device causing torpor, reduced metabolism or even hibernation in the winter months when food sources are low and energy needs to be preserved, it can in some individuals trigger clinical depression. Wintering in tropical or sub-tropical climes or ski resorts (snow reflects ultraviolet better than longer wavelengths, which is why skiers get tanned, or sunburned, so easily) used to be the treatment of choice for the ‘December blues’. However, the advent of ultraviolet lamps and, more recently, ‘daylight’ electric bulbs have helped reduce the dependency on antidepressant medication for the less affluent sections of society.
The Earth receives considerable protection from ultraviolet radiation through the ozone layer, part of the Earth’s stratosphere that lies between 10 and 50 km above the surface of the Earth. Over 90% of the Earth’s ozone can be found in this narrow band. Ozone is an irritating, corrosive, colourless gas with a smell something like burning electrical wiring. In fact, ozone is easily produced by any high-voltage electrical arc such as spark plugs, Van der Graaf generators, and arc welders. It is also, in much lower concentrations, what gives that ‘seaside smell’ near to large bodies of water.
Each molecule of ozone has three oxygen atoms and is produced when normal oxygen molecules (O2) are broken up by energetic electrons or high-energy radiation. The ozone layer absorbs almost 99% of all the sun’s ultraviolet radiation; without it, the planet would be sterilized on a daily basis.
Unfortunately, ozone is broken down by the action of freons, chemicals that for many years were used as propellants in aerosols and coolants in refrigerators. This caused significant depletion of atmospheric ozone layers, particularly in the extreme southern hemisphere, where the protection afforded dropped by more than 10%. Although a rapid ban of the use of chemicals has allowed the gas to replenish, the long-term effects on the current generation remain unknown. In the past 10 years, rates of skin cancer in many countries have doubled; in Australia, the most affected country, some authorities estimate that the incidence may peak over the coming decades to 12 times the previous levels.
Public education and better health awareness play a vital role in combating this, and physical therapists – who get to see the skin of their disrobed patients far more often than most other physicians – have an important role to play in this, which is why dermatology forms an important part of most undergraduate syllabuses.
In 1895, a professor of physics at Würzburg University called Wilhelm Röntgen was experimenting with the flow of ‘cathode rays’ and noted they caused certain materials, such as a piece of barium platinocyanide lying on his work bench, to fluoresce. He also noted that photographic plates, kept wrapped in paper in his desk drawer underneath the tube, were being fogged. He theorized that some unknown radiation was being formed when the tube was in operation, caused by the cathode rays (electrons) striking the glass wall of the tube. Having no idea what the rays could be, he called them ‘X-radiation’ and, despite an attempt to call them Röntgen radiation in his honour, the name stuck.
In 1895, J. J. Thomson hadn’t yet got round to discovering electrons; the apparent flow of current through a vacuum, which we now know to be a beam of electrons and successfully used to make television screens work, was then attributed to ‘cathode rays’ because the ‘rays’ were given off by the cathode (negatively charged terminal).This is how cathode ray tubes (oscilloscopes) got their name.
Within a month, he had determined most of the ray’s properties and taken the first-ever clinical radiograph, an image of his wife’s left hand, clearly showing the phalanges and her wedding ring (Fig. 8.2).
Röntgen was unable to make x-rays demonstrate diffraction, and therefore erroneously concluded that they were not in the same class of phenomenon as light. In fact, x-rays do diffract but, because of their short wavelength, no ordinary diffraction grating will cause them to do so. Instead, the gaps in atomic structures such as metallic lattices and crystals will cause x-ray diffraction to occur and they later became a vital research tool in the understanding of the properties of matter. Although x-ray crystallography has been used to determine the atomic and molecular structures of thousands of materials, perhaps the most famous instance was Rosalind Franklin, who used the technique to image DNA. Unfortunately for her, Crick and Watson beat her to the interpretations of her findings. In 1901, Röntgen, was awarded the first-ever Nobel Prize for physics for his discovery. In 1962, Crick and Watson, together with their colleague Maurice Wilkins, won the Nobel Prize for physiology and medicine. Rosalind Franklin never received the honour; Nobel Prizes cannot be awarded posthumously and, in 1958, she had died at the tragically young age of 37 from cancer most probably brought on by exposure to the x-rays with which she worked.
Over 100 years later, and despite the advent of magnetic resonance and advanced ultrasound imaging, the x-ray still represents the most common and important form of diagnostic imaging, of particular use to the manual physician wishing to evaluate skeletal structures.
Unlike the arbitrary differentiation between radio waves and microwaves, the difference between x-rays and gamma rays is not a functional one but rather is based on their method of production. In fact, high-energy x-rays are more energetic than low-energy γ-rays; however, whilst x-rays are produced by the actions of electrons, gamma rays arise from the nuclei of certain radioactive isotopes when they undergo decay. This process is best understood as part of the subject of radioactive decay as a whole.
Standard diagnostic imaging – plain film x-rays, computed tomography and magnetic resonance imaging – can be enhanced, and one way in which this can be done is to use a radioactive isotope, which is taken up by the body. Once the atoms are selectively absorbed by the target structure, their radioactive emissions are detected by the imaging sensors, traditionally photographic plates, in order to produce an enhanced image.