IN THE WEEKS after the first
medical X-ray images early in
1896, scientists and physicians
began to improve on the faint images
produced by tubes and generators like
the ones Roentgen used. How they
made improvements—borrowing
from advances in physics, chemistry,
pharmacology, nuclear science, computers,
telemetry and information
science—is the story of a century of
medical radiology.
Those early X-ray experiments
also led scientists to observe that the
passage of X rays through living
tissue could cause changes. The lowenergy
X rays appeared to have a good
effect on many skin diseases. Open
cancers shrank and the sores dried
up. Arthritis sufferers reported relief
from their pains. When exposures
were seen to make hair fall out, the
X ray was touted as an end to men’s
daily shaving chores. But just as
quickly, workers with X rays noted
that repeated exposures seemed to
cause skin inflammations, ulcers,
sores, superficial and deeper cancers,
blood abnormalities, and even death.
The question arose: must X-ray
workers inevitably forfeit their own
health, as some pioneers did, to the
promise of this new science?
The struggles of radiation scientists
to develop radiation safety
protocols, to devise measurements,
to learn to control X-ray production,
and to exploit the seeming paradox
that higher energies of radiation