Nuclear Power: Overview
Nuclear power plants use the forces within the nucleus of an atom to generate electricity.
The first nuclear reactor was built by Enrico Fermi below the stands of Stagg Field in Chicago in 1942. The first commercial reactor went into operation in Shippingport, Pa., in December 1957.
In its early years, nuclear power seemed the wave of the future, a clean source of potentially limitless cheap electricity. But progress was slowed by the high, unpredictable cost of building plants, uneven growth in electric demand, the fluctuating cost of competing fuels like oil and safety concerns.
Accidents at the Three Mile Island plant in Pennsylvania in 1979 and at the Chernobyl reactor in the Soviet Union in 1986 cast a pall over the industry that was deepened by technical and economic problems. In the 1980s, utilities wasted tens of billions of dollars on reactors they couldn’t finish. In the ‘90s, companies scrapped several reactors because their operating costs were so high that it was cheaper to buy power elsewhere.
But in the early years of the 21st century, more than a dozen companies around the United States became eager to build new nuclear reactors. Growing electric demand, higher prices for coal and gas, a generous Congress and a public support for radical cuts in carbon dioxide emissions all combined to change the prospects for reactors, and many companies were ready to try again.
The old problems remain, however, like public fear of catastrophe, lack of a permanent waste solution and high construction costs. And some new problems have emerged: the credit crisis and the decline worldwide of factories that can make components. The competition in the electric market has also changed.
Most importantly, the disaster in Japan, with its wide impact and huge cleanup costs, has given both environmental groups and governments around the globe new reason for caution.