Nuclear Energy: Past, Present, and Future

To many, the phrase “nuclear energy” evokes strong sentiment: be it a renewable energy source that could mean the end of carbon emissions, or a harebrained scheme destined for disaster, the question of nuclear power is a contentious issue within the scientific community and beyond. While some point to the benefits of a clean, renewable energy source, others warn of the calamitous repercussions. New developments in nuclear technologies, however, are improving safety and lowering the risk of catastrophic failures, promoting increased debate over its feasibility.

The idea of nuclear energy first arose with Ernest Rutherford, a British scientist renowned for his gold foil experiment and atomic model. His work revealed much about the structure of the atom, including the existence of a positively charged nucleus, and paved the way for further study regarding radiation. In a 1904 statement, Rutherford indicated that “If it were ever possible to control at will” radioactive decay, “an enormous amount of energy could be obtained from a small amount of matter.” His words would prove to be prophetic. In 1934, an Italian physicist by the name of Enrico Fermi would pave the way for sustainable nuclear energy.

Fermi carried out a series of experiments in Rome, experiments which proved that neutrons could split various atoms, thus inducing nuclear fission. Continuing his work, Fermi ended up leading a team of scientists at the University of Chicago in 1942 with one goal– a self-sustaining fission chain reaction. A self-sustaining reaction, until that point, had been highly theoretical due to the specific conditions and masses of radioactive material needed. However, using uranium, Fermi’s team succeeded in creating such a reaction, marking the beginning of the nuclear age.

The creation of the Atomic Energy Commission led to a slew of new nuclear reactors in the United States (Source: U.S. Energy Information Administration)

Since 1942, of course, nuclear technology has advanced considerably. While early research was focused on weaponizing nuclear fission, given the context of World War II, peacetime applications came soon after. The creation of the Atomic Energy Commission a year after the war ended eventually led to the first nuclear reactors in the United States, with the world close behind. By 1991, more than 30 nations had commercial nuclear power plants, or goals to that effect.

Currently, nuclear power is responsible for producing around one-fifth of the United States’ electricity: not surprising when one considers its advantages. Nuclear reactors are powered by heat generated by fission of uranium atoms– specifically, an isotope of uranium called U-235. The subsequent heat generated is used to boil water, producing steam which turns a turbine. As the turbine spins, it causes a generator to produce electricity.

A basic diagram of how a nuclear reactor produces electricity. As the steam turns the turbine, the generator shaft also spins, generating electricity. The excess water escapes as vapor
(Source: U.S. Department of Energy)

Unlike plants that require fossil fuels such as coal, nuclear plants do not use combustion of any sort. The uranium is made to undergo fission in a controlled environment, sealed in fuel rods: thus producing zero carbon emissions. However, other environmental factors must be considered. Nuclear power plants consume large amounts of uranium– approximately 200 tons per year. While recent developments have made it possible to re-enrich and reuse radioactive material, nuclear plants still produce their fair share of nuclear waste.

Concerns abound regarding the affects of nuclear waste on the environment, especially surrounding the reactor itself. In the process of cooling the reactor, some radioactive particles become mixed with the water used to do so. Though the water is filtered, and the conditions in the plant carefully monitored, some radiation inevitably makes its way outside. However, the U.S. Nuclear Regulatory Commission has found that the level of radiation that comes from nuclear plants is “less than 1 percent of the radiation” we are already exposed to “from natural background sources.”

Although most nuclear waste produced is fairly low-level in terms of radioactivity, such as tools and other items liable to be contaminated, the spent reactor fuel is just the opposite. While low-level radioactive items are disposed of fairly easily in keeping with regulations, used fuel is harder to handle. Usually in a solid form, the used, or irradiated, fuel must be stored in “specially designed pools of water,” where the water “acts as a radiation shield” according to the U.S. Energy Information Administration. Nevertheless, the U.S. does not have a permanent facility to dispose of high radioactivity nuclear waste, predominantly due to dissent among environmental activists and local citizens.

Locations of current nuclear waste disposal sites in the United States (Source: Department of Energy)

The environmental effects of nuclear energy, for all the regulations involved in disposing of waste, are especially jarring if one looks at the disaster of Chernobyl, Ukraine in 1986. An explosion at one of the four reactors caused immense amounts of damage, as well as sending up a plume of radioactive particles. The radioactive material even travelled as far as Scotland, where it fell mixed with rain. This incident displayed the potential danger involved with nuclear power plants: radioactive particles could spread with ease, infiltrate the water cycle, and prove difficult to eradicate from an area. In 2011, a similar event at the Fukushima nuclear plant in Japan, the result of a tsunami and an earthquake in succession, led to melting of the reactor cores and release of radioactive particles.

A satellite image of Chernobyl today, where lethal levels of radiation still exist (Source: NASA)

While danger is inherent in dealing with nuclear energy, strict regulations and advanced technology have curbed the potential for disaster. Furthermore, new research and development could make nuclear power even safer, especially with the advent of accident tolerant fuels. Accident tolerant fuels, as the name suggests, could mitigate the potential for catastrophic meltdowns. These fuels are predicted to be able to withstand the extreme conditions of a reactor for longer, and would cut down on the amount of fuel needed by around 30% according to the U.S. Department of Energy. They would also increase efficiency and performance, improving safety and profit margins.

In addition to new fuels, new reactors are also expected to make their debut. Advanced reactors, currently under review by the Nuclear Regulatory Commission, could eliminate the possibility of operator error, allowing the reactor to shut itself off safely. Furthermore, designs can now be better tested with 3D printing programs, allowing for a smaller budget and, less margin for design flaws.

Yet, for all the upcoming advancements in reactor technology, they only lessen the chances of failure. Nuclear fission is an innately dangerous, unstable, and violent reaction. While new technology allows for effective containment, it cannot allow for every possibility. However, groundbreaking research has opened up a new alternative to fission-based nuclear plants: nuclear fusion. Where fission involves splitting heavy radioactive elements, fusion creates heavier, more stable elements from lighter atoms. Nuclear fusion, incidentally, is the same process that occurs in stars, including our sun. Just as the sun fuses hydrogen atoms to create helium, so would theoretical fusion reactors– a safer, more reliable approach to nuclear energy.

Nevertheless, fusion technology is still far from realization, with no viable fusion reactor built to date. The foreseeable future of nuclear energy lies with fission reactors. Although the past has seen its share of disasters, Chernobyl being one, fission reactors are advancing with new technologies that could bring increased efficiency and safety to this clean, carbon-free method of energy production.