In 1945, humanity first released the massive power of nuclear fission, on the plains of the Jornada del Muerto in New Mexico. This was Trinity, the first nuclear explosion in history. The result was far from a controlled harnessing of energy, but it wasn’t meant to be; the Manhattan Project’s mandate was to produce bombs of unprecedented destruction, and in that it succeeded wildly. But thanks to further investigations into fission, largely undertaken to better understand the nature of the atom, six short years later nuclear power was harnessed to produce electricity for the first time. Three years after that, the American submarine USS Nautilus launched, and for the first time nuclear power was used for a sustained and practical purpose. In 1957, the Virginia-based power plant known as SM-1 began supplying power to the electrical grid, and the age of nuclear power was upon us.
Imagine if a particularly prescient person observed what was occurring. Developed by Lise Meitner and Otto Frisch in 1938, the practical science of nuclear fission had gone from theory to commercial application in less than two decades. Nuclear had gone from tearing apart Hiroshima and Nagasaki to providing clean energy for the citizenry in a dozen years. Even without expert knowledge of nuclear power, our observer would be totally justified if they believed that the technology would irrevocably change the world. Such a person would not have a sophisticated modern understanding of the importance of carbon release in the production of energy and its attendant effect on global temperatures, but they would surely still understand the value of dramatically reducing the amount of air pollution being released into the world.
We now know, of course, that the burning of fossil fuels releases carbon into the atmosphere, which then traps heat, gradually raising the world’s temperature, particularly at the poles. This change in climate presents many potential problems for humanity and the environment, both of which exist in an exquisite balance. Even the more restrained projections of the destructive potential of climate change demonstrate that its effects could be exceedingly costly for civilization. If given an education in the modern understanding of climate, our prescient observer might have noted that, widely and responsibly deployed, nuclear power could have helped prevent one of the biggest global challenges in human history.
If they grasped nuclear power’s capacity for severing the relationship between global supply lines and energy production, the consequences for their current era might have seemed even more vast. The story of the 20th century, to a remarkable extent, was the story of the fight for fossil fuels, the immense strategic, diplomatic, and espionage resources spent in the pursuit of convenient forms of portable energy. Wars were fought for oil, governments toppled, regions destabilized. All of these came at the expense of blood, treasure, and the serial violation of human rights by great powers. Quite literally, since the end of World War II there has been no question of greater international strategic importance than the question of who has access to fossil fuels—and yes, that holds true even considering the existence of nuclear weapons. Today, nothing changes a country’s international standing more considerably or more quickly than the discovery of oil. It turns ghost towns into boom towns, it props up dictators, it shifts the balance of world power. It’s the stuff that drives the world.
To pick one particular region, in 1953 the effort to secure British access to oil compelled the United States to help depose the established government of Iran and re-install the Shah, leading directly to the conditions that enabled a conservative Muslim revolutionary movement that still holds power in the country today; Iran would go on to engage in one of the bloodiest wars of the past half-century against fellow regional power Iraq, with the United States hedging its bets by sending weapons to both sides; saddled by war debt, Iraq would go on to throw its weight around by invading oil-rich Kuwait; this action was seen as imperiling Saudi Arabia, whose own theocratic government controls the second largest proven oil reserves in the world, and who invited American troops into the country for their defense; the presence of those troops in Islam’s holiest places enflamed a rich zealot named Osama bin Laden, who orchestrated an audacious terrorist attack against the United States; hungry for vengeance, the Americans invaded Afghanistan and, later, Iraq…. You get the idea. In a world where nuclear energy had become a dominant form of energy production, the path of recent history would have been radically different. It’s true that fissile materials are themselves natural resources that require extraction, and could spark conflict, but it’s also true that they are not nearly as rare as believed in the popular imagination, and vastly less of them is required to produce a comparable amount of energy as oil.
There was every reason for a nuclear revolution to happen. Nuclear energy is far safer than many people think, with only a small handful of adverse incidents in over 70 years of using fission to generate electricity. Indeed, the overall reliability of nuclear power plants is believed to exceed that of most other forms of energy generation. Initial costs for establishing plants can be steep, but not unreasonably so relative to building a conventional fossil fuel plant, and such costs can be recouped with nuclear power’s efficient energy generation. The land footprint of nuclear power plants is small, typically requiring something like one square mile, while wind and solar farms take up vast stretches of geography. The actual electricity-generation function of nuclear fission is zero-carbon, and while the plants themselves generate some carbon impacts, they’re a rounding error compared to burning gas or coal. Widespread nuclear use could