New Jersey Governor Jon Corzine just announced an ambitious new energy plan designed to reduce the state’s carbon dioxide emissions. Many are supportive of New Jersey’s efforts, in particular its goals to derive 20% of its power from renewable sources and to decrease overall consumption. However, one part of the plan has caused considerable consternation: the recommendation that New Jersey build a new nuclear plant.
The fact that nuclear power is being counted on to improve the health of the planet should give us pause. For several decades, environmentalists identified nuclear energy as one of earth's greatest dangers due to the handling and storing of radioactive waste and the possibility of plant meltdowns. In addition to its environmental risks, nuclear power has never been competitive economically. No nuclear power plant has been built in the United States since 1973, largely because public utilities have chosen to invest in cheaper options.
Even though nothing has been done to alleviate these environmental and economic concerns, nuclear energy is poised for a resurgence, as exemplified by New Jersey’s energy plan. The obvious explanation is global warming. Fears of a warming planet have reframed the environmental and economic worries of nuclear energy. The industry has been given new life because of the ability of nuclear energy to produce large quantities of electricity with comparably few greenhouse gas emissions (provisions in recent energy bills promising subsidies for nuclear power plants have helped as well).
However, we should not lose sight of the fact that nuclear energy is only attractive because it is the lesser of several evils. This is not the same thing as being a good option. We need to ask ourselves how we got into a situation where our energy options are so limited that building an incredibly expensive and environmentally risky nuclear power plant makes sense.
How did we get here? Three sets of historical decisions have shaped our present position: heavy government subsidy of nuclear energy, a failure to make proactive decisions surrounding climate change, and a lack of investment in renewable energy technologies.
First and foremost, the only reason nuclear power can enter contemporary energy debates is that the industry has received tens of billions of dollars of investment from the U.S. government over the last sixty years. Much of this funding was political in nature: after dropping atomic bombs on Hiroshima and Nagasaki, the U.S. government was eager to show that the most destructive technology ever constructed could also help society. The idea of a nuclear “swords to ploughshares” program justified the investment of billions of dollars in nuclear energy programs to elite policymakers. Even though the results rarely matched the hype, the continued government support of nuclear power ultimately enabled the industry to become relatively well-developed, if not fully financially solvent. The critical point to keep in mind is that without huge quantities of government funding, nuclear power would not be an option today.
The second historical thread is the failure of the United States and other industrialized nations to proactively address global warming. By the 1970s, and certainly by the beginning of the 1990s, there was an emerging scientific consensus that the atmospheric effects of burning fossil fuels were leading to a warming of the planet. The United States has paid almost no heed to these warnings (some nations, such as Europe and Japan have taken more aggressive steps towards reducing emissions, although rhetoric has exceeded reality most of the time). The failure to begin taking proactive steps despite clear scientific evidence has exacerbated the present situation, forcing us to investigate all options for reducing greenhouse gases, even nuclear power.
Third, and related to the above point, one of the most effective interventions would have been to invest heavily in renewable energy technologies to lessen dependence on fossil fuels. The ability of the government to invest heavily in developing energy technologies is proven by the example of nuclear energy. If some of those billions of dollars over the past sixty years had been invested in developing solar, wind, geothermal, biogas, and other renewable technologies, these technologies would be much more commercially developed. As a result, we could be in a position to derive a much greater part of our energy requirements from renewable technologies. Such investments would likely have made a goal of achieving 20% of New Jersey’s energy from renewables seem as pathetic as Bush’s latest climate policy.
These past decisions have shaped today’s reality. Since we cannot go back and change the past, the best we can do is to learn from it. The nuclear lesson is that whatever we do today will structure the options available to our children, for better or for worse. If we choose to continue on our present course, our children will inherit a world of resource scarcity and a considerably warmer planet. Hopefully we can do better. If we invest heavily in renewable energy technologies that have minimal environmental risks (and not all renewable energy technologies are equal in this regard—see my post on the problems of ethanol, for example) we have a chance to give the next generation a legacy of solutions to accompany the challenges we will undoubtedly leave them as well.
The past dealt the present a bad hand for dealing with climate change. It’s up to us to take proactive steps to stack the deck in favor of the future.
Sunday, April 20, 2008
Subscribe to:
Posts (Atom)