Professional Documents
Culture Documents
2
Nuclear Energy Affirmative
3
Nuclear Energy Affirmative
4
Nuclear Energy Affirmative
1AC – INHERENCY
CURRENT INCENTIVES ARE NOT ENOUGH. NUCLEAR ENERGY FACES MANY HURDLES
TO GET OFF THE GROUND.
CDP 2008 – CONGRESSIONAL DOCUMENTS AND PUBLICATIONS
FURTHER CONGRESSIONAL SUPPORT NEEDED FOR RESURGANCE OF NUCLEAR POWER, US
HOUSE OF REPRESENTATIVES DOCUMENTS, HOUSE COMMITTEE ON SCIENCE AND
TECHNOLOGY, 4-23
WASHINGTON D.C. - The Science and Technology Committee today held a hearing to explore the
potential for nuclear power to provide an increased proportion of electricity in the U.S. Witnesses at the
hearing highlighted the environmental and strategic benefits of nuclear energy and pointed to ways
Congress can support the development of new nuclear power plants. "Nuclear energy has all the properties
and benefits our world needs to successfully combat global climate change and meet our energy needs,"
said Congressman Brian Bilbray (R-CA). "Nuclear energy is one of the cleanest energy sources known to
mankind, but the United States has not built a new nuclear power plant in nearly 20 years. If we are to
truly harness this great technology and solve our environmental problems, we must make a commitment to
nuclear research and development as well as the production of new nuclear facilities." Companies over the
last nine months have filed nine license applications with the Nuclear Regulatory Commission (NRC) to
build a total of fifteen new nuclear reactors in the U.S. No new reactors have been built in the U.S. in over
twenty years, largely due to high upfront costs and uncertainty, deterring investments in such
facilities. Further, Mr. Robert Van Namen, Senior Vice President of Uranium Enrichment at USEC, said
that our domestic companies are at a disadvantage. "Domestic fuel companies constructing new facilities
face stiff competition in a market dominated by foreign, vertically integrated firms, many of which benefit
from the financial and political support of their governments." He continued, "Now is the time for the U.S.
government to encourage the efforts of our domestic companies to rejuvenate the U.S. nuclear fuel cycle
so it can meet the demand of an expanded nuclear power generating capacity in the decades to
come." Many in the industry have expressed that strong federal incentives are necessary to build new
plants. Incentives authorized within the last three years include: loan guarantees for new nuclear plants;
cost-overrun support; a production tax credit; and a joint government-industry cost-shared program to help
utilities prepare for a new licensing process. However, it is expected that currently authorized loan
guarantees will only cover the first 4-6 new plants. Representing the largest owner and operator of
commercial nuclear power plants in the U.S., Marilyn C. Kray, Vice President of Exelon Nuclear and
President of NuStart Energy Development, highlighted the challenges a company faces when attempting to
build a new nuclear plant. These impediments include lack of confidence in a long-term solution for
used fuel disposal, and lack of public confidence in nuclear power.
5
Nuclear Energy Affirmative
6
Nuclear Energy Affirmative
7
Nuclear Energy Affirmative
From studies of long-past climate, including the famous hockey-stick curve of the past millennium's temperature
(Science, 4 August 2006, p. 603), the IPCC concludes that the recent warming is quite out of the ordinary. "Northern
Hemisphere temperatures during the second half of the 20th century were very
likely higher than during any other 50-year period in the last 500 years," the
report concludes, "and likely the highest in at least the past 1300 years." Contrarians
have conceded that greenhouse gases may be warming the planet, but not by much, they say. The climate system is not
sensitive enough to greenhouse gases to overheat the globe, they say. For the first time, the IPCC report directly counters
that argument. Several different lines of evidence point to a moderately strong climate
sensitivity (Science, 21 April 2006, p. 351). The eruption of Mount Pinatubo in 1991 thickened the stratospheric haze
layer and cooled climate, providing a gauge of short-term climate sensitivity. Paleoclimatologists have determined how
hard the climate system was driven during long-past events such as the last ice age and how much climate changed then.
And models have converged on a narrower range of climate sensitivity. The IPCC concludes that both models and past
The warming for a doubling of CO2 "is
climate changes point to a fairly sensitive climate system.
very unlikely to be less than 1.5°C," says the report, not the less than 0.5°C favored by
some contrarians. A best estimate is about 3°C, with a likely range of 2°C to 4.5°C. What next?
Looking ahead, the report projects a warming of about 0.4°C for the next 2 decades. That is about as rapid as the
By the end of this
warming of the past 15 years, but 50% faster than the warming of the past 50 years.
century, global temperatures might rise anywhere between a substantial
1.7°C and a whopping 4.0°C, depending on the amount of greenhouse gases emitted. In some model
projections, late-summer Arctic sea ice all but disappears late in this century. It is very likely that extremes of heat, heat
waves, and heavy precipitation events will continue to become more frequent. Rain in lower latitudes will decrease,
leading to more drought.
8
Nuclear Energy Affirmative
9
Nuclear Energy Affirmative
Atmospheric concentrations of CO2 rose by a record amount over the past year. It is the third
successive year in which they have increased sharply. Scientists are at a loss to explain why the rapid rise
has taken place, but fear the trend could be the first sign of runaway global warming.
Runaway Global Warming promises to literally burn-up agricultural areas into dust worldwide by
2012, causing global famine, anarchy, diseases, and war on a global scale as military powers
including the U.S., Russia, and China, fight for control of the Earth's remaining resources.
Over 4.5 billion people could die from Global Warming related causes by 2012, as planet Earth
accelarates into a greed-driven horrific catastrophe.
10
Nuclear Energy Affirmative
TWO – STARVATION
A – EXPERTS SAY GLOBAL WARMING WILL INTENSIFY WORLD FOOD SHORTAGES. THE
FUTURE OF FOOD PRODUCTION IS BLEAK WITHOUT A RESPONSE TO CO2 EMISSIONS.
GREENPEACE IN 07 (International responders to climate control, GreenPeace, February 2007,
http://archive.greenpeace.org/climate/database/records/zgpz0207.html, “EXPERTS SAY GLOBAL
WARMING MAY EXACERBATE WORLD FOOD SHORTAGES” CNDI-TP)
The IPCC Working Group III Subgroup on Agriculture, Forestry and Other Systems (AFOS) report
concludes: The anticipated rise in global average temperature of about 2 to 3 oC over the next century
will most likely lead to severe impacts on agriculture and forestry such as: a shift of the climatic zones
by several hundred kilometres towards the poles, enlarging the arid zones in the tropical and subtropical
regions, and reducing the land available for agriculture, a rise in sea level of about 0.3 metres, inundating
valuable land in coastal areas, especially in tropical and subtropical zones, a gradual breakdown of many
ecosystems like forests in temperate and boreal regions, leading to additional CO2 emissions and thus to
further greenhouse warming, potentially increased effects from pests and weeds.
Marine and land food species may also be affected by the increasing levels of ultraviolet radiation reaching
the earth as a result of unavoidable ongoing depletion of stratospheric ozone. This could lead to a reduced
production of biomass and photosynthesis, thus again enhancing the CO2 content of the atmosphere.
The Group concludes that "it is likely to be enormously difficult task for mankind, not only to limit climate
change to a tolerable level, but also to simultaneously achieve sufficient food production for a still rising
world population..." (K. Heinloth (Physikalisches Institut des Universit t Bonn) & R.P. Karimanzira, "Outcomes and policy recommendations from the IPCC/AFOS
working group on climate change response strategies and emission reductions", Climatic Change, v.27(1), p. 139-146, May 1994).
Eminent US scientists, Henry Kendall and David Pimental, agree with the conclusions of the IPCC
workshop. In modelling food supply requirements for various population levels, they conclude that global
warming and ozone depletion may have catastrophic effects on global food production. While most
countries were food self-sufficient in the early 1960s, few remain so. The increasing reliance on fertilisers, pesticides and irrigation,
increasing spread of soil erosion, ground and surface water pollution, salinisation, and rapid degradation of productive land has
contributed to significantly reduced food production. In Africa, per capita grain production has decreased by 22 percent since 1967.
Simultaneously, global population is projected to double in 40 years, necessitating a tripling of current food
production to maintain all peoples above the poverty line. Water is considered the major limiting factor, but
the problems associated with irrigation suggest that this is not the answer. Their study finds that while
global warming may benefit some crops, it may also benefit pests, insects and weeds.
11
Nuclear Energy Affirmative
So, to repeat, the food bubble is now starting to implode. What does it all mean? It means that as these
economic and climate realities unfold, our world is facing massive starvation and food shortages. The
first place this will be felt is in poor developing nations. It is there that people live on the edge of economic
livelihood, where even a 20% rise in the price of basic food staples can put desperately-needed calories out
of reach of tens of millions of families. If something is not done to rescue these people from their plight,
they will starve to death.
Wealthy nations like America, Canada, the U.K., and others will be able to absorb the price increases, so you won't see mass starvation
in North America any time soon (unless, of course, all the honeybees die, in which case prepare to start chewing your shoelaces...), but
it will lead to significant increases in the cost of living, annoying consumers and reducing the amount of money available for other
purchases (like vacations, cars, fuel, etc.). That, of course, will put downward pressure on the national economy.
But what we're seeing right now, folks, is just a small foreshadowing of events to come in the next couple
of decades. Think about it: If these minor climate changes and foolish biofuels policies are already
unleashing alarming rises in food prices, just imagine what we'll see when Peak Oil kicks in and global oil
supplies really start to dwindle. When gasoline is $10 a gallon in the U.S., how expensive will food be
around the world? The answer, of course, is that it will be triple or quadruple the current price. And that
means many more people will starve.
Fossil fuels, of course, aren't the only limiting factor threatening future food supplies on our planet: There's
also fossil water. That's water from underground aquifers that's being pumped up to the surface to water crops, then it's lost to
evaporation. Countries like India and China are depending heavily on fossil water to irrigate their crops, and not surprisingly, the
water levels in those aquifers is dropping steadily. In a few more years (as little as five years in some cases), that
water will simply run dry, and the crops that were once irrigated to feed a nation will dry up and turn to
dust. Mass starvation will only take a few months to kick in. Think North Korea after a season of floods.
Perhaps 95% of humanity is just one crop season away from mass starvation.
12
Nuclear Energy Affirmative
Philip Sutton from Greenleap and David Spratt from Carbon Equity argue that “human activity has
already pushed the planet’s climate past several critical ‘tipping points’, including the initiation of
major ice sheet loss”.
They quote US climate scientist James Hansen who warned in 2007 that the loss of 8 million square kilometres of Arctic sea ice now
seems inevitable, and may occur as early as 2010 — a century ahead of the Intergovernmental Panel on Climate Change projections.
“There is already enough carbon dioxide in the Earth’s atmosphere to initiate ice sheet disintegration in
West Antarctica and Greenland and to ensure that sea levels will rise metres in coming decades”, the report’s
authors say.
“The projected speed of change, with temperature increases greater than 0.3C ̊ per decade and the
consequent rapid shifting of climatic zones will, if maintained, likely result in most ecosystems failing to
adapt, causing the extinction of many animal and plant species. The oceans will become more acidic,
endangering much marine life.
“The Earth’s passage into an era of dangerous climate change accelerates as each of these tipping
points is passed. If this acceleration becomes too great, humanity will no longer have the power to
reverse the processes we have set in motion.”
The authors conclude that we can avert this potential disaster, but warn that the science demands that “politics as usual” be rejected.
“The climate crisis will not respond to incremental modification of the business as usual model.”
“The sustainability emergency is now not so much a radical idea as simply an indispensable course of
action if we are to return to a safe-climate planet”, the authors conclude.
Cam Walker, spokesperson from FoE, used the report’s launch on February 4 to call on the government to urgently review the role of
the Garnaut Climate Change Review which is to make recommendations on carbon emission targets.
Walker criticised the terms of reference for Ross Garnaut, and the government’s policy of a 60% cut in emissions by 2050, saying that
global warming of 3C ̊ would lead to disaster.
“The government is potentially allowing Garnaut to engage in dangerous trade-offs with the lives of many species and many people
rather than setting a safe-climate target”, he said.
Walker said the government is behind the times on climate science and urged it to bring James Hansen, head of the US NASA
Goddard Institute for Space Science, and that country’s most eminent climate scientist, into the review process “so that the science
was put first rather than last in making climate policy”.
Walker said that Hansen warned in December that climate tipping points have already been passed for large
ice sheet disintegration and species loss, and there is already enough carbon in the Earth’s atmosphere for
massive ice sheets such as on Greenland to eventually melt away.
“These impacts are starting to happen at less than one degree of warming, yet the government is effectively planning on allowing
warming to run to 3 degrees”, said Walker.
13
Nuclear Energy Affirmative
14
Nuclear Energy Affirmative
15
Nuclear Energy Affirmative
16
Nuclear Energy Affirmative
He believes only a massive expansion of nuclear power, which produces almost no CO2, can now check
a runaway warming which would raise sea levels disastrously around the world, cause climatic turbulence
and make agriculture unviable over large areas. He says fears about the safety of nuclear energy are
irrational and exaggerated, and urges the Green movement to drop its opposition.
In today's Independent, Professor Lovelock says he is concerned by two climatic events in particular: the
melting of the Greenland ice sheet, which will raise global sea levels significantly, and the episode of
extreme heat in western central Europe last August, accepted by many scientists as unprecedented and a
direct result of global warming.
These are ominous warning signs, he says, that climate change is speeding, but many people are still in
ignorance of this. Important among the reasons is "the denial of climate change is in the US, where
governments have failed to give their climate scientists the support they needed".
17
Nuclear Energy Affirmative
18
Nuclear Energy Affirmative
In May 2004 James Lovelock, originator of the Gaian (earth systems) hypothesis, stirred media interest when he
reiterated his support for nuclear power (NP) as part of the solution to the overwhelming threat that
humanity (and the planet) is facing from global warming. Since then the nuclear industry has been lobbying
hard to restart its failing programme by presenting it as the answer to global warming.
James Lovelock knows better than any of us that the solution to global warming will involve complex
changes involving everything from finance to forestry and gigawatts to goat management, interacting together in a huge system
change. Above all, it will involve a shift in our perception of the world. Literally hundreds of new technologies
will be rolled out, primarily in energy conservation, energy efficiency, and many modes of renewable
energy technology.
The key to all this, as James taught us, is that Gaia moves in cycles that interact in mutually complementary ways, sometimes
facilitating each other and sometimes inhibiting each other. We must leave behind our old ways of thinking in isolated, linear, cause
and effect modules, and learn to think in the way that nature moves, in interrelated web-like systems.
The paradox is that nuclear power is an outstanding example of linear thinking. You dig out your uranium, you
burn it, and you bury it (or fire it off into the sun or something, whatever). From a systems point of view, the main thing to
bear in mind is that you must try to cause as few cancers as you can reasonably get away with, which
means isolating the nuclear cycle as best you can from the rest of nature; (and of course, you have make sure that
nobody with brown skin gets hold of nuclear power, because they might develop nuclear weapons from it, and give them to Osama bin
Laden.).
When I put this systems argument to James Lovelock, his only response was that nuclear fission reactions have occurred in nature.
This is true; but asteroid hits are also a part of nature, but this does not mean that we should contemplating attracting asteroid hits in
an effort to extract energy from them. His response is not a valid defence of his position, and the systems argument against nuclear
power still stands.
James recognises that nuclear power is a risky business, but says that we must use it, because if we
continue to use coal oil and gas, it is certain that global warming will cause immense damage to
planet and people.
We must address the question raised by an environmentalist of the stature of James Lovelock. Should we accept nuclear power, despite
its dangers and drawbacks, as a necessary instrument in the battle against global warming?
19
Nuclear Energy Affirmative
20
Nuclear Energy Affirmative
FIRST, BLACKOUTS!
21
Nuclear Energy Affirmative
significant investment: a high-voltage line from the coal fields of West Virginia to Baltimore and
Washington, D.C. and another, extending from West Virginia to Philadelphia, New Jersey, and Delaware.
However, these lines, hundreds of miles long, would not be necessary, if the mandate existed to build new
nuclear plants where the capacity would be near the load centers.While Virginia and Maryland utilities are
considering such new builds, most of the nuclear power plants that are under consideration by utilities are
in the semi-rural Southeast, where there is political support for new plants, and building more high-voltage
transmission lines to carry the power is unlikely to be held up for 15 years by "environmental" court
challenges. Some of that new nuclear-generated power from the Southeast will be used locally, for growing
demand, and some will be wheeled to the energy-short regions of the mid-Atlantic and Northeast, which
refuse to build their own capacity. Companies that have been buying up transmission capacity will make a
bundle, in the process.Investment in new transmission capacity overall has left the grid system vulnerable
to even small instabilities. The industry estimates that $100 billion is needed in new transmission capacity
and upgrades, as quickly as possible. The 2003 blackout did spur some increase in investment industry-
wide, from $3.5 billion per year to $6 billion in 2006. But profit-minded companies are only willing to
invest funds where there is a profit to be made, namely to carry their "economy transfers," regardless of
how that destabilizes the grid system overall. In a July 2006 article, three former electric utility executives,
who formed the organization, Power Engineers Supporting Truth (PEST), out of disgust with the refusal of
the government to pinpoint deregulation as the cause of the massive grid failure, after the 2003 New York
blackout, stated that the "core issue is an almost fundamentalist reliance on markets to solve even the most
scientifically complex problems... [P]olicy makers continue to act as if some adjustment in market
protocols is all that is required, and steadfastly refuse to acknowledge the accumulating mass of evidence
that deregulation ... is itself the problem. Social scientists call this kind of denial, cognitive dissonance."The
engineers, who have among them, more than five decades of experience in the electrical utility industry,
insist that "new transmission lines will not by themselves improve reliability. They may increase transfer
capacities, and hence improve commercial use of the grid," but will not necessarily improve performance
of the system. "Reliability standards have already been reduced to accomodate greater use of the grid for
commercial transactions," they warned (Table II).There has been a huge penalty for this disruption of the
functioning of the electric grid. PEST estimates that the 2003 blackout incurred economic losses in excess
of $5 billion. The California blackouts cost in excess of $1 billion each. The national impact of declining
reliability and quality, they estimate, is in excess of $50 billion.Where To Go From HereWhen the
California energy crisis of 2000-2001 was raging, distraught state legislators and the embattled Gov. Gray
Davis searched for a solution. Although they knew what that solution was, they protested that it would be
impossible to put the toothpaste of deregulation back in the tube. Lyndon LaRouche and EIR proposed that
that was exactly what needed to be done.On Monday, July 17, 2006, in the midst of an intense Summer heat
wave, one of Con Edison's 22 primary feeder lines failed, below the streets of the City of New York. Over
the next several hours, five more feeder lines were lost. Voltage was reduced 8% to limit the instability, and
the utility was faced with 25,000 customers—about 100,000 people—in the heat and dark. It took until
midnight July 23—seven days later—to restore 20,000 of the affected customers, according to Con
Edison.The New York City blackout was the result not of a Summer heatwave, but of the decades of
underinvestment in the infrastructure that distributes electric power from central feeder lines, through
transformers, to the wires that deliver power to each home, school, factory, office building, small business,
and hospital. Some of Con Edison's underground infrastructure goes back almost as far as Thomas Edison's
first central generating station and underground cable, on Pearl Street in lower Manhattan, in 1882. It was a
length of 59-year-old cable whose failure was a factor in the July blackout.A couple of years ago in
Philadelphia, workers for PECO Energy found that some underground utility cable still in service dated to
1899. In July 1999, the failure of outdated cable was blamed for power outages in Manhattan affecting
200,000 people. In San Francisco, a failed cable in December 2003 created an outage for 100,000 residents.
"We've been using equipment far beyond its original intended life because we've been concerned with the
cost of replacement and the need to keep utility rates down," remarked Dean Oskvig, president of Black &
Veatch, an engineering firm based in St. Louis, last month.Industry-wide, there is agreement that
weaknesses due to the age of the underground distribution cable have been exacerbated by the way the
22
Nuclear Energy Affirmative
system is run in today's deregulated world. To "save money," the industry has turned to a policy of "run to
failure," where a company waits for a failure before replacing aged power lines and other equipment. Black
& Veatch reports that although utilities currently spend more than $18 billion on local distribution systems,
most of that is to string new wire to new housing developments (which will likely come to an end soon,
along with the housing boom), and that an additional $8-10 billion per year is needed to replace obsolete
and corroded equipment.On top of this disinvestment policy, local distribution systems, like the
transmission system, are being stretched beyond their design limits. In addition to chronological age,
overheating of equipment that is caused by heavy electricity use and is repeatedly stressed will age faster,
and is more likely to fail suddenly.In 1986, Con Edison began a program to replace all of its older cable
with a newer design. It is spending about $25 million per year, and at that rate, the utility will not finish
until 2024. By that time, some of its replacement cable will be 38 years old. Con Edison delivers electricity
to 3.2 million customers, through 95,000 miles of underground cable, and 33,000 miles of overhead wires.
Estimates are that about 27% of its underground cable needs to be replaced. Why is it taking decades to
replace old cable?According to media reports, recently Southern California Edison sought approval from
the state Public Utilities Commission to replace 800 miles of aging underground cable, after concluding
that cable failures were the leading cause of outages that could be prevented. But "consumer advocates"
opposed the utility's request to recoup the $145 million cost of replacement, on the grounds that the utility's
records were not adequate to ensure the worst cables would be replaced first. The utility will proceed and
spend $250 million more than is recouped in customers' bills anyway, because they "don't want to get too
far behind." Apparently the shareholder-driven "consumer advocates" never added up the economic, and
sometimes, life-threatening costs, of the alternative—blackouts.Before deregulation, companies like Con
Edison would make investments in infrastructure that were deemed necessary, to maintain a level of service
and reliability that met industry-wide standards, assured that state regulators would allow them to recover
the costs, and maintain their financial health. Today, many states have no authority to either order
investments or compensate companies that make them, leaving Wall Street and the "free market" to decide
who shall have reliable electric power.Between 1990 and the year 2000, utility employment in power
generation dropped from 350,000 to 280,000, as utilities looked for ways to slash costs, to be
"competitive." Over the same decade, employment in transmission and distribution went from 196,000 to
156,000, in a system that is growing more complex by the day. Today, the average age of a power lineman
is 50 years."Quick profit," deregulation, shareholder values, environmentalism, have all run their course,
and nearly taken down the electricity grid. It is time to change the axioms.Transmitting Power, or Just
Profits?Yes, there need to be more power plants built, to make up for the deficits in electric-generating
capacity in many parts of the country. It is also the case that entire regions, in particular the West and East
Coasts, have so much congestion on their transmission lines, that they cannot import the power they need.
And as seen in New York City this past July, breakdowns in 100-year-old underground local distribution
systems are now leaving tens of thousands of people in the dark, and must be replaced.But it is foolhardy to
think that the needed investments will be made under the present regime. Today, thanks to deregulation, a
company can earn more profits by not building anything, and instead charging more for what they already
produce, by creating shortages. This strategy was implemented to perfection six years ago by Enron and
other power pirates in California, which withheld power to raise prices through the roof, allowing them to
steal tens of billions of dollars out of the pockets of electricity consumers throughout the West Coast.Today,
unregulated utility companies do not plow a large portion of their profits back into improving
infrastructure, but instead pay out higher dividends to stockholders. If even a regulated company has any
hope of raising hundreds of millions of dollars on Wall Street to finance growth, it must prove itself
creditworthy, by cutting costs and showing it can abide by shareholder values.Individual companies no
longer cooperate to ensure the overall reliability of the electric grid. They compete to build power plants
and transmission lines based on their return on investment, not on the physical requirements of a regional
system. They make themselves "competitive" to undercut the competition by cutting maintenance costs and
getting rid of as many employees as they can.For two decades, industry officials and the North American
Electric Reliability Council (NERC) have warned that restructuring the electricity system would destroy it.
An understanding of that danger provoked Dr. Anjan Bose, former Dean of Engineering at Washington
23
Nuclear Energy Affirmative
State University, to comment, citing the advancement of power systems expertise in China and India that
"the next time a grandstanding politician in North America compares our grid to that of the Third World, he
may actually mean it as a compliment."There is no way to "fix" the system, as Congress has tried to do, by
piling on more and more Federal regulations, to try to patch up the gaping holes in the broken system that
now exists. The only remedy is to return the intention of the industry to one of providing universally
reliable service, by putting the toothpaste of deregulation back in the tube.The nearly two dozen states that
have restructured their local industry, forcing utilities to sell their generation assets to conglomerate holding
companies, in order to "compete," must return responsibility and oversight for electric generation and
disribution to the state utility commissions. These public servants should decide what should be built, and
where, on the basis of providing for the general welfare, not the profit profiles of companies headquartered
a half-continent away.The now-congested and unstable long-distance high-voltage transmission systems
that criss-cross the nation must be used for the purpose for which they were intended: to enable bulk power
transfer in case of emergency, not to wheel power from one end of the country to the other so a company
can import cheaper power, charge a few cents less, and beat out the competition. Responsibility for the
transmission system should be taken out of the hands of the Federal deregulators, and returned to the
regional reliability councils that formulated the rules of the road to keep the system robust.There are no
shortcuts. Decisive action is needed to reverse the past thirty years of failed policies.
24
Nuclear Energy Affirmative
25
Nuclear Energy Affirmative
ONLY NUCLEAR POWER CAN PROVIDE RELIABLE ELECTRICAL POWER TO KEEP OUR
NATION’S INFRASTRUCTURE FROM GOING UNDER
FERTEL 2004
(March 4 2004, Marvin S., Senior Vice President and Chief Nuclear Officer Nuclear Energy Institute,
“United States Senate Committee Energy and Natural Resources Subcommittee on Energy”, Testimony, pg
online @ http://www.nei.org/newsandevents/speechesandtestimony/2004/energysubcmtefertelextended)
America’s 103 nuclear power plants are the most efficient and reliable in the world. Nuclear energy is the
largest source of emission-free electricity in the United States and our nation’s second largest source of
electricity after coal. Nuclear power plants in 31 states provide electricity for one of every five U.S. homes
and businesses. Seven out of 10 Americans believe nuclear energy should play an important role in the
country’s energy future. 1
Given these facts and the strategic importance of nuclear energy to our nation’s energy security and
economic growth, NEI encourages the Congress to adopt policies that foster continued expansion of
emission-free nuclear energy as a vital part of our nation’s diverse energy mix.
26
Nuclear Energy Affirmative
OIL PEAK IMMINENT – PRICE SHOCKS WIL RIPPLE THROUGH OUR NATIONS
INFRASTRUCTURE
LANDRY 2007
(March 30 2007, Cathy, of the American Petroleum Institute, “GAO warns of peak oil threat to global
economies”, pg LEXIS)
World oil production will peak sometime between now and 2040, the US Government Accountability
Office said March 29, cautioning that if the phenomenon occurs "soon" and "without warning," it could
cause oil prices to surge to unprecedented levels and result in "severe" economic damage. "The prospect
of a peak in oil production presents problems of global proportions whose consequences will depend
critically on our preparedness," GAO, the nonpartisan investigative arm of Congress, said in a report.
"While these consequences would be felt globally, the United States, as the largest consumer of oil and one
of the nations most heavily dependent on oil for transportation, may be especially vulnerable among the
industrialized nations of the world." Despite the threat of peak oil, the US government currently has no
"coordinated or well-defined strategy" to address the uncertainties about the timing of peak oil or to
mitigate its potential effects. For that reason, GAO recommended that the federal government take
immediate action, and suggested that the US energy secretary take the lead in coordinating a government
strategy. The government effort, GAO said, should include a monitoring of global supply and demand with
the intent of reducing uncertainty about the timing of peak oil production. It also should assess alternative
technologies in light of predictions about the timing of peak oil and periodically advise Congress on likely
cost-effective areas where government could assist the private sector with development or adoption of the
new technologies. GAO pointed out that there are "many possible alternatives" to using oil, but that
alternatives will require large investments and in some cases will require major investments or
breakthroughs in technology. "Investment, however, is determined largely by price expectations, so unless
high oil prices are sustained, we cannot expect private investment to continue at current levels," GAO said.
But if the peak were anticipated, it said, oil prices would rise, signaling industry to increase efforts to
develop alternatives and consumers of energy to conserve and look for more energy-efficient products.
27
Nuclear Energy Affirmative
28
Nuclear Energy Affirmative
29
Nuclear Energy Affirmative
30
Nuclear Energy Affirmative
31
Nuclear Energy Affirmative
numerous chemical companies. There is no doubt that this powerful new tool will play a major role in
feeding the world's population in the coming century, but its adoption has hit some bumps in the road. In the second essay, Editor-at-
Large Michael Heylin examines how the promise of agricultural biotechnology has gotten tangled up in real public fear of genetic manipulation and
corporate control over food. The third essay, by Senior Editor Mairin B. Brennan, looks at chemists embarking on what is perhaps the greatest intellectual
quest in the history of science—humans' attempt to understand the detailed chemistry of the human brain, and with it, human consciousness. While this
quest is, at one level, basic research at its most pure, it also has enormous practical significance. Brennan focuses on one such practical aspect: the effort
to understand neurodegenerative diseases like Alzheimer's disease and Parkinson's disease that predominantly plague older humans and are likely to
become increasingly difficult public health problems among an aging population. Science and technology are always two-edged swords. They bestow the
power to create and the power to destroy. In addition to its enormous potential for health and agriculture, genetic engineering conceivably could be used
to create horrific biological warfare agents. In the fourth essay of this Millennium Special Report, Senior Correspondent Lois R. Ember examines the
challenge of developing methods to counter the threat of such biological weapons. "Science and technology will eventually produce sensors able to detect
the presence or release of biological agents, or devices that aid in forecasting, remediating, and ameliorating bioattacks," Ember writes. Finally,
Contributing Editor Wil Lepkowski discusses the most mundane, the most marvelous, and the most essential molecule on Earth, H2O. Providing clean
water to Earth's population is already difficult—and tragically, not always accomplished. Lepkowski looks in depth at the situation in Bangladesh—where
a well-meaning UN program to deliver clean water from wells has poisoned millions with arsenic. Chemists are working to develop better ways to detect
arsenic in drinking water at meaningful concentrations and ways to remove it that will work in a poor, developing country. And he explores the evolving
water management philosophy, and the science that underpins it, that will be needed to provide adequate water for all its vital uses. In the past two
centuries, our science has transformed the world. Chemistry is a wondrous tool that has allowed us to understand the structure of matter and gives us the
ability to manipulate that structure to suit our own purposes. It allows us to dissect the molecules of life to see what makes them, and us, tick. It is
providing a glimpse into workings of what may be the most complex structure in the universe, the human brain, and with
it hints about what
constitutes consciousness. In the coming decades, we will use chemistry to delve ever deeper into these
mysteries and provide for humanity's basic and not-so-basic needs.
32
Nuclear Energy Affirmative
33
Nuclear Energy Affirmative
34
Nuclear Energy Affirmative
35
Nuclear Energy Affirmative
36
Nuclear Energy Affirmative
1AC – PLAN
The United States Federal Government should substantially increase loan guarantees for
the expansion of domestic nuclear power facilities.
We’ll clarify.
37
Nuclear Energy Affirmative
1AC – SOLVENCY
38
Nuclear Energy Affirmative
1AC – SOLVENCY
39
Nuclear Energy Affirmative
1AC – SOLVENCY
development." David Torgerson, chief technology officer and senior vice-president of Atomic Energy of Canada, says the way
uranium resources are used by power generators is driven by cost and supply. During the 1990s, for example, uranium prices were so
low that it made more economic sense to just use it once and then stick the spent fuels in wet or dry storage. But some countries don't
have their own uranium resources, leaving them dependent on imports from other, potentially hostile jurisdictions. As uranium prices
rise, the economics of the once-through fuel cycle also become less appealing when measured against the costs of waste management
and disposal. "As the nuclear renaissance takes off and more reactors are built, it's likely the price of uranium will increase (even
more), and people will be looking at ways of getting more value out of that uranium," says Torgerson. "Any time you can convert a
waste into an asset, then you're going in the right direction." He's quick to point out that the DUPIC process is
also "proliferation resistant," meaning there is no chemical separation of the spent uranium's more
dangerous components, primarily plutonium, which could be used by extremists or rogue nations to
produce nuclear weapons. Only mechanical processing is required to change the shape of the spent fuel
rods into shorter Candu rods. Mechanical reprocessing, while it has some safety and transportation issues,
could be cheaper than conventional chemical reprocessing. "Because this is so much simpler, you have to
expect the economics are going to be so much better," says Torgerson, pointing out that the South Koreans
studied the economics of the DUPIC fuel cycle in the 1990s and found it could compete against other fuel
options. "This is one of the characteristics we're certainly pushing." For countries such as China, which
already have Candu reactors in their fleet, it's an approach that could prove attractive. AECL estimates that
waste fuel from three light-water reactors would be enough to fuel one Candu. Daune Bratt, a political
science instructor and expert on Canadian nuclear policy at Calgary's Mount Royal College, says he can
envision two revenue streams going to Candu operators that choose to embrace the DUPIC process. One
stream would be the revenue that comes in through the generation and sale of electricity; the other would
come from a tipping fee that operators of light-water reactors would pay to unload their spent fuel. "These
(Candu) operators wouldn't be buying the spent fuel, they'd be paid to use the spent fuel for environmental
reasons," says Bratt. "If you can minimize the waste, you bring tremendous value."
40
Nuclear Energy Affirmative
1AC – SOLVENCY
NUCLEAR ENERGY IS THE MOST COST EFFECTIVE WAY TO ADDRESS ENERGY AND
POLLUTION CONCERNS
FORATOM 2006 – EUROPEAN ATTOMIC FORUM
NUCLEAR ENERGY THE MOST COST EFFECTIVE, 2-1,
http://www.foratom.org/index.php?option=com_content&task=view&id=219&Itemid=938
Each country needs an appropriate energy strategy, reflecting its natural resources and its energy needs.
Nuclear energy enables countries to:
* reduce their reliance on imported fossil fuels and electricity imports
* increase their energy independence
* strengthen security of energy supply.
With greater reliance on nuclear energy, countries are less likely to be seriously affected by fossil fuel
shortages and sudden rises in fossil fuel prices. The uranium used in nuclear fuel is available from various
countries with a long history of political stability, including Australia and Canada. This has a stabilising
effect on uranium prices and supply. Any rise in uranium prices would have only a minor impact on the cost
of a nuclear kilowatt-hour, as fuel makes up a comparatively small part of the total cost of producing
nuclear electricity. Power plants that burn fossil fuels are more fuel-intensive; producers and consumers
therefore face a much greater risk of increased costs due to higher fuel prices.
Many existing nuclear power plants have already been paid for. Their operating costs are therefore low, and
the electricity produced is among the cheapest in comparison with other sources. Cost projections show
that new power reactors will also be competitive, even assuming low gas prices and heavy subsidies
for wind power.
Many studies have recently been conducted to compare the costs of generating electricity by different
energy sources, including nuclear, which concluded that nuclear is the most cost-effective power source.
The OECD/NEA report, “Projected Costs of Generating Electricity”, underlines the cost advantages,
especially at discount rates of 5% and 10%, that nuclear energy has when it comes to generating electricity.
These advantages are all the more significant when one considers that demand for energy is set to continue
growing steadily across the world. A recent report conducted by the World Nuclear Association (WNA),
“The New Economics of Nuclear Power”, draws the conclusion that in most industrialized countries new
nuclear power plants offer the most economical way to generate electricity. Moreover according to a
study commissioned by the German Federation, postponing plans to shut down nuclear power plants would
reduce Germany’s electricity generating costs and cut greenhouse gas emissions.
41
Nuclear Energy Affirmative
1AC – SOLVENCY
42
Nuclear Energy Affirmative
1AC – SOLVENCY
43
Nuclear Energy Affirmative
1AC – SOLVENCY
OTHER ALTERNATIVE ENERGY FAILS – NUCLEAR POWER IS THE ONLY HOPE – BOTH
SIDES OF THE ISSUE AGREE
DISCOVER 2008 – SCIENCE TECHNOLOGY AND THE FUTURE
IS NUCLEAR ENERGY OUR BEST HOPE, 4-25, http://discovermagazine.com/2008/may/02-is-nuclear-
energy-our-best-hope
Four years ago this month, James Lovelock upset a lot of his fans. Lovelock was revered in the green
movement for developing the Gaia hypothesis, which links everything on earth to a dynamic, organic
whole. Writing in the British newspaper The Independent, Lovelock stated in an op-ed: “We have no time
to experiment with visionary energy sources; civilisation is in imminent danger and has to use
nuclear—the one safe, available energy source—now or suffer the pain soon to be inflicted by our
outraged planet.” Lovelock explained that his decision to endorse nuclear power was motivated by his
fear of the consequences of global warming and by reports of increasing fossil-fuel emissions that drive the
warming. Jesse Ausubel, head of the Program for the Human Environment at Rockefeller University,
recently echoed Lovelock’s sentiment. “As a green, I care intensely about land-sparing, about leaving land
for nature,” he wrote. “To reach the scale at which they would contribute importantly to meeting global
energy demand, renewable sources of energy such as wind, water, and biomass cause serious environmental
harm. Measuring renewables in watts per square meter, nuclear has astronomical advantages over its
competitors.” All of this has led several other prominent environmentalists to publicly favor new nuclear
plants. I had a similar change of heart. For years I opposed nuclear power, but while I was researching my
book Power to Save the World: The Truth About Nuclear Energy, my views completely turned
around. According to the Department of Energy, just to maintain nuclear’s 20 percent share of the energy
supply, the United States would need to add three or four new nuclear power plants a year starting in 2015.
(There are 104 nuclear power plants currently in operation in the United States.) But no new nuclear power
plants have been built here in 30 years, partly because of the public’s aversion to nuclear power after the Three
Mile Island accident in 1979 and the Chernobyl disaster in 1986. Now NRG Energy, based in Princeton, New Jersey, is sticking its
neck out with plans to build two new nuclear reactors at the South Texas Project facility near Bay City. The new reactors will be able
to steadily generate a total of 2,700 megawatts—enough to light up 2 million households. advertisement | article continues below The
United States alone pumped the equivalent of nearly 7 billion tons of carbon dioxide into the atmosphere in 2005. More than 2 billion
tons of that came from electricity generation—not surprising, considering that we burn fossil fuels for 70 percent of our electricity.
About half of all our electricity comes from more than 500 coal-fired plants. Besides contributing to global warming, their pollution
has a serious health impact. Burning coal releases fine particulates that kill 24,000 Americans annually and cause hundreds of
thousands of cases of lung and heart problems. America’s electricity demand is expected to increase by
almost 50 percent by 2030, according to the Department of Energy. Unfortunately, renewable energy
sources, such as the wind and sun, are highly unlikely to meet that need. Wind and solar installations
today supply less than 1 percent of electricity in the United States, do so intermittently, and are decades
away from providing more than a small boost to the electric grid. “To meet the 2005 U.S. electricity
demand of about 4 million megawatt-hours with around-the-clock wind would have required wind farms
covering over 780,000 square kilometers,” Ausubel notes. For context, 780,000 square kilometers (301,000
square miles) is greater than the area of Texas. Solar power fares badly too, in Ausubel’s analysis: “The
amount of energy generated in [one quart] of the core of a nuclear reactor requires [2.5 acres] of solar
cells.” Geothermal power also is decades away from making a significant contribution to America’s
electricity budget.
44
Nuclear Energy Affirmative
1AC – SOLVENCY
45
Nuclear Energy Affirmative
46
Nuclear Energy Affirmative
47
Nuclear Energy Affirmative
In its report dated January 10, 2005, the NETF identified the unavailability of
financing as a significant obstacle to new nuclear power plant construction. The
NETF recommended that the US government offer a range of financial incentives for the
construction of the first few reactors, such as: secured loans, loan guarantees, accelerated
depreciation, investment tax credits, production tax credits and government power
purchase agreements.
48
Nuclear Energy Affirmative
Currently, the world is led to believe that nuclear power is “evil” and does nothing but
harm society.
By Fareed Zakaria | NEWSWEEK
Apr 21, 2008 Issue
Interviewing Patrick Moore, one of the cofounders of Greenpeace
49
Nuclear Energy Affirmative
After six years of the Bush administration pushing for more oil production and new
nuclear power, Democrats want to provide new incentives for energy efficiency and
renewable energy while limiting those for nuclear power.
The United States if failing to invest and fund nuclear energy technology.
W. J. Nuttall, Judge Institute of Management and Cambridge University Engineering
Department, 2005, “Nuclear Renaissance: Technologies and Policies for the Future of
Nuclear Power”
This book does not argue for a return of the days of the welfare state for nuclear
power. What is observed, however, is that the liberalized markets of North America and
western Europe have, thus far, failed to capture properly all aspects of energy policy and
that these shortcomings have disproportionately harmed new investment in nuclear
generation capacity.
50
Nuclear Energy Affirmative
Climate change sceptics sometimes claim that many leading scientists question
climate change. Well, it all depends on what you mean by "many" and "leading". For
instance, in April 2006, 60 "leading scientists" signed a letter urging Canada's new
prime minister to review his country's commitment to the Kyoto protocol. This appears
to be the biggest recent list of sceptics. Yet many, if not most, of the 60
signatories are not actively engaged in studying climate change: some are not
scientists at all and at least 15 are retired. Compare that with the dozens of
statements on climate change from various scientific organisations around the
world representing tens of thousands of scientists, the consensus position
represented by the IPCC reports and the 11,000 signatories to a petition
condemning the Bush administration's stance on climate science. The fact is that
there is an overwhelming consensus in the scientific community about global
warming and its causes. There are some exceptions, but the number of sceptics is
getting smaller rather than growing. Even the position of perhaps the most
respected sceptic, Richard Lindzen of MIT, is not that far off the mainstream: he
does not deny it is happening but thinks future warming will not be nearly as
great as most predict. Of course, just because most scientists think something is
true does not necessarily mean they are right. But the reason they think the way
they do is because of the vast and growing body of evidence. A study in 2004
looked at the abstracts of nearly 1000 scientific papers containing the term "global
climate change" published in the previous decade. Not one rejected the consensus
position. One critic promptly claimed this study was wrong – but later quietly withdrew
the claim.
51
Nuclear Energy Affirmative
nothing is done, it will get a whole lot worse The last time the Intergovernmental Panel on Climate Change (IPCC)
assessed the state of the climate, in early 2001, it got a polite enough hearing. The world was warming, it said, and human activity was "likely" to be driving most of
the warming. Back then, the committee specified a better-than-60% chance--not exactly a ringing endorsement. And how bad might things get? That depended on
a 20-year-old guess about how sensitive the climate system might be to rising greenhouse gases. Given the uncertainties, the IPCC report's reception was on the
Six years of research later, the heightened confidence is obvious. The warming is "unequivocal." Humans are
tepid side.
"very likely" (higher than 90% likelihood) behind the warming. And the climate
system is "very unlikely" to be so insensitive as to render future warming
inconsequential. This is the way it was supposed to work, according to glaciologist Richard Alley of
Pennsylvania State University in State College, a lead author on this IPCC report. "The governments of the world said
to scientists, 'Here's a few billion dollars--get this right,' " Alley says. "They took the money, and 17 years after the first IPCC report, they got it right. It's still science,
not revealed truth, but the science has gotten better and better and better. We're putting CO2 in the air, and that's changing the climate." With such self-
assurance, this IPCC report may really go somewhere, especially in the newly receptive United States (see sidebar, p. 756), where a small band of scientists has
long contested IPCC reports. Coordinating lead author Gabriele Hegerl of Duke University in Durham, North Carolina, certainly hopes their report hits home this
time. "I want societies to understand that this is a real problem, and it affects the life of my kids." Down to work Created by the World Meteorological
Forty
Organization and the United Nations Environment Programme, the IPCC had the process down for its fourth assessment report.
governments nominated the 150 lead authors and 450 contributing authors of Climate
Change 2007: The Physical Science Basis. There was no clique of senior insiders: 75% of nominated
lead authors were new to that role, and one-third of authors got their final degree
in the past 10 years. Authors had their draft chapters reviewed by all comers. More than 600 volunteered, submitting 30,000 comments.
Authors responded to every comment, and reviewers certified each response. With their final draft of the science in hand, authors gathered in Paris, France, with
was perhaps the most straightforward item of business. For starters, the air is 0.74°C warmer
than in 1906, up from a century's warming of 0.6°C in the last report. "Eleven of the last twelve years rank
among the 12 warmest years in the [150-year-long] instrumental record," notes the
summary (ipcc-wg1.ucar.edu). Warming ocean waters, shrinking mountain glaciers, and
retreating snow cover strengthened the evidence. So the IPCC authors weren't
impressed by the contrarian argument that the warming is just an "urban heat
island effect" driven by increasing amounts of heat-absorbing concrete and asphalt. That effect is real, the report says, but it has "a negligible
influence" on the global number. Likewise, new analyses have largely settled the hullabaloo over why thermometers at Earth's surface measured more warming
have increased the satellite-determined warming, largely reconciling the difference. This confidently observed
warming of the globe can't be anything but mostly human-induced, the IPCC finds. True, modeling studies have shown that
natural forces in the climate system--such as calmer volcanoes and the sun's brightening--have in fact led to
warming in the past, as skeptics point out. And the natural ups and downs of climate have at times warmed the globe. But all of these natural
variations in combination have not warmed the world enough, fast enough, and for long enough in the right geographic patterns to produce the observed warming,
the report finds. In model studies, nothing warms the world as observed except the addition of greenhouse gases in the actual amounts emitted. From studies of
long-past climate, including the famous hockey-stick curve of the past millennium's temperature (Science, 4 August 2006, p. 603), the IPCC concludes that the
of the 20th century were very likely higher than during any other 50-year period in
the last 500 years," the report concludes, "and likely the highest in at least the
past 1300 years." Contrarians have conceded that greenhouse gases may be warming the planet, but not by much, they say. The climate system
is not sensitive enough to greenhouse gases to overheat the globe, they say. For the first time, the IPCC report directly counters that argument. Several
52
Nuclear Energy Affirmative
April 2006, p. 351). The eruption of Mount Pinatubo in 1991 thickened the stratospheric haze layer and cooled climate, providing a gauge of short-term climate
sensitivity. Paleoclimatologists have determined how hard the climate system was driven during long-past events such as the last ice age and how much climate
changed then. And models have converged on a narrower range of climate sensitivity. The IPCC concludes that both models and past climate changes point to a
than 1.5°C," says the report, not the less than 0.5°C favored by some contrarians. A best
estimate is about 3°C, with a likely range of 2°C to 4.5°C. What next? Looking ahead, the report projects a warming of about 0.4°C for the
next 2 decades. That is about as rapid as the warming of the past 15 years, but 50% faster than the warming of the past 50 years. By the end of
53
Nuclear Energy Affirmative
54
Nuclear Energy Affirmative
Nuclear power can play a significant role in preventing catastrophic global warming,
maintain William C. Sailor and Bob van der Zwaan, visiting science fellows at the Center
for International Security and Cooperation, Stanford (Calif.) University. They are
affiliated with Nuclear Power Issues and Choices for the 21st Century, a CISAC project
investigating whether nuclear energy has a legitimate role in preventing global
warming."Mankind is facing a tremendous challenge with global climate change. In the
coming two decades, we have to consider new energy sources, including nuclear,"
indicates Van der Zwaan, on leave from the Free University of the Netherlands, though he
admits that widespread public concern has led several countries to halt development of
nuclear energy. "Eighty-five percent of all Dutch people are opposed to it," he notes, and the numbers are similar in other European countries.
Most of the world's energy is derived from fossil fuels like coal, oil, and natural gas. Only
about six percent comes from nuclear power plants However, burning fossil fuels emits
large amounts of carbon dioxide ([CO.sub.2]) and other gases that trap infrared radiation
from the sun. As a result say many climatologists, the atmosphere is heating up like the
inside of a greenhouse, and unless the rate of [CO.sub.2] gas emissions is reduced the
temperature of the Earth will increase by as much as 6 [degrees] F in the 21st century.
Such global warming, according to worst-case scenarios, will cause disastrous floods,
droughts, and erratic changes in ocean currents, and even will spread tropical diseases
and parasites throughout the planet. Advocates say that nuclear power can help prevent
global warming because reactors produce virtually no greenhouse gases. They point to France, where
about 60 nuclear power plants provide three-fourths of the country's electricity. Critics argue that nuclear power is inherently dangerous and prohibitively
expensive. They point out that accidents like the 1986 Chernobyl power plant disaster in the former Soviet Union can result in radiation poisoning that
lasts many generations. Opponents also maintain that safely storing radioactive waste is difficult and that newly designed breeder reactors could make it
easier for plutonium fuel to get into the hands of terrorists and others eager to build small-scale nuclear weapons. Van der Zwaan and Sailor point to
recent studies showing that, to prevent dangerous climate change from occurring in the next 50 years, the [CO.sub.2]-gas emissions must remain at
Lacking a
current levels--despite a projected 50% population increase by the year 2050 that could double or triple world demand for energy. "
crystal ball that tells us the future, we simply select one possible scenario that achieves
the emissions target." Their scenario envisions a world in which one-third of all energy
comes from fossil fuels; one-third from renewable resources, like solar and wind power
and one-third from nuclear power. To achieve that ambitious goal, all the nations of the
world would have to consume less oil, coal, and natural gas than they do today, while
increasing renewable and nuclear energy sources at least tenfold. To accomplish that will require increasing
the number of nuclear reactors from about 430 to roughly 4,000, which means that more than one nuclear reactor would have to be built every week for
he believes that
the next 50 years. "That would require a massive industrial effort" Van der Zwaan concedes, costing trillions of dollars, but
developed nations like the U.S. can achieve this objective if there is strong popular
support. (According to the Department of Energy, the U.S. has 104 nuclear reactors in operation today. Twenty-eight have been shut down
permanently since 1953, and there are no plans to build new ones.) Sailor, who is on a one-year sabbatical from the Los Alamos (N.M.) National
Laboratory and holds a doctorate in nuclear engineering, argues that renewable forms of energy such as hydro, wind, and solar power are fraught with
"Once it's realized that we cannot make ends
technical or environmental problems that make them unlikely substitutes.
meet without nuclear energy, there is a chance that public opinion will turn greatly so that
nuclear power will once again be acceptable."
55
Nuclear Energy Affirmative
In May 2004 James Lovelock, originator of the Gaian (earth systems) hypothesis, stirred media
interest when he reiterated his support for nuclear power (NP) as part of the solution to the
overwhelming threat that humanity (and the planet) is facing from global warming. Since then the
nuclear industry has been lobbying hard to restart its failing programme by presenting it
as the answer to global warming.
James Lovelock knows better than any of us that the solution to global warming will
involve complex changes involving everything from finance to forestry and gigawatts to goat management, interacting
together in a huge system change. Above all, it will involve a shift in our perception of the world.
Literally hundreds of new technologies will be rolled out, primarily in energy
conservation, energy efficiency, and many modes of renewable energy technology.
The key to all this, as James taught us, is that Gaia moves in cycles that interact in mutually complementary ways, sometimes
facilitating each other and sometimes inhibiting each other. We must leave behind our old ways of thinking in isolated, linear, cause
and effect modules, and learn to think in the way that nature moves, in interrelated web-like systems.
The paradox is that nuclear power is an outstanding example of linear thinking. You dig out
your uranium, you burn it, and you bury it (or fire it off into the sun or something, whatever). From a systems point of
view, the main thing to bear in mind is that you must try to cause as few cancers as you
can reasonably get away with, which means isolating the nuclear cycle as best you can
from the rest of nature; (and of course, you have make sure that nobody with brown skin gets hold of nuclear power,
because they might develop nuclear weapons from it, and give them to Osama bin Laden.).
When I put this systems argument to James Lovelock, his only response was that nuclear fission reactions have occurred in nature.
This is true; but asteroid hits are also a part of nature, but this does not mean that we should contemplating attracting asteroid hits in
an effort to extract energy from them. His response is not a valid defence of his position, and the systems argument against nuclear
power still stands.
James recognises that nuclear power is a risky business, but says that we must use it,
because if we continue to use coal oil and gas, it is certain that global warming will cause
immense damage to planet and people.
We must address the question raised by an environmentalist of the stature of James Lovelock. Should we accept nuclear power, despite
its dangers and drawbacks, as a necessary instrument in the battle against global warming?
56
Nuclear Energy Affirmative
Understand this: For the past quarter century, nuclear energy has been the nation's most
important source of clean power for avoiding airborne emissions that result from burning
oil, natural gas and coal.
According to a new study by Washington-based Energy Resources International, nuclear
energy - by substituting for fossil-fuel power plants - has prevented 219 million tons of
sulfur dioxide and 98 million tons of nitrogen oxides from being discharged into the
atmosphere since 1973. Emission-free nuclear energy also has avoided the release of
more than 2 billion tons of carbon dioxide, a major greenhouse gas linked to global
warming.
57
Nuclear Energy Affirmative
To summarize, human activity is causing the Earth to warm. Bacteria converts carbon in the soil into greenhouse
gasses, and enormous quantities are trapped in unstable clathrates. As the earth continues to warm, permafrost clathrates will thaw; peat and soil microbial activity will
dramatically increase; and, finally, vast oceanic clathrates will melt. This global warming chain reaction has happened in the past.
Atmospheric concentrations of CO2 rose by a record amount over the past year. It is the
third successive year in which they have increased sharply. Scientists are at a loss to
explain why the rapid rise has taken place, but fear the trend could be the first sign of
runaway global warming.
Runaway Global Warming promises to literally burn-up agricultural areas into dust
worldwide by 2012, causing global famine, anarchy, diseases, and war on a global scale
as military powers including the U.S., Russia, and China, fight for control of the Earth's
remaining resources.
58
Nuclear Energy Affirmative
Over 4.5 billion people could die from Global Warming related causes by 2012, as planet
Earth accelarates into a greed-driven horrific catastrophe.
59
Nuclear Energy Affirmative
The IPCC Working Group III Subgroup on Agriculture, Forestry and Other Systems
(AFOS) report concludes: The anticipated rise in global average temperature of about 2 to
3 oC over the next century will most likely lead to severe impacts on agriculture and
forestry such as: a shift of the climatic zones by several hundred kilometres towards the
poles, enlarging the arid zones in the tropical and subtropical regions, and reducing the
land available for agriculture, a rise in sea level of about 0.3 metres, inundating valuable land in
coastal areas, especially in tropical and subtropical zones, a gradual breakdown of many ecosystems like
forests in temperate and boreal regions, leading to additional CO2 emissions and thus to further greenhouse
warming, potentially increased effects from pests and weeds.
Marine and land food species may also be affected by the increasing levels of ultraviolet
radiation reaching the earth as a result of unavoidable ongoing depletion of stratospheric
ozone. This could lead to a reduced production of biomass and photosynthesis, thus again
enhancing the CO2 content of the atmosphere.
The Group concludes that "it is likely to be enormously difficult task for mankind, not
only to limit climate change to a tolerable level, but also to simultaneously achieve
sufficient food production for a still rising world population..." (K. Heinloth (Physikalisches Institut
des Universit t Bonn) & R.P. Karimanzira, "Outcomes and policy recommendations from the IPCC/AFOS working group on climate
change response strategies and emission reductions", Climatic Change, v.27(1), p. 139-146, May 1994).
Eminent US scientists, Henry Kendall and David Pimental, agree with the conclusions of
the IPCC workshop. In modelling food supply requirements for various population levels,
they conclude that global warming and ozone depletion may have catastrophic effects on
global food production. While most countries were food self-sufficient in the early 1960s, few remain
so. The increasing reliance on fertilisers, pesticides and irrigation, increasing spread of soil erosion, ground
and surface water pollution, salinisation, and rapid degradation of productive land has contributed to
significantly reduced food production. In Africa, per capita grain production has decreased by 22 percent
since 1967. Simultaneously, global population is projected to double in 40 years,
necessitating a tripling of current food production to maintain all peoples above the
poverty line. Water is considered the major limiting factor, but the problems associated
with irrigation suggest that this is not the answer. Their study finds that while global
warming may benefit some crops, it may also benefit pests, insects and weeds.
60
Nuclear Energy Affirmative
So, to repeat, the food bubble is now starting to implode. What does it all mean? It
means that as these economic and climate realities unfold, our world is facing massive
starvation and food shortages. The first place this will be felt is in poor developing
nations. It is there that people live on the edge of economic livelihood, where even a 20%
rise in the price of basic food staples can put desperately-needed calories out of reach of
tens of millions of families. If something is not done to rescue these people from their
plight, they will starve to death.
Wealthy nations like America, Canada, the U.K., and others will be able to absorb the price increases, so you won't see mass starvation
in North America any time soon (unless, of course, all the honeybees die, in which case prepare to start chewing your shoelaces...), but
it will lead to significant increases in the cost of living, annoying consumers and reducing the amount of money available for other
purchases (like vacations, cars, fuel, etc.). That, of course, will put downward pressure on the national economy.
But what we're seeing right now, folks, is just a small foreshadowing of events to come in
the next couple of decades. Think about it: If these minor climate changes and foolish
biofuels policies are already unleashing alarming rises in food prices, just imagine what
we'll see when Peak Oil kicks in and global oil supplies really start to dwindle. When
gasoline is $10 a gallon in the U.S., how expensive will food be around the world? The
answer, of course, is that it will be triple or quadruple the current price. And that means
many more people will starve.
Fossil fuels, of course, aren't the only limiting factor threatening future food supplies on
our planet: There's also fossil water. That's water from underground aquifers that's being pumped up to the surface to water crops,
then it's lost to evaporation. Countries like India and China are depending heavily on fossil water to irrigate their crops, and not
In a few more years (as little as five years in
surprisingly, the water levels in those aquifers is dropping steadily.
some cases), that water will simply run dry, and the crops that were once irrigated to feed
a nation will dry up and turn to dust. Mass starvation will only take a few months to kick
in. Think North Korea after a season of floods. Perhaps 95% of humanity is just one crop
season away from mass starvation.
61
Nuclear Energy Affirmative
Philip Sutton from Greenleap and David Spratt from Carbon Equity argue that “human
activity has already pushed the planet’s climate past several critical ‘tipping points’,
including the initiation of major ice sheet loss”.
They quote US climate scientist James Hansen who warned in 2007 that the loss of 8 million square kilometres of Arctic sea ice now
seems inevitable, and may occur as early as 2010 — a century ahead of the Intergovernmental Panel on Climate Change projections.
“There is already enough carbon dioxide in the Earth’s atmosphere to initiate ice sheet
disintegration in West Antarctica and Greenland and to ensure that sea levels will rise
metres in coming decades”, the report’s authors say.
“The projected speed of change, with temperature increases greater than 0.3C ̊ per decade
and the consequent rapid shifting of climatic zones will, if maintained, likely result in
most ecosystems failing to adapt, causing the extinction of many animal and plant
species. The oceans will become more acidic, endangering much marine life.
“The Earth’s passage into an era of dangerous climate change accelerates as each of these
tipping points is passed. If this acceleration becomes too great, humanity will no longer
have the power to reverse the processes we have set in motion.”
The authors conclude that we can avert this potential disaster, but warn that the science demands that “politics as usual” be rejected.
“The climate crisis will not respond to incremental modification of the business as usual model.”
“The sustainability emergency is now not so much a radical idea as simply an
indispensable course of action if we are to return to a safe-climate planet”, the authors
conclude.
Cam Walker, spokesperson from FoE, used the report’s launch on February 4 to call on the government to urgently review the role of
the Garnaut Climate Change Review which is to make recommendations on carbon emission targets.
Walker criticised the terms of reference for Ross Garnaut, and the government’s policy of a 60% cut in emissions by 2050, saying that
global warming of 3C ̊ would lead to disaster.
“The government is potentially allowing Garnaut to engage in dangerous trade-offs with the lives of many species and many people
rather than setting a safe-climate target”, he said.
Walker said the government is behind the times on climate science and urged it to bring James Hansen, head of the US NASA
Goddard Institute for Space Science, and that country’s most eminent climate scientist, into the review process “so that the science
was put first rather than last in making climate policy”.
Walker said that Hansen warned in December that climate tipping points have already
been passed for large ice sheet disintegration and species loss, and there is already
enough carbon in the Earth’s atmosphere for massive ice sheets such as on Greenland to
eventually melt away.
“These impacts are starting to happen at less than one degree of warming, yet the government is effectively planning on allowing
warming to run to 3 degrees”, said Walker.
62
Nuclear Energy Affirmative
63
Nuclear Energy Affirmative
apart from affecting climate, will have serious toxic effects on humans and other
mammals. Higher carbon dioxide concentration affects health by reducing blood ph
causing difficulty in breathing, rapid pulse rate, headache, hearing loss, sweating and
fatigue. Some studies have also shown possibilities of embryonic or foetal abnormalities
due to increase in atmospheric carbon dioxide.
A study on health effects of high indoor carbon dioxide concentrations has established
that at 600 ppm, occupants felt stuffy, and above this level, symptoms of poisoning
started to show. At 1,000 ppm, nearly all the occupants were affected.
All these effects were observed with only a transient exposure and not over a lifetime. On an average, carbon dioxide levels in offices reach 800-1,200 ppm and up to 2,000 ppm in
ppm. When it reaches 600 ppm, the Earth will have a permanent outdoor atmosphere
exactly like that of a stuffy room, which life may not adapt to.
64
Nuclear Energy Affirmative
The deaths were due to lung and heart ailments linked to ozone and polluting particles in
the air, which are spurred by carbon dioxide that comes from human activities, according
to the study's author, Mark Jacobson of Stanford University.
As the planet warms due to carbon dioxide emissions, the annual death rate is forecast to
climb, with premature deaths in the United States from human-generated carbon dioxide
expected to hit 1,000 a year when the global temperature has risen by 1.8 degrees F (1
degree C).
When the planet gets that hot, which could happen this century, the world annual death rate is
estimated to rise to 21,600, Jacobson said on Friday in a telephone interview.
Earth has warmed about 1.4 degrees F (0.8 degrees C) in the last 150 years, with most of that gain in the last three decades.
Jacobson said about 700 to 800 US annual deaths in the most recent years can be
attributed to human-caused carbon emissions.
Greenhouse gas pollution has spurred the global warming that is result in a damaging rise in the sea level, droughts and possibly more
This is the first time a scientist has specifically linked one human-
severe storms this century.
generated greenhouse gas to human mortality.
Carbon dioxide is one of several greenhouse gases blamed for climate change, but it is the one
humans have the most ability to control through regulation of activities that burn fossil
fuels like coal and oil. It is also emitted by natural processes.
Using a complex computer model and data on carbon emissions from the US
Environmental Protection Agency, Jacobson found the impact was worse in places that
are populous and polluted.
"Of the additional ... deaths per year due to ozone and particles ... about 30 percent of
those occurred in California, which has 12 percent of the (US) population," he said,
noting that California has six of the 10 most polluted US cities.
"So it was pretty clear ... that climate change was affecting Californians' health disproportionately to its population," Jacobson said.
What happens in California is important, since this populous state has long been a testing
ground for US pollution regulation.
65
Nuclear Energy Affirmative
Immediate Action to change the current main source of energy is crucial to avoid
the a collapse of the economy due to the inevitable oil peak is essential.
Landry 2007
(March 30 2007, Cathy, of the American Petroleum Institute, “GAO warns of peak oil threat to
global economies”, pg LEXIS)
World oil production will peak sometime between now and 2040, the US Government Accountability
Office said March 29, cautioning that if the phenomenon occurs "soon" and "without warning," it could
cause oil prices to surge to unprecedented levels and result in "severe" economic damage. "The
prospect of a peak in oil production presents problems of global proportions whose consequences
will depend critically on our preparedness," GAO, the nonpartisan investigative arm of Congress, said in a report.
"While these consequences would be felt globally, the United States, as the largest consumer of oil
and one of the nations most heavily dependent on oil for transportation, may be especially
vulnerable among the industrialized nations of the world." Despite the threat of peak oil, the US government
currently has no "coordinated or well-defined strategy" to address the uncertainties about the
timing of peak oil or to mitigate its potential effects. For that reason, GAO recommended that
the federal government take immediate action, and suggested that the US energy secretary take
the lead in coordinating a government strategy.
66
Nuclear Energy Affirmative
67
Nuclear Energy Affirmative
The average price of gasoline in the U.S. hit $4 a gallon for the first time Sunday, the latest milestone in a run-up in fuel prices
that is sapping consumer confidence and threatening to nudge the nation into recession.The record nationwide average for
regular-gasoline prices, announced by auto club AAA, follows Friday's near-$11 surge in oil prices to a record $138.54 a barrel.
Both are part of what, by some measures, is the worst energy-price shock Americans have faced for a generation, in terms of its
toll on their pocketbooks.In recent days, soaring fuel prices and disappointing employment data have reignited fears that the
nation's economy -- which has taken a pounding over the past year from a housing downturn, credit crunch and weakening job
market -- will slip into recession, or pull back further if a recession is already under way. Rising fuel prices are straining
household budgets, damping the spending that drives more than two-thirds of the nation's economic activity."What we're seeing
here is a lot of additional pressure on a consumer sector that was soft to begin with," said Alliance Bernstein economist Joseph
Carson. "Is it a tipping point by itself? It's close."Gasoline prices, which have risen 29% over the past year, have been high for
months, and in some markets, such as Alaska and California, consumers have been paying more than $4 a gallon at the pump for
weeks. But the latest increase at the nationwide level from a previous average of nearly $3.99 a gallon seems likely to deliver at
least a psychological blow to many Americans.The current drain on consumers' income from rising fuel prices is greater than it
was during most of the worst energy-price run-ups of the past. Spending on fuel as a share of wage income has shot above 6%.
That exceeds the percentage seen during the 1974-75 and 1990-91 oil-price shocks and approaches the 7% to 8% seen during the
1980-81 price surge, according to Mr. Carson.Comparing the rise in fuel spending to income growth, which has been especially
weak in recent years, the current shock is far worse than any of the three prior ones, he said."It's just gotten out of hand," said 53-
year-old Yvonne Brune of Des Moines, Iowa, referring to the rising cost of gasoline. Because of higher gasoline prices, Ms.
Brune, who works for a printing company doing marketing on weekdays and separately as a bridal consultant on nights and
weekends, no longer makes the drive home at lunchtime -- a 30-mile round trip -- to spend time with her dogs. Because of rising
airfares, she has canceled plans for a trip to Texas to visit relatives. "I think the airlines are going to see their industry implode
because people are going to stop flying," she said.Some economists hold out hope the current oil-price surge won't be as
devastating as some in the past. For one thing, consumers and businesses are far more fuel-efficient today than they were during
the oil shock of the mid-1970s, requiring half as much energy to produce a unit of economic output.Interest rates also are far
lower than they were then, and the Federal Reserve is expected to hold its interest-rate target steady at 2% for much of this year.
The dollar's weakness, meanwhile, is raising overseas demand for American products, and growth in exports is a key reason why
the U.S. economy has continued to expand -- albeit slowly -- over the past six months.Most important, consumers have shown
surprising resilience over the past five years, despite continued surges in their fuel costs. "While it certainly makes it tougher for
the economy for the next few quarters, I still believe consumers can adapt," said Peter Kretzmer, a Bank of America
economist.Still, as gasoline prices climb, they eat up money that consumers might otherwise spend on appliances or movie tickets
or vacations. That could force businesses, hit by weaker consumer demand and an increase in their own costs, to pare operations
and cut more jobs in an already weak labor market. The government reported Friday that the unemployment rate jumped to 5.5%
in May from 5% in April as employers shed 49,000 jobs last month -- a fifth-straight monthly decline.
68
Nuclear Energy Affirmative
69
Nuclear Energy Affirmative
70
Nuclear Energy Affirmative
Global demand for energy is at an all time high. Oil can’t handle it, and politicians need
to make people less dependant on oil.
Dallas Morning News 2008
(June 22 2008, “Energy crisis turns globalism to localism”, pg online @
http://www2.ljworld.com/news/2008/jun/22/energy_crisis_turns_globalism_localism/)
Cheap, abundant and accessible fossil fuels allowed us to create a world in which we are relatively unconstrained
by geography. That era is passing into history, and it is not likely this process can be reversed. There is
simply not enough oil being extracted quickly or inexpensively enough to meet global demand –
nor, in all likelihood, will there be again. This is called peak oil. Last week, economic analysts said Americans
have never before spent a greater portion of their income on energy costs. The sooner we come
to terms with this reality, the sooner we can begin taking serious steps to adapt . By this fall, chances are
John McCain and Barack Obama will be talking more about energy than any other issue. They'll have to. That would be a real
change from now. Peak oil is a far more urgent crisis than climate change, yet its economic and social effects are not even on the
candidates' agendas. Every petroleum-dependent aspect of our economy, from the far-flung distribution systems
will be difficult to sustain. The only question is how soon it will
for consumer goods to the daily commute,
happen and how traumatic the transition will be. National, state and local politicians would be smart to
approach it with a series of policy proposals based on the concept of relocalization. It's the idea that in a
world of costly energy, most economic and social activity will, of necessity, be local. A
comprehensive domestic energy policy should be geared toward helping regions, cities and
neighborhoods depend as little as possible on petroleum.
71
Nuclear Energy Affirmative
72
Nuclear Energy Affirmative
73
Nuclear Energy Affirmative
More Evidence
Discover 2008
(April 25 2008, “Is Nuclear Energy Our Best Hope?”, pg online @
http://discovermagazine.com/2008/may/02-is-nuclear-energy-our-best-hope)
America’s electricity demand is expected to increase by almost 50 percent by 2030, according to
the Department of Energy. Unfortunately, renewable energy sources, such as the wind and sun,
are highly unlikely to meet that need. Wind and solar installations today supply less than 1
percent of electricity in the United States, do so intermittently, and are decades away from
providing more than a small boost to the electric grid. “To meet the 2005 U.S. electricity demand
of about 4 million megawatt-hours with around-the-clock wind would have required wind farms
covering over 780,000 square kilometers,” Ausubel notes. For context, 780,000 square kilometers (301,000 square
miles) is greater than the area of Texas. Solar power fares badly too , in Ausubel’s analysis: “The amount
of energy generated in [one quart] of the core of a nuclear reactor requires [2.5 acres] of solar
cells.” Geothermal power also is decades away from making a significant contribution to
America’s electricity budget.
74
Nuclear Energy Affirmative
"Electricity is the key fabric of the economy," said Dan Rastler, a technical leader with
the Electric Power Research Institute (EPRI), a nonprofit energy research consortium that promotes
science and technology. "There's a real need to get the industry as well as stakeholders on
track."
75
Nuclear Energy Affirmative
Deliberate attacks on grid infrastructure can cripple nations' economies and undermine
their stability.
The grid became a frequent victim of war in Chechnya, where Chechen rebels and Russian troops have
fought off and on since the mid-1990s.
In Iraq, guerrillas continue to attack power lines and towers in an effort to impede recovery and foster
unrest. The grid is often cited as a vulnerable target for terrorism in the United States and in other
developed nations, particularly after the Sept. 11, 2001 attacks in New York City, Washington, D.C., and
Pennsylvania.
Garden-variety outages from storms and other causes sap $119 billion from the U.S.
economy every year, according to an analysis by the EPRI. The nation lost between $4
billion and $10 billion when a blackout shut down parts of the East and Midwest last
August.
Canada, which also went dark in the cascading outage, estimated that its gross domestic product declined
0.7 percent that month.
Most energy experts agree that making the grid less vulnerable to intentional and natural
assaults, and more resilient when such assaults do occur, is critical. They see wholesale
change as prohibitively expensive, risky and impractical.
"We're not going to rip out the entire infrastructure," said John Del Monaco, manager of emerging
technologies and transfer at Public Service Electric & Gas (PSE&G) in New Jersey.
76
Nuclear Energy Affirmative
PSE&G initiated a program to use MEMS-based acoustic sensors to monitor transformers, and is
developing similar technologies for cables and power lines. "You overlay on top of what you already have,"
said Del Monaco.
New technologies aren't enough on their own; they need to complement and be
compatible with both the existing grid and the grid of the future, said T.J. Glauthier, president
and chief executive of the Electricity Innovation Institute (E2I).
An affiliate of EPRI, E2I is charged with orchestrating the coordinated integration of next
generation technologies. This year it offered $500,000 in grants to researchers developing
nanotechnologies for electric power systems.
"What we need to really have is functionality, but we need to apply it in an evolutionary way," Glauthier
said. "We need to find companies that will be able to replace and upgrade where there is the most
congestion and demand. We're looking for ways to help ease that burden."
Fixing the grid from within would likely require giving it nerves in the form of remote sensors that track its
health, a network for collecting and distributing the data and a brain for interpreting and perhaps even
acting on the information. But making such a "smart grid" would require engineers to design
around high temperatures, strong electromagnetic forces and other difficult conditions.
About four years ago, PSE&G technology consultant Harry Roman and colleagues at the
New Jersey Institute of Technology decided to tackle the first challenge: the nerves. They
proposed developing a MEMS acoustic sensor to monitor transformers, using sound
rather than electrical signals to inspect the innards of the transformer.
In theory, sensors would track the telltale sounds of sparks that are emitted when the insulating oil within
the transformer wears down or becomes contaminated. Early detection could allow utilities to avoid power
failures or costly fires.
Developing the sensor hardware proved to be the easier part of the equation, Roman said. Once the project
was underway, he discovered that the oil's temperature affected the sound of arcing. The team had to
develop software that accounted for that relationship before it could get an accurate read on the
transformer's inner workings.
The sensors have progressed from lab-based tests to a mockup placed on a pole-mounted
transformer, to this year's challenge: several months of trials in a small oil tank.
77
Nuclear Energy Affirmative
Roman said "realistic implementation" is about two to four years away. In the meantime, he is
developing similar sensors for gauging the motion of underground cables to detect mechanical stresses, and
temperature sensors to monitor transmission lines.
Roman and Del Monaco emphasized that gathering data from sensors alone won't make the grid more
robust. Knowing how to analyze information to detect and then deflect problems would lead to improved
reliability, they said.
"This is outage management," Roman said. "Our whole philosophy has been to be more proactive. (Sept.
11) also prompted us to think about security. How do we use these microsensors for security?"
PSE&G may be ahead of the curve. Roger Anderson, an advocate of a Web-enabled smart grid, said the
energy industry as a whole shies away from new technologies until it has little choice but
to adapt.
The 2001 terrorist attacks and last year's massive outage jolted the industry, but didn't
prompt any revolutionary change.
The chips combine PNNL's expertise in microsystems with its mission to provide clean
and energy-efficient technologies to the nation. The chips detect when the grid is
becoming overloaded, for instance, when it is being taxed by air-conditioning demands
on a hot and humid day.
The chips temporarily shut down air conditioners or other appliances until the grid has
recovered. At most, temporary brownouts inconvenience homeowners. But similar
outages at energy-reliant high-tech facilities such as computer chip-making plants can
prove ruinous.
"The bottom line is, we can't protect it (the grid) because it is so diverse," said Robert Pratt, a
staff scientist at PNNL and program manager for GridWise. "We need resiliency. We need the flexibility
to make sure it doesn't turn into a blackout."
Pratt said the incentive for consumers would be in cost savings more than concerns about
grid reliability. He envisions consumers installing GridWise into appliances, or buying appliances
already wired with GridWise, and enrolling in utility programs that then give them cheaper rates.
Their individual energy conservation would be small, but "it's the aggregate that makes it great," Pratt said.
EPRI's Rastler takes working outside the grid even further. The technical leader for its distributed energy
resources program, he is looking at technologies such as stationary fuel cells that can provide alternative
energy to consumers and thus ease the burden placed on the grid.
78
Nuclear Energy Affirmative
His program also explores the feasibility of renewables such as solar cells. Both will
likely benefit from nanotechnologies being honed in companies and research labs.
"Several of the electric companies are interested in seeing whether these technologies can
be part of the toolbox," Rastler said. "There's been a lot of hope, and a lot of over promise."
Change is coming to the grid, even if its engineering remains unchanged, according to
Anderson. An oceanographer for 20 years, he recognizes in the grid the same kind of
dynamic interplay of forces that make complex systems like the climate so difficult to
predict.
His tracking of blackouts in the U.S. over several decades shows a recent shift toward
instability, with the frequency and magnitude of blackouts on the rise. The five-year trend
serves as a warning that another multi-state meltdown like last August's could occur
unless the grid is healed.
"it scares us," he said, "like the way the global warming people are scared."
As NERC warned a decade ago, the transmission system was not designed to handle
rapidly-changing bulk, so-called "economy" power transfers. On the three-year
anniversary of the "Great 2003 Blackout," NERC vice president Donald Cook explained, "There's
no question that the grid is being used now in ways for which it wasn't really designed. It
was built to connect neighbor to neighbor, over the last several decades. It was not
designed to move large blocks of power from one region to another. "
The Federally built Tennessee Valley Authority system is illustrative. TVA built, owns, and operates 17,000
miles of transmission lines, to service its customers over an area including all or parts of seven
Southeastern states. FERC has been trying to force the TVA to join a Federally regulated Regional
Transmission Organization, which would require it to cede control of its transmission grid, and force it to
build new transmission capacity (for which its customers would have to pay), not to service its own
ratepayers, but to allow "economy" wheeling over its wires. So far, the TVA has refused.
It is often stated that the solution to this transmission congestion is to build new power lines. But while
more transmission capacity is certainly needed, that in itself, will not solve the problem.
Blackout Blowback
Following the August 2003 blackout, which left 50 million people from the Midwest to
the East Coast in the dark, multiple Congressional hearings and a Federal investigation
were conducted to examine the problem and propose solutions. The Department of
Energy was tasked with identifying the cause. Its final report blamed everything possible
—including operators and fallen trees—except deregulation.
But the Congress mandated that the Department produce a report, the National Electric
Transmission Congestion Study, which it released in August 2006. The report duly noted what
79
Nuclear Energy Affirmative
everyone already knew—that areas of Critical Congestion included the New York City and Connecticut
service areas, with Congestion Areas of Concern all the way from New York through Northern Virginia.
The Los Angeles area was noted as a Critical Congestion area, with parts of the West Coast, from Seattle to
San Diego, in the Areas of Concern category. But it is not in these regions that profit-conscious, and even
foreign-owned companies, are proposing to build new power lines, or the new local generating plants that
would obviate the need for long-distance transmission lines. Why?
Under the no-holds-barred market of deregulation, this "elsewhere" has moved further and further away
from the large cities, with their large power requirements, to areas of the country where power can be
produced more cheaply, and new plants can be built with the minimum amount of local political opposition
and legal interference.
For example, PJM is a regional transmission interconnection, which coordinates the operation of the
transmission grid that now includes Delaware, Indiana, Illinois, Kentucky, Maryland, Michigan, New
Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia, and the District of
Columbia. It oversees 56,070 miles of transmission lines, and plans regional transmission expansion to
maintain grid reliability and relieve congestion.
In March, PJM identified transmission constraints in its region, which were standing in the way of
"bringing resources to a broader market." PJM identified two transmission paths requiring significant
investment: a high-voltage line from the coal fields of West Virginia to Baltimore and Washington, D.C.
and another, extending from West Virginia to Philadelphia, New Jersey, and Delaware. However, these
lines, hundreds of miles long, would not be necessary, if the mandate existed to build new nuclear plants
where the capacity would be near the load centers.
While Virginia and Maryland utilities are considering such new builds, most of the nuclear power plants
that are under consideration by utilities are in the semi-rural Southeast, where there is political support for
new plants, and building more high-voltage transmission lines to carry the power is unlikely to be held up
for 15 years by "environmental" court challenges. Some of that new nuclear-generated power from the
Southeast will be used locally, for growing demand, and some will be wheeled to the energy-short regions
of the mid-Atlantic and Northeast, which refuse to build their own capacity. Companies that have been
buying up transmission capacity will make a bundle, in the process.
Investment in new transmission capacity overall has left the grid system vulnerable to
even small instabilities. The industry estimates that $100 billion is needed in new
transmission capacity and upgrades, as quickly as possible. The 2003 blackout did spur
some increase in investment industry-wide, from $3.5 billion per year to $6 billion in
2006. But profit-minded companies are only willing to invest funds where there is a
profit to be made, namely to carry their "economy transfers," regardless of how that
destabilizes the grid system overall.
In a July 2006 article, three former electric utility executives, who formed the organization, Power
Engineers Supporting Truth (PEST), out of disgust with the refusal of the government to pinpoint
80
Nuclear Energy Affirmative
deregulation as the cause of the massive grid failure, after the 2003 New York blackout, stated
that the
"core issue is an almost fundamentalist reliance on markets to solve even the most
scientifically complex problems... [P]olicy makers continue to act as if some adjustment
in market protocols is all that is required, and steadfastly refuse to acknowledge the
accumulating mass of evidence that deregulation ... is itself the problem. Social scientists
call this kind of denial, cognitive dissonance."
The engineers, who have among them, more than five decades of experience in the electrical utility
industry, insist that "new transmission lines will not by themselves improve reliability. They
may increase transfer capacities, and hence improve commercial use of the grid," but will
not necessarily improve performance of the system. "Reliability standards have already been
reduced to accomodate greater use of the grid for commercial transactions," they warned
(Table II).
There has been a huge penalty for this disruption of the functioning of the electric grid.
PEST estimates that the 2003 blackout incurred economic losses in excess of $5 billion.
The California blackouts cost in excess of $1 billion each. The national impact of
declining reliability and quality, they estimate, is in excess of $50 billion.
When the California energy crisis of 2000-2001 was raging, distraught state legislators and the embattled
Gov. Gray Davis searched for a solution. Although they knew what that solution was, they protested that it
would be impossible to put the toothpaste of deregulation back in the tube. Lyndon LaRouche and EIR
proposed that that was exactly what needed to be done.
On Monday, July 17, 2006, in the midst of an intense Summer heat wave, one of Con Edison's 22 primary
feeder lines failed, below the streets of the City of New York. Over the next several hours, five more feeder
lines were lost. Voltage was reduced
8% to limit the instability, and the utility was faced with 25,000 customers—about 100,000 people—in the
heat and dark. It took until midnight July 23—seven days later—to restore 20,000 of the affected
customers, according to Con Edison.
The New York City blackout was the result not of a Summer heatwave, but of the decades
of underinvestment in the infrastructure that distributes electric power from central feeder
lines, through transformers, to the wires that deliver power to each home, school, factory,
office building, small business, and hospital. Some of Con Edison's underground
infrastructure goes back almost as far as Thomas Edison's first central generating station
and underground cable, on Pearl Street in lower Manhattan, in 1882. It was a length of 59-year-old
cable whose failure was a factor in the July blackout.
81
Nuclear Energy Affirmative
Industry-wide, there is agreement that weaknesses due to the age of the underground
distribution cable have been exacerbated by the way the system is run in today's
deregulated world. To "save money," the industry has turned to a policy of "run to
failure," where a company waits for a failure before replacing aged power lines and other
equipment. Black & Veatch reports that although utilities currently spend more than $18 billion on local
distribution systems, most of that is to string new wire to new housing developments (which will likely
come to an end soon, along with the housing boom), and that an additional $8-10 billion per year is needed
to replace obsolete and corroded equipment.
On top of this disinvestment policy, local distribution systems, like the transmission system, are being
stretched beyond their design limits. In addition to chronological age, overheating of equipment that is
caused by heavy electricity use and is repeatedly stressed will age faster, and is more likely to fail suddenly.
In 1986, Con Edison began a program to replace all of its older cable with a newer design. It is spending
about $25 million per year, and at that rate, the utility will not finish until 2024. By that time, some of its
replacement cable will be 38 years old. Con Edison delivers electricity to 3.2 million customers, through
95,000 miles of underground cable, and 33,000 miles of overhead wires. Estimates are that about 27% of its
underground cable needs to be replaced. Why is it taking decades to replace old cable?
According to media reports, recently Southern California Edison sought approval from the state Public
Utilities Commission to replace 800 miles of aging underground cable, after concluding that cable failures
were the leading cause of outages that could be prevented. But "consumer advocates" opposed the utility's
request to recoup the $145 million cost of replacement, on the grounds that the utility's records were not
adequate to ensure the worst cables would be replaced first. The utility will proceed and spend $250
million more than is recouped in customers' bills anyway, because they "don't want to get too far
behind." Apparently the shareholder-driven "consumer advocates" never added up the
economic, and sometimes, life-threatening costs, of the alternative—blackouts.
Between 1990 and the year 2000, utility employment in power generation dropped from 350,000 to
280,000, as utilities looked for ways to slash costs, to be "competitive." Over the same decade, employment
in transmission and distribution went from 196,000 to 156,000, in a system that is growing more complex
by the day. Today, the average age of a power lineman is 50 years.
"Quick profit," deregulation, shareholder values, environmentalism, have all run their
course, and nearly taken down the electricity grid. It is time to change the axioms.
Yes, thereneed to be more power plants built, to make up for the deficits in electric-
generating capacity in many parts of the country. It is also the case that entire regions, in
particular the West and East Coasts, have so much congestion on their transmission lines,
that they cannot import the power they need. And as seen in New York City this past July,
82
Nuclear Energy Affirmative
breakdowns in 100-year-old underground local distribution systems are now leaving tens
of thousands of people in the dark, and must be replaced.
But it is foolhardy to think that the needed investments will be made under the present regime. Today,
thanks to deregulation, a company can earn more profits by not building anything, and instead charging
more for what they already produce, by creating shortages. This strategy was implemented to perfection six
years ago by Enron and other power pirates in California, which withheld power to raise prices through the
roof, allowing them to steal tens of billions of dollars out of the pockets of electricity consumers throughout
the West Coast.
Today, unregulated utility companies do not plow a large portion of their profits back into
improving infrastructure, but instead pay out higher dividends to stockholders. If even a
regulated company has any hope of raising hundreds of millions of dollars on Wall Street to finance
growth, it must prove itself creditworthy, by cutting costs and showing it can abide by
shareholder values.
Individual companies no longer cooperate to ensure the overall reliability of the electric
grid. They compete to build power plants and transmission lines based on their return on
investment, not on the physical requirements of a regional system. They make themselves
"competitive" to undercut the competition by cutting maintenance costs and getting rid of
as many employees as they can.
For two decades, industry officials and the North American Electric Reliability Council (NERC) have
warned that restructuring the electricity system would destroy it. An understanding of that
danger provoked Dr. Anjan Bose, former Dean of Engineering at Washington State University, to comment,
citing the advancement of power systems expertise in China and India that "the next time a grandstanding
politician in North America compares our grid to that of the Third World, he may actually mean it as a
compliment."
There is no way to "fix" the system, as Congress has tried to do, by piling on more and
more Federal regulations, to try to patch up the gaping holes in the broken system that
now exists. The only remedy is to return the intention of the industry to one of providing
universally reliable service, by putting the toothpaste of deregulation back in the tube.
The nearly two dozen states that have restructured their local industry, forcing utilities to sell their
generation assets to conglomerate holding companies, in order to "compete," must return responsibility and
oversight for electric generation and disribution to the state utility commissions. These public servants
should decide what should be built, and where, on the basis of providing for the general
welfare, not the profit profiles of companies headquartered a half-continent away.
83
Nuclear Energy Affirmative
There are no shortcuts. Decisive action is needed to reverse the past thirty years of failed
policies.
84
Nuclear Energy Affirmative
85
Nuclear Energy Affirmative
86
Nuclear Energy Affirmative
Funding for new grids will be provided with grants New technologies aren't enough on
their own; they need to complement and be compatible with both the existing grid and the
grid of the future, said T.J. Glauthier, president and chief executive of the Electricity Innovation Institute
(E2I). An affiliate of EPRI, E2I is charged with orchestrating the coordinated integration
of next generation technologies. This year it offered $500,000 in grants to researchers
developing nanotechnologies for electric power systems. "What we need to really have is
functionality, but we need to apply it in an evolutionary way," Glauthier said. "We need to find companies
that will be able to replace and upgrade where there is the most congestion and demand. We're looking for
ways to help ease that burden." Fixing the grid from within would likely require giving it nerves in the form
of remote sensors that track its health, a network for collecting and distributing the data and a brain for
interpreting and perhaps even acting on the information. But making such a "smart grid" would
require engineers to design around high temperatures, strong electromagnetic forces and
other difficult conditions. About four years ago, PSE&G technology consultant Harry
Roman and colleagues at the New Jersey Institute of Technology decided to tackle the
first challenge: the nerves. They proposed developing a MEMS acoustic sensor to
monitor transformers, using sound rather than electrical signals to inspect the innards of
the transformer.In theory, sensors would track the telltale sounds of sparks that are emitted when the
insulating oil within the transformer wears down or becomes contaminated. Early detection could allow
utilities to avoid power failures or costly fires. Developing the sensor hardware proved to be the easier part
of the equation, Roman said. Once the project was underway, he discovered that the oil's temperature
affected the sound of arcing. The team had to develop software that accounted for that relationship before it
could get an accurate read on the transformer's inner workings. The sensors have progressed from
lab-based tests to a mockup placed on a pole-mounted transformer, to this year's
challenge: several months of trials in a small oil tank. Roman said "realistic
implementation" is about two to four years away. In the meantime, he is developing similar
sensors for gauging the motion of underground cables to detect mechanical stresses, and temperature
sensors to monitor transmission lines. Roman and Del Monaco emphasized that gathering data from sensors
alone won't make the grid more robust. Knowing how to analyze information to detect and then deflect
problems would lead to improved reliability, they said. "This is outage management," Roman said. "Our
whole philosophy has been to be more proactive. (Sept. 11) also prompted us to think about security. How
do we use these microsensors for security?" PSE&G may be ahead of the curve. Roger Anderson, an
advocate of a Web-enabled smart grid, said the energy industry as a whole shies away from new
technologies until it has little choice but to adapt. The 2001 terrorist attacks and last
year's massive outage jolted the industry, but didn't prompt any revolutionary change.
Researchers at the Department of Energy's Pacific Northwest National Laboratory
(PNNL) in Washington state attack the problem from another angle. They created what
they call GridWise, chips that can be installed into household appliances to monitor and
assist the grid. The chips combine PNNL's expertise in microsystems with its mission to
provide clean and energy-efficient technologies to the nation. The chips detect when the
grid is becoming overloaded, for instance, when it is being taxed by air-conditioning
demands on a hot and humid day. The chips temporarily shut down air conditioners or
87
Nuclear Energy Affirmative
other appliances until the grid has recovered. At most, temporary brownouts
inconvenience homeowners. But similar outages at energy-reliant high-tech facilities
such as computer chip-making plants can prove ruinous. "The bottom line is, we can't protect
it (the grid) because it is so diverse," said Robert Pratt, a staff scientist at PNNL and program
manager for GridWise. "We need resiliency. We need the flexibility to make sure it doesn't turn into
a blackout." Pratt said the incentive for consumers would be in cost savings more than
concerns about grid reliability. He envisions consumers installing GridWise into appliances, or
buying appliances already wired with GridWise, and enrolling in utility programs that then give them
cheaper rates. Their individual energy conservation would be small, but "it's the aggregate that makes it
great," Pratt said. EPRI's Rastler takes working outside the grid even further. The technical leader for its
distributed energy resources program, he is looking at technologies such as stationary fuel cells that can
provide alternative energy to consumers and thus ease the burden placed on the grid. His program also
explores the feasibility of renewables such as solar cells. Both will likely benefit from
nanotechnologies being honed in companies and research labs. "Several of the electric
companies are interested in seeing whether these technologies can be part of the toolbox,"
Rastler said. "There's been a lot of hope, and a lot of over promise." Change is coming to the
grid, even if its engineering remains unchanged, according to Anderson. An oceanographer
for 20 years, he recognizes in the grid the same kind of dynamic interplay of forces that
make complex systems like the climate so difficult to predict. His tracking of blackouts in
the U.S. over several decades shows a recent shift toward instability, with the frequency
and magnitude of blackouts on the rise. The five-year trend serves as a warning that
another multi-state meltdown like last August's could occur unless the grid is healed. "it
scares us," he said, "like the way the global warming people are scared."
88
Nuclear Energy Affirmative
Dependance on Natural Gas is likely to cause economic damage. Nuclear energy will
keep the economy stable.
Fertel 2004
(March 4, Marvin S., Senior Vice President and Chief Nuclear Officer Nuclear Energy
Institute, “United States Senate Committee Energy and Natural Resources Subcommittee
on Energy”, Testimony, pg online @
http://www.nei.org/newsandevents/speechesandtestimony/2004/energysubcmtefertelexten
ded)
Second, new nuclear power plants provide future price stability that is not available from electric
generating plants fueled with natural gas. Intense volatility in natural gas prices over the last several
years is likely to continue, and subjects the U.S. economy to potential damage. Although nuclear
plants are capital-intensive to build, the operating costs of nuclear power plants are stable and
can dampen volatility of consumer costs in the electricity market.
Third, new nuclear plants will reduce the price and supply volatility of natural gas, thereby
relieving cost pressures on other users of natural gas that have no alternative fuel source.
89
Nuclear Energy Affirmative
90
Nuclear Energy Affirmative
91
Nuclear Energy Affirmative
92
Nuclear Energy Affirmative
93
Nuclear Energy Affirmative
94
Nuclear Energy Affirmative
95
Nuclear Energy Affirmative
96
Nuclear Energy Affirmative
And, price shocks will lead to BILLIONS of deaths, threatening humanity’s very
existence.
Final Frontier 2008
(May 6 2008, “Economic Collapse”, pg online @ http://www.ff2012.com/EconCollapse.htm)
The straw that breaks the camel's back may very well be the loss of cheap energy. Oil
production has been stagnant since May of 2005 even though demand has been increasing.
Mexico, one of the largest suppliers of oil to the US has stated that it will soon have no oil to
export and will become an oil importing nation. Saudi Arabia has promised to increase its
production several times, but did not, perhaps because they are currently unable to. Because of the
time required to bring a new oil field into commercial production, there will not be enough time to mitigate the oil situation
before 2012. Add to that the fact that there are currently no alternative energy supplies which can
come close to supplying the energy this country has become used to and dependent upon; and it
becomes obvious that life cannot continue its present course. Economic collapse equals death to
millions, perhaps billions as the life supporting infrastructure collapses. People living in cold
climates will not be able to heat their homes, resulting in death from cold and illness. Health
care will decline, as people out of work lose all health care. Food production will drop as farms
can no longer operate without fuel, or meet their property tax burden. The system doesn't have to suffer a
total collapse to kill off people. Those already living on the margins of society will easily be pushed to far, and they will be the
first to succumb. Is this likely? Well, it is a possibility. Only time will tell how deep our hole is, and whether we can climb out
of it.
97
Nuclear Energy Affirmative
The national power crisis has hit farmers and other food producers hard, according to
economics professor Johan Willemse. He predicts an increase of 10% to 15% in the
prices of staple foods in the next few months. "Load shedding has this unfortunate ripple effect",
Willemse said. "I know of small butchers who have had to throw away meat valued at more than
R50000 because refrigerators went off. At the end of the day, business has to make up for
these losses by asking higher prices."
What he called the"broken cold-food chain" would have an inflationary effect on food
prices because large quantities of staples, such as milk, are being discarded daily,
resulting in shortages, he said. "This is not even taking into account the number of hours lost
by production lines", warned Willemse. "It is absolutely chaotic" is how Agri SA’s director of natural
resources, Nic Opperman, described the effect of the electricity cuts. He said his agricultural association
was trying to establish how many working hours, and how many crops, had been lost as a result of load
shedding.
98
Nuclear Energy Affirmative
"I cannot give a figure now, but I know it is going to be huge. This is a tremendous problem for
every farmer in the country." Koos Coetzee, an economist with the Milk Producers’ Organisation,
said: "We are being modest when we say it is costing the dairy industry about R100-million a month. "It
is
not only the production phase that is being hurt. Shop owners and consumers also suffer
because milk goes sour when their electricity goes off." He predicted an increase in the price of
milk, but could not quantify it.
Coetzee said the unreliable power supply had left dairy farmers with no option but to spend
more than R240-million on power- support systems. But, he said, even “their fancy
equipment won’t stop the damage this [load shedding] is doing to the industry." He
estimated that monthly milk production has dropped by 20million litres . Under normal
circumstances, about 200million litres of milk are produced every month.
Coetzee said the milk producers had sought legal advice on suing Eskom. In a submission to Business
Unity SA, which held a meeting with Eskom yesterday, Agri SA asked for special attention to be given to
farming needs. Opperman said: "Agriculture was not informed in time of the magnitude of the
crisis and was therefore unable to put contingency plans in place … fruit destined for the
export market cannot be refrigerated in time and the cold chain, which is also so vital for
dairy products, is often interrupted or simply not available."
Opperman pleaded with Eskom and the government to play "open cards" with them.
Agri SA, too, would consider suing the electricity utility, he said. "We are in an industry in which
everything is time-related. You can’t postpone a harvest because of a blackout. "Farmers are
becoming anxious and want a solution."
99
Nuclear Energy Affirmative
100
Nuclear Energy Affirmative
This Summer, three decades of underinvestment and looting of the U.S. electrical
industry grid system came home to roost. A week-long blackout in New York City, calls
for "voluntary" conservation, the shutting off of power to large industrial enterprises, and
lowering of voltages across the nation, were all evidence of the wreckage that has been
made of this most critical infrastructure. For the past three decades, financial warfare, and
attacks by anti-technology fanatics and free-market ideologues, have created the "perfect
storm" that has left the U.S. electric grid in a condition of increasing instability. The
restructuring of the electric utility industry, begun during the mid-1970s Carter
Administration, has changed the rules of the road that had created an electric generation
and delivery system that was the envy of the world.This wreckage was accomplished by
changing the axioms. From the time of President Franklin Roosevelt's regulation of the industry in 1935,
the intention of the engineers who designed the electric grid was to deliver reliable, economical electricity,
to every farm, family, and factory in the United States. Now this extraordinarily complex and fragile system
has been degraded into a hodgepodge of hundreds of competing interests, run not by engineers, but by
financiers and lawyers, where states are increasingly losing regulatory oversight, and reliability has taken a
backseat to shareholder values.Wheeling PowerThe first sector of the electric utility industry to
be deregulated was the network of high-voltage transmission wires, which were designed to
make bulk power transfers, over relatively short distances, from large power-generating plants to the cities
and towns where the power was needed. They were built by the utility company that had built the power
plant, and as the grid grew, local lines were connected to other utilties' power lines to be available in case of
emergencies. During the 1977 blackout in New York, for example, power was transferred in
from the Tennessee Valley Authority system in the Southeast, to restabilize the grid.After
the mid-1970s Middle East War and orchestrated "oil crisis," which quadrupled prices,
the Carter Administration proposed, and Congress passed, the 1978 Public Utility
Regulatory Policies Act, which promoted "conservation," and poured billions of wasted
Federal dollars into the development of small non-utility power generators, using "non-
traditional" sources of power, such as biofuels, solar, and wind energy. This insane turning
back the clock to pre-industrial 19th Century methods was reinforced by attacks on nuclear power,
reversing the policy of massive additions of new nuclear plants then underway. The 1978 law required
the traditional utility companies to purchase power from these expensive "alternative"
power sources.The utility companies objected to this potential anarchic use of the
101
Nuclear Energy Affirmative
transmission grid, and refused to provide these non-utility generators access to their
systems. So, the Federal Energy Regulatory Commission, which had been established to restructure the
industry, promulgated a superceding Federal rule forcing "open access" for these new non-
utility generators to the transmission system.This "open access" rule was the foot in the door for
the chaos and congestion in the transmission system that exists today. One of the huge electric industry
conglomerates, American Electric Power, is an instructive case in point.On Dec. 20, 1906, a certificate of
incorporation was filed in Albany, New York for the American Gas and Electric Company. Over the
ensuing 30 years, the company began electric, gas, water, steam, transit, and even ice services, in New
Jersey, New York, Pennsylvania, West Virginia, Virginia, Ohio, Indiana, Michigan, and Illinois.In 1928, the
Federal Trade Commission launched a comprehensive inquiry into the entire electric power industry, as
abuses mounted, from financial pyramid schemes and the stock market speculation of the "Roaring
Twenties." The investigations culminated in the 1935 passage of President Franklin Roosevelt's Public
Utility Holding Company Act, which forced the breakup of many holding companies, and several of
American Electric Power's holdings were divested. Other legislation made it incumbent upon utilities to
provide universal service, and gave the states overall regulatory oversight. While what became American
Electric Power still maintained operations stretching from Virginia to Michigan, each state regulated its
utility companies, defined the level of reliability to be maintained, and, in return, assured each company a
modest return on investment.
102
Nuclear Energy Affirmative
103
Nuclear Energy Affirmative
As NERC warned a decade ago, the transmission system was not designed to handle
rapidly-changing bulk, so-called "economy" power transfers. On the three-year
anniversary of the "Great 2003 Blackout," NERC vice president Donald Cook explained, "There's
no question that the grid is being used now in ways for which it wasn't really designed. It
was built to connect neighbor to neighbor, over the last several decades. It was not
designed to move large blocks of power from one region to another. "The Federally built
Tennessee Valley Authority system is illustrative. TVA built, owns, and operates 17,000 miles of
transmission lines, to service its customers over an area including all or parts of seven Southeastern states.
FERC has been trying to force the TVA to join a Federally regulated Regional Transmission Organization,
which would require it to cede control of its transmission grid, and force it to build new transmission
capacity (for which its customers would have to pay), not to service its own ratepayers, but to allow
"economy" wheeling over its wires. So far, the TVA has refused.It is often stated that the solution to this
transmission congestion is to build new power lines. But while more transmission capacity is certainly
needed, that in itself, will not solve the problem.Blackout BlowbackFollowing the August 2003
blackout, which left 50 million people from the Midwest to the East Coast in the dark,
multiple Congressional hearings and a Federal investigation were conducted to examine
the problem and propose solutions. The Department of Energy was tasked with
identifying the cause. Its final report blamed everything possible—including operators
and fallen trees—except deregulation.But the Congress mandated that the Department
produce a report, the National Electric Transmission Congestion Study, which it released in
August 2006. The report duly noted what everyone already knew—that areas of Critical Congestion
included the New York City and Connecticut service areas, with Congestion Areas of Concern all the way
from New York through Northern Virginia. The Los Angeles area was noted as a Critical Congestion area,
with parts of the West Coast, from Seattle to San Diego, in the Areas of Concern category. But it is not in
these regions that profit-conscious, and even foreign-owned companies, are proposing to build new power
lines, or the new local generating plants that would obviate the need for long-distance transmission lines.
Why?Thanks to 30 years of irrational "environmentalist" brainwashing of sections of the
U.S. population, particularly in "liberal" large urban regions such as New York and
California, it is almost impossible to build new generating capacity—much less nuclear
power plants—where the greatest needs are. Therefore, these regions, which do not
generate enough power locally, are forced to import power from other utilities. Thanks to
the efforts of the same so-called environmentalists, these cities have not even been able to
build enough power lines to bring in the electricity from elsewhere.Under the no-holds-barred
market of deregulation, this "elsewhere" has moved further and further away from the large cities, with
104
Nuclear Energy Affirmative
their large power requirements, to areas of the country where power can be produced more cheaply, and
new plants can be built with the minimum amount of local political opposition and legal interference.For
example, PJM is a regional transmission interconnection, which coordinates the operation of the
transmission grid that now includes Delaware, Indiana, Illinois, Kentucky, Maryland, Michigan, New
Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia, and the District of
Columbia. It oversees 56,070 miles of transmission lines, and plans regional transmission expansion to
maintain grid reliability and relieve congestion.In March, PJM identified transmission constraints in its
region, which were standing in the way of "bringing resources to a broader market." PJM identified two
transmission paths requiring significant investment: a high-voltage line from the coal fields of West
Virginia to Baltimore and Washington, D.C. and another, extending from West Virginia to Philadelphia,
New Jersey, and Delaware. However, these lines, hundreds of miles long, would not be necessary, if the
mandate existed to build new nuclear plants where the capacity would be near the load centers.While
Virginia and Maryland utilities are considering such new builds, most of the nuclear power plants that are
under consideration by utilities are in the semi-rural Southeast, where there is political support for new
plants, and building more high-voltage transmission lines to carry the power is unlikely to be held up for 15
years by "environmental" court challenges. Some of that new nuclear-generated power from the Southeast
will be used locally, for growing demand, and some will be wheeled to the energy-short regions of the mid-
Atlantic and Northeast, which refuse to build their own capacity. Companies that have been buying up
transmission capacity will make a bundle, in the process.Investment in new transmission capacity
overall has left the grid system vulnerable to even small instabilities. The industry
estimates that $100 billion is needed in new transmission capacity and upgrades, as
quickly as possible. The 2003 blackout did spur some increase in investment industry-
wide, from $3.5 billion per year to $6 billion in 2006. But profit-minded companies are
only willing to invest funds where there is a profit to be made, namely to carry their
"economy transfers," regardless of how that destabilizes the grid system overall. In a July
2006 article, three former electric utility executives, who formed the organization, Power Engineers
Supporting Truth (PEST), out of disgust with the refusal of the government to pinpoint deregulation as the
cause of the massive grid failure, after the 2003 New York blackout, stated that the "core issue is an
almost fundamentalist reliance on markets to solve even the most scientifically complex
problems... [P]olicy makers continue to act as if some adjustment in market protocols is
all that is required, and steadfastly refuse to acknowledge the accumulating mass of
evidence that deregulation ... is itself the problem. Social scientists call this kind of
denial, cognitive dissonance."The engineers, who have among them, more than five decades of
experience in the electrical utility industry, insist that "new transmission lines will not by
themselves improve reliability. They may increase transfer capacities, and hence improve
commercial use of the grid," but will not necessarily improve performance of the system.
"Reliability standards have already been reduced to accomodate greater use of the grid for
commercial transactions," they warned (Table II).There has been a huge penalty for this
disruption of the functioning of the electric grid. PEST estimates that the 2003 blackout
incurred economic losses in excess of $5 billion. The California blackouts cost in excess
of $1 billion each. The national impact of declining reliability and quality, they estimate,
is in excess of $50 billion.Where To Go From HereWhen the California energy crisis of 2000-2001
was raging, distraught state legislators and the embattled Gov. Gray Davis searched for a solution.
Although they knew what that solution was, they protested that it would be impossible to put the toothpaste
of deregulation back in the tube. Lyndon LaRouche and EIR proposed that that was exactly what needed to
be done.On Monday, July 17, 2006, in the midst of an intense Summer heat wave, one of Con Edison's 22
primary feeder lines failed, below the streets of the City of New York. Over the next several hours, five
more feeder lines were lost. Voltage was reduced 8% to limit the instability, and the utility was faced with
25,000 customers—about 100,000 people—in the heat and dark. It took until midnight July 23—seven days
later—to restore 20,000 of the affected customers, according to Con Edison.The New York City
blackout was the result not of a Summer heatwave, but of the decades of underinvestment
105
Nuclear Energy Affirmative
in the infrastructure that distributes electric power from central feeder lines, through
transformers, to the wires that deliver power to each home, school, factory, office
building, small business, and hospital. Some of Con Edison's underground infrastructure
goes back almost as far as Thomas Edison's first central generating station and
underground cable, on Pearl Street in lower Manhattan, in 1882. It was a length of 59-year-old cable
whose failure was a factor in the July blackout.A couple of years ago in Philadelphia, workers for
PECO Energy found that some underground utility cable still in service dated to 1899. In
July 1999, the failure of outdated cable was blamed for power outages in Manhattan
affecting 200,000 people. In San Francisco, a failed cable in December 2003 created an
outage for 100,000 residents. "We've been using equipment far beyond its original intended life
because we've been concerned with the cost of replacement and the need to keep utility rates down,"
remarked Dean Oskvig, president of Black & Veatch, an engineering firm based in St. Louis, last
month.Industry-wide, there is agreement that weaknesses due to the age of the
underground distribution cable have been exacerbated by the way the system is run in
today's deregulated world. To "save money," the industry has turned to a policy of "run to
failure," where a company waits for a failure before replacing aged power lines and other
equipment. Black & Veatch reports that although utilities currently spend more than $18 billion on local
distribution systems, most of that is to string new wire to new housing developments (which will likely
come to an end soon, along with the housing boom), and that an additional $8-10 billion per year is needed
to replace obsolete and corroded equipment.On top of this disinvestment policy, local distribution systems,
like the transmission system, are being stretched beyond their design limits. In addition to chronological
age, overheating of equipment that is caused by heavy electricity use and is repeatedly stressed will age
faster, and is more likely to fail suddenly.In 1986, Con Edison began a program to replace all of its older
cable with a newer design. It is spending about $25 million per year, and at that rate, the utility will not
finish until 2024. By that time, some of its replacement cable will be 38 years old. Con Edison delivers
electricity to 3.2 million customers, through 95,000 miles of underground cable, and 33,000 miles of
overhead wires. Estimates are that about 27% of its underground cable needs to be replaced. Why is it
taking decades to replace old cable?According to media reports, recently Southern California Edison
sought approval from the state Public Utilities Commission to replace 800 miles of aging underground
cable, after concluding that cable failures were the leading cause of outages that could be prevented. But
"consumer advocates" opposed the utility's request to recoup the $145 million cost of replacement, on the
grounds that the utility's records were not adequate to ensure the worst cables would be replaced first. The
utility will proceed and spend $250 million more than is recouped in customers' bills
anyway, because they "don't want to get too far behind." Apparently the shareholder-driven
"consumer advocates" never added up the economic, and sometimes, life-threatening
costs, of the alternative—blackouts.Before deregulation, companies like Con Edison
would make investments in infrastructure that were deemed necessary, to maintain a level
of service and reliability that met industry-wide standards, assured that state regulators
would allow them to recover the costs, and maintain their financial health. Today, many
states have no authority to either order investments or compensate companies that make
them, leaving Wall Street and the "free market" to decide who shall have reliable electric
power.Between 1990 and the year 2000, utility employment in power generation dropped from 350,000 to
280,000, as utilities looked for ways to slash costs, to be "competitive." Over the same decade, employment
in transmission and distribution went from 196,000 to 156,000, in a system that is growing more complex
by the day. Today, the average age of a power lineman is 50 years."Quick profit," deregulation,
shareholder values, environmentalism, have all run their course, and nearly taken down
the electricity grid. It is time to change the axioms.Transmitting Power, or Just Profits?Yes, there
need to be more power plants built, to make up for the deficits in electric-generating
capacity in many parts of the country. It is also the case that entire regions, in particular
the West and East Coasts, have so much congestion on their transmission lines, that they
106
Nuclear Energy Affirmative
cannot import the power they need. And as seen in New York City this past July,
breakdowns in 100-year-old underground local distribution systems are now leaving tens
of thousands of people in the dark, and must be replaced.But it is foolhardy to think that the
needed investments will be made under the present regime. Today, thanks to deregulation, a company can
earn more profits by not building anything, and instead charging more for what they already produce, by
creating shortages. This strategy was implemented to perfection six years ago by Enron and other power
pirates in California, which withheld power to raise prices through the roof, allowing them to steal tens of
billions of dollars out of the pockets of electricity consumers throughout the West Coast.Today,
unregulated utility companies do not plow a large portion of their profits back into
improving infrastructure, but instead pay out higher dividends to stockholders. If even a
regulated company has any hope of raising hundreds of millions of dollars on Wall Street to finance
growth, it must prove itself creditworthy, by cutting costs and showing it can abide by
shareholder values.Individual companies no longer cooperate to ensure the overall
reliability of the electric grid. They compete to build power plants and transmission lines
based on their return on investment, not on the physical requirements of a regional
system. They make themselves "competitive" to undercut the competition by cutting
maintenance costs and getting rid of as many employees as they can.For two decades,
industry officials and the North American Electric Reliability Council (NERC) have warned that
restructuring the electricity system would destroy it. An understanding of that danger provoked
Dr. Anjan Bose, former Dean of Engineering at Washington State University, to comment, citing the
advancement of power systems expertise in China and India that "the next time a grandstanding politician
in North America compares our grid to that of the Third World, he may actually mean it as a
compliment."There is no way to "fix" the system, as Congress has tried to do, by piling on
more and more Federal regulations, to try to patch up the gaping holes in the broken
system that now exists. The only remedy is to return the intention of the industry to one
of providing universally reliable service, by putting the toothpaste of deregulation back in the
tube.The nearly two dozen states that have restructured their local industry, forcing utilities to sell their
generation assets to conglomerate holding companies, in order to "compete," must return responsibility and
oversight for electric generation and disribution to the state utility commissions. These public servants
should decide what should be built, and where, on the basis of providing for the general
welfare, not the profit profiles of companies headquartered a half-continent away.The
now-congested and unstable long-distance high-voltage transmission systems that criss-
cross the nation must be used for the purpose for which they were intended: to enable
bulk power transfer in case of emergency, not to wheel power from one end of the
country to the other so a company can import cheaper power, charge a few cents less, and
beat out the competition. Responsibility for the transmission system should be taken out
of the hands of the Federal deregulators, and returned to the regional reliability councils
that formulated the rules of the road to keep the system robust.There are no shortcuts.
Decisive action is needed to reverse the past thirty years of failed policies.
107
Nuclear Energy Affirmative
Immediate Action to change the current main source of energy is crucial to avoid the a
collapse of the economy due to the inevitable oil peak is essential. ***POSSIBLE
1AC***
Landry 2007
(March 30 2007, Cathy, of the American Petroleum Institute, “GAO warns of peak oil
threat to global economies”, pg LEXIS)
World oil production will peak sometime between now and 2040, the US Government Accountability
Office said March 29, cautioning that if the phenomenon occurs "soon" and "without warning," it could
cause oil prices to surge to unprecedented levels and result in "severe" economic damage. "The
prospect of a peak in oil production presents problems of global proportions whose consequences
will depend critically on our preparedness," GAO, the nonpartisan investigative arm of Congress, said in a report.
"While these consequences would be felt globally, the United States, as the largest consumer of oil
and one of the nations most heavily dependent on oil for transportation, may be especially
vulnerable among the industrialized nations of the world." Despite the threat of peak oil, the US government
currently has no "coordinated or well-defined strategy" to address the uncertainties about the
timing of peak oil or to mitigate its potential effects. For that reason, GAO recommended that
the federal government take immediate action, and suggested that the US energy secretary take
the lead in coordinating a government strategy. The government effort, GAO said, should include a
monitoring of global supply and demand with the intent of reducing uncertainty about the timing of peak oil
production. It also should assess alternative technologies in light of predictions about the timing of
peak oil and periodically advise Congress on likely cost-effective areas where government could
assist the private sector with development or adoption of the new technologies. GAO pointed out that
there are "many possible alternatives" to using oil, but that alternatives will require large
investments and in some cases will require major investments or breakthroughs in technology.
"Investment, however, is determined largely by price expectations, so unless high oil prices are sustained, we cannot expect
private investment to continue at current levels," GAO said. But if the peak were anticipated, it said, oil prices would rise,
signaling industry to increase efforts to develop alternatives and consumers of energy to conserve and look for more energy-
efficient products.
108
Nuclear Energy Affirmative
109
Nuclear Energy Affirmative
And, price shocks will lead to BILLIONS of deaths, threatening humanity’s very
existence.
Final Frontier 2008
(May 6 2008, “Economic Collapse”, pg online @
http://www.ff2012.com/EconCollapse.htm)
The straw that breaks the camel's back may very well be the loss of cheap energy. Oil production has
been stagnant since May of 2005 even though demand has been increasing. Mexico, one of the largest
suppliers of oil to the US has stated that it will soon have no oil to export and will become an oil
importing nation. Saudi Arabia has promised to increase its production several times, but did not, perhaps
because they are currently unable to. Because of the time required to bring a new oil field into commercial production,
there will not be enough time to mitigate the oil situation before 2012. Add to that the fact that there are currently no
alternative energy supplies which can come close to supplying the energy this country has become used to
and dependent upon; and it becomes obvious that life cannot continue its present course. Economic
collapse equals death to millions, perhaps billions as the life supporting infrastructure collapses. People
living in cold climates will not be able to heat their homes, resulting in death from cold and illness.
Health care will decline, as people out of work lose all health care. Food production will drop as farms
can no longer operate without fuel, or meet their property tax burden. The system doesn't have to suffer a total
collapse to kill off people. Those already living on the margins of society will easily be pushed to far, and they will be the first to
succumb. Is this likely? Well, it is a possibility. Only time will tell how deep our hole is, and whether we can climb out of it.
110
Nuclear Energy Affirmative
111
Nuclear Energy Affirmative
112
Nuclear Energy Affirmative
the nuclear nonproliferation regime has come under attack from a group of academics and policymakers
who argue that traditional tools such as export controls, diplomatic pressure, arms control agreements, and threats of economic
sanctions are no
They point to North Korea’s reinvigoration
longer sufªcient to battle proliferation.
of its plutonium program, Iran’s apparent progress in developing a nuclear
capability, and the breadth of the Abdul Qadeer (A.Q.) Khan network as
evidence that the regime is failing.1 In addition, they claim that proliferation is
driven by the inevitable spread of technology from a dense network of suppliers
and that certain “rogue” states possess an unºagging determination to acquire
nuclear weapons. Consequently, they argue that only extreme measures such as aggressively enforced containment or regime
change can slow the addition of several more countries to the nuclear club. This “proliferation determinism,”
at least in rhetoric, is shared by many prominent members of President George W. Bush’s administration and has become the main
thrust of U.S. counterproliferation policy.2 Yet current proliferators are neither as “dead set” on proliferating nor as advanced in their
nuclear capabilities as determinists claim.3 To dismantle the network of existing proliferation programs, the administration should
instead move toward a policy of “proliferation pragmatism.” This would entail abandoning extreme rhetoric, using a full
range of incentives and disincentives aimed at states seeking to acquire a nuclear capability, targeting the hubs of proliferation
networks, and engaging in direct talks with the Islamic Republic of Iran and the Democratic Peoples’
Republic of Korea (DPRK). In practice, the Bush administration’s nonproliferation policies have been more varied and less aggressive
than its rhetoric would suggest. For example, it has been willing to enter talks with North Korea and Libya despite describing
both as “rogues.” Strong words can be used strategically to convince proliferators that accepting a settlement offer would be better
than continuing to hold out. Yet the administration’s unyielding rhetoric has placed the United States in a position from which it is
difªcult to back down;4 combined with a lack of positive incentives, this stance has convinced proliferators that the
United States will not agree to or uphold any settlement short of regime change. Moreover, the administration has not formulated any
coherent counterproliferation policies other than regime change and an aggressive form of export control enforcement known as the
Proliferation Security Initiative. With respect to two of the key proliferators today—Iran and North Korea—the Bush administration
has shown little interest in offering any signiªcant incentives or establishing any clear red lines. Instead, it has relied almost
exclusively on China to convince the DPRK to give up its nuclear program and has declined to join the United Kingdom, France, and
First, dense networks
Germany in talks with Iran. Proliferation determinists present two arguments.
among second-tier proliferators such as Iran, North Korea, and Libya and pri-
vate agents—including A.Q. Khan and two of his middlemen, Buhary Seyed
Abu (B.S.A.) Tahir and Urs Tinner—have rapidly accelerated proliferation and
lowered technological barriers.5 Because these networks are widespread and
decentralized, global measures rather than strategies targeted at individual
states are necessary to slow these processes. Second, certain rogue states are
dead set on proliferating and thus have no interest in bargaining.
113
Nuclear Energy Affirmative
114
Nuclear Energy Affirmative
GNEP is committed to spreading nuclear power and preventing weaponized use and
proliferation.
Lacy in '08 (Ian Hore-Lacy, Director for Public Communications at the World Nuclear Association,
“Global Nuclear Energy Partnership
(GNEP)”,http://www.eoearth.org/article/Global_Nuclear_Energy_Partnership_(GNEP), 6/24/2008)
The Global Nuclear Energy Partnership (GNEP) is a comprehensive strategy to expedite the
development of nuclear power around the world while improving the use of resources and
providing greater disincentives to the proliferation of nuclear weapons. It was initiated by the
USA early in 2006, but picked up on concerns and proposals from the International Atomic Energy Agency (IAEA) and Russia. The vision
was for a global network of nuclear fuel cycle facilities all under IAEA control or at least
supervision. Broadly, GNEP’s mission is the global expansion of nuclear power in a safe and secure
manner while reducing the threat of nuclear weapons proliferation and the spread of sensitive
nuclear technology for non-peaceful purposes. The possible spread of nuclear material and
technology for developing weapons of mass destruction must be countered to avoid increasing
the present threat to global security.
115
Nuclear Energy Affirmative
116
Nuclear Energy Affirmative
The greatest threat to the US and the entire civilized [but unbalanced] world is the
proliferation of former Soviet nuclear weapons to radical terrorists.
Cohen'05 (Ariel, Ph.D. is Senior Research Fellow in Russian and Eurasian Studies in the Douglas and
Sarah Allison Center for Foreign Policy Studies, a division of the Kathryn and Shelby Cullom Davis
Institute for International Studies, at The Heritage Foundation., "Preventing a Nightmare Scenario: Terrorist
Attacks Using Russian Nuclear Weapons and Materials", 5/20/2008,
http://www.heritage.org/Research/HomelandSecurity/bg1854.cfm, 6/27/2008)
Since the terrorist attacks on September 11, 2001, Americans have been lucky that there have not been more
atrocities on U.S. soil. However, the enemy, while weakened, is far from destroyed. Osama bin Laden and Ayman
al-Zawahiri continue to issue threats against America from their hideouts. Their strength and support base, while diminished, is not
eliminated. Other terrorist organizations inspired by radical Islamist ideology are still at large in Europe, the Middle
East, the Caucasus, Central Asia, the Indian subcontinent, Southeast Asia, and (presumably) the Americas,
and some of them are willing to use weapons of mass destruction (WMD) to bring down America.
There are also media reports of al-Qaeda buying or stealing up to 20 nuclear warheads from the
former Soviet republics, bin Laden providing $3 million and large commercial amounts of opium
to Chechens in exchange for nuclear weapons or material, and four Turkmen nuclear scientists
working to create an al-Qaeda weapon.[3] The veracity of these reports cannot be independently evaluated.[4] In February
2005, Director of Central Intelligence Porter Goss testified that al-Qaeda might possess radioactive material of Russian or Soviet origin.
117
Nuclear Energy Affirmative
118
Nuclear Energy Affirmative
That’s because the interest rate that any commercial bank would charge on a loan for a
nuclear facility would be so high — because of all the risks of lawsuits or cost overruns
— that it would be impossible for Exelon to proceed. A standard nuclear plant today costs
about $3 billion per unit. The only way to stimulate more nuclear power innovation,
Crane said, would be federal loan guarantees that would lower the cost of capital for
anyone willing to build a new nuclear plant. The 2005 energy bill created such loan
guarantees, but the details still have not been worked out. “We would need a robust loan
guarantee program to jump-start the nuclear industry,” Crane said — an industry
that has basically been frozen since the 1979 Three Mile Island accident. With cheaper
money, added Crane, CO2-free nuclear power could be “very competitive” with CO2-
emitting pulverized coal.
119
Nuclear Energy Affirmative
Estimates of the levelized costs of power from new reactors are shown in Table 26.
These results, which range from 3.7 cents per kWh to 9.8 cents per kWh, are largely
driven by capital cost and financing assumptions. According to Joskow, the 6.7 cents
per kWh MIT study estimate that is shown in Table 26 falls to 5.2 cents per kWh if
the plant is built and financed by a regulated utility with ratepayers bearing the
investment risk (CEEPR 2006, pp.15, 28). Similarly, federal loan guarantees can
reduce the financing costs of a plant. According to an April 2007 Cambridge Energy
Research Associates report, government funding or loan guarantees can reduce the
levelized cost of nuclear generation by 10-15 percent (CERA 2007).
120
Nuclear Energy Affirmative
Loan guarantees and tax credits have been recommended by the government for
nuclear power.
Charles F. Carroll and John E. Matthews, 2005
In its report dated January 10, 2005, the [Nuclear Energy Task Force] identified the
unavailability of financing as a significant obstacle to new nuclear power plant
construction. The NETF recommended that the US government offer a range of
financial incentives for the construction of the first few reactors, such as: secured
loans, loan guarantees, accelerated depreciation, investment tax credits, production tax
credits and government power purchase agreements. The NETF’s recommended
“menu” of incentives is intended to address the anticipated financing needs of
companies thought likely to pursue new plant construction without prescribing a
particular financial model. The three financial models cited by the NETF as likely to be
used for new plant construction are: the regulated utility model; the unregulated merchant
generator model; and the non-recourse project finance model.
121
Nuclear Energy Affirmative
122
Nuclear Energy Affirmative
US Global Leadership Key. By creating nuclear energy incentives the rest of the world will follow
and in turn provide new technology and economic return.
Condoleezza Rice, Secretary of State, March 13, 2006 The Washington Post
The week before last President Bush concluded a historic agreement on civilian nuclear cooperation with India, a rising
democratic power in a dynamic Asia. This agreement is a strategic achievement: It will strengthen international security. It will
enhance energy security and environmental protection. It will foster economic and technological development. And it will help
transform the partnership between the world's oldest and the world's largest democracy.
First, our agreement with India will make our future more secure, by expanding the reach of the
international nonproliferation regime. The International Atomic Energy Agency would gain access to
India's civilian nuclear program that it currently does not have. Recognizing this, the IAEA's director
general, Mohamed ElBaradei, has joined leaders in France and the United Kingdom to welcome our
agreement. He called it "a milestone, timely for ongoing efforts to consolidate the non-proliferation
regime, combat nuclear terrorism and strengthen nuclear safety."
Our agreement with India is unique because India is unique. India is a democracy, where citizens of many ethnicities and faiths
cooperate in peace and freedom. India's civilian government functions transparently and accountably. It is fighting terrorism and
extremism, and it has a 30-year record of responsible behavior on nonproliferation matters.
Aspiring proliferators such as North Korea or Iran may seek to draw connections between themselves and India, but their
rhetoric rings hollow. Iran is a state sponsor of terrorism that has violated its own commitments and is defying the international
community's efforts to contain its nuclear ambitions. North Korea, the least transparent country in the world, threatens its
neighbors and proliferates weapons. There is simply no comparison between the Iranian or North Korean regimes and India.
The world has known for some time that India has nuclear weapons, but our agreement will not
enhance its capacity to make more. Under the agreement, India will separate its civilian and military
nuclear programs for the first time. It will place two-thirds of its existing reactors, and about 65 percent
of its generating power, under permanent safeguards, with international verification -- again, for the
first time ever. This same transparent oversight will also apply to all of India's future civilian reactors,
both thermal and breeder. Our sale of nuclear material or technology would benefit only India's civilian
reactors, which would also be eligible for international cooperation from the Nuclear Suppliers Group.
Second, our agreement is good for energy security. India, a nation of a billion people, has a massive
appetite for energy to meet its growing development needs. Civilian nuclear energy will make it less
reliant on unstable sources of oil and gas. Our agreement will allow India to contribute to and share in
the advanced technology that is needed for the future development of nuclear energy. And because
nuclear energy is cleaner than fossil fuels, our agreement will also benefit the environment. A threefold
increase in Indian nuclear capacity by 2015 would reduce India's projected annual CO2emissions by
more than 170 million tons, about the current total emissions of the Netherlands.
Third, our agreement is good for American jobs, because it opens the door to civilian nuclear trade and cooperation between our
nations. India plans to import eight nuclear reactors by 2012. If U.S. companies win just two of those reactor contracts, it will
mean thousands of new jobs for American workers. We plan to expand our civilian nuclear partnership to research and
development, drawing on India's technological expertise to promote a global renaissance in safe and clean nuclear power.
Finally, our civilian nuclear agreement is an essential step toward our goal of transforming America's
partnership with India. For too long during the past century, differences over domestic policies and
international purposes kept India and the United States estranged. But with the end of the Cold War,
the rise of the global economy and changing demographics in both of our countries, new opportunities
have arisen for a partnership between our two great democracies. As President Bush said in New Delhi
123
Nuclear Energy Affirmative
this month, "India in the 21st century is a natural partner of the United States because we are brothers
in the cause of human liberty."
Under the president's leadership, we are beginning to realize the full promise of our relationship with
India, in fields as diverse as agriculture and health, commerce and defense, science and technology,
and education and exchange. Over 65,000 Americans live in India, attracted by its growing economy
and the richness of its culture. There are more than 2 million people of Indian origin in the United
States, many of whom are U.S. citizens. More Indians study in our universities than students from any
other nation. Our civilian nuclear agreement is a critical contribution to the stronger, more enduring
partnership that we are building.
We are consulting extensively with Congress as we seek to amend the laws needed to implement the agreement. This is an
opportunity that should not be missed. Looking back decades from now, we will recognize this moment as the time when
America invested the strategic capital needed to recast its relationship with India. As the nations of Asia continue their dramatic
rise in a rapidly changing region, a thriving, democratic India will be a pillar of Asia's progress, shaping its development for
decades. This is a future that America wants to share with India, and there is not a moment to lose.
124
Nuclear Energy Affirmative
November 28, 2003, The Korea Herald, Yoo Soh-jung, staff reporter
Opening the electric power market is currently a global trend. Intense competition therefore looms in
Korea as demand for electric power in developing countries continues to rise.
Such conditions call for the Ministry of Science and Technology to reinforce itself with the technology
and know-how it has amassed to make inroads into markets overseas.
The ministry says it aims to strengthen Korea's participation in electric power businesses abroad. Once
it achieves this goal, the government plans to overcome the growth limitations of the domestic electric
power market and contribute to improving the national economy.
Korea's overseas business partners in this area predominantly involve nations that entered the nuclear
power industry later than the first-mover countries. The network includes countries such as China,
Romania and Vietnam. The ministry has noted that its foreign partners expect to build a strong
cooperative relationship with Korea.
These countries are currently making plans to advance on the international nuclear power market
with the support of the Korea Hydro & Nuclear Power Co. and other domestic corporations such as
Doosan Heavy Industries & Construction Co., and through these corporations' ties with foreign
organizations such as the World Energy Council of the United States and Atomic Energy of Canada
Ltd.
With its rich experience and strong technology base, the government says it is confident it can supply
globally competitive services.
Furthermore, despite a slowdown for the nuclear energy industry in the U.S. and Europe, the
government says it is steadily promoting the nuclear power generation business in response to Korea's
increasing electricity demand. It is also seeking new sites for nuclear power plants and supporting the
development of commercial technology.
As of the end of 2001, 16 nuclear power units have been in operation in Korea, with four units under
construction. Construction of four new units began this year. Korea has about 13 gigawatts of nuclear
power generating capacity, which accounts for 28 percent of its electric power generation.
Under the Ministry of Commerce, Industry and Energy's "Fifth Long-Term Plan for Electric Power
Demand and Supply," which was finalized in December 2001, 12 new nuclear power units will be built
by 2015. The government expects their completion to increase the share of nuclear-power capacity and
generation to 33 percent and 44.5 percent, respectively.
In addition to improving Korea's competitiveness by expanding the industry, Korea's rising status
partly comes from its relations with international organizations, particularly the International Atomic
Energy Agency. For instance, since the country became an IAEA member in 1957, it has received
assistance in training the atomic energy work force through the agency's technical cooperation projects.
Following the conclusion of a memorandum of understanding with the IAEA in 1998, theMinistry of
Science and Technology said that Korea has played a role in expanding atomic energy education and
125
Nuclear Energy Affirmative
training programs for developing countries, and has plans to strengthen the activities and programs at
international training and education centers.
Furthermore, Korea hosted the regional office of the Regional Cooperative Agreement for Research,
Development and Training Related to Nuclear Science and Technology in the Asia and Pacific Region
to strengthen technical cooperation and facilitate technology transfers among member states in March
2002.
Moreover, since joining the Nuclear Energy Agency in 1993, Korea has participated in joint research
projects of the Organisation for Economic Cooperation and Development and NEA, such as the Halden
Reactor, RASPLAV, International System on Occupational Exposure and International Cooperative
Decommissioning Program.
126
Nuclear Energy Affirmative
The United States must lead in nuclear energy in order for global leadership and
dominance.
Keir A. Lieber and Daryl G. Press, Council on Foreign Relations, March/April 2006
This debate may now seem like ancient history, but it is actually more relevant than ever
-- because the age of MAD is nearing an end. Today, for the first time in almost 50 years,
the United States stands on the verge of attaining nuclear primacy. It will probably soon
be possible for the United States to destroy the long-range nuclear arsenals of Russia or
China with a first strike. This dramatic shift in the nuclear balance of power stems from a
series of improvements in the United States' nuclear systems, the precipitous decline of
Russia's arsenal, and the glacial pace of modernization of China's nuclear forces. Unless
Washington's policies change or Moscow and Beijing take steps to increase the size and
readiness of their forces, Russia and China -- and the rest of the world -- will live in the
shadow of U.S. nuclear primacy for many years to come.
One's views on the implications of this change will depend on one's theoretical perspective. Hawks, who believe that the United States is a benevolent
force in the world, will welcome the new nuclear era because they trust that U.S. dominance in both conventional and nuclear weapons will help deter
aggression by other countries. For example, as U.S. nuclear primacy grows, China's leaders may act more cautiously on issues such as Taiwan, realizing
that their vulnerable nuclear forces will not deter U.S. intervention -- and that Chinese nuclear threats could invite a U.S. strike on Beijing's arsenal. But
doves, who oppose using nuclear threats to coerce other states and fear an emboldened and unconstrained United States, will worry. Nuclear primacy
might lure Washington into more aggressive behavior, they argue, especially when combined with U.S. dominance in so many other dimensions of
national power. Finally, a third group -- owls, who worry about the possibility of inadvertent conflict -- will fret that U.S. nuclear primacy could prompt
other nuclear powers to adopt strategic postures, such as by giving control of nuclear weapons to lower-level commanders, that would make an
unauthorized nuclear strike more likely -- thereby creating what strategic theorists call "crisis instability."
ARSENAL OF A DEMOCRACY
For 50 years, the Pentagon's war planners have structured the U.S. nuclear arsenal according to the goal of deterring a nuclear attack on the United States
and, if necessary, winning a nuclear war by launching a preemptive strike that would destroy an enemy's nuclear forces. For these purposes, the United
States relies on a nuclear triad comprising strategic bombers, intercontinental ballistic missiles (ICBMs), and ballistic-missile-launching submarines
(known as SSBNs). The triad reduces the odds that an enemy could destroy all U.S. nuclear forces in a single strike, even in a surprise attack, ensuring
that the United States would be able to launch a devastating response. Such retaliation would only have to be able to destroy a large enough portion of the
attacker's cities and industry to deter an attack in the first place. The same nuclear triad, however, could be used in an offensive attack against an
adversary's nuclear forces. Stealth bombers might slip past enemy radar, submarines could fire their missiles from near the enemy's shore and so give the
enemy's leaders almost no time to respond, and highly accurate land-based missiles could destroy even hardened silos that have been reinforced against
attack and other targets that require a direct hit. The ability to destroy all of an adversary's nuclear forces, eliminating the possibility of a retaliatory strike,
is known as a first-strike capability, or nuclear primacy.
The United States derived immense strategic benefits from its nuclear primacy during the early years of the Cold War, in terms of both crisis-bargaining
advantages vis-à-vis the Soviet Union (for example, in the case of Berlin in the late 1950s and early 1960s) and planning for war against the Red Army in
Europe. If the Soviets had invaded Western Europe in the 1950s, the United States intended to win World War III by immediately launching a massive
nuclear strike on the Soviet Union, its Eastern European clients, and its Chinese ally. These plans were not the concoctions of midlevel Pentagon
bureaucrats; they were approved by the highest level of the U.S. government.
U.S. nuclear primacy waned in the early 1960s, as the Soviets developed the capability to carry out a retaliatory second strike. With this development
came the onset of MAD. Washington abandoned its strategy of a preemptive nuclear strike, but for the remainder of the Cold War, it struggled to escape
MAD and reestablish its nuclear dominance. It expanded its nuclear arsenal, continuously improved the accuracy and the lethality of its weapons aimed at
Soviet nuclear arms, targeted Soviet command-and-control systems, invested in missile-defense shields, sent attack submarines to trail Soviet SSBNs, and
built increasingly accurate multiwarhead land- and submarine-launched ballistic missiles as well as stealth bombers and stealthy nuclear-armed cruise
missiles. Equally unhappy with MAD, the Soviet Union also built a massive arsenal in the hope of gaining nuclear superiority. Neither side came close to
127
Nuclear Energy Affirmative
gaining a first-strike capability, but it would be a mistake to dismiss the arms race as entirely irrational: both superpowers were well aware of the benefits
of nuclear primacy, and neither was willing to risk falling behind.
Since the Cold War's end, the U.S. nuclear arsenal has significantly improved. The United States has replaced the ballistic missiles on its submarines with
the substantially more accurate Trident II D-5 missiles, many of which carry new, larger-yield warheads. The U.S. Navy has shifted a greater proportion
of its SSBNs to the Pacific so that they can patrol near the Chinese coast or in the blind spot of Russia's early warning radar network. The U.S. Air Force
has finished equipping its B-52 bombers with nuclear-armed cruise missiles, which are probably invisible to Russian and Chinese air-defense radar. And
the air force has also enhanced the avionics on its B-2 stealth bombers to permit them to fly at extremely low altitudes in order to avoid even the most
sophisticated radar. Finally, although the air force finished dismantling its highly lethal MX missiles in 2005 to comply with arms control agreements, it is
significantly improving its remaining ICBMs by installing the MX's high-yield warheads and advanced reentry vehicles on Minuteman ICBMs, and it has
upgraded the Minuteman's guidance systems to match the MX's accuracy.
IMBALANCE OF TERROR
Even as the United States' nuclear forces have grown stronger since the end of the Cold War, Russia's strategic nuclear arsenal has sharply deteriorated.
Russia has 39 percent fewer long-range bombers, 58 percent fewer ICBMs, and 80 percent fewer SSBNs than the Soviet Union fielded during its last
days. The true extent of the Russian arsenal's decay, however, is much greater than these cuts suggest. What nuclear forces Russia retains are hardly ready
for use. Russia's strategic bombers, now located at only two bases and thus vulnerable to a surprise attack, rarely conduct training exercises, and their
warheads are stored off-base. Over 80 percent of Russia's silo-based ICBMs have exceeded their original service lives, and plans to replace them with
new missiles have been stymied by failed tests and low rates of production. Russia's mobile ICBMs rarely patrol, and although they could fire their
missiles from inside their bases if given sufficient warning of an attack, it appears unlikely that they would have the time to do so.
The third leg of Russia's nuclear triad has weakened the most. Since 2000, Russia's SSBNs have conducted approximately two patrols per year, down
from 60 in 1990. (By contrast, the U.S. SSBN patrol rate today is about 40 per year.) Most of the time, all nine of Russia's ballistic missile submarines are
sitting in port, where they make easy targets. Moreover, submarines require well-trained crews to be effective. Operating a ballistic missile submarine --
and silently coordinating its operations with surface ships and attack submarines to evade an enemy's forces -- is not simple. Without frequent patrols, the
skills of Russian submariners, like the submarines themselves, are decaying. Revealingly, a 2004 test (attended by President Vladimir Putin) of several
submarine-launched ballistic missiles was a total fiasco: all either failed to launch or veered off course. The fact that there were similar failures in the
summer and fall of 2005 completes this unflattering picture of Russia's nuclear forces.
Compounding these problems, Russia's early warning system is a mess. Neither Soviet nor Russian satellites have ever been capable of reliably detecting
missiles launched from U.S. submarines. (In a recent public statement, a top Russian general described his country's early warning satellite constellation
as "hopelessly outdated.") Russian commanders instead rely on ground-based radar systems to detect incoming warheads from submarine-launched
missiles. But the radar network has a gaping hole in its coverage that lies to the east of the country, toward the Pacific Ocean. If U.S. submarines were to
fire missiles from areas in the Pacific, Russian leaders probably would not know of the attack until the warheads detonated. Russia's radar coverage of
some areas in the North Atlantic is also spotty, providing only a few minutes of warning before the impact of submarine-launched warheads.
Moscow could try to reduce its vulnerability by finding the money to keep its submarines and mobile missiles dispersed. But that would be only a short-
term fix. Russia has already extended the service life of its aging mobile ICBMs, something that it cannot do indefinitely, and its efforts to deploy new
strategic weapons continue to flounder. The Russian navy's plan to launch a new class of ballistic missile submarines has fallen far behind schedule. It is
now highly likely that not a single new submarine will be operational before 2008, and it is likely that none will be deployed until later.
Even as Russia's nuclear forces deteriorate, the United States is improving its ability to track submarines and mobile missiles, further eroding Russian
military leaders' confidence in Russia's nuclear deterrent. (As early as 1998, these leaders publicly expressed doubts about the ability of Russia's ballistic
missile submarines to evade U.S. detection.) Moreover, Moscow has announced plans to reduce its land-based ICBM force by another 35 percent by
2010; outside experts predict that the actual cuts will slice 50 to 75 percent off the current force, possibly leaving Russia with as few as 150 ICBMs by
the end of the decade, down from its 1990 level of almost 1,300 missiles. The more Russia's nuclear arsenal shrinks, the easier it will become for the
United States to carry out a first strike.
To determine how much the nuclear balance has changed since the Cold War, we ran a computer model of a hypothetical U.S. attack on Russia's nuclear
arsenal using the standard unclassified formulas that defense analysts have used for decades. We assigned U.S. nuclear warheads to Russian targets on the
basis of two criteria: the most accurate weapons were aimed at the hardest targets, and the fastest-arriving weapons at the Russian forces that can react
most quickly. Because Russia is essentially blind to a submarine attack from the Pacific and would have great difficulty detecting the approach of low-
flying stealthy nuclear-armed cruise missiles, we targeted each Russian weapon system with at least one submarine-based warhead or cruise missile. An
attack organized in this manner would give Russian leaders virtually no warning.
This simple plan is presumably less effective than Washington's actual strategy, which the U.S. government has spent decades perfecting. The real U.S.
war plan may call for first targeting Russia's command and control, sabotaging Russia's radar stations, or taking other preemptive measures -- all of which
would make the actual U.S. force far more lethal than our model assumes.
According to our model, such a simplified surprise attack would have a good chance of destroying every Russian bomber base, submarine, and ICBM.
[See Footnote #1] This finding is not based on best-case assumptions or an unrealistic scenario in which U.S. missiles perform perfectly and the warheads
hit their targets without fail. Rather, we used standard assumptions to estimate the likely inaccuracy and unreliability of U.S. weapons systems. Moreover,
our model indicates that all of Russia's strategic nuclear arsenal would still be destroyed even if U.S. weapons were 20 percent less accurate than we
assumed, or if U.S. weapons were only 70 percent reliable, or if Russian ICBM silos were 50 percent "harder" (more reinforced, and hence more resistant
to attack) than we expected. (Of course, the unclassified estimates we used may understate the capabilities of U.S. forces, making an attack even more
likely to succeed.)
128
Nuclear Energy Affirmative
To be clear, this does not mean that a first strike by the United States would be guaranteed to work in reality; such an attack would entail many
uncertainties. Nor, of course, does it mean that such a first strike is likely. But what our analysis suggests is profound: Russia's leaders can no longer
count on a survivable nuclear deterrent. And unless they reverse course rapidly, Russia's vulnerability will only increase over time.
China's nuclear arsenal is even more vulnerable to a U.S. attack. A U.S. first strike could succeed whether it was launched as a surprise or in the midst of
a crisis during a Chinese alert. China has a limited strategic nuclear arsenal. The People's Liberation Army currently possesses no modern SSBNs or long-
range bombers. Its naval arm used to have two ballistic missile submarines, but one sank, and the other, which had such poor capabilities that it never left
Chinese waters, is no longer operational. China's medium-range bomber force is similarly unimpressive: the bombers are obsolete and vulnerable to
attack. According to unclassified U.S. government assessments, China's entire intercontinental nuclear arsenal consists of 18 stationary single-warhead
ICBMs. These are not ready to launch on warning: their warheads are kept in storage and the missiles themselves are unfueled. (China's ICBMs use liquid
fuel, which corrodes the missiles after 24 hours. Fueling them is estimated to take two hours.) The lack of an advanced early warning system adds to the
vulnerability of the ICBMs. It appears that China would have no warning at all of a U.S. submarine-launched missile attack or a strike using hundreds of
stealthy nuclear-armed cruise missiles.
Many sources claim that China is attempting to reduce the vulnerability of its ICBMs by building decoy silos. But decoys cannot provide a firm basis for
deterrence. It would take close to a thousand fake silos to make a U.S. first strike on China as difficult as an attack on Russia, and no available
information on China's nuclear forces suggests the existence of massive fields of decoys. And even if China built them, its commanders would always
wonder whether U.S. sensors could distinguish real silos from fake ones.
Despite much talk about China's military modernization, the odds that Beijing will acquire a survivable nuclear deterrent in the next decade are slim.
China's modernization efforts have focused on conventional forces, and the country's progress on nuclear modernization has accordingly been slow. Since
the mid-1980s, China has been trying to develop a new missile for its future ballistic missile submarine as well as mobile ICBMs (the DF-31 and longer-
range DF-31A) to replace its current ICBM force. The U.S. Defense Department predicts that China may deploy DF-31s in a few years, although the
forecast should be treated skeptically: U.S. intelligence has been announcing the missile's imminent deployment for decades.
Even when they are eventually fielded, the DF-31s are unlikely to significantly reduce China's vulnerability. The missiles' limited range, estimated to be
only 8,000 kilometers (4,970 miles), greatly restricts the area in which they can be hidden, reducing the difficulty of searching for them. The DF-31s
could hit the contiguous United States only if they were deployed in China's far northeastern corner, principally in Heilongjiang Province, near the
Russian-North Korean border. But Heilongjiang is mountainous, and so the missiles might be deployable only along a few hundred kilometers of good
road or in a small plain in the center of the province. Such restrictions increase the missiles' vulnerability and raise questions about whether they are even
intended to target the U.S. homeland or whether they will be aimed at targets in Russia and Asia.
Given the history of China's slow-motion nuclear modernization, it is doubtful that a Chinese second-strike force will materialize anytime soon. The
United States has a first-strike capability against China today and should be able to maintain it for a decade or more.
INTELLIGENT DESIGN?
Is the United States intentionally pursuing nuclear primacy? Or is primacy an unintended byproduct of intra-Pentagon competition for budget share or of
programs designed to counter new threats from terrorists and so-called rogue states? Motivations are always hard to pin down, but the weight of the
evidence suggests that Washington is, in fact, deliberately seeking nuclear primacy. For one thing, U.S. leaders have always aspired to this goal. And the
nature of the changes to the current arsenal and official rhetoric and policies support this conclusion.
The improvements to the U.S. nuclear arsenal offer evidence that the United States is actively seeking primacy. The navy, for example, is upgrading the
fuse on the W-76 nuclear warhead, which sits atop most U.S. submarine-launched missiles. Currently, the warheads can be detonated only as air bursts
well above ground, but the new fuse will also permit ground bursts (detonations at or very near ground level), which are ideal for attacking very hard
targets such as ICBM silos. Another navy research program seeks to improve dramatically the accuracy of its submarine-launched missiles (already
among the most accurate in the world). Even if these efforts fall short of their goals, any refinement in accuracy combined with the ground-burst fuses
will multiply the missiles' lethality. Such improvements only make sense if the missiles are meant to destroy a large number of hard targets. And given
that B-2s are already very stealthy aircraft, it is difficult to see how the air force could justify the increased risk of crashing them into the ground by
having them fly at very low altitudes in order to avoid radar detection -- unless their mission is to penetrate a highly sophisticated air defense network
such as Russia's or, perhaps in the future, China's.
During the Cold War, one explanation for the development of the nuclear arms race was that the rival military services' competition for budget share
drove them to build ever more nuclear weapons. But the United States today is not achieving primacy by buying big-ticket platforms such as new SSBNs,
bombers, or ICBMs. Current modernization programs involve incremental improvements to existing systems. The recycling of warheads and reentry
vehicles from the air force's retired MX missiles (there are even reports that extra MX warheads may be put on navy submarine-launched missiles) is the
sort of efficient use of resources that does not fit a theory based on parochial competition for increased funding. Rather than reflect organizational
resource battles, these steps look like a coordinated set of programs to enhance the United States' nuclear first-strike capabilities.
Some may wonder whether U.S. nuclear modernization efforts are actually designed with terrorists or rogue states in mind. Given the United States'
ongoing war on terror, and the continuing U.S. interest in destroying deeply buried bunkers (reflected in the Bush administration's efforts to develop new
nuclear weapons to destroy underground targets), one might assume that the W-76 upgrades are designed to be used against targets such as rogue states'
arsenals of weapons of mass destruction or terrorists holed up in caves. But this explanation does not add up. The United States already has more than a
thousand nuclear warheads capable of attacking bunkers or caves. If the United States' nuclear modernization were really aimed at rogue states or
terrorists, the country's nuclear force would not need the additional thousand ground-burst warheads it will gain from the W-76 modernization program.
The current and future U.S. nuclear force, in other words, seems designed to carry out a preemptive disarming strike against Russia or China.
The intentional pursuit of nuclear primacy is, moreover, entirely consistent with the
United States' declared policy of expanding its global dominance. The Bush
129
Nuclear Energy Affirmative
administration's 2002 National Security Strategy explicitly states that the United States
aims to establish military primacy: "Our forces will be strong enough to dissuade
potential adversaries from pursuing a military build-up in hopes of surpassing, or
equaling, the power of the United States." To this end, the United States is openly seeking
primacy in every dimension of modern military technology, both in its conventional
arsenal and in its nuclear forces.
130
Nuclear Energy Affirmative
Light water reactors are safe from overheating and internal accidents.
S.S. Penner, R. Seiser, both professors at the University of California Center of Atomic Research, and
K.R. Schultz, General Atomics, 07
When a nuclear reactor is shut down, the radioactive materials in the core continue to generate some
heat. This heat must be removed to keep the reactor temperature at safe levels. The current light-
water reactors utilize active means for this purpose. There are operational and back-up pumps, pipes,
and heat exchangers to cool the reactor. In recent years, emphasis has been placed on making the
heatremoval systems operate even in case of an accident without human intervention to protect the reactor
and thus the public. This approach is referred to as using a passive safety measure. Passively safe designs
are features of the Generation III+ and later light-water reactors. Designs that tolerate massive pipe
breaks or other equipment failures are being developed for Generation IV gas-cooled reactor designs. It is
worth noting how the AP600 system is designed to produce passive safety in case of a loss of coolant
accident (LOCA). The basic idea is simply that application of gravity cannot fail. The AP600 is the first
reactor with this passive safety feature. It is certified by the US NRC. Tanks elevated with respect to the
reactor core are filled with cold water containing a dissolved salt (e.g., sodium borate). If a LOCA occurs
while the reactor-core pressure remains elevated, the core make-up tanks (CMTs) circulate cold water
through the core as the result of negative buoyancy. At somewhat reduced pressures, forced water injection
is caused by high-pressure nitrogen. At low pressures, forced water injection occurs from water-storage
tanks. The flashing steam condenses on the internal walls before it is recycled back to the water-storage
tank and the lower containment compartment. Heat generated in the reactor core drives convection-cooling
processes to operate as long as injection cooling is needed. The AP600 has sufficient redundancy to
guarantee continued reactor cooling even when some but not all of the systems fail. In the highly unlikely
event that all of the passive safety systems for water-cooled reactors fail simultaneously, safety
systems such as the containment vessels and active cooling systems are designed to ensure public
safety. Preferred over passively safe systems are inherently safe designs. These generally contain fuel
elements made entirely of ceramics that have melting points higher than the steady-state temperatures
reached in the reactor without cooling. The passive safety features of modular pebble-bed reactors
(MPBRs) and prismatic helium-cooled reactors should generally ensure operational safety (see Section 7).
131
Nuclear Energy Affirmative
Most nuclear plants use light water, which is safe and cheap, alleviating any security
or cost concerns.
Most nuclear reactors, particularly in the USA, are of the light water (LWR)
variety, in either pressurized liquid or boiling variants. Improved light water systems,
focusing on safety and cost, have recently been proposed by Westinghouse, Siemens, and
General Electric. Additionally, there are departures from the LWR modes that use closed
systems for cooling and moderation. Closed systems are cooled by gas, heavy water,
liquid sodium, or liquid lead, and some employ innovative types of fuel, TRISO or
breeder fuels. They hold the most promise for redressing nuclear power concerns (Box 1).
Table I characterizes representative advanced reactors. The comparative costs of
advanced power systems, both nuclear and conventional, are of significance to investors
and consumers. The best estimates are that advanced LWR systems are about 10
percent less costly to build and run than newly installed combined-cycle gas-fired
plants, and that TRISO-fueled reactors that employ Brayton cycle power conversion
units are 20 percent less costly. Breeder reactors, by way of comparison, are significantly
more expensive, up to twice the cost of conventional plants. Table II displays estimated
comparative costs in a very general way. Even though the capital costs of nuclear plants
are higher, the life cycle costs are lower due to much lower fuel costs. Note that
although lower costs are an investment incentive, they are not sufficient to alleviate
concerns about nuclear power. Pressurized and boiling water reactors Most of the nuclear
reactors in the world use ordinary light water as coolant and moderator; a small number
uses heavy water. The most common design is the PWR (pressurized water reactor)
encompassing 65 percent of the total; boiling water reactors (BWR) are at 23 percent.
The principal disadvantages of PWR and BWR include waste disposal issues, heat
pollution, and vulnerability to terrorist attack. Light water reactors have also been
criticized on grounds of safety and cost, and although they actually rate highly in
these regards, the public perception is otherwise. Ironically, only safety and cost are
addressed in the newer PWR and BWR proposals. Box 1 Moderation is the process
whereby fast neutrons are slowed to what are known as thermal levels. Fast neutrons can
breed fertile materials; thermal neutrons cannot. However, thermal neutrons interact with
fissile materials more efficiently with large release of energy. Typical moderators are
hydrogen and carbon.
132
Nuclear Energy Affirmative
Light water reactors are cheaper, safer, and more efficient than any other nuclear
reactor.
The Daily Yomiuri – 2007
The Daily Yomiuri, July 14 2007, Tokyo
The Economy, Trade and Industry Ministry plans to create a basic design for large light water
reactors that can generate 1.8 million kilowatts of electricity, or about 1.3 times more than
current large nuclear reactors.
The ministry plans to introduce the large light water reactors in about 2025 and jointly
develop them in conjunction with electric power companies and nuclear power-plant builders.
The research and development cost of 60 billion yen will be split equally between the
government and the private sector.
As the nation's nuclear power plants are expected to be rebuilt from the second half of the
2020s, the ministry has carried out basic research on light water reactors to fill the void
before fast breeder reactors are introduced.
Because fast breeder reactors use natrium--which has a high heat efficiency--as a coolant,
they need a completely new design. However, light water reactors will be an extension of the
current reactors, which use water, which is easy to handle.
In addition to increasing power output significantly, the density of the uranium used will be
enriched so that it will burn for a long time, reducing the amount of spent nuclear fuel by
about 40 percent.
With the ministry currently finding it difficult to decide on sites for the disposal of high-level
radioactive waste, finding ways to dispose of nuclear waste has become a top priority.
133
Nuclear Energy Affirmative
There has always been doubt as to the superiority, both technical and economic, of the light water reactor. It
is difficult to document the claim that light water is inferior in an ex post sense–light water may be
relatively good now but had a different technology dominated, we would have an even better reactor.
Nonetheless, there are indications that this hypothesis is true. In the fifties, following a debate on the
relative merits of enriched uranium (light water) and natural uranium (heavy water and gas graphite), the
journal Nucleonics stated that "to the observer of this debate it seems that enriched reactors must rely
heavily upon their development potential to do much better than match the power costs of natural uranium
systems." Further, the cost estimates made throughout the fifties, detailed later, by no means pointed to light
water as the most efficient technology.
Both the gas graphite and heavy water reactors have much lower volumetric power densities (the ratio of
power output to core volume) than do light water reactors. While this tends to raise capital costs and reduce
design flexibility, it also provides a safety advantage
. In the event of a coolant loss, the core will provide a
much larger heat sink (particularly in the case of the graphite core) and so the temperature transients will be
much smaller, giving operators more time to effect an adequate response. The use of a gas coolant also has
the advantage of being safe from phase changes with changes in pressure or temperature. Thus under many
fault conditions cooling can be maintained in the gas graphite reactor, when it would be lost with liquid
coolant technologies. A second, related advantage of gas coolants is that they can be heated to higher
temperatures, which gives the advanced gas graphite reactors a higher
thermal efficiency than others.
An element of considerable concern during the British debate over the merits of light water and gas
graphite technologies was the steel pressure vessel of the pressurized water reactor (PWR –Westinghouse’s
light water reactor). The safety principle in the PWR was, and still is, that the vessel never comes close to
failure. If a crack does happen to reach the critical size (much smaller than the thickness of the vessel),
however, it can grow at speeds up to the speed of sound. There would be no time for reaction. To
manufacture a vessel sufficiently free of flaws to be safe from this problem requires very high technology
manufacturing abilities, which are beyond the capabilities of many countries and were beyond most
countries in the fifties. Both the Canadian heavy water reactor, the Candu, and the second—generation
British gas graphite reactor, the AGR, avoid this problem through systematic redundancy. The Candu uses
many pressure tubes rather than a single vessel. The failure of a single tube is not critical and gives warning
of other potential failures. This makes Candu less prone to meltdown due to coolant loss.
The AGR uses
a prestressed concrete pressure vessel. There is considerable mechanical redundancy in the system of steel
load—bearing cables. Cables can be replaced individually, and again, the failure of a single cable is not
fatal and gives warning of other potential failures.
134
Nuclear Energy Affirmative
In terms of operating experience, light water has not been significantly better than the
other technologies in spite of having logged many more reactor years–an order of
magnitude more than heavy water and three times more than gas graphite. While
occupational radiation exposure with light water has been approximately equal to that of
heavy water, it has been more than 10 times that of the British gas graphite reactors. The
annual load factor of a reactor is the ratio of the total amount of power produced in a year
to the amount it would have produced had it operated at full capacity, never shutting
down, throughout the year. This is the standard measure of reactor availability. The
average annual load factors of light water and gas graphite reactors have been
approximately equal at 63 percent. Heavy water reactors, however, have had an average
annual load factor of 73 percent. This difference is due in part to the on—load refueling
capabilities of the Candu, which have been adopted for the AGR.
Hugh McIntyre estimated that the heavy water Candu reactors at Pickering generate
power at about 75 percent of the cost of the light water reactors of equivalent size at the
Zion 2 generating station in Illinois. This is consistent with analyses done by Ontario
Hydro, which suggest that if Ontario Hydro had a mature light water reactor program, the
costs of nuclear electricity would be 20 to 25 percent higher than with the current heavy
water systems.
There is considerable evidence, then, that other technologies have inherent advantages
over light water and that with equivalent amounts of development and use might well
have proven to be better. While it is not possible to document definitively that light water
is an inferior technology, it seems clear that the dominant position held by light water
cannot be due to a unanimous belief in its technical and economic superiority.
135
Nuclear Energy Affirmative
SOLVENCY – DUPIC
By enacting DUPIC processing the waste from light water reactors can be used to fuel heavy
water reactors (CANDU reactors)—solving waste problems.
The Toronto Star (Newspaper), February 12, 2007, Lexis-Nexis “The Candu edge; Canada's
heavy-water reactors can run on spent fuel from most light-water reactors, eliminating 2
headaches: skyrocketing uranium prices and waste disposal concerns”
The international potential of Candu nuclear reactors may not be obvious to some, but rising uranium
prices and heightened concern over nuclear-waste disposal could soon shine a light on this made-in-
Canada technology.
Nobody sees this more than Myung Seung Yang of South Korea's atomic energy institute. Yang and his
fellow nuclear scientists have spent the past 15 years exploring ways of using Candu reactors to
recycle highly radioactive waste, or "spent fuel," from a majority of the world's nuclear reactors.
The approach, Yang wrote in an email message to the Star, "would have many benefits when
practically implemented." South Korea is determined to try.
It's little known - at least outside the nuclear power industry - that the heavy-water reactor technology
that lies at the heart of Candu's design can, with some technical tinkering, directly use waste fuel from
most rival light-water reactors. Candu developer Atomic Energy of Canada Ltd. calls this the DUPIC
process - standing for the Direct Use of Spent Pressurized Water Reactor Fuel in Candus. In 1991, the
Canadian government established a joint research program with the Korean Atomic Energy Research
Institute to investigate the approach, and both sides have demonstrated that it technically works.
The long-term implications, if DUPIC processing can be done safely and economically, are potentially
enormous. There are hundreds of pressurized light-water reactors (PWRs) around the world being used
to generate electricity and propel submarines and aircraft carriers.
In the United States alone, two-thirds of the 104 reactors in operation are based on PWR designs,
according to the U.S. Energy Information Administration. This has led over the years to the
accumulation of 36,000 metric tonnes of spent fuel, which is kept in temporary storage at dozens of
locations until a safe permanent-storage site can be found.
With DUPIC processing, that waste can be turned into a reusable fuel. This can significantly reduce a
country's dependence on uranium, which many analysts predict will rise above $100 (U.S.) per pound
by the end of next year - a tenfold price increase since January 2001.
Perhaps most important, the spent light-water fuel that eventually comes out of a Candu reactor will
contain less toxic material than the fuel that goes in, shrinking the amount of radioactive waste that
must ultimately go into long-term storage.
"The DUPIC fuel cycle could reduce a country's need for used PWR fuel disposal by 70 per cent while
reducing fresh uranium requirements by 30 per cent," according to the World Nuclear Association.
It's for this reason South Korea is keen on the DUPIC process. It currently has 20 operating reactors -
16 PWRs and four Candus. Another eight PWRs are on order or being built. It sees the reuse of spent
fuel in Candus as a key strategy for managing radioactive waste.
"The accumulation of spent fuel is an urgent issue that should be resolved," Yang and his colleagues
wrote in a briefing document that was presented at the 15th Pacific Nuclear Conference in Australia
last October. They called the eventual commercial development of the DUPIC process "an extremely
important turning point in the history of nuclear power development."
136
Nuclear Energy Affirmative
David Torgerson, chief technology officer and senior vice-president of Atomic Energy of Canada, says
the way uranium resources are used by power generators is driven by cost and supply. During the
1990s, for example, uranium prices were so low that it made more economic sense to just use it once
and then stick the spent fuels in wet or dry storage.
But some countries don't have their own uranium resources, leaving them dependent on imports from
other, potentially hostile jurisdictions. As uranium prices rise, the economics of the once-through fuel
cycle also become less appealing when measured against the costs of waste management and disposal.
"As the nuclear renaissance takes off and more reactors are built, it's likely the price of uranium will
increase (even more), and people will be looking at ways of getting more value out of that uranium,"
says Torgerson.
"Any time you can convert a waste into an asset, then you're going in the right direction."
He's quick to point out that the DUPIC process is also "proliferation resistant," meaning there is no
chemical separation of the spent uranium's more dangerous components, primarily plutonium, which
could be used by extremists or rogue nations to produce nuclear weapons. Only mechanical processing
is required to change the shape of the spent fuel rods into shorter Candu rods.
Mechanical reprocessing, while it has some safety and transportation issues, could be cheaper than
conventional chemical reprocessing.
"Because this is so much simpler, you have to expect the economics are going to be so much better,"
says Torgerson, pointing out that the South Koreans studied the economics of the DUPIC fuel cycle in
the 1990s and found it could compete against other fuel options. "This is one of the characteristics
we're certainly pushing."
For countries such as China, which already have Candu reactors in their fleet, it's an approach that
could prove attractive. AECL estimates that waste fuel from three light-water reactors would be
enough to fuel one Candu.
Daune Bratt, a political science instructor and expert on Canadian nuclear policy at Calgary's Mount
Royal College, says he can envision two revenue streams going to Candu operators that choose to
embrace the DUPIC process.
One stream would be the revenue that comes in through the generation and sale of electricity; the other
would come from a tipping fee that operators of light-water reactors would pay to unload their spent
fuel.
"These (Candu) operators wouldn't be buying the spent fuel, they'd be paid to use the spent fuel for
environmental reasons," says Bratt. "If you can minimize the waste, you bring tremendous value."
137
Nuclear Energy Affirmative
SOLVENCY – DUPIC
The DUPIC process is much simpler than conventional wet-chemistry techniques for reprocessing, and
promises to be cheaper. It presents a significant anti-proliferation benefit as well, since radioactive fission
products and fissile material are not separated. In addition, since the heat load of spent DUPIC fuel is
similar to that of the original spent LWR fuel, disposal requirements do not increase. However, since
approximately 50% more energy can be derived from LWR fuel by burning it as DUPIC fuel in a CANDU
reactor, the disposal cost is expected to be lower than either spent LWR or CANDU fuel (Baumgartner,
1998).
138
Nuclear Energy Affirmative
SOLVENCY – DUPIC
DUPICs use less fuel, are more energy efficient, minimize proliferation
possibilities, and burn more dangerous radioactive waste than other nuclear
reactors.
Countries with existing and new light-water reactors could use the spent uranium fuel in
those reactors on a separate fleet of Candus, meaning less consumption of new uranium
fuel.
When the spent light-water fuel is run through a Candu, it packs two times the amount
of energy as when the original fuel was used.
Turning spent light-water fuel into usable fuel in the DUPIC process only requires
mechanical separation and repackaging, a more proliferation resistant process than the
so-called "wet chemical" approach used to re-enrich spent fuel.
Finally, when the spent fuel is recycled and used in a Candu reactor, more of the
dangerous radioactive materials are burned away, meaning less bad stuff to handle when
it does eventually go into long-term storage.
139
Nuclear Energy Affirmative
SOLVENCY – DUPIC
The DUPIC fuel cycle has several benefits—it helps solve nuclear proliferation and
meets energy requirements.
Korea Atomic Energy Research Institute,
http://article.nuclear.or.kr/jknsfile/v38/JK0380359.pdf, “The Status and Prospect of
DUPIC Fuel Technology”, 2005
The DUPIC fuel cycle is a unique spent nuclear fuel management technology that can be
implemented in South Korea. In the past, the Tandem fuel cycle development [38], which
recycles mixed oxide fuel in a CANDU reactor through a reprocessing, was not
successful. The Korea Hydro Nuclear Power also tried reprocessing outside of Korea, but
this work was unsuccessful due to the increasing concern about proliferation and
adherence to the nonproliferation treaty as it concerns the Korean peninsula. Nonetheless,
the accumulation of spent fuel is an urgent issue that should be resolved. Therefore, a
technology should be developed that can be implemented in Korea under the non-
proliferation policy. The DUPIC fuel cycle is known to be the most representative
example that has technically overcome the international and domestic restrictions
involved with the Tandem fuel cycle. Though it is yet too early to launch the
commercialization of DUPIC fuel based on the basic DUPIC fuel technology currently
developed, it is also true that the key technologies have been developed for the DUPIC
fuel cycle. Therefore, it is expected that there should be no technical problems 372
NUCLEAR ENGINEERING AND TECHNOLOGY, VOL.38 NO.4 JUNE 2006 CHOI et
al., The Status and Prospect of DUPIC Fuel Technology to develop commercial DUPIC
fuel technology once the DUPIC fuel technology and its performance are demonstrated
through a practical use of the DUPIC fuel, which will be an important turning point in the
history of nuclear power development. By utilizing spent fuel via an internationally-
proven proliferation-resistant technology, it is expected that the burden of spent fuel
accumulation will be relieved not only in the domestic nuclear grid but also in the
worldwide nuclear power industry.
140
Nuclear Energy Affirmative
SOLVENCY – DUPIC
The DUPIC process has many key advantages.
141
Nuclear Energy Affirmative
SOLVENCY – ACCIDENTS
142
Nuclear Energy Affirmative
SOLVENCY – ACCIDENTS
The nuclear power plants are much safer and able to avoid most mistakes.
Nuclear Power to Play Key Role in Meeting Energy, Environmental Goals, House Panel
Told
WASHINGTON, Wed Mar 12, 2008 /PRNewswire-USNewswire/
"Our nuclear plants are not only environmentally sound by avoiding the emission of 681
million metric tons of CO2 each year, they are also extraordinarily safe. In 2006, our lost-
time accident rate was 0.12 accidents per 200,000 worker hours. That is significantly
safer than the 3.5 accidents per 200,000 worker hours in the manufacturing sector," Flint
said.
143
Nuclear Energy Affirmative
SOLVENCY – ACCIDENTS
144
Nuclear Energy Affirmative
SOLVENCY – ACCIDENTS
Nuclear power plants are the most efficient and reliable form of alternative energy.
As noted earlier, the operation costs of nuclear power include high levels of
internalization compared with other major electricity generating technologies. In
particular significant attempts have been made to internalize the environmentaly costs of
the industry. Other factors of national energy policy that remain problematic in the design
of liberalized electricity markets are reliability, security of fuel supply and generation
capacity margins. Conventional nuclear power plants are well suited to continuous high
quality base load electricity generation. ‘Renewables’ on the other hand can suffer from
intermittency and poor power quality, although important progress is now being made in
both these areas.
145
Nuclear Energy Affirmative
SOLVENCY - ACCIDENTS
The transportation of nuclear waste has so far never resulted in an accident and already
has strict guidelines.
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
FACT: The NRC and other regulatory agencies around the world take the strictest
precautions when dealing with spent nuclear fuel. Since 1971, more than 20,000
shipments of spent fuel and high-level waste have been transported more than 18
million miles worldwide without incident.
A staggering amount of evidence directly refutes this myth. Nuclear waste has been
transported on roads and railways worldwide for years without a significant incident.
Indeed, more than 20 million packages with radioactive materials are transported globally
each year--3 million of them in the United States. Since 1971, more than 20,000
shipments of spent fuel and high-level waste have been transported more than 18 million
miles without incident.[9] Transportation of radioactive materials is just not a problem.
The NRC and other regulatory agencies around the world take the strictest precautions
when dealing with spent nuclear fuel. The NRC outlines six key components for
safeguarding nuclear materials in transit:
146
Nuclear Energy Affirmative
SOLVENCY – ACCIDENTS
While environmentalists opposing nuclear war may exaggerate the proportions of nuclear reactor accidents, the
accidents are really not a big problem.
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick Loris
MYTH: Incidents at Davis-Besse, Vermont Yankee, and Kashiwazaki-Kariwa demonstrate that continued use of nuclear power will
lead to another Chernobyl. FACT: The real consequences of these three incidents demonstrate that nuclear power is safe.
Perhaps the greatest myths surrounding nuclear power concern the consequences of past accidents and their association with current
risks. All of these myths depend on a basic construct of flawed logic and misrepresentations that is riddled with logical and factual
errors.
First, the consequences of Chernobyl are overblown to invoke general fear of nuclear power. Next, the Three Mile
Island accident is falsely equated with Chernobyl to create the illusion of danger at home. Finally, any accident, no
matter how minor, is portrayed as being ever so close to another nuclear catastrophe to demonstrate the dangers of new
nuclear power.
This myth can be dispelled outright simply by revisiting the real consequences of Chernobyl and Three Mile Island in terms of actual
fatalities. Although any loss of life is a tragedy, a more realistic presentation of the facts would use these accidents to demonstrate the
inherent safety of nuclear power.
Chernobyl was the result of human error and poor design. Of the fewer than 50 fatalities,[12] most were rescue workers
who unknowingly entered contaminated areas without being informed of the danger.
The World Heath Organization says that up to 4,000 fatalities could ultimately result from Chernobyl-related cancers,
but this has not yet happened. The primary health effect was a spike in thyroid cancer among children, with 4,000-
5,000 children diagnosed with the cancer between 1992 and 2002. Of these, 15 children died, but 99 percent of cases
were resolved favorably. No clear evidence indicates any increase in other cancers among the most heavily affected
populations. Of course, this does not mean that cancers could not increase at some future date.
Interestingly, the World Health Organization has also identified a condition called "paralyzing fatalism," which is caused by
"persistent myths and misperceptions about the threat of radiation."[13] In other words, the propagation of ignorance by anti-nuclear
activists has caused more harm to the affected populations than has the radioactive fallout from the actual accident.
The most serious accident in U.S. history involved the partial meltdown of a reactor core at Three Mile Island, but no
deaths or injuries resulted. The local population of 2 million people received an average estimated dose of about 1
millirem--insignificant compared to the 100-125 millirems that each person receives annually from naturally occurring
background radiation in the area.[14]
Other incidents have occurred since then, and all have been resolved safely. For example, safety inspections revealed a
hole forming in a vessel-head at the Davis-Besse plant in Ohio. Although only an inch of steel cladding prevented the
hole from opening, the NRC found that the plant could have operated another 13 months and that the steel cladding
could have withstood pressures 125 percent above normal operations.[15]
A partial cooling tower collapse at the Vermont Yankee plant was far less serious than the Davis- Besse incident but is nonetheless
presented by activists as evidence of the potential risks posed by power reactors. Non-radioactive water was spilled in the collapse, but
no radiation was released.
As for vulnerability to earthquakes, the NRC requires that each nuclear plant meet a set of criteria to protect against
earthquakes.[16] Earthquakes at the Kashiwazaki-Kariwa site demonstrate the effectiveness of modern earthquake
precautions. In 2004, the site survived without incident an earthquake measuring 6.9 on the Richter scale. A slightly
weaker earthquake in July 2007 caused the plant to suspend operations, but inspectors have since concluded that the
plant's safety features performed properly. While some radiation was released, it was well below dangerous levels and
did not come close to approaching Chernobyl-like levels.
147
Nuclear Energy Affirmative
SOLVENCY – WASTE
Nuclear waste materials are not being used for violent purposes as the environmentalist
propaganda would have the population believe.
148
Nuclear Energy Affirmative
SOLVENCY – WASTE
The nuclear waste can be reused, and the part that is not can be safely deposited in a
remote location.
By Jack Spencer, SPECIAL TO THE WASHINGTON TIMES, Washington Times,
October 28, 2007
But what about the disposal of nuclear waste, the No-Nukers ask? Actually, industry
solved that problem decades ago. Spent fuel is removed from the reactor. The
reusable portion is recycled by separating it and re-using it; the remainder is placed
in either interim or long-term storage, in remote locations such as Yucca Mountain.
Other countries, including France, safely do this every day. Politicians and bad
public policy prevent it from occurring in the U.S.
Waste transportation is another favorite target. The truth is that nuclear waste has
been transported on roads and railways worldwide for years without incident.
Indeed, more than 20 million waste packages are transported globally each year, and
more than 20,000 shipments have traveled some 18 million miles since 1971. It's
just not a problem.
149
Nuclear Energy Affirmative
SOLVENCY – WASTE
DUPIC solves waste problems by using the waste from light water reactors to
fuel heavy water reactors (CANDUs).
The Toronto Star (Newspaper), February 12, 2007, Lexis-Nexis “The Candu edge; Canada's
heavy-water reactors can run on spent fuel from most light-water reactors, eliminating 2
headaches: skyrocketing uranium prices and waste disposal concerns”
The international potential of Candu nuclear reactors may not be obvious to some, but rising uranium
prices and heightened concern over nuclear-waste disposal could soon shine a light on this made-in-
Canada technology.
Nobody sees this more than Myung Seung Yang of South Korea's atomic energy institute. Yang and his
fellow nuclear scientists have spent the past 15 years exploring ways of using Candu reactors to
recycle highly radioactive waste, or "spent fuel," from a majority of the world's nuclear reactors.
The approach, Yang wrote in an email message to the Star, "would have many benefits when
practically implemented." South Korea is determined to try.
It's little known - at least outside the nuclear power industry - that the heavy-water reactor technology
that lies at the heart of Candu's design can, with some technical tinkering, directly use waste fuel from
most rival light-water reactors. Candu developer Atomic Energy of Canada Ltd. calls this the DUPIC
process - standing for the Direct Use of Spent Pressurized Water Reactor Fuel in Candus. In 1991, the
Canadian government established a joint research program with the Korean Atomic Energy Research
Institute to investigate the approach, and both sides have demonstrated that it technically works.
The long-term implications, if DUPIC processing can be done safely and economically, are potentially
enormous. There are hundreds of pressurized light-water reactors (PWRs) around the world being used
to generate electricity and propel submarines and aircraft carriers.
In the United States alone, two-thirds of the 104 reactors in operation are based on PWR designs,
according to the U.S. Energy Information Administration. This has led over the years to the
accumulation of 36,000 metric tonnes of spent fuel, which is kept in temporary storage at dozens of
locations until a safe permanent-storage site can be found.
With DUPIC processing, that waste can be turned into a reusable fuel. This can significantly reduce a
country's dependence on uranium, which many analysts predict will rise above $100 (U.S.) per pound
by the end of next year - a tenfold price increase since January 2001.
Perhaps most important, the spent light-water fuel that eventually comes out of a Candu reactor will
contain less toxic material than the fuel that goes in, shrinking the amount of radioactive waste that
must ultimately go into long-term storage.
"The DUPIC fuel cycle could reduce a country's need for used PWR fuel disposal by 70 per cent while
reducing fresh uranium requirements by 30 per cent," according to the World Nuclear Association.
It's for this reason South Korea is keen on the DUPIC process. It currently has 20 operating reactors -
16 PWRs and four Candus. Another eight PWRs are on order or being built. It sees the reuse of spent
fuel in Candus as a key strategy for managing radioactive waste.
"The accumulation of spent fuel is an urgent issue that should be resolved," Yang and his colleagues
wrote in a briefing document that was presented at the 15th Pacific Nuclear Conference in Australia
last October. They called the eventual commercial development of the DUPIC process "an extremely
important turning point in the history of nuclear power development."
150
Nuclear Energy Affirmative
SOLVENCY – WASTE
The best way to deal with the waste problem is being done currently.
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
FACT: The nuclear industry solved the nuclear waste problem decades ago.
Spent nuclear fuel can be removed from the reactor, reprocessed to separate unused fuel,
and then used again. The remaining waste could then be placed in either interim or long-
term storage, such as in the Yucca Mountain repository. France and other countries carry
out some version of this process safely every day. Furthermore, technology advances
could yield greater efficiencies and improve the process. The argument that there is no
solution to the waste problem is simply wrong.
"Closing the fuel cycle" by reprocessing or recycling spent fuel would enable the U.S. to
move away, finally, from relying so heavily on the proposed Yucca Mountain repository
for the success of its nuclear program. This would allow for a more reasonable mixed
approach to nuclear waste, which would likely include some combination of Yucca
Mountain, interim storage, recycling, and new technologies. Regrettably, the federal
government banned the recycling of spent fuel from commercial U.S. reactors in 1977,
and the nation has practiced a virtual moratorium on the process ever since.[3]
151
Nuclear Energy Affirmative
152
Nuclear Energy Affirmative
Nuclear power will produce energy using clean, environmentally friendly energy
methods
Greener energy Thursday, April 3, 2008 Former Energy Secretary Spencer Abraham,
For one, it is the most environmentally friendly source of all clean-air electricity options.
In the latest report from the Nobel Prize-winning Intergovernmental Panel on Climate
Change (IPCC), nuclear power was distinguished as an integral part in humanity's
attempt to mitigate the effects of climate change. This is because nuclear power plants
emit zero greenhouse gases or pollutants related to ground-level ozone formation,
smog or acid rain.
153
Nuclear Energy Affirmative
The CO2 emissions from the construction of the nuclear plants are outweighed in the
long run by the steps that nuclear energy will take to reduce our dependency on fossil
fuels.
154
Nuclear Energy Affirmative
Nuclear power is a “carbon-free” energy source, and, thus, does not contribute to
climate change. Expanding nuclear power will help stave off climate change.
John M. Deutch and Ernest J. Moniz, Professors at Michigan Institute of Technology, 06
Nuclear power supplies a sixth of the world's electricity. Along with hydropower (which
supplies slightly more than a sixth), it is the major source of "carbon-free" energy
today. The technology suffered growing pains, seared into the public's mind by the
Chernobyl and Three Mile Island accidents, but plants have demonstrated remarkable
reliability and efficiency recently. The world's ample supply of uranium could fuel a
much larger fleet of reactors than exists today throughout their 40- to 50-year life span.
With growing worries about global warming and the associated likelihood that greenhouse gas emissions will be regulated in some
fashion, it is not surprising that governments and power providers in the U.S. and elsewhere are increasingly considering building a
substantial number of additional nuclear power plants. The fossil-fuel alternatives have their drawbacks. Natural gas is attractive in a
carbon-constrained world because it has lower carbon content relative to other fossil fuels and because advanced power plants have
low capital costs. But the cost of the electricity produced is very sensitive to natural gas prices, which have become much higher and
more volatile in recent years. In contrast, coal prices are relatively low and stable, but coal is the most carbon-intensive source of
electricity. The capture and sequestration of carbon dioxide, which will add significantly to the cost, must be demonstrated and
introduced on a large scale if coal-powered electricity is to expand significantly without emitting unacceptable quantities of carbon
into the atmosphere. These concerns raise doubts about new investments in gas- or coal-powered plants. All of which points to a
possible nuclear revival. And indeed, more than 20,000 megawatts of nuclear capacity have come online globally since 2000, mostly
in the Far East. Yet despite the evident interest among major nuclear operators, no firm orders have been placed in the U.S. Key
impediments to new nuclear construction are high capital costs and the uncertainty surrounding nuclear waste management. In
addition, global expansion of nuclear power has raised concerns that nuclear weapons ambitions in certain countries may inadvertently
be advanced.
155
Nuclear Energy Affirmative
156
Nuclear Energy Affirmative
157
Nuclear Energy Affirmative
Nuclear Energy Institute. 2/16/2005. “New Nuclear Power Plants Are Vital to Effective
National Energy Policy, NEI Tells Congress”
(http://www.nei.org/newsandevents/newplantsvital/)
“If the United States is to have an effective national energy policy, it must chart a path for
a diverse energy mix that includes a strong role for nuclear energy,” said John Kane, the
Nuclear Energy Institute’s senior vice president of governmental affairs, in testimony
before the panel. “We simply cannot meet the twin challenges of increased electricity
production and fewer emissions without the reliable, affordable electricity that new
nuclear plants will provide.
“America will need 50 percent more electricity by 2025 to fuel an ever-expanding
economy while at the same time meeting even more stringent environmental goals,”
Kane said. “Nuclear energy provides 70 percent of the emission-free electricity in the
U.S. and is the only readily expandable source of clean energy.”
158
Nuclear Energy Affirmative
Nuclear is becoming widely recognized as a safe, effective way to deal with the
current fuel dilemma.
The Boston Globe, Energy and the Simpsons By Gilbert J. Brown, a professor of
nuclear engineering and the coordinator of the Nuclear Engineering Program at
UMass-Lowell, is a member of the CASEnergy Coalition.
| August 2, 2007
There are now 104 nuclear electric power reactors safely producing 20 percent of the
nation's electricity. Finally, nuclear is being widely recognized as a safe, economical
source of energy. And because it produces none of the greenhouse gases believed to
be a major factor in climate change, environmental groups are taking a more
favorable stance on nuclear energy as well.
Unlike the '90s when energy consumption was an unquestioned way of life, energy
conservation is now the hot topic in the United States. A recent Gallup poll reports
that Americans rank energy issues as the the Number 4 priority for Washington,
coming in behind only Iraq, terrorism and national security, and the economy. As
some of the world's greatest consumers of energy, we are looking for cleaner and
more efficient sources to meet the growing demand for electricity - expected to rise
40 percent in the United States by 2030.
Today, more and more Americans understand that real nuclear by-products are not
uncontrolled green ooze but rather used nuclear fuel that is managed safely and
securely on-site. And, as nuclear technology advances, over 90 percent of used fuel
could be recycled to fuel nuclear power plants again and again. A survey
conducted by the Clean and Safe Energy Coalition last year found that the more
people learn about nuclear, the more supportive they are of it. After a quick lesson
about energy issues and nuclear's capabilities, 73 percent of respondents said that
they felt favorably or somewhat favorably about the use of nuclear. Similarly,
Bisconti Research found that 86 percent of Americans see nuclear energy as an
important part of meeting future electricity needs and 77 percent agree that utilities
should prepare now to build new nuclear plants in the next decade.
Even some policy makers who have been lukewarm to nuclear seem to be coming
around to its merits. People like House Speaker Nancy Pelosi and Senator Barack
Obama are beginning to understand that nuclear energy needs to be part of the
energy mix if we are going to meet our future energy demands safely and cleanly.
The Light water reactors are much more effective and are very established in providing
nuclear energy.
159
Nuclear Energy Affirmative
160
Nuclear Energy Affirmative
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
Given that nuclear fission does not produce atmospheric emissions, NukeFree's carbon
dioxide (CO2) witch-hunt focuses on other, emissions-producing activities surrounding
nuclear power, such as uranium mining and plant construction. Finding fault with nuclear
energy on the basis of these indirect emissions simply holds no merit. Whether the
activists like it or not, the world runs on fossil fuel. Until the nation changes its energy
profile--which can be done with nuclear energy--almost any activity, even building
windmills, will result in CO2 emissions.
The United States has not built a new commercial nuclear reactor in over 30 years, but
the 104 plants operating today prevented the release of 681.9 million metric tons of CO2
in 2005, which is comparable to taking 96 percent of cars off the roads.[2] If CO2 is the
problem, emissions-free nuclear power must be part of the solution.
What makes nuclear energy so exciting from an environmental standpoint is not the
pollution that it has prevented in the past, but the potential for enormous savings in the
future. Ground transportation is a favorite target of the environmental community, and the
members of this community are correct insofar as America's transportation choices are a
primary source of the nation's dependence on and demand for fossil fuels. Plug-in electric
hybrid cars, which require significant development to achieve subsidy-free market
viability, are looked upon as a potential solution to the problem. Yet if the electricity
comes from a fossil-fuel power plant, the pollution is simply transferred from a mobile
energy source to a fixed one, while the problem is solved if the electricity comes from an
emissions-free nuclear plant.
161
Nuclear Energy Affirmative
162
Nuclear Energy Affirmative
163
Nuclear Energy Affirmative
Jang Jin PARK, Myung Seung YANG, Ki Kwang BAE, Hang Bok CHOI, Ho Dong KIM, Jong
Ho KIM, Hyun Soo PARK, “This work has been carried out under the Nuclear Research and
Development Program of Korea Ministry of Science and Technology.” Korea Atomic Energy
Research Institute, “TECHNOLOGY AND IMPLEMENTATION OF THE DUPIC
CONCEPT FOR SPENT NUCLEAR FUEL IN THE ROK”
https://eed.llnl.gov/ncm/session1/JangJin_Park.pdf
One of the important DUPIC features is its excellent proliferation resistance, since no fissile
material is separated in the DUPIC fuel fabrication process. Moreover, DUPIC fuel is refabricated
directly from highly radioactive PWR spent fuel, and therefore access to the sensitive material is
extremely difficult.
A nuclear material accounting system for DUPIC safeguards such as DSNC(DUPIC Safeguards
Neutron Counter) and ICS(Intelligent Containment and Surveillance) is being developed in
cooperation with the USA. DSNC is a well-type neutron coincidence counter and can measure the
amount of curium in the fuel. Pu and U contents are inferred from the amount of curium. The
proportionality between the coincidence neutron counter rate, burn-up and curium-244 content
has been verified experimentally. It has been proved that DSNC is a reliable technology for use in
DUPIC process safeguards.
164
Nuclear Energy Affirmative
The Toronto Star (Newspaper), February 12, 2007, Lexis-Nexis “The Candu edge; Canada's
heavy-water reactors can run on spent fuel from most light-water reactors, eliminating 2
headaches: skyrocketing uranium prices and waste disposal concerns”
He's quick to point out that the DUPIC process is also "proliferation resistant," meaning there is no
chemical separation of the spent uranium's more dangerous components, primarily plutonium, which
could be used by extremists or rogue nations to produce nuclear weapons. Only mechanical processing
is required to change the shape of the spent fuel rods into shorter Candu rods.
165
Nuclear Energy Affirmative
Hore-Lacy, Ian. 2006. Environmental scientist, manager of the Uranium Information Centre, Melbourne,
and Head of Communications for the World Nuclear Association. “Nuclear Energy in the 21st Century.”
Since 1987 the USA and countries of the former USSR have signed a series of disarmament treaties to
reduce the nuclear arsenals of the signatory countries by approximately 80%. The weapons contain a great
deal of uranium enriched to over 90% U-235 (i.e. about 25 times the proportion in most reactor fuel). Some
weapons have plutonium-239, which can be used in diluted form either conventional or fast breeder
reactors.
The surplus of weapons-grade highly enriched uranium (HEU) has led to an agreement between the USA
and Russia for the HEU from Russian warheads and military stockpiles to be diluted for delivery to the
USA and then used in civil nuclear reactors. Under the "megatons to megawatts" deal signed in 1994, the
US government is purchasing 500 tonnes of weapons-grade HEU over 20 years from Russia for dilution
and sale to electric utilities, for US$ 12 billion. This acquisition reached its halfway point in 2005 with the
claim that this eliminated 10,000 nuclear warheads. Weapons-grade HEU is enriched to over 90% U-235
while light water civilian reactor fuel is usually enriched to about 3% to 4%. To be used in most
commercial nuclear reactors, military HEU must therefore be diluted about 25:1 by blending with depleted
uranium (mostly U-238), natural uranium (0.7% U -235), or partially enriched uranium. The contracted
HEU is being blended down to 4.4% U -235 in Russia, using l.5% U -235(enriched tails). The 500 tonnes
of weapons HEU is resulting in just over 15,000 tonnes of low-enriched (4.4%) uranium over the 20 years.
This is equivalent to about 153,000 tonnes of natural uranium, more than twice annual world demand.
The purchase and blending down is being done progressively. Since 2000 the dilution of 30 tonnes per year
of military HEU is displacing about 10,600 tonnes of uranium oxide mine production per year, representing
about 13% of the world's reactor requirements.
In addition, the US Government has declared 174 tonnes of highly enriched uranium (of various
enrichments) to be surplus from its military stockpiles, and this is being blended down to about 4300 tonnes
of reactor fuel. In the short term most of the military uranium is likely to be blended down to 20% U-235,
then stored. In this form it is not useable for weapons.
Disarmament will also give rise to some 150-200 tonnes of weapons-grade plutonium. In 2000 the USA
and Russia agreed t o dispose of 34 tonnes each by 2014. While it was initially proposed to immobilize
some f the US portion, the general idea is now to fabricate it with uranium oxide as a MOX fuel for burning
in existing reactors. A plant is under construction in South Carolina for this fuel fabrication, and meanwhile
some trial MOX assemblies (made in France from US military plutonium) are being trialled in a US
reactor.
However, Europe has a well-developed MOX capacity and Japan is developing its use. This suggests that
weapons-grade plutonium could be disposed of relatively quickly. Input plutonium would need to be about
half reactor-grade and half weapons-grade, but using such MOX as 30% of the fuel in one third of the
world’s reactor capacity would remove about 15 tonnes of warhead plutonium per year. This would amount
to burning 3000 warheads per year to produce 110 billion kWh of electricity.
Over 35 reactors in Europe are licensed to use mixed oxide fuel, and 20 French reactors are using it or
licensed to use it as 30% of their fuel. New reactors may be able to run with full MOX cores.
Russia intends to use all of its plutonium as a fuel, burning it in both late-model conventional reactors and
particularly in fast neutron reactors. If all the plutonium were used in fast neutron reactors in conjunction
with the depleted uranium from enrichment plant stockpiles, there would be enough to run the world’s
commercial nuclear electricity programmes for several decades without any further uranium mining. In
Russia a thorium-uranium fuel is being developed which is intended to use weapons-grade plutonium in
conventional reactors.
Most of this book is concerned with uranium as a fuel for nuclear reactors. However, in future, thorium is
also likely to be utilized as a fuel for particular reactors. The thorium fuel cycle has some attractive
features, and is described further in section 4.7.
166
Nuclear Energy Affirmative
Existing neutron efficiency reactor designs, such as the Canadian Deuterium Uranium (CANDU) reactor,
are capable of operating on a thorium fuel cycle, once they are started using a fissile material such as U-235
or Pu-239. Then the thorium (Th-233) captures a neutron in the reactor to become fissile uranium (U-233),
which continues the reaction. However, there are some practical problems with using thorium this way.
Thorium is about three times as abundant in the Earth’s crust as uranium. Australia and
India have considerable amounts of thorium, and India is developing its whole nuclear
energy programme to make use of it.
167
Nuclear Energy Affirmative
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
This myth relies on creating an illusion of cause and effect. This is why so much anti-
nuclear propaganda focuses on trying to equate nuclear weapons with civilian nuclear
power. Once such a spurious relationship is established, anti-nuclear activists can mix and
match causes and effects without regard for the facts.
Furthermore, this "argument" is clearly irrelevant inside the United States. As a matter of
policy, the United States already has too many nuclear weapons and is disassembling
them at a historic pace, so arguing that expanding commercial nuclear activity in the
United States would somehow lead to weapons proliferation is disingenuous. The same
would hold true for any other state with nuclear weapons.
As for states without nuclear weapons, the problem is more complex than simply arguing
that access to peaceful nuclear power will lead to nuclear weapons proliferation. Nuclear
weapons require highly enriched uranium or plutonium, and producing either material
requires a sophisticated infrastructure. While most countries could certainly develop the
capabilities needed to produce these materials, the vast majority clearly have no intention
of doing so.
For start-up nuclear powers, the preferred method of acquiring weapons-grade material
domestically is to enrich uranium, not to separate plutonium from spent nuclear fuel.
Uranium enrichment is completely separate from nuclear power production. Furthermore,
nothing stops countries from developing a nuclear weapons capability, as demonstrated
by North Korea and Iran. If proliferation is the concern, then proper oversight is the
answer, not stifling a distantly related industry.
168
Nuclear Energy Affirmative
169
Nuclear Energy Affirmative
Greener energy Thursday, April 3, 2008 Former Energy Secretary Spencer Abraham.
Currently, nuclear power accounts for 20 percent of our nation's energy without carbon
emissions. But because we have not built a power plant since 1986 and because our
electricity demands continue to rise exponentially, that percentage will dwindle down to
15 percent by 2030 and eventually to zero as the last American plants are
decommissioned. This will result in growing reliance on imported fuels and carbon-based
power. In fact, a failure to invest in the creation of new nuclear plants not only impacts
the state of our environment, but it also affects the state of our economy. As the price of
oil exceeds $100 a barrel and our economy hints at a recession, the American public is
understandably growing more concerned about our energy policies.
170
Nuclear Energy Affirmative
Why do you favor nuclear energy over other non-carbon-based sources of energy?
Other than hydroelectric energy—which I also strongly support—nuclear is the only
technology besides fossil fuels available as a large-scale continuous power source, and I
mean one you can rely on to be running 24 hours a day, seven days a week. Wind and
solar energy are intermittent and thus unreliable. How can you run hospitals and factories
and schools and even a house on an electricity supply that disappears for three or four
days at a time? Wind can play a minor role in reducing the amount of fossil fuels we use,
because you can turn the fossil fuels off when the wind is blowing. And solar is
completely ridiculous. The cost is so high—California's $3.2 billion in solar subsidies is
all just going into Silicon Valley companies and consultants. It's ridiculous.
171
Nuclear Energy Affirmative
172
Nuclear Energy Affirmative
173
Nuclear Energy Affirmative
Solves energy – Nuclear electricity is competitive with electricity produced by coal, nuclear energy is
safe and reliable, and
Nuclear Energy Institute. 9/20/2004. “Independent Economic Study Confirms Future
Competitiveness of New Nuclear Power Plants”
(http://www.nei.org/newsandevents/econmicstudy/).
The U.S. Department of Energy today released an economic study which concludes that
new nuclear power plants can be highly competitive with baseload, large-scale gas-fired
and coal-fired electricity generation once the first few new nuclear plants have been built.
The independent study was produced by the University of Chicago, under the auspices of
Argonne National Laboratory. The following is a statement from Marvin Fertel, chief
nuclear officer of the Nuclear Energy Institute.
“This independent study ratifies the emerging consensus that new nuclear power plants
are competitive with other baseload sources of electricity once first-of-a-kind engineering
costs are absorbed, construction experience gained, and other near-term financing issues
resolved.
“Notably, the study concludes that even the first of the next series of new nuclear plants
can approach competitiveness if limited federal financial policies are put in place that
afford new nuclear plants the same type of government incentives available to other
sources of electricity generation.
“Nuclear energy supplies one in every five U.S. homes and businesses with safe, clean,
reliable electricity, and can play an even larger role in the future. As the study notes, ‘A
successful transition from oil-based to hydrogen-based transportation could, in the long
run, increase the demand for nuclear energy as a nonpolluting way to produce
hydrogen.’”
174
Nuclear Energy Affirmative
Nuclear Energy Institute. 2/16/2005. “New Nuclear Power Plants Are Vital to Effective
National Energy Policy, NEI Tells Congress” (http://www.nei.org/newsandevents/newplantsvital/)
“If the United States is to have an effective national energy policy, it must chart a path for
a diverse energy mix that includes a strong role for nuclear energy,” said John Kane, the
Nuclear Energy Institute’s senior vice president of governmental affairs, in testimony
before the panel. “We simply cannot meet the twin challenges of increased electricity
production and fewer emissions without the reliable, affordable electricity that new
nuclear plants will provide.
“America will need 50 percent more electricity by 2025 to fuel an ever-expanding
economy while at the same time meeting even more stringent environmental goals,”
Kane said. “Nuclear energy provides 70 percent of the emission-free electricity in the
U.S. and is the only readily expandable source of clean energy.”
175
Nuclear Energy Affirmative
176
Nuclear Energy Affirmative
Not only is nuclear power very cost effective but it also produces many jobs.
Greener energy Thursday, April 3, 2008 Former Energy Secretary Spencer Abraham,
The average nuclear plant, for instance, generates approximately $430 million in
production of goods and services and provides more than $20 million in state and local
tax revenue benefiting schools, roads and infrastructure. Additionally, each nuclear plant
provides around 1,400-1,800 construction jobs, and 400 to 700 permanent positions to
support continued operations. As evidenced by statistics from the Department of Labor,
these are well-paying jobs with the median annual salary for nuclear engineers standing
at $82,900 — which is some $8,000 higher than all other engineering disciplines except
petroleum engineering. These high wages, in addition to a burgeoning demand for
nuclear engineers, has translated into an increasing number of students seeking degrees in
the field, according to the Energy Department.
Nuclear power provides the most cost effective source of energy available.
A number of analyses say that nuclear power isn't cost competitive, and that without
government subsidies, there's no real market for it. That's simply not true. Where the
massive government subsidies are is in wind and solar. I know that France, which
produces 80 percent of its electricity with nuclear, does not have high energy costs.
Sweden, which produces 50 percent of its energy with nuclear and 50 percent with hydro,
has very reasonable energy costs. I know that the cost of production of electricity among
the 104 nuclear plants operating in the United States is 1.68 cents per kilowatt-hour.
That's not including the capital costs, but the cost of production of electricity from
nuclear is very low, and competitive with dirty coal. Gas costs three times as much as
nuclear, at least. Wind costs five times as much, and solar costs 10 times as much.
177
Nuclear Energy Affirmative
Without federal financial policies, the first new nuclear plants coming on line will
have a levelized cost of electricity (LCOE) that ranges from $47 to $71 per
megawatt-hour (MWh), compared to $33 to $41 for coal-fired plants and $35 to $45
for gas-fired plants
Once engineering costs are paid and the first few plants have been built, the 4thor 5thnew
nuclear plants could have costs as low as $33 per MWh
Federal financial policies combining a 20% investment tax credit and an $18 per
MWhproduction tax credit for 8 years could lower first-plant nuclear costs to $25 to
$45 MWh
178
Nuclear Energy Affirmative
The DUPIC system is cost effective because not as much fuel is needed to power the CANDU
reactors.
The Toronto Star (Newspaper), February 12, 2007, Lexis-Nexis “The Candu edge; Canada's
heavy-water reactors can run on spent fuel from most light-water reactors, eliminating 2
headaches: skyrocketing uranium prices and waste disposal concerns”
Mechanical reprocessing, while it has some safety and transportation issues, could be cheaper than
conventional chemical reprocessing.
"Because this is so much simpler, you have to expect the economics are going to be so much better,"
says Torgerson, pointing out that the South Koreans studied the economics of the DUPIC fuel cycle in
the 1990s and found it could compete against other fuel options. "This is one of the characteristics
we're certainly pushing."
For countries such as China, which already have Candu reactors in their fleet, it's an approach that
could prove attractive. AECL estimates that waste fuel from three light-water reactors would be
enough to fuel one Candu.
Daune Bratt, a political science instructor and expert on Canadian nuclear policy at Calgary's Mount
Royal College, says he can envision two revenue streams going to Candu operators that choose to
embrace the DUPIC process.
One stream would be the revenue that comes in through the generation and sale of electricity; the other
would come from a tipping fee that operators of light-water reactors would pay to unload their spent
fuel.
"These (Candu) operators wouldn't be buying the spent fuel, they'd be paid to use the spent fuel for
environmental reasons," says Bratt. "If you can minimize the waste, you bring tremendous value."
179
Nuclear Energy Affirmative
The current emphasis of the English and Welsh electricity Markey on flexibility and
short-termism seems therefore to come with a risk to the national interest. If a nuclear
power plant is constructed it can reasonably be expected to operate trouble free and
relatively inexpensively for 40 years or more. Long periods of profitable trouble free
operations in fact allow full life-cycle assessments of the economics of nuclear power to
be attractive in principle. Unfortunately, however, these timescales of nuclear power are
poorly matched to the timescales of the human experience. The current time horizons of
the investment community are particularly poorly matched to the time scales needed to
bring new nuclear power plants on steam.
180
Nuclear Energy Affirmative
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
Investors are not averse to nuclear power. Utility companies with nuclear experience have
sought to purchase existing plants, are upgrading their existing power plants, and are
extending their operating licenses so that they can produce more energy for a longer time.
Indeed, nuclear energy is so economically viable that it provides about 20 percent of
America's electricity despite the incredibly high regulatory burden.
However, investors are averse to the regulatory risk associated with building new plants.
The regulatory burden is extreme and potentially unpredictable. In the past, opponents of
nuclear power have successfully used the regulations to raise construction costs by filing
legal challenges, not based on any underlying safety issue, but simply because they
oppose nuclear power.
181
Nuclear Energy Affirmative
People who live in Ramsar, Iran, a resort on the Caspian Sea, are exposed to natural
background radiation of 79,000 mrem per year, 5,266 times more than what the EPA’s
15-mrem/year radiation safety standard allows. The local river and its streams have a
high concentration of radium, which is 15 times more radioactive than plutonium. Its
2,000 residents do not have an increased incidence of cancer, as the linear hypothesis
would predict, and their life span is no different than that of other Iranians.
Fortunately, for that resort, EPA regulations don’t apply there, or to people in Guarapari,
Brazil, who get 17,500 mrem of radiation per year with no ill effect.
One place with high background radiation where EPA regulations do apply is a park in Santa Fe, Fountainhead Rock Place. It has
radioactive rock of volcanic origin that emits 760 mrem of gamma radiation, 14 times the amount allowed by the EPA. Regulators,
however, have chosen to make an exception here and have not closed the park off to the public.
A process known as radiation hormesis mediates the beneficial effect that radiation has on
health. Investigators have found that small doses of radiation have a stimulating and
protective effect on cellular function. It stimulates immune system defenses,
prevents oxidative DNA damage, and suppresses cancer.
Epidemiological studies that document the beneficial effects of radiation include one
done on atom bomb survivors. Despite what you might expect, atom bomb survivors in
Nagasaki who received 1,000 to 19,000 mrem of radiation have a lower incidence of
cancer, especially with regard to leukemia and colon cancer, than the non-irradiated
control population. And it is turning out that Japan’s atom bomb survivors are living
longer. They have a death rate after the age of 55 that is lower than matched Japanese
people not exposed to radiation. (Don’t expect to hear this on the evening news.)
182
Nuclear Energy Affirmative
SOLVENCY – RADIATION
Nuclear radiation does not effect humans.
Beyond the variation in national averages there are some locations with strong terrestrial sources
or very high radon concentrations. There are places in India with natural doses of 1500 mrem per year. In
Brazil, Iran, and Sudan local factors produce natural doses of up to 3800 mrem per year In a few isolated
places in Europe, radon pockets have given annual doses of 5000 mrem per year. In studying the
populations in these areas, it is hard to find any clear radiation linked health effects at low doses. The
General Accounting Office (June 2000) report on radiation standards5 referring to studies on the variation
in natural background radiation and the attempt to link this to increase human health risk put it this way:
“…we examined 82 studies, which generally found little or no evidence of elevated cancer risk
from high natural background radiation levels. A large number of studies reported a lack of evidence of
cancer risks; some others reported evidence of slightly elevated risk, and some reported evidence of slightly
reduced risks. Overall, the studies results are inconclusive, but they suggest that at exposure levels of a few
hundred millerem a year and below, the cancer risks from radiation may either be very small nonexistent."
Other studies have looked at workers within the nuclear weapons productions complexes and
alternately found no linkage or a strong linkage depending on the study.
Some examined this data and came to the conclusion that some “extra” radiation is actually good
for you, citing data that indicates that those with somewhat elevated radiation doses were healthier and
lived longer than those with average or lower radiation doses.
Even at higher than long-term, low-level doses there is some mystery in how humans respond to
radiation. Animal studies indicate that radiation can damage reproductive cells that result in genetic effects
in the offspring of the exposed animal. However, exhaustive studies of the atom bomb survivors show no
genetic effects at all, none.
So what are you going to do? Lab tests and first principles physics tell you harm is being done,
but, when you look in the real world, the effects are sufficiently small that they are hard to find. In 1994, the
United Nations Scientific Committee on the Effects of Atomic Radiation summed up the scientific dilemma
this way:
“…there are theoretical reason based solely on the nature of DNA and damage and repair to expect
that cancer can occur at the lowerst doses without a threshold in the response, although this effect would
perhaps not be statistically demonstrable.”
Which translated from the careful words to plain speak is, “well, any level of harm is harm, but it
is unlikely that you will even see it at the low dose levels.”
The current treatment of low-level radiation effects is called the “linear no-threshold” model.
What this comes down to is that we can’t see clear effects below 10 to 50 rem (10,000 to 50,000 mrem), but
it still bothers us. So we draw a line from the last clear impact we can see downward to lower doses making
the assumption that even that tiniest dose does something bad. This means that in estimating the impact of
some radiation release, we would treat 1 mrem given to 100,000 people the same as 100 rem given to a
single person. Those opposed to this view say this is silly because with 100,000 people, you have 100,000
repair mechanisms working to fix the minor harm. The proponents say that might be, but since we can’t see
the actual effects, the linear model assures us that we are bounded because it can’t be any worse.
Recently there has been quite an outcry against the linear no-threshold model attaching both its
scientific bases and the very large negative impacts on medical and nuclear fuel cycle operations with its
extreme and unwarranted conservatism.
183
Nuclear Energy Affirmative
I am not a doctor or a nuclear health physicist, so I can’t really offer an opinion on a scientific
basis. However, logically it would seem that there is a level at which we can repair ourselves. The fact that
globally there is a wide variation in natural radiation without clear health impacts seems to support that. It
would appear that we are straining at gnats when we should be a lot more worried about other things.
SOLVENCY – RADIATION
People who live in Ramsar, Iran, a resort on the Caspian Sea, are exposed to natural
background radiation of 79,000 mrem per year, 5,266 times more than what the EPA’s
15-mrem/year radiation safety standard allows. The local river and its streams have a
high concentration of radium, which is 15 times more radioactive than plutonium. Its
2,000 residents do not have an increased incidence of cancer, as the linear hypothesis
would predict, and their life span is no different than that of other Iranians.
Fortunately, for that resort, EPA regulations don’t apply there, or to people in Guarapari,
Brazil, who get 17,500 mrem of radiation per year with no ill effect.
One place with high background radiation where EPA regulations do apply is a park in Santa Fe, Fountainhead Rock Place. It has
radioactive rock of volcanic origin that emits 760 mrem of gamma radiation, 14 times the amount allowed by the EPA. Regulators,
however, have chosen to make an exception here and have not closed the park off to the public.
A process known as radiation hormesis mediates the beneficial effect that radiation has on
health. Investigators have found that small doses of radiation have a stimulating and
protective effect on cellular function. It stimulates immune system defenses,
prevents oxidative DNA damage, and suppresses cancer.
Epidemiological studies that document the beneficial effects of radiation include
one done on atom bomb survivors. Despite what you might expect, atom bomb
survivors in Nagasaki who received 1,000 to 19,000 mrem of radiation have a lower
incidence of cancer, especially with regard to leukemia and colon cancer, than the non-
irradiated control population. And it is turning out that Japan’s atom bomb survivors
are living longer. They have a death rate after the age of 55 that is lower than matched
Japanese people not exposed to radiation. (Don’t expect to hear this on the evening
news.)
184
Nuclear Energy Affirmative
SOLVENCY – RADIATION
Radiation causes beneficial medical effects, not death and mutation as thought.
Ludwig E. Feinendegen, Medical Doctor,
http://bjr.birjournals.org/cgi/content/abstract/78/925/3, “Evidence for beneficial low level
radiation effects and radiation hormesis”, The British Journal of Radiology, 2005
Abstract. Low doses in the mGy range cause a dual effect on cellular DNA. One is a
relatively low probability of DNA damage per energy deposition event and increases in
proportion to the dose. At background exposures this damage to DNA is orders of
magnitude lower than that from endogenous sources, such as reactive oxygen species.
The other effect at comparable doses is adaptive protection against DNA damage from
many, mainly endogenous, sources, depending on cell type, species and metabolism.
Adaptive protection causes DNA damage prevention and repair and immune
stimulation. It develops with a delay of hours, may last for days to months, decreases
steadily at doses above about 100 mGy to 200 mGy and is not observed any more after
acute exposures of more than about 500 mGy. Radiation-induced apoptosis and terminal
cell differentiation also occur at higher doses and add to protection by reducing genomic
instability and the number of mutated cells in tissues. At low doses reduction of damage
from endogenous sources by adaptive protection maybe equal to or outweigh
radiogenic damage induction. Thus, the linear-no-threshold (LNT) hypothesis for cancer
risk is scientifically unfounded and appears to be invalid in favour of a threshold or
hormesis. This is consistent with data both from animal studies and human
epidemiological observations on low-dose induced cancer. The LNT hypothesis should
be abandoned and be replaced by a hypothesis that is scientifically justified and causes
less unreasonable fear and unnecessary expenditure.
185
Nuclear Energy Affirmative
SOLVENCY – RADIATION
Nuclear plants do not emit any dangerous amounts of radiation.
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
MYTH: Nuclear power releases dangerous amounts of radiation into the atmosphere.
FACT: Nuclear power plants do emit some radiation, but the amounts are
environmentally insignificant and pose no threat.
This myth relies on taking facts completely out of context. By exploiting public fears of
anything radioactive and not educating the public about the true nature of radiation and
radiation exposure, anti-nuclear activists can easily portray any radioactive emissions as a
reason to stop nuclear power. However, when radiation is put into the proper context, the
safety of nuclear power plants is clear.
Nuclear power plants do emit some radiation, but the amounts are environmentally
insignificant and pose no threat. These emissions fall well below the legal safety limit
sanctioned by the Nuclear Regulatory Commission (NRC).
Indeed, less than 1 percent of the public's exposure to radiation comes from nuclear
power plants. The average American is exposed to 360 millirem of radiation a year.[4]
About 83 percent (300 millirem) of this annual radiation dose comes from natural
sources, such as cosmic rays, uranium in the Earth's crust, and radon gas in the
atmosphere. Most of the rest comes from medical procedures such as X-rays, and about 3
percent (11 millirem) comes from consumer products.[5]
The Department of Energy reports that living near a nuclear power plant exposes a person
to 1 millirem of radiation a year.[6] By comparison, an airline passenger who flies from
New York to Los Angeles receives 2.5 millirem.[7] As Chart 1 illustrates, radiation
exposure is an unavoidable reality of everyday life, and radiation exposure from living
near a nuclear power plant is insignificant.
186
Nuclear Energy Affirmative
SOLVENCY – TERRORISM
Nuclear power plants are fully protected in the case of an attack.
By Jack Spencer, SPECIAL TO THE WASHINGTON TIMES, Washington Times,
October 28, 2007
The new No-Nuke crowd then warns of the ripe targets that nuclear plants provide
terrorists. Really? Now Jackson Browne is a terrorism expert? But his credibility is,
we must say, "Running on Empty." Nuclear plants were among the nation's most
protected assets before September 11, 2001, and have had numerous security
upgrades since. But none of the world's 443 nuclear power plants have been
attacked. Why?
Simply put, they're not easy targets. Nuclear plants are built to withstand airplane
impacts, are heavily guarded and are under constant review. If risks are discovered,
the answer is to fix the problem, not shut down the industry.
187
Nuclear Energy Affirmative
SOLVENCY – TERRORISM
Nuclear power plant facilities are adequately prepared against terrorist attacks.
Mark Thompson, TIME magazine author, 6-12-05
For his part, Diaz insists that the improvements made in the nation's nuclear plants
since 9/11 are adequate. They have included adding physical barriers, checking
approaching vehicles at greater stand-off distances and improving coordination with local
police and military authorities. Says the NRC chief: "Any terrorist who looks at one of
these facilities is going to say, 'This is a hardened target, and I'm not going to have
any confidence that I am going to be successful [attacking it].'" Plants have also
improved training for guards and capped their workweeks at 72 hours to eliminate the
not-uncommon tendency of overworked employees to fall asleep on duty. Previously,
guards sometimes worked 80 to 90 hours a week. The NRC chief says that when it
comes to hiring, plant operators are using "a much finertoothed comb" than before
9/11 to keep troublemakers out. Potential employees are screened through numerous
databases, checked for, among other things, mental-health problems, criminal records
and questionable behavior in previous jobs. The NRC's confidence in its "insider
mitigation program" is so high that the DBT specifically rules out the need to defend
against an "active violent insider"--a turncoat employee willing to shoot and kill fellow
workers. The DBT does consider the possibility of a single, nonviolent insider working
with the terrorists. The Peach Bottom Atomic Power Station in southeastern Pennsylvania
is a good place to see some of the enhancements ordered by the NRC after 9/11. The
facility is newly ringed with 990 11-ton concrete blocks and $200-a-foot fencing topped
with razor wire. Ten new guard towers-- some six stories high--give armed guards broad
vistas of possible approaches to the plant. "Since 9/11 we have more security officers
here, and we've enhanced their weaponry," says Jeff Benjamin, a vice president of
Exelon Corp., which operates the plant on the bank of the Susquehanna River. "We have
a number of sensors, cameras and lighting," he told a visiting TIME correspondent,
declining to elaborate for security reasons. The reactor itself is deep inside walls of
concrete and steel. Says Benjamin: "All of the design and construction we do to keep
bad stuff in is also pretty darn good at keeping bad stuff out."
188
Nuclear Energy Affirmative
SOLVENCY – TERRORISM
Nuclear reactors are well defended, and the risk of a successful terrorist attack is very
slim.
December 3, 2007 Dispelling Myths About Nuclear Energy by Jack Spencer and Nick
Loris
FACT: Nuclear reactors are designed to withstand the impact of airborne objects
like passenger airplanes, and the Nuclear Regulatory Commission has increased
security at U.S. nuclear power plants and has instituted other safeguards.
A successful terrorist attack against a nuclear power plant could have severe
consequences, as would attacks on schools, chemical plants, or ports. However, fear of a
terrorist attack is not a sufficient reason to deny society access to any of these critical
assets.
The United States has 104 commercial nuclear power plants, and there are 446
worldwide. Not one has fallen victim to a successful terrorist attack. Certainly, history
should not beget complacency, especially when the stakes are so high. However, the NRC
has heightened security and increased safeguards on site to deal with the threat of
terrorism.
A deliberate or accidental airplane crash into a reactor is often cited as a threat, but
nuclear reactors are structurally designed to withstand high-impact airborne threats, such
as the impact of a large passenger airplane. Furthermore, the Federal Aviation
Administration has instructed pilots to avoid circling or loitering over nuclear or
electrical power plants, warning them that such actions will make them subject to
interrogation by law enforcement personnel.[8]
The right response to terrorist threats to nuclear plants--like threats to anything else--is
not to shut them down, but to secure them, defend them, and prepare to manage the
consequences in the unlikely event that an incident occurs. Allowing the fear of terrorism
to obstruct the significant economic and societal gains from nuclear power is both
irrational and unwise.
189
Nuclear Energy Affirmative
190
Nuclear Energy Affirmative
In those countries that have moved towards Markey liberalization it would seem
that there can be no going back. The 2003 UK Energy White Paper reports:
“Liberalized energy markets are a cornerstone of our energy policy. Competitive
markets incentivize suppliers to achieve reliability. “
Before we look at the present and consider the future, it is important to recognize that
nuclear power, in every country where it was developed, benefited from huge amounts of
public subsidy. These subsidies acted directly for a plant development and in some cases
indirectly via defense expenditures for naval propulsion and nuclear weapons. Detractors
make a persuasive argument that such defense-sunk costs have given nuclear an unfair
advantage over alternative generation technologies. Furthermore it can be argued that this
advantage persists to this day. It should be remembered, however, that gas turbine
technology underpinning combined cycle gas turbine and natural gas-fired power stations
is also a beneficiary of public subsidies to aeroplane engine manufacture, and hence,
these gas turbine innovations also represent a military technology development.
191
Nuclear Energy Affirmative
Giving nuclear energy corporations incentives, such as subsidies, is the only way to
begin the “nuclear renaissance”.
For the most part, nuclear fission is a mature technology. While numerous
innovations and improvements would surely lie ahead in the event of a nuclear
renaissance, the fundamentals of the fission process, its development and its industrial
scale deployment are now well known. This is in contrast to several of the renewable
forms of low carbon electricity generation, such as solar photovoltaics and wind turbines.
In many repects these technologies are in an analogous position to nuclear power in the
1960s. In considering a level playing field between technologies in the electricity
generation market it would seem only fair that renewables receive substantial public
subsidy at this time. Recent policy announcements in the UK and the US are consistent
with such thinking.
192
Nuclear Energy Affirmative
The nuclear energy industry does not provide capital; incentives are needed.
Nuclear power capital costs fall into two broad categories: the physical
plan and the project finance costs. Costs associated with the physical plant include labor,
materials and infrastructure, while the project finance costs include the cost of capital
(discount rate over project development time), the ability to benefit from economies of
scale and learning. The nuclear industry and energy policy makers recognize the
difficulties caused to highly capital-intensive technologies by the move to liberalized
electricity markets and significant attention has been devoted to better understanding the
problem and to reducing capital costs as far as possible.
Clearly nuclear power, with its large up-front capital costs, is unattractive
when costs of capital (discount rates) are high. Importantly the cost of such capital is
completely beyond the control of the nuclear industry. In simple terms it is given by the
return that the plant developers would have to pay to match the returns on offer to
investors from other parts of the economy. The IEA report notes that doubling the
discount rate from 5 to 10% increases the levelized cost of nuclear electricity by an
average of 50%. In comparison, the equivalent figures are 28% for coal fired generation
and only 12% for combined cycle gas turbine generation.
193
Nuclear Energy Affirmative
Incentives are needed to start up the nuclear energy industry which would lead to
more investors, overall starting up the nuclear renaissance.
As Malcolm Grimston and Peter Beck point out, issues of economic risks
are central to nuclear power plant investment decisions. There have been numerous
examples of construction cost over-runs and delays. Regulatory risks are significant in
nuclear power plant planning and licensing. There are also risks of generic defects in
design and of accidents during construction. In a free market these risks would all fall to
the investors before a single unit of electricity had been sold. Investors confronting issues
of nuclear power would therefore seek to internalize these issues of economic risk via a
substantial risk premium.
Investors’ negative perceptions of the economic risks associated with
nuclear power is in addition to any aversion to invest that might arise from political
attitudes among the community of investors. For instance, early ethical investment funds
typically refused to invest in any nuclear power related projects. Recently with the
explosion in these forms of investment the ethical fund management community have
become more pragmatic and discerning. This has been characterized as being a move
from a dark green to a light green perspective on investment decisions. Nevertheless with
the growth of such lifestyle investments the future of nuclear power is vulnerable in ways
that did not apply in the old days of state control.
Grimston and Beck’s argument goes further, in that the economic risks of
nuclear power tend to fall to the private investors during the ten years, or more, of power
plant construction and licensing. By contrast, however, the economic risks associated
with gas-fired generation tend to relate to the operational phase and the risk of primary
fuel price volatility once electricity is being sold. In the case of gas-fired generation these
economic risks do not fall on the original investors in the construction, but rather are
passed directly to electricity consumers.
194
Nuclear Energy Affirmative
195
Nuclear Energy Affirmative
196
Nuclear Energy Affirmative
TOPICALITY – AFFIRMATIVE
Nuclear energy is an alternative energy and, thus, is topical.
Random House Unabridged Dictionary, 2006
Alternative energy- energy, as solar, wind, or nuclear energy, that can replace or
supplement traditional fossil-fuel sources, as coal, oil, and natural gas.
197
Nuclear Energy Affirmative
TOPICALITY – NEGATIVE
Since Nuclear power uses up natural resources, namely uranium, it is NOT an
alternative energy.
Compact OED, 2008
alternative energy-
• noun energy fuelled in ways that do not use up natural resources or harm the
environment.
198
Nuclear Energy Affirmative
I think that it is fair to say that prior to the passage of the Energy Policy Act of
2005 ("the Act"), investors were not exactly beating down the door of the NRC to file
applications of Early Site Permits or COLs. However, the Act provides a number of
significant financial incentives to the first few plants that enter the COL process, are
built
21
and ultimately begin to operate. These incentives, combined with rising fossil fuel costs,
rising wholesale market prices, and growing recognition that CO2 prices may be imposed
at some point within the life of a new plant that enters construction today, have
stimulated much more serious interest among investors in building new nuclear plants.
The Act provides for a 1.8 cent/kWh investment tax credit for new nuclear
capacity during its first 8 years of operation. This subsidy is limited to no more than
$125 million per year per 1,000 Mw of capacity and no more that 6,000 Mw of new
capacity can receive this subsidy. In addition, new nuclear plants are eligible to apply for
loan guarantees for up to 80% of a plant's construction cost. These loan guarantees will
reduce the cost of debt financing for projects that receive them and allow the financing
of
the projects to be more highly leveraged. These subsidies reduce the life-cycle costs of a
new nuclear plant by on the order of $20/Mwh, assuming that they operate with 85%
capacity factors (IEA, p. 376).16 The Act also provides "insurance" against regulatory
delays for the first 6,000 Mw of new capacity that applies for a COL. The first two plants
are eligible for up to $500 million of payments for the costs of regulatory delay and the
next four plants for up to $250 million each. The details of how much in loan guarantees
will actually be made available by the federal government (all generating plants that do
not produce greenhouse gases are eligible), how investment tax credits will be allocated if
more than 6,000 Mw of new capacity enters service during the eligibility window
specified in the Act, and how the costs of regulatory delay will be determined are yet to
be specified by the federal government.
199
Nuclear Energy Affirmative
ALTERNATE CAUSES
Global Economic Instability in Housing Markets
New York Times 2008
(April 14 2008, “Housing Woes In U.S. Spread Around Globe”, pg online @
http://proquest.umi.com/pqdweb?index=0&did=1461417571&SrchMode=1&sid=10&F
mt=3&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1214675356&clien
tId=1566)
The collapse of the housing bubble in the United States is mutating into a global
phenomenon, with real estate prices swooning from the Irish countryside and the Spanish
coast to Baltic seaports and even parts of northern India.This synchronized global
slowdown, which has become increasingly stark in recent months, is hobbling economic growth
worldwide, affecting not just homes but jobs as well.In Ireland, Spain, Britain and
elsewhere, housing markets that soared over the last decade are falling back to earth. Property
analysts predict that some countries, like this one, will face an even more wrenching adjustment
than that of the United States, including the possibility that the downturn could become a
wholesale collapse.To some extent, the world's problems are a result of American
contagion. As home financing and credit tightens in response to the crisis that began in
the subprime mortgage market, analysts worry that other countries could suffer the
mortgage defaults and foreclosures that have afflicted California, Florida and other states.Citing
the reverberations of the American housing bust and credit squeeze, the International
Monetary Fund last Wednesday cut its forecast for global economic growth this year and warned that
the malaise could extend into 2009."The problems in the U.S. are being transmitted to
Europe," said Michael Ball, professor of urban and property economics at the University of Reading in
Britain, who studies housing prices. "What's happening now is an awful lot more grief than we
expected."For countries like Ireland, where prices were even more inflated than in the United States, it has
been a painful education, as homeowners learn the American vocabulary of misery."We know we're already
in negative equity," said Emma Linnane, a 31-year-old university administrator. She bought a cozy, one-
bedroom apartment in the Dublin suburbs with her fiance, Paul Colgan, in May 2006, at the peak of the
market. They paid $575,000 -- at least $100,000 more than it would fetch today. "I sometimes get shivers
thinking about it," Ms. Linnane said, "but I'll let the reality hit me when I go to sell it."That reality is
spreading. Once-sizzling housing markets in Eastern Europe and the Baltic states are cooling
rapidly, as nervous Western Europeans stop buying investment properties in Warsaw, Tallinn, Estonia and
other real estate Klondikes.Further east, in India and southern China, prices are no longer
surging. With stock markets down sharply after reaching heady levels, people do not have as
much cash to buy property. Sales of apartments in Hong Kong, a normally hyperactive market, have
slowed recently, with prices for mass-market flats starting to drop.
200
Nuclear Energy Affirmative
Pursuing nuclear power also perpetuates the myth that increasing atomic energy, and thus
increasing uranium enrichment and spent-fuel reprocessing, will increase neither
terrorism nor proliferation of nuclear weapons. This myth has been rejected by both the
International Atomic Energy Agency and the U.S. Office of Technology Assessment.
More nuclear plants means more weapons materials, which means more targets,
which means a higher risk of terrorism and proliferation. The government admits that
Al Qaeda already has targeted U.S. reactors, none of which can withstand attack by a
large airplane. Such an attack, warns the U.S. National Academy of Sciences, could cause
fatalities as far away as 500 miles and destruction 10 times worse than that caused by the
nuclear accident at Chernobyl in 1986.
Nuclear energy actually increases the risks of weapons proliferation because the
same technology used for civilian atomic power can be used for weapons, as the cases
of India, Iran, Iraq, North Korea and Pakistan illustrate. As the Swedish Nobel Prize
winner Hannes Alven put it, "The military atom and the civilian atom are Siamese twins."
Yet if the world stopped building nuclear-power plants, bomb ingredients would be
harder to acquire, more conspicuous and more costly politically, if nations were caught
trying to obtain them. Their motives for seeking nuclear materials would be unmasked as
military, not civilian.
201
Nuclear Energy Affirmative
COST. In its first four decades, nuclear power cost this country more than $492 billion,
by conservative estimate-nearly twice the cost of the Vietnam War and the Apollo moon
missions combined-according to a study titled "The Economic Failure of Nuclear Power."
Even with those astronomical numbers, government and industry have deliberately
underestimated many costs for nuclear power, such as those for the permanent
disposal of nuclear wastes, the "decommissioning" (shutting-down and cleaning-up) of
retired nuclear power plants (which can be more than $4 billion per reactor), and the
consequences of nuclear accidents, all of which, according to the authors, could well
total another $375 billion. (As they say, a hundred billion here, a hundred billion there,
pretty soon you're talking about real money.)
In return for this massive investment, we have an energy source that contributes about 8 to 10 percent of our total energy consumption.
According to the same study, nuclear power has received more than $97 billion in direct and indirect subsidies from the federal
government, such as deferred taxes, fuel fabrication writeoffs, and artificially low limits on Iiability in case of nuclear accidents
(thanks to the Price-Anderson Act). No other industry has enjoyed such privilege.
Peter Grinspoon, director of Greenpeace's Nuclear Power Campaign at the time of the
report, said that "without even counting liabilities such as accidents and waste, nuclear
power has failed on economic grounds. Nuclear power is untenably expensive. ... It
simply can't compete."
202
Nuclear Energy Affirmative
Nuclear energy will always produce radioactive waste, for which there is no safe
storage method or site.
Jim Rice, Sojourners Magazine, August 2007, Volume 36, Issue 8, “Is Nuclear Power the
Answer?” From Proquest.com
The nuclear industry has no answer to the waste problem. Some nuclear backers
propose the "reprocessing" of spent fuel to extract plutonium and uranium, which can be
used in nuclear weapons or as new fuel for nuclear power plants. The Natural Resources
Defense Council warns that such projects threaten to "compromise efforts to keep
dangerous nuclear technology out of unsafe hands and substantially increase the flow of
nuclear waste for which there is no established means of disposal."
203
Nuclear Energy Affirmative
The vulnerability of nuclear power plants as potential targets of terrorist attack was
recognized long before 9/11, and concerns have only heightened since then. After Sept.
11, Dr. Edwin Lyman, a physicist and scientific director for the Washington-based
Nuclear Control Institute, said that a similar strike on a nuclear plant by a commercial
airliner "would in fact have a high likelihood of penetrating a containment building" and
that as a result "the possibility of an unmitigated loss of coolant accident and significant
release of radiation into the environment is a very real one. In other words, nuclear
power plants contain the potential to turn a conventional terrorist attack into, in
effect, a massive "dirty bomb," with the resultant spread of radioactive material. A
report from the Government Accountability Office criticized efforts by the Nuclear
Regulatory Commission to implement new security plans, concluding that the "NRC
cannot yet provide assurances that its efforts will protect nuclear power plants against
terrorist attacks."
204
Nuclear Energy Affirmative
Paul Gunter, Director of the Reactor Watchdog Project at NIRS and a report author, Nuclear Information
and Resource Service, March 2001, http://www.nirs.org/factsheets/pbmrfactsheet.htm, “THE PEBBLE
BED MODULAR REACTOR (PBMR)”
NO REACTOR CONTAINMENT BUILDING AND REDUCED SAFETY SYSTEMS CUT PBMR COSTS
Unlike light water reactors that use water and steam, the PBMR design would use pressurized helium heated in the
reactor core to drive a series of turbine compressors that attach to an electrical generator. The helium is cycled to a
recuperator to be cooled down and returned to cool the reactor while the waste heat is discharged to the
environment. Designers claim there are no accident scenarios that would result in significant fuel damage and
catastrophic release of radioactivity.
These industry safety claims rely on the heat resistant quality and integrity of the tennis ball-sized graphite fuel
assemblies or "pebbles," 400,000 of which are continuously fed from a fuel silo through the reactor "little by little"
to keep the reactor core only marginally critical. Each spherical fuel element has an inner graphite core embedded
with thousands of smaller fuel particles of enriched uranium (up to 10 %) encapsulated in multi-layers of non-porous
hardened carbon. The slow circulation of fuel through the reactor provides for a small core size that minimizes
excess core reactivity and lowers power density, all of which is credited to safety.
However, so much credit is given to the integrity and quality control of the coated fuel pebbles to retain the
radioactivity that no containment building is planned for the PBMR design. While the elimination of the
containment building provides a significant cost savings for the utility—perhaps making the design economically
feasible—the trade-off is public health and safety.
The protective containment building also is nixed because it would hinder the design’s passive cooling feature of the
reactor core through natural convection (air cooling). Exelon also proposes a dramatic reduction in additional reactor
safety systems and procedures (i.e. no emergency core cooling system and a reduced one-half mile emergency
planning zone as compared to a 10-mile emergency planning zone for light water reactors) to provide for further
reducing PBMR construction and operation costs.
To date, however, Exelon has not submitted to the Nuclear Regulatory Commission descriptions of challenges that
could lead to a radiological accident such as a fire that ignites the combustible graphite loaded into the core. Fire and
smoke then become the transport vehicle for radioactivity released to the environment from damaged fuel.
In addition, the lack of containment would require 100%-perfect quality control in the manufacture of the fuel
pellets—an impossible goal. Imperfections in fuel pellet manufacture could lead to higher radiation releases during
normal operation than is the case with conventional reactors.
205
Nuclear Energy Affirmative
Paul Gunter, Director of the Reactor Watchdog Project at NIRS and a report author, Nuclear Information
and Resource Service, March 2001, http://www.nirs.org/factsheets/pbmrfactsheet.htm, “THE PEBBLE
BED MODULAR REACTOR (PBMR)”
A single 110-megawatt PBMR will produce 2.5 million irradiated fuel elements during a 40-year
operational cycle. Nuclear waste remains dangerous over geological spans of time and a threat to life from
radioactive contamination would persist long after a PBMR has closed. The health and environmental
uncertainties associated with a historically mismanaged radioactive legacy from continued operation of
nuclear technology is yet another reason the public will not accept the PBMR.
206