You are on page 1of 45


The plan codifies outside-the-fence regulation --- justifies
limitless federal grid regulation
Tribe 15, (Carl M. Loeb University Professor, Harvard University and Professor of
Constitutional Law, Harvard Law School, EPAS PROPOSED 111(d) RULE FOR
EPAs plan spectacularly fails that test, and the rule of law commands us to be
consistent. Some people seem to practice fair weather federalism, rediscovering
States rights when it allows them to sustain a federal policy they favor, but abandoning the same
principles when it suits them. The Constitution demands more than that.
EPAs proposal would comprehensively re-order national electricity policy ,
allowing the agency to seize the role of National Electricity Czar and to
elbow state as well as federal regulators out of the way . For the first

time in the agencys history , EPAs proposal contains standards

requiring outside-the-fence actions i.e., measures (such as energy
conservation programs or renewable energy quotas) that take place entirely
outside the physical boundaries of the electrical generating units
identified in the C lean A ir A ct and traditionally subject to EPA jurisdiction. EPA evidently
believes it is empowered to regulate anyone who might have an efect on
CO2 emissions from power plants, up to and including the retail consumer
who uses electricity from a plant to recharge her phone . The proposal also seeks to
regulate renewable energy generators, energy distributors, and large industrial or commercial users of electricity.

This sort of plant to plug regulation would permit the EPA to regulate any
use of electricity as long as it afects CO2 emissions a standard that
would reach virtually every use of electricity in the U nited S tates. There is no
limiting principle . The A ffordable C are A ct may not compel health insurance
consumers to eat or buy broccoli, but EPA seeks to interpret the C lean A ir
A ct to allow it to regulate every watt used in growing broccoli and moving
it to the market as well as every watt used for any other activity within a
State. Even assuming that such an ambitious and unprecedented plan was precisely what Congress directed
EPA to promulgate (and the statute, as I will show, makes clear that Congress did the opposite), the plan
would dramatically violate the Tenth Amendments well-established anticommandeering principle. Indeed, this plan would violate that principle in a
remarkably sweeping and novel way , well beyond EPAs usual mandate. It would
require States to base their energy and emissions policies on the needs of
other States (and even other nations, such as Canada) with which they are inextricably linked through
the power distribution system the national power grid. And the breathtaking scope of
authority asserted by EPA to regulate outside-the-fence would give it
greater power than Congress has granted even to the Federal 6 Energy Regulatory
Commission (FERC), even though national grid management lies within FERCs mandate rather than EPAs and

the extent of federal interference and the degree

of coercion are qualitatively diferent here from those present in any other
Clean Air Act program. One would expect that, if Congress had intended to confer such revolutionary power on
falls far outside EPAs expertise. Both

EPA to command the States to do the Federal Governments bidding, it would have said so clearly. Indeed, as I will
show, core constitutional principles and precedents governing the Federal-State relationship plainly forbid such
blatant federal commandeering. The Supreme Court has instructed that Congress does not hide elephants in

If ever there were an elephant in a mousehole , the EPAs plan

is it and its an unconstitutional elephant to boot.
mouseholes. 12

Ensures blackouts
Scherman 15, Former General Counsel of the Federal Energy Regulatory
Commission and currently Chairman of the Energy, Regulation and Litigation Group
at Gibson, Dunn & Crutcher LLP (William, EPA's Dangerous Desire To Become
America's Energy Regulator,
it should be a concern to every American that does not want to work in
the dark and sleep with the lights on that this federal agency, the one that would fail first year electrical

engineering, is the very same agency proposing to radically change the Nations power gridthe Environmental

The EPAs slow but steady creep toward becoming the

Nations energy czar stems from its actions under the Clean Air Act, last amended in 1990. Under the
Protection Agency.

CAA, the EPA has the regulatory authority to limit certain pollutants and other smog-forming substances emitted
from the smokestacks of fossil fuel-fired power plants (coal, natural-gas, oil, etc.). Sometimes the EPAs measures
have directly impacted the cost of generating power and, as a result, have indirectly influenced the States and

The EPA has

never before sought to radically trample upon the individual States
ultimate and historic authority over the fundamental question of how to
ensure a reliable, afordable, and sustainable supply of energy to the
millions of homes and businesses from coast to coast. Moreover, there is another
power producers planning decisions. But the EPAs past actions have always stopped there.

federal agencythe Federal Energy Regulatory Commissionthat is an expert on the electricity market and whose
mission is to assure Reliable, Efficient and Sustainable Energy for Customers of the Nations bulk power system.

EPAs proposed Clean Power Plan would dramatically change all this and
in doing so would arrogate upon EPA control over how every American
gets and uses electricity service whether that customer is a homeowner in De Moines, a factory
in Ohio, or someone running their air conditioner in California. Simply put, the EPAs proposal reaches
into every aspect of the generation and use of electricity in the United
States through its so-called plant to plug approach to CO2 emissions.
The EPAs sweeping new plant to plug approach is radically diferent from
any other regulation the EPA has previously imposed on electricity
generators. Instead of merely saying to an existing power plant thou shalt not emit more than X from your
smokestack, the Clean Power Plan would insinuate the EPA into every aspect of the Nations energy grid. For
instance, the EPA proposes to reduce the use (demand in industry terms) of electricity by requiring States to impose
energy efficiency standards that meet the EPAs approval. This is a laudable goal to be sure, but the demand
reductions the EPA proposes cannot possible be implemented as quickly as they want (if ever), and if they could,
they would change consumer and industrial consumption patterns forever. At the same time, the EPA is requiring
States to massively shift generation away from fossil fuel-fired power plants to renewable sources of electricity such
as wind and solar. But as any first year electrical engineering student knows, you cant simply substitute wind or
solar power for coal power on a megawatt-for-megawatt basis if you want to keep the lights on when the wind isnt
blowing or the sun isnt shining. The electric grid simply does not work that way. Whether the EPA even has the
legal authority to promulgate the Clean Power Plan will eventually be decided by the Courts. But, putting the EPAs

the key question every American should be asking is

whether EPA has the substantive and technical expertise to be the
lack of legal authority aside,

Nations energy regulator. With the greatest respect, an agency that views all
electricity as fungible lacks the substantive expertise to adequately
consider the impact its proposed rule might have on the reliability of the
electric grid and the long-term effect on costs to the American electricity consumer. That expertise
has always rested with FERC and the States. There has been a great deal written about how
the Clean Power Plan was developed and proposed without much of a role for FERC. One FERC Commissioner even
testified to Congress that in a meeting with the EPAs Joe Goffman and Janet McCabe, the EPA refused to allow
FERC to look at documents relating to the Clean Power Plan. And, when FERC did have some initial views, the EPA
appears to have simply ignored FERCs advice. The Director of FERCs Office of Reliability memorialized in a memo
that in one private meeting, FERC advised the EPA that it had doubts about the EPAs proposal to vastly increase
the use of natural gas-fired generation in lieu of coal-fired generation. FERC also advised that there were
unresolved questions about the proposed increased reliance on renewables, and that the EPAs aggressive
timeline for relying on renewables would be difficult to accomplish. In essence, FERCthe federal experts on
questions of electric reliabilityadvised the EPA that its Clean Power Plan may have serious reliability implications
for the Nations electric grid, but the EPA refused to listen. More recently, the EPA has publically stated that it wants
to work more closely with the States and FERC on reliability issues. This has led many to propose, in various forms,
a so-called reliability safety valve that would be included in the final Clean Power Plan. That a general consensus
has emerged that a safety value is needed all but concedes that the EPAs proposal will cause reliability problems.

the current safety net proposals appear to leave the ultimate

decision making concerning electric reliability in the EPAs hands, rather
than in FERCs and the States, despite the reality that FERC and the
States have the statutory mandate and the necessary technical
experience to ensure the reliability of the electric power grid. Having the
EPA administer an after-the-fact reliability safety value is no diferent
than acknowledging that there is going to be a major car wreck but saying
that as long as there is a good auto-body shop down the road, all will be
fine . The Better get to Macco approach may work for cars, but it isnt the
way to protect the integrity of the Nations power grid . Everyone wants a cleaner
But, almost all of

environment for our kids and our future. But how we reach those goals should not put ideology and federal agency
turf battles ahead of safety and reliability. Nor should the basic physical realities of our electric grid be sacrificed to
pie-in-the-sky notions of endless carbon-free green energy. The sweeping changes envisioned in the Clean Power
Plan should cause every American to ask: When did the EPA become our Nations energy regulator? When did the
EPA acquire both the statutory mandate from Congress and the required subject-matter expertise to do FERCs and
the States jobs? When did the EPA gain the expertise to determine the optimal and most reliable mix of coal and
natural gas power plants? When did the EPA acquire the expertise to determine how much power can (or should) be

the EPA does not have

the expertise to be our Nations energy regulator . Congress has left that
job to FERC and the States. But the EPA is now aggressively expanding its
regulatory chokehold over the United States energy industry in a blatant
attempt to seize control from FERC and the individual States on questions
of how all electricity will be generated and consumed in the U.S. There
has to be a better way of getting both a cleaner environment and keeping the lights on .
reliably generated using wind farms and solar arrays? The short answer is that

Nuclear war
Andres and Breetz, 11Professor of National Security Strategy at the
National War College AND doctoral candidate in the Department of Political Science
at The Massachusetts Institute of Technology (Richard and Hanna, Small Nuclear
Reactors for Military Installations: Capabilities, Costs, and Technological
Implications, Strategic Forum, February 1, 2011, dml) [ableist language
modifications denoted by brackets]

DOD is unable to provide its bases with electricity when the

civilian electrical grid is offline for an extended period of time. Currently,
domestic military installations receive 99 percent of their electricity from
the civilian power grid. As explained in a recent study from the Defense
Science Board: DOD's key problem with electricity is that critical missions, such as
national strategic awareness and national command authorities , are almost
entirely dependent on the national transmission grid ... [which] is fragile ,
vulnerable , near its capacity limit , and outside of DOD control . In most cases,
neither the grid nor on-base backup power provides sufficient reliability to
ensure continuity of critical national priority functions and oversight of
strategic missions in the face of a long term (several months) outage. The grid's
fragility was demonstrated during the 2003 Northeast blackout in which 50 million
Grid Vulnerability.

people in the United States and Canada lost power, some for up to a week, when one Ohio utility failed to properly
trim trees. The blackout created cascading disruptions in sewage systems, gas station pumping, cellular
communications, border check systems, and so forth, and demonstrated the interdependence of modern
infrastructural systems. (8) More recently, awareness has been growing that the grid is also vulnerable to purposive
attacks. A report sponsored by the Department of Homeland Security suggests that a coordinated cyberattack on
the grid could result in a third of the country losing power for a period of weeks or months. (9)Cyberattacks on
critical infrastructure are not well understood. It is not clear, for instance, whether existing terrorist groups might be
able to develop the capability to conduct this type of attack. It is likely, however, that some nation-states either

In the event of a war with one of

these states, it is possible, if not likely, that parts of the civilian grid would cease to
function, taking with them military bases located in afected regions.
Government and private organizations are currently working to secure the
grid against attacks; however, it is not clear that they will be successful. Most
military bases currently have backup power that allows them to function
for a period of hours or, at most, a few days on their own. If power were not restored
after this amount of time, the results could be disastrous . First, military
assets taken offline by the crisis would not be available to help with
disaster relief. Second, during an extended blackout, global military
operations could be seriously compromised ; this disruption would be
particularly serious if the blackout was induced during major combat
operations . During the Cold War, this type of event was far less likely because the
United States and Soviet Union shared the common understanding that
blinding [debilitating] an opponent with a grid black-out could escalate to
nuclear war . America's current opponents, however, may not share this fear
or be deterred by this possibility.
have or are working on developing the ability to take down the U.S. grid.

Chemical industry resilient even when profit falls
CNI 8 (Chemical News & Intelligence, This Week in ICIS Chemical Business, 8-18,
Engineering and construction companies are expanding to specialties and photovoltaics Global engineering

the projects are changing, but the chemical

sector continues to show a surprising amount of resilience
Profitability analysis reveals North American petrochemical industry's
demise is exaggerated Profits in the North American petrochemical industry are
expected to decline sharply following Middle Eastern and Asian capacity
additions. But contrary to the prevailing view, fears of its long-term
demise will prove to be exaggerated. Shell's Omega MEG process kicks off in South
and construction companies report that

Korea The big goal for a process engineer could be the development of a technology that converts all the
raw materials to the desired end product with the minimum theoretical energy consumption, no emissions
and the lowest capital cost.

No impact to economic decline --- countries respond with

cooperation not conflict
Clary 15PhD in Political Science from MIT and a Postdoctoral Fellow at the Watson Institute for International
and Public Affairs at Brown [Christopher, Economic Stress and International Cooperation: Evidence from
International Rivalries, MIT Political Science Department, Research Paper No. 2015-8, p. 4]

Economic crises lead to conciliatory behavior through five primary channels. (1)
Economic crises lead to austerity pressures, which in turn incent leaders
to search for ways to cut defense expenditures. (2) Economic crises also encourage
strategic reassessment, so that leaders can argue to their peers and their publics
that defense spending can be arrested without endangering the state . This
can lead to threat deflation, where elites attempt to downplay the seriousness of the threat posed by
a former rival. (3) If a state faces multiple threats, economic crises provoke
elites to consider threat prioritization, a process that is postponed during
periods of economic normalcy. (4) Economic crises increase the political and
economic benefit from international economic cooperation. Leaders seek
foreign aid, enhanced trade, and increased investment from abroad during
periods of economic trouble. This search is made easier if tensions are reduced with
historic rivals. (5) Finally, during crises, elites are more prone to select leaders who are
perceived as capable of resolving economic difficulties, permitting the
emergence of leaders who hold heterodox foreign policy views. Collectively, these
mechanisms make it much more likely that a leader will prefer conciliatory
policies compared to during periods of economic normalcy. This section reviews this causal logic in greater
detail, while also providing historical examples that these mechanisms recur in practice.

Price volatility small

Kruger 8/23 Vincent Kruger, Will US Retail Electricity Prices Climb in 2017?,
Market Realist, August 23rd 2016,
According to the EIA (Energy Information Administration), residential electricity prices
in the US averaged 13 cents per kilowatt-hour in July 2016. For the month, the

highest electricity price was 17.9 cents per kilowatt-hour in New England, and the lowest was 10.7 cents per

prices are
expected to fall by 0.3% in 2016 and then increase by 3% in 2017.
Favorable weather and expansion of the customer base provided a
moderately ofsetting efect on the low price scenario so far in 2016. In
2017, the expected electricity price rise is likely to have a positive impact
on utilities (XLU) (IDU) (FXU) revenues.
kilowatt-hour in the East South Central area. According to the EIA forecast, residential electricity

CPP destroys the economy---energy costs and electricity

Loris 15Research Fellow at Heritages Roe Institute for Economic Policy Studies
and a MA in economics from George Mason [Nicolas, The Many Problems of the
EPAs Clean Power Plan and Climate Regulations: A Primer, The Heritage Institute, 7
Jul 2015,]
Energy is a key building block for
economic opportunity. Carbon-emitting fuels, such as coal, oil, and natural gas,
provided 87 percent of Americas energy needs in the past decade and have been the
The Costs: Higher Energy Prices, Fewer Jobs, Less Growth

overwhelming supplier for over a century.[4] Throughout that time, particularly during the Industrial Revolution,

access to energy was a critical catalyst to improved health, comfort,

progress, ingenuity, and prosperity.[5] Evidence in the United States and
around the world demonstrates that the availability of energy positively
impacts economic growth or, at the very least, the two jointly impact one another.[6] On the other
hand, restricting the production of carbon-emitting conventional fuels with heavyhanded regulations, such as the Clean Power Plan, will significantly harm the
U.S. economy . Americans feel the pain of higher energy prices directly, but
also indirectly through almost all of the goods and services they buy,
because energy is a necessary component of production and service .
Companies will pass higher costs on to consumers or absorb the costs, which
prevents hiring and new investment. As prices rise, consumer demand
falls, and companies will drop employees , close entirely, or move to other
countries where the cost of doing business is lower. The result is fewer
opportunities for American workers, lower incomes, less economic growth,
and higher unemployment. Without the details of the final regulations, and given the complexities of
state plans, it is difficult to fully model the economic efects of the Administrations Clean Power
Plan; however, economic models can provide a snapshot of the economic losses
that CO2 regulations would impose. The economic consulting firm NERA projects that whether or not a
plan is state-administered or EPA-administered, electricity prices will increase considerably .
If states administer the plan, electricity prices will increase by an average of 12 percent between 2017 and 2031,

if the rulemaking is left to the EPA, prices will rise an average of 17

percent during that time period.[7]

No extinction from warming
Barrett, professor of natural resource economics Columbia University, 7
(Scott, Why Cooperate? The Incentive to Supply Global Public Goods,
climate change does not threaten the survival of the human species.5 If unchecked, it will cause
other species to become extinction (though biodiversity is being depleted now due to
other reasons). It will alter critical ecosystems (though this is also happening
now, and for reasons unrelated to climate change). It will reduce land area as the seas rise,
and in the process displace human populations. Catastrophic climate change is possible, but
not certain. Moreover, and unlike an asteroid collision, large changes (such as sea level
rise of, say, ten meters) will likely take centuries to unfold, giving societies time to
adjust. Abrupt climate change is also possible, and will occur more rapidly, perhaps over a decade or two.
However, abrupt climate change (such as a weakening in the North Atlantic circulation), though potentially very
serious, is unlikely to be ruinous. Human-induced climate change is an experiment of planetary proportions,
and we cannot be sur of its consequences. Even in a worse case scenario , however, global
climate change is not the equivalent of the Earth being hit by mega-asteroid.
Indeed, if it were as damaging as this, and if we were sure that it would be
this harmful, then our incentive to address this threat would be
overwhelming. The challenge would still be more difficult than asteroid defense, but we would have done

much more about it by now.

Oceans resilient
Kennedy 2 (Victor, Environmental science prof, Maryland, Former Director,
Cooperative Oxford Laboratory, PhD, Coastal and Marine Ecosystems and Global
Climate Change,, 2002)

There is evidence that marine organisms and ecosystems are resilient to

environmental change. Steele (1991) hypothesized that the biological
components of marine systems are tightly coupled to physical factors, allowing
them to respond quickly to rapid environmental change and thus rendering
them ecologically adaptable . Some species also have wide genetic variability
throughout their range, which may allow for adaptation to climate change.

China relations inevitable

Shannon Tiezzi 15, Associate Editor @ The Diplomat, Taking US-China Relations
Were still over six months away from Chinese President Xi Jinpings visit to the
United States, but you wouldnt know it from the number of bilateral meetings
being billed as in preparation for Xis arrival. The latest, a meeting between U.S. National Security
Advisor Susan Rice and Chinese State Councilor Yang Jiechi in New York City,

provides some interesting insights into focal points for the big bilateral summit in
September. Both China and the U.S. released reports summarizing the Rice-Yang visit, and the focus was decidedly
global . Instead of touching on bilateral subjects, both governments directed attention to U.S.China cooperation on global issues : the Ebola crisis, the North Korea
nuclear issue, the P5+1 negotiations with Iran, and ensuring stability in
Afghanistan. The emphasis on Afghanistan is especially interesting, as it marks a new area of
cooperation between Washington and Beijing. The Diplomat has
previously reported on the signs China is willing to take a more active role
in mediating between Afghanistan and the Taliban, including bringing Pakistan to the negotiating table. The U.S. role in all of this has been unclear, with
some reports indicating that the U.S. has plans to participate in negotiations with Afghan officials and Taliban leaders which was denied by U.S.
government officials. The prospect of a negotiation process with the Taliban led by China and sanctioned by the U.S. could be a critical development for
Afghanistans future. Details on possible U.S.-China cooperation on this front remain murky. The statement from National Security Council spokesperson
Bernadette Meehan made clear that Rice and Yang discussed Afghanistan in their meeting, but did not offer any additional details. The report from Chinas
Foreign Ministry (translated here by Xinhua) did not mention Afghanistan at all. But given the shared concern for Afghanistans stability, the U.S. and China
are undoubtedly having serious discussions on how to coordinate their efforts. Official summaries of the meeting paid more attention to a long-time point
of emphasis: the North Korean nuclear program. Weve entered another round of speculation as to when (if at all) North Korea will conduct another nuclear
test. When it comes to North Korea, U.S. administrations are always eager to show they have buy-in from China even if verbal commitments never
translate to action (something my colleague Ankit and I discussed in more detail in our latest podcast, featuring Joel Wit). This time around, according to

, Rice and Yang agreed that North Korea would not succeed in its twin
pursuit of nuclear weapons and economic development. The Chinese summary, predictably,
the NSC

was far more muted, saying only that China adheres to the principles of denuclearization and peaceful settlement through dialogue and negotiations.
Yang added his hope that all related parties will exercise restraint, avoid any irritating rhetoric and acts, and jointly maintain peace and stability on the
peninsula. Taken together, these two reports dont spark much hope for a breakthrough on how to approach North Koreas nuclear program. The relative
length given to the North Korea issue in each sides statement shows that both Beijing and Washington are focusing on this issue in the lead-up to Xis
visit. However, the problem is that the two sides have different goals for what a breakthrough would look like. China wants a return to the Six Party Talks or
another form of dialogue, while Washington wants greater Chinese commitment to the sanctions regime and/or a solid North Korean concession on its

U.S.-China relations have always had a global

component, but this trend is only increasing as China becomes more
influential on the world stage. In Meehans statement, the very second sentence underlines that Rice and Yang agreed to
nuclear program as a precursor to talks.

strengthen coordination on regional and global challenges. The U.S. and China have different agendas for the international order (see, for example, my

when it comes to various security challenges, whether

pandemics like Ebola or the threat of Afghanistan becoming a terrorist
haven, there is much common ground.
piece on Chinas vision for the U.N.) but

And, we just hit 400 ppm that makes warming irreversible

Chitransh 16 (Anugya, science writer for Kicker, MSN, and Times of India, MA In
Journalism from CUNY, Time to freak out: Earths carbon dioxide levels are now
[[graphics omitted]]
The world is getting ready to cross a major climate change marker which will
more or less prove that global warming is irreversible . This is Cape Grim, located in the
northwestern part of Tasmania, Australia: It's the most accurate spot in the southern
hemisphere to monitor carbon dioxide in the atmosphere. And for the first time, Cape
Grim will be at 400 parts per million ( ppm ) of carbon dioxide. Just FYI, the safe level of carbon
dioxide is 350 ppm . Once this happens, the level of carbon dioxide on Earth
will never again go below 400 ppm , according to scientists. In other words, we're in a
state of permanent danger from CO2. Another station in Hawaii already
crossed the 400 ppm milestone in 2013. The last time the levels were this
high, humans did not exist.

Warming doesnt cause conflict -- best statistical evidence

Erik Gartzke 11, Associate Professor of Political Science at UC-San Diego, March
16, 2011, Could Climate Change Precipitate Peace?, online:
An evolving consensus that the earth is becoming warmer has led to increased interest

in the social

consequences of climate change. Along with rising sea levels, varying patterns of precipitation,
vegetation, and possible resource scarcity, perhaps the most incendiary claims have to do
with conflict and political violence. A second consensus has begun to emerge
among policy makers and opinion leaders that global warming may well result in
increased civil and even interstate warfare, as groups and nations compete for water, soil, or
oil. Authoritative bodies, leading government officials, and even the Nobel Peace prize committee have highlighted
the prospect that climate change will give rise to more heated confrontations as communities compete in a warmer

Where the basic science of climate change preceded policy , this

second consensus among politicians and pundits about climate and
conflict formed in the absence of substantial scientific evidence . While
anecdote and some focused statistical research suggests that civil conflict may have
worsened in response to recent climate change in developing regions (c.f., Homer-Dixon 1991, 1994; Burke et
al. 2009). these claims have been severely criticized by other studies (Nordas &
Gleditsch 2007; Buhaug et al. 2010: Buhaug 2010).1 In contrast, long-term macro statistical
studies find that conflict increases in periods of climatic chill (Zhang et al. 2006,
2007; Tol & Wagner 2010).2 Research on the more recent past reveals that
interstate conflict has declined in the second half of the twentieth century,
the very period during which global warming has begun to make itself felt

(Goldstein 2002; Levy et al. 2001; Luard 1986, 1988; Hensel 2002; Sarkees, et al. 2003; Mueller 2009).3 While talk

claims that global warming causes conflict

must be evaluated in light of countervailing evidence and a contrasting set of causal
of a ''climatic peace is premature, broader
theoretical claims.4

CPP does too little to provide leadership

Guardian 15

The Guardian August 4, 2015 Obama clean power plan

welcomed but won't avoid dangerous warming
Pierre Radanne, a French energy expert, said the US curbs were weak compared to
the EUs aim to cut emissions by 40% by 2030 over 1990 levels.
The US cannot stay at this level. This is not leadership, said Radanne , noting that
the US target would represent a mere 13% reduction if measured from 1990 to

Military alt cause

Neslen 15 (Arthur Neslen is the Europe environment correspondent at the Guardian. He has previously
worked for the BBC, the Economist, Al Jazeera, and EurActiv, where his journalism won environmental awards. 1214-2015, "Pentagon to lose emissions exemption under Paris climate deal," Guardian,

Although the US never ratified the Kyoto protocol, it won an opt-out from having to fully report or act on its armed
forces greenhouse gas emissions, which was then double-locked by a House national defence authorisation bill in

Under the Paris agreement, countries would not be obliged to cut their
military emissions but, equally, there would be no automatic exemption for them either. US officials
privately say that the deal adopted on Saturday has no provisions covering military
compliance one way or another, leaving decisions up to nation states as to which national sectors should
make emissions cuts before 2030. If were going to win on climate we have to make
sure we are counting carbon completely, not exempting diferent things
like military emissions because it is politically inconvenient to count them ,
Stephen Kretzmann, Oil Change Internationals director told the Guardian. The atmosphere certainly
counts the carbon from the military, therefore we must as well . The US
military is widely thought to be the worlds biggest institutional consumer of
crude oil, but its emissions reporting exemptions mean it is hard to be sure. According to Department
of Defence figures, the US army emitted more than 70m tonnes of CO2 equivalent
per year in 2014. But the figure omits facilities including hundreds of military bases
overseas, as well as equipment and vehicles. Activities including
intelligence work, law enforcement, emergency response, tactical fleets
and areas classified as national security interests are also exempted from

reporting obligations. The US military requested the original Kyoto exemption on national security grounds. While

the Obama administration is not looking to the military for emissions cuts
before 2030, US republicans argue that future presidents, such as the socialist candidate Bernie Sanders for
the Democrats, could. Lets face it, vast swathes of our military are big carbon
emitters tanks, Jeeps, humvees, jet planes and of course much of our navy is not nuclear-powered, so [the
Paris agreement] could be used as a trojan horse, said Steven Groves, a senior research fellow at the US thinktank
the Heritage Foundation. He added: This might be a good opportunity for people concerned with national security
to go to congress and get some type of legislative exemption in the same way as was done during the Kyoto time
period. One of the first advocates of the House double-lock on the Kyoto exemption was Dick Cheney, according to
a book called The greening of the US military by Terry Lee Anderson, a senior fellow at Stanford University. Cheney
argued that the Kyoto clause would not cover US unilateral actions in a letter, which was also signed by other

The Iraq war was responsible for 141m tonnes of carbon

releases in its first four years, according to an Oil Change International report. On an annual basis, this was
more than the emissions from 139 countries in this period, or about the same as
former security officials.

putting an extra 25m cars on to US roads for a year. The paper found that projected US spending on the Iraq war
could cover all global investments in renewable energy needed to halt global warming trends in the period to 2030.

Topical af must be legislation -- judicial action is not T
Establish means legislate -- courts only rule on established law
Websters 10 Webster's New World College Dictionary Copyright 2010 by
Wiley Publishing, Inc., Cleveland, Ohio.
Used by arrangement with John Wiley & Sons, Inc.
establish to order, ordain, or enact (a law, statute, etc.) permanently

Policy requires Congress

Koch 6 - Dudley W. Woodbridge Professor of Law, William and Mary School of Law.
B.A., University of Maryland, not that Charles Koch (Charles, FCC v. WNCN
REVIEW LAW, Administrative Law Review vol 58, Hein Online)
Of these, Judge McGowan's opinion, in particular, provides a theoretically sound and useful framework. Judge
McGowan focused the Circuit's disagreement on the "reading of the [a]ct" in which judicial authority is dominant. 8
Thus, he selected the battleground advantageous to [BEGIN FOOTNOTE] 3. See FCC v. Sanders Bros. Radio Station,
309 U.S. 470, 475 (1940) (stating that Congress wished to allow broadcasters to compete and to succeed or fail
based on the ability to offer programs attractive to the public). 4. FCC v. WNCN Listeners Guild, 450 U.S. at 589. 5.
Id. at 591. In the broad sense, "policy" decisions are those that advance or protect some collective goals of the
community as opposed to those decisions that respect or secure some individual or group rights. See also Ronald
Dworkin, Hard Cases, 88 HARv. L. REV. 1057, 1058 (1975), reprinted in RONALD DWORKIN, TAKING RIGHTS
SERIOUSLY 81-130 (1977) (exploring the distinction between arguments of principle and policy); HENRY M. HART, JR.

term "policy" means such decisions assigned to the agency and policies made
by legislators are embodied in the statutory language and hence are not
"made" either by the agency or the courts , but are derived through the various techniques
(William N. Eskridge, Jr. & Philip P. Frickey ed., 1994) ("A policy is simply a statement of objectives."). Here

of statutory interpretation. 6. FCC v. WNCN Listeners Guild, 450 U.S. at 592-93. See, e.g., Ronald M. Levin,
Identifying Questions of Law in Administrative Law, 74 GEO. L.J. 1 (1985) (scrutinizing the difference between
questions of law and other questions, such as policy). 7. WNCN Listeners Guild v. FCC, 610 F.2d 838, 838 (D.C. Cir.
1979). 8. Id. at 842.

The Chevron doctrine makes no change in this fundamental

principle. See, e.g., Great Plains Coop. v. CFTC, 205 F.3d 353, 356 (8th Cir. 2000) (using the Chevron opinion as
supporting the conclusion that "statutory interpretation is the province of the judiciary"); Antipova v. U.S. Att'y Gen.,
392 F.3d 1259, 1261 (1 1th Cir. 2004) (explaining that the court reviews "the agency's statutory interpretation of its
laws and regulations de novo .... However, we defer to the agency's interpretation if it is reasonable and does not
contradict the clear intent of Congress"). See generally 3 CHARLES H. KOCH, JR., ADMINISTRATIVE LAW AND
PRACTICE 12.32[1] (2d ed. 1997) (offering many more examples). [END FOOTNOTE] the court. He nonetheless
noted that an administrative decision under delegated policymaking authority would be subject only to hard look
review, which he properly characterizes: "[The Commission] must take a 'hard look' at the salient problems." 9 That
is, the court must assure that the agency took a hard look, not take a hard look itself. "Only [the Commission], and
not this court, has the expertise to formulate rules welltailored to the intricacies of radio broadcasting, and the
flexibility to adjust those rules to changing conditions .... And only it has the power to determine how to perform its
regulatory function within the substantive and procedural bounds of applicable law."' 0 In other words, the court
must assure that the agency is acting within its statutory authority and, once it determines the agency is acting
within delegated policymaking authority, the court is largely out of the picture. Upon crossing this boundary, the
judicial job is limited to assuring that the policy is not arbitrary by determining whether the agency took a hard
look. The basic review system is revealed as Judge McGowan continues: "[The prior case] represents, not a policy,
but rather the law of the land as enacted by Congress and interpreted by the Court...."" He properly noted that

this distinction not only implicates the allocation of decisionmaking

authority between a reviewing court and an agency, but between both and
Congress: This court has neither the expertise nor the constitutional
authority to make "policy" as the word is commonly understood .... That
role is reserved to the Congress , and, within the bounds of delegated authority, to the
Commission. But in matters of interpreting the "law" the final say is constitutionally committed to the judiciary . . . .
Although the distinction between law and policy is never clearcut, it is nonetheless a touchstone of the proper
relation between court and agency that we ignore at our peril.

The affirmative interpretation is bad for debate.

Limits and ground are necessary for negative preparation and
clash. We permit a good number of cases, but they make the
topic too big. The agent adds a whole new set of plans -- all
the district and circuit courts are possible agents. This is
compounded by issues of legal precedents and impacts having
nothing to do with emissions.

We meet congress brings the law into force
CI Establish means to put into force
Webster, Merriam-Webster products and services are backed by the largest team
of professional dictionary editors and writers in America, and one of the largest in
the world, establish, no date,
establish : to cause (someone or something) to be widely known
and accepted to put (someone or something) in a position, role, etc., that
will last for a long time
Simple Definition of

Prefer this interpretation:

1) Neg ground You get disads to the enforcement of the law
includes politics because enforcement has electoral
consequences - CPs are in the lit disad links are comparative
to the status quo
2) Af ground defending new programs isnt mean tested
solvency advocates are weak
3) CPP is core of the topic its the biggest regulatory attempt
by the EPA excluding it is bad for education and guarantees
neg DA links are always non-unique
Any interp that excludes the CPP is bad core of the topic
Gamboa, Suzanne, NBC News Senior writer covering Latinos and politics, 5
Questions: Latina Climate Scientist On Carbon Emissions Rule, June 29, 20 15,
The White House is releasing a
Clean Power Plan that will start moving the U.S. toward a clean energy economy - it's the first ever
restriction on carbon emissions . We've had other restrictions - but we
haven't had any on carbon emissions and this is a huge step in limiting carbon
NBC: What is the big announcement on plant emissions from the president? Hernandez Hammer:

pollution. (Carbon dioxide is the primary greenhouse gas contributing to climate change.) The goal is that by 2030, the U.S. would
reduce carbon emissions from coal-fired plants by 32 percent. It's not only good for the U.S. but also, in terms of our position in the
world, it allows us to be leaders in encouraging other countries to take more steps toward clean energy

Prefer reasonability Competing interpretation is a race to the

bottom we should have to significant alter negative link
ground our af is the most core of the topic



Warming much slower than their impacts assume their
models are flawed and our authors use the newest and best
volcanoes, solar forcing, natural variability
Fyfe et. al 16 [John, Canadian Centre for Climate Modelling and Analysis,
Environment and Climate Change university of Vancouver, Gerald Meehl, National
Center for Atmospheric Research, Boulder, Colorado, Matthew England, ARC Centre
of Excellence for Climate System Science, University of New South Wales, Michael
Mann, Department of Meteorology and Earth and Environmental Systems Institute,
Pennsylvania State University, Benjamin Santer, Program for Climate Model
Diagnosis and Intercomparison (PCMDI), Lawrence Livermore National Laboratory,
Gregory Flato, Canadian Centre for Climate Modelling and Analysis, Environment
and Climate Change Canada, University of Victoria, Ed Hawkins, National Centre for
Atmospheric Science, Department of Meteorology, University of Reading, Nathan
Gillet, Canadian Centre for Climate Modelling and Analysis, Environment and
Climate Change Canada, University of Victoria, Shang-Ping Xie, Scripps Institution of
Oceanography, University of California San Diego, Yu Kosaka, Research Center for
Advanced Science and Technology, University of Tokyo, Making sense of the early2000s warming slowdown, Nature Journal March 2016, pg. 227-28]

Our results support previous findings of a reduced rate of surface warming over
the 20012014 period a period in which anthropogenic forcing increased at a relatively
constant rate. Recent research that has identified and corrected the errors and inhomogeneities in the surface air
temperature record4 is of high scientific value. Investigations have also identified non-climatic artefacts in
tropospheric temperatures inferred from radiosondes30 and satellites31, and important errors in ocean heat
uptake estimates25. Newly identified observational errors do not, however, negate

the existence of a real reduction in the surface warming rate in the early
twenty-first century relative to the 1970s1990s. This reduction arises through the
combined efects of internal decadal variability 1118, volcanic 19,23 and solar
activity, and decadal changes in anthropogenic aerosol forcing 32. The warming
slowdown has motivated substantial research into decadal climate variability and uncertainties in key external
forcings. As a result, the scientific community is now better able to explain temperature variations such as
those experienced during the early twenty-first century33, and perhaps even to make skilful predictions of such
fluctuations in the future. For example, climate model predictions initialized with recent observations indicate a
transition to a positive phase of the IPO with increased rates of global surface temperature warming (ref. 34, and

climate models did not (on

reproduce the observed temperature trend over the early twentyfirst century6, in spite of the continued increase in anthropogenic forcing .
G.A. Meehl, A. Hu and H.Teng, manuscript in preparation). In summary,

This mismatch focused attention on a compelling science problem a problem deserving of scientific scrutiny.

Based on our analysis, which relies on physical understanding of the key processes and forcings
involved, we find that the rate of warming over the early twenty-first century is slower than that of the
previous few decades. This slowdown is evident in time series of GMST and in the global mean
temperature of the lower troposphere. The magnitude and statistical significance of observed trends (and the
magnitude and significance of their differences relative to model expectations) depends on the start and end dates

Research into the nature and causes of the slowdown has triggered improved
understanding of observational biases, radiative forcing and internal variability . This has led to
of the intervals considered23.

widespread recognition that modulation by internal variability is large

enough to produce a significantly reduced rate of surface temperature
increase for a decade or even more particularly if internal variability is augmented by the
The legacy of this new understanding
will certainly outlive the recent warming slowdown. This is particularly true in the embryonic field of decadal
externally driven cooling caused by a succession of volcanic eruptions.

climate prediction, where the challenge is to simulate how the combined effects of external forcing and internal
variability produce the time-evolving regional climate we will experience over the next ten years.

Adaptation innovation is limitless their impact authors dont

assume innovation
Indur Goklany 10, policy analyst for the Department of the Interior phd from
MSU, Population, Consumption, Carbon Emissions, and Human Well-Being in the
Age of Industrialization (Part IV There Are No PAT Answers, or Why NeoMalthusians Get It Wrong), April 26,
Neo-Malthusians believe that humanity is doomed unless it reins in population, affluence and technological change,
and the associated consumption of materials, energy and chemicals. But, as shown in the previous posts and

empirical data on virtually every objective indicator of human wellbeing indicates that the state of humanity has never been better, despite
unprecedented levels of population, economic development, and new
technologies. In fact, human beings have never been longer lived, healthier,
wealthier, more educated, freer, and more equal than they are today. Why
does the Neo-Malthusian worldview fail the reality check? The fundamental
reasons why their projections fail are because they assume that population,
affluence and technology the three terms on the right hand side of the IPAT equation are
independent of each other. Equally importantly, they have misunderstood the
nature of each of these terms, and the nature of the misunderstanding is
essentially the same, namely, that contrary to their claims, each of these
factors instead of making matters progressively worse is, in the long run,
necessary for solving whatever problems plague humanity. Compounding
these misunderstandings, environmentalists and Neo-Malthusians frequently
conflate human well-being with environmental well-being. While the latter influences
the former, the two arent the same. Few inside, and even fewer outside, rich countries would rank

environmental indicators among the most important indicators of human well-being except, possibly, access to safe
water and sanitation. These two environmental indicators also double as indicators of human well-being because

they are subsumed within life

expectancy, which, as noted, is the single most important indicator of human
well-being. The UNDPs Human Development Index, for instance, uses three indicators life expectancy, per
they have a large and direct bearing on human health. In any case,

capita income and some combined measure of education and literacy. None of these three are related to the
environment. The disconnect between environmental indicators and indicators of human well-being is further

the most critical indicators of human wellbeing life expectancy, mortality rates, prevalence of hunger and malnutrition, literacy, education, child labor,
or poverty generally improved regardless of whether environmental indicators
(e.g., levels of air and water pollution, loss of biodiversity) fluctuated up or down (see, e.g., the previous
evidenced by the fact that over the last century,

fears that the worlds population would continue to

increase exponentially have failed to materialize. The worlds population
growth rate peaked in the late 1960s. Population increased by 10.6% from 196570, but only
post and here). Moreover,

6.0% from 200005. Many countries are now concerned that fewer young people means that their social security

Projections now suggest that the worlds population may

peak at around 9 billion around mid-century (see here). The slowdown in the
population growth rate, unanticipated by Neo-Malthusians, can be attributed
to the fact that population (P) is dependent on affluence (or the desire for affluence)
and technology (A and T in the IPAT equation). Empirical data show that as people get
wealthier or desire greater wealth for themselves or their offspring, they tend
to have fewer children. Cross-country data shows that the total fertility rate (TFR), which
measures the number of children per women of child-bearing age, drops as affluence (measured by GDP
per capita) increases (see Figure 1). Moreover, for any given level of affluence, TFR has
generally dropped over time because of changes in technology, and societal
attitudes shaped by the desire for economic development (see here). Most importantly,
it is not, contrary to Neo-Malthusian fears, doomed to rise inexorably , absent
coercive policies. Neo-Malthusians also overlook the fact that, in general, affluence,
technology and human well-being reinforce each other in a Cycle of Progress
(Goklany 2007a, pp. 79-97). If existing technologies are unable to reduce impacts or
otherwise improve the quality of life, wealth and human capital can be
harnessed to improve existing technologies or create new ones that will.
systems are unsustainable.

HIV/AIDS is a case in point. The world was unprepared to deal with HIV/AIDS when it first appeared. For practical
purposes, it was a death sentence for anyone who got it. It took the wealth of the most developed countries to
harness the human capital to develop an understanding of the disease and devise therapies. From 1995 to 2004,
age-adjusted death rates due to HIV declined by over 70 percent in the US (USBC 2008). Rich countries now cope
with it, and developing countries are benefiting from the technologies that the former developed through the
application of economic and human resources, and institutions at their disposal. Moreover, both technology and
affluence are necessary because while technology provides the methods to reduce problems afflicting humanity,

affluence provides the means to research, develop

and afford the necessary technologies. Not surprisingly, access to HIV therapies is greater in
including environmental problems,

developed countries than in developing countries. And in many developing countries access would be even lower
but for wealthy charities and governments from rich countries (Goklany 2007a, pp. 7997). Because technology is
largely based on accretion of knowledge, it ought to advance with time, independent of affluence provided
society is open to scientific and technological inquiry and does not squelch technological change for whatever

indicators of human well-being improve not only with

affluence but also with time (a surrogate for technology). This is evident in Figure 1, which shows TFR
reason. Consequently,

dropping with time for any specific level of GDP per capita. It is also illustrated in Figure 2 for life expectancy, which

the entire life

expectancy curve has been raised upward with the passage of time, a
surrogate for technological change (broadly defined). Other indicators of human
well-being e.g., crop yield, food supplies per capita, access to safe water and sanitation, literacy, mortality
also improve with affluence and, separately, with time/technology (see here and
here). This indicates that secular technological change and economic
development, rather than making matters worse, have actually enhanced
societys ability to solve its problems and advanced its quality of life. Moreover,
population is not just a factor in consumption. It is the basis for human
capital. No humans, no human capital. Humans are not just mouths, but also
shows that wealthier societies have higher average life expectancies, and that

hands and brains. As famously noted by Julian Simon, they are the Ultimate Resource.
This is something Neo-Malthusians have difficulty in comprehending . Notably, a
World Bank study, Where is the Wealth of Nations?, indicated that human capital and the value of institutions
constitute the largest share of wealth in virtually all countries. A population that is poor, with low human capital,
low affluence, and lacking in technological knowhow is more likely to have higher mortality rates, and lower life
expectancy than a population that is well educated, affluent and technologically sophisticated, no matter what its

These factors human capital, affluence and technology acting in

concert over the long haul, have enabled technology for the most part to
improve matters faster than any deterioration due to population, affluence
(GDP per person) or their product (GDP). This has helped keep environmental damage in check,

(e.g., for cropland, a measure of habitat converted to human uses) or even reverse it (e.g., for water pollution,
and indoor and traditional outdoor air pollution), particularly in the richer countries. Note that since the product of
population (P) and affluence (A or GDP per capita) is equivalent to the GDP then according to the IPAT identity,
which specifies that I = P x A x T, the technology term (T) is by definition the impact (I) per GDP (see Part II in this
series of posts). Ill call this the impact intensity. If the impact is specified in terms of emissions, then the technology
term is equivalent to the emissions intensity, that is, emissions per GDP. Therefore the change in impact intensity
(or emissions intensity) over a specified period is a measure of technological change over that period. Since matters
improve if impact/emissions intensity drops, a negative sign in front of the change in impact intensity denotes that
technological change has reduced the impact. Table 1 shows estimates of the changes in impacts intensity, or
technological change, over the long term for a sample of environmental indicators for various time periods and
geographical aggregations. Additional results regarding technological change over different time periods and

in the long run,

technological change has, more often than not, reduced impacts. The
reduction in many cases is by an order of magnitude or more! Thus, notwithstanding
countries are available from the original source (here). These results indicate that

plausible Neo-Malthusian arguments that technological change would eventually increase environmental impacts,

historical data suggest that, in fact, technological change ultimately reduces

impacts, provided technology is not rejected through an inappropriate exercise of the precautionary principle or
compromised via subsidies (which usually flow from the general public to politically favored elements of society). To

although population, affluence and technology can create some

problems for humanity and the planet, they are also the agents for solving
these very problems. In the IPAT equation, the dependence of the I term on
the P, A and T terms is not fixed. It evolves over time. And the NeoMalthusian mistake has been to assume that the relationship is fixed, or if it
is not, then it changes for the worse. A corollary to this is that projections of future
impacts spanning a few decades but which do not account for technological
change as a function of time and affluence, more likely than not, will
overestimate impacts, perhaps by orders of magnitude. In fact, this is one
reason why many estimates of the future impacts of climate change are
suspect, because most do not account for changes in adaptive capacity
either due to secular technological change or increases in economic
development (see here and here). Famously, Yogi Berra is supposed to have said, Its tough to make

predictions, especially about the future. Most analysts recognize this. They know that just because one can explain

Neo-Malthusians, by contrast,
cannot hindcast the past but are confident they can forecast the future . Finally,
and hindcast the past, it does not guarantee that one can forecast the future.

had the solutions they espouse been put into effect a couple of centuries ago, most of us alive today would be dead
and those who were not would be living poorer, shorter, and unhealthier lives, constantly subject to the vagaries of
nature, surviving from harvest to harvest, spending more of our time in darkness because lighting would be a
luxury, and our days in the drudgery of menial tasks because under their skewed application of the precautionary
principle (see here, here and here) fossil fuel consumption would be severely curtailed, if not banned. Nor would the
rest of nature necessarily be better off. First,

lower reliance on fossil fuels would mean

greater demand for fuelwood, and the forests would be denuded. Second, less
fossil fuels also means less fertilizer and pesticides and, therefore, lower
agricultural productivity. To compensate for lost productivity,, more habitat
would need to be converted to agricultural uses. But habitat conversion
(including deforestation) not climate change is already the greatest threat to


UNQ Prices Low

Energy-intensive industry growth is increasing and electricity
prices are low now
Dyl 16 Katie Dyl, Curtin University, Ph.D., Geochemistry, UCLA, Industrial and
electric power sectors drive projected growth in U.S. natural gas use, U.S. Energy
Information Administration, May 26th, 2016,
U.S. consumption of natural gas is projected to rise from 28 trillion cubic feet (Tcf) in
2015 to 34 Tcf in 2040, an average increase of about 1% annually, according to EIA's Annual Energy Outlook 2016

The industrial and electric power sectors make up 49%

of this growth, respectively, while consumption growth in the residential, commercial, and
transportation sectors is much lower. Much of this growth in natural gas consumption results from
relatively low natural gas prices. In the AEO2016 Reference case, average annual U.S. natural gas
(AEO2016) Reference case.
and 34%

prices at the Henry Hub are expected to remain around or below $5.00 per million British thermal units (MMBtu) (in
2015 dollars) through 2040. The Henry Hub spot price averaged $2.62/MMBtu in 2015, the lowest annual average
price since 1995. Prices rise through 2020 in the AEO2016 Reference case projection as natural gas demand
increases, particularly for exports of liquefied natural gas (LNG). Currently, most U.S. natural gas exports are sent to
Mexico by pipeline, but LNG exports, including those from several facilities currently built or under construction,

The persistent,
low price of U.S. natural gas is the primary driver for increased
natural gas consumption in the industrial sector. Energy-intensive
industries and those that use natural gas as a feedstock, such as bulk
chemicals, make up most of the increase in natural gas consumption. Low
natural gas prices also support long-term consumption growth in the
electric power sector. Natural gas use for power generation reached a
record high in 2015 and is expected to be high in 2016 as well, likely surpassing coal
account for most of the expected increases in total U.S. natural gas exports through 2020.

on an annual average basis. However, a relatively steep rise in natural gas prices through 2020 (rising 11% per
year) and rapid growth in renewable generationspurred by renewable tax credits that were extended in 2015

the 2020s and 2030s, electricity generation using natural gas increases
again. Because natural gas-fired electricity generation produces fewer
carbon dioxide emissions than coal-fired generation, natural gas is
expected to play a large role in compliance with the Clean Power Plan for
existing generation from fossil fuels, which takes effect in 2022. The electric power sector's
also contribute to a decline in power generation fueled by natural gas between 2016 and 2021.

total consumption of natural gas from 2020 through 2030 is 6 Tcf greater in the AEO2016 Reference case than in a
case where the Clean Power Plan is not implemented (No CPP).

Manufacturing is up PMI index proves
Moutray 12/15 (Chad Moutray, Ph.D., Economics from Southern Illinois
University, is chief economist for the National Association of Manufacturers (NAM).
December 15, 2016 swap
U.S. Manufacturing Output in December Grew at Strongest Rate Since
March 2015 The Markit Flash U.S. Manufacturing PMI edged up from 54.1 in
November to 54.2 in December, a 21-month high . This mostly mirrored
assessments about new orders growth (up from 55.5 to 55.6), which also expanded at the
fastest pace over that time frame. Other indicators were mixed but encouraging. Employment
expanded at its highest rate in 18 months (up from 52.4 to 54.1), whereas output grew
modestly but pulled back a little in December (down from 56.0 to 55.1). On a more disappointing note, exports
slowed to a near crawl but were positive for the sixth time in the past seven months

(down from 51.0 to 50.3). Softer international demand, however, should not be surprising given the strong U.S.
dollar. Overall, this report provides some encouragement for manufacturers ,
many of whom have been rather cautious in their economic outlook for much of the past two years.

A2 Green Tech
Only our studies take this efect into account
Robert Michaels and Robert Murphy, January 2009. Michaels is a professor of economics at
California State University and a senior fellow at the Institute for Energy Research. Murphy is director of the Institute
for Energy Research. Green Jobs: Fact or Fiction? Institute for Energy Research.

studies fail to properly account for the job

destruction that their recommendations would entail. For example, the Center for
Even if job creation per se is the goal, the

American Progress (CAP) study recommends a $100 billion expenditure to be financed through the sale of carbon
allowances under a cap-and-trade program. CAP estimates that this fiscal stimulus will result in the creation of

the CAP methodology treats the $100 billion as manna

from heaven; it does not consider the direct and indirect adverse efects
(including job destruction) of imposing higher costs on a wide array of energy-intensive
two million jobs [v]. Yet

industries and thereby raising prices for consumers.

Double counting of jobs and overly simplistic treatment of the labor market.

green studies critiqued in this report implicitly assume that there is a limitless
pool of idle labor which can fill the new green slots created by government
spending. Yet to the extent that some of the new green jobs are filled by workers
who were previously employed, estimates of job creation are overstated,
perhaps significantly so. In addition, the studies do not account for the rise in worker
productivity over time. Thus their long-range forecasts of total jobs created by green programs
are inflated, even on their own terms.

To its credit, CAP alludes to potential inflationary labor shortages from job creation [vi] due to its proposed
program, but dismisses the concern as irrelevant for an economy in recession. The thinking is that the workers
going into the new green jobs will simply reduce the unemployment rate, rather than siphoning talented people

The CAP analysis ignores the fact that other industries,

not favored by the green subsidies or mandates, would have been able to
draw on the pool of unemployed workers as the economy recovers. With fewer
away from other industries.

workers seeking jobs, job creation in non-green sectors will be lower than it otherwise would have been.

some of the infrastructure plans will require a long time to

implement and then reach completion. Their implementation over time could
contribute to inflationary labor shortages once the current recession
has passed.

Massively drives up prices EPA agrees
Jarrett, MPSC former commissioner and energy attorney, 2016
(Terry, States are right to worry about clean power plan costs, 7-8,
For starters,

the EIA says the plan will mean significantly higher prices

for residential and commercial electricity. They attribute this to higher transmission and
distribution costs coming at a time when electricity consumption will also grow slightly (in 2015-2040.) Interestingly, the EIA
projects that these higher electricity prices will actually reduce demand 2%
by 2030. Why? Because compliance actions and higher prices will force cash-strapped consumers to
adopt their own austerity measures . A key part of the CPP is the
dismantling of coal-fired power in the U.S. As the EIA sees it, Coals share of total electricity generation, which was 50%
in 2005 and 33% in 2015, falls to 21% in 2030 and to 18% in 2040. Coal power plants currently anchor
Americas base-load electricity generation, so its understandable that their elimination
would drive up prices . But is such a move justified? The EIA projects that renewable energy (solar and wind) will play a
significant role in meeting electricity demand growth throughout most of the country. Its a bold gamble, since the EIA believes that renewables will
account for 27% of total U.S. generation by 2040. But EIA data shows wind and solar power supplying only 5.6% of U.S. electricity generation in 2015. So,

EIA data on Germany , where

residential retail electric prices have risen , and are expected to keep
rising, due to higher taxes and fees for renewable power . Overall, Germanys
foray into green energy has driven the average residential electricity price
to 35 cents/kWh, almost three times the U.S. average of 13 cents/kWh. Along with Denmark, Germany has
some of the highest residential electricity prices in Europe.
the jump to 27% will require significant investments. Whats instructive is


2NC intermittency Impact ov

Blackouts cause nuclear conflict --- strategic awareness
and C&C are expressly dependent on the grid --- the
ability to blind the military causes competitors to doubt
the credibility of deterrence and miscalculate --- thats
O/w on Timeframe lashout is fast and unpredictable
because any hotspot becomes a potential nuclear conflict.
Warming is a multi-decade process, vote neg to live to
fight another day

Accidental launch impact

Broad blackout causes accidental launch
Earth Island Journal 2K Winter, Vol. 14, No.
4, eijournal/win2000/wr_ win2000y2k.html
NIRS notes that increasingly severe winter storms have caused power outages in
the eastern US in recent years. Such wintertime power failures "could lead to
extended blackouts and resultant nuclear catastrophes." The NIRS has pet
itioned the NRC to require all nuclear power stations to stockpile a 20-day supply of
fuel for diesel generators. Batteries charged by solar cells, windmills, hydroelectric
or geothermal energy would give the greatest assurance of long-term stability. In
September, the NRC ruled that it would not comply with the NIRS request and
declared that US nuclear plants would only need to have seven days worth of
emergency fuel available on-site. The Pentagon has been exploring ways to prevent
Y2K failures from causing the accidental launch of nuclear missiles. A more likely
scenario is that missiles could explode at their launch sites. Last March, a
Government Accounting Office report revealed that when the North American
Aerospace Defense Command ran a test for Y2K readiness, "testing problems
occurred." Fortunately, NORAD was able to "recover and continue the mission."
"Computer errors are, by their very nature, idiosyncratic," notes the British
American Security Information Council. Because of this, "The real cure is to take the
weapons off alert." Rep. Ed Markey (D-MA) and other members of Congress have
called for a global "nuclear stand-down" before December 31, 1999. "De-alerting of
nuclear warheads would ensure that Y2K would not start an accidental nuclear war,"
Kikuchi says. "US and Russian nuclear weapons are on hair-trigger alert even
though the Cold War is over. De-alerting means to disable the weapons delivery
systems in such a way that human action is required for a launch to succeed.
Currently, all other nuclear weapons states are in de-alert status."

2NC Turns Warming

Causes leakage --- turns warming
Institute for Energy Research, 2/25/2016 The Escalating Cost of Electricity,
The CPP is requiring the 47 states to collectively reduce carbon dioxide emissions from electric generators by 32
percent in 2030 from 2005 levels. However, each state is given a separate target by the EPA with many of the coal

generators in these states will have to prematurely shutter coal-fired plants
and replace them with plants that produce less carbon dioxidemost likely wind and
generating states being assigned reductions that are much larger than the 32 percent. As a result,

solar power plants that need natural gas or coal-fired plants to back them up when the sun is not shining and/or the

the reductions that the United States makes will quickly be

scooped up by Chinaby far the worlds leader in carbon dioxide emissions
or India, who will surpass the United States as the second largest carbon dioxide
emitter, since neither country expects to make reductions in their carbon dioxide
emissions by 2030. China expects to peak carbon dioxide by 2030 and India will
not agree to a firm reduction. Both countries need to bring electricity to many
residents that do not have access to it and coal is the least expensive way to
accomplish that globally. Thus, EPA wants U.S. consumers of electricity to pay
escalating prices as the above graphs indicate just so countries like China
and India can emit more carbon dioxide, in hopes they will start reducing
carbon dioxide sometime down the road. Further, while the regulations on carbon dioxide
emissions would shut down roughly 40 percent of Americas coal-fired power generation, the end result
would be just a 0.01 degree Celsius reduction in global temperatures by
2100[iii]hardly a reason to cause U.S. electric consumers such financial pain,
which impacts the elderly, minorities, and the poor the most. Conclusion Electric rates are increasing
nationally and to a greater extent in certain states, such as Colorado. These increases are caused by
onerous regulations that result in shuttering existing coal and nuclear plants
prematurely and replacing them with new plants that are higher in cost. Electric rates will
continue to escalate as EPA imposes more regulations, such as its regulation of carbon
dioxide emissions from power plants. Consumers need to be aware that their electric rates will
increase for very little gain as other countries will surely increase their
carbon dioxide emissions as they provide electricity to their citizens to improve
wind is not blowing. Further,

their quality of life.

2NC Turns Econ

Major power outages takes down the economy
Montgomery 12 Montgomery, Dan DAmbrosio and Greg Claryand Todd B.
Bates, Gannett News Service Aug 27, 2012 Asbury Park Press A high risk of
failure? Power grid put to the test
A major blackout in hyper-wired America would also have crippling
consequences, with some experts predicting economic losses up to $180 billion. This is really
the fundamental linchpin for everything in our society, our economy, our
quality of life, said Massoud Amin, a University of Minnesota professor and
longtime electric industry analyst and consultant. By deferring infrastructure upgrades, we are
basically increasing the risk for the whole system. In America, extreme weather is driving the discussion.

Terry Boston, president of the PJM bulk electricity management grid that serves 60 million residents in
parts of 13 states, including New Jersey, said doubts are growing over forecasts based on long-term
weather trends, typically 30-year averages. PJM experts, he said, could soon factor climate change and
extreme events into their planning models for delivering power and for restoring it when big storms
turn off the lights. I cannot think of any year in my career with more challenges, Boston said. U.S.

Energy Secretary Steven Chu said theres urgency in moving forward quickly. Blackouts and
brownouts already cost our economy tens of billions of dollars a year, and
we risk ever more serious consequences if we continue to rely on outdated and inflexible
infrastructure, Chu recently told a Congressional committee

2NC Chem impact

Persistent blackout wrecks the chem industry
Latynina 3 Yulia Latynina, journalist for Novaya Gazeta~World Press Review
(VOL. 50, No. 11)
The scariest thing about the cascading power outages was not spoiled
groceries in the fridge, or elevators getting stuck, or even, however cynical it
may sound, sick patients left to their own devices without electricity-powered
medical equipment. The scariest thing of all was chemical plants and
refineries with 24-hour operations, which, if interrupted, can result in
consequences even more disastrous and on a larger scale than those of
an atomic bomb explosion . So it is safe to say that Americans got lucky
this time. Several hours after the disaster, no one could know for certain whether
the power outage was caused by an accident or someones evil design. In fact, the
disaster on the East Coast illustrates just one thing: A modern city is in itself a
bomb, regardless of whether someone sets of the detonator intentionally
or by accident. As I recall, when I was writing my book Industrial Zone, in which
business deals were bound to lead to a massive industrial catastrophe, at some
point in time I was considering making a cascading power outage the cause of a
catastrophe. Back then, I was amazed and shocked at the swiftness of the process.
Shutting down at least one electric power plant is enough to cause a drop
in power output throughout the entire power grid. This is followed by an
automatic shutdown of nuclear power plants, a further catastrophic drop
in power, and finally a cascading outage of the entire grid system. To start
with, the electric power plant may burn out because of just about
anything. In Ekibastuz [Kazakhstan] under the Soviet regime, a large hydroelectric
power station was burned to the ground because of the negligence of one extremely
smart worker, who used a wrench to unscrew the cap from a pressurized oil vessel.
A stream of oil shot up to the ceiling; the worker got scared and dropped the
wrench, which hit against the steel floor and created a spark that set the stream of
oil on fire. Then the lights went off. Which brings us back to our main thesis. In
order to destroy a modern city, one does not need to have nuclear
weapons, because the modern city is in itself a weapon. The city
infrastructure is an infrastructure with dual purpose. Why should terrorists
need chemical weapons if their enemies already have chemical plants? Why should
terrorists need nuclear weapons if their enemies already have skyscrapers and
airplanes with tanks full of fuel, which can be hijacked with the help of a penknife?
Why would they need sophisticated military technologies and stolen explosives if
the KamAZ truck that blew up the hospital in Mozdok was carrying a load of, let us
say, fertilizer? So-called dictatorship regimes and terrorists themselves have long
since figured that out. That is exactly why there were no nuclear or bacteriological
weapons in Iraq. Why not? A bomb planted on an airplane would kill dozens fewer
people than a failure of the air traffic control system of a large airport. Sept. 11
taught the world that the infrastructure of the modern civilization could be
as lethal as the weapons themselves . Last week, a significant and major

addition was made to the lesson of Sept. 11: The actions of terrorists cant always
be distinguished from the actions of a drunken dispatcher or random lightning.

Tribe Link Ext.

The plan wrecks the grid:
No limiting principle --- the Clean Air Act isolated EPA regs
to power plants --- CPP breaks that framework by
regulating the supplier and user which authorizes the fed
to regulate any application of electricity which can be
anything creates plant plug regulations that burden
power distributers
Centralization --- it makes the EPA a National Electricity
Czar with authority to set and adjudicate all state grid
policies --- it goes vastly farther than FERC jurisdiction
and threatens dynamic management thats sustained the
grid for decades - -- thats scherman and Tribe

A2 link turn
CPP will strain the grid and cause blackouts dispersion - also
causes international free-riding
Segal 15 Scott Segal is executive director of the Electric Reliability Coordinating Council and former Emory
debater August 31, 2015 EPA's Clean Power Plan ignores costs, threatens reliability

We can expect significant potential threat to the electric reliability upon which
our modern way of life depends. Experts argued the proposed rule strained essential
services . The final rule is little improved. While it ofers welcome delays in
the onset of compliance, it still creates a choke point by requiring draft
plans to be filed in just 12 months. Further, the "safety valve" it creates for
reliability is too little, too late like pulling an emergency brake after the
accident is already occurring. Renewables are an important element of a
balanced portfolio as power companies know, because they are making
substantial investments in them but they cannot come at the expense of
maintaining critical base-load power plants. Is all the pain worth it for the
benefits of the Clean Power Plan? No. The rule is not likely to decrease the
harms associated with global warming, given that climate change is an
international phenomenon. While the White House hopes other nations
will follow our lead at December's climate summit in Paris, it is just as
likely that they will simply exploit competitive advantages created by our

EPA is woefully inexperienced - disrupts dynamic management

of the grid
Heins 15, (Energy and Regulatory Consultant @ The World Merchant, Clean Power
Plan: Why Has the EPA Given Up on Cooperative Federalism?,
The idea of "Cooperative Federalism" began with the New Deal in the 1930's, when it came to include a division of
responsibilities among the states and the federal government agencies of electric power and distribution. By the

EPA set the minimum standards for the states

to best implement their individual utility plans to meet air pollution goals
with approval of the EPA. This dynamic partnership, with the State Utility Commissioners, state
passage of the Clean Air Act of 1970, the

utilities, Federal Energy Regulatory Commission, Nuclear Regulatory Commission, Department of Energy, state and

lasted for almost 80 years, with very positive impacts.

this state and federal electrical grid partnership developed the
necessary long term planning expertise, engineering sophistication, vast
financing mechanisms and political mandate to develop the most robust
electrical grid in the world. It also had "the machinery for change," as Leonard Cohen put it. Then,
suddenly the EPA announced its " C lean P ower P lan" in 2013. Several constitutional scholars
saw this plan, using 111(d) of the Clean Air Act, as a significant federal agency over-reach that
some have called "regulatory capture." Experts such as William Yeatman of the Competitive
regional transmission lines has
More importantly,

Enterprise Institute believe that the EPA should avoid this aggressive intervention and continue a policy of
"Cooperative Federalism" by using the "normal tools of government" including the electoral process and political

The facts seem to support the historical approach of this well-

rounded cooperation. In a recent news release, the EPA said that it has recorded state efforts that
consistently met or exceeded the federal requirements for energy efficiency, fuel use, renewable energy, and other
high-performance sustainable building metrics. In 2013, for example, EPA oversaw the 24 percent energy intensity
reduction from its FY 2003 baseline, a reduction from the FY 2013 energy intensity by 25.6 percent from FY 2003. In
FY 2013, EPA also measured a reduced fleet petroleum use by 38.9 percent compared to the FY 2005 baseline,
exceeding the goal of 16 percent." In addition, the EPA reports that greenhouse gases in the US have been reduced
by 10 percent 2005-2012. In the states, the 50 separate Public Utility Commissions (and their National Association
of Regulatory Utility Commissioners) have been exercising their authority and responsibility for working with state
governments, power plant operators, business community, state environmental groups, consumer groups and

47 states
have demand-side energy efficiency projects, all with measurable results,
38 states have Renewable Portfolio Standards (RPS), 10 states have
voluntary market-based Green House Gas (GHG) emission trading programs and numerous large
transmission companies to provide electricity to power the largest economy in the world. Currently,

private companies and publicly traded utility companies have been pursuing voluntary emission reduction
strategies. In a recent presentation at conference of the American Meteorological Society in Phoenix, EPA
Administrator Gina McCarthy said that "Science is under attack like it has never been before," which seems like
hyperbole, at the least or a highly political rationalization, at the most. In a recent editorial in Science Magazine, the
executive publisher Alan I. Leshner, said: "If the general public is to share more opinions with members of the
scientific community, scientists themselves cannot ignore concerns that people may have about the research
process or findings. There needs to be a conversation, not a lecture." Adding to the overall scientific confusion are
recent stories about "global warming" by many news outlets like the BBC, Forbes, the New York Times, The
Economist and CBS, they have reported that there has been no measurable increase in temperature over the last
15 years, also known as "global warming pause." On the other hand, other media sources like the World
Meteorological Organization, The Guardian and Climate Central are reporting that the 10 warmest years have been
since 1998. Surely, these disparities represent a major disagreement between respected sources of weather science
information. For the record, the United Nations International Panel on Climate Change's (IPCC) latest study shows a
temperature increase of 0.09 degrees Fahrenheit since 1998. Unsurprisingly, a recent Ohio State University 2015
study suggests that "both liberals, conservatives have science bias," when they are presented with facts that
challenge some of their political beliefs. Finally, there are several EPA's Climate Change assertions which can be
vigorously debated. For example, in the EPA News Release of October 31, 2014, it talks about the impacts of climate
change across the country, "ranging from more severe droughts and wildfires to record heat waves and damaging
storms." One could easily argue that none of the events need necessarily have been caused by global warming. In
fact, there is no detailed scientific evidence to ascribe "climate change" to any of these natural events. All of this

Why has the EPA given up on cooperative

federalism and replaced it with the C lean P ower P lan?" This complex plan simply does
not take into consideration many of the costs related to its
comprehensive plan , like new transmission infrastructure, new power
plant construction and the stranded costs created by shuttering many
coal-burning power plants and current transmission lines. The EPA looks
woefully unprepared for the planning, oversight and execution necessary
for its own C lean P ower P lan. Without an immediate global warming crisis, the rationale and political will
leads me back to my original point:

for such precipitous action as proposed in the Clean Power Plan seems more political than practical, especially given

EPA has almost none of the technical, financial and engineering

expertise developed over 80 years by the group of US electric grid
stakeholders. Most surely, the current state of Cooperative Federalism has
proven to be capable of providing inexpensive and abundant energy,
which is environmentally progressive and economically sound. Ultimately, the
EPA should be part of the total solution, not a part of the problem and the
creator of state and federal uncertainty many years into the future .
the fact that the

A2 k2 reliability
NERC agrees the CPP harms reliabilitymultiple argsno
link turn
NERC, 16North American Electric Reliability Corporation (Potential Reliability
Impacts of EPAs Clean Power Plan,
significant changes occurring on the BPS both as a
matter of the course of business as well as a direct efect of CPP implementation. The results
The results of the CPP analysis underscore

of the IPM and AURORAxmp models have been delineated in the preceding chapters along with an overview of the
key assumptions that went into deriving the aforementioned results. A deviation from the relevant input

the outputs themselves pose potential

reliability challenges. These reliability challenges are discussed in more detail below. Essential Reliability
Services Risks The CPP will accelerate wind and on-grid solar development that will
escalate the need for ERSs, particularly in MISO and SPP. These critical services will take time
and greater investment to develop. If these services are unable to keep up with the renewable
growth, grid reliability can become increasingly challenged, particularly given
that the renewable market share exceeds 30 percent of all generation in
multiple power pools. The North American BPS is undergoing a
fundamental shift to a resource mix that relies less on conventional generation resources such as coal,
assumptions can introduce additional BPS risks. Additionally,

nuclear, etc., to more asynchronous, distributed and storage-enabled resources such as wind, solar, and storage. In
addition, the modern grid system will change in future to incorporate microgrids, smart networks, and other

NERC formed the Essential Reliability Services Task Force (ERSTF) that
studied the implications to planning and operating the BPS in the face of
these changing resource mix. Essential Reliability Services (ERSs) include three important building
advancing technologies.

blocks of reliability, namely; frequency support, ramping, and voltage. In order to maintain the reliability of electric
grid, resources need to be able to provide frequency support, voltage control, and ramping capability. The ERSTF
evaluated the capabilities of newer resources in terms of providing ERSs to see if they are able to provide them.
ERSs will be needed for future as we transition from conventional generation mix to newer resource mix. Based on
the analysis of geographic areas that are experiencing the greatest level of change in their types of resources ,

number of measures and industry practices are recommended to identify trends
and prepare for the transition in resource mix. Frequency20 The electric grid is designed to operate at a frequency
of 60 hertz.

Deviations from 60 Hz can have destructive efects on generators, motors,

and equipment of all sizes and types. It is critical to maintain and restore frequency after
a disturbance such as the loss of generation. An instantaneous (inertial) response from some resources and a fast
response from other resources help to slow the rate of frequency drop during the arresting period by providing a

ramping capability (the ability to match load and generation at all times) is necessary to
maintain system frequency. Changes to the generation mix or the system
operators ability to adjust resource output can impact the ability of the
operator to keep the system in balance. Voltage22 Voltage must be controlled to protect
fast increase in power output during the rebound period to stabilize the frequency. Ramping21

system reliability and move power where it is needed in both normal operations and following a disturbance.
Voltage issues tend to be local in nature, such as in sub-areas of the transmission and distribution systems.
Reactive power is needed to keep electricity flowing and maintain necessary voltage levels. Each reliability building
block has an associated video animation to explain the concept of that particular ERS. Please click on each title
above to access the corresponding video. Additionally, they are all available here: The Basics of Essential Reliability

NERC determined the ERSTF work to be ground breaking and

accommodating to newer technologies to be integrated in the BPS. Thus, there is a further
need to evaluate the ERS measures and identify their sufficient levels for various geographic areas. ERSTF will
continue its work as Essential Reliability Services Working Group (ERSWG) to further analyze these

measures. In addition, the ERSWG in a separate effort will evaluate the impact of distributed energy resources on

This report discusses the level of retirements for coal

resources as well as the need to replace that capacity with new resource s,
primarily natural gas and renewables in order to meet electricity demand. Replacement capacity may
take years to plan, permit, finance, and build. The assessment produces the results
the BPS. Replacement Capacity Risk

indicating the level of resources that need to be built in order to meet demand; however, system planners must
adequately plan for new resources to ensure that they can be built in a timely fashion to accommodate retirements

Given that it takes up to five years or more to bring

new natural gas-fired generation online, the program compliance timelines
are already very tight. If the large block of accelerated unit retirements projected by
the IPM model occur as early as this year, some reliability margin requirements
could be missed if there are an insufficient number of replacement capacity projects planned or projects
are delayed. Without replacement capacity, systems may be left with the choice of continuing to operate
their higher-emitting coal-fired capacity and exceed state caps or operate systems below their
reserve margin requirements. Program Schedule Risk In addition to the necessary planning for
resource adequacy, uncertainty exists around program scheduling. Timing
considerations must be accounted for in regards to program development
and receiving all necessary authorizations throughout the regulatory
process. That process would include time for public comment as well as receipt of EPA approval and the
necessary time to implement appropriate compliance strategies. The 2022 implementation date
could be a challenge for states that need to pass authorizing state
legislation to develop the needed program framework and enforcement
policies required to gain approval of their state implementation programs .
In addition, the states need to develop state implementation plans, gain EPA
approval, and provide sufficient time for the afected sources to plan,
permit, finance, and build replacement capacity, transmission, and any
needed upstream infrastructure (e.g., natural gas pipelines). With the CPP being
litigated, it is uncertain what the final rule deadlines will be and when all
states can complete this process.
and additional demand growth.

The only argument we need to win is that the af makes

that planning harder because minor fluctuations spill up
Johnston, 15honestly cant find quals, but its in a law review? (Victoria,
Storage Portfolio Standards: Incentivising Green Energy Storage, 20 J. Envtl. &
Sustainability L. 25 (2015),
the intermittent
nature of renewable energy faces a large infrastructure challenge. One
large limitation in the current electrical system, no matter what the
generating source is, is the electrical grid's susceptibility to power
failures . Grid failure occurs either when insufficient electricity is available
for consumers to use or when energy suppliers put more electricity on the
In addition to the limitations discussed above in bringing renewable energy online,

grid than consumers demand . 10 4 This imbalance can lead to blackouts ,

which can be costly and take time to mend. Because the balance of the system is so
central to avoiding blackouts, the intermittent nature of renewable
energy makes those sources more complicated to manage . In the normal course of
use, generators inevitably fail on occasion. When one generator fails, the
electrical grid operator will compensate for that failure by pulling power
from other sources. 10 5 Electrical engineers rely on "spinning reserves" when this occurs. 10 6 Spinning
reserves are "the extra generating capacity that is available by increasing the power output of generators that are
already connected to the power system."' 0 7 This extra generating capacity is "ready to instantaneously respond to
control signals from the system operator in order to maintain transmission system integrity."' 1 0 8 However,

when grid operators pull too much energy from compensating sources,
they can cause the generators providing the extra generating capacity to
fail, which in turn causes the grid operator to over-tax different power generators, which may also fail,
creating what is known as a " cascading failure ."' 10 9 Cascading failures can
eventually lead to blackouts .'10 One of the largest cascading failures resulting in a blackout in
North America occurred on August 14, 2003.' That blackout affected an estimated fifty million people in eight U.S.
states and Ontario, Canada for up to a week.'1 2 The blackout cost the United States between $4 billion and $10
billion, 113 and in Canada, "gross domestic product was down 0.7% in August, there was a net loss of 18.9 million
work hours, and manufacturing shipments in Ontario were down $2.3 billion (Canadian dollars)."' "14 To avoid
blackouts, the electrical grid requires the balancing of supply and demand to a near-perfect degree. In the United
States, operators maintain the electrical grid at 60 hertz ("Hz")." 5 Ideally, this means that "the transmission grid
would always operate precisely at 60 Hz ...even as its millions of consumers impose varying loads at tens of
thousands of substations.' ' I 1 A demonstrative case of electrical grid balancing occurs in England, in what is known
as the "TV pick-up."' " 7 After a popular television show or sporting event ends, "[m]illions of lights and kettles are
simultaneously switched on" while "[t]he National Grid . . . must keep the frequency at 50hz.""' 8 This often requires
turning additional peak load generators online specifically to combat the power surge. 19 To combat the threat of
blackouts, electrical operators 'have to forecast [energy demand] second by second, minute by minute."' 1 0 They
base predictions on what customers required on a similar day with 'exactly the same weather." 2 '

Maintaining the grid at an exact frequency is nearly impossible . Electrical

grid operators must not only balance supply with changing power
demands, but also compensate for equipment failure. To make the task even more
difficult, the electrical grid operator needs to take into account the "finite
response time of each generator."' 22 That is, some generators react more quickly to commands
than others-some generators can come online at the flip of a switch (i.e. natural gas), while others take some time
to warm up. 123 Because an electrical operator cannot predict demand with perfect accuracy, operators set limits
that define a range of frequencies within which they must maintain operations. " Operators

of the grid
maintain its reliability by ensuring that deviations [from the ideal 60 Hz]
never grow to catastrophic size ."' 124 These limits give operators a minimal
amount of wiggle room to balance the grid, but if the electrical grid
operator allows the frequency to go outside of the outer bounds, the grid
can fail and cause blackouts . Time-intermittent wind and photovoltaic
power create even more challenges for grid operators. As the demand for
renewable energy sources increases, the shortcomings of the electrical
grid will become more problematic . "The changes will lead to grids that are more stochastic and
exhibit dynamics requiring new stability criteria that address emerging problems and can be evaluated faster,
closer to real time.' 125 In Germany, where renewable energy sources have priority over traditional power plants,
transmission companies send excess renewable energy to other counties because of the grid's inability to store
electricity efficiently.1 26 In the United States, the Bonneville Power Administration, based in the Pacific Northwest,
has taken another approach, which includes booting wind energy supplies off line in favor of hydroelectric power
when too much supply exists, allowing clean energy to go unused. 127 While grid operators make these decisions
based on the necessity to balance the grid, decisions like Bonneville's can make investments in clean energy

A2 Timeframe/ adapt
Ev is about CPP shift to smart grids

US grids secure now shift to smart grids invites hacker and

foreign nation tampering.
Mills, 16
[Mark, Senior Fellow at the Manhattan Institute and author of Exposed: How
Americas Electric Grids Are Becoming Greener, Smarterand More Vulnerable,
VULNERABLE, June, 2016,]
the U.S.
faces a new risk cyberattacks that could threaten public safety and greatly disrupt daily life. Utility
executives and other experts argue persuasively that U.S. grids , especially long-distance
grids, are currently well secured. Yet the key issue is not todays security but tomorrows. Here the
risks are growing rapidly. The push for greener and smarter grids requires far
greater grid-Internet connectivity to ensure the continuous delivery of electricity. These
greener, smarter grids will involve a vast expansion of the Internet of Things that
greatly increases the cyberattack surface available to malicious hackers
and hostile nation-state entities. Cyberattacks overall have been rising 60
percent annually for the past half-dozen years, and utilities are increasingly targeted . A Cisco
Electric grids have always been vulnerable to natural hazards and malicious physical attacks. Now

study found that 70 percent of utility-security professionals say that they have experienced at least one security
breach. For their part, federal and state governments genuflect to the goal of reliable, resilient, and affordable
electric service. Yet comparatively trivial sums are directed at ensuring that grids are more secure, compared with
the vast funding to promote, subsidize, and deploy green energy on grids. The central challenge for U.S. utilities in
the twenty-first century is to accommodate the conflict between political demands for more green energy and

Greater grid cybersecurity in the

future means that policymakers must rethink the deployment of green
and smart grids until there are assurances that security technologies have caught up. While the

societys demand for more reliable delivery of electricity.

government needs to improve its vital role in helping with cyber situational awareness, the private sector must
lead the way in defending against cyberphysical threats that evolve and move at tech-sectornot bureaucratic

Smart grids easier to hack the plan makes grids less secure.
Mills, 16
[Mark, Senior Fellow at the Manhattan Institute and author of Exposed: How
Americas Electric Grids Are Becoming Greener, Smarterand More Vulnerable,
Smart Grids: Greener & Easier to Hack, Real Clear Policy, July 14, 2016,

The problem? Smarter

and greener requires that the grid be more fully

connected with the Internet . Smart grids depend on Internet smarts. And solar and wind energy

both require Internet-centric mechanisms to meet the challenge of using episodic supplies to fuel societys alwayson power demand. Thus, policies from California to New York as well as the EPAs Clean Power Plan, envision adding

For hackers, this is

called vastly expanding the attack surface. In that smarter future, the
cyber-hacking skills bad actors have honed to break into private and
financial data can be directed at breaking into and controlling critical
physical infrastructures. Experts have demonstrated hacks into the entire
panoply of devices associated with smart and green power , from smart lights and power
millions of Internet-connected devices to electric grids, hospitals, and cities.

meters to the power electronics on solar panels. Cybersecurity has simply not been the priority in green policy
domains even though technical and engineering message boards and publications are filled with examples of
cyber-vulnerabilities or weak or non-existent cybersecurity features .

With the full flowering of

smarter infrastructures, just what are we likely to face? Imagine its a
scorching-hot summer day in Los Angeles sometime in the near future and the power in one wing of
a hospital goes down, taking with it the air conditioning and all the critical hospital
equipment from MRIs to life-support. The CEO gets a text from her facilities manager a few minutes before
another wing in a different, larger hospital in the network goes black, too, as the back-up generator fails to start.

followed by an email from the hacker stating that the power at all the
hospitals will be shut down within an hour. The ransom is, say, $10 million in Bitcoins.
Now imagine a different scenario, this time a hot Manhattan evening when several blocks go dark. Its not
a ransom this time but a threat : more is coming. The mayor gets an image on his smartphone of the July 25th
This is

1977 cover of Time Magazine with its headline Night of Terror. That 1977 New York City blackout lasted 25 hours,
involved thousands of ransacked stores and fires, 4,000 arrests and $300 million in damages. This time, the mayor
also worries that the attacker could be coordinating an array of Orlando-type physical assaults to fuel the chaos. In
the first case, the ransom gets paid and power comes back. In the second scenario, no physical attacks happen, but
it takes two days and heroic efforts from ConEds crews to restore power by reverting to older manual systems that
bypass the smart stuff. But the terrorists made their point. And in both cases forensic teams from the Department
of Homeland Security, the FBI, and DODs Cyber Command descend. They learn that a sophisticated phishing scam
inserted a computer worm, combined with malware loaded earlier in a backdoor hack into a power monitoring
device, enabling the remote seizure of local power network controls. The NSA traces the cyber breadcrumbs to
anonymous servers in Georgia (the country not the state) or Iran, or China, and a dead end. Sound far-fetched?
Consider where we are today: ransomware attacks are already a scourge. The American Hospital Association
reported that several health care companies and hospitals were hit earlier this year with ransomware (most paid).
But, so far, hackers can only shut down a target organizations access to its own computer system or e-commerce
Web site. As for the future, consider that for hackers, todays Internet-connected cars look just like tomorrows
connected grids. Researchers have hacked the Ford Escape, Toyota Prius, Nissan Leaf, and to great fanfare a
Jeep Grand Cherokee. Last years cyber-jacking of a Jeep took full control from ten miles away by exploiting
vulnerabilities in the Internet-connected infotainment system to backdoor into the cars microcomputers that
operate the steering and brakes. In the wake of that stunt, Chrysler recalled over a million cars and corrected those
particular vulnerabilities. Earlier this year, the FBI and NHTSA issued a general alert regarding vehicle cyber
vulnerabilities. Everyone on both sides knows its only the tip of the cyber-berg. In fact,

there have

already been cases of grid-like cyber-jacking. In 2008, a Polish teenager hacked a citys
light-rail controls and caused a derailment. In 2010 the world learned of a clandestine hack ostensibly U.S.-Israeli
that inserted the Stuxnet computer virus to damage the electrical infrastructure of Irans nuclear facilities. In
2015, hackers breached the operating system of a German steel mill, causing enormous physical damage. And this
So far there have been no such
hacks on U.S. power grids that we know about . And experts testifying
before Congress about the Ukraine event credibly asserted that Americas longhaul grids are better protected at least for now. But thats not the issue. Exposure is a

past December, hackers blacked out Ukraines electric grid.

problem not so much with long-haul grids but with local grids in cities and
communities where all the Internet smarts are planned. As green
connectivity is accelerated onto those grids, the attack surface expands.
Todays grids are , by Silicon Valley standards, dumb even if deliberately so. But we
already know what adding more Internet connectivity enables. The Department of Homeland Security asserts that
Americas manufacturing and energy sectors are the top two targets for attacks on cyber-physical systems. And
Cisco reports that 70 percent of utility IT security professionals discovered a breach last year, compared with 55

green grid advocates are pushing policies that

will create more Internet-exposure precisely when bad actors and hostile
nation states are rapidly escalating their hacking skills.

percent in other industries. Heres the rub:

Renewables now solves the af, but the plan rushes the
transition and causes our impacts
Porter, et al, 15Bishop William Lawrence University Professor, Harvard
Business School (Michael, with David Gee, Senior Partner and Managing Director at
The Boston Consulting Group, and Gregory Pope, Principal at The Boston Consulting
Policies at both the state and federal level will continue to encourage
lower-carbon energy solutions . State Renewable Portfolio Standards will
cumulatively require a minimum of 60 GW of new renewable generation by
2030, 40% higher than is mandated today.158 In addition, 13 states have introduced
greenhouse gas emissions limits that will require further shifts to lowercarbon
power.159 Federal standards will also ensure that vehicles and appliances
continue to improve their energy efficiency. There are also a growing
number of other proposals that would encourage carbon reductions over
the next 1015 years and longer. The Obama Administration, for example, has recently introduced
the proposed Clean Power Plan (CPP)160 that covers carbon reductions in the power sector, signed a greenhouse
gas emissions accord with China,161 and made U.S. greenhouse gas reduction pledges to the Paris round of
international climate negotiations.162 Each proposal targets a 2530% reduction in carbon emissions by 2030
compared with 2005 levels. These proposals face stiff political and legal challenges, but the reality is that

numerous factors are likely to encourage additional reductions ,

particularly as the economics increasingly favor cleaner energy .
Addressing climate change is not just a U.S. trend, but increasingly a
global one . (See Figure 22 on page 38.) The European Union, long a leader in climate action, has extended
emissions reductions targets to 2050.163 Mexico has announced an unconditional 25% emissions reduction from its
business-as-usual scenario by 2030, which would increase to 40% with a global climate deal.164 Even China, a
traditional opponent to any restrictions on its carbon emissions, has agreed to carbon targets for the first time,
pledging to begin reducing emissions by 2030 in its recent accord with the U.S.165 Political debates over climate
change will continue, as in Australia, which enacted carbon limits and then repealed them.166 Some countries have
also missed their Kyoto Protocol commitments.167 However, while the right targets and the best policies are still

the general trend and current momentum for carbon reductions

are greater than at any time over the last 15 years . NATURAL GAS AND THE U.S. ENERGY TRANSITION The
being debated,

U.S. position in natural gas is a crucial asset in making Americas energy transition both feasible and at a competitive cost across a range of carbon reduction scenarios, at least through
2030. Natural gas can replace up to 50% of the existing coal capacity by 2022 at lower cost,168 providing significant economic and carbon benefits, regardless of other climate policies.
EPA Administrator Gina McCarthy put it well in April 2015 when she said, [Fracking] has changed the game for me in terms of how the energy system is working. The inexpensive gas
thats being produced has allowed us to make leaps and bounds in progress on the air pollution side and, frankly, to make the Clean Power Plan.169 Natural gas essential for near-term
carbon reductions Natural gas is the only fuel that can cost-effectively deliver large-scale carbon emissions reductions in the near term, including the 30% carbon emissions reduction
targeted by the proposed Clean Power Plan. A 2014 CSIS/Rhodium Group study170 shows that increasing natural gass share of power generation from 28% today171 to 43% by 2030
allows the U.S. to meet the 30% reduction target of the Clean Power Plan without significantly increasing the cost of electricity in the U.S.172 The study estimates that power rates would
rise by around 4%, while overall energy expenditures would remain nearly flat, assuming that states coordinate their implementation.173 (See Figure 23 on page 39.) Unconventional
natural gas also gives the U.S. a competitive advantage in moving to a low-carbon energy system over other countries that lack abundant natural gas resources. Without a supply of lowcost gas, Germany, for example, set aggressive renewables goals and then spent $400 billion in direct government subsidies to support renewable growth.174 The price of electricity for
residential customers increased by 70% between 2004 and 2014.175 The share of renewables has increased to about 25%,176 but the share of coalfired power has actually increased as

Switching the U.S. to

all-renewable power in the near term is neither technically nor
economically viable . A faster transition to renewables would require
significant increases in electricity rates immediately . While renewable energy is
becoming more cost-effective with each passing year, the current average unsubsidized, cost
diferential with natural gas is 20100% higher for wind and 90175% for
well.177 Greenhouse gas emissions have only fallen approximately 10% since 2000.178 An all-renewables approach not feasible

solar, depending on the state.179 As the German example shows, major subsidies or much higher
electricity bills would be required to meet the Clean Power Plan, or similar reduction
goals, using renewables alone. In addition to the higher cost of generation, the transition
to a high renewable share will require an estimated $750 billion in grid
improvements in the U.S. to handle large volumes of intermittent
renewables and the more sophisticated forms of energy management and
efficiency needed.180 Transmission and distribution lines will require additional capacity and two-way
flows to manage widening sources of intermittent renewables. Smart grid metering and control
systems need to become more sophisticated and widespread to allow grid
operators to harmonize the new, complex flows of power supply and demand. Practically, this process will
require a 20- to 30-year period .181 Natural gas needed for standby power Natural gas
power plants are a necessary complement to the scale-up of renewables.
As renewables gain share, backup capacity will need to grow significantly
to ensure that a large volume of on-demand power can come online over
extremely short periods to compensate for absences of wind or sun. (See
Figure 24.) The particular levels of backup capacity required will depend on the percentage and
distribution of intermittent renewables, as well as the ability of the grid to utilize demand response and storage, but

will amount to a significant portion of the total installed renewables

capacity. Natural gas power plants are by far the most efficient source of
backup power, at least over the medium term. Natural gas plants can be brought
online in under an hour, in some cases as rapidly as 15 minutes ,182 compared with eight to 48 hours to
start up a coalfired plant.183 Natural gas plants can also operate more efficiently
across a variety of load factors , allowing them to meet varying needs
throughout the day. While energy storage solutions, such as large-scale batteries,
may eventually become economic to provide backup power, they are years
away from being competitive with gasfired plants.184