You are on page 1of 31

1NC

Off-Case
1NC---States CP
The states and relevant sub-federal entities should uniformly substantially
increase pre-discharge permitting and monitoring of Concentrated Animal
Feeding Operations (CAFOs) in the United States.
States solve CAFOs best – resource constraints hamper federal enforcement
Kulkarni 19 [Madhavi Kulkarni, JD Candidate, William & Mary Law School, 2020; Editor-in-
Chief, William & Mary Environmental Law and Policy Review, Volume 44; BA Economics, Rutgers
University, 2017, summa cum laude. Out Of Sight, But Not Out Of Mind: Reevaluating The Role
Of Federalism In Adequately Regulating Concentrated Animal Feeding Operations. William &
Mary Environmental Law & Policy Review, Vol. 44:285, 2019]

Against a meager federal backdrop, states have to shoulder the responsibility of picking up the
environmental slack. Because limited funds and resources may prohibit a significant increase in
inspections and en-forcement actions under a federal inter-region compact, state legislatures and
agencies must take advantage of their ability to impose additional requirements on CAFOs. While
it may be the case that a single state agency controls the implementation and enforcement of regulations re-garding CAFOs,
sometimes this responsibility is delegated to more than one state agency. 14 6 In this case, a cross-agency accountability system,
similar to the inter-region compact at the federal level, may incentivize these agencies to work together better. States are also
more likely to be motivated to enact stricter laws, given that they are more directly im-pacted
by the activities that occur at CAFOs within the state. It is often easier for individuals to reach
their state governments than to reach the federal government, so state governments will most
directly interact with those adversely affected by CAFOs.

States are already regulating CAFOs more stringently than federal law
Kulkarni 19 [Madhavi Kulkarni, JD Candidate, William & Mary Law School, 2020; Editor-in-
Chief, William & Mary Environmental Law and Policy Review, Volume 44; BA Economics, Rutgers
University, 2017, summa cum laude. Out Of Sight, But Not Out Of Mind: Reevaluating The Role
Of Federalism In Adequately Regulating Concentrated Animal Feeding Operations. William &
Mary Environmental Law & Policy Review, Vol. 44:285, 2019]

Several states have also placed more stringent effluent limits on CAFOs than required by federal
regulations." Many states also impose limitations on the land application of waste, which is the
primary mecha-nism of waste disposal for CAFOs." Under an NPDES permit, the rate of land application of solid
animal waste-known as the agronomic rate-is typically determined based on the nitrogen needs ofcrops.61 Forty states
require that solid waste from CAFOs be applied at this agronomic rate.6 2 Thirty-eight states
require the development and use of WMPs, 63 though not all require consideration of odor while developing these
plans.6 4 One state, Georgia, requires a land application system ("LAS") permit.65 This type ofpermit prohibits discharge to surface
water, and requires ground water and soil monitoring, as well as quarterly reporting. Many states also require that all
CAFOs, or at least a larger sub- set of CAFOs than is required by federal law, obtain NPDES
permits.67 For example, Arkansas requires that all animal feeding operations-of which CAFOs are a subset-that use a liquid waste
management system obtain a permit.68 Some states have required that large farms that have even the potentialto discharge waste
must obtain permits.69 Therefore, this
permitting requirement reaches further than the NPDES
permitting requirements under the CWA, which only address CAFOs that have discharged or expect to discharge
animal waste.70
1NC---Incentives CP
Text: The United States federal government should provide financial incentives
for Concentrated Animal Feeding Operations to acquire discharge permits
pursuant to the Clean Water Act.

Incentives solve better than regulations and avoid the politics DA


Ruhl, 8 – Matthews & Hawkins Professor of Property, The Florida State University College of Law (J.B., “Symposium: Breaking
the Logjam: Environmental Reform for the New Congress and Administration: Panel IV: Protecting Ecosystems On Land: Agriculture
and Ecosystem Services: Strategies for State and Local Governments.” New York University Environmental Law Journal, 17 N.Y.U.
Envtl. L.J. 424. Lexis.)

One might think implementing this win-win for agriculture [*425] and the environment is a policy "no-brainer," but agriculture
has long been the Rubik's Cube of environmental policy. Although agriculture is a leading cause
of pollution and other environmental harms, n2 it has been resistant to regulation and, for the most
part, remarkably successful at being paid to do the right thing. n3 While other industries have advanced to
flexible, market-based "second generation" environmental policies and beyond, agriculture somehow keeps dodging the bullet. n4
Federal and state agencies have tried to overlay small pieces of conventional regulation on farms,
which farm interests have resisted at every turn, n5 and Congress opens debate on Farm Bills every few years
with promises of innovative policy reform, only to drift back into business as usual. n6 Seldom has so much time, money, and energy
been expended year after year, decade upon decade, to keep policy of any other kind exactly where it started out. Agricultural
economist David Freshwater sums up this history well: [*426] With each farm bill cycle there are calls for a major rethinking of U.S.
farm policy to make it better suit current farm conditions and the expectations of the broader American public about the roles of
agriculture. These calls for reform have been for the most part unsuccessful because there has been no argument compelling
enough to overcome advocates of the status quo. But as time passes the wisdom of maintaining a set of policies that have their basis
in the 1930s and were designed to support a structure of agriculture that no longer exists becomes more questionable. n7 Paying
farmers to do the "right thing" environmentally has been a theme of federal farm policy for
decades, embodied in programs such as the Conservation Reserve Program (CRP), which pays farmers to take land out of
production for defined periods to enhance its conservation values, and the Conservation Security Program (CSP) and Environmental
Quality Incentives Program (EQIP), which pay farmers to employ better practices on working lands. n8 And either paying or forcing
farmers to preserve agricultural land uses at the urban fringe has become a primary driver of state
and local land use policy. n9 In this sense farms have long been understood as land units that have the capacity to
contribute to environmental and cultural values.
1NC---Agriculture DA
Expanded permit requirements wrecks the ag industry and the economy writ
large
Yager 18 – Chief Environmental Counsel for the National Cattlemen's Beef Association, previously worked for the US
Environmental Protection Agency as a law fellow in the Office of Water where he worked in the branch overseeing the CAFO permit
and other Clean Water Act permits that affect agriculture Scott Yager & Mary-Thomas Hart, The Tipping Point Source: Clean Water
Act Regulation of Discharges to Surface Water via Groundwater, and Specific Implications for Nonpoint Source Agriculture, 23 DRAKE
J. AGRIC. L. 439 (2018). HeinOnline.

A significant portion of agricultural producers and other regulated entities would face a constant,
unpreventable risk of discharge under the direct hydrologic connection or conduit theory. A negligible risk of discharge
always exists, proactive regulated stakeholders will attempt to obtain a CWA NPDES permit to avert liability. The conduit and direct
hydrologic connection theories
move the goalposts, forcing agricultural producers out of voluntary conservation
partnerships and into
mandated permitting requirements. Such blurring of the lines will have
significant economic impact on regulated stakeholders, and the American economy by
consequence. As the EPA under the Trump Administration work to rescind the Clean Water Rule (CWR), one driving force
is the negative impact that increased federal jurisdiction could have on economic development,
with ever- diminishing environmental benefits. Should the EPA rescind the CWR, returning surface water
jurisdiction to the status quo, implementation of the conduit or direct hydrologic connection theory would counteract this action by
stretching federal jurisdiction through other means.

That kills US food exports and ag competitiveness


Maday, 12 – managing editor of Drovers magazine , M.S. in extension education, spent six years at the University of Florida
developing instructional materials and in-service training programs for vocational agriculture education John, 3/2. “The cost of
regulations and shrinking herds.” http://www.cattlenetwork.com/cattle-news/The-cost-of-regulations-and-shrinking-herds-
141240653.html

The researchers found that leading the charge on adopting new


regulations that impact production costs is often
followed by a substantial decline in production. This impact appears to be magnified when
nearby jurisdictions with less additional market-access costs are able to step in and fulfill
demand. In other words, production moves to a less-restrictive environment . “The causal effect
between the regulations and drop in production is not well documented,” the authors note, “but we simply note that from our
research, where we have data, the two coincide more often than not. Regulations
should be adopted no faster
than absolutely necessary or as dictated by the market. In the short term, production for the domestic market is
unlikely to move overseas, due to the established infrastructure and feed-production capacity of the United States. Production
could, however, continue to relocate domestically. There is no clear evidence that food safety would worsen with a shift from
domestically produced to imported meat, poultry, and eggs. Data on food safety are poor, the researchers note. Of the markets
reviewed in this assessment, the United States has the most detailed tracking, but even here, the cause of 80 percent of all
foodborne illnesses cannot be attributed to a specific food, much less whether it is imported. The more likely threat
for U.S.
farmers and ranchers from excess regulation comes from reduced competitiveness in export
markets. Much of the growth in output in U.S. animal agriculture is driving exports rather than fueling
internal demand, which has flattened on a per capita basis. Much of the growth potential for our food
products is in international markets.
Strong US ag averts great power war.
Castellaw 17 (John – 36-year veteran of the U.S. Marine Corps and the Founder and CEO of Farmspace Systems LLC,
“Opinion: Food Security Strategy Is Essential to Our National Security,” 5/1/17, https://www.agri-pulse.com/articles/9203-opinion-
food-security-strategy-is-essential-to-our-national-security)

The U nited S tates faces many threats to our National Security. These threats include continuing
wars with extremist elements such as ISIS and potential wars with rogue state North Korea or
regional nuclear power Iran. The heated economic and diplomatic competition with Russia and
a surging China could spiral out of control . Concurrently, we face threats to our future security
posed by growing civil strife, famine, and refugee and migration challenges which create
incubators for extremist and anti-American government factions. Our response cannot be one dimensional
but instead must be a nuanced and comprehensive National Security Strategy combining all elements of National Power including a
Food Security Strategy. An American Food Security Strategy is an imperative factor in reducing the
multiple threats impacting our National wellbeing. Recent history has shown that reliable
food supplies and stable prices produce more stable and secure countries. Conversely, food
insecurity, particularly in poorer countries, can lead to instability, unrest, and violence. Food
insecurity drives mass migration around the world from the Middle East, to Africa, to
Southeast Asia, destabilizing neighboring populations, generating conflicts , and threatening our
own security by disrupting our economic, military, and diplomatic relationships . Food system
shocks from extreme food-price volatility can be correlated with protests and riots. Food price
related protests toppled governments in Haiti and Madagascar in 2007 and 2008. In 2010 and in 2011, food prices and grievances
related to food policy were one of the major drivers of the Arab Spring uprisings. Repeatedly, history has taught us that a strong
agricultural sector is an unquestionable requirement for inclusive and sustainable growth, broad-based
long-term stability . The impact can be remarkable and far reaching. Rising income , in
development progress, and
addition to reducing the opportunities for an upsurge in extremism, leads to changes in diet,
producing demand for more diverse and nutritious foods provided, in many cases, from
American farmers and ranchers. Emerging markets currently purchase 20 percent of U.S.
agriculture exports and that figure is expected to grow as populations boom. Moving early to ensure
stability in strategically significant regions requires long term planning and a disciplined, thoughtful strategy. To combat current
threats and work to prevent future ones, our national leadership must employ the entire spectrum of our power including
diplomatic, economic, and cultural elements. The best means to prevent future chaos and the resulting instability is positive
engagement addressing the causes of instability before it occurs. This is not rocket science. We know where the instability is most
likely to occur. The world population will grow by 2.5 billion people by 2050. Unfortunately, this massive population boom is
projected to occur primarily in the most fragile and food insecure countries. This alarming math is not just about total numbers.
Projections show that the greatest increase is in the age groups most vulnerable to extremism. There are currently 200 million
people in Africa between the ages of 15 and 24, with that number expected to double in the next 30 years. Already, 60% of the
unemployed in Africa are young people. Too often these situations deteriorate into shooting wars requiring the
deployment of our military forces. We should be continually mindful that the price we pay for committing military forces is
measured in our most precious national resource, the blood of those who serve. For those who live in rural America , this has a
disproportionate impact. Fully 40% of those who serve in our military come from the farms, ranches, and non-urban communities
that make up only 16% of our population. Actions taken now to increase agricultural sector jobs can provide economic opportunity
and stability for those unemployed youths while helping to feed people. A recent report by the Chicago Council on Global Affairs
identifies agriculture development as the core essential for providing greater food security, economic growth, and population well-
being. Our active support for food security , including agriculture development, has helped
stabilize key regions over the past 60 years. A robust food security strategy, as a part of our
overall security strategy, can mitigate the growth of terrorism, build important relationships,
and support continued American economic and agricultural prosperity while materially
contributing to our Nation’s and the world’s security.
1NC---EPA Tradeoff DA
Existing source methane regulations are coming now. They require significant
EPA resources, and are critical to meeting international climate commitments.
Leber 21 [Rebecca Leber, covers climate change for Vox; Published: 6-1-2021; "There’s a
ticking climate time bomb in West Texas"; Vox; Accessed: 7-14-2021;
https://www.vox.com/22407581/gas-texas-biden-climate-change-methane-permian-basin]//KL
Around 265 million years ago, much of modern-day Texas was underwater, and the vast region known as the Permian Basin was a flourishing coral reef. Today, the organisms that once thrived there have been

transformed into enormous deposits of fossil fuels — and they have made the area one of the most treacherous front lines in President Joe Biden’s domestic fight against climate change. The
Permian Basin, which stretches hundreds of miles across West Texas and southeast New Mexico, accounts for 40 percent of US oil production and 15 percent of its natural gas, according to
is
February data. Less than a year after oil prices dipped into negative territory because of the Covid-19 pandemic, production in the region has bounced back almost to pre-pandemic levels. Already, the region

the nation’s No. 1 source of methane, a greenhouse gas that warms the planet far more
efficiently than carbon dioxide in the short term. The US oil and gas industry has pinned much of
its future hopes on the region, especially in the next decade: If it gets its way, the Permian Basin will still grow through 2029, outranking every country except for Saudi Arabia in
liquid fuel production, according to one analysis from Oil Change International. At this rate, by 2050, it would account for 39 percent of the

world’s new oil and gas emissions. The world can’t afford this if it is to meet international
climate goals. That’s what the International Energy Agency recently made clear in a report that argued for halting new investment in fossil fuel production, starting in 2021. Yet under Biden, the
Permian could undergo expansion if the industry sees through its plans to export its gas and oil. That means any credible US response to the climate crisis

will need to include a plan for the West Texas Permian Basin. But wrangling Texas oil and gas
emissions could test President Biden’s powers like nothing else. When Biden signaled early in his
presidency that fighting climate change must involve reining in the fossil fuel industry, Texas Gov. Greg Abbott
immediately signaled that he would protect the state’s oil and gas industry at all costs. On January 28, Abbott signed an executive order to direct every state agency to use all lawful powers and tools to challenge
any federal action that threatened the Texas energy sector. Biden has committed to slash US climate pollution in half by 2030 to contain the worst of global warming, but the administration has few levers to limit
pollution in a red state infamous for deregulating the industry. The Texas side of the Permian is the biggest challenge. The land is entirely state- and privately-held, compared to the federal lands in New Mexico,
which makes it hard to discourage future oil production by blocking new leases. And unlike New Mexico, the state regulators and politicians have shown no interest in coming to terms with the Permian’s pollution.
That leaves the Biden administration with a thorny choice: It could take a gentler approach and regulate the Permian’s climate emissions but risk not doing enough. Or it could swing a political sledgehammer by
declaring a climate emergency and cutting off the Permian from its global customers — which could provoke intense backlash from Abbott, industry, and voters in upcoming elections. Sharon Wilson, an

’s “going to take something like an army”


environmental activist with Earthworks, worries the Biden administration is underestimating the challenges, estimating it

to enforce environmental rules in West Texas. “The Texas regulatory agencies are underfunded,
and they are understaffed, and they’re not really motivated. I don’t think that the Biden administration has a realistic picture of what it’s
going to take to enforce these methane rules,” she says. There is no easy answer here, but environmentalists intend on pushing Biden to go bigger and bolder in the Lone Star State. As Abbott has made clear,

Texas won’t go along without a fight. Perhaps nowhere in the country captures the test of political will to address climate
change — and the limits of the Environmental Protection Agency under Biden — better than the Permian

Basin. When Biden first said he would issue “stronger standards like controls from methane leaks,” in January, he never named the nation’s No. 1 source of methane pollution. The Permian Basin tops the list
— and this runaway pollution is the primary reason why gas can be as bad for climate change as carbon-intensive coal. Methane was once a mostly ignored greenhouse gas because it is less prevalent in the
atmosphere than carbon dioxide. But methane concentrations rose to new heights in 2020, with the largest jump since records began in 1983, and it’s now impossible to ignore. Compared to carbon, methane is
about 80 times as effective at trapping heat in the atmosphere over a 20-year period (though it lasts in the atmosphere for a shorter time — decades as opposed to centuries). Not all human-caused methane
emissions come from oil and gas production — landfills and cattle are major sources too. But those are harder to address, and we already have plenty of readily available fixes to address runaway methane
emissions from oil and gas. That was the conclusion of the United Nations’ Climate and Clean Air Coalition’s May report, which singled out oil and gas as a sector that countries should tackle aggressively, and soon.

To help avert 1.5 degrees Celsius of global warming, the report found, the world needs to cut methane emissions 40
to 45 percent by 2030. An astounding 30 percent of those urgently needed reductions could be

achieved by fixing leaks from oil operations. Many of the large companies, including ExxonMobil, Shell, and BP, have promised to reduce their methane
intensity. Of course, there are reasons this hasn’t been fixed by now. One is that gas has been so plentiful and cheap in the United States. In the Permian, where the majority of producers are interested in making
bigger profits from oil, gas is little more than a waste product. Even with some industry support, methane regulations have faced nothing but setbacks. Until it faced renewed pressure under the Biden
administration, the oil lobbying group American Petroleum Institute was dead set against methane rules. A front group for gas companies, Texans for Natural Gas, still falsely claims that methane emissions are
falling. Producers often say they already have an incentive to conserve methane, because natural gas can be sold for use in heating and powering homes. But scientists have observed the opposite phenomenon.

In the absence of regulations, companies


There has been rampant growth in methane emissions from booming oil fields like the Permian Basin.

continue to intentionally release methane, either by releasing it to the atmosphere with unlit
flares or burning it off. These flares contribute not only to climate change but also to smog . Even when
methane is not released on purpose, the gas escapes from pipelines, tanks, and refineries throughout the supply chain for oil and gas, at the site of extraction. As Texas saw in its massive February blackout, the gas
kept flowing when equipment was frozen offline or not working because of power outages, resulting in a big release of methane. Even worse: We don’t have a great sense of how big these methane releases are
because Texas doesn’t regulate it, and former President Donald Trump reversed reporting requirements. The Environmental Defense Fund, for instance, flew planes over the Permian and found that methane
levels were three times the EPA’s official estimates. An EDF report published in the journal Science Advances in April 2020 found “the highest emissions ever measured from a major U.S. oil and gas basin. There’s
so much methane escaping from Permian oil and gas operations that it nearly triples the 20-year climate impact of burning the gas they’re producing.” The methane that’s lost, EDF estimates, could supply power

to 7 million households. Methane is a problem everywhere oil and gas are thriving, and even in areas where oil is long gone, because abandoned wells can continue to
leach the gas. The problem is especially acute in Texas, where the industry has been left to regulate itself. Biden has promised to fix the methane
crisis, and Congress is expected to reinstate former President Barack Obama’s first regulation of methane from oil and gas operations — a fairly limited rule that forced companies to install the latest
technologies to monitor and limit leaks from a portion of new oil and gas wells. Trump reversed it, replacing it with a rule that would actually increase emissions. Using the Congressional Review Act, the Senate
voted 52-42 to nix Trump’s rule, and it faces a vote next in the House. But if the US is going to get really serious about climate change in the next 10 years, as Biden has promised, it’s even more important to fix

By September 2021, Biden’s EPA plans to issue its


hundreds of thousands of wells — and more than a million miles of pipeline — that already exist.

first draft rule to strengthen national standards for methane emissions from new, reconstructed,
and modified oil and gas sources. Importantly, it will exercise its authority to tackle methane emissions
from existing sources as well.

Increased permits buckle the EPA.


EPA 02 [Environmental Protection Agency; Christine Todd Whitman, Administrator; G. Tracy
Mehan III, Assistant Administrator, Office of Water; Sheila E. Frace Director, Engineering and
Analysis Division; Renee Selinsky Johnson, Project Co-Lead/Economist; Published: December
2002; "Economic Analysis of the Final Revisions to the National Pollutant Discharge Elimination
System Regulation and the Effluent Guidelines for Concentrated Animal Feeding Operations";
Engineering and Analysis Division Office of Science and Technology U.S. Environmental
Protection Agency; Accessed: 7-13-2021;
https://www3.epa.gov/npdes/pubs/cafo_econ_analysis_p1.pdf]//KL

ES.4.2 Costs to the NPDES Permitting Authority

The NPDES permitting authority would incur additional costs to alter existing State programs
and obtain EPA approval to develop new permits, review new permit applications, and issue
revised permits that meet the final regulatory requirements. EPA expects that NPDES permitting
authorities will incur administrative costs related to the development, issuance, and tracking of
general or individual permits.

State and Federal administrative costs to issue a general permit include costs for permit
development, public notice and response to comments, and public hearings. States and EPA
might also incur costs each time a facility operator applies for coverage under a general permit
due to the expenses associated with a NOI. These per-facility administrative costs include initial
facility inspections and annual record-keeping expenses associated with tracking NOIs.
Administrative costs for an individual permit include application review by a permit writer,
public notice, and response to comments. An initial facility inspection might also be necessary.

Mitigating methane emissions is key and paves the way for future action.
Newburger 21 – covers climate change and breaking news for CNBC.com. (Emma; Published:
May 6, 2021; “The world needs to dramatically cut methane emissions to avoid worst of climate
change, UN says”; CNBC; Accessed: July 15, 2021; https://www.cnbc.com/2021/05/06/world-
must-cut-methane-emissions-to-avoid-worst-of-climate-change-un-says.html)//CYang

A landmark United Nations report has declared that drastically


cutting emissions of methane, a key
component of natural gas, is necessary to avoid the worst impacts of global climate change. The
report, published Thursday by the Climate and Clear Coalition and the U.N. Environment Programme, represents a shift in the
worldwide conversation on how to best address the climate crisis, which has focused on longer-term carbon dioxide reduction.
Methane is 84 times more potent than carbon and doesn’t last as long in the atmosphere before it breaks down.
This makes it a critical target for reducing global warming more quickly while simultaneously
working to reduce other greenhouse gases. More than half of global methane emissions come from
oil and gas extraction in the fossil fuel industry, landfills and wastewater from the waste sector, and livestock emissions from
manure and enteric fermentation in the agricultural sector.

The world could slash methane emissions by up to 45% this decade, or 180 million tons a year, according to the
U.N.’s Global Methane Assessment. Such a target will avoid nearly 0.3 degrees Celsius of warming by 2045 and help limit the
rise in global temperatures to 1.5 degrees Celsius, a goal of the Paris climate accord.

The report comes after methane emissions surged to record highs last year despite worldwide lockdowns during the coronavirus
pandemic, according to research from the National Oceanic and Atmospheric Administration. Methane emissions are also rising
faster than ever since record-keeping began in the 1980s. “Cutting
methane is the strongest lever we have to
slow climate change over the next 25 years and complements necessary efforts to reduce
carbon dioxide,” Inger Andersen, executive director of the U.N. Environment Programme, said in a statement.

Warming causes extinction---AND every increment is key because of invisible


thresholds and exponential feedbacks.
Dr. Yew-Kwang Ng 19, Winsemius Professor of Economics at Nanyang Technological University,
Fellow of the Academy of Social Sciences in Australia and Member of Advisory Board at the
Global Priorities Institute at Oxford University, PhD in Economics from Sydney University,
“Keynote: Global Extinction and Animal Welfare: Two Priorities for Effective Altruism”, Global
Policy, Volume 10, Number 2, May 2019, pp. 258–266

Catastrophic climate change

Though by no means certain, CCC causing global extinction is possible due to interrelated factors of non-
linearity, cascading effects, positive feedbacks, multiplicative factors, critical thresholds and
tipping points (e.g. Barnosky and Hadly, 2016; Belaia et al., 2017; Buldyrev et al., 2010; Grainger, 2017; Hansen and Sato, 2012;
IPCC 2014; Kareiva and Carranza, 2018; Osmond and Klausmeier, 2017; Rothman, 2017; Schuur et al., 2015; Sims and Finnoff, 2016;
Van Aalst, 2006).7

A possibly imminent tipping point could be in the form of ‘an abrupt ice sheet collapse [that]
could cause a rapid sea level rise’ (Baum et al., 2011, p. 399). There are many avenues for positive
feedback in global warming, including:

• the replacement of an ice sea by a liquid ocean surface from melting reduces the
reflection and increases the absorption of sunlight, leading to faster warming;

• the drying of forests from warming increases forest fires and the release of more carbon; and

• higher ocean temperatures may lead to the release of methane trapped under the ocean floor,
producing runaway global warming.

Though there are also avenues for negative feedback, the scientific consensus is for an overall
net positive feedback (Roe and Baker, 2007). Thus, the Global Challenges Foundation (2017, p. 25) concludes, ‘The
world is currently completely unprepared to envisage, and even less deal with, the consequences of CCC’.
The threat of sea-level rising from global warming is well known, but there are also other likely
and more imminent threats to the survivability of mankind and other living things. For example ,
Sherwood and Huber (2010) emphasize the adaptability limit to climate change due to heat stress from
high environmental wet-bulb temperature. They show that ‘even modest global warming could ...
expose large fractions of the [world] population to unprecedented heat stress’ p. 9552 and that with
substantial global warming, ‘the area of land rendered uninhabitable by heat stress would
dwarf that affected by rising sea level’ p. 9555, making extinction much more likely and the relatively
moderate damages estimated by most integrated assessment models unreliably low.

While imminent extinction is very unlikely and may not come for a long time even under business as usual, the main
point is that we cannot rule it out. Annan and Hargreaves (2011, pp. 434–435) may be right that there is ‘an upper
95 per cent probability limit for S [temperature increase] ... to lie close to 4°C, and certainly well
below 6°C’. However, probabilities of 5 per cent, 0.5 per cent, 0.05 per cent or even 0.005 per
cent of excessive warming and the resulting extinction probabilities cannot be ruled out and are
unacceptable. Even if there is only a 1 per cent probability that there is a time bomb in the
airplane, you probably want to change your flight. Extinction of the whole world is more
important to avoid by literally a trillion times.
1NC---Infrastructure DA
Bipartisan infrastructure is making progress, but Biden’s PC is key.
Lisa Mascaro et al. 7/27, Alexandra Jaffe, Kevin Freking; covers Congress for the Associated
Press, “Senators, White House in talks to finish infrastructure bill”, 7/27/21,
https://apnews.com/article/joe-biden-government-and-politics-business-
14f3ae962d9501993c2bfde9399a6ffb/micahw

Senators and the White House were locked in intense negotiations Monday to salvage a bipartisan
infrastructure deal, with pressure mounting on all sides to wrap up talks and show progress on
President Joe Biden’s top priority.

Despite weeks of closed-door discussions, senators from the bipartisan group blew past a Monday deadline set for
agreement on the nearly $1 trillion package. Instead they hit serious roadblocks over was how much would be spent on public
transit and water infrastructure and whether the new spending on roads, bridges, broadband and other projects would be required
to meet federal wage requirements for workers. They’re also at odds over drawing on COVID-19 funds to help pay for it.

Republican negotiator Sen. Rob Portman of Ohio, who took the lead in key talks with a top White House aide,
insisted the bipartisan group was “making progress.”

“This is heading in the right direction,” Portman told reporters at the Capitol. “It’s a big, complicated bill.”

Biden struck a similarly upbeat tone, telling reporters at the White House he remained optimistic about
reaching a compromise.

This is a crucial week after more than a monthlong slog of negotiations since Biden and the bipartisan group
first celebrated the contours of the nearly $1 trillion bipartisan agreement in June.

Senators were warned they could be kept in session this weekend to finish the work.

The White House wants a bipartisan agreement for this first phase, before Democrats go it alone
to tackle broader priorities in a bigger $3.5 trillion budget plan that’s on deck. A recent poll from The Associated Press-
NORC found 8 in 10 Americans favor some increased infrastructure spending, and the current package could be a
political win for all sides as lawmakers try to show voters that Washington can work. Securing the bipartisan bill
is also important for some centrist Democrats before engaging in the broader undertaking.

But as talks drag on, anxious Democrats, who have slim control of the House and Senate, face a timeline
to act on what would be some of the most substantial legislation in years. Senate Majority Leader Chuck Schumer
wants progress on both packages before the August recess, and he told senators to brace for a Saturday or Sunday
session.

White House Press Secretary Jen Psaki said Biden himself “worked the phones all weekend,” and the administration
was encouraged by the progress. But Psaki acknowledged “time is not endless.”
Adding to the mix, Donald Trump issued a statement Monday disparaging Senate Republicans for even dealing with the Democrats on infrastructure, though it’s unclear what
influence he has. The former president had failed at an infrastructure deal when he was in office.

“It’s time for everyone to get to ‘yes,'” Schumer said as he opened the Senate.

Schumer said Trump is “rooting for our entire political system to fail” while Democrats are “rooting for a deal.”

The bipartisan package includes about $600 billion in new spending on public works projects, with broad support from Republicans and Democrats for many of the proposed
ideas.

Yet there was little to show Monday after a grinding weekend of talks, putting the deal at risk of stalling out.
The Democrats and the White House had sent what they called a “global” offer to Republicans on remaining issues late Sunday, according to a Democratic aide close to the talks
and granted anonymity to discuss them.

But Republicans rebuffed the ideas, saying the new proposal attempted to reopen issues that had already been resolved, according to a GOP aide also granted anonymity to
discuss the private talks.

Sen. SusanCollins, R-Maine, said it’s time for Biden to become more involved. “I think it’s imperative
that the president indicates strongly that he wants a bipartisan package,” she said.

Passing environmental laws, such as water protection of the plan, are difficult
and controversial – they sidetrack Biden and Congress.
Richard J. Lazarus, Professor of law at Harvard University Law School, 1-5-2021,
"Environmental Law & Politics," American Bar Association,
https://www.americanbar.org/groups/public_education/publications/insights-on-law-and-
society/volume-19/insights-vol--19---issue-1/environmental-law---politics/
In 2007, the environment—in particular, global warming—burst into the public consciousness. Former Vice President Al Gore won the Nobel Peace
Prize and saw his film, “An Inconvenient Truth,” win an Oscar and become the fourth highest-grossing documentary of all time. International leaders
gathered in Bali at the close of the year to launch a new round of climate change negotiations. And here at home, the United States Congress took up
varying proposals to combat global warming while the Supreme Court issued a landmark ruling on the issue. Global warming may represent the newest
frontier in environmental law, but the lawmaking institutions working to address it have more than 30 years of history on which to
build. This history has been tumultuous, but throughout it environmental law has grown, overcoming challenges and demonstrating a
surprising resilience. Whether tackling global warming, water pollution, or the protection of endangered species, environmental
lawmaking is uniquely and inherently difficult. As such, its persistence over the past three decades is even more remarkable.
Making environmental law is difficult in part because the environment itself is so complex. Ecological
systems are complicated and dynamic, as are the factors that contribute to environmental change. Environmental law must take this complexity into
account. For example, when setting standards limiting the amount of a particular air pollutant, the Environmental Protection Agency (EPA) must
consider ecological factors including temperature, atmospheric pressure, and wind. The EPA also must take into account the many sources that
contribute the pollutant into the air. Finally, the agency must consider how all these factors relate to human health—a measure by which it sets
standards under the Clean Air Act. Our
constitutional system itself poses additional hurdles. Lawmaking
institutions are divided vertically between the federal government and states, as well as horizontally among Congress,
executive branch agencies, and the courts. This deliberately fragmented system makes any type of lawmaking difficult and
incremental. Enacting environmental laws is particularly difficult, because the injuries they seek to prevent are
often far-off and diffuse, while their economic impact may be immediate. In addition, the Constitution limits the
powers of Congress to an enumerated list that does not expressly mention the environment. As a result, legislators must hitch federal environmental
laws to authority Congress does possess, such as the ability to regulate economic activity under the Constitution’s Commerce Clause.

Infrastructure bill is vital to the grid – it modernizes it through green energy and
increased transmission
Christian, Potter, and Hale ’21; [Molly Christian; Experienced energy professional with
extensive knowledge of US energy markets. Over a 10-year history of reporting and editorial
leadership on publications covering US and international energy markets, including coal,
electricity and emissions; Ellie Potter; a journalist covering federal energy policy while pursuing a
master's degree in energy policy and climate at Johns Hopkins University; Zack Hale; reports on
federal energy policy for S&P Global Market Intelligence, a leading business news organization
delivering actionable insights to industry professionals, NGOs, government officials and the
broader public. My coverage primarily focuses on the Federal Energy Regulatory Commission
and U.S. Environmental Protection Agency, but I often write about state-level energy policy, as
well; 3/31/21; S&P Global Market Intelligence; “Biden's $2 trillion infrastructure plan aims to
'reenergize' US power grid”;
https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/biden-s-
2-trillion-infrastructure-plan-aims-to-reenergize-us-power-grid-63426704; accessed: 7/13/21;
YS]

The White House on March 31 released the outline of a jobs and infrastructure plan that calls on
Congress to invest $100 billion to "reenergize America's power infrastructure."

The 10-year, $2 trillion American Jobs Plan also seeks to upgrade roads, bridges and water
systems and help make U.S. infrastructure more resilient to the impacts of climate change.

Noting that the U.S. is still struggling economically from the coronavirus and ranks 13th in the
quality of its infrastructure despite being the wealthiest country in the world, the White House
said "it has never been more important for us to invest in strengthening our infrastructure and
competitiveness, and in creating the good-paying, union jobs of the future."

The White House will need to work with Congress to draft comprehensive legislation to realize
President Joe Biden's infrastructure vision. But finding ways to fund the plan and gain sufficient
support in Congress could be challenging with Democrats holding only thin majorities in both
chambers.

Power grid

Along with repairing other infrastructure, the plan calls for the modernization of the country's
aging electric grid, which Biden wants to produce 100% carbon-free electricity by 2035.

The plan proposes $100 billion in spending on upgrading and building out the nation's aging and
regionally siloed electric transmission system. Various studies have estimated the U.S. will need
to double or even triple its electric transmission capacity to successfully decarbonize its
economy by midcentury.

The proposal would "put hundreds of thousands of people to work laying thousands of miles of
transmission lines," according to a White House fact sheet. To that end, the plan would establish
a new Grid Deployment Authority within the U.S. Department of Energy to leverage existing
rights-of-way for transmission lines along roads and railways. The new office would also support
"creative financing tools to spur additional high priority, high-voltage transmission lines."

In terms of related legislation, Rob Gramlich, executive director of Americans for a Clean Energy
Grid, said a bill introduced last week by Sen. Martin Heinrich, D-N.M., to establish a tax credit for
the installation of "regionally significant" transmission lines may be part of the legislative
package.

"There's an opportunity to go really big and look at the needs of a $100 billion-plus macro grid
but there's also an opportunity to focus on near-term deployment, and that's what the
transmission tax credit would do," Gramlich said in an interview.

Gramlich said his firm has calculated that nearly two dozen large-scale interregional projects
around the country, representing approximately 20 GW of transmission capacity, have stalled
due to a lack of financing. "They're most of the way through the permitting process, so they
really could get going soon," Gramlich said.
The grid is extremely vulnerable right now – terrorists and nation-states have
the capability to induce major physical or cyber-attacks in various ways
Wintch ’21; [Timothy M. Wintch; an active-duty Major in the United States Air Force. He is
currently a graduate student at the Oettinger School of Science & Technology Intelligence,
National Intelligence University, in Bethesda, Maryland. Mr. Wintch has over 11 years of
experience in command-and-control operations as an Air Battle Manager. He holds a Bachelor of
Arts in Politics from the University of California, Santa Cruz, and a Master of Arts in Military
Studies from American Military University; 4/20/21; Homeland Security Today; “PERSPECTIVE:
Cyber and Physical Threats to the U.S. Power Grid and Keeping the Lights on”;
https://www.hstoday.us/subject-matter-areas/infrastructure-security/perspective-cyber-and-
physical-threats-to-the-u-s-power-grid-and-keeping-the-lights-on/; accessed: 7/13/21; YS]

Among critical infrastructure sectors in the U.S., energy is perhaps the most crucial of the 16
sectors defined by the Department of Homeland Security. This sector is so vital because it
provides the energy necessary to run every other critical infrastructure sector. However, the
U.S. power grid, the backbone of the energy sector, is built upon an aging skeleton that is
becoming increasingly vulnerable every day. Whether from terrorists or nation-states like
Russia and China, the power grid is susceptible to not just physical attacks, but also to cyber
intrusion as well. However, much of this threat can be mitigated if the U.S. takes the
appropriate steps to safeguard the power grid and avoid a potential catastrophe in the future.

Since Sept. 11, 2001, terrorism on U.S. soil has been at the forefront of American consciousness.
Critical infrastructure provides an appealing target because of the disproportionally large
impact even a small attack can have on the sectors. In particular, the power grid represents a
particularly lucrative target, both in terms of the ease of access and the large impact it can
make. The National Research Council stated that the U.S. power grid is “vulnerable to intelligent
multi-site attacks by knowledgeable attackers intent on causing maximum physical damage to
key components on a wide geographical scale.”[1] Additionally, the physical security of
transmission and distribution systems is difficult due to the dispersed nature of these key
components, which in turn is advantageous to attackers as it reduces the likelihood of their
capture.[2] From 2002-2012, approximately 2,500 physical attacks occurred against transmission
lines and towers worldwide and approximately 500 attacks against transformer substations.[3]
Terrorists have the motivation to attack the U.S. power grid but the very nature of the grid
makes it highly vulnerable. The power grid is not only at risk from physical attacks, but also
nation-state cyberattacks.

One nation that has shown both the capability and intent to use attacks against critical energy
infrastructure is Russia, as demonstrated in their 2015 annexation of Crimea from Ukraine. A
Russian cyber threat group known as Sandworm, which used its BlackEnergy malware, attacked
Ukrainian computer systems that provide remote control of the Ukraine power grid.[4] This
attack, and another in 2016, each left the capital Kiev without power, prompting cyber experts
to raise concern about the same malware already existing in NATO and the U.S. power grids .
[5] In any conflict between Russia and NATO, not only would similar cyberattacks pose a threat,
but so would potential physical attacks severing fuel oil and natural gas lines to Western
Europe. Russia has both the capability and intent to attack critical infrastructure, particularly
power grids, during future conflicts in their “hybrid warfare” approach.

Another nation that has the capability to attack critical energy infrastructure is China,
representing a threat to not just the U.S. energy infrastructure but also that of our allies whose
support would be vital in a major conflict. A recent NATO report highlighted this threat from
China’s Belt and Road Initiative, stating that “[China’s] foreign direct investment in strategic
sectors [such as energy generation and distribution] …raises questions about whether access
and control over such infrastructure can be maintained, particularly in crisis when it would be
required to support the military.”[6] Like Russia, China has been active with cyber intrusions in
U.S. energy infrastructure. The Mission Support Center at Idaho National Laboratory
characterized these as attacks as “multiple intrusions into US ICS/SCADA [Industrial Control
Systems/Supervisory Control and Data Acquisition] and smart grid tools [that] may be aimed
more at intellectual property theft and gathering intelligence to bolster their own
infrastructure, but it is likely that they are also using these intrusions to develop capabilities to
attack the [bulk electric system], as well.”[7] China, therefore, has both the capability and
intent to conduct cyber intrusions and attacks for myriad reasons.

Another arm of this threat is the reliance the U.S. energy industry has on imports from China,
especially transformers. In early 2020, federal officials seized a transformer in the port of
Houston that had been imported by the Jiangsu Huapeng Transformer Company before sending
it to Sandia National Laboratory in Albuquerque. Sandia is contracted by the U.S. Department of
Energy for mitigating national security threats.[8] The Wall Street Journal reported that “Mike
Howard, chief executive of the Electric Power Research Institute, a utility-funded technical
organization, said that the diversion of a huge, expensive transformer is so unusual – in his
experience, unprecedented – that it suggests officials had significant security concerns.”[9]
Previously destined for the Washington Area Power Administration’s Ault, Colo., substation, the
transformer is believed to have been seized due to “backdoor” exploitable hardware emplaced
by the Chinese prior to shipment.[10] Shortly after these events, President Trump issued
Executive Order 13920, “Securing the United States Bulk-Power System,” essentially limiting the
import of Chinese-built critical energy infrastructure components due to concerns about
cybersecurity.[11] Interestingly, Jiangsu Huapeng “boasted that it supported 10 percent of New
York City’s electricity load.”[12]

Franklin Kramer, the former Assistant Secretary of Defense for International Security Affairs,
testified before a U.S. House of Representatives Energy and Commerce subcommittee during an
energy and power hearing in 2011 and said that a “highly-coordinated and structured cyber,
physical, or blended attack on the bulk power system, however, could result in long-term
(irreparable) damage to key system components in multiple simultaneous or near-
simultaneous strikes.” He added that “an outage could result with the potential to affect a wide
geographic area and cause large population centers to lose power for extended periods.”[13]
Even the inclusion of features such as smart grids to the overall grid structure poses new
vulnerabilities through their connectivity. Kramer stated that “such connectivity means that the
distribution system could be a key vector for a national security attack on the grid.”[14]
Power generation represents a key vulnerability of the U.S. energy infrastructure. Physical
security measures vary by site and type of power plant; however, most are still limited in their
security measures beyond chain-link fences, with the notable exception of nuclear power plants.
[15] The very nature of power plants does provide some physical security, with plants often
residing in rural areas over large areas with multiple buildings, which makes locating and
accessing critical components more difficult. While an attack on a power plant would have a
large effect, it would also result in increased security at other plants. Finally, the nature of the
U.S. energy grid provides the capability to provide some level of self-healing, meaning that even
if a power plant were to go offline other sites can mitigate that loss and prevent cascading
effects.

System Control Centers represent another key vulnerability. These centers contain not only
important technical control systems, but also the personnel who operate those systems and
their unique intricacies. However, like power plants, the physical security of these sites varies,
ranging from minimal security to extensive hardening.[16] Fortunately, these centers have
redundant facilities that can mitigate losses to the rest of the system.[17]

Power lines may be viewed as a key vulnerability as the most visible aspect of the transmission
infrastructure, but the number of lines, ability to redirect power, coupled with the relative ease
of replacement, mean that an attack on power lines is likely to be limited in both scope and
duration. Therefore, while still a required part of the power infrastructure, transmission lines are
not a significant vulnerability especially when other critical infrastructure sectors often have
their own temporary backup power such as batteries and motor-generator sets (e.g. an on-site
diesel motor running an electrical generator at a hospital).

Perhaps the most vulnerable aspect of the U.S. power grid is the high-voltage transformers that
allow efficient transmission from power plants to distribution substations. Bottom line is that
power generation is of no consequence if it cannot be delivered to the end user, but “there is
general agreement among security planners that key high-voltage substations are the most
worrisome terrorist targets within the power transmission system.”[18] This fact is complicated
in that the transformers are “difficult to protect” and “replacement parts are difficult to obtain,
and damage to substations can separate customers from generation for long periods,” often
taking over a year to replace under ideal conditions.[19] As previously stated, the power
industry is heavily reliant on imports for these transformers, with many coming from China.
Finally, these substations are often unprotected by more than a perimeter fence, making them
vulnerable to standoff and penetration attacks.[20] The critical nature of these transformers,
combined with the difficulty in manufacturing and replacing them, makes the transmission
substations one of the most vulnerable aspects of the U.S. power grid.

A further vulnerability of energy infrastructure is the increased use of remote-control


mechanisms to operate critical equipment and manage energy loads all the way from power
generation to transmission. The more connected critical energy infrastructure is to a network,
the more vulnerable it becomes to cyberattack. Kramer described the potential effects of a
cyberattack in 2011, following the STUXNET attack on Iranian nuclear facilities, stating, “We
have had even further confirmation of the problem of the [US power] grid’s vulnerability, as
demonstrated by the STUXNET attacks. STUXNET – while not grid-directed, showed the
vulnerability of control machines – which are the very type of machines upon which the grid
depends for effective operation.”[21] This vulnerability is further described by the Mission
Support Center, which stated, “Growth of networks and communication protocols used
throughout ICS networks pose vulnerabilities that will continue to provide attack vectors that
threat actors will seek to exploit for the foreseeable future. The interoperable technologies
created for a shift toward a smart grid will continue to expand the cyberattack landscape.”[22]

As evident in the example of the seized Chinese transformer in Houston, software and networks
are not the only mechanisms for cyberattacks. In fact, ICS and hardware, such as transformers,
present a significant vector for cyber intrusion as well.[23] The added danger of this vector is
that ICS controls can be affected without the people monitoring even knowing. This was the
case with STUXNET, where Iranian engineers could see that something abnormal was occurring
but could not pinpoint the cause in time to avert destruction of the centrifuges.[24] Thus,
vectors exist for cyberattacks in the U.S. energy infrastructure from software, networks, and
malware installed in imported hardware including components such as transformers.

Loss of critical infrastructure causes extinction


Friedemann 16 (Alice Friedemann, transportation expert, founder of EnergySkeptic.com and
author of “When Trucks Stop Running, Energy and the Future of Transportation,” worked at
American Presidential Lines for 22 years, where she developed computer systems to coordinate
the transit of cargo between ships, rail, trucks, and consumers, citing Dr. Peter Vincent Pry. Pry
is executive director of the Task Force on National and Homeland Security, a Congressional
advisory board dedicated to achieving protection of the United States from electromagnetic
pulse and other threats. Dr. Pry is also the director of the United States Nuclear Strategy Forum,
an advisory body to Congress on policies to counter weapons of mass destruction. Dr. Pry has
served on the staffs of the Congressional Commission on the Strategic Posture of the United
States, the Commission to Assess the Threat to the U.S. from an EMP Attack, the House Armed
Services Committee, as an intelligence officer with the CIA, and as a verification analyst at the
U.S. Arms Control and Disarmament Agency. 1-24-16, accessed 1/1/19 “Electromagnetic pulse
threat to infrastructure (U.S. House hearings)” http://energyskeptic.com/2016/the-scariest-u-s-
house-session-ever-electromagnetic-pulse-and-the-fall-of-civilization/)

Modern civilization cannot exist for a protracted period without electricity. Within days of a blackout
across the U.S., a blackout that could encompass the entire planet, emergency generators would
run out of fuel, telecommunications would cease as would transportation due to gridlock, and
eventually no fuel. Cities would have no running water and soon, within a few days, exhaust their food
supplies. Police, Fire, Emergency Services and hospitals cannot long operate in a blackout. Government and
Industry also need electricity in order to operate. The EMP Commission warns that a natural or nuclear EMP event,
given current unpreparedness, would likely result in societal collapse. Terrorists, criminals, and even lone
individuals can build a non-nuclear EMP weapon without great trouble or expense, working from Unclassified designs publicly
available on the internet, and using parts available at any electronics store. In 2000, the Terrorism Panel of the House Armed
Services Committee sponsored an experiment, recruiting a small team of amateur electronics enthusiasts to attempt constructing a
radiofrequency weapon, relying only on unclassified design information and parts purchased from Radio Shack. The team, in 1 year,
built two radiofrequency weapons of radically different designs. One was designed to fit inside the shipping crate for a Xerox
machine, so it could be delivered to the Pentagon mail room where (in those more unguarded days before 9/11) it could slowly fry
the Pentagon’s computers. The other radiofrequency weapon was designed to fit inside a small Volkswagon bus, so it could be
driven down Wall Street and disrupt computers— and perhaps the National economy. Both designs were demonstrated and tested
successfully during a special Congressional hearing for this purpose at the U.S. Army’s Aberdeen Proving Ground. Radiofrequency
weapons are not merely a hypothetical threat. Terrorists, criminals, and disgruntled individuals have used home-made
radiofrequency weapons. The U.S. military and foreign militaries have a wide variety of such weaponry. Moreover, non-nuclear EMP
devices that could be used as radiofrequency weapons are publicly marketed for sale to anyone, usually advertised as ‘‘EMP
simulators.’’ For example, one such simulator is advertised for public sale as an ‘‘EMP Suitcase.’’ This EMP simulator is designed to
look like a suitcase, can be carried and operated by one person, and is purpose-built with a high energy radiofrequency output to
destroy electronics. However, it has only a short radius of effect. Nonetheless, a terrorist or deranged individual who knows what he
is doing, who has studied the electric grid for a major metropolitan area, could—armed with the ‘‘EMP Suitcase’’— black out a major
city. A CLEAR AND PRESENT DANGER. An EMP weapon can be used by state actors who wish to level the battlefield by neutralizing
the great technological advantage enjoyed by U.S. military forces. EMP is also the ideal means, the only means, whereby rogue
states or terrorists could use a single nuclear weapon to destroy the United States and prevail in the War on Terrorism or some
other conflict with a single blow. The EMP Commission also warned that states or terrorists could exploit U.S. vulnerability to EMP
attack for coercion or blackmail: ‘‘Therefore, terrorists or state actors that possess relatively unsophisticated missiles armed with
nuclear weapons may well calculate that, instead of destroying a city or military base, they may obtain the greatest political-military
utility from one or a few such weapons by using them—or threatening their use—in an EMP attack.’’ The EMP Commission found
that states such as Russia, China, North Korea, and Iran have incorporated EMP attack into their military doctrines, and openly
describe making EMP attacks against the United States. Indeed, the EMP Commission was established by Congress partly in response
to a Russian nuclear EMP threat made to an official Congressional Delegation on May 2, 1999, in the midst of the Balkans crisis.
Vladimir Lukin, head of the Russian delegation and a former Ambassador to the United States, warned: ‘‘Hypothetically, if Russia
really wanted to hurt the United States in retaliation for NATO’s bombing of Yugoslavia, Russia could fire an SLBM and detonate a
single nuclear warhead at high altitude over the United States. The resulting EMP would massively disrupt U.S. communications and
computer systems, shutting down everything.’’ China’s military doctrine also openly describes EMP attack as the ultimate
asymmetric weapon, as it strikes at the very technology that is the basis of U.S. power. Where EMP is concerned, ‘‘The United States
is more vulnerable to attacks than any other country in the world’’: ‘‘Some people might think that things similar to the ‘Pearl Harbor
Incident’ are unlikely to take place during the information age. Yet it could be regarded as the ‘Pearl Harbor Incident’ of the 21st
Century if a surprise attack is conducted against the enemy’s crucial information systems of command, control, and communications
by such means as… electromagnetic pulse weapons… Even a superpower like the United States, which possesses nuclear missiles
and powerful armed forces, cannot guarantee its immunity…In their own words, a highly computerized open society like the United
States is extremely vulnerable to electronic attacks from all sides. This is because the U.S. economy, from banks to telephone
systems and from power plants to iron and steel works, relies entirely on computer networks… When a country grows increasingly
powerful economically and technologically…it will become increasingly dependent on modern information systems… The United
States is more vulnerable to attacks than any other country in the world.’’ Iran—the world’s leading sponsor of international
terrorism—in military writings openly describes EMP as a terrorist weapon, and as the ultimate weapon for prevailing over the West:
‘‘If the world’s industrial countries fail to devise effective ways to defend themselves against dangerous electronic assaults, then
they will disintegrate within a few years… American soldiers would not be able to find food to eat nor would they be able to fire a
single shot.’’ The threats are not merely words. The EMP Commission assesses that Russia has, as it openly declares in military
writings, probably developed what Russia describes as a ‘‘Super-EMP’’ nuclear weapon—specifically designed to generate
extraordinarily high EMP fields in order to paralyze even the best protected U.S. strategic and military forces. China probably also
has Super-EMP weapons. North Korea too may possess or be developing a Super-EMP nuclear weapon, as alleged by credible
Russian sources to the EMP Commission, and by open-source reporting from South Korean military intelligence. But any nuclear
weapon, even a low-yield first generation device, could suffice to make a catastrophic EMP attack on the United States. Iran,
although it is assessed as not yet having the bomb, is actively testing missile delivery systems and has practiced launches of its best
missile, the Shahab–III, fuzing for high- altitude detonations, in exercises that look suspiciously like training for making EMP attacks.
As noted earlier, Iran has also practiced launching from a ship a Scud, the world’s most common missile—possessed by over 60
nations, terrorist groups, and private collectors. A Scud might be the ideal choice for a ship-launched EMP attack against the United
States intended to be executed anonymously, to escape any last-gasp U.S. retaliation. Unlike a nuclear weapon detonated in a city, a
high-altitude EMP attack leaves no bomb debris for forensic analysis, no perpetrator ‘‘fingerprints.’’ Under present levels of
preparedness, communications would be severely limited, restricted mainly to those few military communications networks that are
hardened against EMP. Today’s microelectronics are the foundation of our modern civilization, but are over 1 million times more
vulnerable to EMP than the far more primitive and robust electronics of the 1960s, that proved vulnerable during nuclear EMP tests
of that era. Tests conducted by the EMP Commission confirmed empirically the theory that, as modern microelectronics become
ever smaller and more efficient, and operate ever faster on lower voltages, they also become ever more vulnerable, and can be
destroyed or disrupted by much lower EMP field strengths. Microelectronics and electronic systems are everywhere, and run
virtually everything in the modern world. All of the civilian critical infrastructures that sustain the economy of the United States, and
the lives of 310 million Americans, depend, directly or indirectly, upon electricity and electronic systems. Of special concern is the
vulnerability to EMP of the Extra-High-Voltage (EHV) transformers, that are indispensable to the operation of the electric grid. EHV
transformers drive electric current over long distances, from the point of generation to consumers (from the Niagara Falls
hydroelectric facility to New York City, for example). The electric grid cannot operate without EHV transformers—which could be
destroyed by an EMP event. The United States no longer manufactures EHV transformers. They must be manufactured and imported
from overseas, from Germany or South Korea, the only two nations in the world that manufacture such transformers for export.
Each EHV transformer must be custom-made for its unique role in the grid. A single EHV transformer typically requires 18 months to
manufacture. The loss of large numbers of EHV transformers to an EMP event would plunge the United States into a protracted
blackout lasting years, with perhaps no hope of eventual recovery, as the society and population probably could not survive for even
1 year without electricity. Another key vulnerability to EMP are Supervisory Control And Data Acquisition systems (SCADAs). SCADAs
essentially are small computers, numbering in the millions and ubiquitous everywhere in the critical infrastructures, that perform
jobs previously performed by hundreds of thousands of human technicians during the 1960s and before, in the era prior to the
microelectronics revolution. SCADAs do things like regulating the flow of electricity into a transformer, controlling the flow of gas
through a pipeline, or running traffic control lights. SCADAs enable a few dozen people to run the critical infrastructures for an entire
city, whereas previously hundreds or even thousands of technicians were necessary. Unfortunately, SCADAs are especially
vulnerable to EMP. EHV transformers and SCADAs are the most important vulnerabilities to EMP, but are by no means the only
vulnerabilities. Each of the critical infrastructures has their own unique vulnerabilities to EMP: The National electric grid, with its
transformers and generators and electronic controls and thousands of miles of power lines, is a vast electronic machine—more
vulnerable to EMP than any other critical infrastructure. Yet the electric grid is the most important of all critical
infrastructures, and is in fact the keystone supporting modern civilization, as it powers all the
other critical infrastructures. As of now it is our technological Achilles Heel. The EMP Commission found
that, if the electric grid collapses, so too will collapse all the other critical infrastructures. But, if the
electric grid can be protected and recovered, so too all the other critical infrastructures can also be restored. Transportation is
a critical infrastructure because modern civilization cannot exist without the goods and services moved by
road, rail, ship, and air. Cars, trucks, locomotives, ships, and aircraft all have electronic components, motors, and controls
that are potentially vulnerable to EMP. Gas stations, fuel pipelines, and refineries that make petroleum products depend upon
electronic components and cannot operate without electricity. Given our current state of unpreparedness, in the aftermath of a
natural or nuclear EMP event, transportation systems would be paralyzed. Traffic control systems that avert traffic jams
and collisions for road, rail, and air depend upon electronic systems, that the EMP Commission discovered are especially vulnerable
to EMP. Communications is a critical infrastructure because modern economies and the cohesion and operation of
modern societies depend to a degree unprecedented in history on the rapid movement of
information—accomplished today mostly by electronic means. Telephones, cell phones, personal computers, television, and
radio are all directly vulnerable to EMP, and cannot operate without electricity. Satellites that operate at Low-Earth-Orbit (LEO) for
communications, weather, scientific, and military purposes are vulnerable to EMP and to collateral effects from an EMP attack.
Within weeks of an EMP event, the LEO satellites, which comprise most satellites, would probably be inoperable. Banking
and
finance are the critical infrastructure that sustain modern economies. Whether it is the stock market, the
financial records of a multinational corporation, or the ATM card of an individual—financial transactions and record keeping all
depend now at the macro- and micro-level upon computers and electronic automated systems. Many of these are directly
vulnerable to EMP, and none can operate without
electricity. The EMP Commission found that an EMP event could
transform the modern electronic economy into a feudal economy based on barter. Food has
always been vital to every person and every civilization. The critical infrastructure for producing,
delivering, and storing food depends upon a complex web of technology, including machines for
planting and harvesting and packaging, refrigerated vehicles for long-haul transportation, and
temperature-controlled warehouses. Modern technology enables over 98 percent of the U.S.
National population to be fed by less than 2 percent of the population. Huge regional warehouses that resupply
supermarkets constitute the National food reserves, enough food to feed the Nation for 30–60 days at normal consumption rates,
the warehoused food preserved by refrigeration and temperature control systems that typically have enough emergency electrical
power (diesel or gas generators) to last only about an average of 3 days. Experience with storm-induced blackouts
proves that when these big regional food warehouses lose electrical power, most of the food
supply will rapidly spoil. Farmers, less than 2 percent of the population as noted above, cannot feed 310
million Americans if deprived of the means that currently makes possible this technological
miracle. Water too has always been a basic necessity to every person and civilization, even more crucial than food. The critical
infrastructure for purifying and delivering potable water, and for disposing of and treating waste water, is a vast networked machine
powered by electricity that uses electrical pumps, screens, filters, paddles, and sprayers to purify and deliver drinkable water, and to
remove and treat waste water. Much of the machinery in the water infrastructure is directly vulnerable to EMP. The system cannot
operate without vast amounts of electricity supplied by the power grid. A natural or nuclear EMP event would immediately deprive
most of the U.S. National population of running water. Many natural sources of water—lakes, streams, and rivers—would be
dangerously polluted by toxic wastes from sewage, industry, and hospitals that would backflow from or bypass wastewater
treatment plants, that could no longer intake and treat pollutants without electric power. Many natural water sources that would
normally be safe to drink, after an EMP event, would be polluted with human wastes including feces, industrial wastes including
arsenic and heavy metals, and hospital wastes including pathogens. Emergency services such as police, fire, and hospitals are the
critical infrastructure that upholds the most basic functions of government and society—preserving law and order, protecting
property and life. Experience from protracted storm-induced blackouts has shown, for example in the aftermath of Hurricanes
Andrew and Katrina, that when the lights go out and communications systems fail and there is no gas for squad cars, fire trucks, and
ambulances, the worst elements of society and the worst human instincts rapidly takeover. The EMP
Commission found that, given our current state of unpreparedness, a natural or nuclear EMP event could create
anarchic conditions that would profoundly challenge the existence of social order.
Case
1NC---Adv 1
1) Even with the aff, CAFOS are not legally required to report their location
and name -- that makes opens loopholes and makes enforcement
impossible
Nsac 12 Nsac, National Sustainable Agriculture Coalition, "EPA DUCKS RESPONSIBILITY TO
GATHER INFORMATION ON CAFOS", 7-18-2012, https://sustainableagriculture.net/blog/cafo-
reporting-rule-withdrawn//mb

The U.S. Environmental Protection Agency (EPA) has announced that it is withdrawing a
proposed Clean Water Act regulation that would have required Concentrated Animal Feeding
Operations (CAFOs) to report basic information to EPA. The information is necessary to ensure that CAFOs
properly handle their waste to avoid water pollution. When Congress enacted the Clean Water in 1972, it
specifically included CAFOs as point sources of pollution subject to regulation under the Act.
Despite that fact, in a 2008 report the U.S. Government Accountability Office concluded that no
federal agency, including the EPA, had consistent, reliable data on CAFOs . The Report noted
that EPA did not even have accurate information on the number of permitted CAFOs nationwide.
EPA proposed the reporting regulation as part of settlement with environmental plaintiffs in the lawsuit National Pork Producers
Council v. EPA. The agency is authorized to collect the information under Section 308 of the Clean Water Act. Environmental groups
fought for the reporting regulation rule because the very weak 2008 CAFO permit rule does not require all CAFOs to obtain a Clean
Water Act permit. In addition, the CAFO permit rule has large gaps that leave tons of CAFO manure and other waste unregulated.
The biggest many flaws is that waste applied on land that is not under the control of the CAFO operator is not subject to the permit’s
nutrient management plan. In addition, EPA omitted from the permit controls over land application of heavy metals, antibiotics,
pathogens, growth hormones, and other substances commonly found in CAFO waste. EPA’s
proposed regulation was
already very weak, with EPA proposing to require CAFOs to submit information on only five out
of fourteen items addressed in the settlement, including: Type of facility; Number and type(s) of
animals; Whether the CAFO land applies CAFO waste; Available acreage for land application; and
Whether the CAFO has a NPDES permit. EPA proposed to omit the following information request
by the environmental plaintiffs: Name and address of the owner and operator; If a contract
operation, the name of the vertical integrator; Location (longitude and latitude) of the
operation; Type and capacity of manure storage; Quantity of manure, process wastewater and
litter generated by the CAFO; If the CAFO land applies, whether it implements a nutrient
management plan for land application; If the CAFO land applies, whether it employs nutrient
management practices and keeps records on site consistent with the CAFO permit regulations; If
the CAFO does not land apply, alternative uses of manure, litter, and/or wastewater; and
Whether the CAFO transfers manure off-site, and if so, quantity transferred to recipients of
transferred manure. In addition, EPA wanted to collect the information only in “focus
watersheds” which are already known to have water quality degraded by CAFO pollution. This
approach would ignore watersheds where sources of pollution were not known and watersheds
where inadequate CAFO waste handling could cause future impairments. In this week’s notice
that EPA was withdrawing the reporting rule altogether, the agency concluded that it could rely
on information on CAFOs from the states and other sources. This conclusion, however, is belied by numerous reports
and statements from EPA regions and state regulators.

2) CAFOs can dodge requirements and there’s zero enforcement


Glibert 20 – Professor with the University of Maryland Center for Environmental Science
Patricia M., “From hogs to HABs: impacts of industrial farming in the US on nitrogen and
phosphorus and greenhouse gas pollution.” Biogeochemistry volume 150, pages139–180(2020).
https://link.springer.com/article/10.1007/s10533-020-00691-6

By definition, CAFO lagoons are “point sources” of pollution and, depending on the size of operation and waste
handling procedures, must be permitted under the Clean Water Act, which requires operators to have a nutrient
management plan and which defines the limits on the allowable amount of discharge to local waters. Such regulations have been
regularly revised (US EPA 2010) and regularly challenged in court. As noted above, state-wide
reporting–and therefore the
transparency of state-wide statistics–of CAFOs is low for almost every state (Miller and Muren 2019). Permitting
can be avoided if the size of the operation falls just under the regulatory limit, and the
percentage of CAFOs reporting permits to the EPA
(https://www.epa.gov/sites/production/files/2019-09/documents/cafo_tracksum_endyear_2018.pdf) is astonishingly low,
especially for those states where hog production is high (Fig. 11b; Online Resources Fig. S6c). Permitting
can also be avoided if the facility does not discharge directly to a waterway. Lack of permitting
does not imply illegal operation, only that the configuration (i.e., number of confined animals or waste
management procedures) of the farm differs from that required to be regulated. The animals from
unpermitted operations nevertheless still release nutrients. Moreover, federal inspections and
enforcement of CAFOs have declined every year since 2011; in 2016, enforcement actions were down 75%
and inspections down more than 50% compared to those actions taken during the Obama administration (Walton 2016).

3) The plan can’t solve pollution regulation – the EPA has no way of
knowing when to prosecute CAFOs because there is no clear definition to
the permitted amount of emission
Office of Inspectors 17 Office of Inspectors , U.S. ENVIRONMENTAL PROTECTION AGENCY,
"Eleven Years After Agreement, EPA Has Not Developed Reliable Emission Estimation Methods
to Determine Whether Animal Feeding Operations Comply With Clean Air Act and Other
Statutes", 7-19-2017,
https://www.epa.gov/sites/default/files/2017-09/documents/_epaoig_20170919-17-p-
0396.pdf//mb

Planning for Draft Development of EEMs Was Not Systematic Ideally, under a systematic planning process, a
methodology for producing a final product at the desired quality is determined up front. This methodology then drives the data
collection efforts. When data are to be used to make some type of decision or estimation, the EPA recommends that the desired
level of quality be expressed in the form of DQOs. As noted in Chapters 1 and 2, the EPA collaborated with external scientists to
develop the monitoring protocol. However, several factors influenced the scope of the NAEMS, and that
effort was not specifically designed to produce data to satisfy acceptance criteria for the EEMs.
Among these factors was that, prior to the study, the EPA did not know which variables most
impact air emissions at AFOs. Thus, the EPA tried to create an EEM development methodology using the data that was
available from the NAEMS. The NAEMS protocol stated that the NAEMS and subsequent data analyses
and interpretation would allow the EPA and livestock and poultry producers to “reasonably
determine” which AFOs were subject to CAA regulatory provisions and CERCLA/EPCRA reporting
requirements. However, as part of its planning, the EPA did not define what was meant by “reasonably
determine.” The EPA developed a quality assurance project plan for its efforts to develop the draft EEMs that were published in
2012, but it focused on assessing the quality of incoming data from the NAEMS and other sources. The quality assurance project
Unless some form of planning is conducted prior to investing the necessary time and resources to collect data, the chances can
be unacceptably high that these data will not meet specific project needs. Guidance on Systematic
Planning Using the Data Quality Objectives Process, EPA QA/G-4, February 2006 17-P-0396 22 plan did not include DQOs
or other performance criteria defining the acceptable level of uncertainty for EEM predictions,
or the quality control measures the EPA would use to assure its statistical models were
scientifically and statistically sound. The EPA had its draft EEMs peer reviewed by the SAB, but
the agency did not involve the SAB in its planning process to ensure that the NAEMS would
provide sufficient data for EEM development. As discussed in Chapter 2, the SAB concluded that the EPA’s draft
EEMs were not useful for making compliance determinations nationwide due to problems with the underlying data and analysis.

4) *The EPA can stop enforcing regulations whenever they want to -- means
the counterplan is a better option and means the new delta wave shows
that nothing is certain
Knickmeyer 20 ELLEN KNICKMEYER, AP News, "Citing virus, EPA has stopped enforcing environmental laws", 5-27-2020,
https://apnews.com/article/business-public-health-donald-trump-virus-outbreak-environment-
45fdad55aaa3bccde3d09939c631c208//mb

WASHINGTON (AP) — TheEnvironmental Protection Agency on Thursday abruptly waived


enforcement on a range of legally mandated public health and environmental protections,
saying industries could have trouble complying with them during the coronavirus pandemic. The
oil and gas industry were among the industries that had sought an advance relaxation of environmental and public health
enforcement during the outbreak, citing
potential staffing problems. The EPA’s decision was sweeping,
forgoing fines or other civil penalties for companies that failed to monitor, report or meet some
other requirements for releasing hazardous pollutants. ADVERTISEMENT The move was the latest,
and one of the broadest, regulation-easing moves by the EPA, which is seeking to roll back
dozens of regulations as part of President Donald Trump’s purge of rules that the administration
sees as unfriendly to business. Civil and criminal enforcement of polluters under the administration has fallen sharply.
Former Obama-era EPA chief Gina McCarthy, now president of the Natural Resources Defense Council, called the announcement “an
open license to pollute.” The
administration was “taking advantage of an unprecedented public health
crisis to do favors for polluters that threaten public health,” McCarthy said, in part of what was a flurry of
condemnation from environmental groups to the announcement. In a statement, EPA Administrator Andrew Wheeler said the open-
ended waiver was temporary and retroactive to March 13. “EPA is committed to protecting human health and the environment, but
recognizes challenges resulting from efforts to protect workers and the public from COVID-19 may directly impact the ability of
regulated facilities to meet all federal regulatory requirements,” Wheeler said. “This temporary policy is designed to provide
enforcement discretion under the current, extraordinary conditions, while ensuring facility operations continue to protect human
health and the environment.” The EPA directive said industries would be expected to comply with
regulations “where reasonably practicable.” Businesses that broke regulations would have to be
able to show that they tried to reduce the harm, and show how any violations were caused by
the coronavirus outbreak, the EPA said. Collin O’Mara, president of the National Wildlife Federation, called the move
“an assault on our public health and an absolute abdication of the legal responsibilities of the EPA.” ADVERTISEMENT The EPA
said the advance pass on enforcement did not apply to criminal violations by polluters. While
there were circumstances where a disaster like the pandemic might make compliance impossible, those instances called for narrow
decisions by regulators on clemency, said Cynthia Giles, a former senior EPA enforcement during the Obama administration. Giles
said she knew of no previous time in the EPA’s half-century history where it “relinquished its fundamental authority” as she said it
did Thursday.

5) Ag’s not the primary cause of their impacts


Johnson, 15 – assistant vice provost for faculty recruitment and associate professor of
environmental science for the University of Texas at Arlington
Ashanti, with Melanie Harrison, March-April. “The Increasing Problem of Nutrient Runoff on the
Coast.” American Scientist Volume 103, Number 2, p. 98.
http://www.americanscientist.org/issues/pub/problem-of-nutrient-runoff-on-coast/3

Some of the starkest examples


of coastal eutrophication occur in the United States, where the
phenomenon has multiple causes. Increased prevalence of sewage, fertilizers, and other
chemicals for industry, household life, and agriculture are important contributors, but they are not
the only way that people cause nutrient levels to rise. By cutting down trees, paving roads and
parking lots, and developing in wetlands, nutrients are less likely to be taken up by plant roots
and soil microbes or filtered through groundwater before reaching rivers and sea.

6) The aff can't solve social inequality


Klusener 17 Edgar Klusener , Manchester University , "Are capitalism and inequality linked?",
5-8-2017, https://sites.manchester.ac.uk/global-social-challenges/2017/05/08/are-capitalism-
and-inequality-linked//mb
Capitalism is the political and economic system in which a country’s trade, investment, industry, and production are controlled by
private individuals and for-profit corporations rather than by the state. The emergence and growth of capitalism in the 19th and
20th centuries has led it to become the globally dominant economic system. Capitalism has outperformed all competing systems
such as socialism and communism (Muller. 2013). The global shift towards capitalism due to its potential for higher profits, equality
of opportunity, economic freedom, and the reduced role of the state has led to the major problem of rising economic inequality
because some individuals and groups are abler than others to exploit and take advantage of what capitalism allows them to. A
society is capitalist when most of the distribution of products occurs through markets in which
people and for-profit organisations trade goods, contracts and services to generate profit. The
markets allow the circulation of capital into new investment, as invested capital is mixed with
labour power to produce goods and services that are sold to generate a profit which provides more capital for
further enterprise. Investors can continue to do this and create a cycle of increasing amounts of profit. This shows how the
ownership of capital in a capitalist society will produce inequality to an extent as it creates
further accumulation through investment. Cycles can continue as the annual rate of return on capital invested is
often 3-5 percent, whereas the annual income or output growth rate has a long-term average of 1.5 percent. This makes it difficult
to generate capital to invest in a capitalist society (Piketty 2014). However, many economists believed until recently this wouldn’t
have as big of an effect on inequality as we have seen. They thought that the early winners in the capitalist society and the richest’s
investments would decrease over time because a number of people lower down the social ladder would get a chance to invest
capital as they haven been able to produce capital over time. Today, eight people own as much wealth as 50
percent of the global population of 7.4 billion, and in the USA, the richest 1 percent own 34 percent of the wealth
and the richest 10 percent own 74 percent of the wealth (Hodgson. 2016). These figur es highlight the economic
inequality in the world today, however, this isn’t just a problem for the poorest in society,
because if these figures continue to grow and the inequality isn’t addressed then the rising
inequality and economic insecurity can erode social order and create a backlash against the
capitalist system. The inequality in capitalist societies is often not down to differences in talent and skill, but it is driven by
resource inequality and inequalities of inheritance leading to differences in education, economic capabilities, and the number of
opportunities available. An
increasingly unequal society may lead to higher levels of other social
problems within a country or region such as obesity, teenage births, homicides, and
imprisonment rates. These problems are also more experienced by the poorest people in an
unequal society and are independent of poverty. These problems are very difficult for a capitalist society to
alleviate as continuous investment from the top into the police and health service, for example, may be relatively ineffective as
these problems are being endlessly recreated in each generation in the most deprived areas of society. Reducing
inequality
in societies is thus the best way to improve the quality of the social environment, the real
quality of life for everyone, and national standards of performance as inequality is the
underlying problem in societies that creates a steeper social gradient in areas such as health and
educational performance. (Wilkinson, Pickett. 2010) The inequality created by capitalism, however, is not the same all over
the world. Increasing income, inequality, and demand for more valuable goods have created a symbolic nature of inequality in the
most developed capitalist societies, where a good or service’s status or identity may be more important than the goods use or
function, leading people to continuously consume to try to gain a higher status. Inequality is of a different nature in the least
developed societies. Although they are still affected by the capitalist system, the bare necessities are more important for people
living in poverty who have no or only very limited access to electricity or clean water. Furthermore, the the poorest in those societies
are often being exploited by the richest corporations of MEDCs. (Wilkinson, Pickett. 2010) The
nature of capitalism has
led to societies biggest problem of huge gaps of widening inequality that has led to further
worsening problems that are becoming continuously more difficult to solve.
1NC---Adv 2
1) Have a high threshold for spillup claims especially when there’s evidence
that the precautionary principle for CAFOs isn’t enforceable
2) The precautionary principle overcorrects---leads to permanent inaction.
Sunstein 03 [Cass R. Sunstein, Karl N. Ilewellyn Distinguished Service Professor of
Jurisprudence, Law School and Department of Political Science, University of Chicago. January
2003, "Beyond the Precautionary Principle," University of Pennsylvania Law Review Vol. 151, No.
3 (Jan., 2003), pp. 1003-1058, accessed 7-23-2021, https://doi.org/10.2307/3312884] //BY, we
do not endorse ableist language

The most serious problem with the strong version of the precautionary principle is that it offers no
guidance-not that it is wrong, but that it forbids all courses of action, including inaction. To understand this
point, it will be useful to anchor the discussion in some concrete problems:

1. One of the most controversial environmental issues faced in the first year of the Bush administration involved the regulation
of arsenic.7 There is a serious dispute over the precise level of risks posed by low levels of arsenic in drinking
water, but in the "worst case" scenario, over one hundred lives might be lost each year as a result of the
fifty parts per billion (ppb) standard that the Clinton administration sought to revise. At the same time, the proposed ten
ppb standard would cost over two hundred million dollars each year, and it is possible that it
would save as few as five lives annually.76

2. Genetic modification of food has become a widespread practice.77 But the risks of that
practice are not known with precision.78 Some people fear that genetic modification will result
in serious ecological harm and large risks to human health.79

3. Scientists are not in full accord about the dangers associated with global warming,80 but there
is general agreement that global warming is in fact occurring.81 It is possible that global warming will
produce, by 2100, a mean temperature increase of 4.5 degrees Celsius,82 that it will result in well over five trillion
dollars in annual monetized costs,83 and that it will also produce a significant number of deaths from
malaria. The Kyoto Protocol would require most industrialized nations to reduce greenhouse gas emissions to between ninety-
two percent and ninety-four percent of 1990 levels.84

4. Many people fear nuclear power on the grounds that nuclear power plants raise various
health and safety issues, including some possibility of catastrophe.85 But if a nation does not rely
on nuclear power, it might well rely instead on fossil fuels, and in particular on 86 coal-fired power
plants. Such plants create risks of their own, including risks associated with global warming . China, for
example, has relied on nuclear energy in part as a way of reducing greenhouse gases and in part as a way of reducing other air
pollution problems.87

5. There is a possible conflict between the protection of marine mammals and military exercises.
The United States Navy, for example, engages in many such exercises, and it is possible that marine
mammals will be threatened as a result. Military activities in the oceans might well cause
significant harm, but a decision to suspend those activities, in cases involving potential harm,
might also endanger 88 military preparedness.

In these cases, what kind of guidance is provided by the precautionary principle? It is tempting
to say, as is in fact standard, that the principle calls for strong controls on arsenic, on genetic
engineering of food, on greenhouse gases, on threats to marine mammals, and on nuclear
power.89 In all of these cases, there is a possibility of serious harms, and no authoritative scientific
evidence suggests that the possibility is close to zero. If the burden of proof is on the proponent of the activity or
process in question, the precautionary principle would seem to impose a burden of proof that cannot be met. Put to one side the
question of whether the precautionary principle, understood to compel stringent regulation in these cases, is sensible. Let us ask a
more fundamental question: Is that more stringent regulation therefore compelled by the precautionary
principle?

The answer is that it


is not. In some of these cases, it should be easy to see that, in its own way, stringent
regulation would actually run afoul of the precautionary principle. The simplest reason is that such
regulation might well deprive society of significant benefits, and for that reason produce a large
number of deaths that otherwise would not occur. In some cases, regulation eliminates the
"opportunity benefits" of a process or activity, and thus causes preventable deaths.90 If this is so, regulation is
hardly precautionary. The most familiar cases involve the "drug lag," produced by a highly precautionary approach to the
introduction of new medicines and drugs into the market.9' If a government takes such an approach, it might protect people against
harms from inadequately tested drugs; but it will also prevent people from receiving potential benefits from those very drugs.92 Is it
"precautionary" to require extensive premarketing testing, or to do the opposite?

Or consider the case of genetic modification of food. Many people believe that a failure to allow genetic
modification might well result in numerous deaths, and a small probability of many more.93 The
reason is that genetic modification holds out the promise of producing food that is both cheaper
and healthier-resulting, for example, in "golden rice," which might have large benefits in developing
countries. Now the point is not that genetic modification will definitely have those benefits or that the benefits of genetic
modification outweigh the risks. The point is only that if the precautionary principle is taken in its strongest
form, it is offended by regulation as well as by nonregulation. So too for regulation of ground-level
ozone. Such regulation does seem justified by the precautionary principle, for responsible people believe that
low levels of ozone produce a range of health harms, including risks of death. But there is also evidence that ground-
level ozone produces health benefits by reducing risks of cataracts and skin cancer.96 Because
the precautionary principle calls for protection when causal connections are unclear, it would
appear to require, with respect to ground-level ozone, both stringent regulation and no
regulation at all.

Sometimes regulation would violate the precautionary principle because it would give rise to
substitute risks, in the form of hazards that materialize, or are increased, as a result of regulation.97 Consider the case of
nuclear power. It is reasonable to think that in light of current options, a ban on nuclear power will
increase dependence on fossil fuels,98 which contribute to global warming. If so, such a ban would
seem to run afoul of the precautionary principle. Or consider the EPA's effort to ban asbestos,99 a ban
that might well seem justified or even compelled by the precautionary principle. The difficulty,
from the standpoint of that very principle, is that substitutes for asbestos also carry risks.?00 Or return to possible
risks to marine mammals from the United States Navy. Some people are concerned that efforts to
eliminate those risks will endanger military preparedness, if only because of administrative
barriers to training exercises.'0' In these circumstances, what is the appropriate approach, according to
the precautionary principle?

The problem is pervasive. The Administrator of the EPA has expressed concern that arsenic
regulation, by virtue of its cost, will lead people to cease using local water systems and to rely on
private wells, which have high levels of contamination. 02 If this is so, stringent arsenic regulation
violates the precautionary principle no less than less stringent regulation does . This is a common
situation, for opportunity benefits and substitute risks are the rule, not the exception.103 Or consider
the continuing debate over whether certain antidepressants impose a (small) risk of breast cancer.'04 A
precautionary approach might seem to caution against the use of such antidepressants because of
their carcinogenic potential; but the failure to use those depressants might well impose risks of its
own, both psychological and physical. Or consider the Soviet Union's decision to evacuate and relocate
400,000 people in response to the risk of adverse effects from the Chernobyl fallout.'05 It is not clear that, on balance, this
massive relocation project was justified on health grounds: "A comparison ought to have been made between
the psychological and medical burdens of this measure (anxiety, psychosomatic diseases, depression, and suicides) and the harm
that may have been prevented."''06 More generally, it
is possible that a sensible government ignores low
levels of radiation, on the grounds that precautionary responses are likely to cause fear that
outweighs any health benefits from those responses.107

Or consider a more general question about how to handle lowlevel toxic agents, including carcinogens: Do such agents
cause adverse effects? If we lack clear evidence, it might seem "precautionary" to assume that
they do, and hence to assume, in the face of uncertainty, that the dose-response curve is linear
and without safe thresholds. 08 In fact, this is the default assumption of the EPA.'09 But is this approach
actually precautionary? Some evidence suggests that many toxic agents that are harmful at high levels are
actually beneficial at low levels."0 Thus, hormesis is a dose-response relationship in which low
doses stimulate desirable effects and high doses inhibit them."' When hormesis is involved, use of a
linear dose-response curve, without safe thresholds, will actually cause mortality and morbidity
effects. Which default approach to the dose-response curve is precautionary? 12 To raise this question is not to take any stand on
whether some, many, or all toxic agents are beneficial or instead harmful at very low doses; it is only to say that the simultaneous
possibility of benefits at low levels and of harms at low levels makes the precautionary principle paralyzing.

It is possible to go much further. A great deal of evidence suggests the possibility that an expensive
regulation can have adverse effects on life and health."3 To be sure, both the phenomenon and the
underlying mechanisms are disputed.114 It has been urged that a statistical life can be lost for
every expenditure of $7.25 million,"15 and one study suggests a cutoff point, for a loss of life per regulatory ex116
penditure, of $15 million. A striking paper suggests that poor people are especially vulnerable to this effect-that a
regulation that reduces wealth for the poorest twenty percent of the population will have twice
as large a mortality effect as a regulation that reduces wealth for the wealthiest twenty
percent."17 I do not mean to accept any particular amount here, or even to suggest that there has been an unambiguous
demonstration of an association between mortality and regulatory expenditures.'1 The only point is that reasonable people
believe in that association. It follows that a multimillion-dollar expenditure for "precaution" has-as a
worst case scenario-significant adverse health effects, with an expenditure of $200 million
leading to perhaps as many as thirty lives lost.

This point makes the precautionary principle hard to implement not merely where regulation
removes "opportunity benefits" or introduces or increases substitute risks, but also in any case
in which the regulation costs a significant amount. If this is so, the precautionary principle, for that very reason,
seems to argue against many regulations. If the precautionary principle draws into doubt any action that
carries a small risk of significant harm, then we should be reluctant to spend a lot of money to
reduce risks, simply because those expenditures themselves carry risks. Here is the sense in which the
precautionary principle, taken for all that it is worth, is paralyzing: it stands as an obstacle to regulation and
nonregulation, and to everything in between.
To say this is not to say that the precautionary principle cannot be amended in a way that removes the problem."'
But once it is so amended, it is much less distinctive and increasingly resembles an effort to
weigh the health benefits of regulation against the health costs,120 or even to measure benefits against
costs. I will return to this point below.
It is now easier to understand the earlier suggestion that despite their formal enthusiasm for the precautionary principle, European
nations are not "more precautionary" than the United States. Jonathan Wiener and Michael Rogers have demonstrated this point
empirically. 12 It would be most valuable to attempt a comparative study, to see which nations are especially precautionary with
respect to specific risks, and also to explore changes over time. In the modem period, for example, the United States has appeared
to take a highly precautionary approach to the risks associated with abandoned hazardous waste dumps,122 terrorism, and
universal health care, but not to take a highly precautionary approach to the risks associated with global warming, indoor air
pollution, poverty, and obesity. What I have been urging is that the selectivity
of precautions is not merely an
empirical fact; it is a conceptual inevitability. Simply as a logical matter, no society can be highly
precautionary with respect to all risks.

3) No i/L between unenforceable CAFOs policy and upholding democratic


administrative state
4) Alt causes mean they can’t solve—ACB, nationalist uprising outweigh
5) They have no solvency for ag exceptionalism/neolib ag when big ag can
just lobby in world of aff and CAFOs can circumvent regs

You might also like