You are on page 1of 51

1ac

Collisions – 1AC

Unanticipated collisions cause US-China miscalc and nuclear war


Fabian 19 [Christopher David, Bachelor of Science, USAFA. Master of Science in Space
Studies Thesis. “A Neoclassical Realist’s Analysis Of Sino-U.S. Space Policy” January 2019
https://commons.und.edu/cgi/viewcontent.cgi?article=3456&context=theses]
Morgan points out that the nature of space deterrence has fundamentally changed since the end of the Cold War. First, a decoupling of space and nuclear warfare has destroyed
the tacit red lines that guaranteed an attack on space systems would result in nuclear retaliation.60 Furthermore, technologies have been developed that allow for incremental
escalation and nonlethal functional kills of space assets.61 A paradigm is created where escalation is probable, but the extent to which it will happen is unknown. This is a
problem for Sino-U.S. space relations because China is a nuclear capable power who believes itself to have achieved nuclear deterrence with the United States, yet does not have

the implied strategic understanding that it took the U.S. and the U.S.S.R. four decades to build. The rules of the game have changed, but the danger of
nuclear apocalypse is still real and a risk of miscalc ulation has increased . Morgan echoes Johnson-
Freese’s assertion that the dual-use phenomenon complicates deterrence and extends the reasoning on offensive dominance by adding valuable insight
on the state of first-strike stability. In short, first-strike stability is difficult to maintain because the disproportionate gain from a first strike outweighs
any cost a recipient can impose in response. The United States’ overwhelming reliance on and comparative advantage from space based effects gives a
prospective attacker very high payoff and satellites being relatively soft targets increases the likelihood of success and further adds to the benefit of a
first-strike.62 Conversely, the emphasis on system based warfare means that an effective attack on space assets drastically reduces the ability of the U.S.
to impose costs. Also, its overreliance on space and the fragility of the space environment require an asymmetric response to both avoid a tit-for-tat
spiral and protect the continued use of the domain. Furthermore, a lack of space situational awareness (SSA) prevents a rapid response.63 Chinese
military planners are acutely aware of the asymmetric advantage to be gained from a first-strike in space and have integrated it into military doctrine.
This further strengthens the argument of space warfare as a flash point in East Asia. The structural factors examined in the literature thus far paint a
bleak picture for a peaceful restructuring of East Asia. However, a bipartisan grand strategy
that preempts conflict, is sustained and
refined over decades, and has an acute sense of both a competitor and one’s own culture and history may be able to

subvert structural determinism.64 When imperfect rationality and miscalc ulation results in deterrence failure it is
difficult to underestimate the importance of understanding a competitor’s decision making
apparatus. Strategic culture , political climate, and soft power interactions are the core of this
apparatus. Joan Johnson-Freese, who is equal parts East Asia policy and space policy expert, asserts that, “it might
be generally possible to grasp the mechanics of the Chinese space program without the benefits of historical
information, but the likelihood of truly understanding the policy aspects without this contextual
info rmation is slightly less, and attempts at analysis and extrapolation become superficial at best .”65 Likewise,
competitive strategy will be ineffective absent an understanding of one’s own limitations. Resources such as latent
military capacity, budget, political capitol, strategic culture, and soft power/international prestige should be easy to calculate, but many times within
the space program’s short history the failure to grasp internal limitations has been a stumbling block. Henry Kissinger’s On China is a nuanced
examination of Chinese strategic culture that benefits from the author’s understanding of Chinese history and the nation’s role in late20th/early-21st
diplomatic engagement
century global power politics. He conveys a unified message through On China, that continual

between the two powers is the key to peace and develops two motifs throughout the work. First, misapprehension
of Chinese intent by western powers has repeatedly resulted in conflict , which could be
avoided with better understanding of Chinese strategic culture. Traditional Chinese strategic culture, shaped for
millennia by geography and Confucian principals, was not destroyed by Mao and the communist revolution as many assert. Kissinger uses the traditional martial games of wei qi
(go) and chess to exemplify Chinese and western strategic cultures respectively. Where wei qi teaches the art of strategic flexibility by emphasizing encirclement, protracted and
asymmetric warfare, generating unperceptively small advantages, and momentum; chess teaches total victory achieved by attrition, decisive moves, centers of gravity, and
symmetry. Carl von Clausewitz teaches that war is policy by other means, inferring war as a distinct phase of politics; while Sun Tsu emphasizes victory before fighting by
achieving psychological advantage with military means as a small part of overall strategy. The ideal Chinese military conflict is geographically limited and easily contained; the

American way of war concludes only upon total victory. Kissinger then describes the feedback loop that results from conflicting
strategic perspectives . The western desire for control threatens Chinese freedom of maneuver and exacerbates their siege mentality. In response, China assumes a policy of active defense (preemption) in order to maintain
the strategic initiative. This, in turn, is seen as hostile by the west and typically results in escalation in order to establish deterrence through cost imposition. The western idea of deterrence is incompatible with ambiguity and flexibility while Chinese preemption
demands it.67 This results in a distinguishable pattern. First, a state consolidates power on China’s periphery, surrounding China and threatening its structural integrity on both physical and psychological levels. Second, ever aware of shi, Chinese strategists employ
measures to maintain their strategic flexibility and prevent total encirclement. Third, the peripheral power misinterprets preemption for aggression and escalates the conflict. At this point, China is either able to contain the threat and achieve its geopolitical aims or
it is too weak to do so and is thrown into existential crisis. In the 20th century, this pattern has been exemplified by Chinese involvement in the Korean War and its continued support of an independent state to buffer the U.S. alliance bloc from a historical ingress
point to the Chinese mainland; its own Vietnam War to prevent the emergence of a competitive power bloc led by Vietnam in Southeast Asia; and Chinese political maneuvering against the Soviet Union to prevent its consolidation of power over the Eurasian
landmass. Disregarding the similarities between these disputes and the current Sino-U.S. position in East Asia is impossible.68 Second, the Sino-centric worldview is rising in China as she emerges from a century of humiliation to become an economic and military
superpower. The over-proselytized American belief that the implementation of democracy should be the end goal of global politics and unapologetically moralist positions conflict with Sino-centrism. It is seen by China as an extension of colonial interventionism and
a threat to their fiercely held autonomy. U.S. diplomacy is often contingent on the improvement of China’s human rights record. Widespread support for China’s various separatist movements and public outcry over the Tiananmen Square incident has exacerbated
this problem. American reluctance to recognize the legitimacy of a communist government, give up democratization as long term policy goals, or give China its due in international relations has weakened Sino-U.S. relations. America’s moralist rather than pragmatic
approach to policy threatens China’s delicate social order and undermines CCP legitimacy, resulting in missed diplomatic opportunities. Other policy analysts are certainly influenced by Kissinger, but add their own insight into Chinese Strategic culture. Kenneth
Johnson and Andrew Scobell writing for the Strategic Studies Institute both attribute the apparent cognitive dissonance in Chinese policy to a curious blend of Confucian ideals and realpolitik thought, supporting Kissinger’s assertion that Confucianism is not dead.
There is a cult of defense within China, accompanying a deeply held belief that China’s strategic culture is overwhelmingly pacifist. However, preemptive action is permissible as long as it can be a justifiable “defense” of Chinese strategic interests.70 In addition,
China bemoans aggressive territorial expansion and hegemony by force. This historical sensitivity has only been exacerbated by the “century of humiliation” at the hands of European powers.71 However, the benevolent expansion of influence and the use of force for
Chinese national unification is just. Furthermore, the Chinese fear of encirclement could cause a disproportionate reaction to the U.S. force realignment and restructuring of alliances in East Asia. This could exacerbate the worsening of the security dilemma that
alliance forming typically causes. Joan-Johnson Freese emphasizes the influence of Confucianism in internal decision making and the penchant for isolationism. Confucianism emphasizes peace, order, and knowing one’s place within society. This invites
authoritarianism and the Chinese people have little experience with participation in the political process. Rather, there is an instability lurking beneath the calm surface of society that leaders must constrain and satisfy in order to maintain their mandate to rule. The
social contract has a simple results based nature where political stability and prosperity is exchanged for the continued political power. The Chinese Communist Party then is less beholden to communist ideology than it is to continued prosperity. Also, despite the
From the outsider, the familial ties, importance of relationships,
negative connotation of nepotism in the West, it is an institution of Chinese culture (known as Guan Xi).

compartmentalization, and ambiguity in the Chinese bureaucracy are confusing and frustrating. This research
paints the picture of the U.S. and China as diametrically opposed cultures that are almost designed
to create misunderstanding between the two. Therefore, being aware of cultural and political sensitives
is necessary to create sound strategy. Michael Pillsbury identifies 16 psycho-cultural pressure points where, if correctly considered in reassurance, cost imposition, or
dissuasion strategies, will yield disproportionately effects whether they be positive or negative. Each of these factors are referred to as “fears”. Eleven of the sixteen fears are linked to the ability of the U.S. military
to project power into East Asia and the strategic sea lines of communication from the Strait of Malacca to the Bohai Gulf, which is contingent on the ability to deliver space effects in support of U.S. military
operation. Pillsbury identifies the fear of attack on their anti-satellite capabilities as a specific Chinese fear, but warfighting in the space domain is intrinsically linked to the other 11. Another of the sixteen fears is
the fear of escalation and loss of control. This is particularly important because the Chinese view ASAT weaponry as a legitimate cost imposition option designed to limit conflict. Contrast that with the American
strategy of threatening escalation in order to prevent the spread of the conflict into space and implicit red lines that fail to account for limited conflict in a strategic domain. Space’s role in soft power links it to two
final fears, the fear of regional competitors and the fear of internal instability. Space technology development is essential to the CCP’s techno-nationalist narrative as it is assigned great importance internally to
strengthen CCP’s mandate to rule and externally to legitimize China as a regional leader.

The plan’s process is key – direct dialogue builds experience working


with each other – that solves crisis prevention and management
Grego 18 [Laura, Senior scientist in the Global Security Program at the Union of Concerned
Scientists, Technical advisor for the Woomera Manual on the International Law of Military
Space Operations project, Associate editor of Science and Global Security, delegate to the
American Physical Society’s Panel on Public Affairs for the Forum on Physics and Society.
“Space and Crisis Stability” 3-19-18 *edited for clarity
https://www.law.upenn.edu/live/files/7804-grego-space-and-crisis-stabilitypd]

The compressed timelines characteristic of crises combine with these ― use it or lose it pressure s to
shrink timelines. This dynamic couples dangerously with the inherent difficulty of determining the
causes of satellite degradation, whether malicious or from natural causes, in a timely way. Space is a difficult environment in which
to operate. Satellites orbit amidst increasing amounts of debris . A collision with a debris object the size of a

marble could be catastrophic for a satellite, but objects of that size cannot be reliably tracked. So a failure due to a collision with a small piece of
untracked debris may be left open to other interpretations. Satellite electronics are also subject to high levels of damaging radiation. Because of their remoteness, satellites as a
rule cannot be repaired or maintained. While on-board diagnostics and space surveillance can help the user understand what went wrong, it is difficult to have a complete
picture on short timescales. Satellite failure on-orbit is a regular occurrence19 (indeed, many satellites are kept in service long past their intended lifetimes). In the past, when

even during times


fewer actors had access to satellite-disrupting technologies, satellite failures were usually ascribed to ―natural‖ causes. But increasingly,

of peace operators may assume malicious intent . More to the point, in a crisis when the costs of inaction
may be perceived to be costly, there is an incentive to choose the worst-case interpretation of events even if the
information is incomplete or inconclusive. Entanglement of strategic and tactical missions During the Cold War, nuclear and conventional arms were well separated, and escalation pathways were relatively clear. While space-
based assets performed critical strategic missions, including early warning of ballistic missile launch and secure communications in a crisis, there was a relatively clear sense that these targets were off limits, as attacks could undermine nuclear deterrence. In the
Strategic Arms Limitation Treaty, the US and Soviet Union pledged not to interfere with each other‘s ―national technical means‖ of verifying compliance with the agreement, yet another recognition that attacking strategically important satellites could be
destabilizing.20 There was also restraint in building the hardware that could hold these assets at risk. However, where the lines between strategic satellite missions and other missions are blurred, these norms can be weakened. For example, the satellites that provide
early warning of ballistic missile launch are associated with nuclear deterrent posture, but also are critical sensors for missile defenses. Strategic surveillance and missile warning satellites also support efforts to locate and destroy mobile conventional missile
launchers. Interfering with an early warning sensor satellite might be intended to dissuade an adversary from using nuclear weapons first by degrading their missile defenses and thus hindering their first-strike posture. However, for a state that uses early warning
satellites to enable a ―hair trigger‖ or launch-on-attack posture, the interference with such a satellite might instead be interpreted as a precursor to a nuclear attack. It may accelerate the use of nuclear weapons rather than inhibit it. Misperception and dual-use
technologies Some space technologies and activities can be used both for relatively benign purposes but also for hostile ones. It may be difficult for an actor to understand the intent behind the development, testing, use, and stockpiling of these technologies, and see
threats where there are none. (Or miss a threat until it is too late.) This may start a cycle of action and reaction based on misperception. For example, relatively low-mass satellites can now maneuver autonomously and closely approach other satellites without their
cooperation; this may be for peaceful purposes such as satellite maintenance or the building of complex space structures, or for more controversial reasons such as intelligence-gathering or anti-satellite attacks. Ground-based lasers can be used to dazzle the sensors
of an adversary‘s remote sensing satellites, and with sufficient power, they may damage those sensors. The power needed to dazzle a satellite is low, achievable with commercially available lasers coupled to a mirror which can track the satellite. Laser ranging
networks use low-powered lasers to track satellites and to monitor precisely the Earth‘s shape and gravitational field, and use similar technologies. 21 Higher-powered lasers coupled with satellite-tracking optics have fewer legitimate uses. Because midcourse missile
defense systems are intended to destroy long-range ballistic missile warheads, which travel at speeds and altitudes comparable to those of satellites, such defense systems also have inherent ASAT capabilities. In fact, while the technologies being developed for long-
range missile defenses might not prove very effective against ballistic missiles—for example, because of the countermeasure problems associated with midcourse missile defense— they could be far more effective against satellites. This capacity is not just theoretical.
In 2007, China demonstrated a direct-ascent anti-satellite capability which could be used both in an ASAT and missile defense role, and in 2009, the United States used a ship-based missile defense interceptor to destroy a satellite, as well. US plans indicated a
projected inventory of missile defense interceptors with capability to reach all low earth orbiting satellites in the dozens in the 2020s, and in the hundreds by 2030.22 Discrimination The consequences of interfering with a satellite may be vastly different depending
on who is affected and how, and whether the satellite represents a legitimate military objective. However, it will not always be clear who the owners and operators of a satellite are, and users of a satellite‘s services may be numerous and not public. Registration of
satellites is incomplete23 and current ownership is not necessarily updated in a readily available repository. The identification of a satellite as military or civilian may be deliberately obscured. Or its value as a military asset may change over time; for example, the
share of capacity of a commercial satellite used by military customers may wax and wane. A potential adversary‘s satellite may have different or additional missions that are more vital to that adversary than an outsider may perceive. An ASAT attack that creates
persistent debris could result in significant collateral damage to a wide range of other actors; unlike terrestrial attacks, these consequences are not limited geographically, and could harm other users unpredictably. In 2015, the Pentagon‘s annual wargame, or
simulated conflict, involving space assets focused on a future regional conflict. The official report out24 warned that it was hard to keep the conflict contained geographically when using anti-satellite weapons: As the wargame unfolded, a regional crisis quickly
escalated, partly because of the interconnectedness of a multi-domain fight involving a capable adversary. The wargame participants emphasized the challenges in containing horizontal escalation once space control capabilities are employed to achieve limited
national objectives. Lack of shared understanding of consequences/proportionality States have fairly similar understandings of the implications of military actions on the ground, in the air, and at sea, built over decades of experience. The United States and the Soviet
Union/Russia have built some shared understanding of each other‘s strategic thinking on nuclear weapons, though this is less true for other states with nuclear weapons. But in the context of nuclear weapons, there is an arguable understanding about the crisis
escalation based on the type of weapon (strategic or tactical) and the target (counterforce—against other nuclear targets, or countervalue—against civilian targets). Because of a lack of experience in hostilities that target space-based capabilities, it is not entirely clear
what the proper response to a space activity is and where the escalation thresholds or ―red lines‖ lie. Exacerbating this is the asymmetry in space investments; not all actors will assign the same value to a given target or same escalatory nature to different weapons.
For example, the United States is the country most heavily dependent on military space assets. Its proportionally higher commitment to expeditionary forces make this likely to be true well into the future. So while the United States seeks to create a deterrence
framework, punishment-based deterrence would not likely target its adversary‘s space assets. But then there is difficulty finding target on the ground that would be credible but also not unpredictably escalate a crisis. If an American military satellite were attacked
but without attendant human casualties (satellites have no mothers‘), retaliation on an adversary‘s ground-based target is likely to escalate the conflict, perhaps justifying the adversary‘s subsequent claim to self-defense, even if the initial satellite attack didn‘t

Little experience in engaging substantively in these issues Related to this issue is that there is relatively little experience
support such a claim.

among the major space actors in handling a crisis with the others . The United States and the Soviet Union, then
Russia, have had a long history of strategic discussions and negotiations. This built up a shared
understanding of each other‘s point of view, developed relationships between those conducting those
discussions, and created bureaucracies and expertise to support those discussions. This experience and
these relationships are important to interpret ing events and to resolv ing disputes before they turn

into a crisis, and to managing [crisis] one once it begins. There is nothing like this level of
engagement around space issues between these two states, and much less between the US and China . One of
the participants in a 2010 US space war game, a diplomatic veteran, imagined25 how things would play out if one or more militarily important US
satellites failed amidst a crisis with an adversary known to have sophisticated offensive cyber and space capabilities: The good news is that there has
no one around the situation
never been a destructive conflict waged in either the space or cyber domains. The bad news is that

room table can cite any history from previous wars, or common bilateral understandings with the
adversary, relating to space and cyber conflict as a guide to what the incoming reports mean, and what may or
may not happen next. This is the big difference between the space-cyber domains, and the nuclear domain. There is, in this future scenario,
no credible basis for anyone around the president to attribute restraint to the adversary, no track record from which to

interpret the actions by the adversary. There is no crisis management history : the
president has no bilateral understandings or guidelines from past diplomatic discussions, and no
operational protocols from previous incidents where space and cyber moves and counter-
moves created precedents. Perhaps the adversary intended to make a point with one series of limited
attacks, and hoped for talks with Washington and a compromise; but for all the president knows, sitting in
the situation room, the hostile actions taken against America‘s space assets and information systems are
nothing less than early stages of an all-out assault on US interests. Where to start? How to prioritize efforts Using
this lens, what does this say about where efforts around space security should be focused? Start a substantive, high-level arms control discussion
Starting a credible high-level discussion will require countries to identify key domestic stakeholders, assemble teams of experts on relevant issues, and
informed dialogue will increase understanding between
develop detailed policy positions. The resulting

countries , identify important areas of agreement and disagreement, clarify intentions, and establish better
channels of communication.

Coop on SAA data practices provides unique insight on each other’s


institutional cultures that resolves grave mistrust and realigns both
countries incentives towards de-escalation
Johnson-Freese 15 [Joan, Professor of National Security Affairs at the U.S. Naval War
College “China’s Space & Counterspace Programs” 2-18-15
https://www.uscc.gov/sites/default/files/Johnson%20Freese_Testimony.pdf]

Congressman’s Wolf ’s perspective assumes that working with the United States would give China opportunities in
terms of surreptitiously obtain ing U.S. tech nology otherwise unavailable to it. But we live in a globalized
world. Attempting to isolate Chinese space activities has proved futile , and in fact pushed China
and other countries into developing indigenous space industries — totally beyond any U.S. control —
than they might not have done otherwise , and arguably reap more political and prestige benefits
from doing so than if they had got ten the same tech nology from partnering with the U.S . The
only outcome of the past two decades of strict export control there is hard data on is the damage to
the U.S. commercial space sector. Second, Wolf’s rationale assumes the United States has nothing to gain by working with
the Chinese. On the contrary, the United States could learn about how they work — their decision-
making processes, institutional policies and standard operating procedures . This is
valuable info rmation in accurately decipher ing the intended use of dual-use space technology,
long a weakness and so a vulnerability in U.S. analysis . Working together on an actual project where
people confront and solve problems together, perhaps, a space science or space debris project where both
parties can contribute something of value, builds trust on both sides , trust that is currently
severely lacking . It also allows each side to understand the other’s cultural proclivities,
reasoning and institutional constraints with minimal risk of tech nology sharing .
Perhaps most importantly, coop eration would politically empower Chinese individuals and institutions who are
stakeholders in Chinese space policy to be more favorably inclined toward the United States. A
coop erative civil and commercial relationship creates interests that could inhibit aggressive or
reckless behavior, as opposed to Chinese space policy being untethered to any obligations, interest
or benefits it might obtain through cooperation with the United States. The National Academies of Science (NAS) 2014 report titled Pathways to
Exploration: Rationales and Approaches for A U.S. Program of Space Exploration, includes a specific recommendation that it is in U.S. interests to work with China.39 NAS has also successfully completed the first
Forum for New Leaders in Space Science with the Chinese Academy of Science in 2014. It brought together 16 early career space scientists from China and the US to meet over two workshops where they shared
research results and discussed future research opportunities. A second forum is being planned. Wolf further stated that the United States should not work with China based on moral grounds. While clearly the
United States would prefer not to work with authoritarian and/or communist regimes, it has done so in war and in peacetime when it has served American interests, and continues to do so today. That is the basis
of realism: Serve American interests first. While the United States would prefer not to work with Stalin, we continue to work with Putin when it benefits us to do so. Were the U.S. not to work with authoritarian
regimes, it would have few regimes to work with at all in the Middle East. The U.S. provided supported Saddam Hussain’s regime in the Iran-Iraq War. Chinese politicians are interested in the ISS for symbolic

is unrealistic to expect withholding


reasons, specifically, being accepted as part of the international family of spacefaring nations as a sign of regime legitimacy. But it
U.S. cooperation on space issues can influence regime change in China. A similar approach was considered
with the Soviet Union, and it failed. Further, in terms of the U.S. doing China a favor by working with it, perhaps ironically many Chinese space professionals fear that cooperation with the United States would
just slow them down. American politics are viewed as fickle and without the will to see programs to completion. This view is reflected in changing European views regarding space leadership. A 2013 piece in Germany’s Der Spiegel suggested that Europe is thinking of
redirecting its primary space alliance from the United States to China, due to China’s “rising power” status in space. The question of whether China is challenging U.S. leadership in space has received considerable media attention in the form of a U.S. – China “space
race,” prompted largely by perceptions of declining U.S. space leadership. The U.S. civil space program is not dying, military space activities continue to expand, and no country is doing anything in space that has not already been done by the United States. But
having started with such a spectacular accomplishment as the Apollo Program, it has been difficult to maintain the public enthusiasm required to fund further space spectaculars, such as a human spaceflight mission to Mars. Although not completely unsupportive,
the U.S. public treats the space program as expendable to other government programs. The reality is that space, as with other areas of international relations, will likely be a multipolar environment in the future.42 America’s unipolar moment is over, and as long as it
is reluctant to work with rising partners such as China, the perception of its space leadership will continue to decline as well. That is not to say that the United States will not continue to lead in some areas of space activity. If only by virtue of a heftier budget, the
United States will be able to lead in select areas. But the days of total leadership are over. It will be a tough pill to swallow for those who crave exceptionalism— but if we are unwilling to pay the price tag, then swallow it, we must.43 In that respect, China has not

“usurped” the perception of U.S. space leadership, it is being ceded to them. This rebuttal to Congressman Wolf’s views assumes that the United States has a choice regarding whether or not to work with China. If , however,
sustainability of the space environment upon which the U.S. generally and the U.S. military specifically relies upon for
advantages is to be maintained , the space debris issue alone requires that the U.S. not exclude

diplomacy as a policy option. While missile defense/ASAT testing has been conducted in ways to minimize debris issues since 2007,
the potential threat to the space environment in non-test circumstances has become clear. If there was any upside to the 2007 Chinese test, it
was the frightening realization by all countries of the fragility of the space environment. With regard to China specifically, since this 2007 test China has done nothing further in

Mankind’s dependence on space assets thereby makes


space that can be considered irresponsible or outside the norms set by the United States.

it in the best interests of all spacefaring nations to cooperate to maintain that environment. China was scheduled to host an
international meeting of the Inter-Agency Space Debris Coordinating Committee (IADC) only days after its 2007 ASAT test that significantly worsened space debris, resulting in China cancelling the meeting out of
embarrassment. There is a certain (understandable) glee in the U.S. military, which has the most sophisticated government space tracking abilities, at being able to warn China of potential collisions between its

recent constructive Chinese involvement with the IADC indicates


own space junk and its own satellites.44 More

recognition of need to sustain the space environment and coop erate on relevant issues, particularly the d

space debris issue.45 These are the type of “common ground” issue s that provide

opportunitie s to work with all spacefaring nations to protect the “congested, contested and competitive” space environment. U.S.
emphasis on counterspace is often presented as in response to actions and intentions of other countries, specifically China, presumably recent.
Increasingly, however, it seems speculation about Chinese intentions is based on material not
publically shared , mak ing the feasibility of both the speculation and appropriate U.S. response s
difficult to assess. For example, to my knowledge China has done nothing since its admittedly irresponsible 2007 ASAT test that goes beyond
what the U.S. considers international norms of responsible behavior. Pursuing efforts to enhance transparency , confidence-

building measures, toward identifying “common ground among all space-faring nations,” and resiliency for military systems (NSSS, p.8) all must be

pursued with the same energy and commitment as counterspace operations. Otherwise, just as efforts to
isolate Chinese space activities have backfired on the U.S. in areas such as export control, the unintended
consequences of a principally “deter, defend, defeat” strategy could trigger an arms race that puts
the sustainability of the space environment at significant risk, to the detriment of U.S.

nat ional sec urity. With regard to the resilience, specifically the purview of the Department of Defense (DOD) and Office of the Director of National Intelligence (ODNI), resilience has faced
resistance from elements within as being too expensive or, as with space arms control, just too difficult.46 The Air Force appears to be taking the time honored approach of studying the problem rather than acting
on it. Center for Strategic and Budgetary Assessments analyst Todd Harrison characterized part of the problem as a lack of interest on the part of Pentagon leaders. He stated, “While everyone recognizes space as a
critical enabler for the war fighter at all levels of conflict, from low to high end, it is not the sexy weapon system that puts hot metal on a target. So it doesn’t attract much interest from senior leaders.”47
Counterspace, however, offers that sexy option. Regarding transparency, the need to share information about satellite locations was recognized by the private satellite owners and operators, promoting the

government level, Space Situational Awareness ( SSA ) efforts have largely been to
formation of the Space Data Association. At the

“formalize the existing model of one-way data flow from the American military to other countries and satellite operators” and the
U.S. signing bi-lateral agreements with France49 and Japan, and the U.S., United Kingdom (U.K.), Canada and Australia signing a limited agreement in 2014.50 While U.S. efforts to provide
collision-avoidance information to other countries – including China – are admirable, as an
increasing number of
countries place an increasing number of satellites in orbit , improving current techniques and
increasing collaboration and coop eration on exchange s of info rmation must be aggressively
pursued.
Aff is best middle ground – China says yes but doesn’t link to tech
transfer
Wang 19 [Kent Wang is a research fellow at the Institute for Taiwan-America Studies. 3-4-
2019 https://www.asiatimes.com/2019/03/opinion/why-space-exploration-should-be-a-joint-
effort/]
China, which until now has developed its ambitious space program completely on its own, seems more
open to international cooperation , especially with the United States. I recently met a
group of Chinese scientists on a visit to the National Air and Space Museum in Washington, DC. I
asked them what they made of their country’s advances in space and their thoughts for the
future. They all spoke of their desire to see a fellow citizen on the moon and said they would like to
work cooperatively with the United States in space exploration. They also said they believed that
the more the US and China cooperate in space, the less likely it is that they will confront each
other militarily.
I am not suggesting that the US should share all its technology with China. However, the truth is that
the relations between those two countries are nowhere near as strained as those between the US
and the Soviet Union in the mid-1970s, so why not work toward joint missions? At the very least,
both nations could develop a capability to monitor climate change, space weather, threats from
asteroids, and other useful initiatives. Such collaboration has facilitated the development of new
technologies, created commercial opportunities, identified opportunities for shared
missions , and inspired younger generations to undertake the challenge of space
exploration.
International cooperation is a great thing. It can be a useful foreign-policy tool as well as improve
the space capabilities of the United States and its partners. A joint venture approach to manned
space programs and deep space exploration has the potential to yield diplomatic, political and
military benefits that vastly exceed in magnitude the cost of implementation.
P
United States federal government should significantly increase data exchange and
administration of basic orbital data on nonsensititve space assets with the People’s Republic of
China.
Space Engagement – 1AC

SSA coop with China spills over to other space engagement and space
weather
Weeden 15 - Brian Weeden is the Technical Advisor for Secure World Foundation and a
former U.S. Air Force Officer with sixteen years of professional experience in space operations
and policy (“An Opportunity to Use the Space Domain to Strengthen the U.S.-China
Relationship” 9/9, http://nbr.org/research/activity.aspx?id=602
The U.S.-China relationship in space has the potential to be a stable foundation for a stronger
overall relationship between the two countries. Space was arguably a stabilizing element in the
relationship between the United States and Soviet Union during the Cold War by providing national
capabilities to reduce tensions and an outlet for collaboration. Although the future of the U.S.-China relationship
will be characterized by both competition and cooperation, taking concrete steps to stabilize relations in space can be
part of the solution to avoiding the “Thucydides trap,” where an established power’s fear of a rising
power leads to conflict.
The Role of Space in the U.S.-China Relationship

Space is a critical domain to the security of the United States. Space capabilities enable secure, hardened communications with nuclear forces, enable
the verification and monitoring of arms control treaties, and provide valuable intelligence. Such capabilities are the foundation of the United States’
ability to defend its borders, project power to protect its allies and interests overseas, and defeat adversaries. The space domain, however, is currently
experiencing significant changes that could affect the United States’ ability to maintain all these benefits in the future. A growing number of state and
nonstate actors are involved in space, resulting in more than 1,200 active satellites in orbit and thousands more planned in the near future. Active
satellites coexist in space along with hundreds of thousands of dead satellites, spent rocket stages, and other pieces of debris that are a legacy of six
decades of space activities. As a result, the most useful and densely populated orbits are experiencing significant increases in physical and
electromagnetic congestion and interference.

Amid this change, China is rapidly developing its capabilities across the entire spectrum of space activities. It has a robust and successful human
spaceflight and exploration program that in many ways mirrors NASA’s successes in the 1960s and 1970s and is a similar source of national pride.
Although it still has a long way to go, China is developing a range of space capabilities focused on national security that one day might be second only to
those of the United States. Some
of China’s new capabilities have created significant concern within the
U.S. national security community, as they are aimed at countering or threatening the space
capabilities of the United States and other countries.
There is
The massive changes in the space domain and China’s growing capabilities have affected the U.S.-China relationship in space.
growing mistrust between the two countries, fueled in part by their development and testing of
dual-use technologies such as rendezvous and proximity operations and hypervelocity kinetic
kill systems. This mistrust is compounded by a misalignment in political and strategic priorities: China is focused on developing and increasing
its capabilities in the space domain, whereas the United States is focused on maintaining and assuring access to its space capabilities.

Recommendations for Managing Tensions and Promoting Positive Engagement

Despite these challenges and concerns, there are concrete steps that the United States and China can take to manage
tensions and possibly even work toward positive engagement . In 2011, President Barack Obama and then Chinese president
Hu Jintao issued a joint statement on strengthening U.S.-China relations during a visit by President Hu to the White House. As one of the steps
outlined in the statement, the two presidents agreed to take specific actions to deepen dialogue and exchanges in the field of space and discuss
opportunities for practical future cooperation.

President Xi Jinping’s upcoming visit presents an opportunity to build on the 2011 agreement and take steps toward these goals. The first step should
be to have a substantive discussion on space security. President Obama should clearly communicate the importance that the United States places on
assured access to space, U.S. concerns with recent Chinese counterspace testing, and the potential negative consequences of any aggressive acts in
space. Both countries should exchange views on space policies, including their interpretations of how self-defense applies to satellites and hostile
actions in space. Doing so can help avoid misunderstandings and misperceptions that could lead either country to unwittingly take actions that escalate
a crisis.

specific ideas for cooperation in civil and scientific space


Second, Presidents Obama and Xi should discuss
activities and the use of space for peaceful applications on earth. Continuing to exclude China
from civil space cooperation will not prevent it from developing its own capabilities; this
approach will only ensure that China cooperates with other countries in space in a way that advances its own
national interests and goals. Space weather, scientific research , exploration , capacity building for

disaster response , and global environmental monitoring are all areas where the United
States and China share joint interests and could collaborate with each other and other interested countries to help
establish broader relationships outside the military realm.

In addition, the United States should take steps on its own to stabilize the relationship. First and foremost, it should get serious about making U.S.
space capabilities more resilient. Increasing resilience would support deterrence by decreasing the benefits an adversary might hope to achieve and also
help ensure that critical capabilities can survive should deterrence fail. While resilience has been a talking point for the last few years, the United States
has made little progress toward achieving the goal. Radical change is thus needed in how Washington develops and organizes national security space
capabilities. Moreover, the United States should embrace commercial services to diversify and augment governmental capabilities, while encouraging
allies to develop their own space capabilities.

Second, the U nited S tates should continue to bolster the transparency of space activities by increasing the
amount of space situational awareness (SSA) data available to satellite operators and the public. Greater transparency reinforces
ongoing U.S. and international initiatives to promote responsible behavior in space and also helps mitigate the
possibility for accidents or naturally caused events to spark or escalate tensions. Shifting responsibility for space safety to a civil agency that can share
and cooperate more easily with the international community and working with the international community to develop more publicly available sources
of SSA data outside the U.S. government are two steps that would enhance trust, improve data reliability, and reinforce norms of behavior.

The consequences of not addressing the current strategic instability in space are real. A future conflict in space between the United States and China
would have devastating impacts on everyone who uses and relies on space. Both the United States and China have acknowledged the dangers of
outright conflict and have pledged their interest in avoiding such an outcome. Taken together, the
initial steps outlined here could
help stabilize the U.S.-China strategic relationship in space, mitigate the threat of the worst-case
scenario, and work toward a more positive outcome that benefits all .

There’s robust US data disclosure in the squo but only bilateral SSA
framework spills over to broader space engagement with China
Sankaran 14 [Jaganath Sankaran, postdoctoral fellow at the Belfer Center for Science and
International Affairs at Harvard’s Kennedy School of Government and was previously a Stanton Nuclear
Security Fellow at the RAND Corporation. Sankaran received his doctorate in international security from
the Maryland School of Public Policy, Winter 2014 “Limits of the Chinese Antisatellite Threat to the
United States” JSTOR]

The United States should also study and improve its ability to use measures like satellite sensor
shielding and collision avoidance maneuvers for satellites. These would dilute an adversary’s
ASAT operation and increase the apparent uncertainty of the consequences of an ASAT
attack.46 Monitoring mechanisms—both technical and nontechnical—that provide long warning
times and the ability to definitively identify an attacker in real time should also be a priority. The
US Air Force has started to invest in such capabilities on a small scale. Gen William Shelton,
head of Air Force Space Command, announced on 21 February 2014 the upcoming launch of the
geosynchronous space situational awareness (SSA) system designed to “have a clear,
unobstructed and distinct vantage point for viewing resident space objects.”47 Such systems will
help in attributing an ASAT attack. Similarly, the groundbased Rapid Attack, Identification,
Detection, and Reporting System (RAIDRS) is a valuable US asset to identify, characterize, and
geolocate attacks against US satellites.48

However, these unilateral measures offer no direct positive inducement for the Chinese
decision maker to desist from taking an aggressive posture on space security. Such inducements
will require more cooperative ventures that integrate China more deeply into the global
space community. The United States could, for example, make available its data on satellite
traffic and collisions, which would help China streamline its space operations. Such gestures
demonstrate a modicum of goodwill which can encourage further cooperation. The United
States has already put in place policy actions to share SSA data with allies. The latest guidance
document on US space policy, the National Security Space Strategy released in 2011 by the
Office of the Secretary of Defense and the Office of the Director of National Intelligence, states
that “the United States is the leader in space situation awareness (SSA) and can use its
knowledge to foster cooperative SSA relationships, support safe space operations, and protect
US and allied space capabilities and operations.”49 However, the United States has been more
forthcoming and willing to ink data-sharing arrangements with allies than with China. The US
Strategic Command (USSTRATCOM) has signed SSA data agreements with Japan, Australia,
the UK, Italy, Canada, and France.50 Although there may be security reasons behind this
preference to engage primarily with allies, it is important to realize that China is the nation that
most needs to be induced to contribute to the peaceful development of space operations. The
United States should use all available diplomatic leverage to partner with China and share
SSA data to make it a part of the global space community.

China’s satellite data is uniquely key to advanced space weather


forecasting
Aghajanian 12 [Liana Aghajanian, journalist, citing Dr. Rainer Schwenn, one of the
developers of KuaFu; Dr. William Liu, a senior scientist at the Canadian Space Agency; the 2008
National Academy of Sciences Report; May 14, 2012. “Cloudy With a Chance of Catastrophe:
Predicting the Weather in Space.” http://mentalfloss.com/article/30665/cloudy-chance-
catastrophe-predicting-weather-space]
In 1859, while observing sunspots, a young astronomer named Richard Carrington recorded a geomagnetic storm so powerful,
the electrical currents it sent to Earth were enough to keep the newly invented telegraph operating without a battery. Centuries
the
later, though humans have sent robots to Mars and even strong-armed a couple engineers into walking on the moon,
science of space weather, the changing environmental conditions in near-Earth space, has
largely managed to elude us. In fact even the term “space weather” is new; it wasn’t used regularly until the 1990s.
Now, an international project led by China is hoping to advance the study of space weather by light-
years in order to minimize the dangerous impact a storm in space might have on us fragile
Earth lings. If experts are correct, there's a chance that a serious space weather threat will arrive
sooner rather than later – and the risk to humans is greater than you think . Oddly, the
trouble is that we’ve become too advanced. Because humans today are so dependent upon modern
electrical technology, a space storm the size of the one Carrington recorded in 1859 could
cause catastrophic problems if it occurred tomorrow. According to a 2008 National Academy of
Sciences Report, from long-term electrical blackouts to damage to communication satellites and
GPS systems (not to mention billions in financial losses), the results could be devastating
worldwide . Luckily, scientists are hopeful the KuaFu project will prevent (or at least minimize
the impact of) this kind of disaster . Our Eyes on the Sun, The Sun in Our Eyes Named for Kua Fu, a sun-chasing
giant from a Chinese folktale whose pursuit to tame the brightest star in our solar system ended after he died of thirst, the
KuaFu project will create a space weather forecasting system 1.5 million kilometers from the
Earth's surface. The goal is similar to the one from the legend: to observe changes in solar-terrestrial storms ,
investigate flows of energy and solar material, and improve the forecasting of space weather . Not
necessarily to tame the sun, but, at least, to understand it. Proposed in 2003 by scientist Chuanyi Tu from the Chinese Academy
of Sciences, the project will place three separate satellites at strategic points in our solar system to observe the inner workings of
space weather. China's National Space Administration along with the European and Canadian Space Agencies will work together
to man them. “Being aware of the impending blindness to space weather and its effects, we
consider a mission like KuaFu absolutely mandatory ,” said Dr. Rainer Schwenn, one of the developers
of KuaFu. “If 'space weather' keeps being considered an important science goal, then KuaFu is a real key
project .” The satellites will offer an unprecedented ability to glean information about the
often tumultuous relationship between the sun and Earth, by allowing scientists to observe both the star
and its effects on the planet simultaneously. To now, this process has been viewable only via computer
simulation. “You have to look at the two systems simultaneously [to most accurately forecast space weather]” said Dr. William
Liu, a senior scientist at the Canadian Space Agency who took over as project leader when Chuanyi Tu retired two years ago. “It's
a real observation; it's what's actually happening.” Space Storm Showdown: What Do We Do? So, if
the power-grid
frying , billion dollar damage-wreaking storm is inevitable, how much will forecasting it
actually help? Lots . According to Liu, predicting space weather activity can give the operators who
maneuver satellites in space the information they need to protect them and us from harm. For
example: If companies know a storm is approaching, it gives them a chance to tweak their loads
before their systems descend into chaos and shut off power for, say, the entire East Coast of the
United States. “ That's how you prevent catastrophe ,” Liu explained. “You reduce the load on the parts that
are more sensitive.” While the project was originally scheduled to be completed this year, Liu’s current estimates put its debut at
2016. Despite the delays, he remains optimistic it will come to fruition, pointing out that international collaborations like this one
often stir up scientific and financial challenges that delay the launch process. Whether the KuaFu project will be able to predict
space weather accurately all of the time is up for debate. Liu, however, is confident that, at the very least, it's a step toward that
direction . “With this launch and operation, we'll make our predictions better. Whether it will be 100 percent, that will be too
much to ask, but it will definitely improve our knowledge.”

Lack of space weather data causes global catastrophe


Weiss 19 – Matthew Weiss, American Jewish University, and Martin Weiss, UCLA-Olive
View Medical Center, “An assessment of threats to the American power grid” Energy,
Sustainability and Society December 2019, 9:18, 29 May 2019,
https://link.springer.com/article/10.1186/s13705-019-0199-y
Introduction In testimony before a Congressional Committee, it has been asserted that a
prolonged collapse of this
nation’s electrical grid—through starvation, disease, and societal collapse—could result in the
death of up to 90% of the American population [1]. There is no published model disclosing how these
numbers were arrived at, nor are we able to validate a primary source for this claim. Testimony given by the Chairman of the
Congressional EMP Commission, while expressing similar concerns, gave no estimate of the deaths that would accrue from a
prolonged nationwide grid collapse [2]. The
power grid is posited to be vulnerable to geomagnetic storms
generated by solar activity, electromagnetic pulses (EMP, also referred to as HEMP) produced by high altitude nuclear
detonations, cyberattack, and kinetic (physical) attack. Evidence for and against the validity of each of these threats follows below.
Much of the knowledge on these matters is classified. The studies for and against EMP, other than for limited testing of a few
components of the infrastructure by the EMP commission, are based not on physical demonstrations but mathematical models and
simulations. Moreover, the underlying physics and technology involved—the electrical engineering and materials science—is likely
beyond the understanding of the reader, and certainly beyond that of these writers. With these limitations in mind, we proceed. The
electrical grid Geomagnetic storms are due to coronal mass ejections (CMEs)—massive eruptions of plasma expelled
from the sun’s corona. Plasma is the fourth fundamental state of matter, consisting of free electrons and positively charged ions. The
sun, like all stars, is plasma. The electrical grid HV (high voltage) transformers—transmitting voltages of greater than
100 kV—are what make it possible to send electricity over great distances to thousands of substations, where smaller transformers
are the weak link in the system , and the Federal Energy Regulatory
reduce the voltage. HV transformers
Commission (FERC) has identified 30 of these as being critical. The simultaneous
loss of just 9 , in various combinations,
could cripple [destroy] the network and lead to a cascading failure , resulting in a “ coast-to
coast blackout ” [3]. If the HV transformers are irreparably damaged it is problematic
whether they can be replaced . The great majority of these units are custom built . The lead time
between order and delivery for a domestically manufactured HV transformer is between 12 and 24 
months [4], and this is under benign, low demand conditions. The first practical application of the transformer was invented in
the USA by William Stanley, but largely as a consequence of American trade policy (“It doesn’t make any difference whether a
country makes potato chips or computer chips”- attributed to Michael Boskin, Chairman of President George H W Bush’s Council of
Economic Advisors) [5] there is little manufacturing capability remaining in the USA. Worldwide production is less than 100 per
Ordered
year and serves the rapidly growing markets of China and India. Only Germany and South Korea produce for export.
today, delivery of a unit from overseas (responsible for 85% of current American purchasing) would take nearly
3 years [6]. The factory price for an HV transformer can be in excess of $10 million—too expensive to
maintain an inventory solely as spares for emergency replacement.
Potential mechanisms of collapse Geomagnetic storms Coronal mass ejections often occur
with solar flares, but each
can also take place in the absence of the other. The latter emits radiation in all bands of the electromagnetic
spectrum (e.g., white light, ultraviolet light, X-rays, and gamma rays) and unlike CMEs, affect little more than radio
communications. CME’s take several days to reach the Earth. The radiation generated by solar flares on the other hand arrives in
8 min. Coronal mass ejections carry
an intense magnetic field. If a storm enters the earth’s
magnetosphere, it causes rapid changes in the configuration of the earth’s magnetic field . Electric
current is generated in the magnetosphere and ionosphere, generating electromagnetic fields at ground level. The movement of
magnetic fields around a conductor, i.e., a wire or pipe, induces an electric current. The longer the wire, the greater the
amplification. The current induced is akin to DC (direct current), which the electrical system poorly tolerates. Our grid is based on
AC. The
excess current can cause voltage collapse, or worse, cause permanent damage to large
transformers . The current flowing through HV transformers during a geomagnetic disturbance can be estimated using
storm simulation and transmission grid data [7]. From these results, transformer vulnerability to internal heating can be assessed.
The largest recorded geomagnetic storm occurred Sept 1–2, 1859—the Carrington event, named after the English amateur
astronomer, Richard Carrington. Auroras were seen as far south as the Caribbean. Campers in the Rocky Mountains were awakened
shortly after midnight by “an auroral light so bright that one could easily read common print. Some of the party insisted it was
daylight and began preparation for breakfast” [8]. Telegraph wires transmitted electric shocks to operators and ignited fires. In May
1921, there was another great geomagnetic disturbance (GMD), the railroad storm. The National Academy of Sciences estimates that
if that storm occurred today, it could cause 1– 2 trillion dollars damage and full recovery could
take 4– 10 years [9]. The basis for this assertion is a presentation made by J Kappenman of Metatech, the Goleta California
engineering consulting firm, given as part of the NAS Space weather workshop titled “Future Solutions, Vulnerabilities and Risks”,
on May 23, 2008. The simulation asserts that a 1921 intensity storm could damage or destroy over
300 transformers in the US, and leave 130 million people without power [10]. Elsewhere,
Kappenman states that in a worst case situation, geomagnetic disturbances could instantly create loss of over
70% of the nation’s electrical service [11]. In March 1989, a geomagnetic storm caused collapse of the power
grid in Quebec, leaving 6 million without power for 9 h. NERC (the North American Electric Reliability Council), a self-regulated
trade organization formed by the electric utility industry, asserts that the blackout was not due to overheating of transformers from
geomagnetically induced current, but to the near-simultaneous tripping of seven relays, and this is correct [12]. The rapid voltage
collapse (within 93 s) likely prevented transformer thermal damage. The same storm, however, destroyed a major transformer at the
Salem nuclear plant in New Jersey [13]. The 1989 Hydro-Quebec storm was 1/10th the intensity of the 1921 Railroad Storm [14]. A
report for Lloyd’s in 2013 states a Carrington-level extreme geomagnetic storm is almost inevitable in
the future . Using its own models and simulations, it puts the US population at risk at between 20 and 40 million, with the
outages lasting up to 1–2 years [15]. Because of geography and ground conductivity, the risk of a
transformer sustaining damage is 1000 times greater in some US counties than in others. The highest risk is
to the counties along the corridor between Washington DC and New York [16]. The first written account of a
solar storm is possibly in the book of Joshua. Written reports of aural sightings by Greeks and Romans begin in 371 BC. A
Carrington-level storm narrowly missed the earth in 2012 [17]. NASA has produced a video on the
CME [18]. Formerly considered a 1 in 100-year event, the likelihood of a Carrington intensity storm striking the earth has most
recently been placed at 12% per decade [19].

Mitigation The EMP Commission, in its 2008 report, found that it is not practical to try to protect the entire
electrical power system or even all high-value components. It called however for a plan designed to reduce recovery and
restoration times and minimize the net impact of an event [20]. This would be accomplished by “hardening” the grid, i.e., actions to
protect the nation’s electrical system from disruption and collapse, either natural or man-made [21]. The shielding is accomplished
through surge arrestors and similar devices [22]. The cost to harden the grid, from our tabulation of Congressional EMP figures, is
$3.8 billion. There has been no hardening of the grid The commission and organization that are responsible for
public policy on grid protection are FERC and NERC. FERC (The Federal Energy Regulatory Commission) is an independent agency
within the Department of Energy. NERC, the self-regulatory agency formed by the electric utility industry, was renamed the North
American Electric Reliability Corporation in 2006. In June of 2007, FERC granted NERC the legal authority to enforce reliability
standards for the bulk power system in the USA. FERC cannot mandate any standards. FERC only has the authority to ask NERC to
propose standards for protecting the grid. NERC’s position on GMD is that the threat is exaggerated. A report
by NERC in 2012 asserts that geomagnetic storms will not cause widespread destruction of
transformers, but only a short-term (temporary) grid instability [23]. The NERC report did not use a model that
was validated against past storms, and their work was not peer-reviewed . The NERC
report has been criticized by members of the Congressional EMP commission. Dr. Peter Pry asserts that the final draft
was “written in secret by a small group of NERC employees and electric utility insiders….. The report
relied on meetings of industry employees in lieu of data collection or event
investigation” [22]. NERC, in turn, criticizes Kappenman’s work. NERC states that the Metatech work cannot be
independently confirmed [24]. NERC reliability manager Mark Lauby criticized the report for being based on proprietary code [24].
Kappenman’s report, however, received no negative comments in peer review [24]. The NERC standards The
reliability
standards and operational procedures established by NERC, and approved by FERC, are disputed [25].
Among the points are these: 1. The standards against GMD do not include Carrington storm class levels .
The NERC standards were arrived at studying only the storms of the immediate prior 30 years, the largest of which was the Quebec
storm. The GMD “benchmark event”, i.e., the strongest storm which the system is expected to withstand, is set by NERC as 8 V/km
[26]. NERC asserts this figure defines the upper limit intensity of a 1 in 100-year storm [26]. The Los Alamos National Laboratory,
however, puts the intensity of a Carrington-type event at a median of 13.6 V/km, ranging up to 16.6 V/km [27]. Another analysis
finds the intensity of a 100-year storm could be higher than 21 V/km [28]. 2. The
15–45 min warning time of a
geomagnetic storm provided by space satellites (ACE and DSCOVR) will be insufficient for
operators to confer, coordinate, and execute actions to prevent grid damage and
collapse. Testimony of Edison Electric Institute official Scott Aaronson under questioning by Senator Ron
Johnson in a hearing before the Senate Homeland Security and Governmental Affairs Committee in 2016
encapsulates some of the issues. Video of the exchange is available on the web [29]. The Edison Electric Institute (EEI)
is the trade association that represents all US investor-owned electric companies. Johnson: Mr. Aaronson, I just have to ask you –
the protocol of warning 15–30 min – who is going to make that call? I mean, who is going to make that for a massive geomagnetic
disturbance, that nobody knows how many of these transformers are going to be affected. Who is going to make that call to shut
them off line – to take them off line – so those effects do not go through those wires and destroy those large transformers that
cannot be replaced? Aaronson: So, the grid operators are tightly aligned. We talked about the fact that there are 1900 entities that
make up the bulk electric system. There are transmission operators and so on… Johnson (interrupting): Who makes the call? Who
makes the call – we are going to shut them all down in 30 min, in 15 min? Aaronson: It’s not as simple as cutting the power. That’s
not how this is going to work but there is again, there is this shared responsibility among the sector. Johnson: Who makes the call?
Aaronson: I do not know the answer to that question [29]. Mr. Aaronson’s is Managing Director for Cyber and Infrastructure
Security at EEI. Congressman Trent Franks, R Az introduced HR 2417, the SHEILD Act, on 6/18/2013. The bill would give FERC the
authority to require owners and operators of the bulk power system to take measures to protect the grid from GMD or EMP attack.
The costs would be recovered by raising regulated rates. Franks states he had been led to believe that his bill would be brought to the
House floor for a vote. But he states House Energy and Commerce Committee Chairman Fred Upton R, Mich., let it die in
committee. He has been unable to get an explanation from Upton [30]. Between 2011 and 2016, Mr. Upton has received $1,180,000
in campaign contributions from the electric utility industry [31]. The electric utility industry is heavily involved in campaign
donations. During the 2014 federal election cycle, the electric utility industry made $21.6 million in campaign contributions [32].
The electrical utility industry is particularly involved in state politics. For instance, in Florida, between 2004 and 2012 electric utility
companies donated $18 million into legislative and state political campaigns. In that state, the electric utilities employ one lobbyist
for every two legislators [33]. Electric utility revenue in 2015 was 391 billion dollars [34].
Electromagnetic pulse Of the scenarios that might lead to electrical network collapse, EMP has received the widest public attention. It has been the subject of television series,
films, and novels. HEMP (for high altitude) is the more accurate acronym, but as media and the public use EMP, we will use both interchangeably. The issue has become highly
politicized. The most prominent article in the media against EMP as a threat is by Patrick Disney, “The Campaign to Terrify You about EMP” published in the Atlantic in 2011.
“From Newt Gingrich to a Congressional ‘EMP Caucus’, some conservatives warn the electronic frying blast could pose gravely underestimated dangers on the U.S…..Ballistic
missile defense seems to be the panacea for this groups concern, though a generous dose of preemption and war on terror are often prescribed as well” [35]. As of 2009, Mr.
Disney was acting Policy Director for the National Iranian American Council (NIAC). NIAC has been accused of acting as a lobby for the Islamic Republic of Iran [36]. Mr.
Disney is quoted as stating his strategy, in advancing an Iranian interest, is to “create a media controversy” [36]. The campaign to discredit EMP has been largely successful. To a
very large part of the body politic EMP is identified as a cause limited to the far right. A high-altitude electromagnetic pulse (EMP) is produced when a nuclear device is
detonated above the atmosphere. No radiation, blast, or shock wave is felt on the ground, nor are there any adverse health effects, but electromagnetic fields reach the surface.
An EMP has three components, E1 through E3, defined by speed of the pulse. Each has specific characteristics, and specific potential effects on the grid. E1, the first and fastest
component, affects primarily microelectronics. E3, the later and slower component, affects devices attached to long conductive wires and cables, especially high-voltage
transformers. A single nuclear blast will generate an EMP encompassing half the continental USA [37]. Two or three explosions, over different areas, would blanket the entire
continental USA. The potential impact of an EMP is determined by the altitude of the nuclear detonation, the gamma yield of the device, the distance from the detonation point,
the strength and direction of the earth’s magnetic field at locations within the blast zone and the vulnerability of the infrastructures exposed. The E1 gamma signal is greatest for
bursts between 50 and 100 km altitude. E3 signals are optimized at busts between 130 and 500 km altitude, much greater heights than for E1 [38]. Higher altitude widens the
area covered, but at the expense of field levels. The 1963 atmospheric test ban has prevented further testing. E1 and its effects The E1 pulse (“fast pulse”) is due to gamma
radiation (photons), generated by a nuclear detonation at high altitude, colliding with atoms in the upper atmosphere. The collisions cause electrons to be stripped from the
atoms, with the resultant flow of electrons traveling downward to earth at near the speed of light. The interaction of the electrons with the earth’s magnetic field turns the flow
into a transverse current that radiates forward as an intense electromagnetic wave. The field generates extremely high voltages and current in electrical conductors that can
exceed the voltage tolerance of many electronic devices. All this occurs within a few tens of nanoseconds. The Congressional EMP Commission postulated that E1 would have its
primary impact on microelectronics, especially SCADAs (Supervisory Control and Data Acquisition), DCSs (digital control systems), and PLCs (programmable logic controllers).
These are the small computers, numbering now in the millions, that allow for the unmanned operation of our infrastructure. To assess the vulnerability of SCADAs to EMP, and
therefore the vulnerability of our infrastructure, the EMP Commission funded a series of tests, exposing SCADA components to both radiated electric fields and injected voltages
on cables connected to the components. The intent was to observe the response of the equipment, when in an operational mode, to electromagnetic energy simulating an EMP.
“The bottom line observation at the end of the testing was that every system tested failed when exposed to the simulated EMP environment” [6]. E1 can generate voltages of
50,000 V. Normal operating voltages of today’s miniaturized electronics tend to be only a few (3-4) volts. States the EMP Commission: “The large number and widespread
reliance on such systems by all the nation’s critical infrastructures represent a systemic threat to their continued operation following an EMP event” [39]. A scenario seen in
films is all automobiles and trucks being rendered inoperable. This would not be the case. Modern automobiles have as many as 100 microprocessors that control virtually all
functions, but the vulnerability has been reduced by the increased application of electromagnetic compatibility standards. The EMP Commission found that only minor damage
occurred at an E1 field level of 50 kV/m, but there were minor disruptions of normal operations at lower peak levels as well [40]. There is a self-published post (J. Steinberger,
Nobel laureate physics, 1988) disputing the potential effects of E1 [41]. This is an isolated opinion. Shielding against E1 could theoretically be accomplished through the
construction of a Faraday cage around specific components or an entire facility. The cage is composed of conductive materials and an insulation barrier that absorbs pulse
energy and channels it directly into the ground. The cage shields out the EM signals by “shorting out” the electric field and reflecting it. To be an effective Faraday cage, the
conductive case must totally enclose the system. Any aperture, even microscopic seams between metal plates, can compromise the protection. To be useful, however, a device
must have some connection with the outside world and not be completely isolated. Surge protective devices can be used on metallic cables to prevent large currents from
entering a device, or the metallic cables can be replaced by fiber optic cables without any accompanying metal. The US Military has taken extensive measures to protect
(“harden”) its equipment against E1. “On the civilian side, the problem has not really been addressed” [42]. E3 and its effects E3 is caused by the motion of ionized bomb debris
and atmosphere relative to the geomagnetic field, resulting in a perturbation of that field. This induces currents of thousands of amperes in long conductors such as transmission
lines that are several kilometers or greater in length. Direct currents of hundreds to thousands of amperes will flow into transformers. As the length of the conductor increases,
the amperage amplifies. The physics of E3 are similar to that of a GMD, but not identical. GMD comes from charged particles showering down from space creating current flow
in the ionosphere. These currents create magnetic fields on the ground. A nuclear burst on the other hand generates particles which create a magnetic bubble that pushes on the
earth’s magnetic field producing a changing magnetic field at the Earth’s surface. A geomagnetic storm will have substorms that can move over the Earth for more than 1 day,
while the E3 HEMP occurs only immediately following a nuclear burst. There are three studies on the potential effects of a HEMP E3 on the power grid. The first study,
published in 1991, found there would be little damage [43]. Although supporting the utility industry’s position, it has not been subsequently cited by either NERC or the industry.
The study is criticized for expressing a smaller threat intensity [44]. The second, published in 2010 by Metatech, calculated that a nuclear detonation 170 km over the USA would
collapse the entire US power grid [45]. The third study, by EPRI (an organization funded by the electric utility industry) published in February 2017, asserts that a single high-
altitude burst over the continental USA would damage only a few, widely scattered transformers [46]. The study is disputed for underestimating threat levels and using
erroneous models [44]. These results are incompatible. One’s interpretation of the studies on E3 (and GMD) is based largely on the credibility one gives to the underlying
Commission or Institute, and not the published calculations. FERC has decided not to proceed with a GMD standard that includes EMP [47]. It will be recalled the GMD
standard is 8 V/km. The EMP Commission, utilizing unclassified measured data from the Soviet era nuclear tests, found an expected peak level for E3 HEMP for a detonation
over the continental USA would be 85 V/km [48]. The position of the electric utility industry is that E3 from a nuclear detonation is not a critical threat [49]. Others have come
to a different conclusion. Israel has hardened her grid [50]. She perceives herself to face an existential threat, and it is not the Sun. The electric utility industry states the cost of
hardening the grid against EMP is the government’s responsibility, not the industry’s [51]. Cyberattack The vulnerability from a cyberattack is exponentially magnified by our
dependence on SCADAs. In 2010, a computer worm attacking SCADA systems was detected. Although widely spread, it was designed to only attack SCADA systems
manufactured by Siemens for P-1 centrifuges of the Iranian nuclear enrichment program. The attack destroyed between 10 and 20% of Iranian centrifuges. Iran’s program was
likely only briefly disrupted [52]. In December 2015, a cyberattack was directed against the Ukrainian power grid. It caused little damage as the grid was not fully automated.
There is an argument that the cyber threat is exaggerated. Thomas Rid states that viruses and malware cannot at present collapse the electric grid. “(The world has) never seen a
cyber- attack kill a single human being or destroy a building” [53]. The electric utility industry offers a similar perspective. In testimony on cybersecurity before the Senate
Homeland Security and Governmental Affairs Committee, its representative states that “There are a lot of threats to the grid…..from squirrels to nation states, and frankly, there
have been more blackouts as a result of squirrels (gnawing wire insulation) then there are from nation states” [54]. Others however express concern [55]. Moreover, in a report
by the Department of Defense in 2017, it is noted that “the cyber threat to critical US infrastructure is outpacing efforts to reduce pervasive vulnerabilities.” [56] That report
notes that “due to our extreme dependence on vulnerable information systems, the United States today lives in a virtual glass house” [57]. On March 15, 2018, the Department of
Homeland Security issued an alert that the Russian government had engineered a series of cyberattacks targeting American and European nuclear power plants and water and
electric systems [58]. It is reported these attacks could allow Russia to sabotage or shut down power plants at will [59]. The ability to operate a system in the absence of
computer-driven actions is fast disappearing. The electric power industry spends over $1.4 billion dollars annually to replace electromechanical systems and devices that involve
manual operation with new SCADA equipment [60]. With modest increases in efficiency come exponential increases in vulnerability. The extent to which reduced labor costs
(and perhaps reduced energy costs) are passed on to the public is uncertain. Kinetic attack An internal FERC memo obtained by the press in March 2012 states that “destroy
nine interconnector substations and a transformer manufacturer and the entire United States grid would be down for 18 months, possibly longer” [61]. The mechanism is
through the megawatts of voltage that would be dumped onto other transformers, causing them to overheat and in cascading fashion cause the entire system overload and fail.
At Metcalf California (outside of San Jose) on April 16, 2013, a HV Transformer owned by PG&E sustained what NERC and PG&E claimed was merely an act of vandalism [1].
Footprints suggested as many as 6 men executed the attack. They left no fingerprints, not even on the expended shell casings [1]. US FERC Chairman Wellinghoff concluded that
the attack was a dry run for future operations [62]. Information on how to sabotage transformers has been available online [63]. There is a disincentive for management to
invest in security. As stated in a report by the Electric Research Power Institute: “Security measures, in themselves, are cost items, with no direct monetary return. The benefits
are in the avoided costs of potential attacks whose probability is generally not known. This makes cost-justification very difficult” [64]. CEO pay at large American companies is
based on the Harvard Business School theory that the best measure of managerial performance is a company’s stock price. This does not necessarily align the interests of CEOs
with shareholders, let alone the public. It “encourages short-term boosts to profits rather than investing for long term growth” [65]. In 2014, the CEO of PG&E, Anthony Early
Jr., had a compensation of $11.6 million dollars. Over 90% was from bonuses based on stock performance. The President of PG&E, Christopher Johns, had a compensation of $6
million dollars [66]. There is no evidence, however, that any of this is in play in the positions of the electric utility industry vis-à-vis securing the grid. States PG&E spokesman
Jonathan Marshall, “The majority of compensation for senior executives is shareholder funded and dependent on achieving targets related to safety, reliability and other results”
[66].

Consequences of a sustained power outage The EMP Commission states “Should significant parts
of the electrical power infrastructure be lost for any substantial period of time, the Commission believes that the
consequences are likely to be catastrophic , and many people will die for the lack of the
basic elements necessary to sustain life in dense urban and suburban communities.” [67]. Space constraints preclude
discussion on how the loss of the grid would render synthesis and distribution of oil and gas inoperative.
Telecommunications would collapse, as would finance and banking. Virtually all technology,
infrastructure, and services require electricity. An EMP attack that collapses the electric power grid will
collapse the water infrastructure —the delivery and purification of water and the removal and treatment of
wastewater and sewage. Outbreaks that would result from the failure of these systems include cholera.
It is problematic if fuel will be available to boil water . Lack of water will cause death in 3 to
4 days [68]. Food production would also collapse . Crops and livestock require water delivered by
electronically powered pumps. Tractors, harvesters, and other farm equipment run on petroleum products
supplied by an infrastructure (pumps, pipelines) that require electricity. The plants that make
fertilizer, insecticides, and feed also require electricity. Gas pumps that fuel the trucks that distribute food require
electricity. Food processing requires electricity . In 1900, nearly 40% of the population lived on farms. That percentage
is now less than 2% [69]. It is through technology that 2% of the population can feed the other 98% [68].
The acreage under cultivation today is only 6% more than in 1900, yet productivity has increased 50 fold [69]. As stated by Dr.
Lowell L Wood in Congressional testimony: “If
we were no longer able to fuel our agricultural machine in the
country, the food production of the country would simply stop, because we do not have the horses
and mules that used to tow agricultural gear around in the 1880s and 1890s”. “So the situation would be exceedingly adverse if both
electricity and the fuel that electricity moves around the country……… stayed away for a substantial period of time, we would miss
we would starve the following winter ” [70]. People can live for 1–2 months without food, but
the harvest, and
after 5 days, they have difficulty thinking and at 2 weeks they are incapacitated [68]. There
is typically a 30-day
perishable food supply at regional warehouses but most would be destroyed with the loss of
refrigeration [69]. The EMP Commission has suggested food be stockpiled for a possible EMP event.

Space weather risk is systemically underestimated


Rosen 16 [Julia Rosen, science reporter for the Los Angeles Times, PhD in geology,7-14-2016
https://www.sciencemag.org/news/2016/07/here-s-how-world-could-end-and-what-we-can-
do-about-it]
As end-of-humanity scenarios go, that bleak vision from Fritz Leiber’s 1951 short story “A Pail of Air” is a fairly remote possibility.
Scholars who ponder such things think a self-induced catastrophe such as nuclear war or a bioengineered pandemic
is most likely to do us in. However, a number of other extreme natural hazards —including threats from
space and geologic upheavals here on Earth—could still derail life as we know it, unraveling advanced
civilization , wiping out billions of people, or potentially even exterminating our species .

Yet there’s been surprisingly little research on the subject, says Anders Sandberg, a catastrophe researcher at the
University of Oxford’s Future of Humanity Institute in the United Kingdom. Last he checked, “there are more papers about dung
beetle reproduction than human extinction,” he says. “We might have our priorities slightly wrong.”

Frequent, moderately severe disasters such as earthquakes attract far more funding than low-probability apocalyptic ones.
Prejudice may also be at work; for instance, scientists who pioneered studies of asteroid and comet impacts
complained about confronting a pervasive “giggle factor.” Consciously or unconsciously, Sandberg says,
many researchers consider catastrophic risks the province of fiction or fantasy —not serious science.

A handful of researchers, however, persist in thinking the unthinkable. With enough knowledge
and proper
planning, they say, it’s possible to prepare for—or in some cases prevent—rare but devastating natural
disasters. Giggle all you want, but the survival of human civilization could be at stake.

Threat one: Solar storms


One threat to civilization could come not from too little sun, as in Leiber’s story, but from too much. Bill Murtagh has seen how it
might start. On the morning of 23 July 2012, he sat before a colorful array of screens at the National Oceanic and Atmospheric
Administration’s Space Weather Prediction Center in Boulder, Colorado, watching twin clouds of energetic particles—known as a
coronal mass ejection (CME)—erupt from the sun and barrel into space. A mere 19 hours later, the solar buckshot blazed past the
spot where Earth had been just days before. If it had hit us, scientists say, we might still be reeling.

Now the assistant director of space weather at the White House Office of Science and Technology Policy in Washington, D.C.,
Murtagh spends much of his time pondering solar eruptions. CMEs don’t harm human beings directly, and their
effects can be spectacular. By funneling
charged particles into Earth’s magnetic field, they can trigger
geomagnetic storms that ignite dazzling auroral displays. But those storms can also induce dangerous
electrical currents in long-distance power lines. The currents last only a few minutes, but they can take out
electrical grids by destroying high-voltage transformers—particularly at high latitudes, where Earth’s
magnetic field lines converge as they arc toward the surface.

The worst CME event in recent history struck in 1989, frying a transformer in New Jersey and leaving 6 million people in Quebec
province in Canada without power. The largest one on record—the Carrington Event of 1859, named after the
U.K. astronomer who witnessed the accompanying solar flare—was up to 10 times more intense. It sent searing currents racing
through telegraph cables, sparking fires and shocking operators, while the northern lights danced as far south as Cuba.

if another storm that size


“It was awesome,” says Patricia Reiff, a space physicist at Rice University in Houston, Texas. But
struck today’s infrastructure, she says, “there would be tremendous consequences .”

Some researchers fear that another


Carrington-like event could destroy tens to hundreds of
transformers, plunging vast portions of entire continents into the dark for weeks or months—
perhaps even years, Murtagh says. That’s because the custom-built, house-sized replacement transformers can’t be
bought off the shelf. Transformer manufacturers maintain that such fears are overblown and that most equipment would
survive. But Thomas Overbye, an electrical engineer at the University of Illinois, Urbana-Champaign, says nobody knows for sure.
“We don’t have a lot of data associated with large storms because they are very rare,” he says.

What’s clear is that widespread blackouts could be catastrophic , especially in countries that depend on
highly developed electrical grids. “We’ve done a marvelous job creating a great vulnerability to this
threat,” Murtagh says. Information technologies, fuel pipelines, water pumps, ATMs, everything with
a plug would be rendered useless. “That’s going to affect our ability to govern the country,” Murtagh
says.

Space science and tech collaboration vital to stop pandemics, make


telemedicine successful, and distribute vaccines
Krishnamurthy 18 [Ramesh, researcher in Health Systems and Innovation Cluster,
World Health Organization, with; Jason Hatton; 1/1/18, “Space science and technologies to
advance health-related sustainable development goals,”
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5791876/]
Human health, animal health and ecosystems constantly interact; satellite-based geospatial
information, including earth observations and remote sensing data, can accelerate our understanding of
intricate ecological and environmental interactions and their impact on human health. Analysing
these interactions can guide public health decision-making efforts. Large-scale temporal pattern recognition , such
as the decline of artic ice over a 20-year period, global deforestation rates or global daily surface
temperatures cannot be easily accomplished without earth observation satellites. Satellite-based remote
sensing data have been used to identify environmental factors for monitoring Rift Valley fever and other infectious diseases.2–4

Space science and technologies have wide applications, for example in managing public health
emergencies , forecasting epidemics , facilitating early warning and disaster management
plans, as well as monitoring environmental parameters .5 The by-products of space-based technologies
and innovations can make substantial contributions to injury prevention from road crashes.

In health services delivery, innovative space technologies are now applied


in assistive robotic surgeries, predictive
diagnosis, compact water filtration systems, injection safety devices and precision medicine . Furthermore,
satellite communications-based tele-health, telemedicine services and tele-guided ultrasound systems
now connect patients and caregivers in hard-to-reach or resource-constrained settings. In addition,
satellite images can assist in delivering vaccines or accessing health-care facilities by
rapidly allowing the detection
of road features of an image through feature extraction, producing a
map of road networks where maps are either not available or inexistent.

In health systems research, space-based research , such as on the International Space Station, can provide
unique data on physiologic and biological processes, which may allow potential novel therapeutic
approaches to identify diseases. Space science and technology thus contribute to epidemic
intelligence , health emergencies and the research agenda on the benefits to public health.
Space-based innovation in the health sector is poised to bring significant health and economic gains through the adoption of product
and process innovations at all levels of health services. This will contribute to delivering better health for all people and could reduce
the disease burden. These innovations have a clear place in prevention , preparedness , response
and recovery – all the important stages of national public health management. Thus far, the use of
space science and technology in public health has been sporadic . Earth observation data from
orbiting satellites and ground-based sensors are already used by a few countries to make public health decisions, but more
countries could use space-based technologies and geospatial information in this way.

To use space science and technologies in advancing the health sustainable development goal, appropriate national-level policies and
governance mechanisms are essential. National
governments are encouraged to strengthen policy and
governance mechanisms for closer collaboration between health ministries, other relevant ministries and space agencies to
leverage the benefits of space science and technologies for health gains. Governments should ensure
national technical readiness for geospatial information management as well as the use of space-based innovations and integrate the
use of geospatial data in health systems strengthening efforts.

Pandemics cause extinction


Dhillon 17 – Ranu Dhillon, Instructor @ Harvard Medical School and a Physician at
Brigham and Women’s Hospital in Boston, Works on building health systems in developing
countries and served as an advisor to the president of Guinea during the Ebola epidemic
instructor at Harvard Medical School and a physician at Brigham and Women’s Hospital in
Boston. He works on building health systems in developing countries and served as an advisor
to the president of Guinea during the Ebola epidemic, Harvard Business Review, 3-15-17, “The
World Is Completely Unprepared for a Global Pandemic”, https://hbr.org/2017/03/the-world-
is-completely-unprepared-for-a-global-pandemic

We fear it is only a matter of time before we face a deadlier and more contagious pathogen ,
yet the threat of a deadly pandemic remains dangerously overlooked . Pandemics now occur with
greater frequency , due to factors such as climate change , urbanization , and international
travel . Other factors, such as a weak World Health Organization and potentially massive cuts to funding for
U.S. scientific research and foreign aid, including funding for the United Nations, stand to deepen our vulnerability. We
also face the specter of novel and mutated pathogens that could spread and kill
faster than diseases we have seen before . With the advent of genome-editing technologies, bioterrorists could
artificially engineer new plagues, a threat that Ashton Carter, the former U.S. secretary of defense, thinks could rival nuclear

weapons in deadliness . The two of us have advised the president of Guinea on stopping Ebola. In addition, we have worked on ways
to contain the spread of Zika and have informally advised U.S. and international organizations on the matter. Our experiences tell us
that the world is unprepared for these threats. We urgently need to change this trajectory. We can start by learning four
lessons from the gaps exposed by the Ebola and Zika pandemics. Faster Vaccine Development The most effective way to stop pandemics is with
vaccines. However, with Ebola there was no vaccine, and only now, years later, has one proven effective. This has been the case with Zika, too. Though
there has been rapid progress in developing and getting a vaccine to market, it is not fast enough, and Zika has already spread worldwide. Many other
diseases do not have vaccines, and developing them takes too long when a pandemic is already under way. We need faster pipelines, such as the one
that the Coalition for Epidemic Preparedness Innovations is trying to create, to preemptively develop vaccines for diseases predicted to cause outbreaks
in the near future. Poinkt-of-Care Diagnostics Even with such efforts, vaccines will not be ready for many diseases and would not even be an option for
novel or artificially engineered pathogens. With no vaccine for Ebola, our next best strategy was to identify who was infected as quickly as possible and
isolate them before they infected others. Because Ebola’s symptoms were identical to common illnesses like malaria, diagnosis required laboratory
testing that could not be easily scaled. As a result, many patients were only tested after several days of being contagious and infecting others. Some were
never tested at all, and about 40% of patients in Ebola treatment centers did not actually have Ebola. Many dangerous pathogens similarly require
laboratory testing that is difficult to scale. Florida, for example, has not been able to expand testing for Zika, so pregnant women wait weeks to know if
their babies might be affected. What’s needed are point-of-care diagnostics that, like pregnancy tests, can be used by frontline responders or patients
themselves to detect infection right away, where they live. These tests already exist for many diseases, and the technology behind them is well-
established. However, the process for their validation is slow and messy. Point-of-care diagnostics for Ebola, for example, were available but never used
because of such bottlenecks. Greater Global Coordination We need stronger global coordination . The
responsibility for controlling pandemics is fragmented, spread across too many players
with no unifying authority . In Guinea we forged a response out of an amalgam of over 30 organizations, each of which had its
own priorities. In Ebola’s aftermath, there have been calls for a mechanism for responding to
pandemics similar to the advance planning and training that NATO has in place for its
numerous members to respond to military threats in a quick, coordinated fashion . This is the right
thinking, but we are far from seeing it happen. The errors that allowed Ebola to become a crisis replayed with Zika, and the WHO, which
should anchor global action, continues to suffer from a lack of credibility. Stronger Local Health Systems
International actors are essential but cannot parachute into countries and navigate
local dynamics quickly enough to contain outbreaks . In Guinea it took months to establish the ground game
needed to stop the pandemic, with Ebola continuing to spread in the meantime. We need to help developing countries establish health systems that can
provide routine care and, when needed, coordinate with international responders to contain new outbreaks. Local health systems could be established
for about half of the $3.6 billion ultimately spent on creating an Ebola response from scratch. Access to routine care is also
essential for knowing when an outbreak is taking root and establishing trust. For months, Ebola spread before anyone knew it was happening,
and then lingered because communities who had never had basic health care doubted the intentions of foreigners flooding into their villages. The
turning point in the pandemic came when they began to trust what they were hearing about Ebola and understood what they needed to do to halt its
spread: identify those exposed and safely bury the dead. With
Ebola and Zika, we lacked these four things — vaccines, diagnostics,
global coordination, and local health systems — which are still urgently needed . However, prevailing
political headwinds in the United States, which has played a key role in combatting pandemics

around the world , threaten to make things worse. The Trump administration is seeking drastic budget cuts in funding for foreign aid
and scientific research. The U.S. State Department and U.S. Agency for International Development may lose over one-third of their budgets, including
half of the funding the U.S. usually provides to the UN. The National Institutes of Health, which has been on the vanguard of vaccines and diagnostics
research, may also face cuts. The Centers for Disease Control and Prevention, which has been at the forefront of responding to outbreaks, remains
without a director, and, if the Affordable Care Act is repealed, would lose $891 million, 12% of its overall budget, provided to it for immunization
programs, monitoring and responding to outbreaks, and other public health initiatives. Investing in our ability to prevent and
contain pandemics through revitalized national and international institutions should be our shared
goal. However, if U.S. agencies become less able to respond to pandemics, leading institutions from other nations, such as Institut Pasteur and the
National Institute of Health and Medical Research in France, the Wellcome Trust and London School of Hygiene and Tropical Medicine in the UK, and
nongovernmental organizations (NGOs have done instrumental research and response work in previous pandemics), would need to step in to fill the

Pandemics are an existential threat on par with


void. There is no border wall against disease.

climate change and nuclear conflict . We are at a critical crossroads, where we must either take the steps needed to
prepare for this threat or become even more vulnerable. It is only a matter of time before we are hit by a deadlier, more contagious pandemic. Will we
be ready?

Vaccine distribution saves millions


Stern 5 - Alexandra, assistant professor in the department of Obsterics and Gynecology and
the Program in American Culture at the University of Michigan, and Howard, George E. Wantz
Professor of the History of Medicine at the University of MIchigan and a Professor in the
Department of Pediatrics and Communicable Diseases, Psychiatry, Health Policy, Management,
and History, May, "The History of Vaccines And Immunization: Familiar Patterns, New
Challenges", http://content.healthaffairs.org/content/24/3/611.full
Human beings have benefited from vaccines for more than two centuries. Yet the pathway to effective vaccines
has been neither neat nor direct. This paper explores the history of vaccines and immunization, beginning with Edward Jenner’s
creation of the world’s first vaccine for smallpox in the 1790s. We then demonstrate that many of the issues salient in Jenner’s era—
such as the need for secure funding mechanisms, streamlined manufacturing and safety concerns, and deep-seated public fears of
inoculating agents—have frequently reappeared and have often dominated vaccine policies. We suggest that historical awareness can
help inform viable long-term solutions to contemporary problems with vaccine research, production, and supply. If
we could
match the enormous scientific strides of the twentieth century with the political and economic investments of
the nineteenth, the world’s citizens might be much healthier . The gasping breath and distinctive sounds of whooping
cough; the iron lungs and braces designed for children paralyzed by polio; and the devastating birth defects caused by
rubella: To most Americans, these infectious scourges simultaneously inspire dread and represent
obscure maladies of years past. Yet a little more than a century ago, the U.S. infant mortality rate
was a staggering 20 percent, and the childhood mortality rate before age five was another disconcerting 20 percent.1 Not
surprisingly, in an epoch before the existence of preventive methods and effective therapies,
infectious diseases such as measles, diphtheria, smallpox, and pertussis topped the list of childhood killers.
Fortunately, many of these devastating diseases have been contained, especially in industrialized nations,
because of the development and widespread distribution of safe, effective, and affordable
vaccines. Indeed, if you asked a public health professional to draw up a top-ten list of the achievements of the
past century, he or she would be hard pressed not to rank immunization first .2 Millions of
lives have been saved and microbes stopped in their tracks before they could have a chance to wreak
havoc. In short, the vaccine represents the single greatest promise of biomedicine: disease
prevention.3

IoT is inevitable – telehealth integration is key to successful


innovation within IoT
Ellis 17 – Megan Ellis, Journalist at Make Use Of, Honors Degree in New Media, “5 Ways the
Internet of Things Is Revolutionizing Healthcare”, Make Use Of, 10-5,
http://www.makeuseof.com/tag/internet-things-revolutionizing-healthcare/
While most of us are familiar with fitness trackers and other commonplace smart devices, true
innovation is sitting (mostly unnoticed by
the ordinary consumer) where the Internet of Things (IoT) and healthcare meet.

Healthcare has a lot to gain from the wearable and smart device industry — and in turn, it
might be the medical industry that ensures the success of these devices . From
Bluetooth-enabled hearing aids to robotic caretakers, there are some IoT healthcare innovations
we would have relegated to science fiction only a decade ago .
But the technology is here, and it’s more advanced than you may think. As a white paper by Freescale Semiconductor points out:

“The long-predicted IoT revolution in healthcare is already underway … The IoT plays a
significant role in a broad range of healthcare applications, from managing chronic diseases at
one end of the spectrum to preventing disease at the other.”
1. Precise Patient Tracking
Doctors are using various medical apps to track their
Smart devices and wearables benefit both patients and doctors.

patients’ ongoing health concerns. Meanwhile, patients are able to receive recommendations
and new advice as their treatment plan progresses.

This kind of tracking can take place in a hospital setting, but also at home . In some cases, the technology is a literal lifesaver. As early as
2009 there were cases where patients were saved thanks to their Wi-Fi-enabled pacemaker, as documented in an article in the journal Eurospace, which states:

“A 66-year-old patient with a Medtronic Concerto CRT-D for primary prevention of sudden death phoned the clinic complaining of fatigue since two days, without any malaise
or ICD shocks. Remote interrogation of the device . . . showed slow irregular VT (Ventricular Tachycardia).”

Doctors immediately told the patient to go to the clinic. Only 15 minutes after arriving, the patient collapsed from a heart attack. Luckily, doctors were ready and able to help.
Other patients with heart disease continue to benefit from the technology, which is becoming more advanced every day.

2. Smarter Health Aids


Not all smart medical devices are about saving lives. Some are about making the lives of patients, especially those with chronic illness or disabilities, easier.

Diabetes is one condition that inspires the development of many smart healthcare devices. These connected devices don’t only help patients monitor their condition, but also
help with preventative healthcare.

iHealth Smart, for example, is a Bluetooth-enabled device that helps users track their glucose levels. It comes with a paired app to help them understand and keep track of the
data.

Connected versions of everyday items also play role. Siren Smart Socks, for example, use temperature monitoring to help prevent foot ulcers — a complication many diabetic
patients face. The socks send this data to a companion app, which will notify users as to whether they should do a visual foot check, change their shores, reduce physical activity
or visit a doctor.

Some connected healthcare devices add an element of convenience to patients’ lives. This is the aim of the Widex COM-DEX, a Bluetooth-enabled
hearing aid that users can adjust and customize with an app.

This connection with the user’s phone also brings other benefits, such as the ability to take calls or listen to music through the hearing aid.

“I chose a smart hearing aid mainly because of its size and style. I also liked the fact that I could control it from my phone via the COM-DEX,” a user of the device, Amy Bell, told
MakeUseOf.

Bell requires a hearing aid in one ear due to a diving accident that damaged her cochlear hair cells. This makes her sensitive to high-pitched noises, while she cannot properly
hear low or bass tones.

“With the COM-DEX connected to my hearing aid and phone I’m able to hear phone calls through my hearing aid as well as play music. Bigger or older models don’t come with
this feature,” she says.

3. Smart Surgery
Google Glass might be a commercial failure, but it found its niche by providing augmented reality (AR) glasses for factory workers.

But factories aren’t the only places using this technology. In fact, AR-assisted surgery is already a reality.

Other companies are also creating augmented reality tools for surgery, such as Augmedics. The company is developing an AR headset dubbed The Vizor, which will allow doctors
to view a patient’s CT scan during surgery.

Augmedics believes that using AR headsets could allow for minimally invasive spinal surgery because the technology reduces the need for large incisions.

Meanwhile, another company called Scopis has combined the Microsoft HoloLens with their surgery navigation technology. By doing this, they have created a system that
provides AR guidance for surgeons on numerous procedures.

“Scopis’ Holographic Navigation Platform is a universal solution that offers specific advantages for spinal surgeries and can also be applied in the many other areas where the
highest levels of precision and speed are critical . . . In neurosurgery, for example, brain tumors could be located faster and with higher accuracy.” — Brian Kosmecki, Scopis CEO

4. Preventative Healthcare and Early Detection

With connected devices comes a slew of data, allowing researchers to analyze trends and
risks for patients. While it’s not a precise science yet, more companies are implementing data solutions that can help with
preventative healthcare.
PwC has introduced a predictive engine named Bodylogical which processes patient data to help give health insights into potential health trends and problems in the future.
Consumer devices can then use the engine to provide users with possible impacts of their decisions on their health.

“[Using health data from wearables, connected health devices, or other consumer data sources] Bodylogical can visualize how a person’s present choices — both positive and
negative — will personally impact them. It can show people how to achieve greater results with the least amount of effort. Or it can help individuals focus on the one or two
specific things that can deliver the most improvement.” — PwC

This includes sending users notifications on their smartphones to remind them of tasks or interventions.

Meanwhile, connected devices are also powerful tools for the early detection of diseases and health
issues. Cyrcadia Health, for example, has a device called the iTBra, that assists in the early detection of breast cancer.
The smart breast patches use predictive analytics software and algorithms to identify potential abnormalities in breast tissue. This data is sent to Cyrcadia Health for analysis,
with results sent to the user and their doctors.

5. Robotic Healthcare Assistants


Remember those science fiction scenarios we mentioned? Well, this new technology is definitely something you might expect from a futuristic novel. However, robotic
healthcare assistants already exist.

Smart sensors are well-known as a way that the elderly can maintain independence in their homes. But IBM and Rice University are taking it a step further with the Multi-
Purpose Eldercare Robot Assistant (IBM MERA). The technology is still in its early stages, but a prototype was up and running by the end of 2016.

“IBM MERA will be used to help study innovative ways of measuring an individual’s vital signs, such as heart rate, heart rate variability and respiratory rate; answer basic
health-related questions; and, determine if an individual has fallen by reading the results of an accelerometer.” — IBM
AI and robots are playing an increasingly important role in the
Research by PwC Global has noted that

healthcare sector. More surprisingly, the firm also found that the public are ready to accept these advances. Their survey revealed that
consumers around the world are ready to engage with new technology designed to enable health
and wellness.
It will probably be many years before you can buy a robot with the smarts and functionality of Rosie from "The Jetsons." Nonetheless, today’s robots are moving in that
direction.

These medical apps can be very useful, even if they won't fully replace your doctor.

The robot facilitates virtual visits for people confined to their


Another robotic solution named Giraff has helps those with chronic illnesses.

. It does this by providing patients with a visual feed and control over the robot’s
homes or beds

movements.

it can monitor their health and stream consultations


Meanwhile the robot is also useful in the patient’s everyday life, where

with healthcare professionals. Patients can also interact with their loved ones through the telecare robot.
A Two-Way Street

The impact of connected devices on healthcare isn’t one-sided — in fact, the healthcare industry
is one of the major drivers of innovation and production in the smart device industry. This
is especially prevalent when it comes to wearables.

Global Industry Analysts Inc expects that the wearable medical devices market will reach a value of $4.5 billion by 2020. The company says that this is
due to the “growing need for effective management of chronic diseases.”

“The global wearable devices market continues to gain momentum from the rapid adoption of these devices among patients,” the company said in a
report.

It also notes that the increased


concerns over healthcare affordability are driving greater interest in
using connected devices to manage chronic conditions.
Data from Statista shows that between 2015 and 2016, sales revenue from healthcare wearable devices tripled. By the
end of 2017, they anticipate the revenue will increase another 2.5 times from 2016.

So while
healthcare is seeing immense improvements thanks to the Internet of Things, the
connected device industry is also significantly benefiting from its connections to healthcare.

Extinction
Steer 16 – Dr. Andrew Steer, President and CEO of the World Resources Institute, Global
Agenda Trustee for the World Economic Forum, Former Director General at the UK Department
of International Development, “How Information Technology Will Enable a Sustainable
Future”, Hewlett Packard Enterprise, 9-26, https://community.hpe.com/t5/Inspiring-
Progress/How-Information-Technology-Will-Enable-a-Sustainable-Future/ba-
p/6901717#.WLxR0TvafHw
Two recent developments offer great hope for the future.
For the first time ever, all the countries in the world have agreed on global goals (to 2030) that
apply to rich and poor alike. In September 2015, world leaders adopted the UN Sustainable
Development Goals (SDGs) with a vision that all people have a good quality of life—free of
hunger, poverty, and injustice—while our environment thrives. The SDGs have a set of 169
targets, most of which are measurable.
Second, thanks to massive advances in information technology , we can now measure
important things much more accurately in real time—in turn creating an ability to
react quickly to problems , and an accountability for delivering on commitments made.

Take forests, for example. One of the SDG targets is to eliminate deforestation and restore
degraded land. But how to measure what is happening? As recently as five years ago this would
have required referring to a thick book of statistics, often several years out of date. Now, real
time data is available at a resolution of one-fifth of an acre to anybody with a smartphone or
laptop anywhere—all for free! This all thanks to billions of data points provided by satellites
each day, massive gains in cloud computing, and incredible progress in communications and
visualization technology. This is all provided by Global Forest Watch , a partnership between the
World Resources Institute (WRI) and a number of leading technology companies, research
institutions and national governments.
New technology is also helping us increase the efficiency of buildings and build smarter, lower-
carbon, higher-functioning cities. Data platforms like our Aqueduct water risk model are helping
business leaders and countries measure and monitor environmental risks and track progress.

At WRI we focus on six urgent global challenges: climate , water , forests , food , energy ,
and cities . In each of these IT is facilitating better data collection and analysis, heightened
transparency, better communication and ultimately better decisions —all critical to WRI’s
moto: count it, change it, scale it. From satellite images of coral reefs to crowd-sourced data on
urban traffic congestion, our ability to make decisions based on increasingly better information
points to tremendous potential for sustainability gains.
It is a pleasure for us to partner with HPE in a number of important initiatives, including
exploring the intersection of Smart Cities and the I nternet o f T hings, and helping shape HPE’s
science-based target. It’s only by tapping the ingenuity and excitement of those within the IT
sector that we can address existential threats like climate change while increasing standards
of living for billions now living in poverty. For this reason, we see companies like HPE at the
very forefront of solving today’s most difficult challenges.

Only engagement with China solves – rural distribution and


pandemic risk uniquely high
Liu 17 [Melinda Liu has served as the Beijing bureau chief at Newsweek since 1998, November
2017 https://www.smithsonianmag.com/science-nature/china-ground-zero-future-pandemic-
180965213/]
Guan Yi, a virus expert and noted flu hunter at the University of Hong Kong School of Public Health , has
predicted that H7N9 “could be the biggest threat to public health in 100 years .” Specialists at the Centers
for Disease Control and Prevention warned this past June that out of all the novel influenza strains they’d recently evaluated, H7N9 has the

highest potential “to emerge as a pandemic virus and cause substantial human illness.”
Yin says he’d heard about H7N9 on TV, but when his wife started to vomit, they didn’t make the connection. Instead of seeking Western-style medicine, they did what many
rural Chinese people do when they’re under the weather: They went to the local herbalist and sought inexpensive, traditional treatments for what they hoped was a simple
illness. As a small-scale farmer with four children, Yin takes temporary construction jobs (as many rural Chinese do) to boost his income to about $550 a month. He had always
been terrified that someone in his family might develop a serious health problem. “That’s a farmer’s worst nightmare,” he explains. “Hospital costs are unbelievable. Entire
family savings could be wiped out.”

When the herbs didn’t work, Long’s family hired a car and drove her 20 miles to the Ziyang Hospital of Traditional Chinese Medicine. There she was diagnosed with
gastrointestinal ulcers and received various treatments, including a medication often prescribed for colic and a traditional Chinese medicine (jingfang qingre) used to reduce
fever. She didn’t improve. Two days later, Long went into intensive care. The next day, Yin was shocked when doctors told him his wife was, in fact, infected with H7N9.
The diagnosis was especially surprising, given that Long hadn’t done anything different than usual in the period leading up to her illness. She’d looked after her 73-year-old
mother, who lived nearby, and worked in the cornfields. And just a few days before she became ill, Long had walked about an hour to the local market, approached a vendor
selling live poultry and returned home with five chickens.

**********

Officially, the live-bird markets in Beijing have been shuttered for years. In reality, guerrilla vendors run furtive slaughterhouses throughout this national capital of wide
avenues, gleaming architecture and more than 20 million residents—despite warnings that their businesses could be spreading deadly new strains of the flu.

In one such market, a man in sweatstained shorts had stacked dozens of cages—jammed with chickens, pigeons, quail—on the pavement outside his grim hovel.

I picked out two plump brown chickens. He slit their throats, tossed the flapping birds into a greasy four-foot-tall ceramic pot, and waited for the blood-spurting commotion to
die down. A few minutes later he dunked the chickens in boiling water. To de-feather them, he turned to a sort of ramshackle washing machine with its rotating drum studded
with rubber protuberances. Soon, feathers and sludge splashed onto a pavement slick with who knows what.

I asked the vendor to discard the feet. This made him wary. Chicken feet are a Chinese delicacy and few locals would refuse them. “Don’t take my picture, don’t use my name,” he
said, well aware that he was breaking the law. “There was another place selling live chickens over there, but he had to shut down two days ago.”

Many Chinese people, even city dwellers, insist that freshly slaughtered poultry is tastier and more healthful than refrigerated or frozen meat. This is one of the major reasons
China has been such a hot spot for new influenza viruses: Nowhere else on earth do so many people have such close contact with so many birds.

At least two flu pandemics in the past century—in 1957 and 1968—originated in [ China ] the Middle Kingdom and
were triggered by avian viruses that evolved to become easily transmissible between
humans . Although health authorities have increasingly tried to ban the practice, millions of live birds are
still kept, sold and slaughtered in crowded markets each year. In a study published in January, researchers in
China concluded that these markets were a “main source of H7N9 transmission by way of human-poultry contact and
avian-related environmental exposures.”

In Chongzhou, a city near the Sichuan provincial capital of Chengdu, the New Era Poultry Market was reportedly closed for two months at the end of last year. “Neighborhood
public security authorities put up posters explaining why bird flu is a threat, and asking residents to co-operate and not to sell poultry secretly,” said a Chongzhou teacher, who
asked to be identified only as David. “People pretty much listened and obeyed, because everyone’s worried about their own health.”

When I visited New Era Poultry in late June, it was back in business. Above the live-poultry section hung a massive red banner: “Designated Slaughter Zone.” One vendor said he
sold some 200 live birds daily. “Would you like me to kill one for you, so you can have a fresh meal?” he asked.

Half a dozen forlorn ducks, legs tied, lay on a tiled and blood-spattered floor, alongside dozens of caged chickens. Stalls overflowed with graphic evidence of the morning’s brisk
trade: boiled bird carcasses, bloodied cleavers, clumps of feathers, poultry organs. Open vats bubbled with a dark oleaginous resin used to remove feathers. Poultry cages were
draped with the pelts of freshly skinned rabbits. (“Rabbit meat wholesale,” a sign said.)

These areas—often poorly ventilated, with multiple species jammed together—create ideal conditions for
spreading disease through shared water utensils or airborne droplets of blood and other secretions. “That provides
opportunities for viruses to spread in closely packed quarters, allowing ‘amplification’ of the viruses,” says Benjamin John Cowling, a
specialist in medical statistics at the University of Hong Kong School of Public Health. “The risk to humans becomes so much higher.”

Shutting live-bird markets can help contain a bird flu outbreak. Back in 1997, the H5N1 virus traveled from mainland China to Hong Kong, where it started killing chickens and
later spread to 18 people, leaving six dead. Hong Kong authorities shut down the city’s live-poultry markets and scrambled to cull 1.6 million chickens, a draconian measure that
may have helped avert a major epidemic.

In mainland China, though, the demand for live poultry remains incredibly high. And unlike the Hong Kong epidemic, which visibly affected its avian hosts, the birds carrying
H7N9 initially appeared healthy themselves. For that reason, shuttering markets has been a particularly hard sell.

China’s Ministry of Agriculture typically hesitates to “mess with the industry of raising and selling chickens,” says Robert Webster, a world-renowned virologist based at St. Jude
Children’s Research Hospital in Memphis. He has been working with Chinese authorities since 1972, when he was part of a Western public health delegation invited to Beijing.
He and a colleague were eager to collect blood samples from Chinese farm animals. At a state-run pig farm, Webster recalls, he was allowed to get a blood sample from one pig.
“Then we said, ‘Could we have more pigs?’ And the Chinese officials replied, ‘All pigs are the same.’ And that was it,” he concludes with a laugh. “It was a one-pig trip.”

The experience taught Webster something about the two sides of Chinese bureaucracy. “The public health side of China gave us absolute co-operation,” he says. “But the
agricultural side was more reluctant.” He says the Chinese habit of keeping poultry alive until just before cooking “made some sense before the days of refrigeration. And now it’s
in their culture. If you forcibly close down government live-poultry markets, the transactions will simply go underground.”

Tiny porcelain and wood figurines of chickens, geese and pigs dot a crowded windowsill in Guan Yi’s office at the School of Public Health, framing an idyllic view of green, rolling
hills. Famed for his work with animal viruses, Guan is square-jawed and intense. Some call him driven. In another incarnation, he might have been a chain-smoking private
investigator. In real life he’s a blunt-spoken virus hunter.

Working out of his Hong Kong base as well as three mainland Chinese labs, including one at Shantou University Medical College, Guan receives tips about unusual flu trends in
China from grassroots contacts. He has trained several dozen mainland Chinese researchers to collect samples—mostly fecal swabs from poultry in markets and farms—and
undertake virus extraction and analysis.

At a lab in Hong Kong, a colleague of Guan’s sits before rows of chicken eggs, painstakingly injecting droplets of virus-containing liquid into living embryos. Later the amniotic
fluid will be analyzed. Another colleague shows off an important tool for their work: a sophisticated Illumina next-generation sequencing machine, which, he says, “can sequence
genes at least 40 times faster” than the previous method.

Guan is concerned that H7N9 may be undergoing mutations that could make it spread easily between people. He’s alarmed that the most recent version of H7N9 has infected
and killed so many more people than other avian flu viruses. “We don’t know why,” he frets.

Then there was that moment last winter when colleagues analyzing H7N9 were startled to discover that some of the viruses—previously non-pathogenic to birds—now were
killing them. This virus mutation was so new that scientists discovered it in the lab before poultry vendors reported unusually widespread bird deaths.
Flu viruses can mutate anywhere. In 2015, an H5N2 flu strain broke out in the United States and spread throughout the country, requiring the
slaughter of 48 million poultry. But China is unique ly positioned to create a novel flu virus that kills

people. On Chinese farms, people, poultry and other livestock often live in close proximity .
Pigs can be infected by both bird flu and human flu viruses, becoming potent “mixing vessels” that allow genetic material from
each to combine and possibly form new and deadly strains . The public’s taste for freshly killed

meat, and the conditions at live markets, create ample opportunity for humans to come in
contact with these new mutations . In an effort to contain these infections and keep the poultry industry alive, Chinese officials have
developed flu vaccines specifically for birds. The program first rolled out on a large scale in 2005 and has gotten mixed reviews ever since. Birds often spread new viruses
without showing signs of illness themselves, and as Guan notes, “You can’t vaccinate every chicken in every area where bird flu is likely to emerge.” In July, after H7N9 was
found to be lethal to chickens, Chinese authorities rolled out H7N9 poultry vaccines; it’s still too early to assess their impact.

Meanwhile, there is no human vaccine yet available that can guarantee protection against the most recent variant of H7N9. Guan’s team is helping pave the way for one. They’ve
been looking deeply into the virus’ genesis and infection sources, predicting possible transmission routes around the globe. They’re sharing this information with like-minded
researchers in China and abroad, and offering seasonal vaccine recommendations to international entities such as the World Health Organization and the Food and Agriculture
Organization of the United Nations. Such data could prove life-saving—not just in China but worldwide—in the event of a full-on pandemic.

**********

When Long Yanju’s illness was diagnosed in April, she became one of 24 confirmed cases of H7N9 in Sichuan province that month. Hospitals there weren’t well equipped to
recognize signs of the virus: This wave marked the first time H7N9 had traveled from the densely populated eastern coast westward to rural Sichuan. “With the spread across
wider geographical areas, and into rural areas,” says Uyeki, the CDC influenza specialist, “it’s likely patients are being hospitalized where hospitals aren’t so well resourced as in
the cities, and clinicians have less experience managing such patients.”

Yin is now alleging that the hospital committed malpractice for not properly diagnosing or treating his wife until it was too late. He initially asked for $37,000 in damages from
the hospital. Officials there responded with a counterdemand that Yin pay an additional $15,000 in medical bills. “In late September I agreed to accept less than $23,000. I’d
run out of money,” he says. “But when I went to collect, the hospital refused to pay and offered much less. It’s not enough.” A county mediation committee is trying to help both
sides reach an agreement. (Hospital representatives declined to comment for this article.)

Whatever the outcome of Yin’s legal battle, it seems clear that shortcomings in the Chinese health care system are playing a role in the H7N9 epidemic. Along with rural people’s
tendency to avoid Western-style medicine as too expensive, it’s routine for hospitals in China to demand payment upfront, before any tests or treatment takes place. Families are
known to trundle ailing relatives on stretchers (or sometimes on stretched blankets) from clinic to clinic, trying to find someplace they can afford. “Everybody feels the same way
as I do,” Yin says. “If the illness doesn’t kill you, the medical bills will.”

And any delay in receiving treatment for H7N9 is dangerous, physicians say. Although nearly 40 percent of people known
to be infected with H7N9 have died so far, the odds of surviving may be much higher if medication such as the antiviral oseltamivir, known as Tamiflu, could be administered
within 24 to 48 hours. “Chinese with H7N9 usually take two days to see a doctor, another four days to check into a hospital, and then on Day 5 or 6 they get Tamiflu,” says Chin-

Kei Lee, the medical officer for emerging infectious diseases at the WHO China office. “Often people die within 14 days. So especially in rural
areas , it’s hard to get treated in time—even if doctors do everything right.”

Though health authorities worldwide acknowledge that China is often an influenza epicenter, most Chinese people themselves
don’t receive an annual flu shot. The logistics of administering mass vaccinations to a nation of
more than one billion are daunting . While nearly half of Americans receive seasonal flu vaccinations, only about 2 percent of Chinese do. “Not enough,”
admits Lee. “We always want to do better than yesterday.”

Earlier this year, Lee was one of 25 experts who gathered in Beijing under the umbrella of the United Nations to discuss the H7N9 threat. The meeting reviewed some of the
measures in place at live-bird markets—such as mandatory weekly disinfection and bans on keeping poultry overnight—and concluded that they were insufficient.

Despite such shortcomings, Western experts say Chinese officials have come a long way since their wobbly handling of the 2002 outbreak of SARS, the severe respiratory disease
caused by a previously unknown coronavirus; Chinese apparatchiks initially tried to cover up the epidemic, creating a worldwide scandal. But after the first H7N9 outbreak in
2013, Webster observes, Chinese authorities did “exactly what should have been done. You need to get the word out as fast as possible, with transparency and urgency, so the
world can respond.”

Global coop eration is crucial . Along China’s southwestern underbelly lies a string of less developed countries such as Laos, Vietnam and
Myanmar. (The last of these is of particular concern, since it imports large amounts of Chinese poultry.) Some of China’s border regions are

themselves relatively impoverished, raising the possibility of persistent and recurring


outbreaks on both sides of the rugged frontier.

And – US-China data sharing now – lunar landing


Huang 19 [Echo Huang is a reporter based in Hong Kong, where she covers China's
technology and business.1-15-2019 https://qz.com/1523812/nasa-and-china-cooperated-in-
recent-landing-on-far-side-of-the-moon/]
The small ways NASA still cooperates with China’s space program, despite a ban
In its recent landing on the far side of the Moon, China had help from scientists from a handful of countries, while more and more
institutes around the world are cooperating with China in space exploration. NASA, however, is left out, thanks to restrictions
imposed by the US government since 2011.

The US banned the space agency from working with China and its state-owned companies out of concerns regarding national
security and technology transfers. As a result, China was locked out of the International Space Station because NASA is one of the
participating bodies. More recently, scientists from other countries such as Germany and Sweden who were helping China with its
exploration of the far side of the Moon were cautious of not falling afoul of US export controls on sensitive technology.

China’s space agency, however, announced that the


two countries had shared data on its exploration of the
far side of the Moon. “Cooperation is the joint will of scientists,” said Wu Yanhua, deputy director
of China’s National Space Agency in a press conference yesterday (Jan. 14). He also noted that both organizations
have met “frequently.”
According to Wu, China had discussed the possibility of NASA using its Lunar Reconnaissance Orbiter
(LRO), which orbits the Moon, to observe the landing of the Chang’e-4 spacecraft on the lunar far side. Wu said that
China had told NASA the exact landing time and position of the spacecraft, but the LRO wasn’t in the right position to do so as it
wasn’t able to adjust its orbit with what little fuel it had left. Before the touchdown on Jan. 3, the LRO managed to capture pictures
of the landing site.

In a statement last week, NASA confirmed those discussions, and said that “for a number of reasons” the LRO wasn’t
able to be at the optimal location for the landing. But it added that the orbiter has been collecting data since
Chang’e-4’s arrived on the far side, and will take photos of the landing site on Jan. 31. The agencies have
agreed to share data on significant findings, if any, at a meeting of a subcommittee of the UN Committee on the
Peaceful Uses of Outer Space to be held next month. “NASA’s cooperation with China is transparent , reciprocal
and mutually beneficial,” the US space agency said.

NASA’s administrator, James Bridenstine, in an earlier interview with Quartz (paywall), had also said that the
agency can share data with China. “When they do a science mission to the Moon, we’re hopeful they will be able
to share with us the data they receive, and when we do a mission to the moon, we can share data with them,”
Bridenstine told Quartz. “Understanding and characterizing the Moon and doing that kind of science is in the interest of
all humanity. It’s not something any one country should try to retain for itself.”

The plan isn’t “growth” but also it’s sustainable, solves global
problems, and no mindset shift
Lomborg 13 – Bjørn Lomborg, Adjunct Professor at the Copenhagen Business School, “The
Limits to Panic”, Project Syndicate, 6-17, http://www.project-
syndicate.org/commentary/economic-growth-and-its-critics-by-bj-rn-lomborg
The genius of The Limits to Growth was to fuse these worries with fears of running out of stuff.
We were doomed, because too many people would consume too much. Even if our ingenuity
bought us some time, we would end up killing the planet and ourselves with pollution. The only
hope was to stop economic growth itself, cut consumption, recycle, and force people to have
fewer children, stabilizing society at a significantly poorer level.

That message still resonates today, though it was spectacularly wrong . For example, the authors
of The Limits to Growth predicted that before 2013, the world would have run out of aluminum,
copper, gold, lead, mercury, molybdenum, natural gas, oil, silver, tin, tungsten, and zinc.
Instead, despite recent increases, commodity prices have generally fallen to about a third of
their level 150 years ago. Technological innovations have replaced mercury in batteries, dental
fillings, and thermometers: mercury consumption is down 98% and, by 2000, the price was
down 90%. More broadly, since 1946, supplies of copper, aluminum, iron, and zinc have
outstripped consumption, owing to the discovery of additional reserves and new technologies to
extract them economically.
Similarly, oil and natural gas were to run out in 1990 and 1992, respectively; today, reserves of
both are larger than they were in 1970, although we consume dramatically more. Within the past
six years, shale gas alone has doubled potential gas resources in the United States and halved
the price.
As for economic collapse, the Intergovernmental Panel on Climate Change estimates that global
GDP per capita will increase 14-fold over this century and 24-fold in the developing world.
The Limits of Growth got it so wrong because its authors overlooked the greatest resource of all:
our own resourcefulness . Population growth has been slowing since the late 1960’s. Food
supply has not collapsed (1.5 billion hectares of arable land are being used, but another 2.7
billion hectares are in reserve). Malnourishment has dropped by more than half , from 35% of
the world’s population to under 16%.
Nor are we choking on pollution. Whereas the Club of Rome imagined an idyllic past with no
particulate air pollution and happy farmers, and a future strangled by belching smokestacks,
reality is entirely the reverse.
In 1900, when the global human population was 1.5 billion, almost three million people –
roughly one in 500 – died each year from air pollution, mostly from wretched indoor air. Today,
the risk has receded to one death per 2,000 people. While pollution still kills more people than
malaria does, the mortality rate is falling, not rising.
Nonetheless, the mindset nurtured by The Limits to Growth continues to shape popular and
elite thinking.
Consider recycling, which is often just a feel-good gesture with little environmental benefit and
significant cost. Paper, for example, typically comes from sustainable forests, not rainforests.
The processing and government subsidies associated with recycling yield lower-quality paper to
save a resource that is not threatened.

Likewise, fears of over-population framed self-destructive policies , such as China’s one-child


policy and forced sterilization in India. And, while pesticides and other pollutants were seen to
kill off perhaps half of humanity, well-regulated pesticides cause about 20 deaths each year in
the US, whereas they have significant upsides in creating cheaper and more plentiful food.
Indeed, reliance solely on organic farming – a movement inspired by the pesticide fear – would
cost more than $100 billion annually in the US. At 16% lower efficiency, current output would
require another 65 million acres of farmland – an area more than half the size of California.
Higher prices would reduce consumption of fruits and vegetables, causing myriad adverse
health effects (including tens of thousands of additional cancer deaths per year).

Obsession with doom-and-gloom scenarios distracts us from the real global threats . Poverty is
one of the greatest killers of all, while easily curable diseases still claim 15 million lives every
year – 25% of all deaths.
The solution is economic growth. When lifted out of poverty, most people can afford to
avoid infectious diseases. China has pulled more than 680 million people out of poverty in the
last three decades, leading a worldwide poverty decline of almost a billion people. This has
created massive improvements in health , longevity , and quality of life .

The four decades since The Limits of Growth have shown that we need more of it, not less . An
expansion of trade, with estimated benefits exceeding $100 trillion annually toward the end of
the century, would do thousands of times more good than timid feel-good policies that result
from fear-mongering. But that requires abandoning an anti-growth mentality and using our
enormous potential to create a brighter future.

Large scale societal transformation causes transition wars


Mead, 12 -- Professor of Foreign Affairs and Humanities at Bard College [7/28/2012, Walter
Russell, The American Interest, “The Energy Revolution 4: Hot Planet?” http://blogs.the-
american-interest.com/wrm/2012/07/28/the-energy-revolution-4-hot-planet/]

Capitalism is not, Monbiot is forced to admit, a fragile system that will easily be replaced. Bolstered by
huge supplies of oil, it is here to stay . Industrial civilization is, as far as he can now see, unstoppable.
Gaia, that treacherous slut, has made so much oil and gas that her faithful acolytes today cannot protect her from the consequences
of her own folly. Welcome to the New Green Doom: an overabundance of oil and gas is going to release so much greenhouse gas that
the world is going to fry. The exploitation of the oil sands in Alberta, warn leading environmentalists, is a tipping point. William
McKibben put it this way in an interview with Wired magazine in the fall of 2011: I think if we go whole-hog in the tar sands, we’re
out of luck. Especially since that would doubtless mean we’re going whole-hog at all the other unconventional energy sources we can
think of: Deepwater drilling, fracking every rock on the face of the Earth, and so forth. Here’s why the tar sands are important: It’s a
decision point about whether, now that we’re running out of the easy stuff, we’re going to go after the hard stuff. The Saudi Arabian
liquor store is running out of bottles. Do we sober up, or do we find another liquor store, full of really crappy booze, to break into? A
year later, despite the success of environmentalists like McKibben at persuading the Obama administration to block a pipeline
intended to ship this oil to refineries in the US, it’s clear (as it was crystal clear all along to anyone with eyes to see) that the world
has every intention of making use of the “crappy liquor.” Again, for people who base their claim to world leadership on their superior
understanding of the dynamics of complex systems, greens
prove over and over again that they are
surprisingly naive and crude in their ability to model and to shape the behavior of the
political and economic systems they seek to control. If their understanding of the future of the earth’s
climate is anything like as wish-driven, fact-averse and intellectually crude as their approach to international affairs, democratic
politics and the energy market, the greens are in trouble indeed. And as I’ve written in the past, the contrast between green claims to
understand climate and to be able to manage the largest and most complex set of policy changes ever undertaken, and the evident
incompetence of greens at managing small (Solyndra) and large (Kyoto, EU cap and trade, global climate treaty) political projects
today has more to do with climate skepticism than greens have yet understood. Many people aren’t rejecting science; they are
the future of
rejecting green claims of policy competence. In doing so, they are entirely justified by the record. Nevertheless,
the environment is not nearly as dim as greens think. Despairing environmentalists like McKibben
and Monbiot are as wrong about what the new era of abundance means as green energy analysts
were about how much oil the planet had. The problem is the original sin of much environmental thought:
Malthusianism. If greens weren’t so addicted to Malthusian horror narratives they would be
able to see that the new era of abundance is going to make this a cleaner planet
faster than if the new gas and oil had never been found. Let’s be honest. It has long been clear to students of
history, and has more recently begun to dawn on many environmentalists, that all that happy-clappy carbon treaty stuff was a pipe
dream and that nothing like that is going to happen. A humanity that hasn’t been able to ban the bomb despite the clear and present
dangers that nuclear weapons pose isn’t going to ban or even seriously restrict the internal combustion engine and the generator.
The political efforts of the green movement to limit greenhouse gasses have had very little effect so far, and it is highly unlikely that
they will have more success in the future. The green movement has been more of a group hug than a curve
bending exercise, and that is unlikely to change. If the climate curve bends, it will bend the way the population curve did: as
the result of lots of small human decisions driven by short term interest calculations rather than as the result of a grand global plan.
The shale boom hasn’t turned green success into green failure. It’s prevented green failure from
turning into something much worse. Monbiot understands this better than McKibben; there was never any real doubt
that we’d keep going to the liquor store. If we hadn’t found ways to use all this oil and gas, we wouldn’t have embraced the
economics of less. True, as oil and gas prices rose, there would be more room for wind and solar power, but the real winner of an oil
and gas shortage is… coal. To use McKibben’s metaphor, there is a much dirtier liquor store just down the road from the shale
emporium, and it’s one we’ve been patronizing for centuries. The US and China have oodles of coal, and rather than walk to work
from our cold and dark houses all winter, we’d use it. Furthermore, when and if the oil runs out, the technology exists to get liquid
fuel out of coal. It isn’t cheap and it isn’t clean, but it works. The newly bright oil and gas future means that we aren’t entering a new
Age of Coal. For this, every green on the planet should give thanks. The second reason why greens should give thanks for shale is that
environmentalism is a luxury good. People must survive and they will survive by any means necessary. But they would
much rather thrive than merely survive, and if they can arrange matters better, they will. A poor society near the edge of
survival will dump the industrial waste in the river without a second thought. It will burn
coal and choke in the resulting smog if it has nothing else to burn. Politics in an age of survival is
ugly and practical. It has to be. The best leader is the one who can cut out all the fluff and the folderol and keep you alive
through the winter. During the Battle of Leningrad, people burned priceless antiques to stay alive for
just one more night. An age of energy shortages and high prices translates into an age of
radical food and economic insecurity for billions of people. Those billions of hungry,
frightened, angry people won’t fold their hands and meditate on the ineffable wonders of Gaia
and her mystic web of life as they pass peacefully away. Nor will they vote George Monbiot and Bill McKibben into power.
They will butcher every panda in the zoo before they see their children starve, they will
torch every forest on earth before they freeze to death, and the cheaper and the meaner their lives are, the
less energy or thought they will spare to the perishing world around them. But, thanks to shale and other unconventional energy
sources, that isn’t where we are headed. We are heading into a world in which energy is abundant and
horizons are open even as humanity’s grasp of science and technology grows more secure. A
world where more and more basic human needs are met is a world that has time to think
about other goals and the money to spend on them . As China gets richer, the Chinese
want cleaner air, cleaner water, purer food — and they are ready and able to pay for them . A
Brazil whose economic future is secure can afford to treasure and conserve its rain forests . A
Central America where the people are doing all right is more willing and able to preserve its biodiversity. And a world in
which people know where their next meal is coming from is a world that can and will take
thought for things like the sustainability of the fisheries and the protection of the coral
reefs . A world that is more relaxed about the security of its energy sources is going to be able to do more about improving the
quality of those sources and about managing the impact of its energy consumption on the global commons. A rich, energy
secure world is going to spend more money developing solar power and wind power and other
sustainable sources than a poor, hardscrabble one. When human beings think their basic problems are solved,
they start looking for more elegant solutions. Once Americans had an industrial and modern economy, we
started wanting to clean up the rivers and the air . Once people aren’t worried about
getting enough calories every day to survive, they start wanting healthier food more elegantly
prepared. A world of abundant shale oil and gas is a world that will start imposing more environmental regulations on shale and gas
producers. A
prosperous world will set money aside for r esearch and d evelopment for new
tech nologies that conserve energy or find it in cleaner surroundings. A prosperous world facing
climate change will be able to ameliorate the consequences and take thought for the future
in ways that a world overwhelmed by energy insecurity and gripped in a permanent
economic crisis of scarcity simply can’t and won’t do . Greens should also be glad that the new energy
is where it is. For Monbiot and for many others, Gaia’s decision to put so much oil into the United States and Canada seems like her
biggest indiscretion of all. Certainly, a United States of America that has, in the Biblical phrase, renewed its youth like an eagle with
a large infusion of fresh petro-wealth is going to be even less eager than formerly to sign onto various pie-in-the-sky green carbon
treaties. But think how much worse things would be if the new reserves lay in dictatorial kleptocracies. How willing and able would
various Central Asia states have been to regulate extraction and limit the damage? How would Nigeria have handled vast new
reserves whose extraction required substantially more invasive methods? Instead, the
new sources are concentrated in
places where environmentalists have more say in policy making and where, for all the
shortcomings and limits, governments are less corruptible, more publicly accountable and in
fact more competent to develop and enforce effective energy regulations. This won’t satisfy McKibben
and Monbiot (nothing that could actually happen would satisfy either of these gentlemen), but it is a lot better than what we could be
facing. Additionally, if there are two countries in the world that should worry carbon-focused greens more than any other, they are
the United States and China. The two largest, hungriest economies in the world are also home to enormous coal reserves. But based
on what we now know, the US and China are among the biggest beneficiaries of the new cornucopia. Gaia put the oil and the gas
where, from a carbon point of view, it will do the most good. In
a world of energy shortages and insecurity, both
the US and China would have gone flat out for coal . Now, that is much less likely. And there’s one more
reason why greens should thank Gaia for shale. Wind and solar aren’t ready for prime time now, but by the time the new sources
start to run low, humanity will have mastered many more technologies that can used to provide energy and to conserve it. It’s likely
that Age of Shale hasn’t just postponed the return of coal: because of this extra time, there likely will never be another age in which
coal is the dominant industrial fuel. It’s virtually certain that the total lifetime carbon footprint of the human race is going to be
smaller with the new oil and gas sources than it would have been without them. Neither the world’s energy problems nor its climate
issues are going away any time soon. Paradise is not beckoning just a few easy steps away. But the new availability of these energy
sources is on balance a positive thing for environmentalists as much as for anyone else. Perhaps, and I know this is a heretical
thought, but perhaps Gaia is smarter than the greens.

Crisis causes casino capitalist efforts to rebound---that’s worse


Trainer 95 – Ted Trainer, Teaches at the School of Social Work at the University of New
South Wales, The Conserver Society: Alternatives for Sustainability. p. 78-79
It has been evident since the mid-1970s that the global economy is in considerable
trouble. Growth rates have been low, inflation and unemployment rates have been high, and debt has risen to extraordinary
levels. This critical state is basically caused by the fact that manufacturers can't sell all the goods they
can produce. They can't find profitable investment outlets for all that constantly accumulating capital. Obviously an
economy which doubles the amount of capital available per person every 20 years will soon set its people an impossible and farcical
problem of how to consume all the goods that can be produced, and must be produced if all that capital is to be profitably invested.
Now that they can't make normal profits, producing more useful goods, what they are doing is
speculating, i.e., gambling. In the last decade or so there has been a marked increase in
gambling on the share markets (hence the 1987 crash), in financial markets, on
commodity prices, and in company takeovers. Indeed this has been labeled the era of
‘casino capitalism’ (Strange 1986). Since the end of the long boom there has been an
accelerating process of restructuring within the global economy in an effort to
restore the conditions that will permit normal profits to be made again. Corporations
have relocated plants, streamlined operations and worked for greater access to a more unified world market. It is important to them
not to have to get permission to deal with this region and then that one, but to be able to put their goods and services on sale in, if
possible, a single global market-place. Governments are desperate to ‘get their economies going’,
so they accommodate to these demands of business by opening their countries to the
activities of foreign corporations, deregulating economies, and privatizing and thereby reducing
government activities, expenditures and taxes on firms. Getting the economy going involves giving the global
business sector more of what it wants: greater access, fewer restrictions, less protection for local
firms, lower taxes, a more compliant workforce and fewer trade barriers. From here on
the crisis is likely to deepen, especially because of worsening resource, energy and environmental costs and because of the
polarisation that condemns most people in the world to very low incomes and gives them little chance of becoming significant
consumers. Consequently the growth and affluence economy has a powerful tendency to focus only on the relatively small sector
where a few higher-income people purchase and can get jobs and where the profitable investments are to be found. Meanwhile
desperate politicians and economists jump at the chance to invest in mega-buck developments like the
Eastern Creek Motorcycle Speedway in Sydney, because these mean more investment, turnover, subcontracts and jobs, and after all
isn't that development and progress? Evidently it is beyond the capacity of conventional economists and politicians to grasp the vast
gulf between this merely capitalist development and appropriate development, i.e., development of the landscape, cooperatives,
farms, workshops and arrangements that would enable communities to flourish. They will
scramble to get the
economy going in the way they know, especially by giving the foreign corporations
more favorable conditions, cutting state spending and binding us more tightly into
the unifying global economy .

Collapse causes struggle for control of resources---turns every impact


Monbiot 9 – George Monbiot, Columnist for The Guardian, has held visiting fellowships or
professorships at the universities of Oxford (environmental policy), Bristol (philosophy), Keele
(politics), Oxford Brookes (planning), and East London (environmental science, 8-17, “Is There
Any Point In Fighting To Stave Off Industrial Apocalypse?,”
http://www.guardian.co.uk/commentisfree/cif-green/2009/aug/17/environment-climate-
change

From the second and third observations, this follows: instead of gathering as free
collectives of happy householders, survivors of this collapse will be subject to the will
of people seeking to monopolise remaining resources . This will is likely to be imposed
through violence. Political accountability will be a distant memory. The chances of
conserving any resource in these circumstances are approximately zero . The
human and ecological consequences of the first global collapse are likely to persist for many
generations, perhaps for our species' remaining time on earth. To imagine that good could
come of the involuntary failure of industrial civilisation is also to succumb to denial . The
answer to your question – what will we learn from this collapse? – is nothing .

This is why, despite everything, I fight on. I am not fighting to sustain economic growth. I am
fighting to prevent both initial collapse and the repeated catastrophe that follows. However faint
the hopes of engineering a soft landing – an ordered and structured downsizing of the global
economy – might be, we must keep this possibility alive. Perhaps we are both in denial: I, because I
think the fight is still worth having; you, because you think it isn't.

Neolib is sustainable
--accelerating market-driven brown solves poverty and resources
--consumption will stabilize
--tech solves

Saunders 16 – MA in Environment and Planning, PhD in engineering-economic systems


from Stanford University (Harry, “Does Capitalism Require Endless Growth?,” The
Breakthrough Institute, http://thebreakthrough.org/index.php/journal/issue-6/does-
capitalism-require-endless-growth)
6. There is another path to stable and declining calls on the planet’s endowment of natural
capital. Malthus erred not only because he failed to understand the relationship between fertility
rates and food consumption but also because he underestimated the rate at which agricultural
productivity would improve. By growing more food on every acre of land, human societies
avoided mass starvation. More broadly, rising economic productivity due to technological
advances raises incomes, creates economic surplus that can be reinvested in new capital and
infrastructure, and produces more economic output from less natural capital input. So long as
there are large populations living in deep poverty, gains in economic productivity will be put
toward greater output, assuring that some or all of the efficiencies associated with productivity
gains will be put toward greater production and consumption. But once everyone on the
planet achieves a satisfactory level of consumption, consumption of goods and
services should stabilize while calls on natural capital should stabilize and then
decline .34 By satisfactory levels of consumption, what I mean is a standard of living that
would be recognizable to the average citizen of an advanced developed economy — modern
housing, an ample and diverse diet, sufficient electricity for run-of-the-mill household
appliances, roads, hospitals, well-lit public spaces, garbage collection, and so on. The
saturation of demand for goods and services in advanced developed economies in the latter
half of the twentieth century provides a reasonable proxy for the point at which most people
start to see diminishing utility from further household consumption. In a zero-growth world, in
which household consumption has saturated while labor- and resource-sparing technological
change continues, leisure time grows continually over time while societal calls on natural capital
decline.35 Given these conditions, how quickly a zero-growth economy is achieved, and calls on
natural capital globally peak and then decline, depends upon three closely related phenomena:
how rapidly global population stabilizes, how rapidly incomes among the global poor rise, and
the rate at which resource-sparing technological change occurs. 7. Getting to a zero-growth
steady state economy with declining calls on natural capital will require, then, sustaining — or
better yet, accelerating — two trends that capitalism has proven better able to advance than
any alternative economic arrangement to date: lifting large agrarian populations out of poverty,
and improving resource productivity through technological change. The former, as noted above,
is also the key to stabilizing global population.
engagement
US-China coop vital to address regional pollution – independent data
verification key
Orts 18 [Eric W. Orts is the Guardsmark Professor at the Wharton School of the University of
Pennsylvania, 7-2-2018 https://knowledge.wharton.upenn.edu/article/chinese-environmental-
regulation/]
A new paper published earlier this month by the National Academy of Sciences has found a patchy and troubling record for coal-fired power
plants in China in their compliance with the country’s anti-pollution laws. It found inconsistencies between pollution

data recorded by plant-level continuous emissions monitoring systems, or CEMS , and remote
sensing data captured by satellites of the National Aeronautics and Space Administration (NASA). Those mismatches
occurred in areas with relatively higher pollution levels , where regulatory standards may have
been difficult to meet, potentially prompting falsification or misreporting of data. One big
takeaway from the research is that environmental regulation ought to factor in the development path and economic reforms of a country, so that it is
not overly heavy-handed so as to encourage unethical conduct, according to experts at Wharton and elsewhere. The other takeaway is that remote
sensing data could be more widely used to monitor compliance with environmental
regulations — not just in China , but also the U.S. “The stakes are really high in China, because China is the world’s largest energy user, and
nearly 80% of its electricity production comes from coal,” said Valerie Karplus, professor of global economics and management at MIT’s Sloan School
Remote
of Management and a co-author of the paper. Also, many people in China live in close proximity to those power plants, she added.
sensing data could be used to inform policy-making and to enhance the understanding of how firms respond to
pollution-control regulations, she noted. Titled “Quantifying Coal Power Plant Responses to Tighter SO2 Emissions Standards in
China,” the paper’s other co-authors are Shuang Zhang, professor of economics at the University of Colorado, Boulder; and Douglas Almond, professor
of economics and international and public affairs at Columbia University. They tracked sulfur dioxide emissions between July 2014 and July 2016 at
256 coal-fired power plants in China. Data Disconnects The study found that the monitoring systems at the 256 plants reported a 13.9% fall in sulfur
dioxide emissions. However, data captured by NASA’s satellites found higher pollution levels than what the monitors logged at 113 plants in so-called
“key regions,” or places with higher high populations and pollution levels than “non-key regions.” “A potential explanation for the discrepancy in the
two data sources in key regions is that plants overstated or falsified reductions,” the paper stated. “The stricter new standards and greater pressure to
comply may have generated incentives for plant managers to falsify or selectively omit concentration data.” “The stakes are really high in China,
because China is the world’s largest energy user, and nearly 80% of its electricity production comes from coal.”–Valerie Karplus Remote-sensing data
helps understand “responses on the ground, especially in developing countries, where there is, at least in the past, a widespread idea that maybe the
rule of law is weak or that there are challenges in governance and actually in implementing standards at plants,” said Karplus. It would help policy
makers also understand how firms respond to environmental policy against the backdrop of “ongoing economic reforms and development trajectory of
the country,” she said. “It’s important to realize that environmental policy doesn’t exist in a vacuum and that lots of supporting rules and institutions
can have an impact on how businesses manage their environmental footprint.” Shaping Environmental Regulations The research findings could bring
lessons in how stringent environmental regulations ought to be in order to be successful, according to Eric Orts, Wharton professor of legal studies and
business ethics who is also faculty director of the school’s Initiative for Global Environmental Leadership. “This is speculative, but one conclusion from
this research is that in the key areas where the Chinese government wanted to regulate heavily, they were not able to do it,” he said. Those key regions
include Beijing, Shanghai and other cities in the greater Beijing–Tianjin–Hebei area, the Pearl River Delta and the Yangtze River Delta. “One
hypothesis would be they just gave up and [tried to] figure out some way to not get in trouble [for non-compliance],” Orts said as a possible explanation
for misreporting or falsification of data. Under China’s anti-pollution regulations, sulfur dioxide emissions were set at 50 milligrams per cubic meter in
the key regions and at 200 milligrams in non-key regions that were relatively less polluted. The new regulations required 14,140 firms to post hourly
data on emissions on publicly available online platforms from 2014. “Where it was a little bit of a more moderate target, it seems that the satellite data
confirms the on-the-ground reporting that there was success [in reducing emissions],” Orts noted. “ Regulating heavily does create
some perverse disincentives for action and actually runs counter to your own environmental regulatory goals,” said Jackson Ewing, a
senior fellow at Duke University’s Nicholas Institute of Environmental Policy Solutions and an adjunct professor at the university’s Sanford School of
Public Policy. “Some
of China’s regulations in the past couple of years have taken fairly onerous approaches to
quickly reducing the number of heavy air pollution days, particularly in major metropolitan areas.” Those approaches have
had “unintended consequences,” including temporary shutdowns at factories and even the shutdown of heating in cold months
“when households really need that heat … with some obvious human impacts there,” Ewing pointed out. “So while China’s climate-change, domestic
environmental goals and stricter regulation are certainly laudable, you do see second- and third-order effects that include potentially not
putting forward numbers that we can believe in, and some consequences for human development and economic growth …
that we would like to see rolled back.” Why Measurement Is Important “From the U.N.-led efforts on climate change to China’s own domestic goals,
measurement [of pollution] has never been more important,” said Ewing. “We essentially have created a system in
which we’re just calling upon countries to, in good faith, report what they are doing on greenhouse gas mitigation and to come forward to their
international peers on a regular basis using similar methodologies for measuring and showing the results of their efforts. It’s
through that
measuring and reporting that we will have the fundamental starting point for negotiations and discussions
about how targets, goals and efforts can be scaled up, how resources can be shared and how cooperation can take hold.” “ If the core
reporting is problematic or inaccurate, then the entire system rests on a house of cards.”–Jackson
Ewing Any shortcomings in such environmental reporting could of course frustrate the goals. “If that core
reporting is problematic or inaccurate, then the entire system rests on a house of cards,” said Ewing. In that context, satellite monitoring would be
useful in the second- and third-party reviews of the actions declared by countries, he added. Karplus said one
innovation in their paper is the
ability to use satellite data for real-time monitoring of emissions. “The effects of air
pollution on human health depend on the timing of emissions,” she explained. “Being able to resolve emissions on
an hourly basis is incredibly important to thinking about when, where and how these plants are cleaning up.” Putting in
place independent verification measures is important because many of the mechanisms are
subject to human oversight, she added. Need to Strengthen Compliance According to Orts, NASA satellite data
verification could be used in the U.S. as well to influence adherence to environmental
standards. “There’s some concern that [Scott Pruitt], the secretary of the Environmental Protection Agency, might not be so keen on enforcing
the basic environmental laws in the country,” he said. NASA’s satellites are orbiting the earth constantly and pick up changes in air pollution as well as
a range of other substances, said Karplus. The ozone-monitoring instrument that captures such data has improved over time to provide accurate and
real-time measurements, she added. Many countries have installed CEMS at their power plants, but their adoption has to be significantly expanded,
including in China, to generate real-time data, Karplus said. In the U.S., CEMS are being used both for greenhouse gases as well as for local air
pollutants, “and this signals the state of the art,” she added. China does not yet cover carbon dioxide emissions as part of its monitoring systems at coal-
fired power plants, she noted, pointing to areas for improvement. Towards Global Environmental Governance The research study also holds pointers
for“global environmental governance” mechanisms , as they are made possible with remote
sensing capabilities, said Orts. With independently verified plant-level data, “you are able to then focus your policy development better,
because you have research driving that,” he added. “What you don’t want is a situation where you’re a business and you are

complying with the local regulatory scheme, but you’re losing out to someone who is cheating and
gaming the situation .”–Eric Orts Businesses will also find comfort in the regulatory consistency and trustworthiness of data it could
bring about, said Orts. “What you don’t want is a situation where you’re a business and you are complying with the local regulatory scheme, but you’re
China has
losing out to someone who is cheating and gaming the situation or paying off a government official or whatever.” Challenges in China
worked resolutely to address its environmental challenges in recent years, according to Karplus. She pointed to, for example,
its efforts to set up an emissions/carbon trading system, and to do third and fourth-party verification checks on all plant reported data. “It’s important
that we acknowledge how far China has come in making data more available and more transparent both to a domestic and international scholarly
audience, [although] there’s always a long way to go,” she added. “If
we are able to effectively monitor the impact of
particular policies, we will have a better chance of measuring and understanding what’s working,
what’s working well and what’s working poorly and adjusting our policies in turn as we go
forward,” said Ewing. “China’s a fascinating laboratory for that,” he added. “They have command-and-control measures to curtail emissions from
industries, transportation, everyday activities and housing, while they also have market mechanisms such as feed-in tariffs, the emissions trading
scheme, etc.”

Regional pollution causes melting of the Tibetan plateau and conflict


Vince 19 [Gaia Vince is a freelance science reporter based in London, previously news editor
of Nature and online editor of New Scientist 9-15-2019
https://www.theguardian.com/environment/2019/sep/15/tibetan-plateau-glacier-melt-ipcc-
report-third-pole]

the world’s “third pole”. This is how glaciologists refer to the Tibetan plateau , home
Khawa Karpo lies at

to the vast Hindu Kush-Himalaya ice sheet, because it contains the largest amount of snow and ice after the Arctic and Antarctic – the
Chinese glaciers alone account for an estimated 14.5% of the global total. However, a quarter of its ice
has been lost since 1970. This month, in a long-awaited special report on the cryosphere by the Intergovernmental Panel on Climate Change (IPCC), scientists will
warn that up to two-thirds of the region’s remaining glaciers are on track to disappear by the end of

the century. It is expected a third of the ice will be lost in that time even if the internationally agreed target of limiting global warming by 1.5C above pre-industrial
levels is adhered to.

Whether we are Buddhists or not, our lives affect, and are affected by, these tropical glaciers that span eight countries. This frozen “water tower of Asia” is the source of 10 of the
world’s largest rivers, including the Ganges, Brahmaputra, Yellow, Mekong and Indus, whose flows support at least 1.6 billion people directly – in drinking water, agriculture,
hydropower and livelihoods – and many more indirectly, in buying a T-shirt made from cotton grown in China, for example, or rice from India.
Joseph Shea, a glaciologist at the University of Northern British Columbia, calls the loss “depressing and fear-inducing. It changes the nature of the mountains in a very visible
and profound way.”

Yet the fast-changing conditions at the third pole have not received the same attention as those at the north and
south poles. The IPCC’s fourth assessment report in 2007 contained the erroneous prediction that all Himalayan glaciers would be gone by 2035. This statement
turned out to have been based on anecdote rather than scientific evidence and, perhaps out of embarrassment, the third pole has been given less attention in subsequent IPCC
reports.

There is also a dearth of research compared to the other poles, and what hydrological data exists has been jealously guarded by the Indian government and other interested
parties. The Tibetan plateau is a vast and impractical place for glaciologists to work in and confounding factors make measurements hard to obtain. Scientists are forbidden by
locals, for instance, to step out on to the Mingyong glacier, meaning they have had to use repeat photography to measure the ice retreat.

In the face of these problems, satellites have proved invaluable, allowing scientists to watch glacial shrinkage in real time. This summer, Columbia University researchers also
used declassified spy-satellite images from the cold war to show that third pole ice loss has accelerated over this century and is now roughly double the melt rate of 1975 to 2000,
when temperatures were on average 1C lower. Glaciers in the region are currently losing about half a vertical metre of ice per year because of anthropogenic global heating, the
researchers concluded. Glacial melt here carries significant risk of death and injury – far more than in the sparsely populated Arctic and Antarctic – from glacial lake outbursts
(when a lake forms and suddenly spills over its banks in a devastating flood) and landslides caused by destabilised rock. Whole villages have been washed away and these events
are becoming increasingly regular, even if monitoring and rescue systems have improved. Satellite data shows that numbers and sizes of such risky lakes in the region are
growing. Last October and November, on three separate occasions, debris blocked the flow of the Yarlung Tsangpo in Tibet, threatening India and Bangladesh downstream with
flooding and causing thousands to be evacuated.

One reason for the rapid ice loss is that the Tibetan plateau, like the other two poles, is warming at a rate up to three times as fast as the global average, by 0.3C per decade. In
the case of the third pole, this is because of its elevation, which means it absorbs energy from rising, warm, moisture-laden air. Even if average global temperatures stay below
1.5C, the region will experience more than 2C of warming; if emissions are not reduced, the rise will be 5C, according to a report released earlier this year by more than 200
scientists for the Kathmandu-based International Centre for Integrated Mountain Development (ICIMOD). Winter snowfall is already decreasing and there are, on average, four
fewer cold nights and seven more warm nights per year than 40 years ago. Models also indicate a strengthening of the south-east monsoon, with heavy and unpredictable
downpours. “This is the climate crisis you haven’t heard of,” said ICIMOD’s chief scientist, Philippus Wester.

There is another culprit besides our CO2 emissions in this warming story, and it’s all too evident on the dirty surface of the Mingyong glacier: black carbon, or soot. A 2013 study

found that black carbon is responsible for 1.1 watts per square metre of the Earth’s surface of extra energy being stored
in the
atmosphere (CO2 is responsible for an estimated 1.56 watts per square metre). Black carbon has multiple climate effects, changing clouds and monsoon circulation as well as

Air pollution from the Indo-Gangetic Plains – one of the world’s most polluted regions
accelerating ice melt.

– deposits this black dust on glaciers , darkening their surface and hastening melt . While

soot landing on dark rock has little effect on its temperature, snow and glaciers are particularly vulnerable
because they are so white and reflective. As glaciers melt, the surrounding rock crumbles in landslides,
covering the ice with dark material that speeds melt in a runaway cycle . The Everest base camp, for
instance, at 5,300 metres, is now rubble and debris as the Khumbu glacier has retreated to the icefall.

The immense upland of the third pole is one of the most ecologically diverse and vulnerable regions on Earth. People have only attempted to conquer these mountains in the last
century, yet in that time humans have subdued the glaciers and changed the face of this wilderness with pollution and other activities. Researchers are now beginning to
understand the scale of human effects on the region – some have experienced it directly: many of the 300 IPCC cryosphere report authors meeting in the Nepalese capital in July
were forced to take shelter or divert to other airports because of a freak monsoon.

But aAside from such inconveniences, what do these changes mean for the 240 million people living in the mountains? Well, in many areas, it has been welcomed. Warmer,
more pleasant winters have made life easier. The higher temperatures have boosted agriculture – people can grow a greater variety of crops and benefit from more than one
harvest per year, and that improves livelihoods. This may be responsible for the so-called Karakoram anomaly, in which a few glaciers in the Pakistani Karakoram range are
advancing in opposition to the general trend. Climatologists believe that the sudden and massive growth of irrigated agriculture in the local area, coupled with unusual
topographical features, has produced an increase in snowfall on the glaciers which currently more than compensates for their melting.

Elsewhere, any increase in precipitation is not enough to counter the rate of ice melt and places that are wholly reliant on meltwater for irrigation are feeling the effects soonest.
“Springs have dried drastically in the past 10 years without meltwater and because infrastructure has cut off discharge,” says Aditi Mukherji, one of the authors of the IPCC
report.

Known as high-altitude deserts, places such as Ladakh in north-eastern India and parts of Tibet have already lost many of their lower-altitude glaciers and with them their
seasonal irrigation flows, which is affecting agriculture and electricity production from hydroelectric dams. In some places, communities are trying to geoengineer artificial
glaciers that divert runoff from higher glaciers towards shaded, protected locations where it can freeze over winter to provide meltwater for irrigation in the spring.

Only a few of the major Asian rivers are heavily reliant on glacial runoff – the Yangtze and Yellow rivers are showing reduced water levels because of diminished meltwater and

those
the Indus (40% glacier-fed) and Yarkand (60% glacier-fed) are particularly vulnerable. So although mountain communities are suffering from glacial disappearance,

downstream are currently less affected because rainfall makes a much larger contribution to rivers such
as the Ganges and Mekong as they descend into populated basins. Upstream-downstream conflict over extractions, dam-

building and diversions has so far largely been averted through water-sharing treaties between nations,
but as the climate becomes less predictable and scarcity increases , the risk of unrest within – let alone between –
nations grows .

water-flow levels in all these rivers will drastically reduce without


Towards the end of this century, pre-monsoon

glacier buffers, affecting agricultural output as well as hydropower generation , and these stresses
will be compounded by an increase in the number and severity of devastating flash floods . “The impact on local water
resources will be huge, especially in the Indus Valley. We expect to see migration out of dry, high-altitude areas first but populations across the region will be affected,” says
Shea, also an author on the ICIMOD report.
As the third pole’s vast frozen reserves of fresh water make their way down to the oceans , they are
contributing to sea-level rise that is already making life difficult in the heavily populated low-lying deltas and bays of Asia, from Bangladesh to Vietnam. What is more, they
are releasing dangerous pollutants . Glaciers are time capsules, built snowflake by snowflake from the skies of the past and, as they melt, they
deliver back into circulation the constituents of that archived air. Dangerous pesticides such as DDT (widely used for three decades before being banned in

1972) and perfluoroalkyl acids are now being washed downstream in meltwater and accumulating in sediments

and in the food chain.


Ultimatelythe future of this vast region, its people, ice sheets and arteries depends – just as Khawa Karpo’s devotees
believe –on reducing
us: on pollutants. As Mukherji says, many of the glaciers that haven’t yet melted have
our emissions of greenhouse gases and other
effectively “disappeared because in the dense air pollution, you can no longer see them”.

Tibetan water scarcity escalates


Ninkovic 13 – Nina Ninkovic, Doctorate and Master’s Degree in International Relations
from the Geneva School of Diplomacy and International Relations, and Jean-Pierre Lehmann,
Emeritus Professor of International Political Economy at IMD and Contributing Editor at The
Globalist, “Tibet and 21st Century Water Wars”, The Globalist, 7-11,
http://www.theglobalist.com/tibet-and-21st-century-water-wars/
The crucial global role that Asia will play in the 21st century cannot be underestimated . The pivot of global economic
power is shifting east. Asia represents the new arena for analysis, power and influence. The narrative of the coming Asian century is dominated by a variety of

factors — rising economic power, demography, ecology, fierce resource competition , water and food supply and

security, as well as increasing military expenditure. In addition, Asia has the greatest number of geopolitical
“hot spots” and nuclear powers . In the latter context, the Tibetan Plateau stands out.
Strategically located between the two Asian giants, China and India, the Tibetan Plateau and its surroundings have come to represent
Asia’s most critical 21st century battleground . Potentially, they may also be the world’s
battleground . The narrative of this century and of Asia will be written, to a very large extent, in terms of what is the “hottest
geopolitical issue” — water security . The Tibetan Plateau extends from the Hindu Kush in central Afghanistan, through Pakistan, India, Nepal,
Bhutan and onto the borders of Myanmar. The geopolitical significance of Tibet has been tremendous historically, too. It was invaded by Britain in 1904 for that reason. Forty-
five years later China’s Liberation occurred and, almost immediately afterwards, the People’s Liberation Army annexed Tibet in 1950-51. For Mao Zedong, the strategic
importance of Tibet was clear. It was fundamental to national security. The Tibetan Plateau acts as a major buffer zone and provides China with leverage over almost the entire
Eurasian continent. From a national security point of view, the vast barriers of the Tibetan Plateau shield China’s internal populace in the east from military aggression
originating from the west. This strategic importance was clearly manifested in the Sino-Indian War of 1962 — the only war in which the People’s Liberation Army has been
successful so far! Losing Tibet would be seen as greatly weakening China strategically and a national humiliation. Apart from its strategic location, the renewed importance of
Tibet for China lies in its water riches. China has long been eyeing the water reserves of Tibet, especially during and since the period of Mao (1949-1976). Located at a high
It is in fact the
altitude on an average of 4,500 meters, it is richly endowed with fresh water contained in its oxygen deprived vast glaciers and huge underground reservoirs.

largest repository of freshwater after the two poles, Arctic and Antarctic, thus claiming the sobriquet, the “third
pole.” Many of the world’s greatest rivers flow out of the Tibetan Plateau — the Yellow River, Yangtze Kiang,
Mekong, Salween, Sutlej and the Brahmaputra. More important, in terms of human geography,
almost half of the global population currently lives in the watershed of the Tibetan Plateau. This
explains the enormous importance of Tibetan freshwater for China. China, on the whole, is an extremely arid country. One quarter of the
country consists of deserts. China has severe water shortage challenges. At the same time, most of its rivers are either too polluted or are too silted to quench the thirst of 1.3
billion people. The basic internal issue for China regarding water security is to transfer fresh water from the Tibetan Plateau in the country’s west to its industrial and populated
corners in its north and east. This has resulted in a spree of building dams, canals, irrigation systems, pipelines and water diversion projects. As Brahma Chellaney points out in
his seminal work, “Water: Asia’s Next Battleground,” China has created more dams in the last five decades than the rest of the world combined, largely in order to divert the flow
of rivers from the south to its north and east corners. The end result was the diversion of routes of various rivers originating in the Tibetan Plateau. China considers such
diversions to be an internal security matter. But these inter-basin and inter-river water transfer projects in the Tibetan Plateau have tremendous consequences on other
downstream countries that draw water from those rivers. Thus, what was seen as a national concern for China in reality has vast external ramifications. Chinese policy so far has

been to seek to minimize issues to be negotiated with its neighbors. But diversion of rivers could boil into a hot conflict in the
near future. Water wars could largely destabilize not just the wider Tibetan region, but also all of Asia .
Based on current trends, the question is not “how” and “why” such conflicts would arise, but “when”?
miscalc
Remote sensing capabilities are public, redundant and omnipresent
Zhu 17 [Lingli, Finnish Geospatial Research Institute FGI, National Land Survey of Finland
2017 https://www.intechopen.com/books/multi-purposeful-application-of-geospatial-data/a-
review-remote-sensing-sensors]

Commonly used remote sensing satellites So far, more than 1000 remote sensing satellites have been
4.5.

launched. These satellites have been updated with new generation satellites. The few spectral sensors from the earliest missions have been
upgraded to hyperspectral sensors with hundreds of spectral bands. The spatial and spectral resolutions have been improved on the order of 100-fold.
Revisit times have been shortened from months to daily. In addition, more
and more remote sensing data are available as
open data sources . Table 8 gives an overview of the commonly used remote sensing satellites and their parameters. A common
expectation from the remote sensing community is the ability to acquire data at high resolutions (spatial, spectral,
radiometric, and temporal), at low cost, with open resource support and for the creation of new applications by the integration of
spatial/aerial and ground-based sensors. The development of smaller, cheaper satellite technologies in recent years has led many companies to explore
new ways of using low Earth orbit satellites. Many companies have focused on remote imaging, for example, to gather optical or infrared imagery. In
the future, a low-cost communications network between low Earth orbit satellites can be established to form a spatial remote sensing network. This
network would integrate with a large number of distributed ground sensors to establish ground-space remote sensing. In addition, satellites can easily
cover large swaths of territory, thereby supplementing ground-based platforms. Thus, data distribution and sharing would become very easy. Openness
and sharing resources can promote the utilization of remote sensing and maximize its output. In
recent years, open remote sensing
resources have made great progress. Beginning on April 1, 2016, all Earth imagery from a widely used
Japan ese remote sensing instrument operating aboard NASA’s Terra spacecraft since late 1999 has been available to
users everywhere at no cost [41]. On April 8, 2016, ESA announced that an amazing 40-cm resolution WorldView-2 European cities

dataset would be available for download through the Lite Dissemination Server. These data are made available free of charge.
This dataset was collected by ESA, in collaboration with European Space Imaging, over the most populated areas in Europe at 40-cm resolution. These
data products were acquired between February 2011 and October 2013. The dataset is available to ESA member states (including Canada) and
European Union Member states [42]. In open remote sensing resources, NASA (USA) was a pioneer in sharing its imagery data. NASA has been
cooperating with the open source community, and many NASA projects are also open source. NASA has also set up a special
website to present these projects. In addition, some commercial companies like DigiGlobal (USA) have also partly

opened their data to the public. In the future, more and more open resources will become
available.

Remote sensing’s redundant


RNT 16 [Resilient Navigation and Timing Foundation, 501(c)3, Martin Fega, Chairman of the
Board, Vice President, Director, former Assistant Secretary of the Air Force and Director of the
National Reconnaissance Office, and Dana Goward, former Director, Marine Transportation
Systems for the US Coast Guard 11-30 https://rntfnd.org/wp-content/uploads/12-7-
Prioritizing-Dangers-to-US-fm-Threats-to-GPS-RNTFoundation.pdf]

10.Space Debris – GPS service degraded or interrupted because of damage a satellite by space
debris. Vulnerability Individual GPS satellites can be easily damaged by space debris .
However , there are 31 satellites and damage to one is unlikely to impact service as a whole. Low –
Vector able to impact less than 5% of users

no eradication in the squo which proves sats aren’t being utilized for
tracking OR if they are, soaring opium production disproves their
impact
Dr. Vanda Felbab-Brown 17, Senior Fellow in the Center for 21st Century Security and
Intelligence in the Foreign Policy Program at Brookings, PhD in Political Science from MIT,
“Afghanistan’s Opium Production is Through The Roof —Why Washington Shouldn’t
Overreact”, Brookings Report, 11/21/2017, https://www.brookings.edu/blog/order-from-
chaos/2017/11/21/afghanistans-opium-production-is-through-the-roof-why-washington-
shouldnt-overreact/
The diversity of the Taliban’s income portfolio is has important implications for
counternarcotics and counterinsurgency strategies , especially since eliminating the Taliban’s financial base through
counternarcotics efforts is often seen as a key element of the counterinsurgency strategy. There is simply no easy way to
bankrupt the Taliban by wiping out the opium poppy economy. And as discussed below, any such move
would be disastrous for the counterinsurgency efforts. There is simply no easy way to bankrupt the Taliban by wiping
out the opium poppy economy. The Taliban is not the only group profiting from the opiate business in Afghanistan. So are various criminal gangs,
which often are connected to the government, the Afghan police, tribal elites, and many ex-warlords-cum-government-officials. Many of these
powerbrokers are also key anti-Taliban counterinsurgency actors, including in the north of the country where opium too has expanded. NO MAGIC
BULLET Most counternarcotics measures adopted since 2001 have been ineffective or counterproductive
economically, politically, and with respect to counterinsurgency and stabilization efforts .
Eradication and bans on opium poppy cultivation, often borne by the poorest and most socially marginalized,
have generated extensive political capital for the Taliban and undermined
counterinsurgency . They sparked provincial revolts , alienated the rural
population from the Afghan government, and drove the rural population into Taliban
hands . The Taliban presented itself as a protector of the people’s poppies and cast the Afghan
government and its international sponsors as apostates and infidels trying to kill the Afghan
people with hunger. The Obama administration’s decision to defund centrally-led eradication
was a courageous break with U.S. counternarcotics dogma, and such a policy is
still correct today. Aerial spraying would be the only way to do any large-scale eradication since
manual eradication teams have been attacked. That would be disastrous from the counterinsurgency

perspective, since it would cement the Taliban’s political capital rather than bankrupting it.
Eradication never bankrupted insurgents anywhere, not even in Colombia. Nor is it sustainable without an end to conflict.

No Afghanistan escalation
Lamb 14 [Robert, senior fellow and director of the Program on Crisis, Conflict, and
Cooperation at the Center for Strategic and International Studies, PhD in policy studies from the
University of Maryland School of Public Policy; Sadika Hameed, fellow with the Program at
CSIS, MA in international policy studies from Stanford University; Kathryn Mixon, program
coordinator and research assistant
http://csis.org/files/publication/140116_Lamb_SouthAsiaRegionalDynamics_WEB.pdf]
In the event of conflict escalation in Afghanistan, China would be in direct contact with the
Afghan government, the power brokers it has a relationship with, and with Pakistani civilian and
(especially) military leaders to strongly encourage a political settlement. It would put its economic projects on hold

temporarily. But it would not become involved militarily ; instead, it would try to contain the fallout

with, for example, stronger border security. Iran would certainly take similar measures to contain spillover from an
escalated Afghan conflict, but otherwise its involvement would depend almost entirely on the state of its conflicts

and rivalries in the Levant and the Gulf, much higher-priority areas than Afghanistan. If things settle down to its west and south, Iran might turn some
attention eastward to Afghanistan’s conflict. This would not be in the form of direct military incursions
but rather of funding, military equipment, and possibly safe haven to Hazara, Tajik, and Uzbek groups, as it has in the past, with a particular priority on protecting Afghani-

Saudi Arabia, working with Pakistan, would probably offer support to groups that
stan’s Shi‘ite minority.

oppose Iranian-support groups. Qatar might follow Saudi Arabia’s lead or it might offer to mediate talks between opposing groups, as it has recently.
Beyond that, Qatar and UAE would probably stay out . Russia would probably increase its
security presence in Central Asia, as noted above, but work diplomatically with the United States,
European powers, or NATO to find ways to contain the spread of violence from Afghanistan into
Central Asia.

But sats are comparatively more essential to solve their laundry list
Salazar 18 [Doris Elin, Science & Astronomy "Solving Earth's Climate Challenges Requires
More Satellite Vision: Report," 1-6-18 https://www.space.com/39306-earth-climate-science-
satellites-future.html]
Space observations are crucial to solving the challenges presented by Earth's complex climate,
which will play a pivotal role in humanity's success or demise, argued an extensive report by the
U.S. National Academies. The new, 700-page report released today (Jan. 5) is titled "Thriving on Our Changing Planet: A
Decadal Strategy for Earth Observation from Space." In it, the National Academies of Sciences, Engineering and Medicine
(NASEM) announced their recommendations for what federal research agencies — such as
NASA, the National Oceanic and Atmospheric Administration (NOAA), and the United States
Geological Survey (USGS) — should do over the next 10 years. This was the second decadal
survey for Earth science and applications from space; the first was published in 2007. The report's
co-chairs — Waleed Abdalati, the director of Cooperative Institute of Research in Environmental Sciences at the University of
Colorado Boulder, and Bill Gail, chief technology officer at the Global Weather Corporation — addressed the survey's
recommendations during a press conference at the National Academies' Keck Center in Washington, D.C. [See the Effects of Climate
Change Across Earth (Video)] "There is a perspective from space that cannot be gained any other
way ," Abdalati said early in the press conference. Understanding the ways in which human activity and
non-anthropogenic changes are shaping societies across the world ought to be considered like
an extension of infrastructure , he added. Gauging weather systems and predicting
sea-level rise , for example, are as vital to a "thriving" society as fixing highways and
maintaining railroads. "If you go back 10-12 years, we were in a different place when it came to Earth information from
space" Abdalati said. "We were not using weather apps on our phones and planning our days' activities around them. We were not
using online mapping applications to get to and from where we're going in the most efficient way. The military [also] relies heavily
on information from NASA, NOAA and USGS." Space observations are crucial for society in a myriad of ways across the commercial,
public health and national safety sectors, the co-chairs said. The report is the product of 290 suggestions of the most important
issues to tackle in the near future, contributed by the scientific community. From those suggestions, the report's compilers extracted
103 objectives and then synthesized them into 35 final goals. The
report calls for prioritizing advances in, for
example, forecasting air quality and weather so that predictions provide a lead time of up to two
months. In addition, the report calls for knowing how biodiversity changes over time, predicting
future geological hazards within a more accurate time frame and understanding more precisely
how the ocean stores heat, among many other goals. The co-chairs said that the report focuses on
recommendations that are achievable within budget constraints and prioritizes the suggestions
the committee believed were most important for the next decade . The report invites the scientific
community at NASA, NOAA and USGS to focus first on achieving ambitious solutions to climate challenges and then following up
with ways to accelerate technology to meet those ends, rather than the other way around. The report recommends that NASA cap the
budget for its current projects —both flying and soon-to-be-flying missions —at $3.6 billion, to leave room in the agency's funding to
serve the report's 35 objectives over the next decade. Abdalati stressed, however, that it was important to fly the missions already in
development. NASA should also continue studying how small particles of material, known as aerosols, can affect air quality and
should learn more about the traits of vegetation on Earth's surface, the co-chairs said. The report additionally suggests that NASA
start a competitive new Explorer program for medium-size agile instruments and missions (with a budget of $500 million or lower),
in which participants would take a shot at addressing one of seven identified topics from the survey. Those topics include mapping
ocean-surface winds and developing 3D models of the terrestrial ecosystem. The competition may also reveal what objectives will be
really is
more easily achieved in the next decade, Gail said. Gail concluded his presentation by addressing the report's title. "It
about this tension between our ability to thrive over the next decade and longer, and the fact
that as the planet is changing around us, the information we need to acquire about our
planet is changing as rapidly as we try to acquire it," he said. "So this will be a decade in which we
will find growing community and public reckoning between two things: broad reliance on Earth
information ... and this growing challenge of obtaining that information."

Their ev concludes satellites can’t sustain precision ag due to


broadband issues – the plan makes collapse inevitable
Farm Management 18 [Assoc. Prof. Dr. Ignacio Ciampitti: PhD. Crop Physiology and
Plant Nutrition 4-10-18 https://www.farmmanagement.pro/satellite-big-data-how-it-is-
changing-the-face-of-precision-farming/]
The world of precision farming is ever changing, always growing and adapting to the challenges of agriculture as they arise and
develop. One such challenge is collecting accurate data and then being able to interpret it in a way that not only helps
farmers learn and understand but also gives them all the knowledge tools they need to make a real difference come harvest time.
Perhaps surprisingly, the solution to this particular challenge has come by way of satellites , orbiting the
Earth at around 400 miles up. These sophisticated bits of tech are collecting, mapping and channelling
important information about farms and farmers are beginning to see big pay offs . How Satellites Are
Helping. Since 2014 there have been a number of satellites that are providing crucial information
to agricultural databanks. These include high resolution imaging satellites such as Sentinel 1 and 2 along with Landsat-8.
With the advanced data storage methods, it is now easier than before to store plenty of data in one place i.e. cloud storage. This
storage and the unique
satellite capabilities have seen a huge amount of accurate and vital data being
stored in unprecedented levels. All of this data can then be utilised and accessed by agricultural schools, universities and
research facilities the world over to build an accurate picture of how we are faring when it comes to producing on our farms. These
facilities would also employ the use of drones to corroborate satellite imagery and before long the data collected was so substantial
and so valuable that it became a key area of investment for big agricultural companies. These same companies developed ways that
allowed farmers access to information that could help them on their individual farms. Some providing satellite mapping
services, others providing satellite-based crop monitoring and some providing key information from
datasets to allow farmers the opportunity to make informed decisions. Advantages of Satellites. From a ground perspective it is
very difficult to collate a lot of information quickly, accurately and in a manner that can be instantly stored. For example; a
ground reconnaissance vehicle will cost a lot to put into operation, will not be able to easily identify all
crops, will only have a limited field of vision and will then have to upload any data it has collected at the end of the day on to a
database. Although the initial cost of putting a satellite into orbit is very high, once in orbit it requires very little financial
expenditure to maintain. A set of data collection vehicles will always have initial purchase costs along with ongoing maintenance and
fuel costs. Satellites have advantages in 3 other key areas above and beyond those we have just mentioned. Progressive Data
Analysis. Because a satellite orbits, it will circle and cover an individual area of land more than once over the course of a growing
season. This means farmers not only get a snapshot of how their farm is but will get a sequence of snapshots of the farms
performance. The Sentinel satellites for example provide images every 6 days of any given location. More Accurate. Satellite imagery
provides a more detailed and more accurate picture of any given field. It highlights things across many different criteria, from crop
type, crop health, irrigation and problem areas. Because a satellite image can capture weather fronts, it is able to give farmers a real
time assessment of which weather is imminent.

[NEXT PARAGRAPH]

Sat ellite s Have Disad vantage s Too Because weather systems can sometimes obscure imagery it is sometimes difficult for
satellites to either map an area or build up an accurate picture. In Ukraine, research facilities consistently send up drones to verify
information collected by satellites as well as fill in data gaps. This means that satellites are extremely useful, but another data
It is also very data intensive .
collection tool should also be there to back up the information collected by the satellites.
Because satellites are harvesting so much data and not really able to discern for themselves which
data is important , there is a lot of information that goes through the processing system. Primarily
this has an internet speed issue as there is a lot of information that is transmitted from the
satellite via the internet to the cloud. Research facilities and agriculture companies are investing where they can in internet
infrastructure to counter this problem as it can sometimes cause delays or backlogs of information in the
system.
Insufficient broadband dooms precision ag even assuming strong
satellites
Strenkowski 15 [Jeff, JD, Morgan, Lewis & Bockius LLP, Statement Before the Rural
Utility Service, U.S. Department of Agriculture, and the National Telecommunications and
Information Administration, U.S. Department of Commerce, 6-10, p. 2-6]
II. High-Tech Agriculture Requires Increased Wireless M2M Internet Broadband The U.S. agricultural economy is
increasingly high-tech and mobile. Expanded broadband facilities and services are critical economic drivers to rural
communities. In particular, high speed broadband is not only essential to business centers in rural towns and traditional
anchor institutions, it is also an essential service for agricultural operations that form the economic heart of many American rural
communities. Agricultural producers are facing growing demands to produce more food, fuel and fiber for a growing, more prosperous world
farm buildings have access
population, and they must do so with limited resources and increasing regulation. Not only is it critical that
to high speed broadband to communicate with their customers and vendors, follow commodity
markets, gain access to new markets around the world, and manage regulatory compliance, but more and more farmers
are demanding capability for M2M communications from the field that make possible significant improvements in real-time productivity and cost
management. Over the past several decades, technology has enabled farmers to achieve ever greater levels of productivity. The first wave focused on
optimizing the vehicle. The second wave focused on optimizing the fleet. The third wave is focusing on connecting the farmer “in the cab” to the
cooperative, agronomist, or other agriculture service providers who can help reduce input costs, increase yields, and further enable sustainable farming
the future of enhanced farming efficiency and productivity turns on the grower’s ability
practices. Much of

to gather, process , and transmit data using advanced information and comm unications technologies. Technology-
equipped machine solutions enable agronomic decision-making to advance productivity, improve agriculture profitability and global
competitiveness, and optimize inputs for continuous environmental improvement. With superior, precise, site-specific data, a farmer can
analyze and carefully adjust their farming methods to be the most efficient, most economical, and most environmentally friendly possible, thus
requires significant improved
improving productivity and sustainability. However, enabling farmers to utilize M2M data fully
communications capacity and access to high speed mobile broadband. Today, many of Deere’s customers are challenged with a lack of
adequate cellular coverage in the fields where agricultural equipment operates. Deere’s JDLink™ data service,
for example, currently relies on the cellular telephone network to transmit telemetric machine operation data. The lack of coverage needed for these
solutions to transmit telemetric data from the machines is already a concern, but the
shortfall in coverage will only become more
problematic as data volumes increase. Due to significant gaps in cell coverage in rural areas where farm machines operate, today
JDLink™ data transmissions have only a 70% successful call completion rate. Absent significant improvements in cell coverage in cropland areas,
Deere expects that this figure will drop to about 50% in two to three years as agricultural demand for broadband services increases. 6 These data
communication services depend on stable, reliable high speed connections to equipment
operating in remote locations. This is not a problem that can be resolved by relying on
satellite services or even more spectrum. In addition to fiber-to-farm buildings, rural areas need more wireless
antenna towers, all of which must be connected by fiber backhaul to the broadband network provider.
Towers provide the wireless coverage-- the problem is there are simply not enough towers in the cropland areas where significant

productivity enhancements could be gained. The Council should examine ways to remove regulatory barriers to the deployment of
tower and backhaul infrastructure in rural areas, and seek to stimulate business funding for this critical national infrastructure.

But, weak satellites mean more effective remote sensing tech fills in
and resolves barriers to precision ag – squo makes their impact
inevitable
Byrum 17 [Joseph, Senior R&D and Strategic Marketing Executive in Life Sciences - Global
Product Development, Innovation, and Delivery at Syngenta 3-14-17
https://agfundernews.com/remote-sensing-powers-precision-agriculture.html]

Sensors can be grouped according to their enabling technology — ground sensors, aerial sensors and
satellite sensors. Ground sensors are handheld, mounted on tractors and combines, or free-
standing in a field. Common uses for these include evaluating nutrient levels for more
specific chemical and nutrient application , measuring weather , or the moisture
content of the soil . Aerial sensors have become far more affordable with the advent of
drone technology that places the bird’s-eye view of a field within reach of most farmers . They
are also attached to airplanes , another relatively cheap option. The systems are capable of
capturing high-res olution images and data slowly enough , at low altitude, to enable
thorough analysis . Typical uses include plant population count, weed detection, yield estimates, measuring chlorophyll
content and evaluating soil salinity. The downside of aerial platforms is that wind and cloud cover can limit their use. Satellite
sensors provide coverage of vast land areas and are especially useful for monitoring crops status, calculating losses from severe
weather events and conducting yield assessments. Initially, such systems were tailored to the needs of the military and
government, not agriculture. So the main downside, aside from cost , was that these systems were tasked in
advance — usually months — to look at a specific area at a certain time. Worst of all, cloud
cover could ruin that expensive purchase. Now many governments have opened up satellite imaging databases
to the public, providing an important and accessible resource for understanding crop conditions.
k
Environment’s improving and no collapse
Easterbrook 18 – Gregg Easterbrook, Contributing Editor at The New Republic and The
Atlantic, Former Fellow in Economics and Government Studies at The Brookings Institution,
Lecturer at the Aspen Institute and Chautauqua Institution, It's Better Than It Looks: Reasons
for Optimism in an Age of Fear, p. 52
North American
INSTEAD, IN 2017, I WATCHED a bald eagle glide peacefully above my home near Washington, DC.
eagles have proliferated so much that the International Union for the Conservation of Nature (IUCN), which
keeps the books on species gains and losses, now classifies the bird under "least concern.”

The eagle flew through air that was free of smog , as air almost always is in American cities. Newspapers in my
driveway reported that oversupply of petroleum and natural gas was pushing energy prices toward record lows. "Oil Glut Worries"—
here, Wall Street Journal, March 10, 2017; "Natural Gas Glut Deepens," same paper, same page, a week later. Society was expected
by now to be in full panic mode regarding oil and gas exhaustion, and instead the apprehension is too much fuel. Another newspaper
Acid
in the driveway reported so many otters frolicking off California that tourists were crowding seaside enclaves to watch.
rain was nearly stopped , the stratospheric ozone hole was closing . Water quality alarms
were ongoing in Flint, Michigan, and along Long Island Sound, but in general cleanliness was rising , with
Boston Harbor, Chesapeake Bay, Puget Sound, and other major water bodies, filthy a generation ago, mostly
safe for swimming and fishing , meeting the 1972 Clean Water Act's definition of success. Nearly every
environmental barometer in the United States was positive and had been so for years if not
decades .
Watching the bald eagle soar did not make me feel complacent regarding the natural world, rather, made me feel that greenhouse
gases can be brought to heel, just as other environmental problems have been. Climate change reforms will be the subject of a
coming chapter. Here, let's contemplate why nature did not collapse, despite ever more people consuming ever more resources.

Man-made damage to nature can be atrocious. Think of the Exxon Valdez oil spill, which destroyed
forever the wildlife in Prince William Sound, Alaska. At least that's what was said in 1989 when the tanker struck Bligh Reef. Today
most sea and intertidal life in Prince William Sound has returned to pre-spill numbers , while the
sound's combination of beauty and biology makes it a popular destination for whale- watching tours. Exxon, now ExxonMobil,
deserved the billions in fines and settlements the company paid. But the whole thing was over in a snap of the
fingers in geologic terms.

Humanity is hardly the only force that damages nature. In 1980, pressurized magma inside Mount Saint Helens in
Washington State exploded with the power of about 1,500 Hiroshima bombs . "Some 19 million old-
growth Douglas firs, trees with deep roots, were ripped from the ground and tossed about like cocktail
swizzles," one analyst wrote. Hundreds of square miles burned to cinders, animals and fifty- seven
people near the eruption turned to char. Commentators of the time called the Mount Saint Helens area destroyed
forever. When I hiked the blast zone in 1992, I was amazed to behold areas that had been lifeless moonscapes in 1980; just a
dozen years later , they were bright with biology : wildflower, elk, sapling firs. Today Mount
Saint Helens National Volcanic Monument is a recommended destination for backpackers. Through the eons, nature has
healed after insults far worse than the worst ever done by people — ice ages ,
asteroid strikes , thousand-year periods of volcanism so extreme that global ash clouds
blocked the sun for years at a time. The mega-volcanism that long ago created Siberia is estimated to
have unleashed three billion times the force of the Hiroshima blast, plus far more smoke than
humanity 's wars and factories combined. Nature has evolved defenses against such harm in the
same way that the body has evolved defenses against pathogens. This does not make harm to nature insignificant, any more than
having an immune system makes germs insignificant. But
before asking whether nature will collapse, it's good
to remind ourselves that our ongoing existence is evidence that the biosphere is a green
fortress .

Automation is good for growth and their predictions are apocalyptic


nonsense
Sarah Kessler 17, Reporter at Quartz, “The optimist’s guide to the robot apocalypse,”
3/9/2017, https://qz.com/904285/the-optimists-guide-to-the-robot-apocalypse/
Machines, you may have heard, are coming for all the jobs.
Robots flip burgers and work warehouses. Artificial intelligence handles insurance claims and basic bookkeeping, manages
investment portfolios, does legal research, and performs basic HR tasks. Human labor doesn’t stand a chance against them—after
the “automation apocalypse,” only those with spectacular abilities and the owners of the robots will thrive.

Or at least, that’s one plausible and completely valid theory. But before you start campaigning for a universal basic income and
set up a bunker, you might want to also familiarize yourself with the competing theory: In the long run, we’re going to be
just fine .

We’ve been here before


Our modern fear that robots will steal all the jobs fits a classic script. Nearly 500 years ago, Queen
Elizabeth I cited the same fear when she denied an English inventor named William Lee a patent for an automated
knitting contraption. “I have too much regard for the poor women and unprotected young maidens who obtain their daily bread by
knitting to forward an invention which, by depriving them of employment, would reduce them to starvation,” she told Lee, according
to one account of the incident. The lack of patent didn’t ultimately stop factories from adopting the machine.

Two hundred years later, Lee’s invention, still being vilified as a jobs killer, was among the machines destroyed by protestors during
the Luddite movement in Britain. More
than 100 hundred years after that, though computers had
replaced knitting machines as the latest threat to jobs, the fear of technology’s impact on
employment was the same. A group of high-profile economists warned President Lyndon Johnson of a “cybernation
revolution” that would result in massive unemployment. Johnson’s labor secretary had recently commented that new machines had
“skills equivalent to a high school diploma” (though then, and now, machines have trouble doing simple things like recognizing
objects in photos or packing a box), and the economists
were worried that machines would soon take over
service industry jobs. Their recommendation: a universal basic income, in which the government pays everyone a low salary
to put a floor on poverty.

Today’s version of this scenario isn’t much different. This time, we’re warned of the “Rise of
Robots” and the “End of Work.” Thought leaders such as Elon Musk have once again turned to a universal basic income
as a possible response.

But widespread unemployment due to technology has never materialized before . Why, argue the
optimists, should this time be any different?

Automating a job can result in more of those jobs


Though Queen Elizabeth I had feared for jobs when she denied Lee’s patent, weaving
technology ended up creating
more jobs for weavers. By the end of the 19th century, there were four times as many factory weavers as
there had been in 1830, according James Bessen, the author of Learning by Doing: The Real Connection between
Innovation, Wages, and Wealth.

Each human could make more than 20 times the amount of cloth that she could have 100 years earlier. So
how could more textile workers be needed?
According to the optimist’s viewpoint, a factory that saves money on labor through automation will either:
Lower prices, which makes its products more appealing and creates an increased demand that may lead to
the need for more workers.
Generate more profit or pay higher wages. That may lead to increased investment or increased
consumption, which can also lead to more production, and thus, more employment.
Amazon offers a more modern example of this phenomena. The company has over the last three years increased the
number of robots working in its warehouses from 1,400 to 45,000 . Over the same period, the rate at
which it hires workers hasn’t changed.

The optimist’s take on this trend is that robots


help Amazon keep prices low, which means people buy more
stuff, which means the company needs more people to man its warehouses even though it needs fewer human
hours of labor per package. Bruce Welty, the founder of a fulfillment company that ships more than $1 billion of ecommerce orders
each year and another company called Locus Robotics that sells warehouse robots, says he thinks the
threat to jobs from
the latter is overblown —especially as the rise of ecommerce creates more demand for
warehouse workers. His fulfillment company has 200 job openings at its warehouse.
A handful of modern studies
have noted that there’s often a positive relationship between new
technology and increasing employment—in manufacturing firms, across all sectors, and specifically in firms that
adopted computers.

How automation impacts wages is a separate question. Warehouse jobs, for instance, have a reputation as grueling and low-paying.
Will automation make them better or worse? In the case of the loom workers, wages went up when parts of their jobs
became automated. According to Bessen, by the end of the 19th century, weavers at the famous Lowell factory earned more
than twice what they earned per hour in 1830. That’s because a labor market had built up around the new skill (working the
machines) and employers competed for skilled labor.

Cap is environmentally sustainable


Westergård 18 – Rune Westergård. Entrepreneur, Engineer and Author, founder of the
technical consulting company CITEC. 2018. “Real and Imagined Threats.” One Planet Is
Enough, Springer International Publishing, pp. 71–80. CrossRef, doi:10.1007/978-3-319-60913-
3_7.
Threatening reports about our ability to create disasters and even exterminate ourselves are not a new idea. A standard example is
the British national economist Thomas Malthus in the early 19th century, who predicted that population growth
would come to a halt because of starvation. Malthus calculated that the available food in the world
couldn’t feed more than one billion people. He extrapolated the development from a still picture of his own time and
couldn’t fathom that food production would increase tremendously thanks to new knowledge and technology. Our present
food production is sufficient for seven times as many . Malthus didn’t pay attention to the fact that we live in
a continuously changing civilisation, and the same kind of miscalculations are still made today. There are people who have even
achieved the status of media superstars by presenting various dystopias and catastrophe scenarios. As early as 1968, Professor Paul
Erlichs at Stanford University published the bestseller The Population Bomb , where he predicted that
an imminent population explosion would result in hundreds of millions of death s by starvation
in the 1970s and 80s. Basically, he made the same mistake as Malthus, i.e. he treated knowledge and technology as if they were static
phenomena. The most widely read environment report in the world, State of the World , was a loud whistle-blower when
it was first published
in the early 1980s. The Swedish version, Tillståndet i världen, was published yearly from 1984 and
some years into the 2000s by the Worldwatch Institute Norden; I still have some of the early issues left. This
report contains
many valuable observations and suggestions, but also several basic analytical mistakes . In
other words, it
acts as an eye-opener, but it suffers from being tainted by political ideology . Its
main weakness is that it doesn’t take the intrinsic driving forces of progress into account . State
of the World was translated into most major languages and is, as already mentioned, the world’s most widely read environmental
report. It has affected us all, directly or indirectly, through school and media. Even if the Swedish version I refer to was written some
years ago, it is still worthy of discussion, firstly because it maintains an appearance of scientific validity, and secondly because it has
served as a trendsetter for the general ideology which has been adopted by many later books and reports on the subject at hand. It
still lives on as an engraved pattern in our conception of the world. In the report we can, for instance, read the following: A world
where human desires and needs are fulfilled without the destruction of natural systems demands an entirely new economic order,
founded on the insight that a high consumption level, population growth, and poverty are the powers behind the devastation of the
environment. The rich have to reduce their consumption of resources so that the poor can increase their standard of living. The
global economy simply works against the attempts to reduce poverty and protect the environment. We stubbornly insist to regard
economic growth as synonymous with development, even though it makes the poor even poorer. Even if we up to this point have
mainly described the environment revolution in economic terms, it is, in its most fundamental meaning, a social revolution: to
change our values. Massive threat scenarios are still presented, for instance in the British scientist Tim Jackson’s book
Prosperity Without Growth from 2009, which is one of the most widely read and frequently quoted works in this
area. Tim Jackson, who is an economist and professor in sustainable development, explains how we humans are
indulging in a ruthless pursuit of new-fangled gadgets in a consumption society running at full
speed towards its doom. He also claims that material things in themselves cannot help us to
flourish; on the contrary, they may even restrain our welfare . In other words, we cannot build our hopes that
the economy, technology or science can help us to escape from the trap of Anthropocene, which has brought us to the brink of an
ecological disaster. There are hundreds on books on this theme, and they all agree that the general state of the world is pure misery;
everything is getting worse, the resources are being depleted, and that man will soon have destroyed the entire planet. The apparent
reason for this, of course, is due to the consumption culture and the present financial system—which exposes man as a greedy,
ruthless and ultimately weak creature. This
attitude may serve a purpose as an eye-opener. But it is not
very credible, and it may even be counterproductive . Of course, we can see a lot of
problems ahead of us; but to solve them, we need the correct diagnostics instead of
dubious doomsday prophesies. Focus: The Problem Since the focus of attention is so profoundly
fixated on the problems in the climate and environmental debate, the progress already made —
and the opportunities at hand—are often overshadowed . The example below will help to illustrate this point: In the year
2014, the Nobel Prize in physics was awarded to three scientists who had invented blue light emitting diodes—a technology that has
made high-bright and energy-efficient LED lighting possible. As lighting accounts for 20% of the world’s total electrical
consumption, this invention has the potential to radically reduce energy consumption and greenhouse gas emissions. In an interview
made by the major Swedish daily newspaper Dagens Nyheter, one of the prize winners, Hiroshi Amano, says the following about
energy-efficient, inexpensive and high-bright LED lights: “They are now being used all over the world. Even children in the
developing countries can use this lighting to read books and study in the evenings. This makes me very very happy”. Shortly after
this announcement, the news headlines declared that LED lighting was a threat to the environment. This statement was based on a
report showing that LED lighting could be hazardous to flies and moths, which in turn might disturb the eco system. This is a typical
example of how progress pessimists and, not least the media, think and act. In this case, they focused on a
potential problem associated with LED lighting, and ignore d the tremendous possibilities that the
new tech nology offered to dramatically reduce greenhouse gases and thus spare the eco system (not to mention all the
other advantages). Books and reports of the kind mentioned above tell us repeatedly about disasters,
threats, problems, collapses and famines. On the other hand, they are notoriously silent about
the great improvements actually made —the reduction of extreme poverty (not only as a
percentage but also in absolute numbers), longer lifespans, dramatic global progress in
education and healthcare, etc. The lack of positive media coverage on the environment means
that many people believe that too little is being done , which is quite understandable considering the one-
sided nature of the information they are presented with. Alarmist reporting almost always reminds me of
pirates: they are unreliable and half their vision is blocked by their eye patches. It is vital that
the media not only one-sidedly focus on the misery without presenting the progress made and
suggesting constructive courses of action. The quality of our decisions in all respects depends on our knowledge,
insight and attitude. Real and Imagined Threats Many people are convinced that the climate and environmental problems are
growing. It is certainly true that our planet has its limitations, but many of the predictions from alarmist literature have been proven
false. In the 19 80s , the forest dieback was a frequently discussed subject. To quote the well-known
German news magazine Der Spiegel, an “ecological Hiroshima” was imminent. Most experts at the time claimed that a wide-spread
forest death seemed unavoidable. Additionally, the general mood of impending doom was augmented by the threat of a nuclear
disaster during the cold war. I remember the pessimistic discussions among friends and how frequently the gloomy reports appeared
But the forest dieback
in Swedish and Finnish television. The future of humankind appeared to be depressingly bleak.
never happened . On the contrary, the forest area has been constantly expanding in Europe,
even during the entire period when the forest was believed to be dying . Today, only two thirds of the
yearly accretion in Europe are cut down, according to the Natural Resource Institute in Finland. There are different opinions as to
why the large-scale forest dieback didn’t occur. One theory is that the researchers’ evidence and conclusions had been incomplete
and too hasty; the forest was actually never in danger. Others suggest that the emission limitations implemented prevented the
disaster. My point is that the environmental catastrophe did not happen . Some other environmental
problems, exaggerated or not, that have concerned us during the last decades have also
disappeared from the immediate agenda: overpopulation , DDT , the ozone hole, heavy
metals , lead poisoning, soot particles, the waste mountain , and the acid ification of our
lakes . Unfortunately, some environmental problems, like soot particles and waste, still remain in some areas, especially in poorer
countries, where there are other, even worse problems that have yet to be resolved. The conclusion is, however, that we and our
society in most cases have handled threatening situations quite well . When alarming
symptoms are noted, scientists and other experts are summoned, and we act according to their
diagnoses. It is no big deal that the diagnoses are sometimes wrong, as long as the side effects are not too severe. The main
thing is that we do our best to avoid disasters, and on the whole, humankind has succeeded
rather well this far . As individuals, we react very differently to various kinds of threats. The closer and more tangible the
threat is, the more violent are the reactions—while distant and invisible symptoms, like the depletion of the ozone layer, concern us
less. In the latter cases, we have to trust the scientists’ and later the politicians’ reactions. Does this mean that disasters are avoided
thanks to war headlines, threats, and anxiety? I don’t think that this is the most important explanation; rather, it is factual and
science-based information that produces effective results. But if exaggerated threat scenarios and reports of misery are needed to
inspire the necessary political opinion, acquire research funding and create behavioural changes, we will have to live with that. The
most important thing to remember in this context is that the actions shouldn’t cause more harm than the original problem itself. The
risk with exaggerated threat and misery reporting is that it may inspire an over-reaction based on misleading diagnoses, or the
opposite—a paralysing feeling of helplessness. It is necessary to take threats against the climate and the environment seriously, but
not to a degree where our ability to reason and act is blocked by fear or anxiety. Many environmental debaters claim that the fall of
the Inca and Roman empires were caused by the same causes that are now threatening our present civilisation—a short-sighted
over-exploitation and rape of nature. Easter Island is another popular example. However, in my opinion it is both worthless and
irresponsible to judge the world situation of today by copying the outcome of earlier cultural endeavours in history. The inhabitants
of the Inca empire and Easter Island didn’t have anything even remotely comparable with the organisations, technology, medicine or
general knowledge of today. It would be like comparing a case of appendicitis in the past to a case today. In pre-modern times, it was
a fatal condition. In this day and age, it is cured by a simple routine operation. Today,
humankind is conscious of the
climate changes and other ecological challenges. And we also have the knowledge and
resources needed to act . Facts, Propaganda and Hidden Messages During all the years I have followed the
development of technology and society, I have repeatedly observed how a mishmash of serious research, political propaganda, and
the hidden agendas of individuals have been distributed more or less randomly by the media. There are of course many different
kinds of alarmism— everything from well-founded research reports to exaggerated prophesies of doom. It is far from simple to
separate the wheat from the chaff. The actions
taken against ozone depletion, lead emissions and the toxic
chemical, dioxin, are all examples of how research has shown the way to successful results.
Today, greenhouse gas emissions top the list of issues deserving our gravest attention, as it is a
global phenomenon—just as the depletion of the ozone layer once was. There are also a
considerable number of local environmental problems, such as drought, air pollution, forest depletion and
overfishing. All of these are real threats that have to be acted upon, even though they are not global. However, I am always disturbed
when a single global environmental issue is bundled with an assortment of several local issues, rather like a simplified trademark
advertisement for the negative consequences of civilisation. This makes the information abstract and inaccurate, ignoring the fact
that different locales require different solutions. Fear and alarmism are natural reactions that once protected us when we were living
at the mercy of nature—they are evolutionary relics from our life in the savanna. Today, the same properties can be significant
drawbacks. The transition from a primitive, animal-like state to the society we have today must, on the whole, be counted as a great
success. But many people regard the same world as over-exploited, depleted, unjust, war-ridden and balancing on the brink of
destruction. How can people living in the same epoch have so entirely different views of the world? In the sustainability debate,
there is one faction dealing with the natural resources and ecosystems, and another focusing on the redistribution of wealth. There is
even a third faction discussing a minimalistic lifestyle; for example, downshifting, with less work and less material welfare. When all
these ingredients are mixed without discretion, the result is an anxiety soup that many have choked on. In a situation like that, we
cannot expect any constructive initiatives to materialise. Instead, it would be far better to explore, research and discuss each
dimension separately. What Is the Real State of the Planet? It is easy to generalise and say that
we over-exploit the planet’s resources and pollute the world with our waste. But how many care
to examine these statements in detail and ask exactly which resources are over-exploited? • Are
fish becoming extinct? It is true that overfishing occurs in many places, which is, of course,
unsustainable. However, this is not an unavoidable threat to the world’s total food resources .
Fortunately, there are several examples of fish stocks that have either recovered or started to
replenish once the fishing effort has been eased. • Is the air being poisoned? Many are
convinced that the air we breathe is becoming dirtier all the time. But that isn’t true , at least not in
the Western world. From the year 1990, emissions of sulphur dioxide have been reduced by 80%,
nitrogen oxides by 44%, volatile organic substances by 55%, and carbon monoxide by 62%.
Despite these dramatic improvements, 64% of Europeans believe that pollution is increasing. •
Are the forests dying? It is a general belief that the forests in the developed countries are dwindling. But that isn’t
true; on the contrary, the wooded areas are expanding . However, the forests are decreasing in the poor
countries, where forestry and farming are still major sources of income, as they once were in the industrialised countries. • Are
we drowning in waste? There are many who believe that we are surrounded by constantly
growing mountains of waste. In the developed countries, the truth is that increasing amounts of
waste are being recycled and the landfills are decreasing. • Will there be enough
phosphorus? Phosphorus is an important nutrient in farming, extracted from phosphate ore .
Many scientists fear that the finite natural resource of phosphate ore will become depleted in the future, which may jeopardise the
world’s food supply. But there
are already working solutions for this problem, such as by reclaiming
phosphorus through digestion residues and sewage sludge. There are also tech nological
solutions for the chemical extract ion of phosphorus from polluted water—the remediation of
lakes and rainwater by removing phosphorus is already a common procedure . Here we achieve a
win-win situation—phosphorus is collected while preventing the eutrophication of lakes.
• Will there be enough energy to go around? A common statement is that the earth’s
population is too large, and that we consume too much energy with respect to the climate . This is
one of those issues where we have to think in terms of symptoms, diagnoses, and medication. The symptoms are there for all to see:
climate change. On the other hand, the diagnosis that we consume too much energy is wrong . The
correct diagnosi

s is that we are not using the right technology ; i.e. energy efficient power production without harmful
emissions. Consequently, the correct statement would be that we consume energy that is produced by technologies that are harmful
the remedy will
to the climate. The difference in wording is important. As the first diagnosis is “too high energy consumption”,
be to use a different medication than a diagnosis based on “the wrong technology”. Alarmist reporting
can inspire bad decisions if the statements aren’t systematically reviewed and evaluated. It can also be misguiding to express
environmental threats in general terms. Actions must be based on precise specific symptoms with corresponding diagnoses. If the
doctor discovers that the patient is lame and suffers from a high fever, it doesn’t help to predict imminent death. Maybe the
lameness and the fever have different causes altogether! A successful cure would probably include two different diagnoses with
separate medications. Several recent surveys of the general conception of the world have been made— one is Project Ignorance by
Gapminder and Novus in Sweden. One
of the questions asked was whether CO2 emissions per
capita and year had increased or decreased in the world during the last 40 years. The surveyed
group was large and representative in order to give a fairly accurate picture of the common
opinion. No less than 90% believed that CO2 emissions had increased. The truth is that they
haven’t increased at all. It is important that decision makers on all levels learn how to see the wood from the trees.
Decisions based on false preconditions can halt technological development, and thus also the development of the economy, welfare,
and a healthier environment. The
flow of innovation s in the climate and environmental areas is
accelerating rapidly . This can be seen in the number of improvements that have occurred in
recent years, which can be counted in the thousands . Such improvements have to be weighted
on the same scale as the problems in this area. That is not to say the problems should be ignored—they need to be
acted upon. But they should not be allowed to occupy our brains to the extent that our power to act is paralysed. Is the Notion of
Sustainable Technology-Driven Growth Over-Optimistic? The development of a technological society has always
been questioned. In the 19th century, critics claimed that the technological revolution would create poverty. In the 1970s, it
was generally believed that the forest dieback would cause a disaster. In the 1980s, the acidification of lakes and throwaway
mentality of society were regarded as manifestations of the devastating properties of growth and industrialisation. Today, many fear
the environmental effects of air travel and the production of electronic devices. There
are people who seriously wish to
halt economic growth and wind back the clock to the society of the 1960s. They recall this
time period as small-scaled and down-to-earth, stress-free and idyllic. But they tend to forget
that the refrigerators of that time required 90% more electricity than today , and
that our teeth were repaired with mercury fillings instead of plastic . There were no X-
ray CT scanners and no medicines against ulcers. In addition, there were many more people living
without electricity . There was also more widespread malnutrition , a higher infant mortality , and,
in fact, more wars . Cars were fuelled by leaded petrol, and sulphur emissions were 90% higher
than today. The acidification of lakes, as well as polluted streams and fields, were serious
concerns. Since then, technological innovations have reduced sulphur emissions and removed
the lead from car fuel. At any given point in history, there have been critics claim ing that this was the
time when we had reached the optimal point in the development of the modern society. But we
hadn’t, not then and not now . And the more our countries are modernised, the greater our
possibilities to care for animals and nature become. In the mid-1800s, the killing of large animals
like sperm whales didn’t concern people to any significant degree, despite the cruel hunting methods using harpoons. The benefits of
the whale fat, mainly used for lamp oil to facilitate reading in the evenings, overshadowed any empathic impulses. In the 1850s more
than 70,000 people were employed by the American whaling industry. There were 900 ships in the world hunting whales, and
during one of the most active years, 8000 whales were butchered, which provided more than 300,000 barrels of oil. The oil
extracted from the head of the sperm whale, the so-called spermaceti oil, was especially sought-after. It was of very high quality and
sold for 1.50 US dollars per litre in today’s monetary value. As a consequence, the number of sperm whales in the world rapidly
dwindled. However, when oil drilling started in Pennsylvania in the year 1859, the price of whale oil began to fall. The fast transition
to petroleum products for lighting and other applications is considered to have saved the last of the sperm whales. Thus, new
technology can both contribute to the protection of threatened animal species and provide the wealth to make it affordable for us to
even save predators. Imagine what would happen if we were able to bring back someone from the 19th century and tell them that
today we move wolves though the air by helicopter in order to save the species and expand its habitat; our ancestor would probably
rather go back to sleep than listen to such apparent stupidity. Pessimism Does not Support a Sustainable Development There is a lot
of progress going on in the world today, but not without negative side effects. When improving the world and dealing with the side
effects, an optimistic attitude provides us with a much better chance of success than a pessimistic view. The optimist carries a
positive inner beacon to follow, while the pessimist is always looking for potential traps and drawbacks. As visions and conceptions
of ideas often become self-fulfilling, it isn’t difficult to realise what’s most constructive. All decisions—big or small, conscious or not
—are affected and guided by our inner beacon. When solving a problem, such as developing a new product for example, it is
necessary to have a conception of a working solution in mind. As a product developer, it is of course necessary to review every
minute step in the process and question the choices made. You have to ask yourself if there may be a better material or a smarter
design. Strange as it seems, this continuous struggle in the mind of the developer may appear to be a kind of pessimism, as it is all
about looking for weaknesses in the imagined solution. It is not dissimilar from the process a doctor follows when selecting a
diagnosis and a remedy. You start with certain hypotheses, examine, exclude, test, question and verify until you are satisfied that you
have made the correct diagnosis. Then the choice of medication becomes much simpler. It would be fatal if the doctor was
pessimistic from the start and worked in the belief that it would be impossible to find a reason for the illness, or a working remedy.
This could then be the conclusion that such a doctor would unconsciously try to verify. Would you like to have a doctor like that? The
same is true for climate and environmental problems—we need optimists armed with critical thinking to solve them. There are also
so-called climate change deniers, who believe that man hasn’t really affected the planet and its ecosystems to any significant degree.
Some of them claim that the influence of the sun and other natural phenomena are so enormous that human activities have no
bearing on global warming. Perhaps these deniers are so deeply pessimistic that they cannot imagine any possible solutions. For
ages, man has harboured a certain distrust of his own species. Throughout history, various religions have emphasised human
shortcomings and presented assorted consequential threats. During the last 30 years, such prophesies have increasingly often been
introduced by environmental activists and some political groups, whose messages have been significantly supported by the media.
The underlying conception of humanity isn’t flattering. The human race is considered to be fundamentally ruthless, greedy, short-
sighted and evil. Threats against the climate and much other misery on earth are caused by human failure. However, if we take the
time to study the progress that has been made by the human race throughout the ages, we actually get the opposite picture. Can it
really be evil, greedy, and short-sighted beings who put their own lives at stake to treat people infected by Ebola or HIV in poor
countries? Who are the ones that are continuously reducing the number of starving people on earth? Who are the ones that invent
vaccines for the children of the world? Who are the ones that have developed a civilisation where an increasing number of people get
educated, and who struggle to reduce the casualties of war? Why
blame an entire species for atrocities that are
actually committed by a mere fraction? Establishing a firm belief in humankind should be
the first step on the road to sustainable development.

Alt fails---no mindset shift and causes transition wars


Aligica 3 (fellow at the Mercatus Center, George Mason University, and Adjunct Fellow at the
Hudson Institute (Paul, 4/21. “The Great Transition and the Social Limits to Growth: Herman
Kahn on Social Change and Global Economic Development”, April 21,
http://www.hudson.org/index.cfm?fuseaction=publication_details&id=2827)
Stopping things would mean if not to engage in an experiment to change the human nature, at least in an equally difficult
experiment in altering powerful cultural forces: "We firmly believe that despite
the arguments put forward by
people who would like to 'stop the earth and get off,' it is simply impractical to do so. Propensity
to change may not be inherent in human nature, but it is firmly embedded in most contemporary cultures.
People have almost everywhere become curious, future oriented, and dissatisfied with their conditions. They want
more material goods and covet higher status and greater control of nature . Despite much propaganda to
the contrary, they believe in progress and future" (Kahn, 1976, 164). As regarding the critics of growth that stressed the issue of the
gap between rich and poor countries and the issue of redistribution, Kahn noted that what
most people everywhere
want was visible, rapid improvement in their economic status and living standards , and
not a closing of the gap (Kahn, 1976, 165). The people from poor countries have as a basic goal the
transition from poor to middle class. The other implications of social change are secondary for
them. Thus a crucial factor to be taken into account is that while the zero-growth advocates and their
followers may be satisfied to stop at the present point, most others are not. Any serious attempt
to frustrate these expectations or desires of that majority is likely to fail and/or create
disastrous counter reactions . Kahn was convinced that "any concerted attempt to stop or even
slow 'progress' appreciably (that is, to be satisfied with the moment) is catastrophe-prone". At the
minimum, "it would probably require the creation of extraordinarily repressive
governments or movements-and probably a repressive international system" (Kahn, 1976, 165; 1979,
140-153). The pressures of overpopulation, national security challenges and poverty as well as the
revolution of rising expectations could be solved only in a continuing growth environment. Kahn
rejected the idea that continuous growth would generate political repression and absolute poverty. On the contrary, it
is the limits-to-growth position "which creates low morale , destroys assurance, undermines
the legitimacy of governments everywhere, erodes personal and group commitment
to constructive activities and encourages obstructiveness to reasonable policies and
hopes". Hence this position "increases enormously the costs of creating the resources needed for
expansion, makes more likely misleading debate and misformulation of the issues, and make
less likely constructive and creative lives". Ultimately "it is precisely this position the one that
increases the potential for the kinds of disasters which most at its advocates are trying to avoid"
(Kahn, 1976, 210; 1984).

You might also like