Professional Documents
Culture Documents
Collisions – 1AC
the implied strategic understanding that it took the U.S. and the U.S.S.R. four decades to build. The rules of the game have changed, but the danger of
nuclear apocalypse is still real and a risk of miscalc ulation has increased . Morgan echoes Johnson-
Freese’s assertion that the dual-use phenomenon complicates deterrence and extends the reasoning on offensive dominance by adding valuable insight
on the state of first-strike stability. In short, first-strike stability is difficult to maintain because the disproportionate gain from a first strike outweighs
any cost a recipient can impose in response. The United States’ overwhelming reliance on and comparative advantage from space based effects gives a
prospective attacker very high payoff and satellites being relatively soft targets increases the likelihood of success and further adds to the benefit of a
first-strike.62 Conversely, the emphasis on system based warfare means that an effective attack on space assets drastically reduces the ability of the U.S.
to impose costs. Also, its overreliance on space and the fragility of the space environment require an asymmetric response to both avoid a tit-for-tat
spiral and protect the continued use of the domain. Furthermore, a lack of space situational awareness (SSA) prevents a rapid response.63 Chinese
military planners are acutely aware of the asymmetric advantage to be gained from a first-strike in space and have integrated it into military doctrine.
This further strengthens the argument of space warfare as a flash point in East Asia. The structural factors examined in the literature thus far paint a
bleak picture for a peaceful restructuring of East Asia. However, a bipartisan grand strategy
that preempts conflict, is sustained and
refined over decades, and has an acute sense of both a competitor and one’s own culture and history may be able to
subvert structural determinism.64 When imperfect rationality and miscalc ulation results in deterrence failure it is
difficult to underestimate the importance of understanding a competitor’s decision making
apparatus. Strategic culture , political climate, and soft power interactions are the core of this
apparatus. Joan Johnson-Freese, who is equal parts East Asia policy and space policy expert, asserts that, “it might
be generally possible to grasp the mechanics of the Chinese space program without the benefits of historical
information, but the likelihood of truly understanding the policy aspects without this contextual
info rmation is slightly less, and attempts at analysis and extrapolation become superficial at best .”65 Likewise,
competitive strategy will be ineffective absent an understanding of one’s own limitations. Resources such as latent
military capacity, budget, political capitol, strategic culture, and soft power/international prestige should be easy to calculate, but many times within
the space program’s short history the failure to grasp internal limitations has been a stumbling block. Henry Kissinger’s On China is a nuanced
examination of Chinese strategic culture that benefits from the author’s understanding of Chinese history and the nation’s role in late20th/early-21st
diplomatic engagement
century global power politics. He conveys a unified message through On China, that continual
between the two powers is the key to peace and develops two motifs throughout the work. First, misapprehension
of Chinese intent by western powers has repeatedly resulted in conflict , which could be
avoided with better understanding of Chinese strategic culture. Traditional Chinese strategic culture, shaped for
millennia by geography and Confucian principals, was not destroyed by Mao and the communist revolution as many assert. Kissinger uses the traditional martial games of wei qi
(go) and chess to exemplify Chinese and western strategic cultures respectively. Where wei qi teaches the art of strategic flexibility by emphasizing encirclement, protracted and
asymmetric warfare, generating unperceptively small advantages, and momentum; chess teaches total victory achieved by attrition, decisive moves, centers of gravity, and
symmetry. Carl von Clausewitz teaches that war is policy by other means, inferring war as a distinct phase of politics; while Sun Tsu emphasizes victory before fighting by
achieving psychological advantage with military means as a small part of overall strategy. The ideal Chinese military conflict is geographically limited and easily contained; the
American way of war concludes only upon total victory. Kissinger then describes the feedback loop that results from conflicting
strategic perspectives . The western desire for control threatens Chinese freedom of maneuver and exacerbates their siege mentality. In response, China assumes a policy of active defense (preemption) in order to maintain
the strategic initiative. This, in turn, is seen as hostile by the west and typically results in escalation in order to establish deterrence through cost imposition. The western idea of deterrence is incompatible with ambiguity and flexibility while Chinese preemption
demands it.67 This results in a distinguishable pattern. First, a state consolidates power on China’s periphery, surrounding China and threatening its structural integrity on both physical and psychological levels. Second, ever aware of shi, Chinese strategists employ
measures to maintain their strategic flexibility and prevent total encirclement. Third, the peripheral power misinterprets preemption for aggression and escalates the conflict. At this point, China is either able to contain the threat and achieve its geopolitical aims or
it is too weak to do so and is thrown into existential crisis. In the 20th century, this pattern has been exemplified by Chinese involvement in the Korean War and its continued support of an independent state to buffer the U.S. alliance bloc from a historical ingress
point to the Chinese mainland; its own Vietnam War to prevent the emergence of a competitive power bloc led by Vietnam in Southeast Asia; and Chinese political maneuvering against the Soviet Union to prevent its consolidation of power over the Eurasian
landmass. Disregarding the similarities between these disputes and the current Sino-U.S. position in East Asia is impossible.68 Second, the Sino-centric worldview is rising in China as she emerges from a century of humiliation to become an economic and military
superpower. The over-proselytized American belief that the implementation of democracy should be the end goal of global politics and unapologetically moralist positions conflict with Sino-centrism. It is seen by China as an extension of colonial interventionism and
a threat to their fiercely held autonomy. U.S. diplomacy is often contingent on the improvement of China’s human rights record. Widespread support for China’s various separatist movements and public outcry over the Tiananmen Square incident has exacerbated
this problem. American reluctance to recognize the legitimacy of a communist government, give up democratization as long term policy goals, or give China its due in international relations has weakened Sino-U.S. relations. America’s moralist rather than pragmatic
approach to policy threatens China’s delicate social order and undermines CCP legitimacy, resulting in missed diplomatic opportunities. Other policy analysts are certainly influenced by Kissinger, but add their own insight into Chinese Strategic culture. Kenneth
Johnson and Andrew Scobell writing for the Strategic Studies Institute both attribute the apparent cognitive dissonance in Chinese policy to a curious blend of Confucian ideals and realpolitik thought, supporting Kissinger’s assertion that Confucianism is not dead.
There is a cult of defense within China, accompanying a deeply held belief that China’s strategic culture is overwhelmingly pacifist. However, preemptive action is permissible as long as it can be a justifiable “defense” of Chinese strategic interests.70 In addition,
China bemoans aggressive territorial expansion and hegemony by force. This historical sensitivity has only been exacerbated by the “century of humiliation” at the hands of European powers.71 However, the benevolent expansion of influence and the use of force for
Chinese national unification is just. Furthermore, the Chinese fear of encirclement could cause a disproportionate reaction to the U.S. force realignment and restructuring of alliances in East Asia. This could exacerbate the worsening of the security dilemma that
alliance forming typically causes. Joan-Johnson Freese emphasizes the influence of Confucianism in internal decision making and the penchant for isolationism. Confucianism emphasizes peace, order, and knowing one’s place within society. This invites
authoritarianism and the Chinese people have little experience with participation in the political process. Rather, there is an instability lurking beneath the calm surface of society that leaders must constrain and satisfy in order to maintain their mandate to rule. The
social contract has a simple results based nature where political stability and prosperity is exchanged for the continued political power. The Chinese Communist Party then is less beholden to communist ideology than it is to continued prosperity. Also, despite the
From the outsider, the familial ties, importance of relationships,
negative connotation of nepotism in the West, it is an institution of Chinese culture (known as Guan Xi).
compartmentalization, and ambiguity in the Chinese bureaucracy are confusing and frustrating. This research
paints the picture of the U.S. and China as diametrically opposed cultures that are almost designed
to create misunderstanding between the two. Therefore, being aware of cultural and political sensitives
is necessary to create sound strategy. Michael Pillsbury identifies 16 psycho-cultural pressure points where, if correctly considered in reassurance, cost imposition, or
dissuasion strategies, will yield disproportionately effects whether they be positive or negative. Each of these factors are referred to as “fears”. Eleven of the sixteen fears are linked to the ability of the U.S. military
to project power into East Asia and the strategic sea lines of communication from the Strait of Malacca to the Bohai Gulf, which is contingent on the ability to deliver space effects in support of U.S. military
operation. Pillsbury identifies the fear of attack on their anti-satellite capabilities as a specific Chinese fear, but warfighting in the space domain is intrinsically linked to the other 11. Another of the sixteen fears is
the fear of escalation and loss of control. This is particularly important because the Chinese view ASAT weaponry as a legitimate cost imposition option designed to limit conflict. Contrast that with the American
strategy of threatening escalation in order to prevent the spread of the conflict into space and implicit red lines that fail to account for limited conflict in a strategic domain. Space’s role in soft power links it to two
final fears, the fear of regional competitors and the fear of internal instability. Space technology development is essential to the CCP’s techno-nationalist narrative as it is assigned great importance internally to
strengthen CCP’s mandate to rule and externally to legitimize China as a regional leader.
The compressed timelines characteristic of crises combine with these ― use it or lose it pressure s to
shrink timelines. This dynamic couples dangerously with the inherent difficulty of determining the
causes of satellite degradation, whether malicious or from natural causes, in a timely way. Space is a difficult environment in which
to operate. Satellites orbit amidst increasing amounts of debris . A collision with a debris object the size of a
marble could be catastrophic for a satellite, but objects of that size cannot be reliably tracked. So a failure due to a collision with a small piece of
untracked debris may be left open to other interpretations. Satellite electronics are also subject to high levels of damaging radiation. Because of their remoteness, satellites as a
rule cannot be repaired or maintained. While on-board diagnostics and space surveillance can help the user understand what went wrong, it is difficult to have a complete
picture on short timescales. Satellite failure on-orbit is a regular occurrence19 (indeed, many satellites are kept in service long past their intended lifetimes). In the past, when
of peace operators may assume malicious intent . More to the point, in a crisis when the costs of inaction
may be perceived to be costly, there is an incentive to choose the worst-case interpretation of events even if the
information is incomplete or inconclusive. Entanglement of strategic and tactical missions During the Cold War, nuclear and conventional arms were well separated, and escalation pathways were relatively clear. While space-
based assets performed critical strategic missions, including early warning of ballistic missile launch and secure communications in a crisis, there was a relatively clear sense that these targets were off limits, as attacks could undermine nuclear deterrence. In the
Strategic Arms Limitation Treaty, the US and Soviet Union pledged not to interfere with each other‘s ―national technical means‖ of verifying compliance with the agreement, yet another recognition that attacking strategically important satellites could be
destabilizing.20 There was also restraint in building the hardware that could hold these assets at risk. However, where the lines between strategic satellite missions and other missions are blurred, these norms can be weakened. For example, the satellites that provide
early warning of ballistic missile launch are associated with nuclear deterrent posture, but also are critical sensors for missile defenses. Strategic surveillance and missile warning satellites also support efforts to locate and destroy mobile conventional missile
launchers. Interfering with an early warning sensor satellite might be intended to dissuade an adversary from using nuclear weapons first by degrading their missile defenses and thus hindering their first-strike posture. However, for a state that uses early warning
satellites to enable a ―hair trigger‖ or launch-on-attack posture, the interference with such a satellite might instead be interpreted as a precursor to a nuclear attack. It may accelerate the use of nuclear weapons rather than inhibit it. Misperception and dual-use
technologies Some space technologies and activities can be used both for relatively benign purposes but also for hostile ones. It may be difficult for an actor to understand the intent behind the development, testing, use, and stockpiling of these technologies, and see
threats where there are none. (Or miss a threat until it is too late.) This may start a cycle of action and reaction based on misperception. For example, relatively low-mass satellites can now maneuver autonomously and closely approach other satellites without their
cooperation; this may be for peaceful purposes such as satellite maintenance or the building of complex space structures, or for more controversial reasons such as intelligence-gathering or anti-satellite attacks. Ground-based lasers can be used to dazzle the sensors
of an adversary‘s remote sensing satellites, and with sufficient power, they may damage those sensors. The power needed to dazzle a satellite is low, achievable with commercially available lasers coupled to a mirror which can track the satellite. Laser ranging
networks use low-powered lasers to track satellites and to monitor precisely the Earth‘s shape and gravitational field, and use similar technologies. 21 Higher-powered lasers coupled with satellite-tracking optics have fewer legitimate uses. Because midcourse missile
defense systems are intended to destroy long-range ballistic missile warheads, which travel at speeds and altitudes comparable to those of satellites, such defense systems also have inherent ASAT capabilities. In fact, while the technologies being developed for long-
range missile defenses might not prove very effective against ballistic missiles—for example, because of the countermeasure problems associated with midcourse missile defense— they could be far more effective against satellites. This capacity is not just theoretical.
In 2007, China demonstrated a direct-ascent anti-satellite capability which could be used both in an ASAT and missile defense role, and in 2009, the United States used a ship-based missile defense interceptor to destroy a satellite, as well. US plans indicated a
projected inventory of missile defense interceptors with capability to reach all low earth orbiting satellites in the dozens in the 2020s, and in the hundreds by 2030.22 Discrimination The consequences of interfering with a satellite may be vastly different depending
on who is affected and how, and whether the satellite represents a legitimate military objective. However, it will not always be clear who the owners and operators of a satellite are, and users of a satellite‘s services may be numerous and not public. Registration of
satellites is incomplete23 and current ownership is not necessarily updated in a readily available repository. The identification of a satellite as military or civilian may be deliberately obscured. Or its value as a military asset may change over time; for example, the
share of capacity of a commercial satellite used by military customers may wax and wane. A potential adversary‘s satellite may have different or additional missions that are more vital to that adversary than an outsider may perceive. An ASAT attack that creates
persistent debris could result in significant collateral damage to a wide range of other actors; unlike terrestrial attacks, these consequences are not limited geographically, and could harm other users unpredictably. In 2015, the Pentagon‘s annual wargame, or
simulated conflict, involving space assets focused on a future regional conflict. The official report out24 warned that it was hard to keep the conflict contained geographically when using anti-satellite weapons: As the wargame unfolded, a regional crisis quickly
escalated, partly because of the interconnectedness of a multi-domain fight involving a capable adversary. The wargame participants emphasized the challenges in containing horizontal escalation once space control capabilities are employed to achieve limited
national objectives. Lack of shared understanding of consequences/proportionality States have fairly similar understandings of the implications of military actions on the ground, in the air, and at sea, built over decades of experience. The United States and the Soviet
Union/Russia have built some shared understanding of each other‘s strategic thinking on nuclear weapons, though this is less true for other states with nuclear weapons. But in the context of nuclear weapons, there is an arguable understanding about the crisis
escalation based on the type of weapon (strategic or tactical) and the target (counterforce—against other nuclear targets, or countervalue—against civilian targets). Because of a lack of experience in hostilities that target space-based capabilities, it is not entirely clear
what the proper response to a space activity is and where the escalation thresholds or ―red lines‖ lie. Exacerbating this is the asymmetry in space investments; not all actors will assign the same value to a given target or same escalatory nature to different weapons.
For example, the United States is the country most heavily dependent on military space assets. Its proportionally higher commitment to expeditionary forces make this likely to be true well into the future. So while the United States seeks to create a deterrence
framework, punishment-based deterrence would not likely target its adversary‘s space assets. But then there is difficulty finding target on the ground that would be credible but also not unpredictably escalate a crisis. If an American military satellite were attacked
but without attendant human casualties (satellites have no mothers‘), retaliation on an adversary‘s ground-based target is likely to escalate the conflict, perhaps justifying the adversary‘s subsequent claim to self-defense, even if the initial satellite attack didn‘t
Little experience in engaging substantively in these issues Related to this issue is that there is relatively little experience
support such a claim.
among the major space actors in handling a crisis with the others . The United States and the Soviet Union, then
Russia, have had a long history of strategic discussions and negotiations. This built up a shared
understanding of each other‘s point of view, developed relationships between those conducting those
discussions, and created bureaucracies and expertise to support those discussions. This experience and
these relationships are important to interpret ing events and to resolv ing disputes before they turn
into a crisis, and to managing [crisis] one once it begins. There is nothing like this level of
engagement around space issues between these two states, and much less between the US and China . One of
the participants in a 2010 US space war game, a diplomatic veteran, imagined25 how things would play out if one or more militarily important US
satellites failed amidst a crisis with an adversary known to have sophisticated offensive cyber and space capabilities: The good news is that there has
no one around the situation
never been a destructive conflict waged in either the space or cyber domains. The bad news is that
room table can cite any history from previous wars, or common bilateral understandings with the
adversary, relating to space and cyber conflict as a guide to what the incoming reports mean, and what may or
may not happen next. This is the big difference between the space-cyber domains, and the nuclear domain. There is, in this future scenario,
no credible basis for anyone around the president to attribute restraint to the adversary, no track record from which to
interpret the actions by the adversary. There is no crisis management history : the
president has no bilateral understandings or guidelines from past diplomatic discussions, and no
operational protocols from previous incidents where space and cyber moves and counter-
moves created precedents. Perhaps the adversary intended to make a point with one series of limited
attacks, and hoped for talks with Washington and a compromise; but for all the president knows, sitting in
the situation room, the hostile actions taken against America‘s space assets and information systems are
nothing less than early stages of an all-out assault on US interests. Where to start? How to prioritize efforts Using
this lens, what does this say about where efforts around space security should be focused? Start a substantive, high-level arms control discussion
Starting a credible high-level discussion will require countries to identify key domestic stakeholders, assemble teams of experts on relevant issues, and
informed dialogue will increase understanding between
develop detailed policy positions. The resulting
countries , identify important areas of agreement and disagreement, clarify intentions, and establish better
channels of communication.
Congressman’s Wolf ’s perspective assumes that working with the United States would give China opportunities in
terms of surreptitiously obtain ing U.S. tech nology otherwise unavailable to it. But we live in a globalized
world. Attempting to isolate Chinese space activities has proved futile , and in fact pushed China
and other countries into developing indigenous space industries — totally beyond any U.S. control —
than they might not have done otherwise , and arguably reap more political and prestige benefits
from doing so than if they had got ten the same tech nology from partnering with the U.S . The
only outcome of the past two decades of strict export control there is hard data on is the damage to
the U.S. commercial space sector. Second, Wolf’s rationale assumes the United States has nothing to gain by working with
the Chinese. On the contrary, the United States could learn about how they work — their decision-
making processes, institutional policies and standard operating procedures . This is
valuable info rmation in accurately decipher ing the intended use of dual-use space technology,
long a weakness and so a vulnerability in U.S. analysis . Working together on an actual project where
people confront and solve problems together, perhaps, a space science or space debris project where both
parties can contribute something of value, builds trust on both sides , trust that is currently
severely lacking . It also allows each side to understand the other’s cultural proclivities,
reasoning and institutional constraints with minimal risk of tech nology sharing .
Perhaps most importantly, coop eration would politically empower Chinese individuals and institutions who are
stakeholders in Chinese space policy to be more favorably inclined toward the United States. A
coop erative civil and commercial relationship creates interests that could inhibit aggressive or
reckless behavior, as opposed to Chinese space policy being untethered to any obligations, interest
or benefits it might obtain through cooperation with the United States. The National Academies of Science (NAS) 2014 report titled Pathways to
Exploration: Rationales and Approaches for A U.S. Program of Space Exploration, includes a specific recommendation that it is in U.S. interests to work with China.39 NAS has also successfully completed the first
Forum for New Leaders in Space Science with the Chinese Academy of Science in 2014. It brought together 16 early career space scientists from China and the US to meet over two workshops where they shared
research results and discussed future research opportunities. A second forum is being planned. Wolf further stated that the United States should not work with China based on moral grounds. While clearly the
United States would prefer not to work with authoritarian and/or communist regimes, it has done so in war and in peacetime when it has served American interests, and continues to do so today. That is the basis
of realism: Serve American interests first. While the United States would prefer not to work with Stalin, we continue to work with Putin when it benefits us to do so. Were the U.S. not to work with authoritarian
regimes, it would have few regimes to work with at all in the Middle East. The U.S. provided supported Saddam Hussain’s regime in the Iran-Iraq War. Chinese politicians are interested in the ISS for symbolic
“usurped” the perception of U.S. space leadership, it is being ceded to them. This rebuttal to Congressman Wolf’s views assumes that the United States has a choice regarding whether or not to work with China. If , however,
sustainability of the space environment upon which the U.S. generally and the U.S. military specifically relies upon for
advantages is to be maintained , the space debris issue alone requires that the U.S. not exclude
diplomacy as a policy option. While missile defense/ASAT testing has been conducted in ways to minimize debris issues since 2007,
the potential threat to the space environment in non-test circumstances has become clear. If there was any upside to the 2007 Chinese test, it
was the frightening realization by all countries of the fragility of the space environment. With regard to China specifically, since this 2007 test China has done nothing further in
it in the best interests of all spacefaring nations to cooperate to maintain that environment. China was scheduled to host an
international meeting of the Inter-Agency Space Debris Coordinating Committee (IADC) only days after its 2007 ASAT test that significantly worsened space debris, resulting in China cancelling the meeting out of
embarrassment. There is a certain (understandable) glee in the U.S. military, which has the most sophisticated government space tracking abilities, at being able to warn China of potential collisions between its
recognition of need to sustain the space environment and coop erate on relevant issues, particularly the d
space debris issue.45 These are the type of “common ground” issue s that provide
opportunitie s to work with all spacefaring nations to protect the “congested, contested and competitive” space environment. U.S.
emphasis on counterspace is often presented as in response to actions and intentions of other countries, specifically China, presumably recent.
Increasingly, however, it seems speculation about Chinese intentions is based on material not
publically shared , mak ing the feasibility of both the speculation and appropriate U.S. response s
difficult to assess. For example, to my knowledge China has done nothing since its admittedly irresponsible 2007 ASAT test that goes beyond
what the U.S. considers international norms of responsible behavior. Pursuing efforts to enhance transparency , confidence-
building measures, toward identifying “common ground among all space-faring nations,” and resiliency for military systems (NSSS, p.8) all must be
pursued with the same energy and commitment as counterspace operations. Otherwise, just as efforts to
isolate Chinese space activities have backfired on the U.S. in areas such as export control, the unintended
consequences of a principally “deter, defend, defeat” strategy could trigger an arms race that puts
the sustainability of the space environment at significant risk, to the detriment of U.S.
nat ional sec urity. With regard to the resilience, specifically the purview of the Department of Defense (DOD) and Office of the Director of National Intelligence (ODNI), resilience has faced
resistance from elements within as being too expensive or, as with space arms control, just too difficult.46 The Air Force appears to be taking the time honored approach of studying the problem rather than acting
on it. Center for Strategic and Budgetary Assessments analyst Todd Harrison characterized part of the problem as a lack of interest on the part of Pentagon leaders. He stated, “While everyone recognizes space as a
critical enabler for the war fighter at all levels of conflict, from low to high end, it is not the sexy weapon system that puts hot metal on a target. So it doesn’t attract much interest from senior leaders.”47
Counterspace, however, offers that sexy option. Regarding transparency, the need to share information about satellite locations was recognized by the private satellite owners and operators, promoting the
government level, Space Situational Awareness ( SSA ) efforts have largely been to
formation of the Space Data Association. At the
“formalize the existing model of one-way data flow from the American military to other countries and satellite operators” and the
U.S. signing bi-lateral agreements with France49 and Japan, and the U.S., United Kingdom (U.K.), Canada and Australia signing a limited agreement in 2014.50 While U.S. efforts to provide
collision-avoidance information to other countries – including China – are admirable, as an
increasing number of
countries place an increasing number of satellites in orbit , improving current techniques and
increasing collaboration and coop eration on exchange s of info rmation must be aggressively
pursued.
Aff is best middle ground – China says yes but doesn’t link to tech
transfer
Wang 19 [Kent Wang is a research fellow at the Institute for Taiwan-America Studies. 3-4-
2019 https://www.asiatimes.com/2019/03/opinion/why-space-exploration-should-be-a-joint-
effort/]
China, which until now has developed its ambitious space program completely on its own, seems more
open to international cooperation , especially with the United States. I recently met a
group of Chinese scientists on a visit to the National Air and Space Museum in Washington, DC. I
asked them what they made of their country’s advances in space and their thoughts for the
future. They all spoke of their desire to see a fellow citizen on the moon and said they would like to
work cooperatively with the United States in space exploration. They also said they believed that
the more the US and China cooperate in space, the less likely it is that they will confront each
other militarily.
I am not suggesting that the US should share all its technology with China. However, the truth is that
the relations between those two countries are nowhere near as strained as those between the US
and the Soviet Union in the mid-1970s, so why not work toward joint missions? At the very least,
both nations could develop a capability to monitor climate change, space weather, threats from
asteroids, and other useful initiatives. Such collaboration has facilitated the development of new
technologies, created commercial opportunities, identified opportunities for shared
missions , and inspired younger generations to undertake the challenge of space
exploration.
International cooperation is a great thing. It can be a useful foreign-policy tool as well as improve
the space capabilities of the United States and its partners. A joint venture approach to manned
space programs and deep space exploration has the potential to yield diplomatic, political and
military benefits that vastly exceed in magnitude the cost of implementation.
P
United States federal government should significantly increase data exchange and
administration of basic orbital data on nonsensititve space assets with the People’s Republic of
China.
Space Engagement – 1AC
SSA coop with China spills over to other space engagement and space
weather
Weeden 15 - Brian Weeden is the Technical Advisor for Secure World Foundation and a
former U.S. Air Force Officer with sixteen years of professional experience in space operations
and policy (“An Opportunity to Use the Space Domain to Strengthen the U.S.-China
Relationship” 9/9, http://nbr.org/research/activity.aspx?id=602
The U.S.-China relationship in space has the potential to be a stable foundation for a stronger
overall relationship between the two countries. Space was arguably a stabilizing element in the
relationship between the United States and Soviet Union during the Cold War by providing national
capabilities to reduce tensions and an outlet for collaboration. Although the future of the U.S.-China relationship
will be characterized by both competition and cooperation, taking concrete steps to stabilize relations in space can be
part of the solution to avoiding the “Thucydides trap,” where an established power’s fear of a rising
power leads to conflict.
The Role of Space in the U.S.-China Relationship
Space is a critical domain to the security of the United States. Space capabilities enable secure, hardened communications with nuclear forces, enable
the verification and monitoring of arms control treaties, and provide valuable intelligence. Such capabilities are the foundation of the United States’
ability to defend its borders, project power to protect its allies and interests overseas, and defeat adversaries. The space domain, however, is currently
experiencing significant changes that could affect the United States’ ability to maintain all these benefits in the future. A growing number of state and
nonstate actors are involved in space, resulting in more than 1,200 active satellites in orbit and thousands more planned in the near future. Active
satellites coexist in space along with hundreds of thousands of dead satellites, spent rocket stages, and other pieces of debris that are a legacy of six
decades of space activities. As a result, the most useful and densely populated orbits are experiencing significant increases in physical and
electromagnetic congestion and interference.
Amid this change, China is rapidly developing its capabilities across the entire spectrum of space activities. It has a robust and successful human
spaceflight and exploration program that in many ways mirrors NASA’s successes in the 1960s and 1970s and is a similar source of national pride.
Although it still has a long way to go, China is developing a range of space capabilities focused on national security that one day might be second only to
those of the United States. Some
of China’s new capabilities have created significant concern within the
U.S. national security community, as they are aimed at countering or threatening the space
capabilities of the United States and other countries.
There is
The massive changes in the space domain and China’s growing capabilities have affected the U.S.-China relationship in space.
growing mistrust between the two countries, fueled in part by their development and testing of
dual-use technologies such as rendezvous and proximity operations and hypervelocity kinetic
kill systems. This mistrust is compounded by a misalignment in political and strategic priorities: China is focused on developing and increasing
its capabilities in the space domain, whereas the United States is focused on maintaining and assuring access to its space capabilities.
Despite these challenges and concerns, there are concrete steps that the United States and China can take to manage
tensions and possibly even work toward positive engagement . In 2011, President Barack Obama and then Chinese president
Hu Jintao issued a joint statement on strengthening U.S.-China relations during a visit by President Hu to the White House. As one of the steps
outlined in the statement, the two presidents agreed to take specific actions to deepen dialogue and exchanges in the field of space and discuss
opportunities for practical future cooperation.
President Xi Jinping’s upcoming visit presents an opportunity to build on the 2011 agreement and take steps toward these goals. The first step should
be to have a substantive discussion on space security. President Obama should clearly communicate the importance that the United States places on
assured access to space, U.S. concerns with recent Chinese counterspace testing, and the potential negative consequences of any aggressive acts in
space. Both countries should exchange views on space policies, including their interpretations of how self-defense applies to satellites and hostile
actions in space. Doing so can help avoid misunderstandings and misperceptions that could lead either country to unwittingly take actions that escalate
a crisis.
disaster response , and global environmental monitoring are all areas where the United
States and China share joint interests and could collaborate with each other and other interested countries to help
establish broader relationships outside the military realm.
In addition, the United States should take steps on its own to stabilize the relationship. First and foremost, it should get serious about making U.S.
space capabilities more resilient. Increasing resilience would support deterrence by decreasing the benefits an adversary might hope to achieve and also
help ensure that critical capabilities can survive should deterrence fail. While resilience has been a talking point for the last few years, the United States
has made little progress toward achieving the goal. Radical change is thus needed in how Washington develops and organizes national security space
capabilities. Moreover, the United States should embrace commercial services to diversify and augment governmental capabilities, while encouraging
allies to develop their own space capabilities.
Second, the U nited S tates should continue to bolster the transparency of space activities by increasing the
amount of space situational awareness (SSA) data available to satellite operators and the public. Greater transparency reinforces
ongoing U.S. and international initiatives to promote responsible behavior in space and also helps mitigate the
possibility for accidents or naturally caused events to spark or escalate tensions. Shifting responsibility for space safety to a civil agency that can share
and cooperate more easily with the international community and working with the international community to develop more publicly available sources
of SSA data outside the U.S. government are two steps that would enhance trust, improve data reliability, and reinforce norms of behavior.
The consequences of not addressing the current strategic instability in space are real. A future conflict in space between the United States and China
would have devastating impacts on everyone who uses and relies on space. Both the United States and China have acknowledged the dangers of
outright conflict and have pledged their interest in avoiding such an outcome. Taken together, the
initial steps outlined here could
help stabilize the U.S.-China strategic relationship in space, mitigate the threat of the worst-case
scenario, and work toward a more positive outcome that benefits all .
There’s robust US data disclosure in the squo but only bilateral SSA
framework spills over to broader space engagement with China
Sankaran 14 [Jaganath Sankaran, postdoctoral fellow at the Belfer Center for Science and
International Affairs at Harvard’s Kennedy School of Government and was previously a Stanton Nuclear
Security Fellow at the RAND Corporation. Sankaran received his doctorate in international security from
the Maryland School of Public Policy, Winter 2014 “Limits of the Chinese Antisatellite Threat to the
United States” JSTOR]
The United States should also study and improve its ability to use measures like satellite sensor
shielding and collision avoidance maneuvers for satellites. These would dilute an adversary’s
ASAT operation and increase the apparent uncertainty of the consequences of an ASAT
attack.46 Monitoring mechanisms—both technical and nontechnical—that provide long warning
times and the ability to definitively identify an attacker in real time should also be a priority. The
US Air Force has started to invest in such capabilities on a small scale. Gen William Shelton,
head of Air Force Space Command, announced on 21 February 2014 the upcoming launch of the
geosynchronous space situational awareness (SSA) system designed to “have a clear,
unobstructed and distinct vantage point for viewing resident space objects.”47 Such systems will
help in attributing an ASAT attack. Similarly, the groundbased Rapid Attack, Identification,
Detection, and Reporting System (RAIDRS) is a valuable US asset to identify, characterize, and
geolocate attacks against US satellites.48
However, these unilateral measures offer no direct positive inducement for the Chinese
decision maker to desist from taking an aggressive posture on space security. Such inducements
will require more cooperative ventures that integrate China more deeply into the global
space community. The United States could, for example, make available its data on satellite
traffic and collisions, which would help China streamline its space operations. Such gestures
demonstrate a modicum of goodwill which can encourage further cooperation. The United
States has already put in place policy actions to share SSA data with allies. The latest guidance
document on US space policy, the National Security Space Strategy released in 2011 by the
Office of the Secretary of Defense and the Office of the Director of National Intelligence, states
that “the United States is the leader in space situation awareness (SSA) and can use its
knowledge to foster cooperative SSA relationships, support safe space operations, and protect
US and allied space capabilities and operations.”49 However, the United States has been more
forthcoming and willing to ink data-sharing arrangements with allies than with China. The US
Strategic Command (USSTRATCOM) has signed SSA data agreements with Japan, Australia,
the UK, Italy, Canada, and France.50 Although there may be security reasons behind this
preference to engage primarily with allies, it is important to realize that China is the nation that
most needs to be induced to contribute to the peaceful development of space operations. The
United States should use all available diplomatic leverage to partner with China and share
SSA data to make it a part of the global space community.
Mitigation The EMP Commission, in its 2008 report, found that it is not practical to try to protect the entire
electrical power system or even all high-value components. It called however for a plan designed to reduce recovery and
restoration times and minimize the net impact of an event [20]. This would be accomplished by “hardening” the grid, i.e., actions to
protect the nation’s electrical system from disruption and collapse, either natural or man-made [21]. The shielding is accomplished
through surge arrestors and similar devices [22]. The cost to harden the grid, from our tabulation of Congressional EMP figures, is
$3.8 billion. There has been no hardening of the grid The commission and organization that are responsible for
public policy on grid protection are FERC and NERC. FERC (The Federal Energy Regulatory Commission) is an independent agency
within the Department of Energy. NERC, the self-regulatory agency formed by the electric utility industry, was renamed the North
American Electric Reliability Corporation in 2006. In June of 2007, FERC granted NERC the legal authority to enforce reliability
standards for the bulk power system in the USA. FERC cannot mandate any standards. FERC only has the authority to ask NERC to
propose standards for protecting the grid. NERC’s position on GMD is that the threat is exaggerated. A report
by NERC in 2012 asserts that geomagnetic storms will not cause widespread destruction of
transformers, but only a short-term (temporary) grid instability [23]. The NERC report did not use a model that
was validated against past storms, and their work was not peer-reviewed . The NERC
report has been criticized by members of the Congressional EMP commission. Dr. Peter Pry asserts that the final draft
was “written in secret by a small group of NERC employees and electric utility insiders….. The report
relied on meetings of industry employees in lieu of data collection or event
investigation” [22]. NERC, in turn, criticizes Kappenman’s work. NERC states that the Metatech work cannot be
independently confirmed [24]. NERC reliability manager Mark Lauby criticized the report for being based on proprietary code [24].
Kappenman’s report, however, received no negative comments in peer review [24]. The NERC standards The
reliability
standards and operational procedures established by NERC, and approved by FERC, are disputed [25].
Among the points are these: 1. The standards against GMD do not include Carrington storm class levels .
The NERC standards were arrived at studying only the storms of the immediate prior 30 years, the largest of which was the Quebec
storm. The GMD “benchmark event”, i.e., the strongest storm which the system is expected to withstand, is set by NERC as 8 V/km
[26]. NERC asserts this figure defines the upper limit intensity of a 1 in 100-year storm [26]. The Los Alamos National Laboratory,
however, puts the intensity of a Carrington-type event at a median of 13.6 V/km, ranging up to 16.6 V/km [27]. Another analysis
finds the intensity of a 100-year storm could be higher than 21 V/km [28]. 2. The
15–45 min warning time of a
geomagnetic storm provided by space satellites (ACE and DSCOVR) will be insufficient for
operators to confer, coordinate, and execute actions to prevent grid damage and
collapse. Testimony of Edison Electric Institute official Scott Aaronson under questioning by Senator Ron
Johnson in a hearing before the Senate Homeland Security and Governmental Affairs Committee in 2016
encapsulates some of the issues. Video of the exchange is available on the web [29]. The Edison Electric Institute (EEI)
is the trade association that represents all US investor-owned electric companies. Johnson: Mr. Aaronson, I just have to ask you –
the protocol of warning 15–30 min – who is going to make that call? I mean, who is going to make that for a massive geomagnetic
disturbance, that nobody knows how many of these transformers are going to be affected. Who is going to make that call to shut
them off line – to take them off line – so those effects do not go through those wires and destroy those large transformers that
cannot be replaced? Aaronson: So, the grid operators are tightly aligned. We talked about the fact that there are 1900 entities that
make up the bulk electric system. There are transmission operators and so on… Johnson (interrupting): Who makes the call? Who
makes the call – we are going to shut them all down in 30 min, in 15 min? Aaronson: It’s not as simple as cutting the power. That’s
not how this is going to work but there is again, there is this shared responsibility among the sector. Johnson: Who makes the call?
Aaronson: I do not know the answer to that question [29]. Mr. Aaronson’s is Managing Director for Cyber and Infrastructure
Security at EEI. Congressman Trent Franks, R Az introduced HR 2417, the SHEILD Act, on 6/18/2013. The bill would give FERC the
authority to require owners and operators of the bulk power system to take measures to protect the grid from GMD or EMP attack.
The costs would be recovered by raising regulated rates. Franks states he had been led to believe that his bill would be brought to the
House floor for a vote. But he states House Energy and Commerce Committee Chairman Fred Upton R, Mich., let it die in
committee. He has been unable to get an explanation from Upton [30]. Between 2011 and 2016, Mr. Upton has received $1,180,000
in campaign contributions from the electric utility industry [31]. The electric utility industry is heavily involved in campaign
donations. During the 2014 federal election cycle, the electric utility industry made $21.6 million in campaign contributions [32].
The electrical utility industry is particularly involved in state politics. For instance, in Florida, between 2004 and 2012 electric utility
companies donated $18 million into legislative and state political campaigns. In that state, the electric utilities employ one lobbyist
for every two legislators [33]. Electric utility revenue in 2015 was 391 billion dollars [34].
Electromagnetic pulse Of the scenarios that might lead to electrical network collapse, EMP has received the widest public attention. It has been the subject of television series,
films, and novels. HEMP (for high altitude) is the more accurate acronym, but as media and the public use EMP, we will use both interchangeably. The issue has become highly
politicized. The most prominent article in the media against EMP as a threat is by Patrick Disney, “The Campaign to Terrify You about EMP” published in the Atlantic in 2011.
“From Newt Gingrich to a Congressional ‘EMP Caucus’, some conservatives warn the electronic frying blast could pose gravely underestimated dangers on the U.S…..Ballistic
missile defense seems to be the panacea for this groups concern, though a generous dose of preemption and war on terror are often prescribed as well” [35]. As of 2009, Mr.
Disney was acting Policy Director for the National Iranian American Council (NIAC). NIAC has been accused of acting as a lobby for the Islamic Republic of Iran [36]. Mr.
Disney is quoted as stating his strategy, in advancing an Iranian interest, is to “create a media controversy” [36]. The campaign to discredit EMP has been largely successful. To a
very large part of the body politic EMP is identified as a cause limited to the far right. A high-altitude electromagnetic pulse (EMP) is produced when a nuclear device is
detonated above the atmosphere. No radiation, blast, or shock wave is felt on the ground, nor are there any adverse health effects, but electromagnetic fields reach the surface.
An EMP has three components, E1 through E3, defined by speed of the pulse. Each has specific characteristics, and specific potential effects on the grid. E1, the first and fastest
component, affects primarily microelectronics. E3, the later and slower component, affects devices attached to long conductive wires and cables, especially high-voltage
transformers. A single nuclear blast will generate an EMP encompassing half the continental USA [37]. Two or three explosions, over different areas, would blanket the entire
continental USA. The potential impact of an EMP is determined by the altitude of the nuclear detonation, the gamma yield of the device, the distance from the detonation point,
the strength and direction of the earth’s magnetic field at locations within the blast zone and the vulnerability of the infrastructures exposed. The E1 gamma signal is greatest for
bursts between 50 and 100 km altitude. E3 signals are optimized at busts between 130 and 500 km altitude, much greater heights than for E1 [38]. Higher altitude widens the
area covered, but at the expense of field levels. The 1963 atmospheric test ban has prevented further testing. E1 and its effects The E1 pulse (“fast pulse”) is due to gamma
radiation (photons), generated by a nuclear detonation at high altitude, colliding with atoms in the upper atmosphere. The collisions cause electrons to be stripped from the
atoms, with the resultant flow of electrons traveling downward to earth at near the speed of light. The interaction of the electrons with the earth’s magnetic field turns the flow
into a transverse current that radiates forward as an intense electromagnetic wave. The field generates extremely high voltages and current in electrical conductors that can
exceed the voltage tolerance of many electronic devices. All this occurs within a few tens of nanoseconds. The Congressional EMP Commission postulated that E1 would have its
primary impact on microelectronics, especially SCADAs (Supervisory Control and Data Acquisition), DCSs (digital control systems), and PLCs (programmable logic controllers).
These are the small computers, numbering now in the millions, that allow for the unmanned operation of our infrastructure. To assess the vulnerability of SCADAs to EMP, and
therefore the vulnerability of our infrastructure, the EMP Commission funded a series of tests, exposing SCADA components to both radiated electric fields and injected voltages
on cables connected to the components. The intent was to observe the response of the equipment, when in an operational mode, to electromagnetic energy simulating an EMP.
“The bottom line observation at the end of the testing was that every system tested failed when exposed to the simulated EMP environment” [6]. E1 can generate voltages of
50,000 V. Normal operating voltages of today’s miniaturized electronics tend to be only a few (3-4) volts. States the EMP Commission: “The large number and widespread
reliance on such systems by all the nation’s critical infrastructures represent a systemic threat to their continued operation following an EMP event” [39]. A scenario seen in
films is all automobiles and trucks being rendered inoperable. This would not be the case. Modern automobiles have as many as 100 microprocessors that control virtually all
functions, but the vulnerability has been reduced by the increased application of electromagnetic compatibility standards. The EMP Commission found that only minor damage
occurred at an E1 field level of 50 kV/m, but there were minor disruptions of normal operations at lower peak levels as well [40]. There is a self-published post (J. Steinberger,
Nobel laureate physics, 1988) disputing the potential effects of E1 [41]. This is an isolated opinion. Shielding against E1 could theoretically be accomplished through the
construction of a Faraday cage around specific components or an entire facility. The cage is composed of conductive materials and an insulation barrier that absorbs pulse
energy and channels it directly into the ground. The cage shields out the EM signals by “shorting out” the electric field and reflecting it. To be an effective Faraday cage, the
conductive case must totally enclose the system. Any aperture, even microscopic seams between metal plates, can compromise the protection. To be useful, however, a device
must have some connection with the outside world and not be completely isolated. Surge protective devices can be used on metallic cables to prevent large currents from
entering a device, or the metallic cables can be replaced by fiber optic cables without any accompanying metal. The US Military has taken extensive measures to protect
(“harden”) its equipment against E1. “On the civilian side, the problem has not really been addressed” [42]. E3 and its effects E3 is caused by the motion of ionized bomb debris
and atmosphere relative to the geomagnetic field, resulting in a perturbation of that field. This induces currents of thousands of amperes in long conductors such as transmission
lines that are several kilometers or greater in length. Direct currents of hundreds to thousands of amperes will flow into transformers. As the length of the conductor increases,
the amperage amplifies. The physics of E3 are similar to that of a GMD, but not identical. GMD comes from charged particles showering down from space creating current flow
in the ionosphere. These currents create magnetic fields on the ground. A nuclear burst on the other hand generates particles which create a magnetic bubble that pushes on the
earth’s magnetic field producing a changing magnetic field at the Earth’s surface. A geomagnetic storm will have substorms that can move over the Earth for more than 1 day,
while the E3 HEMP occurs only immediately following a nuclear burst. There are three studies on the potential effects of a HEMP E3 on the power grid. The first study,
published in 1991, found there would be little damage [43]. Although supporting the utility industry’s position, it has not been subsequently cited by either NERC or the industry.
The study is criticized for expressing a smaller threat intensity [44]. The second, published in 2010 by Metatech, calculated that a nuclear detonation 170 km over the USA would
collapse the entire US power grid [45]. The third study, by EPRI (an organization funded by the electric utility industry) published in February 2017, asserts that a single high-
altitude burst over the continental USA would damage only a few, widely scattered transformers [46]. The study is disputed for underestimating threat levels and using
erroneous models [44]. These results are incompatible. One’s interpretation of the studies on E3 (and GMD) is based largely on the credibility one gives to the underlying
Commission or Institute, and not the published calculations. FERC has decided not to proceed with a GMD standard that includes EMP [47]. It will be recalled the GMD
standard is 8 V/km. The EMP Commission, utilizing unclassified measured data from the Soviet era nuclear tests, found an expected peak level for E3 HEMP for a detonation
over the continental USA would be 85 V/km [48]. The position of the electric utility industry is that E3 from a nuclear detonation is not a critical threat [49]. Others have come
to a different conclusion. Israel has hardened her grid [50]. She perceives herself to face an existential threat, and it is not the Sun. The electric utility industry states the cost of
hardening the grid against EMP is the government’s responsibility, not the industry’s [51]. Cyberattack The vulnerability from a cyberattack is exponentially magnified by our
dependence on SCADAs. In 2010, a computer worm attacking SCADA systems was detected. Although widely spread, it was designed to only attack SCADA systems
manufactured by Siemens for P-1 centrifuges of the Iranian nuclear enrichment program. The attack destroyed between 10 and 20% of Iranian centrifuges. Iran’s program was
likely only briefly disrupted [52]. In December 2015, a cyberattack was directed against the Ukrainian power grid. It caused little damage as the grid was not fully automated.
There is an argument that the cyber threat is exaggerated. Thomas Rid states that viruses and malware cannot at present collapse the electric grid. “(The world has) never seen a
cyber- attack kill a single human being or destroy a building” [53]. The electric utility industry offers a similar perspective. In testimony on cybersecurity before the Senate
Homeland Security and Governmental Affairs Committee, its representative states that “There are a lot of threats to the grid…..from squirrels to nation states, and frankly, there
have been more blackouts as a result of squirrels (gnawing wire insulation) then there are from nation states” [54]. Others however express concern [55]. Moreover, in a report
by the Department of Defense in 2017, it is noted that “the cyber threat to critical US infrastructure is outpacing efforts to reduce pervasive vulnerabilities.” [56] That report
notes that “due to our extreme dependence on vulnerable information systems, the United States today lives in a virtual glass house” [57]. On March 15, 2018, the Department of
Homeland Security issued an alert that the Russian government had engineered a series of cyberattacks targeting American and European nuclear power plants and water and
electric systems [58]. It is reported these attacks could allow Russia to sabotage or shut down power plants at will [59]. The ability to operate a system in the absence of
computer-driven actions is fast disappearing. The electric power industry spends over $1.4 billion dollars annually to replace electromechanical systems and devices that involve
manual operation with new SCADA equipment [60]. With modest increases in efficiency come exponential increases in vulnerability. The extent to which reduced labor costs
(and perhaps reduced energy costs) are passed on to the public is uncertain. Kinetic attack An internal FERC memo obtained by the press in March 2012 states that “destroy
nine interconnector substations and a transformer manufacturer and the entire United States grid would be down for 18 months, possibly longer” [61]. The mechanism is
through the megawatts of voltage that would be dumped onto other transformers, causing them to overheat and in cascading fashion cause the entire system overload and fail.
At Metcalf California (outside of San Jose) on April 16, 2013, a HV Transformer owned by PG&E sustained what NERC and PG&E claimed was merely an act of vandalism [1].
Footprints suggested as many as 6 men executed the attack. They left no fingerprints, not even on the expended shell casings [1]. US FERC Chairman Wellinghoff concluded that
the attack was a dry run for future operations [62]. Information on how to sabotage transformers has been available online [63]. There is a disincentive for management to
invest in security. As stated in a report by the Electric Research Power Institute: “Security measures, in themselves, are cost items, with no direct monetary return. The benefits
are in the avoided costs of potential attacks whose probability is generally not known. This makes cost-justification very difficult” [64]. CEO pay at large American companies is
based on the Harvard Business School theory that the best measure of managerial performance is a company’s stock price. This does not necessarily align the interests of CEOs
with shareholders, let alone the public. It “encourages short-term boosts to profits rather than investing for long term growth” [65]. In 2014, the CEO of PG&E, Anthony Early
Jr., had a compensation of $11.6 million dollars. Over 90% was from bonuses based on stock performance. The President of PG&E, Christopher Johns, had a compensation of $6
million dollars [66]. There is no evidence, however, that any of this is in play in the positions of the electric utility industry vis-à-vis securing the grid. States PG&E spokesman
Jonathan Marshall, “The majority of compensation for senior executives is shareholder funded and dependent on achieving targets related to safety, reliability and other results”
[66].
Consequences of a sustained power outage The EMP Commission states “Should significant parts
of the electrical power infrastructure be lost for any substantial period of time, the Commission believes that the
consequences are likely to be catastrophic , and many people will die for the lack of the
basic elements necessary to sustain life in dense urban and suburban communities.” [67]. Space constraints preclude
discussion on how the loss of the grid would render synthesis and distribution of oil and gas inoperative.
Telecommunications would collapse, as would finance and banking. Virtually all technology,
infrastructure, and services require electricity. An EMP attack that collapses the electric power grid will
collapse the water infrastructure —the delivery and purification of water and the removal and treatment of
wastewater and sewage. Outbreaks that would result from the failure of these systems include cholera.
It is problematic if fuel will be available to boil water . Lack of water will cause death in 3 to
4 days [68]. Food production would also collapse . Crops and livestock require water delivered by
electronically powered pumps. Tractors, harvesters, and other farm equipment run on petroleum products
supplied by an infrastructure (pumps, pipelines) that require electricity. The plants that make
fertilizer, insecticides, and feed also require electricity. Gas pumps that fuel the trucks that distribute food require
electricity. Food processing requires electricity . In 1900, nearly 40% of the population lived on farms. That percentage
is now less than 2% [69]. It is through technology that 2% of the population can feed the other 98% [68].
The acreage under cultivation today is only 6% more than in 1900, yet productivity has increased 50 fold [69]. As stated by Dr.
Lowell L Wood in Congressional testimony: “If
we were no longer able to fuel our agricultural machine in the
country, the food production of the country would simply stop, because we do not have the horses
and mules that used to tow agricultural gear around in the 1880s and 1890s”. “So the situation would be exceedingly adverse if both
electricity and the fuel that electricity moves around the country……… stayed away for a substantial period of time, we would miss
we would starve the following winter ” [70]. People can live for 1–2 months without food, but
the harvest, and
after 5 days, they have difficulty thinking and at 2 weeks they are incapacitated [68]. There
is typically a 30-day
perishable food supply at regional warehouses but most would be destroyed with the loss of
refrigeration [69]. The EMP Commission has suggested food be stockpiled for a possible EMP event.
Yet there’s been surprisingly little research on the subject, says Anders Sandberg, a catastrophe researcher at the
University of Oxford’s Future of Humanity Institute in the United Kingdom. Last he checked, “there are more papers about dung
beetle reproduction than human extinction,” he says. “We might have our priorities slightly wrong.”
Frequent, moderately severe disasters such as earthquakes attract far more funding than low-probability apocalyptic ones.
Prejudice may also be at work; for instance, scientists who pioneered studies of asteroid and comet impacts
complained about confronting a pervasive “giggle factor.” Consciously or unconsciously, Sandberg says,
many researchers consider catastrophic risks the province of fiction or fantasy —not serious science.
A handful of researchers, however, persist in thinking the unthinkable. With enough knowledge
and proper
planning, they say, it’s possible to prepare for—or in some cases prevent—rare but devastating natural
disasters. Giggle all you want, but the survival of human civilization could be at stake.
Now the assistant director of space weather at the White House Office of Science and Technology Policy in Washington, D.C.,
Murtagh spends much of his time pondering solar eruptions. CMEs don’t harm human beings directly, and their
effects can be spectacular. By funneling
charged particles into Earth’s magnetic field, they can trigger
geomagnetic storms that ignite dazzling auroral displays. But those storms can also induce dangerous
electrical currents in long-distance power lines. The currents last only a few minutes, but they can take out
electrical grids by destroying high-voltage transformers—particularly at high latitudes, where Earth’s
magnetic field lines converge as they arc toward the surface.
The worst CME event in recent history struck in 1989, frying a transformer in New Jersey and leaving 6 million people in Quebec
province in Canada without power. The largest one on record—the Carrington Event of 1859, named after the
U.K. astronomer who witnessed the accompanying solar flare—was up to 10 times more intense. It sent searing currents racing
through telegraph cables, sparking fires and shocking operators, while the northern lights danced as far south as Cuba.
What’s clear is that widespread blackouts could be catastrophic , especially in countries that depend on
highly developed electrical grids. “We’ve done a marvelous job creating a great vulnerability to this
threat,” Murtagh says. Information technologies, fuel pipelines, water pumps, ATMs, everything with
a plug would be rendered useless. “That’s going to affect our ability to govern the country,” Murtagh
says.
Space science and technologies have wide applications, for example in managing public health
emergencies , forecasting epidemics , facilitating early warning and disaster management
plans, as well as monitoring environmental parameters .5 The by-products of space-based technologies
and innovations can make substantial contributions to injury prevention from road crashes.
In health systems research, space-based research , such as on the International Space Station, can provide
unique data on physiologic and biological processes, which may allow potential novel therapeutic
approaches to identify diseases. Space science and technology thus contribute to epidemic
intelligence , health emergencies and the research agenda on the benefits to public health.
Space-based innovation in the health sector is poised to bring significant health and economic gains through the adoption of product
and process innovations at all levels of health services. This will contribute to delivering better health for all people and could reduce
the disease burden. These innovations have a clear place in prevention , preparedness , response
and recovery – all the important stages of national public health management. Thus far, the use of
space science and technology in public health has been sporadic . Earth observation data from
orbiting satellites and ground-based sensors are already used by a few countries to make public health decisions, but more
countries could use space-based technologies and geospatial information in this way.
To use space science and technologies in advancing the health sustainable development goal, appropriate national-level policies and
governance mechanisms are essential. National
governments are encouraged to strengthen policy and
governance mechanisms for closer collaboration between health ministries, other relevant ministries and space agencies to
leverage the benefits of space science and technologies for health gains. Governments should ensure
national technical readiness for geospatial information management as well as the use of space-based innovations and integrate the
use of geospatial data in health systems strengthening efforts.
We fear it is only a matter of time before we face a deadlier and more contagious pathogen ,
yet the threat of a deadly pandemic remains dangerously overlooked . Pandemics now occur with
greater frequency , due to factors such as climate change , urbanization , and international
travel . Other factors, such as a weak World Health Organization and potentially massive cuts to funding for
U.S. scientific research and foreign aid, including funding for the United Nations, stand to deepen our vulnerability. We
also face the specter of novel and mutated pathogens that could spread and kill
faster than diseases we have seen before . With the advent of genome-editing technologies, bioterrorists could
artificially engineer new plagues, a threat that Ashton Carter, the former U.S. secretary of defense, thinks could rival nuclear
weapons in deadliness . The two of us have advised the president of Guinea on stopping Ebola. In addition, we have worked on ways
to contain the spread of Zika and have informally advised U.S. and international organizations on the matter. Our experiences tell us
that the world is unprepared for these threats. We urgently need to change this trajectory. We can start by learning four
lessons from the gaps exposed by the Ebola and Zika pandemics. Faster Vaccine Development The most effective way to stop pandemics is with
vaccines. However, with Ebola there was no vaccine, and only now, years later, has one proven effective. This has been the case with Zika, too. Though
there has been rapid progress in developing and getting a vaccine to market, it is not fast enough, and Zika has already spread worldwide. Many other
diseases do not have vaccines, and developing them takes too long when a pandemic is already under way. We need faster pipelines, such as the one
that the Coalition for Epidemic Preparedness Innovations is trying to create, to preemptively develop vaccines for diseases predicted to cause outbreaks
in the near future. Poinkt-of-Care Diagnostics Even with such efforts, vaccines will not be ready for many diseases and would not even be an option for
novel or artificially engineered pathogens. With no vaccine for Ebola, our next best strategy was to identify who was infected as quickly as possible and
isolate them before they infected others. Because Ebola’s symptoms were identical to common illnesses like malaria, diagnosis required laboratory
testing that could not be easily scaled. As a result, many patients were only tested after several days of being contagious and infecting others. Some were
never tested at all, and about 40% of patients in Ebola treatment centers did not actually have Ebola. Many dangerous pathogens similarly require
laboratory testing that is difficult to scale. Florida, for example, has not been able to expand testing for Zika, so pregnant women wait weeks to know if
their babies might be affected. What’s needed are point-of-care diagnostics that, like pregnancy tests, can be used by frontline responders or patients
themselves to detect infection right away, where they live. These tests already exist for many diseases, and the technology behind them is well-
established. However, the process for their validation is slow and messy. Point-of-care diagnostics for Ebola, for example, were available but never used
because of such bottlenecks. Greater Global Coordination We need stronger global coordination . The
responsibility for controlling pandemics is fragmented, spread across too many players
with no unifying authority . In Guinea we forged a response out of an amalgam of over 30 organizations, each of which had its
own priorities. In Ebola’s aftermath, there have been calls for a mechanism for responding to
pandemics similar to the advance planning and training that NATO has in place for its
numerous members to respond to military threats in a quick, coordinated fashion . This is the right
thinking, but we are far from seeing it happen. The errors that allowed Ebola to become a crisis replayed with Zika, and the WHO, which
should anchor global action, continues to suffer from a lack of credibility. Stronger Local Health Systems
International actors are essential but cannot parachute into countries and navigate
local dynamics quickly enough to contain outbreaks . In Guinea it took months to establish the ground game
needed to stop the pandemic, with Ebola continuing to spread in the meantime. We need to help developing countries establish health systems that can
provide routine care and, when needed, coordinate with international responders to contain new outbreaks. Local health systems could be established
for about half of the $3.6 billion ultimately spent on creating an Ebola response from scratch. Access to routine care is also
essential for knowing when an outbreak is taking root and establishing trust. For months, Ebola spread before anyone knew it was happening,
and then lingered because communities who had never had basic health care doubted the intentions of foreigners flooding into their villages. The
turning point in the pandemic came when they began to trust what they were hearing about Ebola and understood what they needed to do to halt its
spread: identify those exposed and safely bury the dead. With
Ebola and Zika, we lacked these four things — vaccines, diagnostics,
global coordination, and local health systems — which are still urgently needed . However, prevailing
political headwinds in the United States, which has played a key role in combatting pandemics
around the world , threaten to make things worse. The Trump administration is seeking drastic budget cuts in funding for foreign aid
and scientific research. The U.S. State Department and U.S. Agency for International Development may lose over one-third of their budgets, including
half of the funding the U.S. usually provides to the UN. The National Institutes of Health, which has been on the vanguard of vaccines and diagnostics
research, may also face cuts. The Centers for Disease Control and Prevention, which has been at the forefront of responding to outbreaks, remains
without a director, and, if the Affordable Care Act is repealed, would lose $891 million, 12% of its overall budget, provided to it for immunization
programs, monitoring and responding to outbreaks, and other public health initiatives. Investing in our ability to prevent and
contain pandemics through revitalized national and international institutions should be our shared
goal. However, if U.S. agencies become less able to respond to pandemics, leading institutions from other nations, such as Institut Pasteur and the
National Institute of Health and Medical Research in France, the Wellcome Trust and London School of Hygiene and Tropical Medicine in the UK, and
nongovernmental organizations (NGOs have done instrumental research and response work in previous pandemics), would need to step in to fill the
climate change and nuclear conflict . We are at a critical crossroads, where we must either take the steps needed to
prepare for this threat or become even more vulnerable. It is only a matter of time before we are hit by a deadlier, more contagious pandemic. Will we
be ready?
Healthcare has a lot to gain from the wearable and smart device industry — and in turn, it
might be the medical industry that ensures the success of these devices . From
Bluetooth-enabled hearing aids to robotic caretakers, there are some IoT healthcare innovations
we would have relegated to science fiction only a decade ago .
But the technology is here, and it’s more advanced than you may think. As a white paper by Freescale Semiconductor points out:
“The long-predicted IoT revolution in healthcare is already underway … The IoT plays a
significant role in a broad range of healthcare applications, from managing chronic diseases at
one end of the spectrum to preventing disease at the other.”
1. Precise Patient Tracking
Doctors are using various medical apps to track their
Smart devices and wearables benefit both patients and doctors.
patients’ ongoing health concerns. Meanwhile, patients are able to receive recommendations
and new advice as their treatment plan progresses.
This kind of tracking can take place in a hospital setting, but also at home . In some cases, the technology is a literal lifesaver. As early as
2009 there were cases where patients were saved thanks to their Wi-Fi-enabled pacemaker, as documented in an article in the journal Eurospace, which states:
“A 66-year-old patient with a Medtronic Concerto CRT-D for primary prevention of sudden death phoned the clinic complaining of fatigue since two days, without any malaise
or ICD shocks. Remote interrogation of the device . . . showed slow irregular VT (Ventricular Tachycardia).”
Doctors immediately told the patient to go to the clinic. Only 15 minutes after arriving, the patient collapsed from a heart attack. Luckily, doctors were ready and able to help.
Other patients with heart disease continue to benefit from the technology, which is becoming more advanced every day.
Diabetes is one condition that inspires the development of many smart healthcare devices. These connected devices don’t only help patients monitor their condition, but also
help with preventative healthcare.
iHealth Smart, for example, is a Bluetooth-enabled device that helps users track their glucose levels. It comes with a paired app to help them understand and keep track of the
data.
Connected versions of everyday items also play role. Siren Smart Socks, for example, use temperature monitoring to help prevent foot ulcers — a complication many diabetic
patients face. The socks send this data to a companion app, which will notify users as to whether they should do a visual foot check, change their shores, reduce physical activity
or visit a doctor.
Some connected healthcare devices add an element of convenience to patients’ lives. This is the aim of the Widex COM-DEX, a Bluetooth-enabled
hearing aid that users can adjust and customize with an app.
This connection with the user’s phone also brings other benefits, such as the ability to take calls or listen to music through the hearing aid.
“I chose a smart hearing aid mainly because of its size and style. I also liked the fact that I could control it from my phone via the COM-DEX,” a user of the device, Amy Bell, told
MakeUseOf.
Bell requires a hearing aid in one ear due to a diving accident that damaged her cochlear hair cells. This makes her sensitive to high-pitched noises, while she cannot properly
hear low or bass tones.
“With the COM-DEX connected to my hearing aid and phone I’m able to hear phone calls through my hearing aid as well as play music. Bigger or older models don’t come with
this feature,” she says.
3. Smart Surgery
Google Glass might be a commercial failure, but it found its niche by providing augmented reality (AR) glasses for factory workers.
But factories aren’t the only places using this technology. In fact, AR-assisted surgery is already a reality.
Other companies are also creating augmented reality tools for surgery, such as Augmedics. The company is developing an AR headset dubbed The Vizor, which will allow doctors
to view a patient’s CT scan during surgery.
Augmedics believes that using AR headsets could allow for minimally invasive spinal surgery because the technology reduces the need for large incisions.
Meanwhile, another company called Scopis has combined the Microsoft HoloLens with their surgery navigation technology. By doing this, they have created a system that
provides AR guidance for surgeons on numerous procedures.
“Scopis’ Holographic Navigation Platform is a universal solution that offers specific advantages for spinal surgeries and can also be applied in the many other areas where the
highest levels of precision and speed are critical . . . In neurosurgery, for example, brain tumors could be located faster and with higher accuracy.” — Brian Kosmecki, Scopis CEO
With connected devices comes a slew of data, allowing researchers to analyze trends and
risks for patients. While it’s not a precise science yet, more companies are implementing data solutions that can help with
preventative healthcare.
PwC has introduced a predictive engine named Bodylogical which processes patient data to help give health insights into potential health trends and problems in the future.
Consumer devices can then use the engine to provide users with possible impacts of their decisions on their health.
“[Using health data from wearables, connected health devices, or other consumer data sources] Bodylogical can visualize how a person’s present choices — both positive and
negative — will personally impact them. It can show people how to achieve greater results with the least amount of effort. Or it can help individuals focus on the one or two
specific things that can deliver the most improvement.” — PwC
This includes sending users notifications on their smartphones to remind them of tasks or interventions.
Meanwhile, connected devices are also powerful tools for the early detection of diseases and health
issues. Cyrcadia Health, for example, has a device called the iTBra, that assists in the early detection of breast cancer.
The smart breast patches use predictive analytics software and algorithms to identify potential abnormalities in breast tissue. This data is sent to Cyrcadia Health for analysis,
with results sent to the user and their doctors.
Smart sensors are well-known as a way that the elderly can maintain independence in their homes. But IBM and Rice University are taking it a step further with the Multi-
Purpose Eldercare Robot Assistant (IBM MERA). The technology is still in its early stages, but a prototype was up and running by the end of 2016.
“IBM MERA will be used to help study innovative ways of measuring an individual’s vital signs, such as heart rate, heart rate variability and respiratory rate; answer basic
health-related questions; and, determine if an individual has fallen by reading the results of an accelerometer.” — IBM
AI and robots are playing an increasingly important role in the
Research by PwC Global has noted that
healthcare sector. More surprisingly, the firm also found that the public are ready to accept these advances. Their survey revealed that
consumers around the world are ready to engage with new technology designed to enable health
and wellness.
It will probably be many years before you can buy a robot with the smarts and functionality of Rosie from "The Jetsons." Nonetheless, today’s robots are moving in that
direction.
These medical apps can be very useful, even if they won't fully replace your doctor.
. It does this by providing patients with a visual feed and control over the robot’s
homes or beds
movements.
with healthcare professionals. Patients can also interact with their loved ones through the telecare robot.
A Two-Way Street
The impact of connected devices on healthcare isn’t one-sided — in fact, the healthcare industry
is one of the major drivers of innovation and production in the smart device industry. This
is especially prevalent when it comes to wearables.
Global Industry Analysts Inc expects that the wearable medical devices market will reach a value of $4.5 billion by 2020. The company says that this is
due to the “growing need for effective management of chronic diseases.”
“The global wearable devices market continues to gain momentum from the rapid adoption of these devices among patients,” the company said in a
report.
So while
healthcare is seeing immense improvements thanks to the Internet of Things, the
connected device industry is also significantly benefiting from its connections to healthcare.
Extinction
Steer 16 – Dr. Andrew Steer, President and CEO of the World Resources Institute, Global
Agenda Trustee for the World Economic Forum, Former Director General at the UK Department
of International Development, “How Information Technology Will Enable a Sustainable
Future”, Hewlett Packard Enterprise, 9-26, https://community.hpe.com/t5/Inspiring-
Progress/How-Information-Technology-Will-Enable-a-Sustainable-Future/ba-
p/6901717#.WLxR0TvafHw
Two recent developments offer great hope for the future.
For the first time ever, all the countries in the world have agreed on global goals (to 2030) that
apply to rich and poor alike. In September 2015, world leaders adopted the UN Sustainable
Development Goals (SDGs) with a vision that all people have a good quality of life—free of
hunger, poverty, and injustice—while our environment thrives. The SDGs have a set of 169
targets, most of which are measurable.
Second, thanks to massive advances in information technology , we can now measure
important things much more accurately in real time—in turn creating an ability to
react quickly to problems , and an accountability for delivering on commitments made.
Take forests, for example. One of the SDG targets is to eliminate deforestation and restore
degraded land. But how to measure what is happening? As recently as five years ago this would
have required referring to a thick book of statistics, often several years out of date. Now, real
time data is available at a resolution of one-fifth of an acre to anybody with a smartphone or
laptop anywhere—all for free! This all thanks to billions of data points provided by satellites
each day, massive gains in cloud computing, and incredible progress in communications and
visualization technology. This is all provided by Global Forest Watch , a partnership between the
World Resources Institute (WRI) and a number of leading technology companies, research
institutions and national governments.
New technology is also helping us increase the efficiency of buildings and build smarter, lower-
carbon, higher-functioning cities. Data platforms like our Aqueduct water risk model are helping
business leaders and countries measure and monitor environmental risks and track progress.
At WRI we focus on six urgent global challenges: climate , water , forests , food , energy ,
and cities . In each of these IT is facilitating better data collection and analysis, heightened
transparency, better communication and ultimately better decisions —all critical to WRI’s
moto: count it, change it, scale it. From satellite images of coral reefs to crowd-sourced data on
urban traffic congestion, our ability to make decisions based on increasingly better information
points to tremendous potential for sustainability gains.
It is a pleasure for us to partner with HPE in a number of important initiatives, including
exploring the intersection of Smart Cities and the I nternet o f T hings, and helping shape HPE’s
science-based target. It’s only by tapping the ingenuity and excitement of those within the IT
sector that we can address existential threats like climate change while increasing standards
of living for billions now living in poverty. For this reason, we see companies like HPE at the
very forefront of solving today’s most difficult challenges.
highest potential “to emerge as a pandemic virus and cause substantial human illness.”
Yin says he’d heard about H7N9 on TV, but when his wife started to vomit, they didn’t make the connection. Instead of seeking Western-style medicine, they did what many
rural Chinese people do when they’re under the weather: They went to the local herbalist and sought inexpensive, traditional treatments for what they hoped was a simple
illness. As a small-scale farmer with four children, Yin takes temporary construction jobs (as many rural Chinese do) to boost his income to about $550 a month. He had always
been terrified that someone in his family might develop a serious health problem. “That’s a farmer’s worst nightmare,” he explains. “Hospital costs are unbelievable. Entire
family savings could be wiped out.”
When the herbs didn’t work, Long’s family hired a car and drove her 20 miles to the Ziyang Hospital of Traditional Chinese Medicine. There she was diagnosed with
gastrointestinal ulcers and received various treatments, including a medication often prescribed for colic and a traditional Chinese medicine (jingfang qingre) used to reduce
fever. She didn’t improve. Two days later, Long went into intensive care. The next day, Yin was shocked when doctors told him his wife was, in fact, infected with H7N9.
The diagnosis was especially surprising, given that Long hadn’t done anything different than usual in the period leading up to her illness. She’d looked after her 73-year-old
mother, who lived nearby, and worked in the cornfields. And just a few days before she became ill, Long had walked about an hour to the local market, approached a vendor
selling live poultry and returned home with five chickens.
**********
Officially, the live-bird markets in Beijing have been shuttered for years. In reality, guerrilla vendors run furtive slaughterhouses throughout this national capital of wide
avenues, gleaming architecture and more than 20 million residents—despite warnings that their businesses could be spreading deadly new strains of the flu.
In one such market, a man in sweatstained shorts had stacked dozens of cages—jammed with chickens, pigeons, quail—on the pavement outside his grim hovel.
I picked out two plump brown chickens. He slit their throats, tossed the flapping birds into a greasy four-foot-tall ceramic pot, and waited for the blood-spurting commotion to
die down. A few minutes later he dunked the chickens in boiling water. To de-feather them, he turned to a sort of ramshackle washing machine with its rotating drum studded
with rubber protuberances. Soon, feathers and sludge splashed onto a pavement slick with who knows what.
I asked the vendor to discard the feet. This made him wary. Chicken feet are a Chinese delicacy and few locals would refuse them. “Don’t take my picture, don’t use my name,” he
said, well aware that he was breaking the law. “There was another place selling live chickens over there, but he had to shut down two days ago.”
Many Chinese people, even city dwellers, insist that freshly slaughtered poultry is tastier and more healthful than refrigerated or frozen meat. This is one of the major reasons
China has been such a hot spot for new influenza viruses: Nowhere else on earth do so many people have such close contact with so many birds.
At least two flu pandemics in the past century—in 1957 and 1968—originated in [ China ] the Middle Kingdom and
were triggered by avian viruses that evolved to become easily transmissible between
humans . Although health authorities have increasingly tried to ban the practice, millions of live birds are
still kept, sold and slaughtered in crowded markets each year. In a study published in January, researchers in
China concluded that these markets were a “main source of H7N9 transmission by way of human-poultry contact and
avian-related environmental exposures.”
In Chongzhou, a city near the Sichuan provincial capital of Chengdu, the New Era Poultry Market was reportedly closed for two months at the end of last year. “Neighborhood
public security authorities put up posters explaining why bird flu is a threat, and asking residents to co-operate and not to sell poultry secretly,” said a Chongzhou teacher, who
asked to be identified only as David. “People pretty much listened and obeyed, because everyone’s worried about their own health.”
When I visited New Era Poultry in late June, it was back in business. Above the live-poultry section hung a massive red banner: “Designated Slaughter Zone.” One vendor said he
sold some 200 live birds daily. “Would you like me to kill one for you, so you can have a fresh meal?” he asked.
Half a dozen forlorn ducks, legs tied, lay on a tiled and blood-spattered floor, alongside dozens of caged chickens. Stalls overflowed with graphic evidence of the morning’s brisk
trade: boiled bird carcasses, bloodied cleavers, clumps of feathers, poultry organs. Open vats bubbled with a dark oleaginous resin used to remove feathers. Poultry cages were
draped with the pelts of freshly skinned rabbits. (“Rabbit meat wholesale,” a sign said.)
These areas—often poorly ventilated, with multiple species jammed together—create ideal conditions for
spreading disease through shared water utensils or airborne droplets of blood and other secretions. “That provides
opportunities for viruses to spread in closely packed quarters, allowing ‘amplification’ of the viruses,” says Benjamin John Cowling, a
specialist in medical statistics at the University of Hong Kong School of Public Health. “The risk to humans becomes so much higher.”
Shutting live-bird markets can help contain a bird flu outbreak. Back in 1997, the H5N1 virus traveled from mainland China to Hong Kong, where it started killing chickens and
later spread to 18 people, leaving six dead. Hong Kong authorities shut down the city’s live-poultry markets and scrambled to cull 1.6 million chickens, a draconian measure that
may have helped avert a major epidemic.
In mainland China, though, the demand for live poultry remains incredibly high. And unlike the Hong Kong epidemic, which visibly affected its avian hosts, the birds carrying
H7N9 initially appeared healthy themselves. For that reason, shuttering markets has been a particularly hard sell.
China’s Ministry of Agriculture typically hesitates to “mess with the industry of raising and selling chickens,” says Robert Webster, a world-renowned virologist based at St. Jude
Children’s Research Hospital in Memphis. He has been working with Chinese authorities since 1972, when he was part of a Western public health delegation invited to Beijing.
He and a colleague were eager to collect blood samples from Chinese farm animals. At a state-run pig farm, Webster recalls, he was allowed to get a blood sample from one pig.
“Then we said, ‘Could we have more pigs?’ And the Chinese officials replied, ‘All pigs are the same.’ And that was it,” he concludes with a laugh. “It was a one-pig trip.”
The experience taught Webster something about the two sides of Chinese bureaucracy. “The public health side of China gave us absolute co-operation,” he says. “But the
agricultural side was more reluctant.” He says the Chinese habit of keeping poultry alive until just before cooking “made some sense before the days of refrigeration. And now it’s
in their culture. If you forcibly close down government live-poultry markets, the transactions will simply go underground.”
Tiny porcelain and wood figurines of chickens, geese and pigs dot a crowded windowsill in Guan Yi’s office at the School of Public Health, framing an idyllic view of green, rolling
hills. Famed for his work with animal viruses, Guan is square-jawed and intense. Some call him driven. In another incarnation, he might have been a chain-smoking private
investigator. In real life he’s a blunt-spoken virus hunter.
Working out of his Hong Kong base as well as three mainland Chinese labs, including one at Shantou University Medical College, Guan receives tips about unusual flu trends in
China from grassroots contacts. He has trained several dozen mainland Chinese researchers to collect samples—mostly fecal swabs from poultry in markets and farms—and
undertake virus extraction and analysis.
At a lab in Hong Kong, a colleague of Guan’s sits before rows of chicken eggs, painstakingly injecting droplets of virus-containing liquid into living embryos. Later the amniotic
fluid will be analyzed. Another colleague shows off an important tool for their work: a sophisticated Illumina next-generation sequencing machine, which, he says, “can sequence
genes at least 40 times faster” than the previous method.
Guan is concerned that H7N9 may be undergoing mutations that could make it spread easily between people. He’s alarmed that the most recent version of H7N9 has infected
and killed so many more people than other avian flu viruses. “We don’t know why,” he frets.
Then there was that moment last winter when colleagues analyzing H7N9 were startled to discover that some of the viruses—previously non-pathogenic to birds—now were
killing them. This virus mutation was so new that scientists discovered it in the lab before poultry vendors reported unusually widespread bird deaths.
Flu viruses can mutate anywhere. In 2015, an H5N2 flu strain broke out in the United States and spread throughout the country, requiring the
slaughter of 48 million poultry. But China is unique ly positioned to create a novel flu virus that kills
people. On Chinese farms, people, poultry and other livestock often live in close proximity .
Pigs can be infected by both bird flu and human flu viruses, becoming potent “mixing vessels” that allow genetic material from
each to combine and possibly form new and deadly strains . The public’s taste for freshly killed
meat, and the conditions at live markets, create ample opportunity for humans to come in
contact with these new mutations . In an effort to contain these infections and keep the poultry industry alive, Chinese officials have
developed flu vaccines specifically for birds. The program first rolled out on a large scale in 2005 and has gotten mixed reviews ever since. Birds often spread new viruses
without showing signs of illness themselves, and as Guan notes, “You can’t vaccinate every chicken in every area where bird flu is likely to emerge.” In July, after H7N9 was
found to be lethal to chickens, Chinese authorities rolled out H7N9 poultry vaccines; it’s still too early to assess their impact.
Meanwhile, there is no human vaccine yet available that can guarantee protection against the most recent variant of H7N9. Guan’s team is helping pave the way for one. They’ve
been looking deeply into the virus’ genesis and infection sources, predicting possible transmission routes around the globe. They’re sharing this information with like-minded
researchers in China and abroad, and offering seasonal vaccine recommendations to international entities such as the World Health Organization and the Food and Agriculture
Organization of the United Nations. Such data could prove life-saving—not just in China but worldwide—in the event of a full-on pandemic.
**********
When Long Yanju’s illness was diagnosed in April, she became one of 24 confirmed cases of H7N9 in Sichuan province that month. Hospitals there weren’t well equipped to
recognize signs of the virus: This wave marked the first time H7N9 had traveled from the densely populated eastern coast westward to rural Sichuan. “With the spread across
wider geographical areas, and into rural areas,” says Uyeki, the CDC influenza specialist, “it’s likely patients are being hospitalized where hospitals aren’t so well resourced as in
the cities, and clinicians have less experience managing such patients.”
Yin is now alleging that the hospital committed malpractice for not properly diagnosing or treating his wife until it was too late. He initially asked for $37,000 in damages from
the hospital. Officials there responded with a counterdemand that Yin pay an additional $15,000 in medical bills. “In late September I agreed to accept less than $23,000. I’d
run out of money,” he says. “But when I went to collect, the hospital refused to pay and offered much less. It’s not enough.” A county mediation committee is trying to help both
sides reach an agreement. (Hospital representatives declined to comment for this article.)
Whatever the outcome of Yin’s legal battle, it seems clear that shortcomings in the Chinese health care system are playing a role in the H7N9 epidemic. Along with rural people’s
tendency to avoid Western-style medicine as too expensive, it’s routine for hospitals in China to demand payment upfront, before any tests or treatment takes place. Families are
known to trundle ailing relatives on stretchers (or sometimes on stretched blankets) from clinic to clinic, trying to find someplace they can afford. “Everybody feels the same way
as I do,” Yin says. “If the illness doesn’t kill you, the medical bills will.”
And any delay in receiving treatment for H7N9 is dangerous, physicians say. Although nearly 40 percent of people known
to be infected with H7N9 have died so far, the odds of surviving may be much higher if medication such as the antiviral oseltamivir, known as Tamiflu, could be administered
within 24 to 48 hours. “Chinese with H7N9 usually take two days to see a doctor, another four days to check into a hospital, and then on Day 5 or 6 they get Tamiflu,” says Chin-
Kei Lee, the medical officer for emerging infectious diseases at the WHO China office. “Often people die within 14 days. So especially in rural
areas , it’s hard to get treated in time—even if doctors do everything right.”
Though health authorities worldwide acknowledge that China is often an influenza epicenter, most Chinese people themselves
don’t receive an annual flu shot. The logistics of administering mass vaccinations to a nation of
more than one billion are daunting . While nearly half of Americans receive seasonal flu vaccinations, only about 2 percent of Chinese do. “Not enough,”
admits Lee. “We always want to do better than yesterday.”
Earlier this year, Lee was one of 25 experts who gathered in Beijing under the umbrella of the United Nations to discuss the H7N9 threat. The meeting reviewed some of the
measures in place at live-bird markets—such as mandatory weekly disinfection and bans on keeping poultry overnight—and concluded that they were insufficient.
Despite such shortcomings, Western experts say Chinese officials have come a long way since their wobbly handling of the 2002 outbreak of SARS, the severe respiratory disease
caused by a previously unknown coronavirus; Chinese apparatchiks initially tried to cover up the epidemic, creating a worldwide scandal. But after the first H7N9 outbreak in
2013, Webster observes, Chinese authorities did “exactly what should have been done. You need to get the word out as fast as possible, with transparency and urgency, so the
world can respond.”
Global coop eration is crucial . Along China’s southwestern underbelly lies a string of less developed countries such as Laos, Vietnam and
Myanmar. (The last of these is of particular concern, since it imports large amounts of Chinese poultry.) Some of China’s border regions are
The US banned the space agency from working with China and its state-owned companies out of concerns regarding national
security and technology transfers. As a result, China was locked out of the International Space Station because NASA is one of the
participating bodies. More recently, scientists from other countries such as Germany and Sweden who were helping China with its
exploration of the far side of the Moon were cautious of not falling afoul of US export controls on sensitive technology.
In a statement last week, NASA confirmed those discussions, and said that “for a number of reasons” the LRO wasn’t
able to be at the optimal location for the landing. But it added that the orbiter has been collecting data since
Chang’e-4’s arrived on the far side, and will take photos of the landing site on Jan. 31. The agencies have
agreed to share data on significant findings, if any, at a meeting of a subcommittee of the UN Committee on the
Peaceful Uses of Outer Space to be held next month. “NASA’s cooperation with China is transparent , reciprocal
and mutually beneficial,” the US space agency said.
NASA’s administrator, James Bridenstine, in an earlier interview with Quartz (paywall), had also said that the
agency can share data with China. “When they do a science mission to the Moon, we’re hopeful they will be able
to share with us the data they receive, and when we do a mission to the moon, we can share data with them,”
Bridenstine told Quartz. “Understanding and characterizing the Moon and doing that kind of science is in the interest of
all humanity. It’s not something any one country should try to retain for itself.”
The plan isn’t “growth” but also it’s sustainable, solves global
problems, and no mindset shift
Lomborg 13 – Bjørn Lomborg, Adjunct Professor at the Copenhagen Business School, “The
Limits to Panic”, Project Syndicate, 6-17, http://www.project-
syndicate.org/commentary/economic-growth-and-its-critics-by-bj-rn-lomborg
The genius of The Limits to Growth was to fuse these worries with fears of running out of stuff.
We were doomed, because too many people would consume too much. Even if our ingenuity
bought us some time, we would end up killing the planet and ourselves with pollution. The only
hope was to stop economic growth itself, cut consumption, recycle, and force people to have
fewer children, stabilizing society at a significantly poorer level.
That message still resonates today, though it was spectacularly wrong . For example, the authors
of The Limits to Growth predicted that before 2013, the world would have run out of aluminum,
copper, gold, lead, mercury, molybdenum, natural gas, oil, silver, tin, tungsten, and zinc.
Instead, despite recent increases, commodity prices have generally fallen to about a third of
their level 150 years ago. Technological innovations have replaced mercury in batteries, dental
fillings, and thermometers: mercury consumption is down 98% and, by 2000, the price was
down 90%. More broadly, since 1946, supplies of copper, aluminum, iron, and zinc have
outstripped consumption, owing to the discovery of additional reserves and new technologies to
extract them economically.
Similarly, oil and natural gas were to run out in 1990 and 1992, respectively; today, reserves of
both are larger than they were in 1970, although we consume dramatically more. Within the past
six years, shale gas alone has doubled potential gas resources in the United States and halved
the price.
As for economic collapse, the Intergovernmental Panel on Climate Change estimates that global
GDP per capita will increase 14-fold over this century and 24-fold in the developing world.
The Limits of Growth got it so wrong because its authors overlooked the greatest resource of all:
our own resourcefulness . Population growth has been slowing since the late 1960’s. Food
supply has not collapsed (1.5 billion hectares of arable land are being used, but another 2.7
billion hectares are in reserve). Malnourishment has dropped by more than half , from 35% of
the world’s population to under 16%.
Nor are we choking on pollution. Whereas the Club of Rome imagined an idyllic past with no
particulate air pollution and happy farmers, and a future strangled by belching smokestacks,
reality is entirely the reverse.
In 1900, when the global human population was 1.5 billion, almost three million people –
roughly one in 500 – died each year from air pollution, mostly from wretched indoor air. Today,
the risk has receded to one death per 2,000 people. While pollution still kills more people than
malaria does, the mortality rate is falling, not rising.
Nonetheless, the mindset nurtured by The Limits to Growth continues to shape popular and
elite thinking.
Consider recycling, which is often just a feel-good gesture with little environmental benefit and
significant cost. Paper, for example, typically comes from sustainable forests, not rainforests.
The processing and government subsidies associated with recycling yield lower-quality paper to
save a resource that is not threatened.
Obsession with doom-and-gloom scenarios distracts us from the real global threats . Poverty is
one of the greatest killers of all, while easily curable diseases still claim 15 million lives every
year – 25% of all deaths.
The solution is economic growth. When lifted out of poverty, most people can afford to
avoid infectious diseases. China has pulled more than 680 million people out of poverty in the
last three decades, leading a worldwide poverty decline of almost a billion people. This has
created massive improvements in health , longevity , and quality of life .
The four decades since The Limits of Growth have shown that we need more of it, not less . An
expansion of trade, with estimated benefits exceeding $100 trillion annually toward the end of
the century, would do thousands of times more good than timid feel-good policies that result
from fear-mongering. But that requires abandoning an anti-growth mentality and using our
enormous potential to create a brighter future.
Capitalism is not, Monbiot is forced to admit, a fragile system that will easily be replaced. Bolstered by
huge supplies of oil, it is here to stay . Industrial civilization is, as far as he can now see, unstoppable.
Gaia, that treacherous slut, has made so much oil and gas that her faithful acolytes today cannot protect her from the consequences
of her own folly. Welcome to the New Green Doom: an overabundance of oil and gas is going to release so much greenhouse gas that
the world is going to fry. The exploitation of the oil sands in Alberta, warn leading environmentalists, is a tipping point. William
McKibben put it this way in an interview with Wired magazine in the fall of 2011: I think if we go whole-hog in the tar sands, we’re
out of luck. Especially since that would doubtless mean we’re going whole-hog at all the other unconventional energy sources we can
think of: Deepwater drilling, fracking every rock on the face of the Earth, and so forth. Here’s why the tar sands are important: It’s a
decision point about whether, now that we’re running out of the easy stuff, we’re going to go after the hard stuff. The Saudi Arabian
liquor store is running out of bottles. Do we sober up, or do we find another liquor store, full of really crappy booze, to break into? A
year later, despite the success of environmentalists like McKibben at persuading the Obama administration to block a pipeline
intended to ship this oil to refineries in the US, it’s clear (as it was crystal clear all along to anyone with eyes to see) that the world
has every intention of making use of the “crappy liquor.” Again, for people who base their claim to world leadership on their superior
understanding of the dynamics of complex systems, greens
prove over and over again that they are
surprisingly naive and crude in their ability to model and to shape the behavior of the
political and economic systems they seek to control. If their understanding of the future of the earth’s
climate is anything like as wish-driven, fact-averse and intellectually crude as their approach to international affairs, democratic
politics and the energy market, the greens are in trouble indeed. And as I’ve written in the past, the contrast between green claims to
understand climate and to be able to manage the largest and most complex set of policy changes ever undertaken, and the evident
incompetence of greens at managing small (Solyndra) and large (Kyoto, EU cap and trade, global climate treaty) political projects
today has more to do with climate skepticism than greens have yet understood. Many people aren’t rejecting science; they are
the future of
rejecting green claims of policy competence. In doing so, they are entirely justified by the record. Nevertheless,
the environment is not nearly as dim as greens think. Despairing environmentalists like McKibben
and Monbiot are as wrong about what the new era of abundance means as green energy analysts
were about how much oil the planet had. The problem is the original sin of much environmental thought:
Malthusianism. If greens weren’t so addicted to Malthusian horror narratives they would be
able to see that the new era of abundance is going to make this a cleaner planet
faster than if the new gas and oil had never been found. Let’s be honest. It has long been clear to students of
history, and has more recently begun to dawn on many environmentalists, that all that happy-clappy carbon treaty stuff was a pipe
dream and that nothing like that is going to happen. A humanity that hasn’t been able to ban the bomb despite the clear and present
dangers that nuclear weapons pose isn’t going to ban or even seriously restrict the internal combustion engine and the generator.
The political efforts of the green movement to limit greenhouse gasses have had very little effect so far, and it is highly unlikely that
they will have more success in the future. The green movement has been more of a group hug than a curve
bending exercise, and that is unlikely to change. If the climate curve bends, it will bend the way the population curve did: as
the result of lots of small human decisions driven by short term interest calculations rather than as the result of a grand global plan.
The shale boom hasn’t turned green success into green failure. It’s prevented green failure from
turning into something much worse. Monbiot understands this better than McKibben; there was never any real doubt
that we’d keep going to the liquor store. If we hadn’t found ways to use all this oil and gas, we wouldn’t have embraced the
economics of less. True, as oil and gas prices rose, there would be more room for wind and solar power, but the real winner of an oil
and gas shortage is… coal. To use McKibben’s metaphor, there is a much dirtier liquor store just down the road from the shale
emporium, and it’s one we’ve been patronizing for centuries. The US and China have oodles of coal, and rather than walk to work
from our cold and dark houses all winter, we’d use it. Furthermore, when and if the oil runs out, the technology exists to get liquid
fuel out of coal. It isn’t cheap and it isn’t clean, but it works. The newly bright oil and gas future means that we aren’t entering a new
Age of Coal. For this, every green on the planet should give thanks. The second reason why greens should give thanks for shale is that
environmentalism is a luxury good. People must survive and they will survive by any means necessary. But they would
much rather thrive than merely survive, and if they can arrange matters better, they will. A poor society near the edge of
survival will dump the industrial waste in the river without a second thought. It will burn
coal and choke in the resulting smog if it has nothing else to burn. Politics in an age of survival is
ugly and practical. It has to be. The best leader is the one who can cut out all the fluff and the folderol and keep you alive
through the winter. During the Battle of Leningrad, people burned priceless antiques to stay alive for
just one more night. An age of energy shortages and high prices translates into an age of
radical food and economic insecurity for billions of people. Those billions of hungry,
frightened, angry people won’t fold their hands and meditate on the ineffable wonders of Gaia
and her mystic web of life as they pass peacefully away. Nor will they vote George Monbiot and Bill McKibben into power.
They will butcher every panda in the zoo before they see their children starve, they will
torch every forest on earth before they freeze to death, and the cheaper and the meaner their lives are, the
less energy or thought they will spare to the perishing world around them. But, thanks to shale and other unconventional energy
sources, that isn’t where we are headed. We are heading into a world in which energy is abundant and
horizons are open even as humanity’s grasp of science and technology grows more secure. A
world where more and more basic human needs are met is a world that has time to think
about other goals and the money to spend on them . As China gets richer, the Chinese
want cleaner air, cleaner water, purer food — and they are ready and able to pay for them . A
Brazil whose economic future is secure can afford to treasure and conserve its rain forests . A
Central America where the people are doing all right is more willing and able to preserve its biodiversity. And a world in
which people know where their next meal is coming from is a world that can and will take
thought for things like the sustainability of the fisheries and the protection of the coral
reefs . A world that is more relaxed about the security of its energy sources is going to be able to do more about improving the
quality of those sources and about managing the impact of its energy consumption on the global commons. A rich, energy
secure world is going to spend more money developing solar power and wind power and other
sustainable sources than a poor, hardscrabble one. When human beings think their basic problems are solved,
they start looking for more elegant solutions. Once Americans had an industrial and modern economy, we
started wanting to clean up the rivers and the air . Once people aren’t worried about
getting enough calories every day to survive, they start wanting healthier food more elegantly
prepared. A world of abundant shale oil and gas is a world that will start imposing more environmental regulations on shale and gas
producers. A
prosperous world will set money aside for r esearch and d evelopment for new
tech nologies that conserve energy or find it in cleaner surroundings. A prosperous world facing
climate change will be able to ameliorate the consequences and take thought for the future
in ways that a world overwhelmed by energy insecurity and gripped in a permanent
economic crisis of scarcity simply can’t and won’t do . Greens should also be glad that the new energy
is where it is. For Monbiot and for many others, Gaia’s decision to put so much oil into the United States and Canada seems like her
biggest indiscretion of all. Certainly, a United States of America that has, in the Biblical phrase, renewed its youth like an eagle with
a large infusion of fresh petro-wealth is going to be even less eager than formerly to sign onto various pie-in-the-sky green carbon
treaties. But think how much worse things would be if the new reserves lay in dictatorial kleptocracies. How willing and able would
various Central Asia states have been to regulate extraction and limit the damage? How would Nigeria have handled vast new
reserves whose extraction required substantially more invasive methods? Instead, the
new sources are concentrated in
places where environmentalists have more say in policy making and where, for all the
shortcomings and limits, governments are less corruptible, more publicly accountable and in
fact more competent to develop and enforce effective energy regulations. This won’t satisfy McKibben
and Monbiot (nothing that could actually happen would satisfy either of these gentlemen), but it is a lot better than what we could be
facing. Additionally, if there are two countries in the world that should worry carbon-focused greens more than any other, they are
the United States and China. The two largest, hungriest economies in the world are also home to enormous coal reserves. But based
on what we now know, the US and China are among the biggest beneficiaries of the new cornucopia. Gaia put the oil and the gas
where, from a carbon point of view, it will do the most good. In
a world of energy shortages and insecurity, both
the US and China would have gone flat out for coal . Now, that is much less likely. And there’s one more
reason why greens should thank Gaia for shale. Wind and solar aren’t ready for prime time now, but by the time the new sources
start to run low, humanity will have mastered many more technologies that can used to provide energy and to conserve it. It’s likely
that Age of Shale hasn’t just postponed the return of coal: because of this extra time, there likely will never be another age in which
coal is the dominant industrial fuel. It’s virtually certain that the total lifetime carbon footprint of the human race is going to be
smaller with the new oil and gas sources than it would have been without them. Neither the world’s energy problems nor its climate
issues are going away any time soon. Paradise is not beckoning just a few easy steps away. But the new availability of these energy
sources is on balance a positive thing for environmentalists as much as for anyone else. Perhaps, and I know this is a heretical
thought, but perhaps Gaia is smarter than the greens.
From the second and third observations, this follows: instead of gathering as free
collectives of happy householders, survivors of this collapse will be subject to the will
of people seeking to monopolise remaining resources . This will is likely to be imposed
through violence. Political accountability will be a distant memory. The chances of
conserving any resource in these circumstances are approximately zero . The
human and ecological consequences of the first global collapse are likely to persist for many
generations, perhaps for our species' remaining time on earth. To imagine that good could
come of the involuntary failure of industrial civilisation is also to succumb to denial . The
answer to your question – what will we learn from this collapse? – is nothing .
This is why, despite everything, I fight on. I am not fighting to sustain economic growth. I am
fighting to prevent both initial collapse and the repeated catastrophe that follows. However faint
the hopes of engineering a soft landing – an ordered and structured downsizing of the global
economy – might be, we must keep this possibility alive. Perhaps we are both in denial: I, because I
think the fight is still worth having; you, because you think it isn't.
Neolib is sustainable
--accelerating market-driven brown solves poverty and resources
--consumption will stabilize
--tech solves
data recorded by plant-level continuous emissions monitoring systems, or CEMS , and remote
sensing data captured by satellites of the National Aeronautics and Space Administration (NASA). Those mismatches
occurred in areas with relatively higher pollution levels , where regulatory standards may have
been difficult to meet, potentially prompting falsification or misreporting of data. One big
takeaway from the research is that environmental regulation ought to factor in the development path and economic reforms of a country, so that it is
not overly heavy-handed so as to encourage unethical conduct, according to experts at Wharton and elsewhere. The other takeaway is that remote
sensing data could be more widely used to monitor compliance with environmental
regulations — not just in China , but also the U.S. “The stakes are really high in China, because China is the world’s largest energy user, and
nearly 80% of its electricity production comes from coal,” said Valerie Karplus, professor of global economics and management at MIT’s Sloan School
Remote
of Management and a co-author of the paper. Also, many people in China live in close proximity to those power plants, she added.
sensing data could be used to inform policy-making and to enhance the understanding of how firms respond to
pollution-control regulations, she noted. Titled “Quantifying Coal Power Plant Responses to Tighter SO2 Emissions Standards in
China,” the paper’s other co-authors are Shuang Zhang, professor of economics at the University of Colorado, Boulder; and Douglas Almond, professor
of economics and international and public affairs at Columbia University. They tracked sulfur dioxide emissions between July 2014 and July 2016 at
256 coal-fired power plants in China. Data Disconnects The study found that the monitoring systems at the 256 plants reported a 13.9% fall in sulfur
dioxide emissions. However, data captured by NASA’s satellites found higher pollution levels than what the monitors logged at 113 plants in so-called
“key regions,” or places with higher high populations and pollution levels than “non-key regions.” “A potential explanation for the discrepancy in the
two data sources in key regions is that plants overstated or falsified reductions,” the paper stated. “The stricter new standards and greater pressure to
comply may have generated incentives for plant managers to falsify or selectively omit concentration data.” “The stakes are really high in China,
because China is the world’s largest energy user, and nearly 80% of its electricity production comes from coal.”–Valerie Karplus Remote-sensing data
helps understand “responses on the ground, especially in developing countries, where there is, at least in the past, a widespread idea that maybe the
rule of law is weak or that there are challenges in governance and actually in implementing standards at plants,” said Karplus. It would help policy
makers also understand how firms respond to environmental policy against the backdrop of “ongoing economic reforms and development trajectory of
the country,” she said. “It’s important to realize that environmental policy doesn’t exist in a vacuum and that lots of supporting rules and institutions
can have an impact on how businesses manage their environmental footprint.” Shaping Environmental Regulations The research findings could bring
lessons in how stringent environmental regulations ought to be in order to be successful, according to Eric Orts, Wharton professor of legal studies and
business ethics who is also faculty director of the school’s Initiative for Global Environmental Leadership. “This is speculative, but one conclusion from
this research is that in the key areas where the Chinese government wanted to regulate heavily, they were not able to do it,” he said. Those key regions
include Beijing, Shanghai and other cities in the greater Beijing–Tianjin–Hebei area, the Pearl River Delta and the Yangtze River Delta. “One
hypothesis would be they just gave up and [tried to] figure out some way to not get in trouble [for non-compliance],” Orts said as a possible explanation
for misreporting or falsification of data. Under China’s anti-pollution regulations, sulfur dioxide emissions were set at 50 milligrams per cubic meter in
the key regions and at 200 milligrams in non-key regions that were relatively less polluted. The new regulations required 14,140 firms to post hourly
data on emissions on publicly available online platforms from 2014. “Where it was a little bit of a more moderate target, it seems that the satellite data
confirms the on-the-ground reporting that there was success [in reducing emissions],” Orts noted. “ Regulating heavily does create
some perverse disincentives for action and actually runs counter to your own environmental regulatory goals,” said Jackson Ewing, a
senior fellow at Duke University’s Nicholas Institute of Environmental Policy Solutions and an adjunct professor at the university’s Sanford School of
Public Policy. “Some
of China’s regulations in the past couple of years have taken fairly onerous approaches to
quickly reducing the number of heavy air pollution days, particularly in major metropolitan areas.” Those approaches have
had “unintended consequences,” including temporary shutdowns at factories and even the shutdown of heating in cold months
“when households really need that heat … with some obvious human impacts there,” Ewing pointed out. “So while China’s climate-change, domestic
environmental goals and stricter regulation are certainly laudable, you do see second- and third-order effects that include potentially not
putting forward numbers that we can believe in, and some consequences for human development and economic growth …
that we would like to see rolled back.” Why Measurement Is Important “From the U.N.-led efforts on climate change to China’s own domestic goals,
measurement [of pollution] has never been more important,” said Ewing. “We essentially have created a system in
which we’re just calling upon countries to, in good faith, report what they are doing on greenhouse gas mitigation and to come forward to their
international peers on a regular basis using similar methodologies for measuring and showing the results of their efforts. It’s
through that
measuring and reporting that we will have the fundamental starting point for negotiations and discussions
about how targets, goals and efforts can be scaled up, how resources can be shared and how cooperation can take hold.” “ If the core
reporting is problematic or inaccurate, then the entire system rests on a house of cards.”–Jackson
Ewing Any shortcomings in such environmental reporting could of course frustrate the goals. “If that core
reporting is problematic or inaccurate, then the entire system rests on a house of cards,” said Ewing. In that context, satellite monitoring would be
useful in the second- and third-party reviews of the actions declared by countries, he added. Karplus said one
innovation in their paper is the
ability to use satellite data for real-time monitoring of emissions. “The effects of air
pollution on human health depend on the timing of emissions,” she explained. “Being able to resolve emissions on
an hourly basis is incredibly important to thinking about when, where and how these plants are cleaning up.” Putting in
place independent verification measures is important because many of the mechanisms are
subject to human oversight, she added. Need to Strengthen Compliance According to Orts, NASA satellite data
verification could be used in the U.S. as well to influence adherence to environmental
standards. “There’s some concern that [Scott Pruitt], the secretary of the Environmental Protection Agency, might not be so keen on enforcing
the basic environmental laws in the country,” he said. NASA’s satellites are orbiting the earth constantly and pick up changes in air pollution as well as
a range of other substances, said Karplus. The ozone-monitoring instrument that captures such data has improved over time to provide accurate and
real-time measurements, she added. Many countries have installed CEMS at their power plants, but their adoption has to be significantly expanded,
including in China, to generate real-time data, Karplus said. In the U.S., CEMS are being used both for greenhouse gases as well as for local air
pollutants, “and this signals the state of the art,” she added. China does not yet cover carbon dioxide emissions as part of its monitoring systems at coal-
fired power plants, she noted, pointing to areas for improvement. Towards Global Environmental Governance The research study also holds pointers
for“global environmental governance” mechanisms , as they are made possible with remote
sensing capabilities, said Orts. With independently verified plant-level data, “you are able to then focus your policy development better,
because you have research driving that,” he added. “What you don’t want is a situation where you’re a business and you are
complying with the local regulatory scheme, but you’re losing out to someone who is cheating and
gaming the situation .”–Eric Orts Businesses will also find comfort in the regulatory consistency and trustworthiness of data it could
bring about, said Orts. “What you don’t want is a situation where you’re a business and you are complying with the local regulatory scheme, but you’re
China has
losing out to someone who is cheating and gaming the situation or paying off a government official or whatever.” Challenges in China
worked resolutely to address its environmental challenges in recent years, according to Karplus. She pointed to, for example,
its efforts to set up an emissions/carbon trading system, and to do third and fourth-party verification checks on all plant reported data. “It’s important
that we acknowledge how far China has come in making data more available and more transparent both to a domestic and international scholarly
audience, [although] there’s always a long way to go,” she added. “If
we are able to effectively monitor the impact of
particular policies, we will have a better chance of measuring and understanding what’s working,
what’s working well and what’s working poorly and adjusting our policies in turn as we go
forward,” said Ewing. “China’s a fascinating laboratory for that,” he added. “They have command-and-control measures to curtail emissions from
industries, transportation, everyday activities and housing, while they also have market mechanisms such as feed-in tariffs, the emissions trading
scheme, etc.”
the world’s “third pole”. This is how glaciologists refer to the Tibetan plateau , home
Khawa Karpo lies at
to the vast Hindu Kush-Himalaya ice sheet, because it contains the largest amount of snow and ice after the Arctic and Antarctic – the
Chinese glaciers alone account for an estimated 14.5% of the global total. However, a quarter of its ice
has been lost since 1970. This month, in a long-awaited special report on the cryosphere by the Intergovernmental Panel on Climate Change (IPCC), scientists will
warn that up to two-thirds of the region’s remaining glaciers are on track to disappear by the end of
the century. It is expected a third of the ice will be lost in that time even if the internationally agreed target of limiting global warming by 1.5C above pre-industrial
levels is adhered to.
Whether we are Buddhists or not, our lives affect, and are affected by, these tropical glaciers that span eight countries. This frozen “water tower of Asia” is the source of 10 of the
world’s largest rivers, including the Ganges, Brahmaputra, Yellow, Mekong and Indus, whose flows support at least 1.6 billion people directly – in drinking water, agriculture,
hydropower and livelihoods – and many more indirectly, in buying a T-shirt made from cotton grown in China, for example, or rice from India.
Joseph Shea, a glaciologist at the University of Northern British Columbia, calls the loss “depressing and fear-inducing. It changes the nature of the mountains in a very visible
and profound way.”
Yet the fast-changing conditions at the third pole have not received the same attention as those at the north and
south poles. The IPCC’s fourth assessment report in 2007 contained the erroneous prediction that all Himalayan glaciers would be gone by 2035. This statement
turned out to have been based on anecdote rather than scientific evidence and, perhaps out of embarrassment, the third pole has been given less attention in subsequent IPCC
reports.
There is also a dearth of research compared to the other poles, and what hydrological data exists has been jealously guarded by the Indian government and other interested
parties. The Tibetan plateau is a vast and impractical place for glaciologists to work in and confounding factors make measurements hard to obtain. Scientists are forbidden by
locals, for instance, to step out on to the Mingyong glacier, meaning they have had to use repeat photography to measure the ice retreat.
In the face of these problems, satellites have proved invaluable, allowing scientists to watch glacial shrinkage in real time. This summer, Columbia University researchers also
used declassified spy-satellite images from the cold war to show that third pole ice loss has accelerated over this century and is now roughly double the melt rate of 1975 to 2000,
when temperatures were on average 1C lower. Glaciers in the region are currently losing about half a vertical metre of ice per year because of anthropogenic global heating, the
researchers concluded. Glacial melt here carries significant risk of death and injury – far more than in the sparsely populated Arctic and Antarctic – from glacial lake outbursts
(when a lake forms and suddenly spills over its banks in a devastating flood) and landslides caused by destabilised rock. Whole villages have been washed away and these events
are becoming increasingly regular, even if monitoring and rescue systems have improved. Satellite data shows that numbers and sizes of such risky lakes in the region are
growing. Last October and November, on three separate occasions, debris blocked the flow of the Yarlung Tsangpo in Tibet, threatening India and Bangladesh downstream with
flooding and causing thousands to be evacuated.
One reason for the rapid ice loss is that the Tibetan plateau, like the other two poles, is warming at a rate up to three times as fast as the global average, by 0.3C per decade. In
the case of the third pole, this is because of its elevation, which means it absorbs energy from rising, warm, moisture-laden air. Even if average global temperatures stay below
1.5C, the region will experience more than 2C of warming; if emissions are not reduced, the rise will be 5C, according to a report released earlier this year by more than 200
scientists for the Kathmandu-based International Centre for Integrated Mountain Development (ICIMOD). Winter snowfall is already decreasing and there are, on average, four
fewer cold nights and seven more warm nights per year than 40 years ago. Models also indicate a strengthening of the south-east monsoon, with heavy and unpredictable
downpours. “This is the climate crisis you haven’t heard of,” said ICIMOD’s chief scientist, Philippus Wester.
There is another culprit besides our CO2 emissions in this warming story, and it’s all too evident on the dirty surface of the Mingyong glacier: black carbon, or soot. A 2013 study
found that black carbon is responsible for 1.1 watts per square metre of the Earth’s surface of extra energy being stored
in the
atmosphere (CO2 is responsible for an estimated 1.56 watts per square metre). Black carbon has multiple climate effects, changing clouds and monsoon circulation as well as
Air pollution from the Indo-Gangetic Plains – one of the world’s most polluted regions
accelerating ice melt.
– deposits this black dust on glaciers , darkening their surface and hastening melt . While
soot landing on dark rock has little effect on its temperature, snow and glaciers are particularly vulnerable
because they are so white and reflective. As glaciers melt, the surrounding rock crumbles in landslides,
covering the ice with dark material that speeds melt in a runaway cycle . The Everest base camp, for
instance, at 5,300 metres, is now rubble and debris as the Khumbu glacier has retreated to the icefall.
The immense upland of the third pole is one of the most ecologically diverse and vulnerable regions on Earth. People have only attempted to conquer these mountains in the last
century, yet in that time humans have subdued the glaciers and changed the face of this wilderness with pollution and other activities. Researchers are now beginning to
understand the scale of human effects on the region – some have experienced it directly: many of the 300 IPCC cryosphere report authors meeting in the Nepalese capital in July
were forced to take shelter or divert to other airports because of a freak monsoon.
But aAside from such inconveniences, what do these changes mean for the 240 million people living in the mountains? Well, in many areas, it has been welcomed. Warmer,
more pleasant winters have made life easier. The higher temperatures have boosted agriculture – people can grow a greater variety of crops and benefit from more than one
harvest per year, and that improves livelihoods. This may be responsible for the so-called Karakoram anomaly, in which a few glaciers in the Pakistani Karakoram range are
advancing in opposition to the general trend. Climatologists believe that the sudden and massive growth of irrigated agriculture in the local area, coupled with unusual
topographical features, has produced an increase in snowfall on the glaciers which currently more than compensates for their melting.
Elsewhere, any increase in precipitation is not enough to counter the rate of ice melt and places that are wholly reliant on meltwater for irrigation are feeling the effects soonest.
“Springs have dried drastically in the past 10 years without meltwater and because infrastructure has cut off discharge,” says Aditi Mukherji, one of the authors of the IPCC
report.
Known as high-altitude deserts, places such as Ladakh in north-eastern India and parts of Tibet have already lost many of their lower-altitude glaciers and with them their
seasonal irrigation flows, which is affecting agriculture and electricity production from hydroelectric dams. In some places, communities are trying to geoengineer artificial
glaciers that divert runoff from higher glaciers towards shaded, protected locations where it can freeze over winter to provide meltwater for irrigation in the spring.
Only a few of the major Asian rivers are heavily reliant on glacial runoff – the Yangtze and Yellow rivers are showing reduced water levels because of diminished meltwater and
those
the Indus (40% glacier-fed) and Yarkand (60% glacier-fed) are particularly vulnerable. So although mountain communities are suffering from glacial disappearance,
downstream are currently less affected because rainfall makes a much larger contribution to rivers such
as the Ganges and Mekong as they descend into populated basins. Upstream-downstream conflict over extractions, dam-
building and diversions has so far largely been averted through water-sharing treaties between nations,
but as the climate becomes less predictable and scarcity increases , the risk of unrest within – let alone between –
nations grows .
glacier buffers, affecting agricultural output as well as hydropower generation , and these stresses
will be compounded by an increase in the number and severity of devastating flash floods . “The impact on local water
resources will be huge, especially in the Indus Valley. We expect to see migration out of dry, high-altitude areas first but populations across the region will be affected,” says
Shea, also an author on the ICIMOD report.
As the third pole’s vast frozen reserves of fresh water make their way down to the oceans , they are
contributing to sea-level rise that is already making life difficult in the heavily populated low-lying deltas and bays of Asia, from Bangladesh to Vietnam. What is more, they
are releasing dangerous pollutants . Glaciers are time capsules, built snowflake by snowflake from the skies of the past and, as they melt, they
deliver back into circulation the constituents of that archived air. Dangerous pesticides such as DDT (widely used for three decades before being banned in
1972) and perfluoroalkyl acids are now being washed downstream in meltwater and accumulating in sediments
factors — rising economic power, demography, ecology, fierce resource competition , water and food supply and
security, as well as increasing military expenditure. In addition, Asia has the greatest number of geopolitical
“hot spots” and nuclear powers . In the latter context, the Tibetan Plateau stands out.
Strategically located between the two Asian giants, China and India, the Tibetan Plateau and its surroundings have come to represent
Asia’s most critical 21st century battleground . Potentially, they may also be the world’s
battleground . The narrative of this century and of Asia will be written, to a very large extent, in terms of what is the “hottest
geopolitical issue” — water security . The Tibetan Plateau extends from the Hindu Kush in central Afghanistan, through Pakistan, India, Nepal,
Bhutan and onto the borders of Myanmar. The geopolitical significance of Tibet has been tremendous historically, too. It was invaded by Britain in 1904 for that reason. Forty-
five years later China’s Liberation occurred and, almost immediately afterwards, the People’s Liberation Army annexed Tibet in 1950-51. For Mao Zedong, the strategic
importance of Tibet was clear. It was fundamental to national security. The Tibetan Plateau acts as a major buffer zone and provides China with leverage over almost the entire
Eurasian continent. From a national security point of view, the vast barriers of the Tibetan Plateau shield China’s internal populace in the east from military aggression
originating from the west. This strategic importance was clearly manifested in the Sino-Indian War of 1962 — the only war in which the People’s Liberation Army has been
successful so far! Losing Tibet would be seen as greatly weakening China strategically and a national humiliation. Apart from its strategic location, the renewed importance of
Tibet for China lies in its water riches. China has long been eyeing the water reserves of Tibet, especially during and since the period of Mao (1949-1976). Located at a high
It is in fact the
altitude on an average of 4,500 meters, it is richly endowed with fresh water contained in its oxygen deprived vast glaciers and huge underground reservoirs.
largest repository of freshwater after the two poles, Arctic and Antarctic, thus claiming the sobriquet, the “third
pole.” Many of the world’s greatest rivers flow out of the Tibetan Plateau — the Yellow River, Yangtze Kiang,
Mekong, Salween, Sutlej and the Brahmaputra. More important, in terms of human geography,
almost half of the global population currently lives in the watershed of the Tibetan Plateau. This
explains the enormous importance of Tibetan freshwater for China. China, on the whole, is an extremely arid country. One quarter of the
country consists of deserts. China has severe water shortage challenges. At the same time, most of its rivers are either too polluted or are too silted to quench the thirst of 1.3
billion people. The basic internal issue for China regarding water security is to transfer fresh water from the Tibetan Plateau in the country’s west to its industrial and populated
corners in its north and east. This has resulted in a spree of building dams, canals, irrigation systems, pipelines and water diversion projects. As Brahma Chellaney points out in
his seminal work, “Water: Asia’s Next Battleground,” China has created more dams in the last five decades than the rest of the world combined, largely in order to divert the flow
of rivers from the south to its north and east corners. The end result was the diversion of routes of various rivers originating in the Tibetan Plateau. China considers such
diversions to be an internal security matter. But these inter-basin and inter-river water transfer projects in the Tibetan Plateau have tremendous consequences on other
downstream countries that draw water from those rivers. Thus, what was seen as a national concern for China in reality has vast external ramifications. Chinese policy so far has
been to seek to minimize issues to be negotiated with its neighbors. But diversion of rivers could boil into a hot conflict in the
near future. Water wars could largely destabilize not just the wider Tibetan region, but also all of Asia .
Based on current trends, the question is not “how” and “why” such conflicts would arise, but “when”?
miscalc
Remote sensing capabilities are public, redundant and omnipresent
Zhu 17 [Lingli, Finnish Geospatial Research Institute FGI, National Land Survey of Finland
2017 https://www.intechopen.com/books/multi-purposeful-application-of-geospatial-data/a-
review-remote-sensing-sensors]
Commonly used remote sensing satellites So far, more than 1000 remote sensing satellites have been
4.5.
launched. These satellites have been updated with new generation satellites. The few spectral sensors from the earliest missions have been
upgraded to hyperspectral sensors with hundreds of spectral bands. The spatial and spectral resolutions have been improved on the order of 100-fold.
Revisit times have been shortened from months to daily. In addition, more
and more remote sensing data are available as
open data sources . Table 8 gives an overview of the commonly used remote sensing satellites and their parameters. A common
expectation from the remote sensing community is the ability to acquire data at high resolutions (spatial, spectral,
radiometric, and temporal), at low cost, with open resource support and for the creation of new applications by the integration of
spatial/aerial and ground-based sensors. The development of smaller, cheaper satellite technologies in recent years has led many companies to explore
new ways of using low Earth orbit satellites. Many companies have focused on remote imaging, for example, to gather optical or infrared imagery. In
the future, a low-cost communications network between low Earth orbit satellites can be established to form a spatial remote sensing network. This
network would integrate with a large number of distributed ground sensors to establish ground-space remote sensing. In addition, satellites can easily
cover large swaths of territory, thereby supplementing ground-based platforms. Thus, data distribution and sharing would become very easy. Openness
and sharing resources can promote the utilization of remote sensing and maximize its output. In
recent years, open remote sensing
resources have made great progress. Beginning on April 1, 2016, all Earth imagery from a widely used
Japan ese remote sensing instrument operating aboard NASA’s Terra spacecraft since late 1999 has been available to
users everywhere at no cost [41]. On April 8, 2016, ESA announced that an amazing 40-cm resolution WorldView-2 European cities
dataset would be available for download through the Lite Dissemination Server. These data are made available free of charge.
This dataset was collected by ESA, in collaboration with European Space Imaging, over the most populated areas in Europe at 40-cm resolution. These
data products were acquired between February 2011 and October 2013. The dataset is available to ESA member states (including Canada) and
European Union Member states [42]. In open remote sensing resources, NASA (USA) was a pioneer in sharing its imagery data. NASA has been
cooperating with the open source community, and many NASA projects are also open source. NASA has also set up a special
website to present these projects. In addition, some commercial companies like DigiGlobal (USA) have also partly
opened their data to the public. In the future, more and more open resources will become
available.
10.Space Debris – GPS service degraded or interrupted because of damage a satellite by space
debris. Vulnerability Individual GPS satellites can be easily damaged by space debris .
However , there are 31 satellites and damage to one is unlikely to impact service as a whole. Low –
Vector able to impact less than 5% of users
no eradication in the squo which proves sats aren’t being utilized for
tracking OR if they are, soaring opium production disproves their
impact
Dr. Vanda Felbab-Brown 17, Senior Fellow in the Center for 21st Century Security and
Intelligence in the Foreign Policy Program at Brookings, PhD in Political Science from MIT,
“Afghanistan’s Opium Production is Through The Roof —Why Washington Shouldn’t
Overreact”, Brookings Report, 11/21/2017, https://www.brookings.edu/blog/order-from-
chaos/2017/11/21/afghanistans-opium-production-is-through-the-roof-why-washington-
shouldnt-overreact/
The diversity of the Taliban’s income portfolio is has important implications for
counternarcotics and counterinsurgency strategies , especially since eliminating the Taliban’s financial base through
counternarcotics efforts is often seen as a key element of the counterinsurgency strategy. There is simply no easy way to
bankrupt the Taliban by wiping out the opium poppy economy. And as discussed below, any such move
would be disastrous for the counterinsurgency efforts. There is simply no easy way to bankrupt the Taliban by wiping
out the opium poppy economy. The Taliban is not the only group profiting from the opiate business in Afghanistan. So are various criminal gangs,
which often are connected to the government, the Afghan police, tribal elites, and many ex-warlords-cum-government-officials. Many of these
powerbrokers are also key anti-Taliban counterinsurgency actors, including in the north of the country where opium too has expanded. NO MAGIC
BULLET Most counternarcotics measures adopted since 2001 have been ineffective or counterproductive
economically, politically, and with respect to counterinsurgency and stabilization efforts .
Eradication and bans on opium poppy cultivation, often borne by the poorest and most socially marginalized,
have generated extensive political capital for the Taliban and undermined
counterinsurgency . They sparked provincial revolts , alienated the rural
population from the Afghan government, and drove the rural population into Taliban
hands . The Taliban presented itself as a protector of the people’s poppies and cast the Afghan
government and its international sponsors as apostates and infidels trying to kill the Afghan
people with hunger. The Obama administration’s decision to defund centrally-led eradication
was a courageous break with U.S. counternarcotics dogma, and such a policy is
still correct today. Aerial spraying would be the only way to do any large-scale eradication since
manual eradication teams have been attacked. That would be disastrous from the counterinsurgency
perspective, since it would cement the Taliban’s political capital rather than bankrupting it.
Eradication never bankrupted insurgents anywhere, not even in Colombia. Nor is it sustainable without an end to conflict.
No Afghanistan escalation
Lamb 14 [Robert, senior fellow and director of the Program on Crisis, Conflict, and
Cooperation at the Center for Strategic and International Studies, PhD in policy studies from the
University of Maryland School of Public Policy; Sadika Hameed, fellow with the Program at
CSIS, MA in international policy studies from Stanford University; Kathryn Mixon, program
coordinator and research assistant
http://csis.org/files/publication/140116_Lamb_SouthAsiaRegionalDynamics_WEB.pdf]
In the event of conflict escalation in Afghanistan, China would be in direct contact with the
Afghan government, the power brokers it has a relationship with, and with Pakistani civilian and
(especially) military leaders to strongly encourage a political settlement. It would put its economic projects on hold
temporarily. But it would not become involved militarily ; instead, it would try to contain the fallout
with, for example, stronger border security. Iran would certainly take similar measures to contain spillover from an
escalated Afghan conflict, but otherwise its involvement would depend almost entirely on the state of its conflicts
and rivalries in the Levant and the Gulf, much higher-priority areas than Afghanistan. If things settle down to its west and south, Iran might turn some
attention eastward to Afghanistan’s conflict. This would not be in the form of direct military incursions
but rather of funding, military equipment, and possibly safe haven to Hazara, Tajik, and Uzbek groups, as it has in the past, with a particular priority on protecting Afghani-
Saudi Arabia, working with Pakistan, would probably offer support to groups that
stan’s Shi‘ite minority.
oppose Iranian-support groups. Qatar might follow Saudi Arabia’s lead or it might offer to mediate talks between opposing groups, as it has recently.
Beyond that, Qatar and UAE would probably stay out . Russia would probably increase its
security presence in Central Asia, as noted above, but work diplomatically with the United States,
European powers, or NATO to find ways to contain the spread of violence from Afghanistan into
Central Asia.
But sats are comparatively more essential to solve their laundry list
Salazar 18 [Doris Elin, Science & Astronomy "Solving Earth's Climate Challenges Requires
More Satellite Vision: Report," 1-6-18 https://www.space.com/39306-earth-climate-science-
satellites-future.html]
Space observations are crucial to solving the challenges presented by Earth's complex climate,
which will play a pivotal role in humanity's success or demise, argued an extensive report by the
U.S. National Academies. The new, 700-page report released today (Jan. 5) is titled "Thriving on Our Changing Planet: A
Decadal Strategy for Earth Observation from Space." In it, the National Academies of Sciences, Engineering and Medicine
(NASEM) announced their recommendations for what federal research agencies — such as
NASA, the National Oceanic and Atmospheric Administration (NOAA), and the United States
Geological Survey (USGS) — should do over the next 10 years. This was the second decadal
survey for Earth science and applications from space; the first was published in 2007. The report's
co-chairs — Waleed Abdalati, the director of Cooperative Institute of Research in Environmental Sciences at the University of
Colorado Boulder, and Bill Gail, chief technology officer at the Global Weather Corporation — addressed the survey's
recommendations during a press conference at the National Academies' Keck Center in Washington, D.C. [See the Effects of Climate
Change Across Earth (Video)] "There is a perspective from space that cannot be gained any other
way ," Abdalati said early in the press conference. Understanding the ways in which human activity and
non-anthropogenic changes are shaping societies across the world ought to be considered like
an extension of infrastructure , he added. Gauging weather systems and predicting
sea-level rise , for example, are as vital to a "thriving" society as fixing highways and
maintaining railroads. "If you go back 10-12 years, we were in a different place when it came to Earth information from
space" Abdalati said. "We were not using weather apps on our phones and planning our days' activities around them. We were not
using online mapping applications to get to and from where we're going in the most efficient way. The military [also] relies heavily
on information from NASA, NOAA and USGS." Space observations are crucial for society in a myriad of ways across the commercial,
public health and national safety sectors, the co-chairs said. The report is the product of 290 suggestions of the most important
issues to tackle in the near future, contributed by the scientific community. From those suggestions, the report's compilers extracted
103 objectives and then synthesized them into 35 final goals. The
report calls for prioritizing advances in, for
example, forecasting air quality and weather so that predictions provide a lead time of up to two
months. In addition, the report calls for knowing how biodiversity changes over time, predicting
future geological hazards within a more accurate time frame and understanding more precisely
how the ocean stores heat, among many other goals. The co-chairs said that the report focuses on
recommendations that are achievable within budget constraints and prioritizes the suggestions
the committee believed were most important for the next decade . The report invites the scientific
community at NASA, NOAA and USGS to focus first on achieving ambitious solutions to climate challenges and then following up
with ways to accelerate technology to meet those ends, rather than the other way around. The report recommends that NASA cap the
budget for its current projects —both flying and soon-to-be-flying missions —at $3.6 billion, to leave room in the agency's funding to
serve the report's 35 objectives over the next decade. Abdalati stressed, however, that it was important to fly the missions already in
development. NASA should also continue studying how small particles of material, known as aerosols, can affect air quality and
should learn more about the traits of vegetation on Earth's surface, the co-chairs said. The report additionally suggests that NASA
start a competitive new Explorer program for medium-size agile instruments and missions (with a budget of $500 million or lower),
in which participants would take a shot at addressing one of seven identified topics from the survey. Those topics include mapping
ocean-surface winds and developing 3D models of the terrestrial ecosystem. The competition may also reveal what objectives will be
really is
more easily achieved in the next decade, Gail said. Gail concluded his presentation by addressing the report's title. "It
about this tension between our ability to thrive over the next decade and longer, and the fact
that as the planet is changing around us, the information we need to acquire about our
planet is changing as rapidly as we try to acquire it," he said. "So this will be a decade in which we
will find growing community and public reckoning between two things: broad reliance on Earth
information ... and this growing challenge of obtaining that information."
[NEXT PARAGRAPH]
Sat ellite s Have Disad vantage s Too Because weather systems can sometimes obscure imagery it is sometimes difficult for
satellites to either map an area or build up an accurate picture. In Ukraine, research facilities consistently send up drones to verify
information collected by satellites as well as fill in data gaps. This means that satellites are extremely useful, but another data
It is also very data intensive .
collection tool should also be there to back up the information collected by the satellites.
Because satellites are harvesting so much data and not really able to discern for themselves which
data is important , there is a lot of information that goes through the processing system. Primarily
this has an internet speed issue as there is a lot of information that is transmitted from the
satellite via the internet to the cloud. Research facilities and agriculture companies are investing where they can in internet
infrastructure to counter this problem as it can sometimes cause delays or backlogs of information in the
system.
Insufficient broadband dooms precision ag even assuming strong
satellites
Strenkowski 15 [Jeff, JD, Morgan, Lewis & Bockius LLP, Statement Before the Rural
Utility Service, U.S. Department of Agriculture, and the National Telecommunications and
Information Administration, U.S. Department of Commerce, 6-10, p. 2-6]
II. High-Tech Agriculture Requires Increased Wireless M2M Internet Broadband The U.S. agricultural economy is
increasingly high-tech and mobile. Expanded broadband facilities and services are critical economic drivers to rural
communities. In particular, high speed broadband is not only essential to business centers in rural towns and traditional
anchor institutions, it is also an essential service for agricultural operations that form the economic heart of many American rural
communities. Agricultural producers are facing growing demands to produce more food, fuel and fiber for a growing, more prosperous world
farm buildings have access
population, and they must do so with limited resources and increasing regulation. Not only is it critical that
to high speed broadband to communicate with their customers and vendors, follow commodity
markets, gain access to new markets around the world, and manage regulatory compliance, but more and more farmers
are demanding capability for M2M communications from the field that make possible significant improvements in real-time productivity and cost
management. Over the past several decades, technology has enabled farmers to achieve ever greater levels of productivity. The first wave focused on
optimizing the vehicle. The second wave focused on optimizing the fleet. The third wave is focusing on connecting the farmer “in the cab” to the
cooperative, agronomist, or other agriculture service providers who can help reduce input costs, increase yields, and further enable sustainable farming
the future of enhanced farming efficiency and productivity turns on the grower’s ability
practices. Much of
to gather, process , and transmit data using advanced information and comm unications technologies. Technology-
equipped machine solutions enable agronomic decision-making to advance productivity, improve agriculture profitability and global
competitiveness, and optimize inputs for continuous environmental improvement. With superior, precise, site-specific data, a farmer can
analyze and carefully adjust their farming methods to be the most efficient, most economical, and most environmentally friendly possible, thus
requires significant improved
improving productivity and sustainability. However, enabling farmers to utilize M2M data fully
communications capacity and access to high speed mobile broadband. Today, many of Deere’s customers are challenged with a lack of
adequate cellular coverage in the fields where agricultural equipment operates. Deere’s JDLink™ data service,
for example, currently relies on the cellular telephone network to transmit telemetric machine operation data. The lack of coverage needed for these
solutions to transmit telemetric data from the machines is already a concern, but the
shortfall in coverage will only become more
problematic as data volumes increase. Due to significant gaps in cell coverage in rural areas where farm machines operate, today
JDLink™ data transmissions have only a 70% successful call completion rate. Absent significant improvements in cell coverage in cropland areas,
Deere expects that this figure will drop to about 50% in two to three years as agricultural demand for broadband services increases. 6 These data
communication services depend on stable, reliable high speed connections to equipment
operating in remote locations. This is not a problem that can be resolved by relying on
satellite services or even more spectrum. In addition to fiber-to-farm buildings, rural areas need more wireless
antenna towers, all of which must be connected by fiber backhaul to the broadband network provider.
Towers provide the wireless coverage-- the problem is there are simply not enough towers in the cropland areas where significant
productivity enhancements could be gained. The Council should examine ways to remove regulatory barriers to the deployment of
tower and backhaul infrastructure in rural areas, and seek to stimulate business funding for this critical national infrastructure.
But, weak satellites mean more effective remote sensing tech fills in
and resolves barriers to precision ag – squo makes their impact
inevitable
Byrum 17 [Joseph, Senior R&D and Strategic Marketing Executive in Life Sciences - Global
Product Development, Innovation, and Delivery at Syngenta 3-14-17
https://agfundernews.com/remote-sensing-powers-precision-agriculture.html]
Sensors can be grouped according to their enabling technology — ground sensors, aerial sensors and
satellite sensors. Ground sensors are handheld, mounted on tractors and combines, or free-
standing in a field. Common uses for these include evaluating nutrient levels for more
specific chemical and nutrient application , measuring weather , or the moisture
content of the soil . Aerial sensors have become far more affordable with the advent of
drone technology that places the bird’s-eye view of a field within reach of most farmers . They
are also attached to airplanes , another relatively cheap option. The systems are capable of
capturing high-res olution images and data slowly enough , at low altitude, to enable
thorough analysis . Typical uses include plant population count, weed detection, yield estimates, measuring chlorophyll
content and evaluating soil salinity. The downside of aerial platforms is that wind and cloud cover can limit their use. Satellite
sensors provide coverage of vast land areas and are especially useful for monitoring crops status, calculating losses from severe
weather events and conducting yield assessments. Initially, such systems were tailored to the needs of the military and
government, not agriculture. So the main downside, aside from cost , was that these systems were tasked in
advance — usually months — to look at a specific area at a certain time. Worst of all, cloud
cover could ruin that expensive purchase. Now many governments have opened up satellite imaging databases
to the public, providing an important and accessible resource for understanding crop conditions.
k
Environment’s improving and no collapse
Easterbrook 18 – Gregg Easterbrook, Contributing Editor at The New Republic and The
Atlantic, Former Fellow in Economics and Government Studies at The Brookings Institution,
Lecturer at the Aspen Institute and Chautauqua Institution, It's Better Than It Looks: Reasons
for Optimism in an Age of Fear, p. 52
North American
INSTEAD, IN 2017, I WATCHED a bald eagle glide peacefully above my home near Washington, DC.
eagles have proliferated so much that the International Union for the Conservation of Nature (IUCN), which
keeps the books on species gains and losses, now classifies the bird under "least concern.”
The eagle flew through air that was free of smog , as air almost always is in American cities. Newspapers in my
driveway reported that oversupply of petroleum and natural gas was pushing energy prices toward record lows. "Oil Glut Worries"—
here, Wall Street Journal, March 10, 2017; "Natural Gas Glut Deepens," same paper, same page, a week later. Society was expected
by now to be in full panic mode regarding oil and gas exhaustion, and instead the apprehension is too much fuel. Another newspaper
Acid
in the driveway reported so many otters frolicking off California that tourists were crowding seaside enclaves to watch.
rain was nearly stopped , the stratospheric ozone hole was closing . Water quality alarms
were ongoing in Flint, Michigan, and along Long Island Sound, but in general cleanliness was rising , with
Boston Harbor, Chesapeake Bay, Puget Sound, and other major water bodies, filthy a generation ago, mostly
safe for swimming and fishing , meeting the 1972 Clean Water Act's definition of success. Nearly every
environmental barometer in the United States was positive and had been so for years if not
decades .
Watching the bald eagle soar did not make me feel complacent regarding the natural world, rather, made me feel that greenhouse
gases can be brought to heel, just as other environmental problems have been. Climate change reforms will be the subject of a
coming chapter. Here, let's contemplate why nature did not collapse, despite ever more people consuming ever more resources.
Man-made damage to nature can be atrocious. Think of the Exxon Valdez oil spill, which destroyed
forever the wildlife in Prince William Sound, Alaska. At least that's what was said in 1989 when the tanker struck Bligh Reef. Today
most sea and intertidal life in Prince William Sound has returned to pre-spill numbers , while the
sound's combination of beauty and biology makes it a popular destination for whale- watching tours. Exxon, now ExxonMobil,
deserved the billions in fines and settlements the company paid. But the whole thing was over in a snap of the
fingers in geologic terms.
Humanity is hardly the only force that damages nature. In 1980, pressurized magma inside Mount Saint Helens in
Washington State exploded with the power of about 1,500 Hiroshima bombs . "Some 19 million old-
growth Douglas firs, trees with deep roots, were ripped from the ground and tossed about like cocktail
swizzles," one analyst wrote. Hundreds of square miles burned to cinders, animals and fifty- seven
people near the eruption turned to char. Commentators of the time called the Mount Saint Helens area destroyed
forever. When I hiked the blast zone in 1992, I was amazed to behold areas that had been lifeless moonscapes in 1980; just a
dozen years later , they were bright with biology : wildflower, elk, sapling firs. Today Mount
Saint Helens National Volcanic Monument is a recommended destination for backpackers. Through the eons, nature has
healed after insults far worse than the worst ever done by people — ice ages ,
asteroid strikes , thousand-year periods of volcanism so extreme that global ash clouds
blocked the sun for years at a time. The mega-volcanism that long ago created Siberia is estimated to
have unleashed three billion times the force of the Hiroshima blast, plus far more smoke than
humanity 's wars and factories combined. Nature has evolved defenses against such harm in the
same way that the body has evolved defenses against pathogens. This does not make harm to nature insignificant, any more than
having an immune system makes germs insignificant. But
before asking whether nature will collapse, it's good
to remind ourselves that our ongoing existence is evidence that the biosphere is a green
fortress .
Or at least, that’s one plausible and completely valid theory. But before you start campaigning for a universal basic income and
set up a bunker, you might want to also familiarize yourself with the competing theory: In the long run, we’re going to be
just fine .
Two hundred years later, Lee’s invention, still being vilified as a jobs killer, was among the machines destroyed by protestors during
the Luddite movement in Britain. More
than 100 hundred years after that, though computers had
replaced knitting machines as the latest threat to jobs, the fear of technology’s impact on
employment was the same. A group of high-profile economists warned President Lyndon Johnson of a “cybernation
revolution” that would result in massive unemployment. Johnson’s labor secretary had recently commented that new machines had
“skills equivalent to a high school diploma” (though then, and now, machines have trouble doing simple things like recognizing
objects in photos or packing a box), and the economists
were worried that machines would soon take over
service industry jobs. Their recommendation: a universal basic income, in which the government pays everyone a low salary
to put a floor on poverty.
Today’s version of this scenario isn’t much different. This time, we’re warned of the “Rise of
Robots” and the “End of Work.” Thought leaders such as Elon Musk have once again turned to a universal basic income
as a possible response.
But widespread unemployment due to technology has never materialized before . Why, argue the
optimists, should this time be any different?
Each human could make more than 20 times the amount of cloth that she could have 100 years earlier. So
how could more textile workers be needed?
According to the optimist’s viewpoint, a factory that saves money on labor through automation will either:
Lower prices, which makes its products more appealing and creates an increased demand that may lead to
the need for more workers.
Generate more profit or pay higher wages. That may lead to increased investment or increased
consumption, which can also lead to more production, and thus, more employment.
Amazon offers a more modern example of this phenomena. The company has over the last three years increased the
number of robots working in its warehouses from 1,400 to 45,000 . Over the same period, the rate at
which it hires workers hasn’t changed.
How automation impacts wages is a separate question. Warehouse jobs, for instance, have a reputation as grueling and low-paying.
Will automation make them better or worse? In the case of the loom workers, wages went up when parts of their jobs
became automated. According to Bessen, by the end of the 19th century, weavers at the famous Lowell factory earned more
than twice what they earned per hour in 1830. That’s because a labor market had built up around the new skill (working the
machines) and employers competed for skilled labor.
s is that we are not using the right technology ; i.e. energy efficient power production without harmful
emissions. Consequently, the correct statement would be that we consume energy that is produced by technologies that are harmful
the remedy will
to the climate. The difference in wording is important. As the first diagnosis is “too high energy consumption”,
be to use a different medication than a diagnosis based on “the wrong technology”. Alarmist reporting
can inspire bad decisions if the statements aren’t systematically reviewed and evaluated. It can also be misguiding to express
environmental threats in general terms. Actions must be based on precise specific symptoms with corresponding diagnoses. If the
doctor discovers that the patient is lame and suffers from a high fever, it doesn’t help to predict imminent death. Maybe the
lameness and the fever have different causes altogether! A successful cure would probably include two different diagnoses with
separate medications. Several recent surveys of the general conception of the world have been made— one is Project Ignorance by
Gapminder and Novus in Sweden. One
of the questions asked was whether CO2 emissions per
capita and year had increased or decreased in the world during the last 40 years. The surveyed
group was large and representative in order to give a fairly accurate picture of the common
opinion. No less than 90% believed that CO2 emissions had increased. The truth is that they
haven’t increased at all. It is important that decision makers on all levels learn how to see the wood from the trees.
Decisions based on false preconditions can halt technological development, and thus also the development of the economy, welfare,
and a healthier environment. The
flow of innovation s in the climate and environmental areas is
accelerating rapidly . This can be seen in the number of improvements that have occurred in
recent years, which can be counted in the thousands . Such improvements have to be weighted
on the same scale as the problems in this area. That is not to say the problems should be ignored—they need to be
acted upon. But they should not be allowed to occupy our brains to the extent that our power to act is paralysed. Is the Notion of
Sustainable Technology-Driven Growth Over-Optimistic? The development of a technological society has always
been questioned. In the 19th century, critics claimed that the technological revolution would create poverty. In the 1970s, it
was generally believed that the forest dieback would cause a disaster. In the 1980s, the acidification of lakes and throwaway
mentality of society were regarded as manifestations of the devastating properties of growth and industrialisation. Today, many fear
the environmental effects of air travel and the production of electronic devices. There
are people who seriously wish to
halt economic growth and wind back the clock to the society of the 1960s. They recall this
time period as small-scaled and down-to-earth, stress-free and idyllic. But they tend to forget
that the refrigerators of that time required 90% more electricity than today , and
that our teeth were repaired with mercury fillings instead of plastic . There were no X-
ray CT scanners and no medicines against ulcers. In addition, there were many more people living
without electricity . There was also more widespread malnutrition , a higher infant mortality , and,
in fact, more wars . Cars were fuelled by leaded petrol, and sulphur emissions were 90% higher
than today. The acidification of lakes, as well as polluted streams and fields, were serious
concerns. Since then, technological innovations have reduced sulphur emissions and removed
the lead from car fuel. At any given point in history, there have been critics claim ing that this was the
time when we had reached the optimal point in the development of the modern society. But we
hadn’t, not then and not now . And the more our countries are modernised, the greater our
possibilities to care for animals and nature become. In the mid-1800s, the killing of large animals
like sperm whales didn’t concern people to any significant degree, despite the cruel hunting methods using harpoons. The benefits of
the whale fat, mainly used for lamp oil to facilitate reading in the evenings, overshadowed any empathic impulses. In the 1850s more
than 70,000 people were employed by the American whaling industry. There were 900 ships in the world hunting whales, and
during one of the most active years, 8000 whales were butchered, which provided more than 300,000 barrels of oil. The oil
extracted from the head of the sperm whale, the so-called spermaceti oil, was especially sought-after. It was of very high quality and
sold for 1.50 US dollars per litre in today’s monetary value. As a consequence, the number of sperm whales in the world rapidly
dwindled. However, when oil drilling started in Pennsylvania in the year 1859, the price of whale oil began to fall. The fast transition
to petroleum products for lighting and other applications is considered to have saved the last of the sperm whales. Thus, new
technology can both contribute to the protection of threatened animal species and provide the wealth to make it affordable for us to
even save predators. Imagine what would happen if we were able to bring back someone from the 19th century and tell them that
today we move wolves though the air by helicopter in order to save the species and expand its habitat; our ancestor would probably
rather go back to sleep than listen to such apparent stupidity. Pessimism Does not Support a Sustainable Development There is a lot
of progress going on in the world today, but not without negative side effects. When improving the world and dealing with the side
effects, an optimistic attitude provides us with a much better chance of success than a pessimistic view. The optimist carries a
positive inner beacon to follow, while the pessimist is always looking for potential traps and drawbacks. As visions and conceptions
of ideas often become self-fulfilling, it isn’t difficult to realise what’s most constructive. All decisions—big or small, conscious or not
—are affected and guided by our inner beacon. When solving a problem, such as developing a new product for example, it is
necessary to have a conception of a working solution in mind. As a product developer, it is of course necessary to review every
minute step in the process and question the choices made. You have to ask yourself if there may be a better material or a smarter
design. Strange as it seems, this continuous struggle in the mind of the developer may appear to be a kind of pessimism, as it is all
about looking for weaknesses in the imagined solution. It is not dissimilar from the process a doctor follows when selecting a
diagnosis and a remedy. You start with certain hypotheses, examine, exclude, test, question and verify until you are satisfied that you
have made the correct diagnosis. Then the choice of medication becomes much simpler. It would be fatal if the doctor was
pessimistic from the start and worked in the belief that it would be impossible to find a reason for the illness, or a working remedy.
This could then be the conclusion that such a doctor would unconsciously try to verify. Would you like to have a doctor like that? The
same is true for climate and environmental problems—we need optimists armed with critical thinking to solve them. There are also
so-called climate change deniers, who believe that man hasn’t really affected the planet and its ecosystems to any significant degree.
Some of them claim that the influence of the sun and other natural phenomena are so enormous that human activities have no
bearing on global warming. Perhaps these deniers are so deeply pessimistic that they cannot imagine any possible solutions. For
ages, man has harboured a certain distrust of his own species. Throughout history, various religions have emphasised human
shortcomings and presented assorted consequential threats. During the last 30 years, such prophesies have increasingly often been
introduced by environmental activists and some political groups, whose messages have been significantly supported by the media.
The underlying conception of humanity isn’t flattering. The human race is considered to be fundamentally ruthless, greedy, short-
sighted and evil. Threats against the climate and much other misery on earth are caused by human failure. However, if we take the
time to study the progress that has been made by the human race throughout the ages, we actually get the opposite picture. Can it
really be evil, greedy, and short-sighted beings who put their own lives at stake to treat people infected by Ebola or HIV in poor
countries? Who are the ones that are continuously reducing the number of starving people on earth? Who are the ones that invent
vaccines for the children of the world? Who are the ones that have developed a civilisation where an increasing number of people get
educated, and who struggle to reduce the casualties of war? Why
blame an entire species for atrocities that are
actually committed by a mere fraction? Establishing a firm belief in humankind should be
the first step on the road to sustainable development.