You are on page 1of 67

OCTAS---Michigan---Wiki Doc

1AC---Single Payer---Michigan
1AC---Full Text
Adv---Health---1AC
Advantage 1 is HEALTH.
Rural hospital closures are on the brink.
HealthDay 23, world’s largest syndicator of health news and a leading independent creator of
evidence-based health content. “Hundreds of Hospitals Could Close Across Rural America”,
https://www.usnews.com/news/health-news/articles/2023-01-16/hundreds-of-hospitals-could-close-
across-rural-america, 16 January 2023, DOA: 8/30/2023, SAoki

Hundreds of rural hospitals across the United States are teetering on the edge of closure, with their financial
status increasingly in peril, a new report reveals.

More than 200 rural hospitals are at immediate risk of closure because they aren’t making enough
money to cover the rising cost of providing care, and their low financial reserves leave them little
margin for error, the Center for Healthcare Quality and Payment Reform report states.

more than 600 rural hospitals — nearly 30% of rural hospitals nationwide — are at risk of closing
Overall,

in the near future, according to the report.

“Costs have been increasing significantly and payments, particularly from commercial insurance plans,
have not increased correspondingly with that,” said Harold Miller, president and CEO of the Center for Healthcare Quality and Payment Reform (CHQPR).
“And the small hospitals don't have the kinds of financial reserves to be able to cover the losses .”

COVID-19 pandemic actually provided temporary relief to cash-strapped hospitals, thanks to federal grants that
The

helped keep them open and serving patients.


More than 150 rural hospitals nationwide closed between 2005 and 2019, the CHQPR report noted. Another 19 shut down in 2020, more than any year in the previous decade.

But only six more closed in 2021 and 2022, because of the financial assistance hospitals received while the pandemic raged.

Now that federal assistance has ended, the financial crisis for rural hospitals looms larger than ever,
Miller said.

hospitals are going to be facing the situation where they have to spend more than they can take in to
“The

be able to pay for care, and they don't have the reserves to be able to deal with that ,” he explained.
The problem owes to the need for hospitals to have a certain number of staffers on duty every hour of every day, while reimbursement is based on the number of patients treated, Miller said.

Urban hospitals serve larger populations and are able to make ends meet due to the constant churn of
patients coming in and out. But rural hospitals serving less populated areas are less likely to see enough
patients on average to cover costs of care.
“An emergency department in the hospital, it has to have a physician there 24/7 to be able to deal with emergencies, so there's a certain fixed cost associated with having that physician there

if you have fewer visits because the community is smaller, then


around the clock available for patients who need it,” Miller said. “Well,

there are fewer patients to be able to cover that cost.”

Rising health care costs are adding an additional strain for these hospitals, due both to inflation and to
workforce shortages.

In addition, many people still avoid going to the hospital for care, much as they did during the pandemic, said
Brock Slabach, chief operations officer of the National Rural Health Association.

“Many hospitals I talked to are reporting that volumes have not returned since the pandemic has been over, in general,” Slabach said.
Nearly every state has a rural hospital at risk, according to the CHQPR report.

In almost half the states, 25% or more of rural hospitals are at risk of closing. In 10 states, 40% or more are at risk.

States with the highest percentage of at-risk hospitals include Alabama, Arkansas, Connecticut, Hawaii, Kansas, Mississippi, New York, Oklahoma, Tennessee and Texas, according to the report.

People in urban areas tend to think of a hospital as one of many health care facilities in the community, functioning alongside urgent care centers, testing labs, radiology centers, doctors’
offices and the like.

“In many of the smallest rural communities, the only thing there is is the hospital,” Miller said. “The hospital is
the only source. Not only is it the only emergency department and the only source of inpatient care, it's
the only source of laboratory services, the only place to get an X-ray or radiology. It may even be the
only place where there is primary care.”

A lot of these small rural hospitals run rural health clinics, Miller added.

“There literally wouldn't be any physicians in the community at all if it wasn't for the rural hospital
running that rural health clinic,” he said. “So if the hospital closes, you're literally eliminating all health care
services in the community.”

Private insurance is putting the financial squeeze on hospitals more than public plans like Medicare, Miller
said.

“Medicare actually pays small rural hospitals based on their cost, recognizing that it costs more to deliver
health care in a small rural community. Private payers don't do that,” Miller said. "So in many cases, these hospitals, the
reason they're losing money is because they're losing money on their privately insured patients, not on
Medicare.”
He said the best way to preserve rural hospitals is to shift to a two-part payment schedule, where the facilities are paid to maintain standby capacity and to actively treat patients.

“If you think about a fire department in the community, fire departments don't get paid only when there's a fire. We have fire departments available in case there is a fire and there is a cost
associated with doing that,” Miller said. “We have police departments. We don't pay them based on how many crimes there are. We pay the police department to have enough salary and staff
to be able to protect the community in case there are crimes.”

Slabach, meanwhile, said private insurers also need to recognize differences in operating costs between rural and urban health care facilities.

“We need to make sure that they understand that they have a responsibility to support access to care in rural areas through paying higher unit rates, paying for these higher costs,” Slabach
said.

Businesses and average folks in rural communities can help by choosing health plans that best support their local hospital, Miller said.

For example, seniors can choose traditional Medicare coverage over a Medicare Advantage plan, he said.

Medicare Advantage plans typically pay rural hospitals less than regular Medicare does, so
“People don't realize that

when people are picking a Medicare Advantage plan or when they're picking a commercial insurance
plan, they need to ask the question first, is that health plan paying our local hospital adequately to deliver services in the
community?” Miller said. “Because it doesn't do any good to have insurance if there's no place to use it.”

If folks pick a health insurance plan that either doesn't pay or doesn't pay a rural hospital adequately, that may
put the rural hospital out of business, he said. “Then it doesn't do you any good that you have insurance,” Miller added.
Businesses can help by asking hospital administrators which health plans provide the best and easiest financial support to their local hospital, and then offering those plans to employees,
Miller suggested.

Officials also can pitch in by strengthening Medicare and Medicaid, and making sure the programs are adequately reimbursing rural hospitals, Slabach said.

Americans that live in rural communities producing all of our food, fuel and fiber should expect and

have access to the same level of primary and secondary care services that every other American in the
United States has available to them,” he said.
Slabach called it "incredibly alarming" that these sites are in jeopardy.
We know that it took several generations to get to where we are now in terms of these services being

available,” he said. “Replacing them after they've been ended will be much harder to do. Sustaining them is
much easier than trying to replace them once they close.”

A single-payer system reverses trends, solving closures.


Meagan C. Fitzpatrick & Alison P. Galvani 20. Meagan C. Fitzpatrick, Ph.D. in Epidemiology of
Microbial Diseases, Yale University, B.S. in Biological Sciences, University of Notre Dame, Postdoctoral
Associate, Center for Infectious Disease Modeling and Analysis, Yale School of Public Health,
Postdoctoral Fellow, Center for Vaccine Development, University of Maryland School of Medicine. Alison
P. Galvani, Ph.D. in Theoretical Epidemiology, BA (Honors, with Distinction) in Biological Sciences from
University of Oxford, founding director of the Yale Center for Infectious Disease Modeling and Analysis
and the Burnett and Stender Families Professor of Epidemiology at the Yale School of Public Health, Yale
School of Medicine. “The effect of Medicare for All on rural hospitals – Authors' reply,”
https://doi.org/10.1016/S0140-6736(20)32216-9, 31 October 2020, DOA: 8/30/2023, SAoki

The USA'srural hospitals are a lifeline for the communities that they serve, as evidenced by the
catastrophic effects of their closures. Subsequent to the closure of a rural hospital, mortality rises by
5.9% among residents of the service area.1

Contrary to concerns raised by Ankit Agarwal and Trevor Royce, the Medicare for All Act proposed by Senator Bernard Sanders will bolster the
financial sustainability of rural hospitals. Although our analysis finds that national spending on hospital care would fall by 5.6% under the
Medicare for All Act,2 rural hospitals specifically would see their revenue increase. This increase is driven by two

simultaneous changes: adjusting all fees to current Medicare rates and eliminating the financial burden
that uncompensated care places on hospitals.

Compared with urban hospitals, rural hospitals receive a larger share of their revenue from Medicare (45% of revenue)
and Medicaid (11% of revenue), with the remainder from private insurance.3, 4 Under the Medicare for All Act, alignment of all
fees to the Medicare schedule would result in no change for the current revenue from Medicare, an
increase of 20% in the current revenue from Medicaid, and a decrease of 22% in the revenue from
private insurance.2 Combined, the new revenue stream would be 93% of current revenue. Additionally, the costs
of uncompensated care in rural hospitals are equivalent to 10% of current revenue.3 As this shortfall
would be eliminated by Medicare for All, the mean projected revenue for rural hospitals would correspond to 103% of current revenue, for
the same level of service provision and, therefore, operating costs. Notably, the magnitude of the shift would be largest for hospitals

serving the least affluent communities, which tend to have substantial balances for uncompensated
care and receive a substantial proportion of their revenue from Medicaid.

Additional aspects of the Medicare for All proposal would further improve the financial outlook for rural hospitals.
Medicare for All would facilitate service use, which is a stabilising factor given that occupancy rates are
a predictor of rural hospital profitability.5 Individuals who are uninsured often forego needed services,
seeking health care at half the rate of people with adequate insurance,2 and 12.3% of rural residents
(excluding older adults [ie, aged 65 years and older]) are uninsured.6 As people who are newly insured begin to access

health care at rates commensurate with their currently insured counterparts, hospital use will expand
and lives will be saved. Hospital administration costs will simultaneously fall by 53%, as billing will be
streamlined into a single-payer system under the Medicare for All proposal.7

The disparities in health-care access between rural, suburban, and urban Americans are already stark. Medicare
for All is urgently needed to
stem and reverse the accelerating trends of hospital closure in rural regions across the USA.
Effective rural healthcare access is key to US agriculture---current insurance plans fail.
Florence Becot et al. 20, PhD in Environment and Natural Resources/Rural Sociology from Ohio State
University, M.Sc. in Community Development and Applied Economics from University of Vermont,
Equivalent B.A. in Economics and Social Sciences from Universite de Rennes. Shoshanah Inwood, Ph.D.
in Rural Sociology from Ohio State University, M.S. in Environmental Science Graduate Program from
The Ohio State University, B.A. in Biology from Oberlin College. Casper Bendixsen, Ph.D. in Social-
Cultural Anthropology from Rice University, M.A.in Social-Cultural Anthropology from Rice University,
Graduate Certificate in Study of Women, Gender, and Sexuality from Rice University, B.S. in
Anthropology from University of Idaho, B.S. in Philosophy from University of Idaho. Carrie Henning-
Smith, PhD in Health Services Research and Policy and Administration from University of Minnesota, MS
in Health Services Research and Policy and Administration from University of Minnesota, MPH in Health
Behavior and Health Education and University of Michigan, MSW in Interpersonal Practice and Mental
Health with a minor in Management of Human Services. Specialist in Aging Certificate, BA in
International Relations and Claremont McKenna College. Journal of Agromedicine pg 374-377, “Health
Care and Health Insurance Access for Farm Families in the United States during COVID-19: Essential
Workers without Essential Resources?,” 12 September 2020,
https://doi.org/10.1080/1059924X.2020.1814924, DOA: 9/13/2023, SAoki
Introduction

Agriculture is one of the most dangerous occupations, as well as one that experiences some of the highest stress, depression, and suicide rates.1 As such, access

to timely, high-quality health care, along with comprehensive health insurance, are essential resources
for the health and well-being of the farm population. Yet, as farmers, farm families, and farm workers have been deemed essential
workers across the world in the midst of the COVID-19 pandemic, the pandemic and associated lockdowns are illuminating the deep inequities in our social,
political, economic, and food systems. In
the United States, COVID-19 is accentuating the reality that many farm families and
farm workers have lacked accessible and affordable health insurance and health care2 while being especially
vulnerable to the virus. This vulnerability to the virus first stems from existing risk factors such as older average ages of farmers and high rates of pre-existing
conditions of farmers and farm workers.1,2 This vulnerability also stems from difficulties adopting control measures such as keeping safe distances, staying home
when sick, and lack of personal protective equipment, with variations in the ability to adopt control measures based on living arrangements, financial resources,
farm commodity, farm structure, and marketing channels. In this commentary, we discuss three main challenges farm families face meeting their health needs and
highlight the ways in which the pandemic is likely exacerbating these pressures. While this commentary does not address the situation of farm workers, we
acknowledge the harsh and unjust realities that farm workers, in particular undocumented workers, will continue to face until major reforms address the
problematic underpinnings of our immigration and agricultural systems. Furthermore, while this commentary is focused on farm families, many of the difficulties
these families face are reflective of the challenges low- and medium-income Americans face. Unless otherwise noted, farmer-level data are from the work of the
first two authors on the U.S. Department of Agriculture funded Health Insurance, Rural Economic Development and Agriculture (HIREDnAg) project with insights
drawn from survey data and interviews with key stakeholders and farm families in 10 U.S. states pre-COVID-19.

Vulnerability due to reliance on off-farm employment for health insurance coverage

Off-farm employment has long been an important source of income for farm families, but for about half of farm families in our sample, health
insurance
coverage was their main reason for having an off-farm job. Off-farm employment creates competition
for time and resources between the off-farm job, farm work, and household needs. In turn, it can become a
source of stress that negatively impacts farm productivity, farm safety, and family quality of life.

Most recently, COVID-19


has shed further light on the deep yet fraught connection between employer-based
health insurance and farm wellbeing. This is because COVID-19-related disruptions to labor markets have led to massive lay-offs, furloughs, and
decreases in work hours. In early May 2020, an estimated 27 million Americans had lost employer-sponsored health insurance (either for themselves or through the
loss of dependent coverage).4 Additionally, the U.S. Bureau of Labor Statistics reported the loss of nearly 1.3 million local education and non-education government
jobs.5 Nearly half of all farm families reliant on employer-based health insurance are insured through public-sector jobs (health, education, government). In

rural areas, public-sector jobs tend to offer the highest wages and most generous benefits. Changes in
public- and private-sector employment options and benefits can directly affect access to health care,
financial stability, and the social well-being of farm families, with impacts felt throughout rural
communities.
Health insurance marketplace and the need to predict income in an unpredictable industry

For farmers and farm families under the age of 65, after employer-based health insurance, the
second most important source of coverage
is the direct purchase of private plans. Despite half of the farm families receiving a health insurance tax credit in 2016, private plans’
monthly premiums were higher compared to employer-based and public coverage. The need to forecast
income for the coming year when purchasing insurance on the marketplace presents important challenges for
selfemployed farmers. This is largely due to the fluctuating nature of farm income and deductible business
expenses. During interviews for our HIREDnAg project, farm families shared about the unexpected financial burden of having
underestimated their income. Others spoke of their worries of getting it wrong and “played it safe” by overestimating their farm
income, meaning they were likely overpaying for health insurance or forgoing Medicaid eligibility.

When plans are purchased on the insurance marketplace, sudden


changes in income can impact monthly premiums, tax
credits, and Medicaid eligibility. Therefore, income changes must be reported to the insurance marketplace. For farm families, even in “normal”
years, upfront estimations of farm yields, market conditions, and farm income are a form of soothsaying. However, farmers might not be cognizant of the
requirement to update their income information. As a global pandemic, COVID-19 has disrupted supply chains and prices for both farm inputs and farm commodity
markets. Early estimates indicate that farm income will experience a 20 USD billion loss in 2020.6 These
impacts are building on top of
several years of unstable commodity markets, trade disputes, extreme weather, and a farm income
crisis that has weakened the farm economy in many parts of the country. In combination with potential lay-offs and
furloughs, COVID-19 related decreases in income may require farm families to re-think their calculus around deductible business expenses vs. health insurance
subsidies so that farmers are able to meet their farm and household financial needs in the short term.

Barriers to health care in rural areas

Farmers are often portrayed as stoic, self-reliant individuals who keep their problems to themselves and delay care until the problem can no longer be ignored.7,8
Behaviors and beliefs associated with these agrarian ethics have certainly come up in our work. Yet, we have found structural
barriers to health
care in rural areas play a larger role in limiting access to timely care for farm families. In particular, rural areas
have long experienced disproportionate health-care workforce shortages, hospital and other facility
closures, and infrastructure barriers to accessing care, including transportation and technology.9 Such issues are longstanding and
will likely continue well beyond the current pandemic. However, early assessments indicate that COVID-19 will likely complicate rural health-care systems' capacity.
As the threat of COVID-19 increased in the spring of 2020, many non-essential medical appointments were cancelled, effectively foregoing vital revenue for rural
hospitals while reducing access to care. Changes to telehealth rules to loosen restrictions and increase insurance reimbursements are, at first glance, a promising
solution both in the face of COVID-19 and to increase health-care access in general. Indeed, telehealth is a key strategy to increase behavioral care to farmers, as it
provides more anonymity while partially addressing the shortage of providers.10 Yet, until the digital divide and lack of reliable high-speed Internet in rural areas is
addressed, the potential for telehealth may be limited.

Where do we go from here?

As essential workers, farmers and farm families need access to essential resources that maintain their health and
vitality during the current pandemic but also during “normal” times. Policy intervention is urgently needed to ensure
that farmers can access necessary care while continuing to produce food and provide essential services
for the population at large. In similar ways that major crises in the past have led to major shifts in economic, social, and political systems (the Social
Security Act of 1935 in the midst of the Great Depression is perhaps the best example in the U.S. context), the disruptions brought on by COVID-19 could be
leveraged to work toward increasing access to affordable and adequate health insurance and health care. Immediate
policy action might include
stabilizing farm household incomes, expanding access to affordable and comprehensive health insurance, and additional flexibility

in telehealth provision, coupled with the expansion of broadband Internet. Long-term, sustainable, and structural solutions are

needed to ensure that farmers can continue to feed the population while also maintaining a humane
standard of living that can support their own health and wellbeing.

Policy action requires collective and transparent efforts in which researchers also have a role to play. This
includes providing scientifically valid data and communicating about findings broadly beyond the traditional confines of the peerreviewed literature.
Responsive policy requires research that recognizes our agricultural population is diverse and accounts
for the lived reality different subgroups experience in meeting their health needs. Such work should directly
confront gender discrimination and structural racism embedded within agriculture, and the additional
health and economic threats that Black, Indigenous, and other farmers of color face from COVID-19. This also
includes the need for researchers, including in the farm health and safety fields, to recognize that too often the focus of US-based research is on individual-level
behaviors and attitudes, without adequate consideration of the ways in which larger systems shape those behaviors and attitudes. While a paradigm shift for some,
a relational approach to research will open the door to different types of interventions compared to the ones that are commonly prescribed. Last, a particularity
fertile area for future research is the exploration of the ways in which different types of health insurance and health-care systems shape farm families’ ability to
meet their health needs and bolster these families’ resilience. Given
the reliance on farmers as essential workers, not just during
COVID-19, but throughout history, we cannot ensure the health and safety of these workers and their families

without essential resources like health insurance and access to health care.

Domestic insecurity cascades globally and fuels conflict. US leadership is key.


Kimberly Flowers 16, director of the Global Food Security Project at the Center for Strategic and
International Studies in Washington, D.C., executive director of the Goldfarb Center for Public Affairs at
Colby College. “The Food Security Solution,” https://www.csis.org/analysis/food-security-solution, DOA:
9/20/2023, SAoki
Agriculture’s Economic Power

Agriculture is the primary source of employment and income for 70 percent of the world’s rural poor, and it
contributes more than a third of gross domestic product (GDP) in many of the least developed countries. In light of evidence that GDP growth
originating in agriculture can be four times more effective than growth in other sectors in raising
incomes of the extremely poor, the economic leverage of agriculture for development is hard to dispute.

Aligning foreign assistance with country-led strategies for agricultural growth is the most effective
approach to achieving results for vulnerable smallholder farmers, their families, and their communities. Government
ownership is critical to sustaining development investments and to ensuring a sound policy
environment for private-sector engagement. In order for agriculture to reach its potential to generate
employment, raise smallholder incomes, and catalyze markets, both the will of country leadership to
dedicate resources and the ability of local and international private companies to invest along the value
chain are required. In some cases, this translates into tough policy reforms that take time to understand, to
implement, and to enforce.
National Security Risks

There is a causal relationship between food insecurity and political instability, as escalating and volatile
food prices have resulted in urban riots, toppled governments, and regional unrest from the Caribbean to the Middle
East. A paper released by the Chicago Council on Global Affairs in April 2016 reminds us that “ food price shocks can act as a catalyst for both nonviolent and

armed conflict.” Global food security undergirds economic security, national security, and human
security; it goes well beyond a moral obligation or humanitarian response.
The intelligence community recognizes this nexus and the increasing security risk in the face of dwindling resources. Last October, the U.S. Office of the Director of National Intelligence

the overall risk of food insecurity in many countries of strategic importance to the United
produced a report stating that “

States will increase during the next 10 years because of production, transport, and market disruptions to local food
availability, declining purchasing power, and counterproductive government policies. ” One of the greatest global
development challenges that wealthy and poor countries face together is increasing agricultural production to meet shifting consumer preferences and a growing population while using less
water and fewer hectares and managing the unpredictable effects of climate change.

A Changing Global Climate

Erratic weather patterns, emerging pests and diseases, and extreme natural disasters are among the overwhelming obstacles to ensuring that all people have access to safe, affordable, and
nutritious food. The U.S. Global Research Program, which is a consortium of 13 federal agencies, published a report in December 2015 that said “climate change is very likely to affect global,
regional, and local food security by disrupting food availability, decreasing access to food, and making utilization more difficult.”
A changing global climate poses a unique set of interwoven challenges to agricultural growth in developed and developing countries alike. Shifting and increasingly variable temperatures and
modified rainfall and humidity throughout the growing season impact not only crop maturation, but also the array of weeds, pests, and diseases that farmers must contend with. Increasingly
arid conditions across the Sahel and soil salinity in South Asia both highlight the need for improved seed varieties, irrigation techniques, and other inputs to help smallholders adapt to new
conditions. It is also a reminder of the importance of investing in and scaling up innovative technological solutions from transgenic crops to mobile solutions. Without strategic interventions
directed at mitigating climate change–induced agricultural productivity losses, the consistency and predictability of staple crop supplies and prices in local markets is far less assured. Because
households in many developing countries spend over 60 percent of their budgets on food, even modest price fluctuations mean that many will go hungry.

Volatility as the New Nor

According to the World Meteorological Organization, 2016’s El Niño is one of the most extreme weather patterns on record, worsening the existing impacts of climate change in many places in
sub-Saharan Africa and Southeast Asia. In Ethiopia, the government estimates that 10.2 million people will need humanitarian assistance in 2016 due to a drought severely exacerbated by the
effects of El Niño. Meanwhile, it is unlikely that the conflicts in Syria, South Sudan, Nigeria, and Yemen will be quickly resolved; the protracted unrest has disrupted agricultural production and
market activity and damaged critical infrastructure, causing billions of dollars in losses that will take decades to recover from.

The Malnutrition Continuum

The irreversible effects of childhood malnutrition are not limited to conflict environments but may contribute to their instability. Poor nutrition causes about 3.1 million deaths among children
under five each year, and one in three children in developing countries is stunted. Undernutrition in early childhood has been linked to adverse health outcomes throughout life in addition to
reduced educational attainment and lower earnings as an adult. In the long run, malnutrition undermines a country’s economic growth potential by diminishing the cognitive and physical
capacity of its emergent workforce. Addressing nutritional deficits is thus a critical component of any strategy that seeks to harness the potential of youth.

Development Glass Half Full

the U.S. global


Agricultural development may take time to show results, but with the right kind of partnerships and country leadership, it works. While Feed the Future,

hunger and nutrition initiative launched by the Obama administration, has room for improvement, its achievements in
poverty reduction and improved nutrition in select focus countries are laudable. Smallholder farmers are
learning improved cultivation and management practices and utilizing new tech nologies. The private sector
has been engaged: small and large actors alike are making investments across various value chains. Children
are eating more diverse and nutritionally complete diets. And the United States has established itself as a global development leader by

fulfilling its promise to address food insecurity while leveraging substantial investments of other
countries to complement its initiative.
Bipartisan Support

There are few topics that have broad bipartisan support in both congressional chambers, but global food security has proven to bring both sides of
the aisle together in solidarity. The Global Food Security Act, which would codify Feed the Future into law and authorize $1 billion a year for
the initiative, was passed by the House on April 12 with 370 votes of support and by the Senate with unanimous

consent on April 20. There are slight differences in the two versions, including an emergency food aid component, that will need to be worked out in conference, but it looks likely that
the bill will be signed into law ahead of a new administration in 2017. This is nothing short of a ground-breaking moment that signals strong U.S. leadership and,
more importantly, continues services to the millions of smallholder farmers and families who currently receive direct support to sustainably increase their incomes and to improve their diets.

Nuke war.
Julian Cribb 19, Fellow of the UK Royal Society for the Arts, Australian Academy of Technological
Science and Engineering, Australian National University Emeritus Faculty, Director of National
Awareness, CSIRO, Co-founder at Council for the Human Future. Food or War pg 181-184, “Food as an
Existential Risk,” DOI: 10.1017/9781108690126, DOA: 9/20/2023, SAoki
Weapons of Mass Destruction

Detonating just 50–100 out of the global arsenal of nearly 15,000 nuclear weapons would suffice to end civilisation in
a nuclear winter, causing worldwide famine and economic collapse affecting even distant nations, as we
saw in the previous chapter in the section dealing with South Asia. Eight nations now have the power to terminate civilisation should they desire

to do so – and two have the power to extinguish the human species. According to the nuclear monitoring group Ploughshares, this arsenal is
distributed as follows:

– Russia, 6600 warheads (2500 classified as ‘retired’)

– America, 6450 warheads (2550 classified as ‘retired’)

– France, 300 warheads

– China, 270 warheads


– UK, 215 warheads

– Pakistan, 130 warheads

– India, 120 warheads

– Israel, 80 warheads

– North Korea, 15–20 warheads.11

the danger of nuclear conflict in fact


Although actual numbers of warheads have continued to fall from its peak of 70,000 weapons in the mid 1980s, scientists argue

increased in the first two decades of the twentyfirst century. This was due to the modernisation of existing
stockpiles, the adoption of dangerous new technologies such as robot delivery systems, hypersonic missiles, artificial intelligence
and electronic warfare, and the continuing leakage of nuclear materials and knowhow to nonnuclear nations and potential terrorist

organisations.
In early 2018 the hands of the ‘Doomsday Clock’, maintained by the Bulletin of the Atomic Scientists, were re-set at two minutes to midnight, the highest risk to humanity that it has ever
shown since the clock was introduced in 1953. This was due not only to the state of the world’s nuclear arsenal, but also to irresponsible language by world leaders, the growing use of social
media to destabilise rival regimes, and to the rising threat of uncontrolled climate change (see below).12

In an historic moment on 17 July 2017, 122 nations voted in the UN for the first time ever in favour of a treaty banning all nuclear weapons. This called for comprehensive prohibition of “a full
range of nuclear-weapon-related activities, such as undertaking to develop, test, produce, manufacture, acquire, possess or stockpile nuclear weapons or other nuclear explosive devices, as
well as the use or threat of use of these weapons.”13 However, 71 other countries – including all the nuclear states – either opposed the ban, abstained or declined to vote. The Treaty vote
was nonetheless interpreted by some as a promising first step towards abolishing the nuclear nightmare that hangs over the entire human species.

In contrast, 192 countries had signed up to the Chemical Weapons Convention to ban the use of chemical weapons, and 180 to the Biological Weapons Convention. As of 2018, 96 per cent of
previous world stocks of chemical weapons had been destroyed – but their continued use in the Syrian conflict and in alleged assassination attempts by Russia indicated the world remains at
risk.14

As things stand, the only entities that can afford to own nuclear weapons are nations – and if humanity is to be wiped out, it will most likely be as a
result of an atomic conflict between nations. It follows from this that, if the world is to be made safe from such a fate it will need to get rid of nations as a
structure of human self-organisation and replace them with wiser, less aggressive forms of self-governance. After all, the nation state really only began in the early nineteenth century and is by
no means a permanent feature of self-governance, any more than monarchies, feudal systems or priest states. Although many people still tend to assume it is. Between them, nations have
butchered more than 200 million people in the past 150 years and it is increasingly clear the world would be a far safer, more peaceable place without either nations or nationalism. The
question is what to replace them with.

Although there may at first glance appear to be no close linkage between weapons of mass destruction and food, in the twentyfirst century with world resources of food, land and water under

chemical weapons have frequently been deployed in the Syrian civil war,
growing stress, nothing can be ruled out. Indeed,

which had drought, agricultural failure and hunger among its early drivers. And nuclear conflict remains a distinct
possibility in South Asia and the Middle East, especially, as these regions are already stressed in terms of
food, land and water, and their nuclear firepower or access to nuclear materials is multiplying.

panicking regimes in Russia, the USA or even France would be ruthless enough to
It remains an open question whether

deploy atomic weapons in an attempt to quell invasion by tens of millions of desperate refugees,
fleeing famine and climate chaos in their own homelands – but the possibility ought not to be ignored.
That nuclear war is at least a possible outcome of food and climate crises was first flagged in the report The Age of Consequences by Kurt Campbell and the US-based Centre for Strategic and

Food insecurity is therefore a


International Studies, which stated ‘it is clear that even nuclear war cannot be excluded as a political consequence of global warming’.15

driver in the preconditions for the use of nuclear weapons, whether limited or unlimited.

A global famine is a likely outcome of limited use of nuclear weapons by any country or countries – and would be
unavoidable in the event of an unlimited nuclear war between America and Russia, making it unwinnable for either. And that, as the mute
hands of the ‘Doomsday Clock’ so eloquently admonish, is also the most likely scenario for the premature termination of the

human species.

Single-payer is vital for cost containment which unlocks new-wave pharma


innovation. Bargaining power AND administrative streamlining.
Michael Drummond et al. 18, Centre for Health Economics, University of York, York, North Yorks, UK.
Rosanna Tarricone, Department of Policy Analysis and Public Management, Centre for Research in
Health and Social Care Management (CERGAS), Bocconi University, Milan, Italy. Maurizio de Cicco is
Chairman of the Board of Directors, Managing Director Roche, Italy; Vice President of Farmindustria,
Monza, Italy. 2018. “Myth #5: Health Care Is Rightly Left to the Private Sector, for the Sake of Efficiency.”
The Myths of Health Care, Springer International Publishing, pp. 123-154.
3.3 Public Financing in Health care

So what are the main arguments for or against, the public financing of health care? The first argument is that, under public
financing, it
may be easier to prevent escalating costs. As mentioned above, since the consumers of health care often do
not know the value of the services on offer, they have to rely on the providers of those services (i.e. their
physicians) to make the choices for them. This leaves open the possibility of “over-selling” the benefits of
health care, leading to a continual escalation of expenditure.

Of course, there are many actions that can be taken, within a privately financed healthcare system, to
counteract these forces. Patients could be required to make co-payments at the time of receiving care,
physicians and hospitals could be given appropriate incentives to provide care efficiently and utilization
reviews could be conducted to ensure that appropriate care was delivered. However, it may still be
more difficult to control the growth of expenditure than it is in a publicly financed system with a
single [payer] paper, mainly because there are multiple sources of funding that need to be controlled.
Reinhardt (2003) points out that, in comparing the Canadian and US healthcare systems, the existence of public
financing (through national health insurance) in Canada has helped to reduce cost escalation, with no
apparent reduction in the outcomes, in terms of improved health, produced.

The second argument is that the administrative costs associated with the private financing may be
higher than those for public financing. It was mentioned above that many policies may be required to ensure that the private
market functions efficiently. In addition, with private health insurance the costs of marketing, writing insurance
contracts and processing claims are substantial. Comparisons of administrative costs show that these
tend to be higher, as a percentage of total expenditure in privately financed systems. For example, Woolhandler
et al. (2003) reported that, in looking at the insurance element, the Canadian single payer insurance system operated
with overheads of 1.3%, comparing favourably with Canadian private insurance overheads of 13.2%, US
private insurance overheads of 11.7% and US Medicare and Medicaid overheads of 3.6 and 6.8%,
respectively. Of course, these types of comparisons are fraught with definitional judgments and the
administrative costs of publicly financed systems are often regarded as a cause for concern .
Nevertheless, it is likely that the administrative costs of operating a privately financed system are high.
In addition, it is often cheaper to raise the funds for publicly financed systems, since raising funds
through taxation is relatively inexpensive and the costs of borrowing are usually lower for governments
than for private industry.

The third argument relates to the level of control over how the funds are allocated within the healthcare
system. Even when the funding has been raised and the overall level of expenditure controlled, it is still
important to ensure that those funds are allocated efficiently, since the signals provided by a perfectly functioning
market are absent. In the last 30 years, one of the major forces to increase the efficiency of health care has
been the increased use of health technology assessment. Here, the benefits and costs of treatment
alternatives are compared in order to assess which treatment strategy will deliver the greatest value for
money. Henry Mintzberg (2012) was somewhat skeptical of the use of measurement and “evidence-based medicine” as a way of improving
health care. However, while he is right that measurement should not be exclusively pursued at the expense of
judgment, it is undeniable that assessment of the outcomes of healthcare treatments and programmes
is critical, owing [to] the information asymmetry mentioned above and the fact that many common practices
in health care have not been adequately evaluated.

While in principle health technology assessment could be used in all healthcare systems, it has thrived
to a much larger extent in publically financed healthcare systems, such as Australia, Canada, Sweden
and the United Kingdom. The reasons for this are complex, but are in part related to the explicit nature
of the budgetary constraint in publicly financed systems and the relative ease of concentrating
resources to conduct studies to satisfy the needs of a single public payer. Health Technology Assessment
is not totally absent in privately funded healthcare systems like the US, but as Sullivan et al. (2009) point out, it is
“fragmented and uncoordinated, and includes both the public and private sector”. They note that a number
of US health technology programmes predate those in countries with well-known efforts such as
Australia, Canada, Sweden and the UK, but that “regrettably, a number of these early efforts have been
discontinued or have been substantially altered in large part because of political, financial and
commercial pressures.”
A key component of health technology assessment is the use of economic evaluation, where the costs of alternative healthcare treatments and
programmes are compared with their costs. A major area for the application of economic evaluation has been in decisions on the pricing and
reimbursement of pharmaceuticals. In reviewing worldwide experience with the use of economic evaluation over the last 20 years, Drummond
(2013) argues that it has been effective in helping to target expensive therapies to the patients who would benefit most and in securing price
reductions. In addition, in comparing the United Kingdom, a heavy user of HTA, with the US, Mason et al. (2010) note that cancer drugs licensed
since 2004 were all reimbursed by payers in the US, often without restrictions, but that they received considerably more scrutiny in the UK.

In conclusion, there
are few compelling reasons to believe that leaving the financing of health care to the
private sector would automatically lead to greater efficiency. In fact there are some reasons to believe
that the opposite might be true.
8.4 The Future of Complementary Health care
8.4.1 The Context

Moving from the considerations related to the evolution of health management models—from the biomedical one (totally linear and impervious to the external 8.3 Public Versus Private Financing 137 environment) to the industrial-fordist one which introduces measurement parameters, such as the duration of visits, rigid division of operators’ work and economic performance
evaluations, this chapter aims to be a reflection about the myth #5, which emphasizes pros and cons of the growing involvement of the private sector in the organization and management of the health service. When talking about health care in Italy, two important premises must be considered. Firstly, the Italian health system has been ranked second best in the world by the World
Health Organization, with only the French system ranked higher. Secondly, the Italian National Healthcare Service (SSN) has deep roots. Created in 1978 to replace a previous system based on a multitude of insurance schemes, the NHS was inspired by the British National Health Service and has two underlying principles: every Italian citizen and foreign resident has the right to health
care and the system covers all necessary services and treatments. The starting point of any reflection is, therefore, that in Italy the management approach toward health is fundamentally paternalistic, providing the right for all, under article 32 of the Italian Constitution for universality, equality and equity. The best universalistic experiment developed worldwide. Nevertheless, even if
Italy is the European country with the lowest healthcare expenditure, we have been unfortunately experiencing a growing unsustainability of the public system to guarantee adequate levels of assistance, services and accessible innovation. Unacceptable discrepancies have emerged over time between the Northern and Southern Regions. Moreover, the declining conditions of public
finances have emphasized the need for further spending restraint policies and new rules have been defined to control Regional Governments: not only they are now required to break-even but they are also subjected to sanctions in case of failure, ranging from a general loss of autonomy, to a financial recovery plan or repayment plan supervised by the Central Government. While
searching for pragmatic solutions to save the universalistic concept of health care, the Italian situation has been rapidly worsening—as demonstrated by the latest Research lead by Fondazione Censis and RBM Salute (“Building the complementary health care”)—with immediate and direct effects on the population, who has to face endless waiting lists, even for services of primary
importance. Italian citizens are now finding themselves at a crossroad; those who can afford private healthcare insurance, deciding to pay for what the NHS should guarantee, and those who cannot afford it, having to deal with the System’s inefficiencies, giving up prevention exams, tests, check-ups and therapies. Today health care is mainly funded by public funds. In 2014, 77% of
expenditure was covered by the NHS, whilst the remaining 23% was paid by citizens (a considerable share and on the increase in recent years.). In 2015, about 11 million Italians had to give up health benefits. Private health expenditure has grown to 34.5 billion euro (+3.2% over the past two years): double the increase in overall spending on household consumption in the same
period 138 8 Myth #5: Health Care Is Rightly Left to the Private Sector … (+1.7%). A sum that, divided by every citizen, amounts to 570 euro per year, or 2000 euro per household. This result becomes even more significant if we consider the deflationary dynamic, relevant in the case of some health products and services. 10 million Italians resort more to the private sector and 7
million to Intramoenia because they cannot wait. It is not by chance that one citizen out of two—26 million —affirms he is likely to accede to integrative health solutions. More than half of the population believes that those who can afford a health insurance policy or who work in a company where integrative health care is available should take out a policy and stick to it. On one side,
this would bring public benefits because many people would use private facilities therefore freeing up space in the public sector, and on the other side this would introduce more resources in the health system. The need for concrete and sustainable solutions will surely increase. In June 2016 the Lancet published a study of the Institute for Health Metrics and Evaluation, funded by
the Bill & Melinda Gates Foundation. The researchers carried out the forecasts for 184 countries. The global healthcare spending, both public and private, will increase from 7.83 trillion dollars in 2013 to 18.28 trillion dollars in 2040. During this period, health spending per capita will increase by 2.7% per year in high-income countries, by 3.4% per year for those with upper middle
income, by 3% for those with low and middle-incomes and finally 2.4% in low-income countries. In Italy an annual growth rate of 2.6% is estimated which would lead in 2040 to a total (public and private) health spending per capita of $5968 (with a range between $5013 and $6804), compared to $3077 in 2013 with almost 80% covered by public health spending and the rest as
private spending. The conclusions of the study do not sound positive for many countries, including Italy. Despite the many advances made in the field of health, the low and middle-income countries will not be able to effectively meet health expenditure objectives at a global level by 2040. Even the health expenditure gap between the poorest and the richest countries will not
significantly shrink in the coming years unless action is taken today, with important policy interventions and concerted actions. Italy is currently ranked fourth among the G7 countries with a share of private spending higher than Japan, the UK and France, not much less than Germany and Canada and well below the United States, where private health expenditure is always more than
50% of the total expenditure and health insurance represents a substantial proportion of this share. What differentiates Italy from other countries is the high incidence of out-of-pocket spending rather than the overall private spending, with consequences less and less equitable from the viewpoint of distribution and with some concerns for the future stability of the overall health
system. Our National Health Service recognizes the opportunity for citizens to supplement the benefits provided by the public service by resorting to private insurances or forms of voluntary mutual aid. The health funds are included in supplementary 8.4 The Future of Complementary Health care 139 mutuality and thus in the non-profit sector being directed to provide additional
benefits to those provided by the NHS according to a logic based on the principle of solidarity between occupational categories and groups of citizens. The regulatory framework distinguishes between the matching funds into two categories: so-called funds “Doc” and “non Doc.” The former are traditionally defined matching funds of the National Health Service, while the latter are
corporations, welfares and mutual aid societies which have exclusively charitable purposes. These not only integrate the performance of the NHS, but also offer support measures related to performance which fall within the essential levels of care guaranteed by the NHS. Along with these types of funds there are private insurances. 8.4.2 Present and Future Role of Complementary
Health care Currently integrative health plays a marginal role, the prerogative of a “few privileged people”, even if someone believes today it is considered a way to obtain additional benefits (avoiding queues, shortening time, improving the hospital stay, etc.) and not really used in case of particularly complex situations, where the NHS remains the first choice. In the last couple of
years private expenditure has increased by 6.5% and since 2010 beneficiaries have nearly doubled, amounting to 88%. Nevertheless, the percentage of private health expenditure negotiated by insurance fell from 14 to 13% of the total, although more and more Italians think they can one day join some form of supplementary assistance. Moreover if we consider supplemental health
insurance policy costs, on average, about 70% of what each citizen spends on visits and examinations in a year, the opportunity coming from this form of funding becomes more evident, also because big funds can provide much more affordable prices than those a single person or patient could ever obtain. According to politicians, policy makers and technicians, private health care
should be extended to the entire population as far as possible, with a general objective of safeguarding public health. Some technical commission representatives believe the role of collective policies could be a good opportunity for risk-pooling, thus enabling those people with a higher risk to obtain insurance coverage. Certainly, opportunities from integrative health care have not
yet been fully grasped and it could become a synergic element of the System. Like collective health funds, to which from the past subscriptions have been encouraged and in a near future they could substitute lots of services of the NHS, promoting a better efficiency and a deeper transparency for future development. According to Censis—RBM Salute research, massive adherence to
these systems would free up about 15 billion euro of additional resources in health care. 140 8 Myth #5: Health Care Is Rightly Left to the Private Sector … The perspective changes if we consider the position of Citizens Associations, such as Active Citizenship. They worryingly underline how Italians pay twice for a service: they pay for health care through public taxes on the one hand
and pay for healthcare insurance on the other. While arguing, however, that innovation is not insurable and that no health insurance can offer the patient the opportunity to be treated with innovative drugs, the Association proposes forms of experimentation where the supplementary health can raise the standards of NHS structures, thus engaging a positive circle with better
services for everyone, both for those who use private health care and for those who remain exclusively within the scope of public health. What is lacking in the Italian scenario is a centralized political choice on the role that the private health sector should have in the overall system. Alongside the undeniable défaillances of the public mechanism, it appears to be in fact a fragmented
and largely unregulated sector. Just consider that Italy is the only Organization for Economic Cooperation and Development (OECD) country in which there coexists an integrative health service (providing, for example, dentistry or social assistance services), a complementary health service (for example in case of complex radiological examinations difficult to access) and a
replacement or substitute health service (which mainly covers hospitalizations and operations/interventions). The image represents a peculiarity which is completely local, that comes up again when you look at those who buy insurance policies; ranging from Public Administration/Government categories, to social security institutions of professionals, from individual companies to
national categories contracts, passing for funds subscribed on a territorial basis in some regions and individual citizens. From this picture the need to proceed to a rationalization of the landscape is evident. In order to gain improved efficiency in a sector, like the private one for health services, Institutions cannot continue pretending it does not exist.

How can this be achieved? Censis—RBM Salute research suggests Italy should take into consideration a new system of deductions and
detractions that today penalize the policies individually subscribed, and which aims to reward the ability to mediate the expense of citizens. In
many cases in particular, health insurance policies are not considered as a luxurious asset, but as a need.

The current social security model needs to be urgently reformed to take on more flexible dynamics, in order to better adapt to the evolving
needs. We can aspire to integrate the first and foremost public pillar with the private and complementary one, through the synergy among
different dimensions and mechanisms (company and community category funds).

8.4.3 Insurance Scheme for Innovative Drugs

And what about drugs? In the last 10–20 years the health revolution has started. Firstly, a new approach in Government
policies and in people’s mindset: moving beyond the notion of merely treating disease towards prevention and wellness, with several
pioneering programmes leading the way. Secondly, pharmaceutical companies have altered their Research and
Development focus from primary care—where the medical need has been reduced and the most common diseases can be
adequately treated with established and mostly generic treatments (i.e. cardiovascular, pain, GI, anti-inflammatory/infections, etc.)— to
speciality care, where there is the highest medical need, in disease areas with currently no or
inadequate treatment options. New treatment options (i.e. for certain cancers, HCV, hospital infections)
offer significant additional clinical and economical value, compared with the standard of care, or offer,
often for the first time, treatment options for orphan/rare diseases (i.e. IPF) which result in higher costs
per treatment/patient. The more therapeutic solutions become available, the more budget pressure
will increase with access issues for all eligible patients.

At the same time these are the areas where it has become increasingly difficult for Pharma to
demonstrate incremental value and cost-effectiveness versus new options brought by effective generics
and achieving adequate pricing rewarding the investments made in R&D.

One thing is clear: a healthy storm of innovative drugs is here to come, and it will require [need]
to be driven, guided and controlled.
If on one side, Italian political and technical healthcare institutions have been putting in place really praiseworthy efforts, such as the creation of
a dedicated fund for innovative drugs or the reconsideration of pricing mechanisms, on the other side we should admit they
certainly move in the right direction, but this will not be enough in the medium term.
Our National Health Service alone can no longer cope with the new health needs of citizens, with the care needs of people that have changed
considerably since the end of the 70s when the NHS started to rule health care in Italy.

Beside the demographic development with ageing societies in the whole of Europe—the need for, or the consumption of, medical care and
treatment (in particular for specialty care products such as in oncology, in CNS or in immunology) is strictly correlated with the ageing
phenomenon and strongly contributes to budget constraints. Immigration into the EU has significantly increased and considering how many
immigrants cannot immediately find employment and are therefore not contributing to financing health care, this has further strained existing
healthcare budgets.

The result of this mix of factors—increasing costs (due to the increasing number of patients) for primary, secondary care and rehab—leads to
growing budget pressure and in parallel reduces saving potential in these healthcare areas.

Consequently payers, that have to secure the financial sustainability of the health system without undermining the values
shared by the universal coverage, solidarity in financing, equity of access and the provision of high-quality health care, are looking for
increasing resources progressively driven by cost-minimization rather than value based decisions. As a
direct result they keep on drawing from the pharma sector.

On the contrary, today we need to fund research, prevention, new drugs and face the challenge of
longevity.
There is no one to blame but, if the NHS, like that of the rest of the major OECD countries, is definitely shaky and at risk of sustainability, we do
need to take stock of things and act to identify the necessary countermeasures. We need to firmly focus on a health system reform in our
country, with the contribution of all the healthcare stakeholders, including insurance companies, supplementary health funds but also
pharmaceutical companies, to identify additional sources of financing to enable citizens to maintain adequate levels of assistance.

Some argue that it might be useful to start this process at national level for the definition of healthcare priorities
with the active participation of citizens. The process has been running for over 20 years in many European countries such as Sweden and
Norway, where there are still national health services that certainly have not been “dismantled”. The
answer to the growing needs
of the population on the basis of ethical criteria of solidarity could therefore make it possible to identify
priority classes, by which to allocate the available financial resources. The lowest priority classes may be included in
the supplementary forms of assistance in a perspective of cooperation and “take care” of the patient in shared assistance pathways.

Our country is called upon to make choices.

We are certainly proud to have a model that is universalistic but with all its differences of which we are perfectly aware. At least 50% of the out-
of-pocket expenditure is due to the fact that the universalistic health often does not work.
Considering the actual situation and the future scenario, one of the points under discussion is related to the possible benefit coming from a
specific treatment and on the role a pharmaceutical company could play in the identification of new paths.

From my perspective, if on one side pharmaceuticals should take part in this evolution, on the other side insurance companies should identify
and present a business model which can be applied to drugs, and in particular to innovative treatments.

What is to come, are really revolutionary medicines, able to treat or control not only diseases with a
high social effect, like Alzheimer Disease or Parkinson, but also many types of tumours which can now
be considered chronic.
For all these reasons it is crucial to find a new model to sustain our Healthcare System, and certainly the solution is not to increase the
healthcare fund defined year per year.

“The” decision maker—which is the institutional role that is predominantly called to manage this matter
—has to take the opportunity coming from this historical moment, when different players are debating
about possible solutions, analyzing models and benchmarking other countries’ systems, finally admitting it is no
longer possible for the actual universalism to go on.

Taking stock of all above positions and considerations, I would like to offer some final food for thought. A few years ago we started talking
about “health consumptions”, that is to say we started measuring health care through the parameters of service consumption. Eventually, this
new trend led to the creation of “local healthcare companies” (ASL).

What has been going on in the last 20–30 years—perhaps without even really realizing it—is the transformation of health, which has become
more and more subordinate to market imperatives and economic criteria. Thus, citizens have become customers, rights have become needs,
and welfare has become a social market.

If on one side, avoiding waste and rationalizing expenditure must be the starting point, on the other
side it should be done without reducing it, but making it more efficient. This growing corporatization of
health (and social), sacrifices quality, the human dimension and the efficiency and performance of care in
the end. But health is not a commodity.

Pharma innovation stops extinction from natural disease and bioweapons.


Dr. Piers Millett 17, PhD, Senior Research Fellow at the University of Oxford, Future of Humanity
Institute, and Andrew Snyder-Beattie, MS, Director of Research at the University of Oxford, Future of
Humanity Institute, “Existential Risk and Cost-Effective Biosecurity”, Health Security, Volume 15,
Number 4, 8/1/2017, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5576214/
Abstract

In the decades to come, advanced bioweapons could threaten human existence. Although the probability of
human extinction from bioweapons may be low, the expected value of reducing the risk could still be large, since such risks jeopardize the
existence of all future generations. We provide an overview of biotechnological extinction risk, make some rough initial estimates for how
severe the risks might be, and compare the cost-effectiveness of reducing these extinction-level risks with existing biosecurity work. We find
that reducing human extinction risk can be more cost-effective than reducing smaller-scale risks, even when using conservative estimates. This
suggests that the risks are not low enough to ignore and that more ought to be done to prevent the worst-case scenarios.

Keywords: : Biothreat, Catastrophic risk, Existential risk, Cost-effectiveness, Cost-benefit analysis

How worthwhile is it spending resources to study and mitigate the chance of human extinction from biological risks? The risks of such a
catastrophe are presumably low, so a skeptic might argue that addressing such risks would be a waste of scarce resources. In this article, we
investigate this position using a cost-effectiveness approach and ultimately conclude that the expected value of reducing these risks is large,
especially since such risks jeopardize the existence of all future human lives.

Historically, disease events have been responsible for the greatest death tolls on humanity. The 1918
flu was responsible for more than 50 million deaths,1 while smallpox killed perhaps 10 times that many in
the 20th century alone.2 The Black Death was responsible for killing over 25% of the European population ,3
while other pandemics, such as the plague of Justinian, are thought to have killed 25 million in the 6th century—
constituting over 10% of the world's population at the time.4 It is an open question whether a future pandemic could result in
outright human extinction or the irreversible collapse of civilization.

A skeptic would have many good reasons to think that existential risk from disease is unlikely. Such a disease would need
to spread worldwide to remote populations, overcome rare genetic resistances, and evade detection, cures, and countermeasures. Even
evolution itself may work in humanity's favor: Virulence and transmission is often a trade-off, and so evolutionary pressures could push against
maximally lethal wild-type pathogens.5,6

While these arguments point to a very small risk of human extinction, they do not rule the possibility out entirely. Although
rare, there are recorded instances of species going extinct due to disease —primarily in amphibians, but also in 1
mammalian species of rat on Christmas Island.7,8 There are also historical examples of large human populations being
almost entirely wiped out by disease, especially when multiple diseases were simultaneously
introduced into a population without immunity. The most striking examples of total population collapse include native
American tribes exposed to European diseases, such as the Massachusett (86% loss of population), Quiripi-Unquachog (95% loss of population),
and the Western Abenaki (which suffered a staggering 98% loss of population).9

In the modern context, no single disease currently exists that combines the worst-case levels of transmissibility, lethality, resistance to
countermeasures, and global reach. But many diseases are proof of principle that each
worst-case attribute can be realized
independently. For example, some diseases exhibit nearly a 100% case fatality ratio in the absence of
treatment, such as rabies or septicemic plague. Other diseases have a track record of spreading to virtually every
human community worldwide, such as the 1918 flu,10 and seroprevalence studies indicate that other pathogens, such as
chickenpox and HSV-1, can successfully reach over 95% of a population.11,12 Under optimal virulence theory, natural evolution would be an
unlikely source for pathogens with the highest possible levels of transmissibility, virulence, and global reach. But advances
in
biotechnology might allow the creation of diseases that combine such traits. Recent controversy has already
emerged over a number of scientific experiments that resulted in viruses with enhanced transmissibility,
lethality, and/or the ability to overcome therapeutics.13-17 Other experiments demonstrated that mousepox could be
modified to have a 100% case fatality rate and render a vaccine ineffective.18 In addition to transmissibility and lethality, studies have shown
that other disease traits, such as incubation time, environmental survival, and available vectors, could be
modified as well.19-21
Although these experiments had scientific merit and were not conducted with malicious intent, their implications are still worrying. This is
especially true given that there
is also a long historical track record of state-run bioweapon research applying
cutting-edge science and technology to design agents not previously seen in nature . The Soviet bioweapons
program developed agents with traits such as enhanced virulence, resistance to therapies, greater environmental resilience, increased difficulty
to diagnose or treat, and which caused unexpected disease presentations and outcomes.22 Delivery capabilities have also been subject to the
cutting edge of technical development, with Canadian, US, and UK bioweapon efforts playing a critical role in developing the discipline of
aerobiology.23,24 While there is no evidence of state-run bioweapons programs directly attempting to develop or deploy bioweapons that
would pose an existential risk, the logic of deterrence and mutually assured destruction could create such incentives in more unstable political
environments or following a breakdown of the Biological Weapons Convention.25 The possibility of a war between great powers could also
increase the pressure to use such weapons—during the World Wars, bioweapons were used across multiple continents, with Germany
targeting animals in WWI,26 and Japan using plague to cause an epidemic in China during WWII.27

Non-state actors may also pose a risk, especially those with explicitly omnicidal aims. While rare, there are
examples. The Aum Shinrikyo cult in Japan sought biological weapons for the express purpose of causing
extinction.28 Environmental groups, such as the Gaia Liberation Front, have argued that “we can ensure Gaia's survival only through the
extinction of the Humans as a species … we now have the specific technology for doing the job … several different [genetically engineered]
viruses could be released”(quoted in ref. 29). Groups such as R.I.S.E. also sought to protect nature by destroying most of humanity with
bioweapons.30 Fortunately, to date, non-state
actors have lacked the capabilities needed to pose a catastrophic bioweapons
threat, but this could change in future decades as biotechnology becomes more accessible and the pool of experienced
users grows.31,32
What is the appropriate response to these speculative extinction threats? A balanced biosecurity portfolio might include investments that reduce a mix of proven and speculative risks, but striking this balance is still difficult given the massive uncertainties around the low-probability, high-
consequence risks. In this article, we examine the traditional spectrum of biosecurity risks (ie, biocrimes, bioterrorism, and biowarfare) to categorize biothreats by likelihood and impact, expanding the historical analysis to consider even lower-probability, higher-consequence events
(catastrophic risks and existential risks). In order to produce reasoned estimates of the likelihood of different categories of biothreats, we bring together relevant data and theory and produce some first-guess estimates of the likelihood of different categories of biothreat, and we use
these initial estimates to compare the cost-effectiveness of reducing existential risks with more traditional biosecurity measures. We emphasize that these models are highly uncertain, and their utility lies more in enabling order-of-magnitude comparisons rather than as a precise measure
of the true risk. However, even with the most conservative models, we find that reduction of low-probability, high-consequence risks can be more cost-effective, as measured by quality-adjusted life year per dollar, especially when we account for the lives of future generations. This
suggests that despite the low probability of such events, society still ought to invest more in preventing the most extreme possible biosecurity catastrophes.

Here, we use historical data to analyze the probability and severity of biothreats. We place biothreats in 6 loose categories: incidents, events, disasters, crises, global catastrophic risk, and existential risk. Together they form an overlapping spectrum of increasing impact and decreasing
likelihood (Figure 1).*

A spectrum of differing impacts and likelihoods from biothreats. Below each category of risk is the number of human fatalities. We loosely define global catastrophic risk as being 100 million fatalities, and existential risk as being the total extinction of humanity. Alternative definitions can
be found in previous reports,33 as well as within this journal issue.34

The historical use of bioweapons provides useful examples of some categories of biothreats. Biocrimes and bioterrorism provide examples of incidents.† Biological warfare provides examples of events and disasters. These historical examples provide indicative data on likelihood and
impact that we can then feed into a cost-effectiveness analysis. We should note that these data are both sparse and sometimes controversial. Where possible, we use multiple datasets to corroborate our numbers, but ultimately the “true rate” of bioweapon attacks is highly uncertain.

Biocrimes and Bioterrorism

Historically, risks of biocrime‡ and bioterrorism§ have been limited. A 2015 Risk and Benefit Analysis for Gain of Function Research detailed 24 biocrimes between 1990 and 2015 (0.96 per year) and an additional 42 bioterrorism incidents between 1972 and 2014 (1 per year).36 This is
consistent with other estimates of biocrimes and bioterrorism frequency, which range from 0.35 to 3.5 per year (see supplementary material, part 1, at http://online.liebertpub.com/doi/suppl/10.1089/hs.2017.0028).

Most attacks typically result in no more than a handful of casualties (and many of these events include hoaxes, threats, and attacks that had no casualties at all). For example, the anthrax letter attacks in the United States in 2001, perhaps the most high-profile case in recent years,
resulted in only 17 infections with 5 fatalities.37 The 2015 Risk and Benefit Analysis for Gain of Function Research detailed only a single death from the recorded biocrimes.** Only 1 of the bioterrorism incidents in the report had associated deaths (the 2001 anthrax letter attacks).36
Based on this data, for the purposes of this article, we assume that we could expect 1 incident per year resulting in up to tens of deaths.

Biological Warfare

Academic overviews of biological warfare†† detail 7 programs prior to 1945.38 A further 9 programs are recorded between 1945 and 1994.39 For most of the last century, at least 1 program was active in any given year (Table 1).

The actual use of bioweapons by states is less common: Over the 85 years covered by these histories (1915 to 2000), 18 cases of use (or possible use) were recorded, including outbreaks connected to biological warfare (see supplementary material, part 2, at
http://online.liebertpub.com/doi/suppl/10.1089/hs.2017.0028). Extrapolating this out (dividing 18 by 85), we would have about a 20% chance per year of biowarfare. It is worth noting the limitations of these data. Most of these events occurred before the introduction of the Biological
Weapons Convention and were conducted by countries that no longer have biological weapons programs. Since many of these incidents occurred during infrequent great power wars, we revise our best guess to around 10% chance per year of biowarfare.

We use 2 sets of data to estimate the magnitude of such events. The first dataset was Japanese biological warfare in China,40 where records indicate a series of attacks on towns resulted in a mean of 330 casualties per event and 1 case in which an attack resulted in a regional outbreak
causing an estimated 30,000 deaths (see supplementary material, part 3, at http://online.liebertpub.com/doi/suppl/10.1089/hs.2017.0028). The second data set came from disease events that were alleged to have an unnatural origin.41 In one case study, a point source release of
anthrax resulted in at least 66 deaths. In a second case study, a regional epidemic of the same disease resulted in more than 17,000 human cases. While these events were not confirmed as having been caused by biological warfare, contemporary or subsequent analysis has suggested
that such an origin was at least feasible. Combined, these figures provide an estimated impact of between 66 to 330 and 17,000 to 30,000.

For the purposes of this analysis, we are assuming the lower boundary figures from biological warfare are indicative of events, with a likelihood of 10% per year and an impact ranging between tens and thousands of fatalities. The upper boundary figures from biological warfare are
indicative of disasters, with a likelihood of 1% per year and an impact range of thousands to tens of thousands of fatalities.‡‡

Unlike standard biothreats, there is no historical record on which to draw when considering global catastrophic or existential risks. Alternative approaches are required to estimate the likelihood of such an event. Given the high degree of uncertainty, we adopt 3 different approaches to
approximate the risk of extinction from bioweapons: utilizing surveys of experts, previous major risk assessments, and simple toy models. These should be taken as initial guesses or rough order-of-magnitude approximations, and not a reliable or precise measure.

An informal survey at the 2008 Oxford Global Catastrophic Risk Conference asked participants to estimate the chance that disasters of different types would occur before 2100. Participants had a median risk estimate of 0.05% that a natural pandemic would lead to human extinction by
2100, and a median risk estimate of 2% that an “engineered” pandemic would lead to extinction by 2100.42

The advantage of the survey is that it directly measures the quantity that we are interested in: probability of extinction from bioweapons. The disadvantage is that the estimates were likely highly subjective and unreliable, especially as the survey did not account for response bias, and the
respondents were not calibrated beforehand. We therefore also turn to other models that, while indirect, provide more objective measures of risk.§§

Recent controversial experiments on H5N1 influenza prompted discussions as to the risks of deliberately creating potentially pandemic pathogens. These agents are those that are highly transmissible, capable of uncontrollable spread in human populations, highly virulent, and also
possibly able to overcome medical countermeasures.44 Previous work in a comprehensive report done by Gryphon Scientific, Risk and Benefit Analysis of Gain of Function Research,36 has laid out very detailed risk assessments of potentially pandemic pathogen research, suggesting that
the annual probability of a global pandemic resulting from an accident with this type of research in the United States is 0.002% to 0.1%. The report also concluded that risks of deliberate misuse were about as serious as the risks of an accidental outbreak, suggesting a 2-fold increase in
risk. Assuming that 25% of relevant research is done in the United States as opposed to elsewhere in the world, this gives us a further 4-fold increase in risk. In total, this 8-fold increase in risk gives us a 0.016% to 0.8% chance of a pandemic in the future each year (see supplementary
material, part 4, at http://online.liebertpub.com/doi/suppl/10.1089/hs.2017.0028).

The analysis in Risk and Benefit Analysis of Gain of Function Research suggested that lab outbreaks from wild-type influenza viruses could result in between 4 million and 80 million deaths,36 but others have suggested that if some of the modified pathogens were to escape from a
laboratory, they could cause up to 1 billion fatalities.45 For the purposes of this model, we assume that for any global pandemic arising from this kind of research, each has only a 1 in 10,000*** chance of causing an existential risk. This figure is somewhat arbitrary but serves as an
excessively conservative guess that would include worst-case situations in which scientists intentionally cause harm, where civilization permanently collapses following a particularly bad outbreak, or other worst-case scenarios that would result in existential risk. Multiplying the
probability of an outbreak with the probability of an existential risk gives us an annual risk probability between 1.6 × 10–8 and 8 × 10–7.†††

Model 3: Naive Power Law Extrapolation

Previous literature has found that casualty numbers from terrorism and warfare follow a power law distribution, including terrorism from WMDs.46 Power laws have the property of being scale invariant, meaning that the ratio in likelihood between events that cause the deaths of 10
people and 10,000 people will be the same as that between 10,000 people and 10,000,000 people.‡‡‡ This property results in a distribution with an exceptionally heavy tail, so that the vast majority of events will have very low casualty rates, with a couple of extreme outliers.

Past studies have estimated this ratio for terrorism using biological and chemical weapons to be about 0.5 for 1 order of magnitude,47 meaning that an attack that kills 10x people is about 3 times less likely (100.5) than an attack that kills 10x–1 people (a concrete example is that attacks
with more than 1,000 casualties, such as the Aum Shinrikyo attacks, will be about 30 times less probable than an attack that kills a single individual). Extrapolating the power law out, we find that the probability that an attack kills more than 5 billion will be (5 billion)–0.5 or 0.000014.
Assuming 1 attack per year (extrapolated on the current rate of bio-attacks) and assuming that only 10% of such attacks that kill more than 5 billion eventually lead to extinction (due to the breakdown of society, or other knock-on effects), we get an annual existential risk of 0.0000014 (or
1.4 × 10–6).

We can also use similar reasoning for warfare, where we have more reliable data (97 wars between 1820 and 1997, although the data are less specific to biological warfare). The parameter for warfare is 0.41,47 suggesting that wars that result in more than 5 billion casualties will
comprise (5 billion)–0.41 = 0.0001 of all wars. Our estimate assumes that wars will occur with the same frequency as in 1820 to 1997, with 1 new war arising roughly every 2 years. It also assumes that in these extreme outlier scenarios, nuclear or contagious biological weapons would be
the cause of such high casualty numbers, and that bioweapons specifically would be responsible for these enormous casualties about 10% of the time (historically bioweapons were deployed in WWI, WWII, and developed but not deployed in the Cold War—constituting a bioweapons
threat in every great power war since 1900). Assuming that 10% of biowarfare escalations resulting in more than 5 billion deaths eventually lead to extinction, we get an annual existential risk from biowarfare of 0.0000005 (or 5 × 10–7).

Perhaps the most interesting implication of the fatalities following a power law with a small exponent is that the majority of the expected casualties come from rare, catastrophic events. The data also bear this out for warfare and terrorism. The vast majority of US terrorism deaths
occurred during 9/11, and the vast majority of terrorism injuries in Japan over the past decades came from a single Aum Shinrikyo attack. Warfare casualties are dominated by the great power wars. This suggests that a typical individual is far more likely to die from a rare, catastrophic
attack as opposed to a smaller scale and more common one. If our goal is to reduce the greatest expected number of fatalities, we may be better off devoting resources to preventing the worst possible attacks.

Why Uncertainty Is Not Cause for Reassurance

Each of our estimates rely to some extent on guesswork and remain highly uncertain. Technological breakthroughs in areas
such as diagnostics, vaccines, and therapeutics, as well as vastly improved surveillance, or even eventual space colonization,
could reduce the chance of disease-related extinction by many orders of magnitude. Other
breakthroughs such as highly distributed DNA synthesis or improved understanding of how to construct
and modify diseases could increase or decrease the risks. Destabilizing political forces, the breakdown of the
Biological Weapons Convention, or warfare between major world powers could vastly increase the amount of investment in
bioweapons and create the incentives to actively use knowledge and biotechnology in destructive ways .
Each of these factors suggests that our wide estimates could still be many orders of magnitude off from the true risk in this century. But
uncertainty is not cause for reassurance. In instances where the probability of a catastrophe is thought to be extremely low (eg, human
extinction from bioweapons), greater uncertainty around the estimates will typically imply greater risk of the catastrophe, as we have reduced
confidence that the risk is actually at a low level.48 §§§
Given that our conservative
models are based on historical data, they fail to account for the primary source of future
risk: technological development that could radically democratize the ability to build advanced
bioweapons. If the cost and required expertise of developing bioweapons falls far enough, the world
might enter a phase where offensive capabilities dominate defensive ones . Some scholars, such as Martin Rees,
think that humanity has about a 50% chance of going extinct due in large part to such technologies.49
However, incorporating these intuitions and technological conjectures would mean relying on qualitative arguments that would be far more
contentious than our conservative estimates. We therefore proceed to assess the cost-effectiveness on the basis of our conservative models,
until superior models of the risk emerge.

Pandemics cause extinction. The risk is higher than ever.


Noah B. Taylor 23, Guest Lecturer in Peace and Conflict Studies at the University of Innsbruck,
Existential Risks in Peace and Conflict Studies, “Peace, Pandemics, and Conflict”, Palgrave Macmillan,
May 2023, pg. 87-89, HBisevac
Pandemics should instead be understood as falling between a natural and an anthropogenic risk. Non-engineered pathogens that emerge from
nature do not represent the same type of direct natural risk as an event such as an asteroid strike. Understanding
a pandemic in
terms of a global catastrophic risk is more practical, meaning that it could threaten permanent
civilizational collapse (Bostrom and Ćirković 2011). The John Hopkins Center for Health Security has followed this idea in defining
pandemics as possible “Global Catastrophic Biological Risks” (GCBRs). Which they define as

biological events—deliberate, accidental, or emerging—that could


lead to sudden, extraordinary, widespread
disaster beyond the collective capability of national and international governments and the private
sector to control [that] if unchecked […] would lead to great suffering, loss of life, and sustained
damage to national governments, international relationships, economies, societal stability, or global
security. (Schoch-Spana et al. 2017, 323)

Though humanity has lived with diseases since its earliest history, the contemporary possibility of such a risk
has increased. The global population recently climbed to over 8 billion, and the overall population density
continues to increase (Roser et al. 2013). With more individuals, there are more possible origins of new
pandemic diseases. As of 2018, 55% of the world’s population lived in urban areas (UN DESA 2018), and an estimated
65% by 2050 (UN DESA 2019). People living closer to one another will likely increase the transmission rates.

Many facets of modern life further increase a pandemic’s possible severity, scope, and scale (Jones et al.
2008; Morse 1995). Deforestation, industrial farming, and meat production practices combined with climate
change increase the likelihood of zoonotic transmission of pathogens from animals to humans. The
melting permafrost, increased ultraviolet immunosuppression, changing weather patterns, and arctic
thawing that comes with global warming may unleash pathogens frozen long ago (Hofmeister et al. 2021). Alternatively,
trigger pathogenic mutations in previously non-pathogenic organisms. The ease of global
transportation also dramatically increases the risk of future pandemics (Ord 2020). In addition to these variables,
war is a significant factor that results in the spread of pandemic disease, both unintentionally as troops move into foreign lands and
intentionally and potentially when used as a weapon (A. T. Price-Smith 2009).

Another often singled but under-recognized future pandemic risk is so-called super bugs, antimicrobial resistant bacteria. Superbugs fall into
three categories defined by their susceptibility to antimicrobial agents. Multidrug-resistant pathogens are not susceptible to at least one agent
in three or more categories, extensively drug-resistant are not susceptible to at least one agent in all but two or fewer categories, and the most
concerning the pandrug-resistant which have no known susceptibility to any antimicrobial agents (Magiorakos et al. 2012).
The potential risk from superbugs is grave. The general director of the WHO described it as being a “fundamental threat
to human health, development and security” (Fox 2016). In the USA alone, in 2019 the rate of superbug infections was
2.8 million each year with more than 35,000 deaths (CDC 2019). This rate has likely risen by 15% between 2019 and 2020
(Mishra 2022). Globally the situation is worse. In 2019 an estimated 1.27 million deaths were directly tied to
these superbugs, with another 4.95 million associated with such infections. The global burden of these infections is likely
higher than HIV or Malaria. Like many other pandemic-related topics, the global burden of these diseases unequally spread with
much higher concentrations in Sub-Saharan Africa and South Asia (Antimicrobial Resistance Collaborators 2022).

Biomedical researcher Dr. Brian K. Coombes described the severity of the situation, “ antibiotics
are the foundation on which all
modern medicine rests. Cancer chemotherapy, organ transplants, surgeries, and childbirth all rely on
antibiotics to prevent infections. If you can’t treat those, then we lose the medical advances we have
made in the last 50 years” (Miller 2015).

Even if bio risks, such as pandemics, are not technically defined as a genuine existential threat and are instead understood as a global
catastrophic risk, there is a consensus that they should be an issue of global priority (Connell 2017; Palmer et al. 2017).
These evaluations of the destructive potential of pandemics focus on it being the direct cause of an existential or global catastrophic threat.
The possibility of pandemics following similar pathways as discussed regarding Great Power Conflicts makes a
pandemic’s indirect or compounding risk a topic of concern. Global cooperation is needed to address another existential
risk a pandemic could hinder. A pandemic could occur alongside another existential threat, such as a Great Power Conflict or runaway global
warming overtaxing the systems that might otherwise make us resilient to such a risk.

Bioweapons kill all. Short timeframe.


Bryan Walsh 20, Future Correspondent for Axios, Editor of the Science and Technology Publication
OneZero, Former Senior and International Editor at Time Magazine, BA from Princeton University, End
Times: A Brief Guide to the End of the World, Orion Publishing Group, Limited Edition, p. 204-206
I’ve lived through disease outbreaks, and in the previous chapter I showed just how unprepared we are to face a widespread pandemic of flu or
another new pathogen like SARS. But a
deliberate outbreak caused by an engineered pathogen would be far
worse. We would face the same agonizing decisions that must be made during a natural pandemic: whether to ban travel from affected
regions, how to keep overburdened hospitals working as the rolls of the sick grew, how to accelerate the development and distribution of
vaccines and drugs. To that dire list add the terror that would spread once it became clear that the death and disease in our midst was
not the random work of nature, but a deliberate act of malice. We’re scared of disease outbreaks and we’re scared of terrorism—put them
together and you have a formula for chaos.
As deadly and as disruptive as a conventional bioterror incident would be, an attack that employed existing pathogens could only spread so far,
limited by the same laws of evolution that circumscribe natural disease outbreaks. But a
virus engineered in a lab to break those
laws could spread faster and kill quicker than anything that would emerge out of nature. It can be
designed to evade medical countermeasures, frustrating doctors’ attempts to diagnose cases and treat
patients. If health officials manage to stamp out the outbreak, it could be reintroduced into the public
again and again. It could, with the right mix of genetic traits, even wipe us off the planet, making
engineered viruses a genuine existential threat.

And such an attack may not even be that difficult to carry out. Thanks to advances in biotechnology that
have rapidly reduced the skill level and funding needed to perform gene editing and engineering, what
might have once required the work of an army of virologists employed by a nation-state could soon be done by a handful of talented
and trained individuals. Or maybe just one.

When Melinda Gates was asked at the South by Southwest conference in 2018 to identify what she saw as the biggest threat facing
the world over the next decade, she didn’t hesitate: “A bioterrorism event. Definitely.”2
She’s far from alone. In 2016, President Obama’s director of national intelligence James Clapper identified CRISPR as a “weapon of mass
destruction,” a category usually reserved for known nightmares like nuclear bombs and chemical weapons. A 2018 report from the National
Academies of Sciences concluded that biotechnology had rewritten what was possible in creating new weapons, while
also increasing the range of people capable of carrying out such attacks.3 That’s a
fatal combination, one that plausibly threatens
the future of humanity like nothing else.

“Theexistential threat that would be most available for someone, if they felt like doing something,
would be a bioweapon,” said Eric Klien, founder of the Lifeboat Foundation, a nonprofit dedicated to helping humanity survive
existential risks. “It would not be hard for a small group of people, maybe even just two or three people, to kill a hundred million people using a
bioweapon. There are probably a million people currently on the planet who would have the technical knowledge to
pull this off. It’s actually surprising that it hasn’t happened yet.”

Single payer solves pandemics.


Aliso P. Galvani 22, Center for Infectious Disease Modeling and Analysis, Yale School of Public Health,
6/13/2022, “Universal healthcare as pandemic preparedness: The lives and costs that could have been
saved during the COVID-19 pandemic,” Economic Sciences,
https://www.pnas.org/doi/10.1073/pnas.2200536119#sec-4, kav

First, theprevalence of those underlying conditions which exacerbate COVID-19 severity would be
reduced via equitable access to care. For example, uninsured adults are significantly more likely than insured adults to
be unaware of their hypertension (33, 34), much less likely to be receiving treatment (35), and much less likely to have their
hypertension under control (36). Hypertension specifically increases the risk of COVID-19 mortality by 188% (37). Additionally, uninsured
women were found to have a higher prevalence of obesity (35), which is another risk factor for severe COVID-19.
Diabetes has similarly been associated with significantly increased COVID-19 severity and mortality (38).
Uninsured adults with diabetes were half as likely to be aware of their condition as their insured counterparts (39).

Early diagnosis and access to life-saving medical care.

Financial barriers reduce and delay care for COVID-19. Due to apprehension about their ability to pay, 14% of US adults reported that even if
they experienced the two most common symptoms of COVID-19, fever and dry cough, they would still avoid seeking care (40). These financial
concerns are justified as 18% of the US population had medical debt even prior to the COVID-19 pandemic, collectively totaling $140 billion
(41). Medical debt ballooned further during the pandemic as the confluence of lost insurance and lost income makes it more challenging to pay
medical bills (42, 43). Removing financial obstacles to care can accelerate diagnosis. More timely medical attention increases the probability of
recovery from COVID-19 infection (10, 44). For example, treatment with monoclonal antibodies early during the infection reduced the risk of
severe outcomes (hospitalization or death) by 85% (45). Reducing the time to diagnosis also ensures more prompt isolation, which in turn
reduces transmission to others.

The Coronavirus Aid, Relief, and Economic Security (CARES) Act subsidizes all testing and medical bills for the uninsured with COVID-19.
However, hurdles still exist that may prevent individuals from seeking care. Since this program does not directly offer insurance to patients,
uninsured individuals may be unaware that their COVID-19–related medical expenses can be reimbursed, which may prevent them from
seeking treatment (46, 47). Furthermore, CARES Act provides financial assistance for only one disease. It is a possibility that people without
insurance and seeking medical care for COVID-19 symptoms might be billed for a different diagnosis or other services associated with testing
(48).

Geographic inequities in healthcare access have exacerbated case fatality rates in the United States (49). Specific
provisions in the
Medicare for All 2021 bill to iteratively monitor and address geographic and racial inequity (12) would have
particular importance during a pandemic. Rural hospitals are disproportionately reimbursed at the
relatively low rates paid by Medicaid (50) and have heavy burdens of uncompensated care (51), which has
challenged the survival of rural facilities (52). Although hospital fees nationwide would be reduced by
Medicare for All, applying Medicare rates across the board would actually increase support to those
rural hospitals which currently serve substantial populations of Medicaid and uninsured patients (53).
Furthermore, clinical outcomes such as mortality have been elevated among rural communities during the
COVID-19 pandemic (49). Rural hospitals were more prone to shortages of ventilators, personal
protective equipment, ICU capacity, and healthcare workers (54). These factors would be rectified by
investment to expand healthcare facilities and hospital capacity in rural areas (12).
Facilitation of COVID-19 preventative measures.

COVID-19 mortality rates have been higher and vaccination rates lower among Black and Hispanic
individuals relative to White individuals (55, 56). A key driver of these disparities is inequitable access to
primary care (57, 58). For example, recommendation of COVID-19 vaccination to patients by their trusted
primary care providers is effective in overcoming vaccine hesitancy (59, 60). Consequently, it is unsurprising
that individuals without access to a primary care provider have lower rates of vaccine uptake despite it
being free to the public. Universal healthcare would ameliorate such inequities, particularly given the
provisions for investment to address racial and other disparities.
Alleviating pressure on hospitals during a pandemic.

Beyond COVID-19 outcomes, high hospital caseloads impact non-COVID care and also lead to
premature death from non-COVID causes during a pandemic. Removal of cost barriers and alleviation
of comorbidities would have reduced not only the risk of COVID-19 death but also hospitalizations, with
positive externalities specific to the pandemic context. Particularly during outbreak surges, high demand for COVID-19
hospital services often delayed procedures related to other health conditions. Reduced hospitalization rates facilitated by
Medicare for All would have blunted these COVID-19 peaks and thereby freed capacity for non-COVID
care.
Fully addressing health disparities requires a multifactorial strategy.

Not all disparities in COVID-19 mortality could have been alleviated by adoption of a single-payer universal healthcare system. Even in countries
with single-payer healthcare systems, there can be a steep income gradient associated with COVID-19 outcomes (61). While single-payer
healthcare is paramount to addressing health disparities in the United States, the challenge is multifactorial. Pervasive inequities regarding
income, education, and housing impact nutrition, mental health, exposure to pollution, and feasibility of accessing healthcare services. Paid sick
leave, nutrition programs and affordable housing are among initiatives that are necessary to alleviate disease burdens overall and mitigate
systemic gaps in health.

Economic Savings That Could Have Been Realized by Single-Payer Universal Healthcare during the COVID-19 Pandemic.

We calculated the economic savings of a single-payer universal healthcare system in 2020, relative to status quo, by examining two sets of
costs: medical costs unrelated to COVID-19 and those attributable to the treatment of COVID-19.

National healthcare costs unrelated to COVID-19.

Per capita healthcare spending in the United States increased from US$10,682 in 2017 to $11,582 in 2019 (62). We previously calculated that a
single-payer universal healthcare system would have saved $458 billion in 2017 (3). There are several factors driving these savings. Streamlined
administration, negotiated pharmaceutical prices, and application of the Medicare fee schedule throughout the healthcare system are major
reforms that would achieve substantial reductions in national medical costs (SI Appendix). Combined, the savings from these mechanisms more
than compensate for the expanded utilization when coverage is extended to the entire population (SI Appendix).

Taking into account shifts in demography, healthcare utilization, and coverage composition, we updated our previous analysis and found that
single-payer universal healthcare would have saved $438 billion in 2019. A key factor driving the slight reduction in savings compared to 2017 is
the increase in the number of Americans who are underinsured from 41 million (63) to 45 million (13). In our analysis, we take into account that
these individuals would likely have expanded utilization once provided with full healthcare coverage (64). Given the small difference in savings
between 2017 and 2019, we anticipate that 2020 savings would have been similar in the absence of the COVID-19 pandemic.

National healthcare costs related to COVID-19.

Hospitalization and healthcare fees attributable to cases of COVID-19 are much higher under the current system than would have been incurred
under Medicare for All. In general, Medicare charges are 22% lower than those charged by private insurance for the same services (65).
However, the discrepancy is even greater for COVID-19 in particular. Private insurers paid more than double the Medicare rate for a
hospitalized COVID-19 case (66, 67).
To calculate the national expense associated with COVID-19 hospitalizations, we used the estimated cost of COVID-19 hospitalization with or
without a ventilator stratified by whether the patient was insured and, if so, their type of insurance (67). While the average Medicare and
Medicaid costs for a COVID-19 hospitalization that requires mechanical ventilation were $57,822 and $47,396, respectively, the average charge
to private insurance was $114,842. Charges for hospitalizations that did not require ventilation were lower but showed a similar pattern across
insurance types (SI Appendix, Table S5). We combined the age distribution of 11,832,077 [95% uncertainty interval (UI): (10,586,595,
13,077,559)] estimated hospitalizations through 12 March 2022 (68, 69) (SI Appendix), age-specific insurance coverage by type, and age-specific
probability of ventilation given hospitalization to estimate the proportion of hospitalizations in each age group that were reimbursed at each
different cost level (SI Appendix, Table S5). Since ratification of the CARES Act, the Federal Government has been reimbursing hospitals for the
care of uninsured COVID-19 patients at Medicare rates. Therefore, we applied Medicare rates for uninsured individuals who were hospitalized.
We calculated that the expense of COVID-19 hospitalization has totaled $365.8 [95% UI: (327.3, 404.3)] billion, of which $141.2 [95% UI: (126.3,
156.1)] billion occurred in 2020. If the Medicare rate had been applied to all hospitalizations, $105.6 [95% UI: (94.5, 116.7)] billion would have
been saved during the pandemic thus far and $39.4 [95% UI: (35.2, 43.5)] billion in 2020.

Consolidating the expected general savings from a transition to Medicare for All with savings specific to
COVID-19, single-payer universal healthcare could have cost $459 billion less in 2020 than our current
system. These savings would alleviate the burden on employers and individuals to cover insurance
premiums, copays, and deductibles. Since Medicare for All would achieve savings overall, the tax
revenue needed to fund Medicare for All would be significantly lower than the healthcare premiums
that are currently paid by employers and households.
The Consolidated Omnibus Budget Reconciliation Act (COBRA) of 1985 provided a mechanism by which individuals who become unemployed
may temporarily retain healthcare insurance for themselves and their families. However, the unemployed individuals must shoulder the entire
premium payments, including the proportion that was previously paid by their former employer, which on average is $21,342 annually for
family coverage (70). Due to a 2% COBRA administrative fee, the premiums paid by the unemployed worker are actually higher than that paid
when they were employed. The American Rescue Plan of 2021 included subsidies to cover COBRA premiums for individuals who lost
employment during the pandemic. While essential to relieve household financial strain due to prolonged reliance on COBRA, an estimated $57
billion in subsidies will ultimately flow to insurers (71). In 2020, many of these companies made multibillion-dollar second quarter profits,
double the amount for the previous nonpandemic year (72). Under a universal single-payer system, the recently unemployed keep their
coverage, and the taxpayer is not subsidizing these profits.

COVID-19 can also have long-term health and economic consequences. Among survivors of COVID-19, there can be substantial and often long-
term morbidity (73). The debilitating symptoms of long COVID can include pulmonary and cardiovascular disorders, mental health impairments,
neurologic symptoms, and functional mobility impairments (73). The cost of treatment for these symptoms is substantial (74). Uninsured
individuals, including many whose unemployment was precipitated by the pandemic, would have to bear the full cost of these treatments.
Further, long COVID may affect the ability to work, potentially reducing income or leading to insurance loss (75). As with acute COVID,
affordability may be a deterrent to seeking needed care for long COVID, particularly for low-income families (75). By curtailing the spread of
COVID-19, Medicare for All would also have reduced the incidence of long COVID. Additionally, medical expenses associated with long COVID
would be lower under a more efficient healthcare system and covered for the patient regardless of employment status.

Medicare for All as Pandemic Preparedness.

The COVID-19 outbreak has underscored the societal vulnerabilities that arise from the fragmented
healthcare system in the United States. Universal healthcare coverage decoupled from employment
and disconnected from profit motivations would have stood the country in better stead against a
pandemic. Emergence of virulent pathogens is becoming more frequent, driven by climate change and
other global forces (76). Universal single-payer healthcare is fundamental to pandemic preparedness.
We determined that such a system could have saved 211,897 lives in 2020 alone. Strikingly, it would
have done so at lower cost than the current healthcare system, saving the US $459 billion in 2020 at a
time of economic tumult. To facilitate recovery from the ongoing crisis and bolster pandemic
preparedness, as well as safeguard well-being and prosperity more broadly, now is the time to transition to a healthcare
system that can better serve the American people.

Only progressive taxation is sustainable.


Lycourgos Liaropoulos & Ilias Goranitis 15. Lycourgos Liaropoulos, PhD in Economics from Michigan
State University. MA in Economics from Michigan State University. Ilias Gorantis, PhD, Associate
Professor of Health Economics and Deputy Director of the Health Economics Unit at University of
Melbourne, Health Economics Lead of Australian Genomics. “Health care financing and the sustainability
of health systems,” 15 September 2015, https://doi.org/10.1186/s12939-015-0208-5, DOA: 9/13/2023,
SAoki
Financing sustainable health care: who must pay and how?

The answer to the question of who must pay for health care and how lies in the moral fabric and the value system of a society. It is a deeply ideological and political question with undertones
of social involvement, personal responsibility, and freedom of choice. Big changes in health care financing happen rarely, usually after major events6, and are more likely to take place in
countries with social cohesion high on their value scale7. This is possibly why discussions on health system sustainability continue to “finesse” the question of financing, and perhaps to avoid
two uncomfortable truths. One, that reliance on out-of-pocket expenditure is not acceptable on equity and financial protection grounds. Two, that only some kind of income transfer, such as
taxation, can cover the increasing cost of health care.

The moral determinant of “who pays” and “how” must now gain importance, as ageing societies,
technological advances, globalization, and economic recessions put a strain on the sustainability of
financing sources. The question therefore should now focus, not only on whether society as a whole will bear the cost but also on how to
obtain and manage the needed savings, and on the efficiency and competitiveness of the economy
which must produce them.
For the increasing cost of care many “blame” the demographic factor, although the major part of life-time health cost occurs in the last two years of life [19]. Life expectancy indeed rose
significantly in the last fifty years together with total lifetime cost [20]. The average retirement age, however, remained more or less the same at around 65. There are, therefore, twenty years
in which a citizen incurs health costs without producing income as “insurance”. People of working age today must finance the health needs of their children, themselves and, mainly, the 3rd
and 4th generation. Labor contributions legislated thirty years ago are clearly not enough for today’s medical costs8, while contributions sufficient to cover health costs thirty years from now

only savings in the form of taxes on all incomes produced by society, including wealth and
would make labor extremely expensive. Therefore,

capital, appear to be a sustainable source of funding in the long-term.

cyclical fluctuations are now common events rather than rare occurrences. Health financing may determine
In addition,

how pressures on health systems are weathered without loss of equity, quality and financial protection.
Social Health Insurance has been found to have negative labor market effects [21] and to hurt competitiveness [7] due to higher labor costs. This is crucial in monetary unions where

as unemployment
devaluation during economic crises is not an option and competitiveness gains are the only way for the economy to adjust to pre-crisis levels. In addition,

increases, incomes decline and pressures on health budget and public infrastructure are pushed to
extremes, evidence has indicated that public health systems financed through taxation can be more
responsive to economic pressures and more effective in health expenditure consolidation [22]. Although
conclusive evidence is lacking, the experiences of Canada and Greece may be indicative.

Evidence from Canada, where health is financed mainly through taxation, suggests that patient
satisfaction, hospital performance and health outcomes were maintained despite the financial strain
[23]. Concerns that reliance on taxation may be associated with higher private payments, especially during

economic downturns [22], or that corruption may inhibit administrative capacity to collect taxes [24], may
be put to rest by the fact that during economic turmoil individuals become more price-sensitive and
administrative capacity tends to improve.
In Greece, Social Insurance historically covered approximately 40 % of health care cost. In the face of severe unemployment (27 %) caused by 25 % GDP contraction, reliance on employer-
employee contributions proved an inadequate funding base for health care. Between 2009 and 2012,9 Social Insurance expenditure declined by 29.3 %, with the fairness of the system and
quality of care severely affected [25, 26]. Greece is now a country where the need of re-orientation of health care financing is pressing [25, 27].

In conclusion,employment contributions as a source of health financing are incompatible with universal


coverage, quality of services, and rising life expectancy. A move towards general taxation to meet
health care needs can boost economic growth through increased competitiveness, and achieve major
non-health objectives, like equity, financial protection, quality and responsiveness even during economic
downturns. Health system sustainability, as a system objective, must turn to financing through progressive
taxation of all types of income. “Uncomfortable” as this may appear, it is a reality not to be overlooked. Political concerns associated with economic imperatives as
well as moral considerations may force changes in health services financing in both the developed and developing world. National health insurance financed

through taxation should gain momentum in the quest for more sustainable and responsive health
systems.
Plan---1AC
Therefore:
The United States federal government should increase fiscal redistribution by
expanding Medicare to eliminate the age restriction.
Adv---Modelling---1AC
Advantage two is MODELLING.
There’s a leadership void in public health.
Frederick Burkle et al. 22, Ph.D., Professor, Senior Fellow & Scientist,Harvard Humanitarian Initiative,
Harvard University & T. H. Chan School of Public Health, Krzysztof Goniewicz, Ph.D., Professor,
Department of Strategic Studies, Polish Air Force University, Mariusz Goniewicz, M.D., Department of
Emergency Medicine, Medical University of Lublin, Amir Khorram-Manesh, Associate Professor,
Lecturer, Department of Surgery, University of Gothenburg, "Global public health leadership: The vital
element in managing global health crises", Journal of Global Health, 2022, Vol. 12, pg. 2-4, HBisevac

The key to success in the face of a public health crisis lies in prevention, preparedness, communication,
and control of infectious and environmental diseases. Often called the “invisible health profession”, public health specialty
is largely responsible for the majority of “improvements in global health expectancy” [2]. Developed countries expect public
health to play a significant role in managing outbreaks of infectious diseases and epidemics, and overall, it
has performed well. Adequate epidemiological analysis and effective preventive measures have not overwhelmed the global healthcare system
allowing timely treatment of patients. As
the prevalence of some diseases increase, health promotion strategies
bring more resources and financial support to both local and global health resources. This was clearly evident in
epidemic outbreaks of H1N1, SARS, and Ebola, to name but a few. Societies rely on public health professionals who know
how to utilize resources efficiently, creating, organizing, and implementing a variety of public health strategies
and programs for the benefit of the global population [3].

Public health aims to serve the whole population, recognizing that populations are not homogenous entities. Factors such as sex, race,
disability, migration status and socioeconomic position can intersect to result in certain groups in society being particularly at-risk during health
crises [2]. Indeed, population-based management, not individual one-on-one care, is the mainstay of public health management and success.
COVID-19 pandemic has resulted in disparities in health outcomes, with the most vulnerable being disproportionately impacted throughout the
pandemic [4]. Therefore, public health preparedness plans must consider the impact of a health crisis on the whole population and include
underserved, vulnerable or stigmatized groups (such as undocumented migrants) [4]. The rationale for this inclusion is for both justice and
enlightened self-interest; to reduce the exacerbation of existing health disparities and to optimize the collective emergency response.

The field of public health offers a multiagency and multi-professional collaboration that teaches openness and readiness for changes and
challenges in the course of a disease. A public health specialist can identify threats to the health of the population, that is, threats beyond the
disease of the individual patient alone [5]. Epidemiology enables the analysis of the prevalence of diseases in society and the study of the
factors that influence their emergence. The most important features of a public health specialist are the willingness to search for the causes
and origin of diseases and selecting appropriate tools for various tasks, communication, and cooperation in population-based teams [3]. There
is a need for efficient coordination of pro-population-based health activities to prevent the consequences of disease or at least mitigate their
unwanted outcomes. That is why unfairly, public health has often been labelled as a passive specialty focusing only on preventable diseases [6].

The ongoing pandemic of COVID-19 has emphasized the significance of public health as a unique specialty and its leading role in emerging
public health emergencies [5]. It has also shown the importance of treating the origin of such a tumour on society and not only its metastasis.
2020 has proven to be a challenging year for public health leaders who had to go through an unimaginable crisis for which many health care
systems were not prepared [7]. Collaborative efforts are crucial in managing public health crises. Epidemic control demands a response that
outstrips the capacity and authority of any single organization; public health leaders, often in positions without direct authority, must therefore
be skilled in influencing a coherent multi-stakeholder response [8].

In the United States, both economic and political leaders at state and national levels subsumed the leadership of
public health decisions from the start, with many public health experts summarily being ignored, dismissed, threatened, or
completely disregarded [9]. One in eight Americans - roughly 40 million people - lives in a community that has lost its local public health
department leader during the coronavirus pandemic, all because of lack of political support or controlling political interference that prevented
these leaders from making unpopular but necessary public health decisions [1]. Collectively, public
health experts say the loss of
expertise and experience has created a leadership vacuum in the profession [3].
Public trust in public health agencies is an essential prerequisite for a successful response to a health
crisis. Trust is required for motivating the public to undertake both voluntary action (such as vaccination uptake [10])
or to maintain compliance with legally binding regulations (such as stay-at-home orders), recognizing that government policies in
free societies require the consent of the governed [11]. Public health agencies must not succumb to naive utilitarianism and instead consider
the long-term consequences of individual policies on trust in the overall public health response [10,11]. Trustworthy public health messages are
required to be scientifically based and non-political, a challenging task given inherent scientific uncertainty during health crises (which can
result in policy inconsistencies) and the requirement for public health officials to work closely with politicians to deliver an effective response
[11].

If there is one universal lesson that this pandemic teaches us is that a new generation of public health leadership and authority, better trained,
respected, and managed, will have to be established in most communities worldwide and under a strong and independent WHO.

Resolute public health leadership at global, national and local levels has decisively influenced crisis response. At the global level, Gro Harlem
Brundtland’s leadership of the WHO during SARS was characterized by swiftly galvanizing the international community to action and honestly
engaging with (and at times challenging) national governments to contain the outbreak [12]. At the national level, Nigeria’s response to Ebola in
2014 saw strong political and public health leadership build on a pre-existing commitment to strengthening public health capacity to rapidly
contain the outbreak, which resulted in only twenty cases and eight deaths [13]. At the local level, insight from three UK case studies identified
factors grouped into “getting started”, “maintaining momentum” and “indicators of success” themes that contribute to success in systems
leadership [14]. One of these cases was the 2018 Salisbury Novichok poisoning response. The challenges of leading this incident’s response
have recently been given a wider audience through the BBC’s television drama The Salisbury Poisonings, resulting in media interest in the
complexities of public health decision-making and communication [15].

On the other hand, it must be remembered that there have been cases where politicians have taken the role of scientific experts and give
recommendations on which vaccines are “good” and which ones should not be taken [16]. National public health responses should outline a
clear delineation between the roles and responsibilities of political leaders, who have the legitimacy and authority to lead the policy response
and health officials who give medical and technical advice to officials and the public, such as the UK’s Joint Committee on Vaccination and
Immunisation.

Public health emergencies will become more frequent in the future with accelerating climate changes, rapid urbanization, scarcities in food,
water and energy resources, and deforestation to name but a few [3]. Public
health leadership must focus on improving
communication and leadership skills [6]. It cannot afford to be a passive behind the scenes profession. This must be
reflected in improved quality of data, data analysis, forecasting, infrastructure improvements, a strong focus
on mitigating economic and social (including racial) community-wide inequalities and improved community resilience to
help public health leaders successfully move through all phases of future pandemic and global public health challenges [2].

RECOMMENDATIONS

Several measures should be implemented immediately, such as alternative local leadership and alternative
facilities for care to facilitate a bid for a more flexible surge capacity to increase the four essential elements of surge
capacity, ie, staff, stuff, structure, and systems [3]. A major role for Public Health and public education is a critical ingredient
of a flexible response system in an age of increasing weather/climate disasters and epidemic and pandemic risks. One of the main
obstacles to implementing public health policies and strategies is the low prevalence of health literacy globally. Several reports from the US,
Southeast Asian countries and Europe have indicated a low or limited global health literacy, resulting in worse health care and poorer health
outcomes [17]. This health literacy is consistently associated with several factors such as education, ethnicity, and age. The lack of necessary
skills required to understand and manage their health does not allow the active participation of vulnerable groups in disease prevention
programs [18]. Having a sustainable program guarantees public health’s continuous use to achieve its aims and population outcomes. To
achieve such sustainability, public health leaders need to have the authority to evaluate whether a program is continued or halted for any
reason. They also need to create systems and control measures to evaluate the benefits or outcomes of a program for its consumers. The latter
requires good collaboration with the social and community-related organizations before, during and after program implementation. A course of
sustained collaboration may also identify other sites in need of immediate improvement and where the program, sensitive to environmental,
cultural, and socioeconomic demands, must be replicated to guarantee the long-term outcomes of the entire global program [3].

Lastly, population-based management of health crises must become a recognized health specialty which has as its core the training of future
Health Crisis Managers and scientists trained across the entire “disaster cycle’ not just that of the “response phase” alone. Such recognized
experts, all with a core of public health training and expertise, would become the population-based managers serving every WHO-recognized
community level program [19].

CONCLUSIONS
Global crisis management demands responsible public health leadership, receptive to well-established experiences
and new suggestions and exercising vocal anti-discrimination policies effectively [20]. Public health schools worldwide have to equip their
students with the tools and resources they need to become effective leaders in uncertain times. They must look to not only train the best
epidemiologists and biostatisticians, but also equip students with the skills and expertise to discharge public health leadership roles within the
political environment. These important steps must be taken to guarantee a better future for global health and offer opportunities and new
insights to health care leadership in a flexible response system [8]. With improvements in the inherent knowledge, innovation, education, and
support from all emergency and non-emergency organizations, public
health must be the leading specialty to build upon and
create a new culture of safety and resilience at all levels of society measured by innovative priorities for
action in disaster risk reduction and response [3]. If the world was able to develop global communications and air travel, it
should also be eager and able to develop what we now appreciate as the future of “global public health” [9].

Only US single payer fills in by fostering collaboration across diverging international


systems.
Louis Rowitz 13, Ph.D., Professor Emeritus, School of Public Health, University of Illinois at Chicago,
Public Health Leadership: Putting Principles Into Practice, "The Global Public Health Leader", Jones &
Bartlett, January 2013, Chp. 28, pg. 577-578
The requirements associated with successful leadership seem to present leaders with a moving target. Our world is ever changing at the same
time that it appears to be flattening. As
the world changes and flattens, public health leaders need to be flexible
enough to adjust to these changes. Not only is public health important for building the infrastructure of public health in the
United States, but leadership is important for building the infrastructure of public health around the
world.1 With the health issues of the world becoming more complex, the number of agencies addressing
these issues appears to be increasing as well. Leadership is critical for this growing response to health
and disease around the world. Public health leaders are needed for policy development, development
and formulation of innovative public health and primary care programs, monitoring of global health
issues, and evaluation. Leadership development programs similar to those in the United States are needed throughout the world. If
these programs can be collaborative with the leadership development programs in the United States, coordinated global public health
initiatives become more possible. As we look forward, we discover the necessity of understanding people throughout the world.

With an understanding of the importance of broadening our view of leadership and the recognition that being a leader on a global level may be
different, it is worth briefly looking at the GLOBE Project (Globe Leadership and Organizational Behavior Effectiveness Research Project), begun
in 1991 by Robert J. House of the Wharton School of Business. The GLOBE Project became a 62-society, 11-year study involving 170 researchers
throughout the world. The respondents included more than 17,000 middle managers from about 950 organizations in the food processing,
financial services, and telecommunications industries. The 62 societies were classified into 10 cultural clusters: Anglo Cultures, Latin Europe,
Nordic Europe, Germanic Europe, Eastern Europe, Latin America, Sub-Saharan Africa, Arab Cultures, Southern Asia, and Confucian Asia. The
GLOBE work was guided by the Implicit Leadership Theory, which states that from childhood, people gradually develop beliefs about the
characteristics and behaviors of leaders.-' House and his colleagues reported that across all societies included in the GLOBF study, people
expect their leaders to be trustworthy, just, honest, decisive, encouraging, positive, motivational, able to build confidence in others, and
dynamic, and to have foresight.' And yet leader effectiveness is clearly contextual and may vary in its presentation in different cultures and
societies. With the flattening of the world, however, our overall expectation of our leaders may coalesce across cultures.

With a shift from a domestic look at leadership to a global look, public health issues such as global
warming, pandemic influenza, child survival initiatives, human immunodeficiency virus/acquired immune deficiency
syndrome, and other emerging and reemerging global infections become more and more important.1
Public health leaders need to remove the barriers between countries to work in partnership with our
public health colleagues all over the world. Hesselbein has stated that the time for partnership is now.5 A world vision related
to healthy families and children, excellent schools, decent and available housing and work opportunities, and health equity is necessary for
international and global public health to become a reality.

It will not be possible to attain global health equity without partnerships. Collaboration becomes critical
if we are to make these partnerships productive. In order to make collaboration work, it is important to move from a
business-as-usual approach to a new stage of transformation that makes collaboration lead to positive change. In real collaboration, the
partners become an integrated team that dialogues and debates health challenges on a global level in
order to come up with potential solutions to these challenges.6 Rosenberg and his colleagues discuss the partnership
pathway as going from its beginning or genesis through the first mile, the journey, and the last mile. In the genesis comes the realization that
positive changes can occur. After the right partners come together during the first mile, a shared goal or goals arc set, an appropriate structure
is created, system-based strategies are created, and organizational roles are defined. During the journey, management issues predominate in
which there needs to be a disciplined and flexible approach that guides the partnership. The individuals who compose the partnership will also
need to take leadership roles at an individual level to bring about change. In the last mile, there will need to be adaptation to sustain the
momentum created, transfer of control in a supportive way, understanding and communicating of the lessons learned, and finally a method for
dissolving the partnership when the goal or goals are reached.

A cautionary note is necessary here. Public health is not the same everywhere. It has different meanings
from country to country. For example, leadership is practiced differently in countries where health
service is provided at the national level. In these countries, there is a clinical focus, and almost all the
leaders are physicians. Public health does not extend to social and behavioral scientists and other professions in the same way. In Asia,
Africa, and some European countries, public health leaders put public health and clinical health together with a primary care focus. In the
United States, we tend to separate the clinical focus from the public health focus. There are strengths
and weaknesses in all of these systems. In partnerships, the cultural differences must be dealt with first
before the partners can ever deal with collaboration and discover ways to work together.

Synergizing global health solves every impact.


Lancet 21, world’s highest-impact academic peer-reviewed medical journal. “Health as a foundation
for society,” 2 January 2021, https://doi.org/10.1016/S0140-6736(20)32751-3, DOA: 9/13/2023, SAoki

In 2020, a virus that thrived on chronic disease and inequality became the great revealer. COVID-19
revealed the fragility of civilisations
built on social injustices, short-term policies, and a dangerous disregard for the environment. The need
to become more resilient to crises of all kinds is almost universally agreed on. But to construct that resilience, a
philosophical change in how we care for each other and our environment must be made. Health
improvement is the guiding principle to lead a recovery away from regressive policies that harm the
most vulnerable (and will result in future catastrophes) and point us towards change that supports
equity and sustainability, and reinvigorates the Sustainable Development Goal agenda.

Societies start and end with the collective security of the planet. Climate stabilisation must be the cornerstone of the 2020s and
beyond, closely entwined with equity. Equity for now, but also for future generations. 2020 was supposed to be the year that The Lancet focused on child and
adolescent health, but many of our initiatives were delayed. 2021 demands renewed activity. The Lancet Countdown on climate and health and our 2020 WHO–
UNICEF–Lancet Commission, A future for the world's children?, will continue to investigate the impacts of the climate crisis on health and the type of environment
that young people can expect to inherit, ahead of COP26.

Countries might justifiably start to look inward to repair the damage after COVID-19. But equitable
access, whether to a vaccine, food, or
finance, will require global collaboration. The health community should nurture and encourage
multilateral partnerships, in which countries share responsibility for each other, as the best way to build strong and just institutions. A
complex, synergistic relationship exists between the environment, conflict, migration, and equity, in
which the desire for good health is a common denominator. As reported in a World Report, a record number of people will
require humanitarian assistance in 2021. The Lancet will publish a Series on women and children living in conflict-affected areas (representing over half of all women
and children), and a standing Commission on migration and health will investigate these dynamic relationships.

COVID-19 has proven that the economic and political success of individual countries is founded on the
health of its population. The disproportionate impact of COVID-19 on the USA has made clear where the
Trump administration's lack of public health response and espousal of health-harming policies have accelerated
negative outcomes. A forthcoming Lancet Commission on public policy and health in the Trump era will serve as a call to action for the
new Biden administration to refresh the way health is valued in the USA. Without health, there is no

productivity, no GDP, no trade, and no education.


The case for universal health coverage has never been clearer; yet it has not guaranteed success against COVID-19. The UK NHS
is a renowned universal health-care system, but years of underfunding and short-term political agendas have led to an

unnecessarily poor response to COVID-19. The Lancet Commission on the NHS will be published in early 2021 and explores how a system
run on efficiency and restricted resources has resulted in a lag in life expectancy and infant mortality compared with other high-income countries. Health-

care systems will need to prioritise resilience and sustainability to overcome the collective challenges of
shifting demographics, climate change, and increased demand.

This week, a Health Policy piece in The Lancet scrutinises the disparities
between a global health security agenda and
fragmented universal health coverage systems. It indicates that a new understanding of preparedness
must develop—one that appreciates that the baseline level of health in a population dictates how well
a country will fare in a crisis. New Zealand and Germany are examples of countries whose sustained investment in the health of their people has
paid off during COVID-19.

AND conflict in every hotspot.


Roberto Nang & Keith Martin 17, M.D., MPH, Joint Medical Chair for Global Health, Adjunct
Professor, National Defense University; M.D., P.C., Executive Director, Consortium of Universities for
Global Health, “Global Health Diplomacy: A New Strategic Defense Pillar”, Military Medicine, January
2017, Vol. 182, Iss. 1-2, pg. 1456-1458, HBisevac

The world appears unhinged. Instability from the Middle East, Caucasus, Africa, and Central America to
Asia abound. The Study of Terrorism and Response to Terrorism database identified fewer than 300 major terrorist incidents between
1998 and 2004 in the Middle East and North Africa. In 2013, they listed 4,650 such incidents.1 Quieter cracks tear at the fabric of South America
and parts of Asia. Although
geographically distinct, many of these areas of instability share underlying causes
that give rise to threats to the United States and the global community.
Human-generated causes include corruption, poor governance, absence of the rule of law, violence, gross human rights abuses, climate change,
environmental degradation, a weak civil society, and a lack
of professional capabilities across skill sets within the
government departments needed to effectively manage the operations of a well-run state.2 Natural causes include disasters,
disease, demographic changes, and limited access to the resources essential for life.

When these human or natural causes create conditions that result in poor provision of, or unequal access to essential
services, such as water, food, shelter, health services, education, and economic opportunity, people lose confidence in
government and hope for their children and their future. They become restless, demonstrate, can become violent and
overthrow their governments (such as the self-immolation of Mohamed Bouazizi, the Tunisian cart vendor, which
sparked 35 more self-immolations by extralegal businessmen and started the Arab Spring), or can result in mass migrations.3
Desperate human security, conditions create desperate people undermining stability and creating even more demands from host nation
governments and governments in neighboring states.

Although force and counter terrorism programs are sometimes needed to address security threats, enormous opportunities are available to use
nonkinetic capabilities within the Department of Defense (DoD), Department of State, U.S. Agency for International Development, other U.S.
Government agencies, and civilian organizations to address the underlying causes of instability. Global
health diplomacy is an
underutilized strategic asset to do this. At a far lower cost, it will save lives, decrease economic losses, reduce
the need for kinetic military operations, increase security cooperation, improve diplomatic relations,
encourage trade, and create the foundations for long-term stability.
HEALTH IS A NATIONAL SECURITY IMPERATIVE—DISTANT HEALTH THREATS ARE GLOBAL THREATS

Health is a national security imperative. The second- and third-order effects of a strategic health or global health issue that severely impacts
and overwhelms the stability of a far-distant nation can have broad and multiplying effects that transcend boundaries and can become regional
and global security threats. When human immunodeficiency virus/acquired immunodeficiency syndrome first started to be seen in the United
States, there were U.S. leaders that were not too concerned about its impact on the general public, alluding to the fact that it was a disease
that mostly affected the four H's: homosexuals, heroin addicts, hemophiliacs, and Haitians.4 From its first known cases in 1981 up to 2013,
human immunodeficiency virus has infected almost 78 million people and killed about 39 million.5

The Chernobyl power plant accident that occurred on October 26, 1986, was a catastrophic nuclear accident. Several studies have been done to
estimate the increase in health effects and cancer-related morbidity and mortality in Europe.6 Communicable diseases can be easily carried
from a distant area of the world to a teeming metropolis within 24 hours because of the ease and affordability of plane travel. The
interconnectedness of countries as a result of trade has its drawbacks—biological or chemical contamination of food or products commonly
occur across oceans and continents.7

Noncommunicable diseases are also affecting not just high-income countries but also low-to-middle income countries. Ubiquitous exports of
fast-food meals, high-fructose drinks, and salty, fried foods have contributed to a tremendous increase in obesity and hypertension.8 Obese
and sedentary populations negatively impact the workforce of a nation and its productivity. The offices of military personnel and readiness cite
obesity as the number one disqualifying reason for new recruits.9 Twenty seven percent of the U.S. young adults are not fit to serve in the
military.10

Addiction to illegal drugs is an important global health threat. The problems created by the manufacture of opium in Afghanistan,
methamphetamine in Mexico, and cocaine in Peru and Columbia create tremendous and devastating health effects, loss of productivity, social
disruptions, breed corruption in a nation's military and police forces, and create turbulent violence all along its wake, both in the countries
manufacturing the drugs and the countries importing them.

Weather forecasters often discuss the multiplying effects that the fluttering of a butterfly's wings in one country may have on the regional
weather of another distant country. Global health professionals and more and more of our military and political leaders are now concerned
that the disease that we see in a child in Africa or a pig in Asia may have tremendous impacts on the public health, economic productivity,
military readiness, and strategic security interests of their nation. In addition, a weak health and political system anywhere can be a threat
everywhere.

LINKAGES: GLOBAL HEALTH, SECURITY, AND STRATEGIC CHALLENGES

Global health encompasses the basic needs required for human security: respect for people's universal rights,
personal protection, the rule of law, access to food, water, health care, education, basic infrastructure, and shelter.11 Their absence
leaves populations vulnerable to the depredations of insurgent groups and corrupt, venal cabals that can
hijack a region or state for the benefit of themselves and a select group of people. This creates an environment of the privileged and
abused, the included and excluded, and an environment ripe for insecurity and conflict.12

For a nation to provide the environment where people's basic needs can be met requires capabilities within their governing
infrastructure and communities. This includes management, finance, education, social sciences, law, medicine, public health, engineering,
veterinary medicine, agronomy, and more. Their absence cripples [undermines] a nation's ability to support a
foundation for human security and stability, inhibits its ability to thrive in good times, and respond effectively to natural and man-
made threats in bad times. It breeds corruption, poverty, poor health outcomes, spread of lethal diseases, gross
human rights abuses and conflict. This we have seen played out with grim efficiency in Afghanistan, Pakistan, Iraq,
Syria, Sudan, Democratic Republic of the Congo, Central African Republic, Libya, Yemen, Somalia, Nigeria, Honduras, and beyond. All
have had disastrous regional effects, many have created direct threats to U.S. interests.
Islamic State in Iraq and Syria was borne out of the brutal kleptocracy of Assad's Syria and a destructive government in Iraq. Al-Shabaab was
created in the failed state of Somalia. Boko Haram grew in the destitute and neglected regions of northern Nigeria. Al Qaeda and the Taliban
secured a haven in the lawless western regions of Pakistan. Weak governments in Central America created a fertile ground for organized
criminal gangs to terrorize the populace and profiteer off the illegal drug trade that destroys lives, and drives people to desperately flee
northward into the United States.

Insurgencies, terrorist organizations, and other nonstate actors thrive in the presence of an
incompetent or abusive state government that violates segments of its citizenry and fails to provide an environment where
peoples' rights are protected and their basic needs met. These groups divine counter narratives that take advantage of people's lack of hope
and fears. They create a refuge and an outlet for people's rage. Such
messages and place of belonging can be a powerful
magnet for youths, the poor, and the disenfranchised, who see little hope in the future.
Security threats are not only manmade but also can come from nature. The international
community's failure to dramatically
reduce our carbon footprint leaves us vulnerable to an increasing number of extreme weather events
that threaten everything from coastal communities to food and water security. This will amplify
existing tensions over natural resources and could result in the forced migrations of massive numbers of
vulnerable people. The world's population is expected to reach 9 billion by 2030. The growth will primarily occur in cities in the developing
world most of which already have fractured or nonexistent infrastructure. Climate change will have a dramatic effect on densely populated
poor urban areas, especially those in arid zones and in littoral areas. This is a recipe for disaster.

Environmental degradation is also increasing the spread of infectious diseases and facilitating zoonoses
to jump the species barrier and infect humans. The Ebola outbreak, like severe acute respiratory syndrome and H1N1 before it, is part
of a long list of diseases that have infected humans from an animal reservoir with devastating impact. Many zoonoses exist and
more will come. Using history's guide, the next pandemic will likely be a zoonotic agent. Recognizing this, the United States last year led
the creation of the Global Health Security Agenda to prevent, detect, and respond to deadly disease outbreaks.13 Though accepted by many
countries, it has been implemented by few.

No amount of force can resolve these challenges. However, global health diplomacy, exercised through civil-
military and military-military programs, is a promising strategic tool that should be employed to address these
wicked strategic or global health problems and improve domestic and international security.

AN OPPORTUNITY TO ACT

Despite a growing level of interest in academia and government agencies, there is little agreement on how to define “global health
diplomacy.”14 Michaud defined it as “international diplomatic activities that (directly or indirectly) address issues of global health importance,
and is concerned with how and why global health issues play out in a foreign policy context.”14 The World Health Organization (WHO) states
that it “brings together the disciplines of public health, international affairs, management, law, and economics, and focuses on negotiations
that shape and manage the global policy environment for health.”15 We summarize global health diplomacy as the application of a broad range
of skill sets to cooperatively improve human security throughout the world. A vital area of focus
must be to strengthen public
service, governance capabilities, and civil society in unstable regions. Doing so will enable nations to
create an environment where their citizens' basic needs can be met, universal rights respected, and the ability to hold a
government to account, secure. This includes building and retaining capabilities to manage effective, noncorrupt,
justice, finance, health, education, defense, public works, and environmental departments. The absence of these
structures cripples [undermines] a country's ability to govern itself and leaves it vulnerable to the causes of
instability, both human and natural.

The United States, by virtue of its strengths across diplomacy, defense, development, trade, and its inherent
domestic civilian capabilities, has an opportunity to exercise its leadership and mobilize these assets.
Using global health diplomacy to comprehensively strengthen public service and governance capabilities has
been chronically neglected by the international development community. It needs a leader to start this process and the United States
has the ability and authority to do so in the national and international interest.

Failure causes extinction. Collaborative projects mitigate antagonisms.


Wilmot James 17, Ph.D. from UW Madison, Honorary Professor in the Division of Human Genetics at
the University of Cape Town's Medical School and Non-residential Senior Fellow at Bard College’s
Hannah Arendt Centre, “In an Age of Zika and a Threat of Biochemical Terror, Health Security Must Be
Everybody’s Concern”, Daily Maverick, 2017, https://www.dailymaverick.co.za/article/2017-04-02-op-
ed-in-an-age-of-zika-and-a-threat-of-biochemical-terror-health-security-must-be-everybodys-concern/
#.WOY8xTvDHHw

Health security is humanity’s shared concern. Promoting health and preventing death define us at our
most altruistic and advanced. The Hippocratic Ideal, the concept of the physician as the guardian of human health, encapsulates a
fundamental human quality common to all the world’s great religions. Medicine is one of the earliest and greatest human
achievements because it is a co-operative enterprise involving highly skilled individuals; and it is as a
result of cooperation – and our unusual ability for complex language – that cumulative civilisation is possible.

In the age of globalisation, it is health security, a recent Lancet editorial stated, that “is now the most important
foreign policy issue of our time”. The rapid emergence and re-emergence of pathogenic infectious
disease, of which Zika is the most recent, the slow but steady cumulative acts of nature associated with
climate change, high-risk forced migration caused by desperation and war, the creeping reality of
biochemical terror and the threat of nuclear war, propel human survival and well-being to the frontline of
what today must be everybody’s concern.

The field of health diplomacy provides an unprecedented opportunity to build human solidarity. It is an
area of human endeavour that cuts through inherited antagonisms. Governments that offer health
improvements as part of aid to nations with whom they wish to develop stronger diplomatic links
succeed in cultivating deeper cultural relationships precisely because of their direct benefit to citizens . To
advance health diplomacy requires health leaders with an inclusive global vision...

AMR threatens the SDG agenda. Absent the plan, initiatives fail. Extinction.
Dr. Raphael Chanda et al. 23, Medical Doctor at ReAct Africa, Clinical Microbiologist at the University
Teaching Hospital, Lusaka, Zambia, Clinical Microbiologist and Microbiology Registrar at Groote Schuur
hospital in Capetown, South Africa. Dr. Francesca Chiara, director of the CIDRAP Antimicrobial
Stewardship Project, PhD in Neuroscience from University College London, master’s in public health
from the London School of Hygiene and Tropical Medicine, master’s research degree in Pharmaceutical
Biotechnology from the University of Milan, post-doctoral fellowship at King’s College London. Mirfin
Mpundu, Public Health expert, Global Advocate for AMR, Clinical Pharmacist, Engagement Lead Africa at
ICARS, Director at React Africa. Philip Matthew, Technical Officer, WHO Geneva. Steering Against
Superbugs: The Global Governance of Antimicrobial Resistance, chapter 7, pg 83-91, DOA: 8/29/2023,
SAoki. [AMR = anti-microbial resistance] [SDG = Sustainable Development Goals]

AMR has been recognized as a global problem of public health concern (Ramanan Laxminarayan et al., 2013). AMR
is an existential
threat that leads to treatment failure, increased costs related to treatments, longer hospital stays, predisposal
of patients to healthcare-associated infections (HAIs) especially in those who are immunosuppressed, and
affects both morbidity and mortality (Dadgostar, 2019). Left unabated, AMR threatens to reverse the major gains made
in modern medicine over the past 100 years. Procedures such as surgeries, treatment of infectious diseases,
and neonatal and maternal health will be difficult to perform and manage. While it is well known that the
interconnectedness of human, animal, plant, and the environment contributes to the spread of
resistance, very few initial steps have been taken to address AMR in its entire complexity (Destoumieux-
Garzon et al., 2018; see Chapters 11, 12, and 13). In addressing AMR, the focus should not be dominated by r esearch
and development of newer antimicrobial agents but should also target public health interventions which are
essential in addressing the emerging threat of superbugs (Boucher et al., 2009). The veterinary and human sectors largely
share the same antimicrobial classes and consequently when resistance is developed in one context, it can quickly spread to other species and
environments (Woolhouse et al., 2015). Transmission can happen through the food chain, environment (especially sewage), and direct contact.
Recently, the Global Leaders Group on AMR called for an urgent reduction of the use of antimicrobials in animals, including significant
investment in infection prevention and control in the food system (WHO, 2021). The biggest challenge for controlling the spread of AMR is to
limit infections happening in the first place, thus reducing the need to use antimicrobials in all sectors (Klein, Levin, et al., 2018; Klein, Van
Boeckel, et al., 2018). Overuse and misuse, including inadequate access to the correct treatments, are still considered to be major drivers of
AMR. Poor hygiene and sanitation and inadequate infection prevention measures pose the highest risk to increased burden of resistant
infections. The greatest effect has been felt by LMICs, whose water, sanitation, and hygiene (WASH) infrastructures and health systems are
weak, and which suffer from chronic stock-outs of critical and essential antimicrobials, critical human resources, and weak laboratory systems.
The World Health Organization (WHO) during the sixty-eighth World Health Assembly (WHA) of 2015 adopted the global action plan on AMR
with five key strategic areas: raising awareness and knowledge of AMR; strengthening the knowledge base through surveillance; using infection
prevention practices to reduce infections; optimizing the use of antimicrobials in both the human and animal sectors; and making an economic
case for sustainable investment (WHO, 2015). Member states agreed to develop and implement NAPS within two years of this resolution to
address and contain AMR in their countries. Additionally, in recognition of the complexity of addressing AMR as a One Health issue, across
sectors, the WHO, Food and Agriculture Organization (FAO), and World Organization for Animal Health (OIE) together formed the Tripartite that
is responsible for AMR governance across all sectors. According to WHO Regional Office for Africa, thirty-six countries have developed NAPs and
are at different stages of implementation (WHO, n.d.).

Five years down the line it has become apparent that in addressing AMR, the challenge for LMICs does not rest in development of NAPS but
rather in securing sufficient political interest, public attention, and financial support for implementation, as concluded by the Interagency
Coordination Group (IACG) on AMR (Frumence et al., 2021; IACG, 2019; see also Chapter 14 for discussion of lack of NAP implementation).
Reasons include lack of knowledge and data on the burden of AMR and inability to translate its impact on both human and animal health
effectively (Pokharel et al., 2019). Describing consequences related to AMR has always proven to be a challenge due to a number of factors
which include involvement of diverse pathogens, unique mechanism of transmission, and its association with a variety of clinical syndromes.
Since antibiotic resistance is not a disease entity by itself, these factors provide a cloak of invisibility obscuring its impact on health (Cars and
Nordberg, 2005). Additionally, AMR has a language problem: it is a term commonly used in clinical practice among health professionals with an
assumption that its meaning is well understood by patients and policymakers (Mendelson et al., 2017). Several surveys conducted worldwide in
different settings have concluded that knowledge and understanding of antibiotics is lacking and an innovative approach is required to address
these problems in order to create awareness in the community (Bakhit et al., 2019; Brookes-Howell et al., 2012; Mokoena et al., 2021). Until
very recently, AMR did not feature in the UN's 2030 Sustainable Development Goals (SDGs) but it now
has one indicator (percentage of bloodstream infections due to selected antimicrobial-resistant
organisms) although many of the SDGs are dependent on AMR being curbed for their delivery (WHO, 2018). If
we fail to frame and communicate the dramatic impact that AMR could have beyond the health sector,
as happened successfully for other global health challenges, the issue will not win the support of
politicians and the public.
Governance challenges for LMICS

This section discusses three central governance challenges for LMICs. First, we look at political will and leadership, then: advocacy and awareness, and third: resources and capabilities.

1. Political will and leadership

Addressing a cross-cutting public health issue such as AMR requires strong functional governance structures. Governance of AMR at the global, regional, and country level has become a huge and complex problem, and is a relatively weak link at the country level. Though most countries in
Africa have moved forward and developed their NAPs, there are issues with implementation. As NAPs are legally non-binding, most countries are not persuaded to follow through with implementation. Many countries have made very little efforts to develop operational and budget plans
for implementation of NAP activities, and many have limited themselves to a strategic action plan only (FAO, 2018). On paper, most countries may have good structures for NAP implementation that include a ministerial committee across sectors, an AMR Secretariat with a National AMR
Focal Point, technical/thematic working groups across sectors, and sector-specific AMR focal points, but these are dysfunctional and rarely meet. There is no accountability, clear scope of work, nor terms of references for some of these committees and positions. Lack of the needed
coordination across sectors and absence of robust engagement with various stakeholders is contributing to inaction or misaligned implementation of NAPs. Where some activities for AMR are taking place, these are largely donor driven and often not guided by country priorities. In
addition, most AMR Focal Points or coordinators have other primary responsibilities that make it very difficult to focus and concentrate on AMR functions. Depending on the professional background of the national AMR Focal Point, the implementation efforts around NAPs will be skewed
in favour of that domain. There are also regulatory constraints that drive the issue of AMR at country level. Medicines regulation in both human animal and agriculture sectors ensures appropriate use of antimicrobials but unfortunately, most LMICs lack appropriate legislation and
policies. Even in countries with policies in place, monitoring and enforcement mechanisms are often lacking (Pokharel et al., 2019; Roth et al., 2018).

As AMR is a complex multi-sectorial problem it requires the involvement of multiple stakeholders in animal, human, agricultural, and environmental sectors. It requires intersectoral planning and implementation, a true One Health approach (Destoumieux-Garzon et al., 2018; Queenan et
al., 2016; see also Chapters 11, 12, and 13). This One Health approach to AMR, however, is challenging because it requires concerted actions across different ministries and sectors, especially for LMICs. The political will at global level has minimal impact at country level and does not
always translate into action, mainly due to the absence of effective governance structures and wholehearted commitment that leads to a lack of provision of resources and problems with ownership and accountability. In most cases, the governments or regulators fail to appreciate or they
do not have enough data to appreciate the true extent of the burden and impact of AMR at country level (Iskandar et al., 2021). The scientific community has also not been able to frame the AMR issue in a way that can interest the policy community at the country level. All these factors
hinder the translation of AMR policy from paper to action in LMICS. AMR requires long-term actions which are more challenging to achieve compared to short-term and one-sector focused solutions. As highlighted in earlier work, AMR is identified as a super wicked problem with a global
impact that requires willingness of national states to collaborate internationally. The change needed represents an overwhelming challenge at the political and policy level (Baekkeskov et al., 2020; Levin et al., 2012).

In general, most LMICs have failed to move away from vertical programmes towards more integrated systems that could move AMR interventions into the mainstream. Most developing countries are implementing projects directed at WASH, HIV, TB, maternal and child health, and non-
communicable diseases. However most of these programmes are still siloed with very few efforts to integrate AMR interventions in them. Beyond line ministries, there are very few that co-develop activities such as intervention activities on AMR. The way budgeting is done typically by
various ministries contributes to the failure of ministries to work together. Governments need to take leadership in creating the platform and forum for interministerial planning.

Monitoring and evaluation systems are important in NAP implementation as these systems help track performance indicators. Surveillance data improve accountability and transparency, facilitate regular monitoring and evaluation, and highlight progress achieved in attaining goals laid
out in the global action plan. This ensures continued stakeholder support and ultimately contributes to the global momentum on the AMR issue. However, few LMICs have developed such functional and integrated systems, highlighting potential gaps that need to be addressed (Kariuki et
al., 2018).

Among the recommendations of the IACG on AMR was accelerating progress in AMR actions, a focus on innovation to secure the future, collaboration and engagement of various stakeholders, increased investments for sustainable response, and a focus on strengthening accountability
and global governance (IACG, 2019). While some progress has been made in this area, a lot needs to be done through appropriate monitoring and evaluation.

2. Advocacy and awareness

Some scholars describe AMR as an advocacy crisis, building on the argument that in comparison to other well-established
global health programmes directed, for instance, at tuberculosis, HIV, and malaria, AMR has not received enough support from
civil society, national advocacy associations, and the general public (Fraser et al., 2021). The high burden of
communicable and non-communicable diseases (NCDs) with better advocacy and visibility profiles displaces AMR positioning on national
agendas, with AMR coordinators in most countries having little influence, which leads to poor integration of activities (Baekkeskov et al., 2020).
Most of the advocacy and policy work on AMR has been driven by high-income countries (HICs), which have represented the issue at a high
political level, although experiencing a lower burden of resistant infections compared to LMICs (see Chapters 2 and 14). In particular, health
agencies, international funders, and a few HIC governments with a high stake in global health have set the AMR agenda. This raises questions
on how well needs and challenges facing LMICS have been truly represented.
A good example comes from stewardship and responsible use of antimicrobials campaigns, which have proven effective in HICs but yielded
poor outcomes in low-resource settings.

First, excess versus access to antibiotics is a critical dilemma in LMICS (Cox et al., 2017). Although
increased used of
antimicrobials has been recently documented, suggesting better access to these life-saving medicines, it
is also true that many people still suffer and die due to lack of their availability (Browne et al., 2021). The recently
published AMR Benchmark (Access to Medicine Foundation, 2021) shows that only 54 producers of antibacterial, antifungals,
and vaccines out of the 166 tracked have a strategy to ensure provision of these products in LMICs. This
leads prescribers to use inappropriate treatments, despite the existence of guidelines or awareness campaigns pointing to the correct drug of
choice. Campaigns designed to promote responsible use of antimicrobials should be adjusted to the needs of the contexts where they have
been implemented (Laxminarayan et al., 2016). Secondly, lack of reliable surveillance data in LMICs is also an important limiting factor to the
design of advocacy, stewardship, and awareness campaigns. HICs collect and have access to reliable surveillance data that can inform on
changes in drug resistance patterns and consumption of antimicrobials. However, these programmes do not often take into account the
heterogeneity of health systems in LMICs, the interrelationship between patients and doctors and the role of other healthcare professionals
such as pharmacists and community workers as prescribers, and lack of access to basic services such as microbiology labs, sanitation facilities,
clean water, and cold supply chains, to name a few. While targeting behaviour seems to be the primary focus of many programmes, this might
have little effect in contexts plagued by lack of widespread access to healthcare services and other health inequalities (Charani et al., 2021; see
also Chapter 8). The COVID-19 pandemic and the slow distribution and scarce availability of vaccines in the Global South has been a painful
reminder of the reality that many resource-limited settings face.

Thirdly, AMR has been described as a slow-moving pandemic. While politically this definition serves well
the purpose of creating urgency on political agendas, it might dangerously shift the focus from
strengthening health systems to epidemic preparedness. While epidemic preparedness obviously plays a very important
role in strengthening the capacity of health systems, what many LMICs need is to lay the basis of functional and
sustainable health systems rather than building capacity on a system, which lacks the foundation.
Finally, the epidemiological transition that many LMICs have experienced in the last few decades, has also resulted in a reduced interest in infectious conditions and subsequently AMR. Many agendas have shifted to raise awareness on the health impact of NCDs. As the gross domestic
product (GDP) of countries has increased, habits have changed including around food and nutrition.

The consumption of animal protein and processed food has increased. The demand for animal protein is being fuelled by a general understanding that its quality is better than plant-based alternatives and the increase in per-capita income seen over time in low-resource settings.
Agricultural intensification has increased production dramatically and this has also brought down the effective price of animal products. Most of the intensification is not supported by robust infection control measures and is therefore propped up by the use of antibiotics in livestock
(Godfray and Garnett, 2014). The small/medium farmers in LMICs often do not have access to resources or incentives to improve the adoption of infection prevention (biosecurity) measures, which might be common practice in HICs. The price discovery process for farm products in low-
resource settings also has several structural flaws, which results in poor farmer incomes. The inability of the scientific community to decipher the AMR issues for the public results in poor awareness across the board. In most countries, consumers are not willing to pay a premium for
products made without routine use of antibiotics.

3. Resources and capabilities

AMR-focused funding opportunities for LMICs are few and not accessible for most countries. As a result, most countries report the lack of financial resources as one of the major hindrances to NAP development and implementation. Equally, countries are at different stages of NAP
development/implementation and challenges in comparing progress indicators may arise. Among the critical roles of the national AMR Secretariats is resource mobilization that would facilitate NAP implementation. However, most AMR Secretariats at the national level are understaffed
and lack the skills needed for resource mobilization.

As mentioned above, the lack of involvement from different stakeholders is another area of concern. In developing NAPs, inclusion of key stakeholders and use of local data conducted through a comprehensive situational analysis are important. Aligning AMR activities with country
priority areas, in addition to advocacy efforts, may play an important role, especially in resource limited settings. These can only be achieved through governance and delegated activities.

In general, LMICs have limited capacity to conduct AMR surveillance in all sectors, and where data are collected they are mostly fragmented and lacks reflection of the true burden of AMR (WHO, 2020). Besides, quality assurance is a concern in many of the existing laboratory facilities,
thereby limiting the insights which can be generated from the laboratory results (see Chapter 12 for an illustrative example of this specific challenge in Malawi). Contributing factors include poor infrastructure, lack of human resources, and reliance on external funding (Iskandar et al.,
2021). AMR surveillance systems are an important component in management of infectious diseases and they form the basis of understanding the dynamics of AMR transmission. The data collected contribute to improved public health services, inform policy formulation, and act as an
early warning mechanism for monitoring emerging resistance trends (Hay et al., 2018). A true One Health approach requires the integration of human surveillance systems with environmental, animal, and food systems to track the spread of resistant pathogens and antibiotic residues to
identify areas of risk and priority for action (WHO, 2013). The limited capacity to conduct surveillance in LMICs impacts the ability to conduct situational analysis and generate country level data, and the absence of the data presents a further challenge for most countries to document and
build a case for funding of AMR activities. In order to overcome this challenge in LMICs, innovative approaches to improving resources and financing of laboratory services are required. In 2020, Zambia developed a framework for an integrated surveillance approach which follows the lead
example of high-income countries such as Canada, Denmark, United Kingdom, and Sweden, to name a few, with already established systems. Though it is too early to judge its success, it is an important step. Lack of technical capacity to implement novel stewardship interventions in
human, animal, and environmental sectors is also a problem which drives the global AMR issue.

An often overlooked cause of AMR is the impact of antimicrobial residues and resistant genes in wastewaters on the spread of resistant pathogens. Contamination can happen from hospitals, households, farms, and pharmaceutical manufacturing facilities. Drug-resistant organisms have
been systematically detected in communities living around manufacturing plants in India and China, which manufacture most of the world antibiotics (Rutgersson et al., 2014). In 2020, India was set to become one of the first countries to set limits for levels of antibiotic effluents in
wastewater, however the India Drug Manufacturers Association successfully argued against the implementation of the policy (Schaaf, 2020). There is a need to strengthen surveillance capacity to include monitoring of the impact of wastewaters on human health and investing in
innovative solutions to develop wastewater treatment plants for LMICs (see also Chapters 17 and 18 for a discussion of regulations of pharmaceutical industries).

Recommendations

Foundations of strong governance structures requires the appreciation of AMR as a public health
threat that needs urgent attention. Currently few political leaders and other key decision-makers are
aware of AMR as a public health problem and the proportionate response it requires. More efforts should be
targeted towards political leaders and the policy community to take up AMR as a critical public health issue. National AMR secretariats must be
empowered and funded to carry out the many activities including AMR NAP implementation. The COVID-19 pandemic has demonstrated that
with good advocacy and leadership AMR can be placed on the global and national agenda (Cars et al., 2021). Though COVID-19 has taken away
some of the political capital available to the AMR issue, the increased interest in health among the public and policy community could yield rich
dividends if the AMR issue is placed strategically on the political landscape.
For coordinating the efforts around the UN SDGs, most LMICs have an expert on the issue who generally sits in the head of
state's office. Engaging with these experts will gain AMR the necessary visibility for resource mobilization.
Rather than supporting the creation of yet another AMR coordination body or office, we advocate for appropriate funding being made available
with recruitment of dedicated AMR staff at country level. The AMR national focal points should be intended as full-time jobs and their positions
supported by staff to spearhead NAP implementation, engage with stakeholders, convene technical and thematic working groups, liaise with
funders, and leverage on current programs such as WASH, infection prevention and control (IPC), and vaccination programmes.

Provision of finances

Implementing AMR NAP activities is cost-intensive and requires planning and apportioning the necessary funds after thorough costing of the
action plan activities. AMR
should be viewed as a development issue and security threat that requires funds
and it should be integrated in SDG agendas. It has been twenty years since African governments signed the Abuja declaration
pledging to allocate 15 per cent of annual budgets to the health sector. However, this target has been elusive with most countries failing to
honour the commitment in full; as of 2011, only South Africa and Rwanda had reached the 15 per cent target, and data from WHO Global
health expenditure database showed that at the end of 2018 all South African Development Community (SADC) Member States were struggling
to honour the pledge (Bwalya, 2021; WHO, 2011). Increased public health financing is also required for countries to attain universal health
coverage; as highlighted during the COVID-19 pandemic, disparities in access to health still exist in LMICs. In
addressing AMR it is
important to frame the issue more broadly, articulating it not only as a health issue but also as a
development issue as part of the SDG agenda.
Accountability

Like any national programmes or interventions, there is need for the government to build in accountability measures in AMR progress. The
relevant ministries have to take ownership for the action plan and allocate resources. Clear core indicators should be reported every year to
ensure progress, tease out some lessons, use monitoring and evaluation strategies that track yearly activities, and present reports every year
for not only political leaders but communities who are affected by AMR.

Raising awareness and knowledge

Governments working with partners and other key stakeholders need to develop novel strategies on efforts to raise awareness and knowledge
within communities. With the ease that antibiotics are accessed without a prescription in most LMICs, civil society need to be
engaged to effectively tackle this issue. Such public health campaigns and interventions cost money but are rewarding.
Mobilizing local communities around the AMR issue is likely to contribute to behavioural changes and
unwarranted antibiotic requests when they are not medically indicated. Civil society organizations could
work with and support governments with this agenda. Deliberate efforts should be made towards
addressing interventions that target behavioural change. Understanding social, economic, gender,
cultural, and religious dimensions and related inequalities is key to AMR containment. These range from
social structures and poverty affecting literacy and hindering access to healthcare as well as a lack of
universal health coverage, which burdens the most vulnerable. Cultural and religious beliefs affect the relationship
between patients and healthcare professionals and trust in standard medical practices. Holistic and symbolic form of healings are still common
and widely used in many contexts. Additionally, the effect of urbanization has shifted populations in search of better jobs from rural areas into
cities in many fast-growing economies of the Global South. As a result, in
emerging economies there are huge income
disparities with a high level of poverty forcing many to live in crowded and inadequate housing with
poor access to hygiene and sanitation facilities, exposing them to a higher risk of contracting infectious
diseases. Women might be more susceptible as they suffer inadequate access to WASH during pregnancy and childbirth (Charani et al.,
2021). The successful containment of AMR needs to address the pressing social determinants of health that still characterize LMICs.

Independently, only expanded public investment prevents SDG collapse.


Anne Mills et al. 15, London School of Hygiene and Tropical Medicines. Viroj Tangcharoensathien,
International Health Policy Program, Ministry of Public Health. Toomas Palu, World Bank, East Asia and
Pacific Office. “Accelerating health equity: the key role of universal health coverage in the Sustainable
Development Goals,” https://doi.org/10.1186/s12916-015-0342-3, DOA: 8/28/2023, SAoki [UHC =
Universal Health Care]
Background

The recognition
that health is a precondition for, an outcome of, and an indicator of all three
dimensions of sustainable development [1] has led to a series of extensive negotiations among United Nations
(UN) member states on the text of the post-2015 Sustainable Development Goals (SDGs; see Box 1). The SDGs follow, and expand upon, the
Millennium Development Goals (MDGs), which are due to expire at the end of 2015, though all health-related MDGs continue to be included in the SDGs with newer
targets. The SDGs are due to be finalized in September 2015, and will be the result of the largest consultation process by the UN.

Despite the critique on the number of SDGs: 17 goals and 169 targets, all are
interlinked, reflecting the fact that sustainable
development in a country requires multidimensional and multisectoral policy interventions. These
include addressing poverty, hunger, food insecurity and malnutrition, environmental protection, quality education, universal health
coverage (UHC), employment, and decent work. All of these issues are embraced within an equity framework and interwoven with health considerations.
Take the case of malnutrition. Children with severe malnutrition have a higher mortality risk; malnutrition accounts for 45% of total annual child mortality [2]. While
management of acute malnutrition within the health sector is cost-effective [3], food and nutritional security realized by sustainable resilient agriculture and
improved capacity to adapt to climate change, drought, flooding, and disasters in SDG2, is equally important and synergistic. Or consider the case of tobacco as a
significant contributor to the noncommunicable disease (NCD) epidemic. Strengthening implementation of the Framework Convention on Tobacco Control and
controlling harmful use of alcohol will face industry resistance, and in some countries is hampered by free trade agreements and trade interests dominating health
goals. Addressing these cross-sectoral complexities requires strong leadership, active civil society organizations, and effective intersectoral actions to ensure that a
health lens is taken by other policies.

The 13 targets (nine specific and four cross-cutting) of the health goal in SDG3 are raised to a level much higher than in the MDGs, such as reducing the maternal
mortality rate to no more than 70 per 100,000 live births, ending preventable deaths in newborns and children, reducing one-third of premature mortality from
NCDs, halving global deaths and injuries from road traffic accidences, and achieving UHC.

UHC is a significant SDG health target combining financial protection against catastrophic health
spending and medical impoverishment as well as ensuring access to essential services. It is both a
measurable goal in itself with significant contribution to welfare valued by societies, as well as an
important means for achieving the other SDG3 health targets. It is also high on the global agenda, as
reflected in the 2012 United Nations General Assembly Resolution. To reflect this key role of UHC this commentary reviews different trajectories
countries have taken in making progress toward UHC, and accelerating the achievement of health equity, financial protection, and long-term sustainability [4].

Universal health coverage: different trajectories

Although countries take different routes in making progress toward UHC, based on their socio-economic and political context, a common trend emerges: different
financing sources are used to cover different population groups. Public and private sector employees are covered by payroll-tax financed contributory schemes,
often taking the form of mandatory social health insurance (SHI). The poor are usually covered by tax-financed mechanisms either directly managed by the Ministry
of Health or as part of the SHI as in Vietnam and the Philippines. Coverage of the large informal sector is financed by a range of funding sources; from full premium
contributions by households, to partial and fully tax subsidized premiums. Most countries in Asia gradually shift from full contributions to tax funding depending on
the government fiscal space and, most importantly, political leadership. Countries find it difficult to expand coverage of the informal sector through contributory
schemes because of ineffective mechanisms to enforce contribution payment [5].

Another trajectory is in countries where policy choice is to achieve UHC via services that are (in theory) provided free of charge in public health facilities. In this
trajectory, in some countries public spending on health may not match the increased demand for health services, resulting in high levels of household out-of-pocket
payments, for example 45% of total health spending in Sri Lanka [6]. Also, wealthier members of the population may opt out of government services, preferring to
pay out of pocket for private services (Malaysia). But, on the other hand, in the Pacific Island States, publicly provided health services at relatively high cost to the
governments have actually minimized out-of-pocket payments by the population.

The design and inter-relationship between health delivery and financing have major ramifications for health systems performance. Evidence from Organization for
Economic Cooperation and Development (OECD) countries suggests that public contract where there is a direct relationship between purchaser organization and
healthcare providers, or reimbursement systems where the purchaser organization reimburses patients for their medical bills, are more efficient than public
integrated systems where healthcare providers are owned by a purchaser organization [7]. But this efficiency is also a function of strong institutions in OECD
countries compared to those in developing countries.

Universal health coverage: contribution to health equity

To achieve a favorable UHC outcome, strengthening physical access by improving geographical coverage
of health services, and financial access by extension of financial risk protection mechanisms , are two
essential parallel synergistic interventions [8]. The higher the coverage of skilled birth attendance (SBA), the smaller
the rich-poor disparities [9]. In countries with very low SBA coverage, that is, less than 30%, the rich-poor disparities are large, at around 60
percentage points. A smaller disparity, less than 20 percentage points, is observed in countries having high coverage.
Where 100% SBA coverage is reached, as in Thailand, there are no gaps whether by maternal education or by socio-economic status [10]. In Thailand, universal
coverage of Maternal and Child Health (MCH) services resulted in rapid reduction in the rich-poor gap of child mortality between the 1990 and 2000 censuses [11].
Relative inequalities tend to be larger in countries with lower overall levels of health care use [12]. The
US Affordable Care Act coverage
expansion has resulted in improved access to a usual care provider for millions of black and Hispanic
Americans, and reduced the likelihood of going without care because of cost [13].
Functioning close-to-client primary health care (PHC) which the majority of the poor can access [14] acts as a major hub in translating UHC political intentions into
pro-poor outcomes such as service utilization and government subsidies [15]; a comprehensive benefit package results in high levels of financial risk protection,
preventing non-poor households from becoming poor due to medical payments [16].

Health workforce: a backbone of health systems

The health workforce is critical to functioning health services. Shortages and maldistribution of the health workforce, a common problem facing many MDG offtrack
countries, has been a constant challenge despite the 2008 Kampala commitment [17]. Investment in the health workforce remains low, with large gaps between
demand and supply; health workforce planning is often weak without intersectoral coordination; policies on retention of the health workforce in rural areas and
within countries are not fully implemented; scaling and transforming health professional education is at an early stage of reform [18].

Future projections demonstrate that low income countries will face a widening gap between the supply and need for health workers, but have limited capacity to
employ more workers, even if supply can be increased. Upper middle income countries will face a similar widening gap, but created by demand factors, which could
drive up health care costs or encourage in-migration of health workers. Projection by the International Labour Organization (ILO) shows that 10.3 million additional
health workers globally are required to close the current gaps and ensure universal health coverage, of which 7.1 million are needed in Asia and 2.8 million in Africa
[19]; these gaps are hardly met unless governments have strong commitments to produce and retain health workers in countries. OECD countries are the major
destinations for international migration of health workers, often the highly skilled workers from low and middle income countries. Demand for health workers in
high income and emerging countries due to aging and needs for long-term care stimulates international migration. This is exacerbated by the unresolved “push
factors” in source countries, such as low pay, lack of career paths, and poor working conditions. Despite the World Health Assembly adopting by consensus the
WHO Global Code of Practice on International Recruitment of Health Personnel [20], implementation of the Code is suboptimal, as reflected by the first report of
the Code’s implementation [21]. But on the other side of equation is the macro-economic calculus of the professional migration from the demographic dividend
countries that goes beyond individual push and pull factors. In the Philippines, the remittances from migrants, of which health professionals make up a significant
part, contribute more than 10% to the gross national income (GNI). In a global economy, win-win situations may be possible if importing countries adhere to the
Code, and donor countries organize their health professional educations system and labor markets so that local populations’ access to qualified health professionals
does not suffer.

Skill mix, cadre mix, and task shifting [22], clinical and public health competency, performance, and social accountability are as important as numbers of health
workers. These require transformation of the instructional and institutional dimensions of health professional education systems. A more diverse composition of the
health workforce, and expansion of health workers in the community and of mid-level health workers, needs careful planning [23].

Finding fiscal space

Progressively achieving UHC will require a significant increase in public investment. Countries would need to
systematically review the opportunities under the five domains of fiscal space creation [24]. Macroeconomic conditions remain challenging over the medium term
with slow growth in developed countries and slowing growth in Asia. But Africa has just had a decade of the fastest economic growth that should create
opportunities for fiscal space for health. The recent Lancet Commission report on Global Health 2035 makes a strong economic case for health that should facilitate
greater prioritization of health by the economic ministries in countries [25]. The Philippines has recently demonstrated success in raising additional resources for
health through a sin tax reform for tobacco and alcohol, 80% of the revenues accruing to speeding up progress toward the UHC. In spite of global economic
problems, the UK has just reaffirmed its commitment to allocate 0.7% of the gross domestic product (GDP) to overseas development assistance [26], and the recent
Chatham House Global Health Financing report [27] calls for 0.15% to go toward health. But perhaps the most untapped resource for increasing fiscal space for
health is efficiency gains from existing allocations by using evidence-based approaches to priority setting, resource allocation, performance-oriented provider
payment mechanisms, and strengthened public financial management and accountability.

Conclusion

UHC and the health workforce are two among 13 health targets in the SDGs, and jointly contribute to the achievement of the SDGs.
The upcoming health targets in the SDGs, more inspirational and demanding than the previous health-related MDGs, are achievable
only when countries demonstrate investment in health systems strengthening beyond the rhetorical
statements made at the United Nations General Assembly by Heads of State.

Collapse undermines crisis response and locks in global shortages.


Tom Cernev 20, MPhil in Engineering for Sustainable Development from the University of Cambridge,
BA in Mechanical Engineering from the University of Adelaide, Winter School Attendee at the Australian
National University, and Dr. Richard Fenner, Reader at the University of Cambridge and Director of the
MPhil in Engineering for Sustainable Development, “The Importance of Achieving Foundational
Sustainable Development Goals in Reducing Global Risk”, Futures, Volume 115, January 2020,
https://doi.org/10.1016/j.futures.2019.102492
4.2. Existential and catastrophic risk

The level and consequences of these risks may be severe. Existential Risks (ER) have a wide scope, with extreme danger, and are “a risk
that threatens the premature extinction of humanity or the permanent and drastic destruction of its potential for desirable
future development” (Farquhar et al., 2017,) essentially being an event or scenario
that is “transgenerational in scope and
terminal in intensity” (Baum & Handoh, 2014). With a smaller scope, and lower level of severity, global catastrophic risk is defined as a
scenario or event that results in at least 10 million fatalities, or $10 trillion in damages (Bostrom & Ćirković, 2008). Global Catastrophic Risk
(GCR) events are those which are global, but they are durable in that humanity is able to recover from them (Bostrom & Ćirković, 2008; Cotton-
Barratt, Farquhar, Halstead, Schubert, & Snyder-Beattie, 2016) but which still have a long-term impact (Turchin & Denkenberger, 2018b).

Achieving the Sustainable Development Goals can be considered to be a means of reducing the long-term global
catastrophic and existential risks for humanity. Conversely if the targets represented across the SDGs
remain unachieved there is the potential for these forms of risk to develop. This association combined with
the likely emergence of new challenges over the next decades (Cook, Inayatullah, Burgman, Sutherland, & Wintle, 2014) means that it is of
great value to identify points within the systems representations of the Sustainable Development Goals that could both lead to global
catastrophic risk and existential risk, and conversely that could act as prevention, or leverage points in order to avoid such outcomes. This
identification in turn enables sensible policy responses to be constructed (Sutherland & Woodroof, 2009).

Whilst existential threats are unlikely, there is extensive peril in global


catastrophic risks. Despite being lesser in severity than
existential risks, they increase
the likelihood of human extinction (Turchin & Denkenberger, 2018a) through chain
reactions (Turchin & Denkenberger, 2018a), and inhibiting humanity’s response to other risks (Farquhar et al., 2017). It is
necessary to consider risks that may seem small, as when acting together, they can have extensive consequences (Tonn, 2009). Furthermore,
the high adaptability potential of humans, and society, means that for humanity to become extinct, it is most likely that there would be a series
of events that culminate in extinction as opposed to one large scale event (Tonn & MacGregor, 2009; Tonn, 2009).

Whilst the prospect of existential risk, or global catastrophic risk can seem distant, the Stern Review on the Economics of Climate Change
estimated the risk of extinction for humanity as 0.1 % annually, which accumulates to provide the risk of extinction over the next century as 9.5
% (Cotton-Barratt et al., 2016). With respect to identifying these risks, it is known that in particular, “ positive
feedback loops…
represent the gravest existential risks” (Kareiva & Carranza, 2018), with pollution also having the potential to pose an existential
risk.

With respect to reinforcing feedback loops, there is particular concern about the effects of time delay, and the level of uncertainty
when feedback loops interact (Kareiva & Carranza, 2018). It is difficult to identify the exact thresholds that are associated with tipping
points (Moore, 2018), which leads to global catastrophic risk or existential risk, and thus it is necessary to understand the events that can
lead to existential risks (Kareiva & Carranza, 2018).

Table 1 identifies possible global catastrophic risks and existential risks as reported in the literature and from Fig. 3 these are aligned to the
Sustainable Development Goals they impact on the most.

Table 1. Sustainable Development Goals and global catastrophic risk and existential risk scenarios.

Risk Scenario Relevant SDG Goal classification

Pandemic
Posner (2004)
Good Health and Well-being [3] Outcome/foundational
GCR and ER Avin et al. (2018)
Poverty [1] Outcome/foundational
Cotton-Barratt et al. (2016)
Turchin and Denkenberger (2018b)

GCR and ER Global Warming Climate Action [13] Human input


Posner (2004) Life on Land [15] Outcome/foundational
Risk Scenario Relevant SDG Goal classification

Ćirković, Sandberg, and Bostrom (2010)


Matheny (2007)
Life Below Water [14] Outcome/foundational
Farquhar et al. (2017)
Responsible Consumption and Production [12] Physical assets
Cotton-Barratt et al. (2016)
Affordable and Clean Energy [7] Physical asset
Bostrom and Ćirković (2008)
Turchin and Denkenberger (2018b)

Depletion of Resources
GCR Responsible Consumption and Production [12] Physical assets
Posner (2004)

Biodiversity Loss
Life on Land [15] Outcome/foundational
GCR Posner (2004)
Life Below Water [14] Outcome/foundational
Avin et al. (2018)

Global Agricultural Shortfall Hunger [2] Human Input


GCR
Turchin and Denkenberger (2018b) Inequality [10] Human input

4.3. Linking risks with progress in the SDGs

Generally it is the Outcome/Foundational and Human input SDGs that are most directly related. For example as the movement of
refugees increases pandemic risk, poverty levels in low and middle income countries increase reducing the health of
the population, and so restricting access to education which further enhances poverty and birth rates rise as family
sizes increases generating unsustainable population growth which furthers the migration of refugees (Fig. 5). Fig.
3 shows that leverage points to reduce refugees lies in SDG 16 (Peace Justice and Strong Institutions), reducing malnutrition through
alleviating SDG 2 (Zero Hunger) and taking SDG 13 (Climate Action) to avoid the mass movement of people to avoid the impacts of
global warming.

Global warming itself will drive disruptive changes in both terrestial and aquatic ecosystems affecting SDG 15 (Life on Land) and SDG 14 (Life
Below Water) adding to their vulnerability to increases in pollution driven by a growing economy. Loop B (in Fig. 4)shows the constraints
associated with SDG 13 (Climate Action) may slow the economic investment in industry and infrastructure reducing the pollution generated,
encouraging adoption of SDG 7 (Affordable and Clean Energy) whilst stimulating carbon reduction and measures such as afforestation, which
will also improve the foundational environmental goals.

Depletion of resources and biodiversity are strongly linked to SDG 12 (Responsible Consumption and Production)
through measures such as halving global waste, reducing waste generation through recycling reuse and reduction
schemes, and striving for more efficient industrial processes. The more resources that are used, the less responsible is
Consumption and Production which may thus reduce biodiversity (Fig. 3) and increase the amounts of wastes accumulating in the environment.
The final driver of Global Catastrophic Risk is an agricultural shortfall which will increase global Hunger
(SDG 2) and widen the Inequality (SDG 10) between rich and poor nations and individuals. Quality Education (SDG 4) is
important as a key leverage point to stimulate the generation and adoption of new technologies to
improve energy (SDG 7) and water supplies (6) which can enhance agricultural production. Such linkages are
convincingly examined and demonstrated in the recent film “The Boy Who Harnessed the Wind” (2019), based on a factual story of water
shortages in Malawi in the mid 2000s.

These examples may appear self evident, but it is the connections between the goals and how they adjust together that is important to
consider so the consequence of policy actions in one area can be fully understood. Because
of the underlying system
structures global threats can quickly transmit through the system. Water Crises will limit the water available for
agriculture and basic needs which in turn will stimulate a decline in Gender Equality (SDG 5). Technology disruption from cyber attacks will
restrict the ability to operate Sustainable Cities and Communities (SDG 11) and potentially expose populations to extreme events by disrupting
transport, health services, and the ability to pay for adaptation and mitigation of climate related threats from a weakened economy. Conflict
(in all forms) will increase refugees and climate change provides the backdrop against which all these interactions will play out.

Whilst it is possible that general catastrophic risk or existential risk scenarios may eventuate from the non-
achievement of the Sustainable Development Goals, there are certain aspects within the causal loop diagram which if prioritised will
reduce this risk. For example, to reduce the risk of pandemic, ensuring that the number of Refugees is minimised, and is a leverage point.
Similarly, prioritising SDG 3 (Good Health and Well-being) is essential and is enabled by many of the other goals. However, a feature missing
from the SDGs is a recognition of the precautionary principle, with an implicit assumption that technological innovation alone may create
improvements in many of the goals.

Shortages collapse civilization.


Julian Cribb 23, Fellow of the UK Royal Society for the Arts, Australian Academy of Technological
Science and Engineering, Australian National University Emeritus Faculty, Director of National
Awareness, CSIRO, Co-founder at Council for the Human Future, How to Fix a Broken Planet: Advice for
Surviving the 21st Century, “Resources for Living”, Cambridge University Press, 1/15/23, pg. 28-43,
HBisevac
The Problem

Humanity’s hunger for resources devours over 100 billion tonnes of materials a year, 12 tonnes for each of us –
while people living in well-off countries consume 6 to 10 times as many resources as those in poor ones.1 Yet very few people are aware how
much stuff it takes to support them, or its true cost to the planet. In recent decades, human demand for material goods has exploded – each of
us now consumes 10 times what our own great-grandparents needed to live their thrifty lives a century ago.

This is far more than the Earth can support in the long run. Over a lifetime, each of us now:

• uses 35,000 tonnes of fresh water (mostly in the form of food);

• causes the loss of 650 tonnes of topsoil (mainly through food demand);

• uses 120 tonnes of pure energy (oil equivalent);

• wastes 13.5 tonnes of food;

• causes the emission of 119 tonnes of often toxic chemicals;

• causes the emission of 350 tonnes of climate-wrecking CO2.2

This is a colossal personal impact which, collectively, is now running way beyond our planet’s capacity
to supply, absorb, cleanse, and renew, as many scientists have warned. The Global Footprint Network calculates that
we are consuming materials equivalent to the output of 1.75 planet Earths each year – and that we overshoot the globe’s renewable resource
budget by July each year.3 The rest of the year is spent running up a deficit that, sooner or later, will be paid for in famine, scarcity, war,
disease, and human and animal suffering. Human consumption of material resources is skyrocketing: it has tripled
from 29 billion tonnes a year in 1972 to 101 billion tonnes in 2021 – and is on track to reach 170 billion tonnes by
2050.4

Resources are the mainspring of unprecedented growth in the old world economy and today’s high
material living standards. Their growing scarcity is also one of the triggers for civilisational collapse. Somehow,
we have to reduce our dependency on them.

Water

The world water crisis is already upon us. Two out of every three humans face acute water scarcity for one
month or more a year5 – the actual number is forecast to reach 5.7 billion by mid-century. Globally, rivers, lakes, and wetlands are
dying and drying up, mountain glaciers are shrinking and groundwater reserves are running out in many
countries and entire regions, while the human population and its insatiable demand for water continues to rise
relentlessly, exacerbated by a warming climate which accelerates the hydrological cycle.

All told, humansnow use about 4 trillion cubic metres of water a year. In the time that our population has
tripled, our use of water has grown fourfold (see Figure 3.1). This means that the average person now consumes (and wastes)
about 500 tonnes of water annually, around two-thirds of which is used to produce our food while much of the rest goes to make the material
goods we buy and use – from cement to furniture and clothing. Our personal daily use for washing and drinking is relatively small and
is therefore not a reliable guide to our actual water impact on the planet; we must look mainly to our buying habits to reduce the water we
consume. To put our water use in perspective: you will personally consume enough fresh water in your lifetime to
float a battleship.

[FIGURE OMITTED]

The amount of fresh water on Earth is finite and has been since the planet was formed. The current scarcity is due
mainly to greed, universally bad management, and gross pollution of the resource in almost all societies. The obvious
answer – recycling – is only being adopted, marginally, by states such as Singapore and a handful of cities. Water shortages are
everybody’s problem: while supplies remain adequate in cooler parts of the world such as Europe and North
America, even these will be impacted if dangerous water shortages occur in Asia, the Middle East, and
Africa – and billions of people are forced to flee their home countries. Water shortages also spell food
scarcity and higher world food prices for all.

The globalwater crisis is the most immediate of the catastrophic risks facing humanity. It is here now,
and will only get worse as the century advances without drastic reform, major changes in food production and urban
planning, and worldwide personal action to save and reuse precious water.

Forests

The world’sforests are dwindling at a rate of 6.6 million hectares a year, and deserts are spreading
across 12 million hectares of fertile farmland, every year. The UN Food and Agriculture Organization says:

Deforestation and forest degradation continue to take place at alarming rates, which contributes
significantly to the ongoing loss of biodiversity. Since 1990, it is estimated that 420 million hectares of
forest have been lost through conversion to other land uses, although the rate of deforestation has decreased over the
past three decades. Between 2015 and 2020, the rate of deforestation was estimated at 10 million hectares per year, down from 16 million
hectares per year in the 1990s. The area of primary forest worldwide has decreased by over 80 million hectares since 1990.6

Forest loss is far more serious for humans that the mere absence of trees. The clearing and thinning of
forests for wood and paper production is making global heating worse, as depleted forests change from being
carbon absorbers to carbon emitters. It is exacerbating the loss of wild animals, the extinction of many
species, and the collapse of the very ecosystems that support humanity. It is destroying rivers, lakes, and
wetlands through soil erosion and turbid drinking water supplies. The clearance of large areas of tropical rainforest in
the Amazon and Congo basins and south-east Asia is expected to cause significant local and global climatic disruption, including the spread of
deserts (see Chapter 5). The drying
of cleared landscapes can shatter local farming and food production. As
forests dwindle, they also release new diseases into the human population, some of which
become pandemic (see Chapter 7).
Soils

The crisis in the world’s soils is as great as the crisis in water – and is expanding at similar rates as more land is cleared to feed a burgeoning
human population and ‘profligate consumerism’ grows. Recent estimates indicate that 40 per cent of the planet’s land area is
degraded.7 Various scientific studies put the global rate of soil loss at between 36 and 75 billion tonnes a year8 – meaning that an astonishing
5–10 kilograms of soil are lost for every meal you eat. It also represents the loss of around 1 per cent of the world’s farmed soils every year.
Sheffield University researchers estimate that a third of our soils are already gone9 – and over the next half-century, at current rates of
destruction, half the world’s remaining farmland is going to disappear. This will affect the health and survival of all humans and most of the
Earth’s living species.

Humanity still relies on agriculture to produce about 95 per cent of our food, but by 2070 we will have only one-third of the farmland we had in
1975 – to furnish twice the amount of food to support a projected 10–11 billion people. As anyone can see, that places our future food security
in extreme jeopardy. A system that destroys itself cannot endure. Even more serious than the shortage of water and the increasingly erratic
climate, the loss of topsoil combined with the spread of deserts is a gravely underestimated threat to our human future – and, so far, is proving
extremely difficult to reverse, or even to slow.10 However, the UN says that a worldwide effort to restore our vanishing soils is urgent and must
be achieved by 2030.11

Oceans

In the oceans, 762 polluted ‘dead zones’ embrace an area of 245,000 square kilometres,12 and 94 per
cent of the world’s fisheries are either maxed out or overfished.13 Acidification (caused by the burning of fossil
fuels) threatens the entire oceanic food chain, from plankton and corals up to large sea life.14 Furthermore,
there are alarming signs that the oceans are giving up their oxygen15 – in other words, their ability to support life –
as well as losing their capacity to absorb humanity’s excessive carbon emissions. Humans are also
dumping around 14 million tonnes of plastic in the oceans every year, along with much other toxic
waste, creating a further source of poisoning in the marine food chains that supply our food.16
Metals

The world currently mines about 10 billion tonnes of metal ores a year, whose 100–150 billion tonnes of
waste cause widespread destruction of landscapes, pollution of water, and harm to the health of
humans and wildlife. By mid-century the world’s reserves of cheap phosphorus and potash, critical to the
global food supply, will run low, endangering food security. While supplies of base metals are deemed adequate to meet
future demand, shortages are already emerging among 35 rare earths and metals of strategic value to the
electronics and renewable energy sectors, such as lithium.17
Overuse

Resource overuse endangers humanity at several levels – through contamination and poisoning of
people and wildlife, and through destruction of the natural systems that support us, especially air, water,
soil, and biodiversity. Throughout history, resource scarcity and population pressure have often led to
war (as was the case with World War II) – indeed, national borders are usually drawn around food-producing
resources.
2AC
T
T-SS---2AC
B. The SSA agrees.
SSA 10, Social Security Administration, Compilation Of The Social Security Laws, “Entitlement to
Hospital Insurance Benefits”, March 2010, Social Security Act §226, 42 U.S.C. 426, HBisevac

Sec. 226. [42 U.S.C. 426] (a) Every individual who—

(1) has attained age 65, and

(2)(A) is entitled to monthly insurance benefits under section 202, would be entitled to those benefits except
that he has not filed an application therefor (or application has not been made for a benefit the entitlement to which for any
individual is a condition of entitlement therefor), or would be entitled to such benefits but for the failure of another individual, who
meets all the criteria of entitlement to monthly insurance benefits, to meet such criteria throughout a month, and, in conformity
with regulations of the Secretary, files an application for hospital insurance benefits under part A of title XVIII,

(B) is a qualified railroad retirement beneficiary, or

(C)(i) would meet the requirements of subparagraph (A) upon filing application for the monthly insurance benefits
involved if medicare qualified government employment (as defined in section 210(p)) were treated as employment (as
defined in section 210(a)) for purposes of this title, and (ii) files an application, in conformity with regulations of the
Secretary, for hospital insurance benefits under part A of title XVIII,

shall be entitled to hospital insurance benefits under part A of title XVIII for each month for which he meets the
condition specified in paragraph (2), beginning with the first month after June 1966 for which he meets the conditions specified in paragraphs
(1) and (2).

C. ‘Social Security’ CURRENTLY is old age, survivors, and disability insurance


LII ’23 [Legal Information Institute; “Social Security Law: An Overview,” page last updated May 22,
2023, https://www.law.cornell.edu/wex/social_security#:~:text=social%20security%20law%3A%20an
%20overview,full%20burden%20of%20such%20occurrences.]
Social Security

social security law: an overview

Social security is designed, as the title suggests, to provide security. To protect individuals from unforeseen catastrophes, the government
spreads certain risks among all members of society so that no single family bears the full burden of such occurrences.

In the United States, the Social Security Program was created in 1935 (42 U.S.C. 401 et seq.) to provide old age,
survivors, and disability insurance benefits to workers and their families. Unlike welfare, social security
benefits are paid to an individual or his or her family at least in part on the basis of that person's
employment record and prior contributions to the system. The program is administered by the Social Security
Administration (SSA). Since the establishment of the Medicare program in 1965, it and Social Security have been closely linked. While
the original act used "Social Security" in a broader sense, including federally funded welfare programs
and unemployment compensation within its scope, and the Medicare legislature took the form of amendments to that act,
current usage associates the phrase with old age, survivors, and disability insurance.

The Federal Old Age, Survivors, and Disability Insurance (OASDI) program pays out monthly benefits to
retired people, to families whose wage earner has died, and to workers unemployed due to sickness or accident. Workers qualify for its
protection by having been employed for a minimum amount of time and by having made contributions to the program. Once an individual has
qualified for protection, certain other family members are, as well. Financial need is not a requirement but continuing to earn substantial sums
is inconsistent with eligibility for certain benefits (disability insurance) and can reduce the benefit amount with others (including retirement or
survivors benefits).
2. C/I---Social Security is any program established by the SSA---only expands the topic
to health programs.
Law Dictionary, no date [Law Dictionary; no date given; Online legal dictionary feat. Black’s Law
Dictionary, 2nd ed. “Social Security,” https://thelawdictionary.org/social-security/]

SOCIAL SECURITY Definition & Legal Meaning


Definition & Citations:

A program that has been mandated by the Social Security Act in 1935 that provides for the elderly,
disabled and provides health insurance to certain members of a community.

Or it doesn’t matter because expand means to increase its scope.


Fahey ’19 [Eugene; October 22; Judge on the Court of Appeals of New York, dissenting; Westlaw,
“Adirondack Wild: Friends of the Forest Pres. v. New York State Adirondack Park Agency,” 34 N.Y.3d
184]

The Rivers Act does not define the word “expanded.” “In
the absence of a statutory definition, ‘we construe words of
ordinary import with their usual and commonly understood meaning, and in that connection have
regarded dictionary definitions as useful guideposts in determining the meaning of a word or phrase’ ”
(Yaniveth R. v LTD Realty Co., 27 NY3d 186, 192 [2016], quoting Rosner v Metropolitan Prop. & Liab. Ins. Co., 96 NY2d 475, 479-480 [2001]).
One ordinary meaning of “expand,” and the one relevant here, is “to increase the extent, number,
volume, or scope of” (Merriam-Webster Online Dictionary, expand [http://www.merriam-webster.com/dictionary/expand]; see
also Webster's New Collegiate Dictionary 402 [1977] [“to increase the extent, number, volume, or scope of”]). Accordingly, for DEC's
determination that motor vehicle use on the road would not “expand” to be rational, there must be some basis in the record upon which DEC
could reasonably conclude that once the road is opened to the public, motor vehicle use on the road would not increase in extent, number,
volume, or scope.
SSA PIC
SSA PIC---2AC
Social Security means any benefits provided under the SSA or the Railroad Retirement
Act

Disability Benefit Plan ’20 [Summary Plan Description of the California Correctional Peace
Officers Association Benefit Trust Fund’s DISABILITY BENEFIT PLAN (Plan 502); December;
effective law June 1, 2021; Effective June 1, 2021;
https://web.archive.org/web/20230128091730/https://www.ccpoabtf.org/documents/DBP-
SPD.pdf]

1.46 “Social Security” means the benefits provided under the United States Social Security Act,
the Railroad Retirement Act, or any similar plan provided under the laws of the United States, or
any plan provided as an alternative to such plans.
Econ DA
Econ DA---2AC
A. Recession coming now
Fox 10/23 (Matthew Fox | Matt covers macro research and stock market news at Markets Insider. He
contributes to the markets and investing teams. Matt joined Insider in April 2020. | “A recession is about to hit the
US economy and these 3 warning signs are defying the consensus view, Raymond James says” | Business Insider |
October 23, 2023 | https://www.businessinsider.com/recession-outlook-us-economy-rising-risks-3-warning-signs-
gdp-2023-10 | DOA: 10/26/23 | Saaπ)

Don't be fooled by a resilient consumer as a recession is poised to hit the US economy within the next nine months,
Raymond James' chief investment officer, Larry Adam, said in a recent note. Wall Street forecasts of a recession have been getting pushed
further and further out into the future as consumers continue to show solid spending habits and are overall in good financial shape. But Adam
said a convergence of several risk factors suggested the economy would be unable to avoid a mild recession within the next year. These are the
three warning signs he said he was monitoring ahead of a potential recession. 1. Growing headwinds for consumers From the
resumption of student-loan payments to elevated borrowing costs, there are a lot of risks that everyday consumers are forced to navigate.
Tailwinds that drove strong consumer spending since the pandemic are ending, and excess savings have
been nearly depleted. "Sure, consumers have jobs and income right now, but their ability to continue to consume
indiscriminately is coming to an end," Adam said. He pointed to recent comments from Bank of America CEO Brian Moynihan,
who said consumer spending was running at levels consistent with a low-inflation, low-growth economy that was prevalent before the
pandemic. Finally, growing credit-card debt and rising delinquencies suggest more Americans are starting to fall behind on debt obligations.
"While we are not suggesting that consumption is going to fall off a cliff, a moderation in spending should be expected," Adam said. 2. High
borrowing costs High borrowing costs for cars, homes, and credit cards pose a threat to economic
growth, especially if they persist for longer than expected. Adam said the housing market's ongoing affordability crisis suggested
residential real-estate activity would stay "frozen," and that is weighing on homebuilder confidence, which has declined to
its lowest level since January. Consumers are navigating this scenario by utilizing adjustable-rate mortgages, which now make up a nearly 10%
share of new home loans. The higher interest rates are also impacting business capital-expenditure plans. "A composite of regional Fed
capex surveys shows that business capex spending plans over the next six months have fallen to their second
lowest level in the post-COVID era," Adam said. 3. Macro risks are building Risks surrounding the economy and the stock market
are rising — and quickly. Elevated gas prices, conflicts in the Middle East, and souring consumer sentiment are
just a few of the lingering threats. The Conference Board's Expectations Index tumbled to its lowest level in four
months, "a level that historically signals a recession within the next year," Adam said. The index measures consumers'
attitudes toward the short-term outlook for the economy and jobs market. All of these risks should ultimately weigh on consumer spending
habits as the crucial holiday season approaches. "Add
in the possible disruptions from the ongoing autoworkers
strikes and a potential temporary federal government shutdown in mid-November, and growth could look
considerably weaker in the months ahead," Adam said.

D. Every one of our internal links turn growth. R&D innovation, health improvements,
etc.
David E. Bloom et al. 18, PhD in Economics and Demography from Princeton University, MA in
Economics from Princeton University, BSc in Industrial and Labor Relations at New York State School of
Industrial and Labor Relations from Cornell University. Michael Kuhn, PhD in economics from University
of Rostock. Klaus Prettner, Ph.D. in Economics from University of Vienna. “Health and Economic
Growth,” https://www.iza.org/publications/dp/11939/health-and-economic-growth, November 2018,
DOA: 9/23/2023, SAoki
The Channels by Which Health Affects Economic Growth in Developed Countries

health improvements are perceived to be an important component of economic


Despite some remaining controversy,

development in general and of the takeoff to sustained economic growth in particular. Much more skepticism surrounds the role of health as
a driver of economic growth in developed economies. Indeed, many view the high costs of advanced health-care systems as having the potential to deter growth. The debate centers on two
main concerns. First, given that longevity improvements in developed countries are concentrated among the elderly (see Breyer, Costa-Font, & Felder, 2010; Eggleston & Fuchs, 2012), further
expansions of longevity may lower the economic support ratio (i.e., the ratio of effective labor supply to the dependent population) and thereby lead to a decline in per capita consumption
levels. A related concern is that productivity gains insufficiently offset the elderly’s high medical costs, which therefore impose a drag on economic growth. Second, with health expenditure
shares in many OECD countries approaching or past the 10% mark (OECD, 2017), the absorption of productive resources by “oversized” health-care sectors is feared to compromise economic
performance (as debated in Pauly & Saxena, 2012). Particular concerns relate to health insurance as a source of inefficiency and to medical progress as a key cost driver (Chandra & Skinner,
2012).

This section begins by describing how health affects human capital accumulation, overall investment, and R&D-based economic growth in developed countries. The literature, which mostly
treats health and longevity improvements as exogenous, also sheds light on whether aging stifles economic growth. Blanchard (1985) undertook the first attempts to model the effects of an
increase in life expectancy on economic performance in developed economies, replacing the representative agent assumption of a standard neoclassical growth model with an overlapping

An increase in life expectancy raises aggregate savings in this setting


generations structure in which individuals face a constant risk of death.

and therefore, according to the mechanisms in the neoclassical growth model, raises the economy’s growth rate during the transition to the steady
state.4 However, life expectancy itself has no effect on the long-run growth rate at the steady state.5

Subsequently, endogenous growth models based on learning-by-doing spillovers (Romer, 1986) replaced the underlying neoclassical framework. Articles utilizing this framework include

increasing life expectancy


Reinhart (1999), Heijdra and Mierau (2011), and Mierau and Turnovsky (2014b).6 The baseline conclusion from this literature is that

positively affects long-run economic growth: aggregate savings rise with longevity and aggregate
returns to capital accumulation are non-diminishing due to knowledge spillovers. Consequently, capital
accumulation alone can sustain economic growth. While this effect is derived analytically for age-independent mortality, it can only be illustrated
numerically in case of age-dependent mortality. The argument presented so far refers to the impact of mortality reductions on capital accumulation as a supply-side driver of economic growth.

a reduction in
By contrast, Kuhn and Prettner (2018) explore mortality reductions’ impact on consumption growth as a demand-side driver. In overlapping generations economies,

the average death rate leads to greater consumption growth if the average consumption of those who
die exceeds per capita consumption across the population (Kuhn & Prettner, 2018). This condition is typically satisfied if consumption increases
with age up to age groups that exhibit significant mortality. Kuhn and Prettner (2018) employ data from the National Transfer Accounts to show that this holds for countries such as Finland,
Germany, Japan, and the United States.

Models of learning-by-doing spillovers or human capital accumulation as exclusive drivers of long-run growth do not leave room for technological progress to explain economic growth.
Prettner (2013) introduces an overlapping generations structure with age-independent mortality into the research and development (R&D)-driven (semi-)endogenous growth models

increasing life expectancy has an unambiguously positive effect on


pioneered by Romer (1990) and Jones (1995). Prettner shows that

technological progress and on long-run economic growth in the cases of both endogenous and semi-
endogenous growth. The central mechanism is that increasing life expectancy results in higher aggregate savings,
which puts downward pressure on the equilibrium interest rate. This in turn raises the discounted stream
of income derived by investing in successful R&D projects. Consequently, the incentive to carry out R&D
increases, which boosts technological progress and long-run economic growth.

R&D-based growth literature, Strulik, Prettner, and Prskawetz (2013) show that the aggregate human capital
In another strand of the

stock, rather than the size of the workforce allocated to R&D, is what matters for long-run economic
growth. In older models of this type, lower fertility always implies a lower growth rate of the economy. However, Strulik et al. (2013) show that increased schooling investments
accompany lower fertility through the quality-quantity substitution at the household level (see Becker & Lewis, 1973). This effect tends to be strong enough to overturn falling fertility’s
negative effect on aggregate human capital accumulation such that economic growth rises as fertility falls. Drawing from Bloom and Canning (2005), Prettner, Bloom, and Strulik (2013) show
that the human capital dimension in Strulik et al.’s (2013) model consists of both education and the stock of health. Baldanzi et al. (2017) use this insight to construct a fully-fledged dynamic

The complementarity between health and


general equilibrium growth model in which parental investments endogenously determine health and education.

education is crucial to raise the human capital level and therefore the central input into the R&D sector.
Consequently, long-run economic growth rises with health investments .

evidence is mounting that the life years


Another important aspect is the impact of health as transmitted through changes in labor supply. Indeed,

gained with greater longevity are increasingly spent in good health (e.g., Sanderson & Scherbov, 2010), implying that, in principle,
working lives can be extended so as to avoid an increase in old-age dependency (e.g., Loichinger & Weber, 2016).7 The
extent to which increases in healthy life span translate into the elderly’s higher labor force participation varies strongly across countries, depending on the retirement incentives from the

Theoretical models
pension scheme, among other factors (Milligan & Wise, 2015). The relationship also varies substantially across subpopulations (Dudel & Myrskylä, 2017).

reflect the mixed empirical evidence: While Prettner and Canning (2014) and Chen and Lau (2016) show that an increase in longevity
should lead to an increase in labor supply and savings, caveats arise in the presence of social security and

in economies where knowledge spillovers (Romer, 1986) or R&D (Romer, 1990) drive economic growth.8 Considering the role of
a pay-as-you-go pension scheme within a Romer (1986) economy, Heijdra and Mierau (2011) show that declining mortality promotes economic

growth due to a sufficient saving response. But this effect is muted under a defined benefits – as opposed to a defined contributions - pension scheme.
Surprisingly, the growth-promoting effects are also dampened if the retirement age is raised, due to a reduction in retirement savings (and, thus, capital accumulation) in response to an
extended working life. A similar effect is present in Kuhn and Prettner (2016), who consider a Romer (1990) economy: if health improvements lead to a decline in morbidity that dominates the
decline in mortality, then labor participation is boosted and retirement savings are reduced. The resulting increase in the interest rate dampens R&D activity and economic growth. While
certainly welcome on many grounds, health-related increases in the working life play a surprisingly ambiguous role for economic growth.

The models reviewed so far help allay some of the concerns that aging, in the sense of rising longevity, may undermine economic growth. However, a second important policy concern is that
oversized health-care sectors and medical progress may place an undue burden on economic growth.

The health-care
Kuhn and Prettner (2016) introduce an explicit health-care sector into an overlapping generations version of Romer’s (1990) R&D-based growth model.

sector employs labor to raise the population’s health level along three dimensions: longer life
expectancy, higher worker productivity, and later exit from the labor market. Because providing health
care absorbs labor from the production and R&D sectors, and due to diminishing returns of health care in lowering mortality, an interior size of
the health-care sector exists that maximizes long-run economic growth.9 Kuhn and Prettner (2016) show that an increase in the size of the health-care

sector beyond its growth-maximizing level constitutes a Pareto improvement that makes all generations ,
young and old, better off. This result is consistent with rational individuals being willing to spend an increasing

fraction of a growing income on health care such that economic development goes hand in hand with an
increasing share of the health-care sector in the aggregate economy (Hall & Jones, 2007).
Schneider and Winkler (2016) and Frankovic, Kuhn, and Wrzaczek (2017) study the impact of medical innovations on macroeconomic performance. Comparing balanced growth paths for a

Romer (1986) economy, Schneider and Winkler (2016) show that a life-saving technology enhances economic growth as long as the absorption of
additional labor into the health-care sector does not lead to reduced spillovers in the production sector. Frankovic et al. (2017) study the impact of life-saving medical innovations within a

calibrated overlapping generations model of the US economy.10By rendering health care more effective in lowering mortality,
especially among the elderly population, medical innovations increase the demand for health care and,
thereby, lead to an expansion of the health-care sector (as evidenced, e.g., in Cutler & Huckman, 2003; Roham, Gabrielyan, Archer, Grignon, & Spencer,

2014; Wong, Wouterse, Slobbe, Boshuizen, & Polder, 2012). However, for the following reasons this does not lower per capita GDP at the general

equilibrium: the longevity increase that comes with more effective health care and the prospect of
purchasing such care in the future trigger a strong increase in saving (as evidenced, e.g., in Bloom, Canning, & Graham, 2003; De Nardi,
French, & Jones, 2010). The resulting expansion of the capital stock overcompensates for the decline in the

economic support ratio that comes with most survival gains accruing after retirement. Moreover, the sectoral shift
toward the relatively labor-intensive production of health care, combined with increasing wages, induces price inflation in the health-care sector. This general equilibrium effect strongly
dampens the initial increase in the demand for health care.

E. Lower drug prices are a holistic boon for the economy, even assuming pharma profit
declines.
Steve Lippman 7, Vice President of Trillium Asset Management, a private equity investment firm;
Daniel Rosan, CFO of Ascidian Therapeutics, Former Head of Investor Relations and R&D Operations;
Adam Seitchik, President of New Summit Investments, a Venture Capital Firm, 5/3/2007, "Why Lower
Drug Prices Benefit Institutional Investors: an application of universal ownership theory," Corporate
Governance: An International Review, Vol. 15, Issue 3, pp. 455-466, https://doi.org/10.1111/j.1467-
8683.2007.00578.x

In the first instance, lower drug prices could reduce drug manufacturers’ profits and thus reduce their share prices. However some analysts
suggest that increased volumes of drug sales resulting from lower prices would mitigate reduced margins
and could even boost drug manufacturers’ profits. Still, assuming drug profits are impinged by lower prices,
offsetting gains could accrue elsewhere in the economy from savings to companies, governments, and
individuals purchasing prescription drugs. High healthcare costs are a major drag on the profitability of
many U.S. companies, and savings from lower drug prices would benefit firms providing employee healthcare
coverage. More affordable prescription drugs would also provide individual consumers with additional
discretionary income to spend or invest in other sectors of the economy. Greater access to prescription drugs could
reduce employee absenteeism and improve health and productivity. Even assuming a zero-sum outcome, where there are
no net benefits of lower drug prices, any lost profits from drug makers represent gains for other sectors
of the economy. As such, even if pharmaceutical profits decline, lower drug prices would have offsetting
gains for 3 a broad portfolio representing the entire economy – exactly the type of portfolio most institutional investors hold.
However, lower prescription drug prices might not simply shift profits around within an investment portfolio. There would likely be additional
dynamic implications, leading to net financial benefits for investments held by universal owners. Lower
drug prices might lead to
increased utilization of necessary drugs, and a healthier and more productive U.S. workforce. And despite
pharmaceutical company arguments, we found little evidence to suggest lower drug prices would reduce investment
in research & development. Research has actually been declining as a percentage of sales even as drug prices have
dramatically increased. (Instead, spending on marketing, particularly direct-to-consumer marketing, has increased as prices have
increased.) In short, the impact of lower drug prices on investors heavily exposed to pharmaceuticals could be negative, with some lesser
offsetting benefits from the increased demand for drugs as a result of lower prices. However, for universal institutional investors broadly
exposed to all market sectors, the impact of lower drug prices is likely to be neutral at worst, and could very well
be positive on a net basis. In addition, institutions such as public pension funds that purchase health
care services for their beneficiaries may see additional benefits from lower drug prices. For the majority of
institutional investors, then, the downside risk of pursuing lower drug prices is minimal and likely to be exceeded by the potential upside. These
findings suggest that investor action to reduce prescription drug prices is likely to have a neutral or positive impact on their overall portfolios
and represents a responsible exercise of fiduciary duty.

Economic decline won’t cause war -- diversionary theory is wrong


Segev et. al 22 (Elad Segev, Associate professor in International Communication at Tel Aviv University,
Dr. Atsushi Tago is a professor of International Relations at the School of Political Science and
Economics, Waseda University. Dr. Kohei Watanabe has a PhD in Social Research Methodology from the
LSE and an MA in Political Science from Central European University. “Could leaders deflect from
political scandals? Cross-national experiments on diversionary action in Israel and Japan,” International
Interactions, 3/15/22 – gm)

The diversionary theory of war is one of the best-known conflict initiation theories focusing on democratic leaders’ incentives.
According to the theory, democratic leaders who face greater electoral challenges, either due to political scandals or an
economic downturn, are more likely to choose provocative foreign policies and seek to lead the country
into diplomatic crises, in hopes of inciting nationalistic sentiments that will boost their approval ratings
via the so-called “rally around the flag” effect (e.g. Gaines Citation2002; Hetherington and Nelson Citation2003; Mueller
Citation1973).
Despite the intuitive appeal of this theory, empirical studies have been largely unable to find
consistent evidence to corroborate the purported theoretical mechanisms. Findings from observational
studies have been quite mixed. The fact that a diverse set of findings have been reported from observational studies suggests that
unobservable confounders arising from strategic interactions greatly hinder our ability to tease out the causal effect of electoral hardship on
conflict behaviors.
In this research note, we claim that the key assumption of the theory does not work as expected. That is, a political
leader cannot divert attention from his/her political scandals by emphasizing a foreign threat and
alerting the general public that the country may go to war against an enemy. Although the assumptions that the
threat or use of force is salient and that an acute enemy threat would create a rally-around-the-flag effect are common, they have rarely
been tested at a micro-level in an experimental setting. Our team conducted a cross-national experiment to find out whether and how
political leaders could divert the public’s attention away from their political scandals.
We selected Japan and Israel as fields of the experiment. Both are comparable parliamentary democracies that have witnessed a series of
serious political scandals. Moreover, the general Japanese and Israeli public could plausibly expect a hawkish national security policy against the
potentially nuclear-armed enemy states of North Korea and Iran, respectively, both of which are widely represented as “evil” and “mad”
enemies.Footnote1 While we acknowledge that Japan and Israel have significantly different military cultures and distinct records of militarized
interstate disputes, we consider them to be interesting, important, and comparable cases that could generate significant findings in testing the
above-mentioned assumptions of diversionary war theory.
Our contribution is twofold. First, we confirm that, in both Japan and Israel, diverting public attention
from salient political scandals would fail even if a political leader emphasizes the enemy threat or alerts
the public to potential escalatory moves against the enemy. In particular, the most escalatory hawkish policy—a
preemptive move—would not help the government hide its political scandals from the general public.
Second, we found that, when we showed a (mock) news article predicting the prime minister’s hawkish policy (i.e. an escalation against a
potentially nuclear-armed enemy), this
would not directly lead to greater support for the prime minister
compared to the mere emphasis on the threat level posed by the enemy. Just warning of an imminent threat from
North Korea or Iran proves critical and sufficient to induce political support from the general public; we call this threat-induced political support
MedTech DA
MedTech DA---2AC
Biotech post COVID overwhelms any link

Alexandra Zemp et al 21 (Alexandra is a leader in McKinsey’s Life Sciences


Practices. Extensive experience in organizational transformations and led
projects, 04/30/21, accessed 05/25/22, “What’s ahead for biotech: Another wave
or low tide?”, https://www.mckinsey.com/industries/life-sciences/our-insights/whats-ahead-for-
biotech-another-wave-or-low-tide)DShah

Unlike most industries in these extraordinarily challenging times, biotech is experiencing a high. Executives in many
other sectors are becoming more pessimistic about the outlook for their businesses as the global pandemic continues to spread.1
COVID-19 has focused intense government,
But the search to understand and find treatment or preventive solutions to
media, and public attention
on science and medicine, reinforcing the perception that biotech
acquisitions and partnerships represent a good investment. In an effort to understand worldwide biotech
financing in the context of the COVID-19 crisis, McKinsey analyzed the sector’s financial performance and interviewed 20 C-level
executives from small and midsize biotechs and venture-capital (VC) firms. The pandemic has had an enormous financial impact on
many sectors, but biotech has weathered the storm: after a brief downturn early in the crisis, it recovered
quickly (Exhibit 1). Between January 2020 and January 2021, the average share price for European and US
biotechs increased at more than twice the rate of the S&P 500, and Chinese biotechs performed
more than six times better, with their average share price more than doubling in a year. Overall,
biotech is outperforming its sister industry, pharmaceuticals, as well as many household-name consumer-goods and
technology companies. With acquisitions, partnerships, IPOs, and fundraising still increasing, biotech’s star has, if
anything, risen higher than it was before the pandemic. The industry’s response to the crisis, its record
of innovation, and its reputation as a safe haven for investment have all served it well. But whether biotech can sustain this
performance is open to question. This article looks at the industry’s record of growth, its resilience during the global
pandemic, and the factors that could determine whether the biotech wave continues. Between 2019 and 2020, biotech saw
double-digit annual growth in fundraising from VCs and deals such as partnerships,
codevelopments, and joint ventures. It also saw triple-digit growth in IPOs (Exhibit 2).
SSA DA
SSA DA---2AC
No organized crime impact

Vanda Felbab-Brown, 18, senior fellow in the Center for Security, Strategy, and Technology
in the Foreign Policy program at Brookings, "The Crime–Terror Nexus and its Fallacies," Oxford
Handbooks Online,
https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780198732914.001.0001/oxfordhb-
9780198732914-e-22)SEM

Not all organized crime groups are equally willing and capable buddies of terrorists. There is a
considerable variation in the capacity of organized crime groups to penetrate (p. 375) new
territories or domains, as Frederico Varese (2011) has shown. Few criminal groups are polycrime enterprises:
smuggling cocaine is not the same as smuggling fissile material. Although it is frequently
suggested that organized crime groups will easily make alliances or without restraint cooperate with terrorist
and militant groups, the relationship between the two actors is often fraught and violent (see e.g. Farah 2013).
Most organized crime groups are not simply blind profit maximizers, but also they weigh risks —
including the risk of cooperating with militant groups and thus drawing a far different level of
scrutiny and repression from law enforcement.
Ukraine DA
Ukraine DA---2AC
Johnson is against additional aid---dooms it in the House OR waters it down to nothing
on passage.
Suter 10/25 (Tara Suter | Tara Suter is a breaking news reporter for The Hill. She graduated from The George
Washington University with a degree in journalism and mass communications. While at GW, Tara worked in
various roles of increasing importance on the student newspaper, The GW Hatchet, including events editor and
research assistant. She has previously worked as an editorial intern on WGBH’s FRONTLINE program and a
journalism intern at OpenSecrets. Tara is originally from Washington State. | Where House Speaker Mike Johnson
stands on Ukraine, Israel | The Hill | October 25, 2023 | https://thehill.com/homenews/house/4274682-where-
gop-speaker-nominee-mike-johnson-stands-on-ukraine-israel/ | DOA: 10/26/23 | Saaπ)
President Biden announced the sending of a budget request to Congress requesting aid for both Israel and Ukraine in a speech last Thursday.
The request is expected to be about $100 billion, with a large portion of the funds for Ukraine. ¶ Johnson
has shown some resistance
to more funding for Ukraine in the past but seems to wholeheartedly support Israel. ¶ Here’s what you need to know about his
history of views on both conflicts:¶ Ukraine¶ Speaker-elect Mike Johnson (R-La.) holds the gavel after winning the Speakership election
Wednesday, Oct. 25, 2023.¶ Johnson made a strong statement in support of Ukraine in its effort to fight back against Russia in the wake of the
invasion in February 2022.¶ “We should impose debilitating sanctions on Russia’s economic interests,” Johnson said in the statement, posted to
X, the platform then known as Twitter. “We should return to robust American energy production to provide greater stability and security here
and for our European allies. We should exclude Russia from global commerce and international institutions. Even though the best time to take
these actions has passed, we must act decisively.”¶ “America’s prayers remain with the Ukrainian people,” Johnson continued.¶ In April 2022,
he voted for The Ukraine Democracy Defense Lend-Lease Act of 2022, a bill that aimed to ease the process for the U.S. to send military aid to
Ukraine. The bill was later signed by President Biden and became law. ¶ However, in
recent times, he has taken a skeptical
stance toward aid for Ukraine. He voted against two different appropriations bills that provided aid to
Ukraine, one in 2022 and another last month.¶ “American taxpayers have sent over $100 billion in aid to Ukraine in the
last year,” Johnson said in an X post in February. “They deserve to know if the Ukrainian government is being entirely
forthcoming and transparent about the use of this massive sum of taxpayer resources.” ¶ After gaining the
gavel, Johnson was asked whether he supports additional aid to Ukraine.¶ “We all do…we are going to have
conditions on that, so we’re working through,” Johnson said while walking through the Capitol, in a clip posted Wednesday to X, the
platform formerly known as Twitter.¶ “We want accountability, and we want objectives that are clear from the White House,“ Johnson later
said in response to a question about what the conditions would be.

Biden’s PC isn’t key – it’s entirely up to the House election which he cannot influence
Lexie Schapitl 10/6 – “The fate of Ukraine funding lies in the balance with speaker's race”, 2023,
https://www.npr.org/2023/10/06/1203961930/ukraine-funding-congress-speakers-race

The next speaker of the House will have the power to decide what policies come up for a vote in the House of
Representatives, leaving funding for U.S. involvement in Ukraine in the balance.

Last week, former Speaker Kevin McCarthy, R-Calif., made a last-minute decision to move ahead with a short-term government spending bill
without the $24 billion for military, humanitarian and economic aid for Ukraine requested by President Biden. That move avoided an impending
government shutdown but it may have doomed any future funding.

Biden is distracted and won’t push aid.


Amna Nawaz 10-9, Co-Anchor of PBS newshour, 10/9/2023, “Tamara Keith and Amy Walter on
pressure to elect House speaker after attack in Israel,” https://www.pbs.org/newshour/show/tamara-
keith-and-amy-walter-on-pressure-to-elect-house-speaker-after-attack-in-israel [Person speaking left for
added clarification]
Tam, you heard John Kirby tell Geoff earlier that he's very optimistic, feels very good that the White
House and the administration is
prepared to handle all of these multiple fronts , right, the war in Ukraine , countering China , and now supporting
Israel .

But what does your reporting show? Do they have both the p olitical c apital and the consensus to do all that ?
Tamara Keith:

What's fascinating is, President Biden last week announced that he was going to give a major address on Ukraine to try to make the case to the
American people that this is something that they should still care about and should still be invested in.

Now he may have to give a major address on Israel and Ukraine. And as I was looking into sort of the approach to that
speech, I talked to a number of allies of the White House who said that President Biden hadn't really made the case to
the American people .

Like, he had sort of been quiet on Ukraine for a domestic audience and that that was a mistake. Well, now you have yet
another front here. One thing that is important to point out, a big difference between Iraq and Afghanistan is, there are
not U.S. boots on the ground.
Amna Nawaz:

Yes.

Amy Walter:

Right.

Tamara Keith:

And that is a very big difference and something that the White House hasn't exactly driven home as part of
their pitch .

PC fails unless Biden’s popular with the public.


MS 21 – Meeting Street, a polling organization, 5/7/2021, “Presidential Approval Ratings At 100 Days:
How President Biden’s 100th Day Approval Ratings Compare To His Predecessors, And What It Might
Mean For The Midterms,” https://meetingst.com/presidential-approval-ratings-at-100-days/

Since then, the


first 100 days of a President’s administration have been seen as an opportunity for them to
pursue their top priorities and set a legislative tone for what they hope to achieve. Presidents are typically aided in this by
a honeymoon period of high approval ratings , which affords them p olitical c apital. The higher their approval rating ,
the more p olitical c apital they have to wield over Congress .
While the sheer output of Roosevelt’s first 100 days will be difficult to match, Presidents since have been similarly ambitious, and President
Biden is no exception. During Biden’s first 100 days, Congress passed a massive $1.9 trillion spending package his administration has called “the
most progressive piece of legislation in history.”2 Now as President Biden reaches his hundredth day and turns his attention to even
larger spending packages , his approval rating will be watched closely , not only as a reflection of the popularity of what he’s
done, but also as a rough gauge of how much p olitical c apital he has left in the tank.

So far, Biden’s approval rating isn’t as lofty as his legislative aspirations . According to Gallup, which has been conducting
polling on Presidents’ approval ratings since the Second World War, President Biden’s approval rating among adults 100 days into his
term stands at just 57% . His underwhelming honeymoon period ranks toward the bottom historically. While 57% may seem like a decent
rating, Presidents’ first few months tend to be high water marks.

Winners-win.
Paul Kane 21, Senior Congressional Correspondent and Columnist; Washington Post, 7/24/2021, “Day-
to-day, Biden’s Agenda Looks Rocky. But Congressional Democrats Say Things Are Far Rosier If You Take
the Long View,” https://www.washingtonpost.com/powerpost/biden-agenda-democrats-congress/
2021/07/24/83b776be-ebc0-11eb-ba5d-55d3b5ffcaf1_story.html

There is, so far at least, little fear that Democrats are spreading themselves too thin by eschewing the
traditional practice of focusing on a handful of domestic policy issues in the first two years of an
administration. “Political momentum and political capital is like a muscle. The more you exercise it, the
more of it you have. It is not like a finite resource that you can run out of if you spend too much of it.
What happens is that if we do a lot of positive things, then we’ve got more political clout to do even
more positive things,” Sen. Brian Schatz (D-Hawaii) said.

The plan generates the necessary support.


Sayed and Johnson 21 – Towsley Foundation Policymaker in Residence at the University of
Michigan. Micah is resident physician at the Brigham and Women's Hospital. Abdul El-Sayed and Micah
Johnson, 2021, Medicare for All: A Citizen’s Guide, Chapter 5: How to Pay for It.

Public support for M4A is high. A nationally representative Pew Research study found that most Americans
agree “it is the responsibility of the federal government to make sure all Americans have health care
coverage.”1 And a majority of Americans support M4A as a way to achieve this goal. In March 2018, the Kaiser
Family Foundation asked 1,212 American adults, “Do you favor or oppose having a national health plan, or Medicare-for-all, in which all
Americans would get their insurance from a single government plan?” Among all respondents, 59 percent were in favor and 38
percent were opposed.2

Medicare expansion is publicly supported


KFF ND (Kaiser Family Foundation | “Public Opinion on Single-Payer, National Health Plans, and Expanding
Access to Medicare Coverage” | KFF | No Date | https://www.kff.org/slideshow/public-opinion-on-single-payer-
national-health-plans-and-expanding-access-to-medicare-coverage/ | DOA: 10/27/23 | Saaπ)

KFF polling finds more Democrats and Democratic-leaning independents would prefer voting for a candidate who wants to
build on the ACA in order to expand coverage and reduce costs rather than replace the ACA with a national Medicare-for-all
plan (Figure 12). Additionally, KFF polling has found broader public support for more incremental changes to expand the public health
insurance program in this country including proposals that expand the role of public programs like
Medicare and Medicaid (Figure 13). And while partisans are divided on a Medicare-for-all national health plan, there is robust
support among Democrats, and even support among four in ten Republicans, for a government-run
health plan, sometimes called a public option (Figure 14). Notably, the public does not perceive major differences in how a
public option or a Medicare-for-all plan would impact taxes and personal health care costs. However, there are some differences in
perceptions of how the proposals would impact those with private health insurance coverage (Figure 15).
KFF polling in October 2020 finds about half of Americans support both a Medicare-for-all plan and a public option (Figure 16). So while the
general idea of a national health plan (whether accomplished through an expansion of Medicare or some other way)
may enjoy fairly broad support in the abstract, it remains unclear how this issue will play out in the 2020 election and beyond.

No war---Ukraine is fine on their own and Russia is weak.


Samuel Charap 23, Senior Political Scientist at the RAND Corporation and served on Policy Planning
Staff of the U.S Department of State, 10/3/2023, “Rightsizing the Russia Threat,”
https://www.foreignaffairs.com/eastern-europe-and-former-soviet-union/rightsizing-russia-threat
The trouble with seeing Putin as a maximalist or a génocidaire is that it ignores his inability to be either one of
those things—unless he resorts to use of weapons of mass destruction. When Russia’s conventional military was at the peak of its
power at the start of the war, it was incapable of taking control of any major Ukrainian city. Since the retreat from Kyiv and
the northeast, Russian forces have demonstrated little capacity to conduct successful offensive operations. Their
last attempt—a winter offensive in the south of the Donetsk region—ended in a bloodbath for the Russian side. At this rate,
Putin will never succeed at taking control of Ukraine by force, let alone wipe out its inhabitants, even if
Western support for Kyiv wanes. If he cannot take Ukraine, it seems far-fetched that he could go beyond it. These
Russian weaknesses are widely invoked, but they are usually ignored in assessments that focus on Putin’s intentions.

Moreover, Moscow’s soft-power instruments have been revealed to be equally ineffective as its hard power ones.
Despite many fears to the contrary, German dependence on Russian natural gas has not allowed Moscow to stop Berlin
from leading efforts to counter aggression in Ukraine. In addition, the shallowness of Russia’s capital markets and
the general weakness of its industrial sector have driven former Soviet countries toward the West and China in
search of trade opportunities and investments—despite elaborate attempts by Moscow to foster economic integration in the
region. In addition, Putin’s Russia, unlike its Soviet predecessor, has no power of attraction with which to co-opt foreign elites
into larger political projects. The Kremlin under Putin has neither a powerful, transnational ideology nor a
developmental model that could attract elites outside its borders. Whatever soft power Russia wielded to attract elites through
more banal means—say, bribery on a grand scale—has been largely squandered by now, thanks to the brutality of its war.

The Ukraine war has revealed that Putin does not have the resources—short of using nuclear weapons—to fulfill
maximalist or genocidal objectives. The Russian military has improved its performance during the war; its destructive power
should not be dismissed. And Putin’s intentions do matter. But it is now clear that his forces cannot defeat the Ukrainian
military, let alone occupy the country. Perhaps he might dream of wiping Ukraine off the map or of marching onward
from Ukraine to the rest of the continent. But his dreams matter little if he cannot realize them on the ground.
1AR
SSA DA
SSA DA---1AR
Any solution to the crime-terror nexus makes it worse
Vanda Felbab-Brown, 18, senior fellow in the Center for Security, Strategy, and Technology in the
Foreign Policy program at Brookings, "The Crime–Terror Nexus and its Fallacies," Oxford Handbooks
Online, https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780198732914.001.0001/
oxfordhb-9780198732914-e-22)SEM

Yet the conventional view is often wrong-headed. Despite the fact that states often have intimate knowledge of using organized
crime for their purposes and exploiting illicit economies, many policy interventions to combat organized crime and illicit
economies— whether linked to violent conflict or in the absence of one—have rarely been highly effective. Not only do belligerents
and illicit crop farmers find ways to adapt to the policies implemented under the siren song of eradication, but such policies are

counterproductive. Eradication alienates rural populations from the government and thrusts them into
the hands of the insurgents. And partnering with quasi-criminal actors has often turned out to be
counterproductive with respect to other objectives, such as mitigating violent conflict, fostering good governance,
and promoting human rights, and at times even counterproductive with respect to very direct objectives, such as
weakening criminal groups and their linkages to terrorist organizations. This is because although illicit economies
pose multiple threats to states, their effects on societies are often highly complex. Indeed, large populations around the world in areas

with inadequate or problematic state presence, great poverty, and social and political marginalization continue to be dependent on illicit

economies, including the drug trade, for economic survival and the satisfaction of other socio-economic needs. For many, (p. 368)
participation in informal economies, if not outright illegal ones, is the only way to satisfy their human security and provide
any chance of their social advancement, even as they continue to live in a trap of insecurity, criminality, and marginalization. Winning the military

conflict or negotiating peace often requires the halting of suppression actions against labor-intensive illicit
economies. As much as external actors may condemn tacitly or explicitly permitting illicit economies, such practices are often crucial for
winning hearts and minds and ending conflict. They may also be crucial for giving belligerent groups a
stake in peace. But this “narcopeace” may come at the cost of severely negative public-goods side-effects, such as extensive drug production and
unrestrained environmental destruction due to logging and wildlife trafficking. Development-based policies aimed at reducing illicit drug

production are crucial for avoiding such negative side-effects while maximizing the chance for peace and social justice. But they
must equally focus on preventing the emergence of unrestrained logging and wildlife trafficking and other environmentally destructive replacement economies.

You might also like