POLICYReview

February & March 2011, No. 165, $6.00

THE ROAD TO (AND FROM) THE 2010 ELECTIONS DAVID W. BRADY, MORRIS P. FIORINA, & R. DOUGLAS RIVERS A CLIMATE POLICY FOR THE REAL WORLD PAUL J. SAUNDERS & VAUGHAN TUREKIAN THE PERSISTENCE OF GENOCIDE DAVID RIEFF PTSD’S DIAGNOSTIC TRAP SALLY SATEL ALSO: ESSAYS AND REVIEWS BY MICHAEL GONZALEZ, GREGORY CONKO & HENRY I. MILLER, JAMES KIRCHICK, PETER BERKOWITZ, HENRIK BERING, YING MA, DAVID R. HENDERSON

A P u b l i c a t i o n o f t h e H o ov e r I n s t i t u t i o n
stanford university

the hoover institution was established at Stanford
University in 1919 by Herbert Hoover, a member of Stanford’s pioneer graduating class of 1895 and the thirty-first president of the United States. Since 1919 the Institution has evolved from a library and repository of documents to an active public policy research center. Simultaneously, the Institution has evolved into an internationally recognized library and archives housing tens of millions of books and documents relating to political, economic, and social change. The Hoover Institution’s overarching purposes are: • To collect the requisite sources of knowledge pertaining to economic, political, and social changes in societies at home and abroad, as well as to understand their causes and consequences • To analyze the effects of government actions relating to public policy • To generate, publish, and disseminate ideas that encourage positive policy formation using reasoned arguments and intellectual rigor, converting conceptual insights into practical initiatives judged to be beneficial to society • To convey to the public, the media, lawmakers, and others an understanding of important public policy issues and to promote vigorous dialogue Ideas have consequences, and a free flow of competing ideas leads to an evolution of policy adoptions and associated consequences affecting the well-being of a free society. The Hoover Institution endeavors to be a prominent contributor of ideas having positive consequences. In the words of President Hoover: This Institution supports the Constitution of the United States, its Bill of Rights, and its method of representative government. Both our social and economic systems are based on private enterprise from which springs initiative and ingenuity. . . . The Federal Government should undertake no governmental, social or economic action, except where local government, or the people, cannot undertake it for themselves. . . . The overall mission of this Institution is . . . to recall the voice of experience against the making of war, and . . . to recall man’s endeavors to make and preserve peace, and to sustain for America the safeguards of the American way of life. . . . The Institution itself must constantly and dynamically point the road to peace, to personal freedom, and to the safeguards of the American system.

POLICY Review
F EBRUARY & M ARCH 2011, No. 165

Features
3 THE ROAD TO (AND FROM) THE 2010 ELECTIONS What happened to the president and his party? David W. Brady, Morris P. Fiorina, & R. Douglas Rivers 15 A CLIMATE POLICY FOR THE REAL WORLD Less international negotiation, smarter domestic decisions Paul J. Saunders & Vaughan Turekian 29 THE PERSISTENCE OF GENOCIDE “Never Again,” again and again David Rieff 41 PTSD’S DIAGNOSTIC TRAP Locking some veterans into long-term dependence Sally Satel 55 CUBA’S LOST HISTORY Reclaiming the pre-Castro national character Michael Gonzalez 69 THE RUSH TO CONDEMN GENETICALLY MODIFIED CROPS Impractical regulations and nuisance lawsuits Gregory Conko & Henry I. Miller

Books
83 THE CENTER-RIGHT HONORABLE TONY BLAIR James Kirchick on A Journey: My Political Life by Tony Blair. 90 THINKING ABOUT TORTURE Peter Berkowitz on Because it is Wrong: Torture, Privacy, and Presidential Power in the Age of Terror by Charles Fried and Gregory Fried. 96 BRUTISH AND SHORT Henrik Bering on Brute: The Life of Victor Krulak, U.S. Marine by Robert Coram. 102 MARKET CAPITALISM, STATE-STYLE Ying Ma on The End of the Free Market: Who Wins the War Between States and Corporations? by Ian Bremmer. 108 HOME ECONOMICS David R. Henderson on At Home: A Short History of Private Life by Bill Bryson.

A P u b l i c a t i o n o f t h e H o ov e r I n s t i t u t i o n
stanford university

POLI CY Review
F e b r u a ry & M a r c h 2 0 1 1 , N o . 1 6 5

Editor Tod Lindberg
Research Fellow, Hoover Institution

Consulting Editor Mary Eberstadt
Research Fellow, Hoover Institution

Managing Editor Liam Julian
Research Fellow, Hoover Institution

Office Manager Sharon Ragland

Policy Review ® (issn 0146-5945) is published bimonthly by the Hoover Institution, Stanford University. For more information, write: The Hoover Institution, Stanford University, Stanford ca 94305-6010. Or visit www.hoover.org. Periodicals postage paid at Washington dc and additional mailing offices. POSTMASTER: Send address changes to Policy Review, Subscription Fulfillment, P.O. Box 3 7 0 0 5 , Chicago, il 6 0 6 3 7 - 0 0 0 5 . The opinions expressed in Policy Review are those of the authors and do not necessarily reflect the views of the Hoover Institution, Stanford University, or their supporters. E d i t o r i a l a n d b u s i n e s s o f f i c e s : Policy Review, 21 Dupont Circle n w , Suite 310, Washington, d c 20036. Telephone: 202-466-3121. Email: polrev@hoover.stanford.edu. Website: www.policyreview.org. Subscription information: For new orders, call or write the subscriptions department at Policy Review, Subscription Fulfillment, P.O. Box 37005, Chicago, il 60637. Order by phone Monday through Friday, 8 a.m. to 5 p.m. Central Time, by calling (773) 753-3347, or toll-free in the U.S. and Canada by calling (877) 705-1878. For questions about existing orders please call 1-800935-2882. Single back issues may be purchased at the cover price of $6 by calling 1-800-935-2882. Subscription rates: $36 per year. Add $10 per year for foreign delivery. Copyright 2011 by the Board of Trustees of the Leland Stanford Junior University.

The Road to (and from) the 2010 Elections
By David W. Brady, Morris P. Fiorina, & R. Douglas Rivers

he 2008 elections gave the Democrats the House, the presidency, and a “filibuster proof” Senate. Pundits spoke of the election as a “game changer.” Evan Thomas wrote that “Like Franklin Roosevelt in 1932 and Reagan in 1980, the Obama run of 2008 marks a real shift in real time. It is early yet, but it is not difficult to imagine that we will, for years to come, think of American politics in terms of Before Obama and After Obama.”1 According to Borsage and Greenberg: “But election 2008 was not simply a testament to the remarkable candidacy of Barack Obama, nor a product of Bush’s catastrophic presidency. Rather, the results suggest that this may not be simply a change election but a sea-change election . . . we may be witness to the emergence of a new progressive majority, that contrary to conservatives’ claims, America is now a center-left nation.”2 Even
David W. Brady, Morris P. Fiorina, and R. Douglas Rivers are senior fellows of the Hoover Institution and professors of Political Science at Stanford University.
February & March 2011 3 Policy Review

T

Brady, Fiorina & Rivers
James Carville’s 40 More Years: How the Democrats Will Rule the Next Generation did not seem as outlandish when published in early 2009 as it does in the aftermath of the 2010 congressional elections. This article examines what happened to the president and his party between the electoral zenith of November 2008 and the nadir of November 2010. We begin by reviewing an analysis that appeared in these pages early in 2009. Contrary to much commentary at the time, that analysis showed that the Obama victory was less a reflection of an electorate that had moved to the left than it was a negative judgment on the performance of the Bush administration. We extend that analysis, showing that in 2009 and 2010 the public came to view the performance of the new administration increasingly negatively, in part because of policies it pursued that did not enjoy wide popular support. In particular, we show that the administration’s focus on health care and, to a lesser extent, cap and trade probably cost the Democrats their House majority. We conclude with brief speculations about the prospects that the new Republican House can make progress on its avowed goal of reducing government expenditures.

The road to 2008
s we pointed out in our earlier article, between 2004 and 2006, numerous public opinion polls reported a significant increase in Democratic Party identifiers and a corresponding decrease in Republican identifiers.3 Since these polls were cross-sectional snapshots, however, the causes of the change were unclear. Thus, we commissioned YouGov/Polimetrix, an internet-based poll, to sample nearly 13,000 respondents who had been in their large database since 2004. This allowed us to track change in party identification over the four-year period and relate them to questions about the policies and performance of the Bush administration. Figure 1 shows that 2004 Republicans who stuck with their identification in 2008 were much more likely to approve of the Bush administration’s overall performance as well as its handling of the war in Iraq and the economy than were stable independents and 2004 Republicans who had moved to Independent in 2008. The latter in turn were more favorable to the administration than stable Democrats and 2 0 0 4 Republicans and Independents who had moved to the Democratic side in 2008.
1. Evan Thomas and the staff of Newsweek, A Long Time Coming: The Inspiring, Combative 2008 Campaign and the Historic Election of Barack Obama (Public Affairs Press, 2010). 2. Robert Borsage and Stanley B. Greenberg, “The Emerging Center-Left Majority,” American Prospect (November 13, 2008), available at http://www.prospect.org/cs/articles?article=the_emerging_centerleft_majority. (This and subsequent weblinks accessed December 17, 2010.) 3. David Brady, Douglas Rivers, and Laurel Harbridge, “The 2008 Democratic Shift,” Policy Review 152 (December 2008 & January 2009).

A

4

Policy Review

The Road to (and from) the 2010 Elections
figure 1 Evaluations of Bush administration performance, 2008
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% bush approval iraq economy

Republican

Independent

Democrat

But while disapproval of the Bush administration was strongly associated with movement away from the Republican Party, disapproval did not indicate that voter sentiment was shifting to the left across a range of policy issues. We used the same survey to examine public positions on a set of issues where party differences were clearly evident: universal healthcare, global warming, gay marriage, abortion rights, and illegal immigration. The Republican response was coded as: oppose universal health care, think effects of global warming are overstated, oppose any legal recognition of same sex couples, allow no abortions or only in case of rape and incest, and favor deportation of illegal immigrants. Figure 2 plots the percent taking the Republican response for each of the three categories of respondents: stable Republicans, stable and new independents, stable and new Democrats. Large majorities of those who remained Republican between 2004 and 2008 favor the Republican position on these issues, but the opinions of those in the independent category are much closer to the opinions of stable Republicans than to the Democratic side. Thus, the battering the Republicans experienced in 2006–2008 was much more a result of dissatisfaction with the performance of the Bush administration than an indication of a policy realignment in the electorate.

From 2008 to 2010

C

o n t r a ry t o o u r 2 0 0 9 conclusion, many commentators assumed that the 2008 elections had ushered in a new progressive era. President Obama came to Washington with an ambitious policy agenda featuring a stimulus package, health care reform, energy and envi5

February & March 2011

Brady, Fiorina & Rivers
figure 2 Issue positions, 2008
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Health Care Global Warming Gay Marriage Abortion Illegal Immigration

Republican

Independent

Democrat

figure 3 Health care bill approval

ronmental legislation, financial reform, and foreign policy change, among other lesser goals. At first things went smoothly. The stimulus package passed handily with the promise that it would keep unemployment at 8.2 percent or less. But the health care and environmental proposals faced tougher sledding than the stimulus. The Democratic majority in the House was more liberal than the Democratic majority in the Senate. In particular, it
6 Policy Review

The Road to (and from) the 2010 Elections
was clear from the beginning that the single payer and public option plans favored by the most liberal wing of the Democratic Party had no chance of passing the Senate. After months of contentious politics, even a much weaker version of health care reform appeared doomed as late as February 2010. But through a series of parliamentary procedures and House compromises, the Democrats finally enacted the Patient Protection and Affordable Care Act in March of 2010. The story on cap and trade was simpler. The House passed it but Senate Democrats representing coal producing states and states dependent upon coal for energy joined with Republicans to prevent consideration. After a lengthy process the two chambers agreed on a financial reform bill in July of 2010. All in all, despite the failure of cap and trade and the compromises entailed in passing health care and financial reform, the Democrats had enacted a stimulus package, a major health care bill, financial reform, and other less significant legislation. From the standpoint of legislative production, the 111th Congress excelled. For Joe Klein in Time, “the legislative achievements have been stupendous,” and in the judgment of Doris Kearns Goodwin, “I don’t think we’ve ever seen anything like Obama since Roosevelt.”4 From an electoral standpoint, however, this legislative productivity was a different story. As Figure 3 shows, from the summer of 2009 to the election, more Americans opposed the health care reform than favored it, although parts of it were popular. The burst of legislative activity on multiple fronts made middle-of-the-road voters receptive to Republican charges that federal power and spending were out of control. Moreover, by summer 2010, the claim that the stimulus would keep unemployment below 8.2 percent had clearly proved wrong. The combination of a very slow and weak economic recovery, concern about the deficit, and negative opinion toward the president’s policies combined to bring the president’s approval ratings down. The most significant drop in approval and increase in disapproval occurred during the health care debate from late May 2009 through December. From more than 70 percent approval at the time of his inauguration, the president’s approval ratings fell below his disapproval ratings by the summer of 2010. Perhaps most significantly, approval among independents fell most quickly; a plurality of them already disapproved by late summer of 2009 (Figure 4). According to our YouGov/Polimetrix surveys, many Americans were skeptical about their prospects under the new health care regime. By mid2009 about 40 percent said they would receive worse care if the bill passed compared to slightly less than 20 percent who thought they would be better off than before the bill. After passage the gap between worse and better care
4. Quoted in Howard Kurtz, “Beware the gop Coronation,” Daily Beast (October 31, 2010), available at http://www.thedailybeast.com/blogs-and-stories/2010-10-31/republican-election-wins-will-drawglowing-press-just-like-obama-once-did/.

February & March 2011

7

Brady, Fiorina & Rivers
figure 4 Obama approval among independents

widened slightly, with 40 percent holding worse off and about 15 percent better off. And on the cost side, Americans were skeptical about claims that they would get improved care with less cost: Over 50 percent of those polled believed that their costs for health care would increase compared to 10 percent or so who thought their costs would decrease. Independents had moved very sharply to the Democrats in the 2006 congressional elections — by a margin of 18 percentage points according to the national exit polls. Thus, their growing disenchantment with the president threatened Democratic prospects in 2010. In addition, the number of selfprofessed independents had increased from about 31 percent in October of 2008 to over 36 percent in October 2010. The increase in Independents came at the expense of Democrats who declined from about 36 percent to about 32 percent. During the same period Republicans stayed at the same level — about a quarter of the electorate.

The 2010 Democratic shellacking
he preceding developments superimposed on a long, deep drop in employment and a very slow recovery generated stiff headwinds for Democrats in the 2010 elections. Democratic members from moderate and conservative districts were left particularly vulnerable. That condition proved fatal for many of them as Republicans targeted districts where Obama had lost to McCain, as well as districts that had been lost to Democrats in the previous two elections. In the election, eleven of 21 members of the class of 2006 lost their seats while 21 of 24 members of the
8 Policy Review

T

The Road to (and from) the 2010 Elections
class of 2008 lost. In total Republicans gained 63 seats in the House, the largest gain in a midterm election since 1938. Republicans also gained six Senate seats, seven governorships, control of nineteen state legislative chambers, and almost 700 new state legislative seats. Exit polls revealed huge shifts to Republicans between the 2006 and 2010 midterms. Republicans turned an 18 point 2006 deficit among independents into a 17 point lead in 2010. Rural voters, older voters, and Catholics all showed four-year swings over 2 0 percentage points. Republicans gained nearly as much among white voters and high school graduates. They gained 13 percentage points over 2006 among those making less than $50,000 and those above $250,000 in income. They gained heavily in the Northeast and the Midwest. In 2010, As usual, postpluralities thought Republicans better able than Democrats to handle the economy (+23%), spend- election analyses ing (+27%), and taxes (+31%). In sum, voters, by pundits almost across the board, turned away from the Democrats. and party As usual, post-election analyses by pundits and spokespersons party spokespersons differed according to their political orientation. For the most part Republicans differed viewed the election as repudiation of Obama and his according to policies, while Democrats claimed they had not their political properly communicated their policy achievements or that Republicans had benefitted from illicit corpoorientation. rate contributions. Thus, the day after the election Rush Limbaugh exulted that Nancy Pelosi, whom he referred to as “the wicked witch of the West,” had a House fall on her.5 In contrast, in her dayafter column Maureen Dowd wrote, “Republicans out communicated a silver-tongued president who was supposed to be Ronald Reagan’s heir in the communications department. They were able to persuade a lot of Americans that the couple in the White House was not American enough, not quite ‘normal,’ too Communist, too radical, too Great Society.”6 In spite of the predictability of the responses to the election, the question of what factors caused the Democratic debacle is an important one to answer. In the remainder of this article we turn to an analysis of the election results in an attempt to discern how much of the Democratic loss is attributable to the economy and how much attributable to the choices made by Obama and the Democratic congressional leadership. The president’s party typically loses seats in midterm elections; the average loss for first term Democratic presidents in the postwar era is 30 seats. A very poor economy and a president below 50 percent in approval by
5. See http://noisyroom.net/blog/2010/11/03/wipe-out-rush-limbaugh-celebrates-2010-election/. .

6. Maureen Dowd “Republican Party Time,” New York Times (November 3, 2010), available at http://www.nytimes.com/2010/11/03/opinion/03dowd.html?ref=maureendowd.

February & March 2011

9

Brady, Fiorina & Rivers
themselves would generate significant losses for the incumbent party. But political science forecasting models based on the “fundamentals” badly under-predicted the Democratic seat loss — several even forecast that the Democrats would easily retain their House majority.7 Our hypothesis is that by pursuing a policy agenda to the left of the electorate’s comfort zone the president and the House leadership exacerbated the problems for Democrats from moderate to conservative districts — especially the districts captured in the 2006 and 2008 elections. Three-quarters of the Democratic candidates running in the 48 districts carried by McCain in 2009 lost. Tellingly, eight of fifteen who voted no on both health care and cap and trade survived, while only three of sixteen who voted yes on one or the other survived, and all seven of those who voted yes on both bills lost. By pushing highly controversial legislation, the The question Democratic leadership in the House of of what factors Representatives caused the party to lose significantly more seats than they would have from the poor caused the economy alone. Democratic In the analyses that follow we have used standard statistical procedures to estimate the electoral harm debacle is an of the health care and cap and trade votes.8 These important one were not the only controversial votes that figured in to answer. the campaign, of course, but tarp II had only ten Democrats in opposition and the stimulus only eleven. Moreover, several of these defectors did not seek reelection in 2008, so there is simply not enough variation on these votes to analyze. We use the vote cast for the member in 2008 to measure his or her electoral vulnerability, and Obama’s 2008 vote in the district to measure district support for Obama. We also include variables for members elected in 2006 and 2008. We expect that the more conservative the district, the more that support for issues like health care and cap and trade hurt Democratic candidates. Conversely, the more liberal the district the smaller the damage from a vote in support of such issues, and in very liberal districts, of course, a yes vote would be an electoral benefit. We used four different statistical estimation procedures to insure the robustness of our findings. Tables 1 and 2 report the results most favorable and least favorable to our hypothesis. From Table 1, yes votes on health care and cap and trade severely damaged Democrats from marginal districts. In districts where Obama got only 45 percent of the vote and the Democratic incumbent voted yes on health care, we estimate a 9 percent vote penalty. At the 50 percent Obama vote
7. See James E. Campbell, ed., “Forecasts of the 2010 Midterm Elections,” PS: Political Science & Politics 43 (2010), 625–648. 8. See David W. Brady, Morris P. Fiorina, and Arjun Wilkins, “The 2010 Elections: Why Did Political Scientists’ Forecasts Go Awry?,” forthcoming in PS: Political Science & Politics 44 (2011).

10

Policy Review

The Road to (and from) the 2010 Elections
level, a yes vote hurts less than it did at 45 percent, but still carries about a 7 percent vote penalty. The more liberal the district, the less it costs to vote yes on either bill. At 60 percent 2008 Obama support the penalty is less than 4 percent. The penalties for voting yes on cap and trade are smaller, and at 60 percent Obama support in 2008 a yes vote becomes a positive. table 1 Vote share loss from yes v. no votes on health care and cap and trade Health care Cap and trade vote share vote share Obama vote yes no yes no .45 41.4 50.7 42.3 49.1 .50 46.4 53.9 47.2 51.9 .55 51.5 57.1 52.2 54.8 .60 56.6 60.3 57.1 57.7

The analysis underlying Table 1 examines the gain or loss in percentage of the vote, but a drop in vote percentage does not automatically translate into loss of a seat, so the analysis that underlies Table 2 examines the probability of winning versus losing a seat. The results are again instructive. Democrats who came from districts where Obama won 45 percent of the vote in 2008 and who voted yes on either health care or cap and trade had almost no chance of reelection compared to a reelection probability of 40 percent or more if they voted no. At an Obama 2008 level of 50 percent reelection probabilities are still two to three times higher for “nay” voters. A “yea” vote on cap and trade becomes a positive factor as the 2008 vote for Obama in the district nears 55 percent, and a health care vote becomes a positive factor as Obama’s 2008 vote climbs from 55 to 60 percent. table 2 Probability of defeat given district Obama vote and votes on health care and cap and trade Probability of winning contingent on vote Health care Cap and trade Obama vote yes no yes no .45 .02 .45 .05 .40 .50 .20 .60 .27 .25 .55 .63 .74 .66 .69 .60 .93 .85 .92 .80

While there is something of a gap between these most favorable and least favorable statistical analyses (this is not rocket science, after all), the preceding results clearly indicate that Democrats coming from more conservative
February & March 2011 11

Brady, Fiorina & Rivers
districts were hurt by votes in favor of health care and cap and trade. Democrats from districts that had given Obama a big majority in 2008 were helped by a yes vote on either health care or cap and trade, but they were generally likely to win anyway. How many seats did it cost the president and his party to push for health care and cap and trade legislation? To suggest an answer to that question, we take the statistical equations underlying Tables 1 and 2 and set all Democratic votes on health care and cap and trade at no, which takes away (as much as can be done) the effect of yes votes on health care and cap and trade. In this counterfactual case, where every Democrat opposes these two pieces of legislation, Democrats would have saved between 22 seats (from Table 2) and 40 seats (from Table 1), probably allowing them to save their majority. Counterfactual exercises like that just reported are problematic in that other things presumed to stay constant would not have stayed constant had rank-and-file Democratic House members voted down health care and cap and trade. In that event they might have provoked primary challenges and/or defections from angry voters in the Democratic base. But this exercise strongly suggests that members from more-conservative districts who supported health care and cap and trade, either out of personal belief or party pressure or both, paid an electoral penalty. The economy and the president’s middling ratings indicated that the Democrats would lose seats. But by advancing controversial legislation the Democratic leadership appears to have turned a probable big loss into one of historic proportions.

Implications for 2012
hat does the Republican “shellacking” of Democrats in 2010 portend for 2012? If history is any guide, the answer is not good news for Republicans, at least for the post-World War II period. In 1946, the Republicans gained 55 seats in the house and 12 in the Senate to take control of Congress for the first time in 16 years. Democratic prospects for 1948 looked so poor that Senator Fulbright, a Democrat, proposed that President Truman appoint a Republican secretary of state (next in line for the presidency after Truman’s elevation), resign, and cede the presidency to the Republicans. Contrary to expectations, Truman campaigned against the “do-nothing” Republican Congress and won reelection. A generation later, after the 1994 elections gave Republicans control of Congress for the first time in 40 years, some in the media wondered whether a weakened President Clinton was still relevant. But after two government shutdowns Clinton was reelected overwhelmingly in 1996. The key to understanding how these two new Republican Congresses managed to reelect sitting Democratic presidents lies in the policy choices they made. In the flush of a big victory they overreached.
12 Policy Review

W

The Road to (and from) the 2010 Elections
In the 112th Congress, a key issue will be government spending. The one common principle across all the Tea Party movements in states and localities was that the country cannot afford our current deficits, let alone those looming in the not too distant future. Addressing the deficit problem involves raising taxes, cutting spending, or some combination of the two. Thus far, Republicans have been insistent on cutting spending while keeping the Bush tax cuts in place and enacting no new taxes. In principle, very strong economic growth could increase incomes sufficiently to increase revenue without increasing the tax rate; however, few expect the economy to do this in the near future. Given that the next election will occur before economic growth can solve the problem, the key issue for Republicans is reducing government spending. While the electorate in 2010 yelled a loud “no” to the policies of the president and Democratic Congress, the negative verdict was by no means carte blanche for Republicans to carry out their own wish list. Given the centrality of spending issues, we conducted a YouGov/Polimetrix poll on sixteen federal programs, asking whether spending on each should be increased, decreased, or kept the same. Table 3 presents the results of this poll. table 3 Do you think federal spending on the following programs should be increased or decreased or kept the same? % % % % Increase Keep the same Decrease Not sure Social Security National Defense Medicare Aid to the Poor Medicaid Veterans Benefits Health Research Education Highways Mass Transit Foreign Aid Unemployment Benefits Science and Technology Agriculture Housing The Environment 42 31 40 34 32 52 43 53 28 29 3 31 36 23 21 37 42 39 42 39 46 39 42 30 53 41 22 40 45 44 42 36 9 25 12 20 15 4 9 12 13 20 67 23 12 24 30 21 8 6 6 7 7 6 5 5 6 10 8 6 7 9 7 6

February & March 2011

13

Brady, Fiorina & Rivers
In fifteen out of sixteen programs, a majority of the public would like spending to be increased or kept the same. The only program that a majority of Americans would cut is foreign aid. Most importantly, in the large entitlement programs, Social Security, Medicare, and Medicaid, only small minorities favor the cuts that must come if the budget deficit is to be brought under control. Defense enjoys a similarly privileged status, with 70 percent favoring either current spending or an increase. Even in agriculture and housing, over 60 percent of Americans favor keeping expenditures where they are or increasing them. Data like these should inform the agenda of the new Republican House majority. In 2 0 1 0 the country voted no on Democrats, not yes on Republicans, and certainly not yes to across-the-board spending cuts. The new majority faces the hard reality that, in general, voters want spending reduced, but when it comes to specific programs, there are none that stand out, save foreign aid, which, if eliminated entirely, would not dent the deficit of the United States. Leading the country on a new spending path will require skill and leadership. The obvious place to start is to begin a serious effort to educate the American public about both the seriousness and complexity of the problem. Everything must be on the table. For example, Democrats can no longer label any proposal to slow the growth rate of social security benefits as “unacceptable.” And for their part Republicans must address Ron Paul’s query about why the U.S. has more than 700 military bases overseas. In recent days we have heard a lot about the need to have an “adult conversation” with the American public. That conversation must begin with some serious political leadership.

14

Policy Review

A Climate Policy for the Real World
By Paul J. Saunders & Vaughan Turekian

overnment officials worldwide are trying to put the best face on the 2010 United Nations climate change negotiations in Cancun, especially after 2 0 0 9 ’s debacle in Copenhagen. But the talks produced little real progress and led many to wonder whether the two global climate meetings represent a necessary, albeit somewhat sideways step in the long process towards an eventual global treaty reducing greenhouse gases or, alternatively, the gradual and unsurprising end to a nearly twenty-year effort to achieve binding international mandates. Advocates of a binding global treaty on greenhouse gas emissions are divided over the importance of the new agreement coming out of Cancun. For any who might harbor doubts, the Obama administration’s approach to the negotiations is revealing: Neither the president nor the secretary of state (nor the vice president, for that matter) traveled to Mexico, leaving the
Paul J. Saunders is executive director of The Nixon Center. Vaughan Turekian is a non-resident fellow in foreign policy at the Brookings Institution and holds a Ph.D. in atmospheric geochemistry. They worked together on climate issues as advisors to the under secretary of state for global affairs during the George W. Bush administration.
February & March 2011 15 Policy Review

G

Paul J. Saunders & Vaughan Turekian
negotiations in the hands of State Department Special Envoy Todd Stern. Key congressional leaders also skipped this year’s talks. The administration’s reduced emphasis on the un meetings, and continuing international disagreements over climate change, demonstrate an uncomfortable fact for many greens: U.S. efforts to stem climate change thus far have largely vindicated the Bush administration’s approach to global action on climate change during its final years. The failures of the high-profile Copenhagen talks — and of U.S. domestic legislation — reflect structural political and economic realities that will be profoundly difficult to overcome, if they can be overcome at all. Obama would do well to understand the lessons of Copenhagen and cap-and-trade and move on to a more practical approach — especially after the 2010 midterm elections. The pragmatic The pragmatic wing of the activist community wing of the has cautiously praised the Cancun summit, which produced a deal that brought a voluntary internaactivist, tional climate agreement reached on the margins in climate-change Copenhagen inside the un process and created a community has fund to help poor developing countries reduce their greenhouse gas emissions and manage the consecautiously quences of a warming Earth. At the other end of the spectrum, the climate movement’s doctrinaire ideopraised the Cancun summit. logues have denounced the talks’ failure to produce a binding agreement on deep reductions. They are all the more bitter after the Copenhagen fiasco, years of resentment of the Bush administration’s approach, and earlier surety that the Democrats controlling the White House and the Congress would accomplish what the prior Republican president was unwilling to try. The fact that they have no “Plan b” for addressing the climate problem — and apparently cannot conceive of a solution other than unprecedented and therefore very unlikely global regulation — only adds to their frustration. Equally troubling to both of these camps is the Kyoto Protocol’s looming expiration in 2012, with its results limited and no follow-on arrangements in place. Since Kyoto’s modest emissions targets were secondary to its goal of establishing a global system for deeper future reductions, supporters of binding international targets and timetables for emissions are alarmed by the relentless ticking of the clock. Compounding activists’ worries is the refusal of key parties to the Kyoto Protocol — including Japan and Russia — to agree to an extension through a new so-called “commitment period.” Japan sensibly refuses to accept deeper emissions reductions without commitments from the United States and China. Russia — whose ratification brought Kyoto across the threshold that made the pact legally binding — seems more interested in its ability to sell emissions credits than in preventing climate change. Moscow was an enormous beneficiary of Kyoto’s 1990 base year for measuring emissions
16 Policy Review

A Climate Policy for the Real World
reductions; the combination of the Soviet Union’s vast and highly inefficient industrial base and Russia’s subsequent economic collapse meant that the country did not have to do anything to meet its targets and could sell both its natural gas and its leftover emissions to Europe. With the most invested in Kyoto, European leaders may be particularly eager to make a deal in the remaining time before 2012 — and they may eventually do so. However, neither of the two options available is likely to produce meaningful results. Efforts to negotiate a new global agreement will force a choice. One option is to include the United States and China, the two largest emitters, and India, where emissions are rising rapidly; but this would weaken any deal because none will commit to significant emissions reductions. The alternative is to exclude them, which would limit the impact of an agreement by leaving out the nations together responsible for over 45 percent of global greenhouse gas emissions. It could be worse if Japan, Russia, and others are unwilling to accept new limits without a comprehensive deal. This makes a meaningful international agreement on climate change very improbable. The reasons for this are clear. While both developments were shocking to many inside the echo chamber that surrounds climate change discussions, the breakdown of the Copenhagen negotiations and the slow death of emission-limiting legislation in the United States were eminently predictable. Moreover, while the Obama administration has clearly tempered its ambitions, at least for the time being, there is little evidence that the president and other senior officials have drawn necessary conclusions from their first two years and reassessed U.S. climate change strategy. This is a mistake; the United States needs new pragmatic and creative policies to address climate change at the local, national, and international levels. But making these changes requires clearly understanding what has happened so far.

What really happened in Copenhagen?
ne problem in the December 2009 Copenhagen climate summit was that expectations had soared wildly beyond the limits of rationality, in part due to wholly unrealistic hopes tied to President Barack Obama. As a result, the meetings evolved into a summit of heads of state without adequate diplomatic preparation. Climate change is far too complex an issue to resolve in negotiating sessions among national leaders if the central parameters of the deal have not been resolved in advance. Absent this, the administration allowed the United States to be drawn into a highstakes gamble that was very unlikely to succeed, especially in view of the many other flaws in its approach to the talks. The second problem was one of strategic sequencing. Since the administration had not succeeded in passing climate legislation prior to Copenhagen, it was trying to pursue an international agreement without a domestic conFebruary & March 2011 17

O

Paul J. Saunders & Vaughan Turekian
sensus on climate policy. Broadly speaking, this repeated the major error in the Clinton administration’s decision to sign the Kyoto Protocol in the face of clear Senate opposition. Thus, even if the administration succeeded in reaching a deal that went beyond a political declaration, subsequent events have demonstrated that it would not have been able to deliver at home. Given this, the lack of a deal at Copenhagen — which the press and others fortunately and accurately blamed as much on China’s reluctance as U.S. policy paralysis — was probably the best outcome for the administration itself. Reaching an agreement with major emitters in the developed and developing world only to see it die in the Senate would have been a major blow to American credibility and to Obama’s domestic leadership. The third problem with the U.S. approach to In Copenhagen, Copenhagen (a problem shared by the Europeans, who often appeared to be observers rather than parWashington ticipants in the negotiations) was tactical and diplomatic. Washington and European capitals gave far and European too much attention to China — which is admittedly capitals gave central to any successful effort to reduce greenhouse gas emissions at a global level — thereby placing far too much Beijing in the driver’s seat and limiting American attention to and Western negotiating leverage. This was especialChina. ly damaging in the wake of the global financial crisis, when China’s sense of indispensability was already unprecedented (not to say inflated). China luxuriated in its Copenhagen role, sending a second-tier diplomat to a negotiating session among heads of state and repeatedly making them sit and wait during phone calls to decision-makers. The final two problems are fundamental structural weaknesses of the un Framework Convention on Climate Change. One is that un-based negotiating processes inherently give all parties equal formal status (though obviously not equal influence). It is simply too difficult to negotiate a highly complex agreement incorporating emissions limits, verification measures, development support, and other components with 200 parties around the table. While they are well-meaning and legitimately concerned and involved, the vast majority of the delegates in such a conversation have little to contribute beyond their grievances. Denmark’s weak chairmanship didn’t help this already difficult situation. A similarly deep underlying problem of the unfccc is the historically and morally reasonable but impractical and unmanageable legal concept of “common but differentiated responsibility,” the idea that all nations share responsibility for managing climate change but that the developed world has greater responsibility because of its past contribution to today’s greenhouse gas concentrations in the atmosphere. Unfortunately, the climate problem is well beyond the point at which it could be solved through even drastic measures by the U.S., Europe, and Japan alone. In fact, even if the U.S. became a
18 Policy Review

A Climate Policy for the Real World
zero-emission economy by 2030, China’s expected new emissions — driven by an economic engine increasingly important to global growth — would expand to fill nearly all the gap, leaving the world with essentially no net change in emissions. The statistics tell the story. According to the Department of Energy’s Energy Information Administration, the developing world’s portion of global carbon dioxide emissions has grown from 46.4 percent in 1990 to 57.0 percent in 2010, and is projected to reach 64.2 percent by 2030. China’s share of global co 2 emissions have grown from 10.7 percent in 1990 to 23.4 percent in 2010, now somewhat exceeding the U.S. share, and is projected to hit 29.2 percent by 2030, close to double America’s expected share at that time. It is not realistic for developed countries, now making up significantly less than half the total global co 2 emissions, to make vast reductions in their own emissions simply to allow developing countries more room to increase emissions. Separately, while the developed world’s past emissions may be fair game in global negotiations, it is somewhat disingenuous to disconnect the developed world’s progress from developing nations. Setting aside the excesses of the colonial era, during which emissions were still quite low, economic growth in developed countries has in fact made a real difference to those living in developing economies, providing export-oriented jobs as well as improvements in public health, education, and other fields. This is perhaps most spectacular in the case of China, where rapid growth in the last 30 years is substantially attributable to Western investment and perhaps excessive consumer demand for cheap imports.

What happened to domestic climate legislation?
he failure in Copenhagen was both a contributor to and a consequence of the breakdown in domestic action attempted before and after the meeting. It was a consequence of the Obama administration’s strategic decision to use its then-large congressional majorities to push health care reform as its top priority in the months leading up to the summit. Given the rancor associated with this debate — especially severe during the summer of 2009 — this had an immediate impact on the prospects for climate legislation. That impact was compounded by earlier polarizing debates on the economic stimulus package and continuing weak growth after the stimulus. Simultaneously and unsurprisingly, the administration, Senator John Kerry, and others behind the cap-and-trade bill faced considerable skepticism from fellow Democrats representing coal-producing and coal-using states in the Senate. These Democrats were quite concerned about the effects
February & March 2011 19

T

Paul J. Saunders & Vaughan Turekian
the legislation could have on coal producers and utilities or, in other words, on jobs and energy prices. In an already bitter political environment, and against the background of a growing grass roots Tea Party movement energized by attempts at federal government intervention, Senate Republicans not passionately committed to the climate issue had little reason to be more accommodating than these Senate Democrats. After Copenhagen, with no real deal to trumpet, the argument for a climate bill forcing significant emissions cuts was dramatically weakened by the fact that developing countries, especially China, had not made a commitment to take any new emissions-limiting measures they were not previously expected to take. Congressional concern about China’s emissions had already been a major concern when the Kyoto Protocol was negotiated and signed, as reflected in After the 1997 Byrd-Hagel Resolution. Approved 95–0 Copenhagen, prior to the Clinton administration’s decision to sign the argument for Kyoto, Byrd-Hagel explicitly expressed the sense of the Senate that “the exemption for Developing a U.S. climate Country Parties [in the un Framework Convention bill forcing on Climate Change] is inconsistent with the need for global action on climate change and is environmensignificant tally flawed” and stated that the United States emissions cuts should not undertake any commitment to reduce its was weakened. own emissions without “new specific scheduled commitments” by developing countries. The interrelationship between the domestic and international levels of the climate issue through China’s role may actually prevent action in either arena by creating a catch-22. In brief, Congress won’t approve strong emissions limits without a commitment from China — and China won’t make a commitment before the United States does (if Beijing will make a commitment at all, which is subject to question). Climate bill advocates knew before Copenhagen that they would need to defend themselves against charges that the plan was not only costly domestically, but could further weaken U.S. competitiveness vis-à-vis China during a recession, and senators backing the cap-and-trade bill tried to avoid the catch-22 by attempting to demonstrate sufficient support for the bill without actually passing it. They sought to strengthen the administration’s negotiating position, hoping that an agreement in the Copenhagen talks would in turn provide the momentum they needed to get cap-and-trade through the Senate. This was far too complex a strategy to work in practice. With nothing to show from China in the wake of Copenhagen, Senate Democrats were in a weaker substantive position and unable to give the climate bill sustained attention as the 2010 midterm elections approached. After courting Republican Senator Lindsey Graham before and after the summit, in mid-2010 Senate Majority Leader Harry Reid alienated him by sidelining climate legislation in favor of immigration reform, to strengthen
20 Policy Review

A Climate Policy for the Real World
his own struggling reelection campaign. Either legislation would have been quite difficult to pass in an election year, but Senator Reid’s decision effectively killed the most prominent and promising bipartisan negotiations on a climate bill. Underlying all of the back-and-forth on Capitol Hill was the biggest obstacle to climate change legislation: the fact that the American people were never truly behind emissions limits. Public support for cap-and-trade was basically illusory: Though 66 percent supported emissions limits in principle in Pew Research Center polling in the summer of 2010, only 32 percent viewed climate change as a “priority” — compared to 81 percent who focused on jobs and 67 percent on energy needs. Thus, while the idea of emissions reductions had some appeal, most people subordinated it to other concerns, and legislation The biggest that appeared either to put jobs at risk or to raise obstacle to energy costs had little support. Climate bill advoclimate change cates were well aware of this problem, which was one of the factors behind proposals to create “green legislation: the jobs,” but they were never able convincingly to American people overcome it in the public eye. In fact, though they have tried many different were never arguments, climate advocates have thus far largely truly behind failed in making a sufficiently strong case for emissions limits on any basis. While the scientific case for emissions limits. climate change is solid, the “approaching calamity” argument about its expected consequences hasn’t gained traction. This appears partially due to good public relations by climate skeptics (helped recently by foolish and highly-publicized emails among a handful of scientists) and to record-high snowfall throughout the United States in the winter of 2009-10 that was consistent with climate change modeling but confused many Americans. The moral argument for action to save indigenous peoples, animals, and glaciers is closely related to the calamity argument and often has a greater emotional appeal. However, despite support from some evangelical Christian groups focused on humanity’s stewardship of God’s creation, this has also fallen short. Some conservatives have been attracted to two different national security arguments for measures that address climate change. One has highlighted the possible security consequences of floods, droughts, and refugee flows in failed and failing states and has been promoted by former senior military officers. The other has targeted reductions in oil consumption to improve energy security, usually combined with dubious claims that lower American oil imports will deny revenue to hostile regimes or groups. These arguments appear insufficient largely because most people see the benefits of reducing greenhouse gas emissions as long-term, abstract, and distant, while they see the costs as immediate, concrete, and personal. As a
February & March 2011 21

Paul J. Saunders & Vaughan Turekian
result, strong limit-based policies are almost inherently impractical in democratic societies or, indeed, authoritarian systems that are not prepared to impose them without regard to public reaction. Actually, without inexpensive and widely applicable new technologies to break the link between energy consumption and greenhouse gas emissions, the effectiveness of any limits is inversely proportional to their popularity. Making emissions limits more effective requires making energy more expensive, with public support for the policy declining as it becomes more effective. Conversely, making limit-based policies sufficiently popular to win public support and legislative approval requires either making them ineffective, by restricting energy price increases, or extremely costly, by providing offsetting subsidies. The fate of the U.S. climate bill is Even in Europe, telling in this regard in that its emission reductions of 17 percent below 2005 levels by 2020 were whose public more modest even than the reductions the Clinton administration accepted under Kyoto — and the bill seems to embrace still failed. climate policies, Even in Europe, where climate policies are seemingly embraced by the public, much of the reduction much of the in greenhouse gas emissions is in a sense artificial. reduction in The European Union’s population growth rate has greenhouse gas been half America’s rate over the last decade, someemissions is in a thing that in itself sharply slows emissions growth in comparison to the United States and other countries sense artificial. with more rapidly increasing populations, requiring less effort to make emissions cuts measured from a common baseline year. Europe also benefits from accounting rules that compare current emissions to years when emissions were artificially high, especially by combining emissions for West Germany and East Germany in 1990 and comparing them with today’s united Germany, in which many of East Germany’s highly inefficient power plants and factories no longer exist for economic reasons. According to Eurostat, Germany represented around 20 percent of Europe’s total emissions in 2008, but accounted for 43 percent of the decline in emissions since 1990. Finally, Europe (and Germany in particular) is in a sense outsourcing greenhouse emissions through its extensive use of Russia’s natural gas rather than domestic coal, even as Moscow substitutes its own coal for gas internally to maintain export revenues. Europe seems likely to confront America’s same dilemmas moving forward and may, in fact, already be facing them. For while citizens in most European countries are accustomed to higher gasoline and electricity prices — and, for that matter, higher taxes — it is not the absolute level of these costs that excites public opinion but rather the changes up and down. So while Europeans may tolerate higher costs, it is far from assured that they would accept considerable new increases. Moreover, at the level of the European Union, European advocates of steep emission reductions face a
22 Policy Review

A Climate Policy for the Real World
problem similar to that of the Senate Democrats but more severe: They need to accommodate coal-dependent national governments, rather than coaldependent senators. After bailing out Greece and Ireland, how much will Germans be willing to pay to reduce greenhouse gas emissions in Poland, Hungary, and other new members of the European Union? And how much will the eu and its member states be able to pay while implementing austerity packages to address calamitous deficits? At the deepest level, it is so difficult to make climate policy because it is not really climate policy at all, but a back door to energy policy. And energy is one of the most politically sensitive issues in modern society, because it is so intimately intertwined with so many other issues, both economically and in daily life. In America, energy policy intersects with life in countless ways, from how people get to work (and, in fact, whether they have a job in some cases) to how comfortable they are in their homes, how much they pay for energy, and how much they have left over for other things. As a result, making energy policy is an extremely dangerous pursuit for politicians: It carries within it scores of potential booby traps, any of which might end a career in elected office. From a political perspective, trying to pass climate change legislation is like trying to walk through a minefield with a blindfold — and a dozen different sleeve-tugging guides, each of whom is sure that his path is the safe one.

Climate change lessons
ith this in mind, the first climate policy lesson for the Obama administration is that the climate issue simply does not and in the foreseeable future cannot provide a sufficiently broad political base to make the policy changes necessary to address climate change successfully. What America really needs is more effective and focused economic policy, including energy policy, driven by America’s economic needs but sensitive to climate concerns. Trying to make economic policy or even energy policy via climate policy puts the politics upside down and will not succeed in preventing climate change. The second and related lesson is that even seemingly minor yet still binding international commitments will be difficult if not impossible to ratify within the United States, especially in a more closely divided Senate. Many have discussed a compromise solution for the climate negotiations, under which countries would sign a treaty codifying their existing domestic policies. While this appears to be noncontroversial on its face (and limited in its impact), even this outcome is unlikely in view of America’s domestic political realities. In view of the attention to the economy — and to China — during the 2010 election campaign, virtually any U.S. climate legislation, even a further watered-down domestic cap-and-trade bill, would have a minimal chance to overcome a Senate filibuster with 60 votes, let alone win the 67
February & March 2011 23

W

Paul J. Saunders & Vaughan Turekian
votes required by the Constitution to ratify a treaty and internationalize such a policy. The door to a global treaty involving the United States is all but shut. A third lesson is that China, like the United States, is making decisions based on its domestic needs rather than any particular sense of global obligation to reduce climate change impact in more vulnerable countries. This situation is even less likely to change in China than in America, because of differences between what leaders in the two countries might reasonably fear. The president, his cabinet, senators, and House members might lose their jobs as a result of costly policies that slow growth, but many of them would probably move quickly into new jobs with higher pay. China’s leaders have considerably more at stake: They fear that a slowing China, like the economy could produce widespread protests, political instability, or even the collapse of the United States, is Communist Party’s control. China’s climate policy making decisions will be driven by its leaders’ need to maintain exports and create jobs as well as their interest in based on its reducing energy consumption as a matter of ecodomestic needs nomic policy and energy security rather than special environmental concern. rather than any Another lesson, the fourth, is that U.S. climate particular sense policy cannot be disconnected from America’s broader national interests. The sharp reductions in of global China’s emissions that advocates seek would likely obligation. require a foreign investment effort on the scale of the Marshall Plan — something difficult to reconcile with America’s economic and security interests vis-à-vis China at a time when Beijing is already becoming increasingly assertive and when China’s prosperity is in large measure attributable to U.S. policy in the first place. The massive transfers of wealth some advocates seek from the United States and Europe to China and other developing nations are totally impractical. Even if it materializes, the $100 billion per year by 2020 envisioned under the Cancun agreement will make only a modest difference due to the scale of investment required. The administration should not need to learn that un-based processes are often ineffective, but this must be the fifth lesson. The two most important participants in un climate talks — the United States and China — are unlikely to accept un-mandated emissions limits, especially limits sufficiently tough to avert climate change impacts. (Still, each will probably do much more than it is prepared to promise.) The most committed participants, in Europe, don’t produce a sufficiently large share of global emissions for even drastic cuts to succeed on their own. And the most anxious participants, in poorer developing countries, can do little more than watch the process with diminishing hope, while trying to extract compensation payments from wealthier economies.
24 Policy Review

A Climate Policy for the Real World
The sixth and final lesson is the central role of technology. The history of international climate talks shows that governments and societies will generally commit only to limits that they believe to be economically viable, which from a policymaking perspective means limits that nations can reasonably expect to satisfy on the basis of existing technologies and expected improvements — which we already know will be inadequate to prevent climate change. At the same time, if we achieve a technological breakthrough that makes radical emissions reductions economically attractive, binding limits will not be necessary to produce the required action.

What to do?
aken together these lessons force a broad conclusion that must be the basis for any successful policy: It is very unlikely that humanity will be able to stop or considerably slow climate change by relying on binding emissions limits, whether domestic or international. At the international level, this has several policy implications. The first is that emissions-reduction discussions should focus on action-oriented dialogue among major emitters, the top 21 of which accounted for 79 percent of global emissions in 2007, according to the International Energy Agency. un-based processes create the illusion of action, consuming considerable time and energy in the process (as well as producing a lot of co 2 to bring delegates to international conferences), but are secondary to solving the climate challenge. Like in Cancun, the United States should reduce its diplomatic commitment and presence at future unfccc meetings, concentrating on using the sessions for coordination and information-sharing rather than negotiation.. While the Bush administration made plenty of mistakes early on, theatrically pulling out of Kyoto when it could just as easily have allowed it to languish in the Senate, as it surely would have, and vocally denying the science of climate change, it eventually saw the need to address climate change through discussion with major emitters. Combined with an apparent preexisting bias against the United Nations, this led President Bush to launch the Major Economies Meeting, bringing together the world’s largest economies to discuss energy technology and related issues. The Obama administration re-branded this effort as the Major Economies Forum on Energy and Climate. Secondly, rather than trying to cajole or shame China into taking on binding commitments in international negotiations, which will not produce important results, America should focus on encouraging further Chinese action to reduce emissions. This could include offering expanded economic, scientific, and technical cooperation with China (while protecting U.S. economic interests, including intellectual property rights) to accelerate its efforts. Simultaneously, the United States should underscore to Beijing the
February & March 2011 25

T

Paul J. Saunders & Vaughan Turekian
climate change impacts predicted in major studies and how associated extremes in weather would be especially problematic for China, with its lower per capita gdp and less resilient political, economic, and social systems. This should not be a public message, but it can be a clear one. In the realms of diplomacy and global public opinion, the United States should cede no ground, working to prevail in rhetorical battles and to win over media and thought leaders. Tactics in these areas could include working with delegations from smaller developing countries to shift the onus of action (and blame) to China by demonstrating an American commitment to taking real and measurable steps to reduce emissions. Already committed to the principle of common but differentiated responsibility with respect to their own actions, poor developing nations are increasingly accepting the notion that not all develIn the realms oping countries have equal obligations and might at of diplomacy a minimum deprive China of their public support. The U.S. appeared to make some headway in this and global public opinion, direction at Cancun. Realistically, these efforts will probably have little impact on Chinese policy; for the United States many if not most of these governments, concerns over climate change impacts are long-term and secshould cede ondary to their immediate hopes for economic no ground. growth, infrastructure projects, and the political benefits of both. Chinese investment can make a key contribution to achieving these objectives. Chinese investment can make a key contribution to achieving these objectives What is important is to recognize that there is a difference between China and other developing economies, and even between China and other large developing economies like India and Brazil. The difference is a matter of scale; one can credibly refer to the United States and China as a g2 because their combined economies — and greenhouse gas emissions — are so large. India’s emissions are one-fifth to one-quarter of U.S. or Chinese emissions, and are growing at two-thirds the rate of China’s emissions. China, India, Brazil, and other developing nations all share an emphasis on development rather than emissions reductions and an unwillingness to accept binding international limits on their emissions, but they do not have equal responsibility for projected growth in emissions from developing economies or equal abilities to reduce emissions or address their consequences. Domestically, the United States should focus on energy policy rather than climate policy, recognizing that altering energy consumption patterns takes quite some time and seeking both incremental improvements in efficiency and breakthroughs. The central goal of such a policy would be to orient American energy policy to serve broader national economic and security goals, by increasing efficiency and therefore productivity, contributing to economic growth and creating jobs, maintaining and extending America’s global leadership in science and technology, and limiting our exposure to
26 Policy Review

A Climate Policy for the Real World
volatile commodity prices. New technologies also require an adept and technically literate society, which in turn requires education reform. President Obama’s efforts to build support for a new economic growth strategy based on developing new products rather than new financial instruments are constructive. Taking political facts of life into account, policies that focus on incentives to develop new technologies rather than applying penalties to existing technologies will be much more likely to succeed, though in an environment of increasing concern about deficits, it will not be easy to establish and maintain incentives. The highest priority in this approach must be investment in research and development as well as measures to speed the commercialization and implementation of successful new technologies. Public-private investment funds could be one option The general in accelerating the deployment of new technologies. argument that At the international level, new collaborative climate change research programs, intensified exchanges of existing best practices, and expanded technology-sharing will requires a be important. Protecting intellectual property rights response on will be essential to stimulating the innovation necessary to produce breakthroughs. While some exotic national security technologies, such as geo-engineering (attempting to grounds has reduce temperatures by increasing the atmosphere’s reflectivity, dissipating more of the sun’s energy into been ineffective. space), seem fraught with problems, it would be irresponsible for the policymaking and scientific communities to ignore them. Further study of the technologies and their political and economic implications is important. A greater focus on concrete national-security-related issues could also be helpful. The general argument that climate change can lead to greater instability and requires a response on security grounds has been ineffective. Yet it is clear that the U.S. military’s reliance on fossil fuels — and the supply chains they demand — is a real vulnerability when American forces are in the field; the frequent destruction of U.S. and nato fuel tankers in Pakistan (and earlier in Iraq) illustrates this. Use-based research seeking to solve specific problems like this one, in this case developing low-emission alternative energy technologies that do not require massive distributed infrastructure, can lead to important progress toward both security and climate goals. Such research is often more suited to bipartisan support than broad, ambitious, and controversial programs. Moreover, given the track record of military-origin technologies adapted and commercialized for civilian use, these investments might eventually contribute to economy-wide reductions in emissions over the longer term. The Obama administration should also look at innovative programs to encourage state and local measures. In the absence of federal action during the Bush administration, states launched a number of creative efforts to meet
February & March 2011 27

Paul J. Saunders & Vaughan Turekian
climate and energy goals. As the incubators for ideas, in constant competition with their neighbors for population and economic vitality, states are well-placed to develop practical solutions that respond to their varied circumstances. The Department of Education’s “Race to the Top” competition for federal funds might serve as a model to catalyze new ideas; one can imagine a similar program of incentives that encourages states (or groups of states) to compete to formulate the best emissions reduction strategies in targeted sectors, such as transportation, electricity, or commercial or residential buildings. Effective approaches could be implemented on a wider scale. Finally, recognizing that the world is unlikely to stop climate change, America will need to intensify research on climate change impact and begin federal, state, and local assessments of the policies needed to adapt to the most likely domestic consequences. While important steps can and will be taken to reduce greenhouse gas emissions, any country, industry, or community that does not increase its understanding of the impact of climate change and build up resilience is putting itself at considerable risk.

28

Policy Review

The Persistence of Genocide
By David Rieff

c c o r d i n g t o t h e great historian of the Holocaust, Raul Hilberg, the phrase “Never Again” first appeared on handmade signs put up by inmates at Buchenwald in April, 1945, shortly after the camp had been liberated by U.S. forces. “I think it was really the Communists who were behind it, but I am not sure,” Hilberg said in one of the last interviews he gave before his death in the summer of 2007. Since then, “Never Again” has become kind of shorthand for the remembrance of the Shoah. At Buchenwald, the handmade signs were long ago replaced by a stone monument onto which the words are embossed in metal letters. And as a usage, it has come to seem like a final word not just on the murder of the Jews of Europe, but on any great crime against humanity that could not be prevented. “Never Again” has appeared on monuments and memorials from Paine, Chile, the town
Author David Rieff is a New York-based writer and policy analyst who has written extensively about humanitarian aid and human rights. He is the author of eight books, including A Bed for the Night: Humanitarianism in Crisis and At the Point of a Gun: Democratic Dreams and Armed Intervention, and is currently writing a book on the global food crisis.
February & March 2011 29 Policy Review

A

David Rieff
with proportionately more victims of the Pinochet dictatorship than any other place in the country, to the Genocide Museum in Kigali, Rwanda. The report of conadep, the Argentine truth commission set up in 1984 after the fall of the Galtieri dictatorship, was titled “Nunca Mas” — “Never Again” in Spanish. And there is now at least one online Holocaust memorial called “Never Again.” There is nothing wrong with this. But there is also nothing all that right with it either. Bluntly put, an undeniable gulf exists between the frequency with which the phrase is used — above all on days of remembrance most commonly marking the Shoah, but now, increasingly, other great crimes against humanity — and the reality, which is that 65 years after the liberation of the Nazi concentration camps, “never again” has proved to be nothing more than a Since 1945, promise on which no state has ever been willing to “never again” deliver. When, last May, the writer Elie Wiesel, himhas meant, self a former prisoner in Buchenwald, accompanied President Barack Obama and Chancellor Angela essentially, Merkel to the site of the camp, he said that he had “Never again always imagined that he would return some day and will Germans kill tell his father’s ghost that the world had learned from the Holocaust and that it had become a Jews in Europe “sacred duty” for people everywhere to prevent it in the 1940s.” from recurring. But, Wiesel continued, had the world actually learned anything, “there would be no Cambodia, and no Rwanda and no Darfur and no Bosnia.” Wiesel was right: The world has learned very little. But this has not stopped it from pontificating much. The Obama administration’s National Security Strategy Paper, issued in May 2010, exemplifies this tendency. It asserts confidently that “The United States is committed to working with our allies, and to strengthening our own internal capabilities, in order to ensure that the United States and the international community are proactively engaged in a strategic effort to prevent mass atrocities and genocide.” And yet again, we are treated to the promise, “never again.” “In the event that prevention fails,” the report states, “the United States will work both multilaterally and bilaterally to mobilize diplomatic, humanitarian, financial, and — in certain instances — military means to prevent and respond to genocide and mass atrocities.” Of course, this is not strategy, but a promise that, decade in and decade out, has proved to be empty. For if one were to evaluate these commitments by the results they have produced so far, one would have to say that all this “proactive engagement” and “diplomatic, financial, and humanitarian mobilization” has not accomplished very much. No one should be surprised by this. The U.S. is fighting two wars and still coping (though it has fallen from the headlines) with the floods in Pakistan, whose effects will be felt for many years in a country where America’s security interests and humanitari30 Policy Review

The Persistence of Genocide
an relief efforts are inseparable. At the same time, the crisis over Iran’s imminent acquisition of nuclear weapons capability is approaching its culmination. Add to this the fact that the American economy is in shambles, and you do not exactly have a recipe for engagement. The stark fact is that “never again” has never been a political priority for either the United States or the so-called international community (itself a self-flattering idea with no more reality than a unicorn). Nor, despite all the bluff talk about moral imperatives backed by international resolve, is there any evidence that it is becoming one. And yet, however at variance they are with both geopolitical and geoeconomic realities, the arguments exemplified by this document reflect the conventional wisdom of the great and the good in America across the “mainstream” (as one is obliged to say in this, the era of the tea parties) political spectrum. Even a fairly cursory online search will reveal that there are a vast number of papers, book-length studies, think tank reports, and United Nations documents proposing programs for preventing or at least halting genocides. For once, the metaphor “cottage industry” truly is appropriate. And what unites almost all of them is that they start from the premise that prevention is possible, if only the “international community” would live up to the commitments it made in the Genocide Convention of 1948, and in subsequent international covenants, treaties, and un declarations. If, the argument goes, the world’s great powers, first and foremost of course the United States, in collaboration with the un system and with global civil society, would act decisively and in a timely way, we could actually enforce the moral standards supposedly agreed upon in the aftermath of the Holocaust. If they do not, of course, then “never again” will never mean much more than it has meant since 1945 — which, essentially, is “Never again will Germans kill Jews in Europe in the 1940s.” he report of the United States Institute for Peace’s task force on genocide, chaired by former Secretary of State Madeleine Albright and former Secretary of Defense William Cohen, is among the best of these efforts. As the report makes clear, the task force undertook its work all too painfully aware of the gulf between the international consensus on the moral imperative of stopping genocide and the ineffectiveness to date of the actual responses. Indeed, the authors begin by stating plainly that 60 years after the United Nations adopted the Genocide Convention and twenty years after it was ratified by the U.S. Senate, “The world agrees that genocide is unacceptable and yet genocide and mass killings continue.” To find ways to match words and “stop allowing the unacceptable,” Albright and Cohen write with commendable candor, “is in fact one of most persistent puzzles of our times.” Whether or not one agrees with the task force about what can or cannot be done to change this, there can be no question that sorrow over the world’s collective failure to act in East Pakistan, or Cambodia, or Rwanda is
February & March 2011 31

T

David Rieff
the only honorable response imaginable. But the befuddlement the authors of the report confess to feeling is another matter entirely. Like most thinking influenced by the human rights movement, the task force seems imbued with the famous Kantian mot d’ordre: “Ought implies can.” But to put the matter bluntly, there is no historical basis to believe anything of the sort, and a great deal of evidence to suggest a diametrically opposing conclusion. Of course, history is not a straitjacket, and the authors of the report, again echoing much thinking within the human rights movement, particularly Michael Ignatieff’s work in the 1990s, do make the argument that since 1945 there has been what Ignatieff calls “A revolution of global concern” and they call a “revolution in conscience.” In fairness, if in fact they are basing their optimism on this chiliastic idea, then one better understands the degree to which the members “Preventing of the task force came to believe that genocide, far genocide is a from being “A Problem From Hell,” as Samantha Power titled her influential book on the subject, in goal that can be reality is a problem if not easily solved then at least achieved with the susceptible to solution — though, again, only if all right institutional the international actors, by whom the authors mean the great powers, the un system, countries in a structures, region where there is a risk of a genocide occurring, strategies, and and what they rather uncritically call civil society, partnerships . . .” make it a priority. Since it starts from this presupposition, it is hardly surprising that the report is upbeat about the prospects for finally reversing course. “Preventing genocide,” the authors insist, “is a goal that can be achieved with the right institutional structures, strategies, and partnerships — in short, with the right blueprint.” To accomplish this, the task force emphasizes the need for strengthening international cooperation both in terms of identifying places where there is a danger of a genocide being carried out and coordinated action to head it off or at least halt it. Four specific responses are recommended, one predominantly informational (early warning) and three operational (early prevention, preventive diplomacy, and, finally, military intervention when all else has failed). None of this is exactly new, and most of it is commonsensical from a conceptual standpoint. But one of the great strengths of the report, as befits the work of a task force chaired by two former cabinet secretaries, is this practical bent — that is to say, its emphasis on creating or strengthening institutional structures within the U.S. government and the un system and showing how such reforms will enable policymakers to respond effectively to genocide. However, this same presupposition leads the authors of the report to write as if there were little need for them to elaborate the political and ideological bases for the “can do” approach they recommend. Francis Fukuyama’s controversial theory of the “End of History” goes unmentioned, but there is more than a little of Fukuyama in their assumptions about a “final” interna32 Policy Review

The Persistence of Genocide
tional consensus having been established with regard to the norms that have come into force protecting populations from genocide or mass atrocity crimes. It is true that there is a body of such norms: the Genocide Convention, the un’s so-called Responsibility to Protect doctrine, adopted by the World Summit (with the strong support of the Bush administration) in 2005, and various international instruments limiting impunity, above all the Rome Statute that created the International Criminal Court. And, presumably, it is with these in mind that the report’s authors can assert so confidently that the focus in genocide prevention can now be on “implement[ing] and operationalizing the commitments [these instruments] contain.” It is here that doubt will begin to assail more skeptical readers. Almost since its inception, the human rights movement has been a movement of lawyers. And for lawyers, the Almost since establishment of black-letter international law is indeed the “end of the story” from a normative its inception, the point of view — an internationalized version of stare human rights decisis, but extended to the nth degree. On this movement has account such a norm, once firmly established (which, activists readily admit, may take time; they been a movement are not naifs), can within a fairly short period thereof lawyers. after be understood as an ineradicable and unchallengeable part of the basic user’s manual for international relations. This is what has allowed the human rights movement (and, at least with regard to the question of genocide, the members of the task force in the main seem to have been of a similar cast of mind) to hew to what is essentially a positivist progress narrative. However, the human rights movement’s certitude on the matter derives less from its historical experience than it does from its ideological presuppositions. In this sense, human rights truly is a secular religion, as its critics but even some of its supporters have long claimed. Of course, strategically (in both polemical and institutional terms) the genius of this approach is of a piece with liberalism generally, of which, in any case, “human rights-ism” is the offspring. Liberalism is the only modern ideology that will not admit it is an ideology. “We are just demanding that nations live up to the international covenants they have signed and the relevant national and international statutes,” the human rights activist replies indignantly when taxed with actually supporting, and, indeed, helping to midwife an ideological system. It may be tedious to have to point out in 2010 that law and morality are not the same thing, but, well, law and morality are not the same thing. The problem is that much of the task force report reads as if they were. An end to genocide: It is an attractive prospect, not to mention a morally unimpeachable goal in which Kantian moral absolutism meets American can do-ism, where the post-ideological methodologies (which are anything but post-ideological, of course) of international lawyers meet the American
February & March 2011 33

David Rieff
elite’s faith, which goes back at least to Woodrow Wilson if not much earlier in the history of the republic, that we really can right any wrong if only we commit ourselves sufficiently to doing so. Unfortunately, far too much is assumed (or stipulated, as the lawyers say) by the report’s authors. More dismayingly still, far too many of the concrete examples either of what could have been done but wasn’t are presented so simplistically as to make the solutions offered appear hollow, since the challenge as described bears little or no resemblance to the complexities that actually exist. Darfur is a good example of this. The report mentions Darfur frequently, both in the context of a nuts and bolts consideration of the strengths and weaknesses of various states and institutions such as the un and the African Union, which have intervened, however unsatisfacThe calls for an torily, over the course of the crisis, and as an example of how the mobilization of civil society can influintervention in ence policy. “In today’s age of electronic media comDarfur reached munication,” the report states, “Americans are increasingly confronted in their living rooms — and their height after even on their cell phones — with information about and images of death and destruction virtually anythe moral where they occur. . . . The Internet has proven to be imperative for a powerful tool for organizing broad-based responsintervention had es to genocide and mass atrocities, as we have seen in response to the crisis in Darfur.” started to The problem is not so much that this statement is dissipate. false but rather that it begs more questions than it answers, and, more tellingly still, that the report’s authors seem to have no idea of this. There is no question that the rise in 2005 and 2006 of a mass movement calling for an end to mass killing in Darfur (neither the United Nations nor the most important relief groups present on the ground in Darfur agree with the characterization of what took place there as a genocide) was an extraordinarily successful mobilization — perhaps the most successful since the anti-Apartheid movement of the 1970s and 1980s. Beginning with the activism of a small group of college students who in June 2004 had attended a Darfur Emergency Summit organized by the U.S. Holocaust Memorial Museum and addressed by Elie Wiesel, and shortly afterwards founded an organization called Save Darfur, the movement rapidly expanded and, at its height, included the U.S. Congressional Black Caucus, right-wing evangelicals, left-leaning campuses activists, mainline human rights activists, and American neoconservatives. But nowhere does the task force report examine whether the policy recommendations of this movement were wise, or, indeed, whether the effect that they had on the U.S. debate was positive or negative. Instead, the report proceeds as if any upsurge in grassroots interest and activism galvanized by catastrophes like Darfur is by definition a positive development. In reality, the task force’s assumption that any mass movement that sup34 Policy Review

The Persistence of Genocide
ports “more assertive government action in response to genocide and mass atrocities” is to be encouraged is a strangely content-less claim. Surely, before welcoming the rise of a Save Darfur (or its very influential European cousin, sos Darfour), it is important to think clearly not just about what they are against but what they are for. And here, the example of Save Darfur is as much a cautionary tale as an inspiring one. The report somewhat shortchanges historical analysis, with what little history that does make it in painted with a disturbingly broad brush. Obviously, the task force was well aware of this, which I presume is why its report insists, unwisely in my view, that it was far more important to focus on the present and the future more than on the past. But understanding the history is not marginal, it is central. Put the case that one believes in military intervention in extremis to halt genocide. In that case, intervening in late-2003 and early-2004, when the killing was at its height, would have been the right thing to do. But Save Darfur really only came into its own in late 2005, that is, well after the bulk of the killing had ended. In other words, the calls for an intervention reached their height after the moral imperative for such an intervention had started to dissipate. An analogy can be made with the human rights justification for the U.S. overthrow of Saddam Hussein. As Kenneth Roth, the head of Human Rights Watch, has pointed out, had this happened during Baghdad’s murderous Anfal campaign against the Kurds in 1988, there would have been a solid justification for military intervention, whether or not Human Rights Watch would have agreed with it. But to intervene fifteen years later because of the massacre was indefensible on human rights grounds (though, obviously, there were other rationales for the war that would not have been affected by such reasoning). If you want to be a prophet, you have to get it right. And if Save Darfur was wrong in its analysis of the facts relevant to their call for an international military intervention to stop genocide, either because there had in reality been no genocide (as, again, the un and many mainstream ngos on the ground insisted) or because the genocide had ended before they began to campaign for intervention, then Save Darfur’s activism can just as reasonably be described in negative terms as in the positive ones of the task force report. Yes, Save Darfur had (and has) good intentions and the attacks on them from de facto apologists for the government of Sudan like Mahmood Mamdani are not worth taking seriously. But good intentions should never be enough.1 In fairness, had the task force decided to provide the history of the Darfur, or Bosnia, or Rwanda, in all their frustrating complexity, they would have
1. Under attack from a number of quarters, the leadership of Save Darfur has claimed that they were never calling for a military intervention to overthrow the Bashir regime in Khartoum but rather for an international protection force to protect the people of Darfur. Leaving aside whether, in practical terms, this is a distinction without a difference (i.e., that the latter would have required the former, as other proDarfur activists like Eric Reeves and Gerard Prunier had the courage to acknowledge), the record of their statements belies this claim.

February & March 2011

35

David Rieff
produced a report that, precisely because of all the nuance, the ambiguity, the need for “qualifiers,” doubtless would have been of less use to policymakers, whose professional orientation is of necessity toward actionable policies. But when what is being suggested is a readiness for U.S. soldiers (to be sure, preferably in a multilateral context) in extreme cases to kill and die to prevent genocide or mass atrocity crimes, then, to turn human rights Kantianism against them for a change, it is nuance that is the moral imperative. Again, good intentions alone will not do. Qui veut faire l’ange, fait la bete, Pascal said. Who wishes to act the angel, acts the beast. History, in all its unsentimentality, is almost always the best antidote to such simplicities. And yet, if anything, the task force’s report is a textbook case of ahistorical thinking and its perils. The authors emphasize that, “This task force is not a historical commission; its focus is on the future and on prevention.” The problem is that unless the past is looked at in detail, not just conjured up by way of illustrations of the West’s failures to intervene that the task force hopes to remedy, then what is being argued for, in effect, are, if necessary, endless wars of altruism. To put it charitably, in arguing for that, I do not think the authors have exactly established their claim to occupying the moral high ground. If they had spent half the time thinking about history in as serious a way as they did about how to construct the optimal bureaucratic architecture within the U.S. government, then what the task force finally produced would have been a document that was pathbreaking. Instead, they took the conventional route, and, in my view, will simply add their well-reasoned policy recommendations to the large number that came before and, indeed, as in the case of the recent initiative of the Montreal Institute for Genocide and Human Rights Studies on the so-called Will to Intervene, have already begun to come after. With the best will in the world, what is one to make of arguments made at the level of generalization of the following?
Grievances over inequitable distribution of power and resources appear to be a fundamental motivating factor in the commission of mass violence against ethnic, sectarian, or political groups. That same inequality may also provide the means for atrocities to be committed. For example, control of a highly centralized state apparatus and the access to economic and military power that comes with it makes competition for power an all-or-nothing proposition and creates incentives to eliminate competitors. This dynamic was evident in Rwanda and Burundi and is serious cause for concern in Burma today.

The fact is that, vile as they are, there is actually very little likelihood of the butchers in Rangoon committing genocide — their crimes have other characteristics. It is disheartening that the members of the task force would allow the fact that they, like most sensible people, believe that Burma is one of the worst dictatorships in the world, to justify their distorting reality in this way,
36 Policy Review

The Persistence of Genocide
when they almost certainly know better. And since they do precisely that, it is hard not to at least entertain the suspicion — whose implications extend rather further than that and beg the question of what kind of world order follows from the task force’s recommendations — that consciously or (and this is worse, in a way) unconsciously they reasoned that if they could identify the Rangoon regime as genocidal, this would make an international intervention to overthrow it far more defensible. If this is right, then, if implemented, the report (again, intentionally or inadvertently) would have the effect of helping nudge us back toward a world where the prevention of genocide becomes a moral warrant for other policy agendas (as was surely the case with Saddam Hussein in 2003, and was the case with General Bashir in Khartoum until the arrival of the Obama administration). I write this in large measure because the task force’s description of why mass violence and genocide occur could be a description of practically the entire developing world. Analysis at that level of generalization is not just useless, it is actually a prophylactic against thought. It gets worse. The authors write:
It is equally important to focus on the motivations of specific leaders and the tools at their disposal. There is no genocidal destiny. Many countries with ethnic or religious discrimination, armed conflicts, autocratic governments, or crushing poverty have not experienced genocide while others have. The difference comes down to leadership. Mass atrocities are organized by powerful elites who believe they stand to gain from these crimes and who have the necessary resources at their disposal. The heinous crimes committed in Nazi-occupied Europe, Cambodia, and Rwanda, for example, were all perpetrated with significant planning, organization, and access to state resources, including weapons, budgets, detention facilities, and broadcast media. There are also key triggers that can tip a high-risk environment into crisis. These include unstable, unfair, or unduly postponed elections; high-profile assassinations; battlefield victories; and environmental conditions (for example, drought) that may cause an eruption of violence or heighten the perception of an existential threat to a government or armed group. Sometimes potential triggers are known well in advance and preparations can be made to address the risk of mass atrocities that may follow. Poorly planned elections in deeply divided societies are a commonly cited example, but deadlines for significant policy action, legal judgments, and anniversaries of highly traumatic and disputed historical events are also potential triggers that can be foreseen.

I tax the reader’s patience with such a long quotation to show how expertise can produce meaninglessness. For apart from the mention of poorly planned elections — a reference to Rwanda that is perfectly correct as far as it goes — the rest of this does not advance our understanding one
February & March 2011 37

David Rieff
iota. To remedy or at least alleviate these vast social stresses, the task force recommends “effective [sic] early prevention”! The authors themselves were obliged to admit that, “Such efforts to change underlying social, economic, or political conditions are difficult and require sustained investment of resources and attention.” Really, you think? But about where these resources, as opposed to institutional arrangements, are to come from, they are largely silent, apart from emphasizing the need to target with both threats and positive inducements leaders thought likely to choose to commit such crimes. But the authors know perfectly well that, as they themselves put it, “early engagement is a speculative venture,” and that “the watch list of countries ‘at risk’ can be long, due to the difficulty of anticipating specific crises in a world generally plagued by instability.” Surely, people like Secretary Albright and Secretary Cohen know better than anyone that such ventures are never going to be of much interest to senior policymakers, just as the global Marshall Plan that would be required to effectively address the underlying causes of genocidal wars is never going to be on offer. o a great power, and to the citizens of great power, powerlessness is simply an unconscionable destiny. The task force report, with its strange imperviousness to viewing historical tragedy as much more than an engineering problem, is a perfect illustration of this. Unsound historically, and hubristic morally, for all its good intentions, the task force report is not a blueprint for a better future but a mystification of the choices that actually confront us and between which we are going to have to choose if we are ever to prevent or halt even some genocides. My suspicion is that the reason that the very accomplished, distinguished people who participated in the task force did not feel obliged to face up to this is because the report gives as much weight to the national interest basis for preventing or halting genocide as it does to the moral imperative of doing so. As the report puts it:
First, genocide fuels instability, usually in weak, undemocratic, and corrupt states. It is in these same types of states that we find terrorist recruitment and training, human trafficking, and civil strife, all of which have damaging spillover effects for the entire world. Second, genocide and mass atrocities have long-lasting consequences far beyond the states in which they occur. Refugee flows start in bordering countries but often spread. Humanitarian needs grow, often exceeding the capacities and resources of a generous world. The international community, including the United States, is called on to absorb and assist displaced people, provide relief efforts, and bear high economic costs. And the longer we wait to act, the more exorbitant the price tag. For example, in Bosnia, the United States has invested nearly $15 billion to
38 Policy Review

T

The Persistence of Genocide
support peacekeeping forces in the years since we belatedly intervened to stop mass atrocities. Third, America’s standing in the world — and our ability to lead — is eroded when we are perceived as bystanders to genocide. We cannot be viewed as a global leader and respected as an international partner if we cannot take steps to avoid one of the greatest scourges of humankind. No matter how one calculates U.S. interests, the reality of our world today is that national borders provide little sanctuary from international problems. Left unchecked, genocide will undermine American security. A core challenge for American leaders is to persuade others — in the U.S. government, across the United States, and around the world — that preventing genocide is more than just a humanitarian aspiration; it is a national and global imperative.

Again, apologies for quoting at such length. but truthfully, is one meant to take this seriously? There is absolutely no evidence that terrorist recruiting is more promising in failed states than, say, in suburban Connecticut where the (very middle-class) Faisal Shahzad, son of a retired Pakistani Air Force vice-marshal, plotted to explode a car bomb in Times Square. Nor, in the U.S. case is there any basis for concluding that the main source of immigration is from places traumatized by war. To the contrary, most of our immigrants are the best and the brightest (in the sense not of the most educated but most enterprising) of Mexico, the Philippines, India, and China. The proportion of migrants from Sudan or Somalia is small by comparison. As for the costs of peacekeeping, are the authors of the report serious? Fifteen billion dollars? The sum barely signifies in the rubric of the military budget of the United States. And lastly, the report’s claim that the U.S. won’t be viewed as a global leader and respected as an international partner if it doesn’t take the lead to stop genocide is absurd on its face. Not respected by whom, exactly? Hu Jintao in Beijing? Merkel in Berlin? President Felipe Calderon in Mexico City? To put it charitably, the claim conjures up visions of Pinocchio, rather than Theodore Roosevelt or Woodrow Wilson. The report calls for courage, but courage begins at home. Pressed by Armenian activists at one of the events held to launch the report as to why they had both earlier signed a letter urging the U.S. not to bow to Armenian pressure and formally recognize the Armenian genocide, Secretary Cohen and Secretary Albright refused over and over again to characterize the Armenian genocide as, well, a genocide. It is true that the Armenian activists had come looking for a confrontation. But there can be little question that both secretaries did everything they could to avoid committing themselves one way or the other. “Terrible things happened to the Armenians,” Secretary Albright said, refusing to go any further. The letter, she explained, had been primarily about “whether this was an appropriate time to raise the issue.” For his part, Secretary Cohen, emphasized that angering the Turks
February & March 2011 39

David Rieff
while the Iraq war was raging could lead to Turkish reactions that would “put our sons and daughters in jeopardy.” And, in any case, the task force was not “a historical commission.” This is a perfectly defensible position from the perspective of prudential realpolitik. The problem is that what the task force report constantly calls for is political courage. And whatever else they were, Secretaries Albright and Cohen’s responses were expedient, not courageous. There will always be reasons not to intervene — compelling pressures, I mean, not trivial ones. Why should a future U.S. government be less vulnerable to them than the Bush or Obama administrations? About this, as about so many other subjects, the task force report is as evasive as Secretary Albright and Secretary Cohen were at the press conference at which the Armenian activists confronted them. Doubtless, they had to be. For the solutions they propose are not real solutions, the history they touch on is not the actual history, and the world they describe is not the real world.

40

Policy Review

PTSD’s Diagnostic Trap
By Sally Satel

ilitary history is rich with tales of warriors who return from battle with the horrors of war still raging in their heads. One of the earliest examples was enshrined by Herodotus, who wrote of an Athenian warrior struck blind “without blow of sword or dart” when a soldier standing next to him was killed. The classic term — “shell shock” — dates to World War I; “battle fatigue,” “combat exhaustion,” and “war stress” were used in World War II. Modern psychiatry calls these invisible wounds post-traumatic stress disorder (ptsd). And along with this diagnosis, which became widely known in the wake of the Vietnam War, has come a new sensitivity — among the public, the military, and mental health professionals — to the causes and consequences of being afflicted. The Department of Veterans Affairs is parSally Satel is a psychiatrist, a resident scholar at the American Enterprise Institute, and a lecturer at Yale University School of Medicine.
February & March 2011 41 Policy Review

M

Sally Satel
ticularly attuned to the psychic welfare of the men and women who are returning from Operation Iraqi Freedom and Operation Enduring Freedom. Last July, retired Army General Eric K. Shinseki, secretary of Veterans Affairs, unveiled new procedures that make it easier for veterans who believe they are disabled by wartime stress to file benefit claims and receive compensation.“[Psychological] wounds,” Shinseki declared, “can be as debilitating as any physical battlefield trauma.” This is true. But gauging mental injury in the wake of war is not as straightforward as assessing, say, a lost limb or other physical damage. For example, at what point do we say that normal, if painful, readjustment difficulties have become so troubling as to qualify as a mental illness? How can clinicians predict which patients will recover when a veteran’s odds of recovery depend so greatly on nonmedical factors, including his own expectations for recovery; social support available to him; and the intimate meaning he makes of his distress? Inevitably, successful caregiving will turn on a clear understanding of post-traumatic stress disorder. One of the most important and paradoxical lessons to emerge from these insights is that lowering the threshold for receipt of disability benefits is not always in the best interest of the veteran and his family. Without question, some veterans will remain so irretrievably damaged by their war experience that they cannot participate in the competitive workplace. These men and women clearly deserve the roughly $2,300 monthly tax-free benefit (given for “total,” or 100 percent, disability) and other resources the Veterans Administration offers. But what if disability entitlements actually work to the detriment of other patients by keeping them from meaningful work and by creating an incentive for them to embrace institutional dependence? And what if the system, well-intentioned though it surely is, does not adequately protect young veterans from a premature verdict of invalidism? Acknowledging and studying these effects of compensation can be politically delicate, yet doing do is essential to devising reentry programs of care for the nation’s invisibly wounded warriors.

What is ptsd?
h e m o s t r e c e n t edition of the Diagnostic and Statistical Manual (dsm IV) of the American Psychiatric Association defines ptsd according to symptoms; their duration; and the nature of the “trauma” or event. Symptoms fall into three categories: re-experiencing (e.g., relentless nightmares; unbidden waking images; flashbacks); hyperarousal (e.g., enhanced startle, anxiety, sleeplessness); and phobias (e.g., fear of driving after having been in a crash). These must persist for at least 30 days and impair function to some degree. Overwhelming calamity — or “stressor,” as psychiatrists call it — of any kind, such as a natural disaster, rape, accident, or assault, can lead to ptsd.
42 Policy Review

T

PTSD’s Diagnostic Trap
Notably, not everyone who confronts horrific circumstances develops ptsd. Among the survivors of the Oklahoma City bombing, for example, 34 percent developed ptsd, according to a study by psychiatric epidemiologist Carol North. After a car accident or natural disaster, fewer than 10 percent of victims are affected, while among rape victims, well over half succumb. The reassuring news is that, as with grief and other emotional reactions to painful events, most sufferers get better with time, though periodic nightmares and easy startling may linger for additional months or even years. In contrast to the sizeable literature on ptsd in civilian populations and in active-duty soldiers, data on veterans are harder to come by. To date, the congressionally mandated National Vietnam Veterans Readjustment Study (nvvrs) remains the landmark analysis. Data were collected during According to the 1986 and 1987 and revealed that 15.2 percent of Columbia a random sample of veterans still met criteria for reanalysis, the ptsd. Yet, a number of scholars found those psychological estimates to be improbably high (e.g., if roughly one in six Vietnam veterans suffered from ptsd, as cost of Vietnam the nvvrs suggests, this would mean that virtually was 40 percent each and every soldier who served in combat — a ratio of 1 combatant to every 6 in support speciallower than the ties — developed the condition). To help clarify the picture, a team of researchers from Columbia original estimate. University undertook a reanalysis of the nvvrs. After their results appeared in Science in 2006, it became impossible for responsible researchers to consider the original findings of nvvrs as definitive. According to the Columbia reanalysis, the psychological cost of the war was 40 percent lower than the original nvvrs estimate — that is, 9.1 percent were diagnosed with ptsd at the time of the study. The researchers arrived at this prevalence rate by considering information — collected by the original nvvrs investigators but not used — on veterans’ functional impairment (i.e., their ability to hold a job, fulfill demands of family life, maintain friendships, etc). However, the Columbia team used a rather lenient definition of “impairment,” stipulating that even veterans with “some difficulty” but who were “functioning pretty well” despite their symptoms had ptsd. This spurred yet another reanalysis. In a 2007 article in the Journal of Traumatic Stress, Harvard psychologist Richard McNally took the definition of impairment up a notch so that only veterans who had at least “moderate difficulty” in social or occupational functioning could qualify as having ptsd. In doing so, he further reduced the estimate of affliction to 5.4 percent. If nothing else, this analytic sequence — from the nvvrs, to the Columbia reevaluation, and to the McNally recalibration — serves as an object lesson in the definitional fluidity of psychiatric syndromes.
February & March 2011 43

Sally Satel
From the wars in Iraq and Afghanistan, researchers have collected data on thousands of active-duty servicemen, but very little on veterans of those conflicts. The most rigorous evaluation to date appeared in the Archives of General Psychiatry last summer. It was conducted by investigators at the Walter Reed Army Institute of Research who applied rigorous and uniform diagnostic standards. This distinguished their work from other studies on the current Gulf wars, which were deficient in one or more ways: failure to perform in-depth diagnostic assessments; use of broad sampling that did not distinguish combat from support personnel; or assessment by snapshot rather than longitudinal follow-up. The Walter Reed team assessed over 18,000 army soldiers in infantry brigade combat teams at three points: pre-deployment (to establish a baseline); three months after deployment; and at twelve months post-deployment. After three months the rate of ptsd (symptoms accompanied by “serious impairment”) was 6.3 percent higher than the pre-deployment baseline. At a year, it was 7.3 percent higher.

The new va rule

O

n july 12, 2010, General Shinseki penned an op-ed in USA Today (“For Vets with ptsd, End of an Unfair Process”) announcing a new Veterans Administration rule making it easier for veterans suffering from ptsd to file disability claims. Part of the rule was straightforward: The va would no longer require that a veteran provide documentation of his exposure to combat trauma, seeing how such paperwork is often very difficult for veterans to obtain. Streamlining the lumbering claims bureaucracy is one thing, and welcome it is, but the new rule does not end there. It also establishes that noninfantry personnel can qualify for ptsd disability if they had good reason to fear danger, such as firefights or explosions, even if they did not actually experience it. “[If] a stressor claimed by a veteran is related to the veteran’s fear of hostile military or terrorist activity, he is eligible for a ptsd benefits,” according to the Federal Register. This is a strikingly novel amendment. The idea that one can sustain an enduring and disabling mental disorder based on anxious anticipation of a traumatic event that never materialized is a radical departure from the clinical — and common-sense — understanding that traumatic stress disorders are caused by events that actually do happen to people.1 However, this is by no means

1. The new rule is actually quite confusing. See the Federal Register, 75:133 (July 13, 2010), 39847, available online at http://www.thefederalregister.com/d.p/2010-07-13-2010-16885. (This and subsequent weblinks accessed December 13, 2010.) While the Federal Register states that a diagnosis of ptsd cannot be made “in the absence of exposure to a traumatic event,” in keeping with the formal psychiatric conception of ptsd, it also says, apparently contrarily, that “constant vigilance against unexpected attack” can constitute a stressor and that ptsd can result from “veteran’s fear of hostile military or terrorist activity.”

44

Policy Review

PTSD’s Diagnostic Trap
the first time that controversy and ambiguity have swirled around the diagnosis of ptsd. During the Civil War, some soldiers were said to suffer “irritable heart” or “Da Costa’s Syndrome” — a condition marked by shortness of breath, chest discomfort, and pounding palpitations that doctors could not attribute to a medical cause. In World War I, the condition became known as “shell shock” and was characterized as a mental problem. The inability to cope was believed to reflect personal weakness — an underlying genetic or psychological vulnerability; combat itself, no matter how intense, was deemed little more than a precipitating factor. Otherwise well-adjusted individuals were believed to be at small risk of suffering more than a transient stress reaction once they were removed from the front. In 1917, the British neuroanatomist Grafton In the summer Elliot Smith and the psychologist Tom Pear chalof 1972, the lenged this view. They attributed the cause more to the experiences of war and less to the character or New York Times fiber of soldiers themselves. “Psychoneurosis may be produced in almost anyone if only his environment ran a front-page be made ‘difficult’ enough for him,” they wrote in story on their book, Shell Shock and Its Lessons. This trig“Post-Vietnam gered a feisty debate within British military psychiaSyndrome.” try, and eventually the two sides came to agree that both the soldier’s predisposition to stress and his exposure to hostilities contributed to breakdown. By World War II, then, military psychiatrists believed that even the bravest and fittest soldier could endure only so much. “Every man has his breaking point,” the saying went. The story of ptsd, as we know it today, starts with the Vietnam War. In the late 1960s, a band of self-described antiwar psychiatrists — led by Chaim Shatan and Robert Jay Lifton, who was well known for his work on the psychological damage wrought by Hiroshima — formulated a new diagnostic concept to describe the psychological wounds that the veterans sustained in the war. They called it “Post-Vietnam Syndrome,” a disorder marked by “growing apathy, cynicism, alienation, depression, mistrust, and expectation of betrayal as well as an inability to concentrate, insomnia, nightmares, restlessness, uprootedness, and impatience with almost any job or course of study.” Not uncommonly, the psychiatrists said, these symptoms did not emerge until months or years after the veterans returned home. Civilian contempt for veterans, according to Messrs. Shatan and Lifton, further entrenched their hostility and impeded their return. This vision inspired portrayals of the Vietnam veteran as a kind of “walking time bomb,” “living wreckage,” or rampaging loner, images immortalized in films such as “Taxi Driver” and “Rambo.” In the summer of 1972, the New York Times ran a front-page story on Post-Vietnam Syndrome. It reported that 50 percent of all Vietnam veterans — not just combat veterans — needed professional help to readjust, and contained
February & March 2011 45

Sally Satel
phrases such as “psychiatric casualty,” “emotionally disturbed,” and “men with damaged brains.” By contrast, veterans of World War II were heralded as heroes. They had fought in a popular war, a vital distinction for understanding how veterans and the public give meaning to their wartime hardships and sacrifice. Historians and sociologists note that the high-profile involvement of civilian psychiatrists in the wake of the Vietnam War was another feature that set those returning soldiers apart. “The suggestion or outright assertion was that Vietnam veterans have been unique in American history for their psychiatric problems,” writes the historian Eric T. Dean Jr. in Shook over Hell: Post-Traumatic Stress, Vietnam, and the Civil War. As the image of the psychologically injured veteran took root in the national conscience, the psychiatric profession debated the wisdom of giving him his own diagnosis.

ptsd becomes official

I

n 1980, the American Psychiatric Association adopted post-traumatic stress disorder (rather than the narrower post-Vietnam syndrome) as an official diagnosis in the third edition of its Diagnostic and Statistical Manual. A patient could be diagnosed with ptsd if he experienced a trauma or “stressor” that, as dsm described it, would “evoke significant symptoms of distress in almost everyone.” Rape, combat, torture, and fires were those deemed to fall, as the dsm III required, “generally outside the range of usual human experience.” Thus, while the stress was unusual, the development of ptsd in its wake was not. No longer were prolonged traumatic reactions viewed as a reflection of an individual’s constitutional vulnerability. Instead, stress-induced syndromes were a natural process of adapting to extreme stress. With the introduction of ptsd into the psychiatric manual, the single-minded emphasis on the importance of one’s pre-morbid state in shaping response to crisis gave way to preoccupation with the trauma itself and its supposed leveling effect on human response. Surely, it was wrong of earlier psychiatrists to attribute war-related pathology solely to the combatant himself, but the dsm III definition embodied an equal but opposite error: It obliterated the role of an individual’s own characteristics in the development of the condition. Not surprising, perhaps, this blunder served a political purpose. As British psychiatrist Derek Summerfield put it, the newly minted diagnosis of ptsd “was meant to shift the focus of attention from the details of a soldier’s background and psyche to the fundamentally traumatic nature of war.” Shatan and Lifton clearly saw ptsd as a normal response. “The placement of post-traumatic stress disorder in [the dsm] allows us to see the policies of diagnosis and disease in an especially clear light,” writes combat veteran and sociologist Wilbur Scott in his detailed 1993 account The Politics of Readjustment: Vietnam Veterans Since the War. The diagnosis of ptsd is
46 Policy Review

PTSD’s Diagnostic Trap
in the dsm, Mr. Scott writes, “because a core of psychiatrists and Vietnam veterans worked conscientiously and deliberately for years to put it there . . . at issue was the question of what constitutes a normal reaction or experience of soldiers to combat.” Thus, by the time ptsd was incorporated into the official psychiatric lexicon, it bore a hybrid legacy — part political artifact of the antiwar movement, part legitimate diagnosis. Over the years, the major symptoms of ptsd have remained fairly straightforward — re-experiencing, anxiety, and phobic avoidance — but what counted as a traumatic experience turned out to be a moving target in subsequent editions of the dsm. In 1987, the dsm III was revised to expand the definition of a traumatic experience. The concept of stressor now included witnessing harm to others, such as a horrific car accident in progress. In the fourth edition in 1994, the range of “traumatic” events was expanded further to include hearing about harm or threats to others, such as the unexpected death of a loved one or receiving a fatal diagnosis such as terminal cancer oneself. No longer did one need to experience a life-threatening situation directly or be a close witness to a ghastly accident or atrocity. As long as one experienced an “intense fear, helplessness, or horror” in response to a catastrophic event (e.g., after watching the September 11 terrorist attacks on television, or being in a minor car accident) he could conceivably qualify for a diagnosis of ptsd if symptoms of re-experiencing, arousal, and phobias persisted for a month. There is pitched debate within the field of traumatology as to whether a stressor should be defined as whatever traumatizes a person. True, a person might feel “traumatized” by, say, a minor car accident — but to say that a fender-bender counts as trauma alongside such horrors as concentration camps, rape, or the Bataan Death March is to dilute the concept. “A great deal rides on how we define the concept of traumatic stressor,” says Richard J. McNally. In the civilian realm, he says, “the more we broaden the category of traumatic stressors, the less credibly we can assign causal significance to a given stressor itself and the more weight we must place on personal vulnerability.” In the context of war, too, while anticipatory fear of being thrust in harm’s way could conceivably morph into a crippling stress reaction, this will almost surely be more likely among individuals who struggled with anxiety-related problems prior to deployment. Surely, their distress merits treatment from military psychiatrists, but the odds that such symptoms persist after separation from the military, let alone harden into a serious, lasting state of disablement, are probably very low.

The troubled va disability system

S

ecretary shinseki’s move to reduce the bureaucratic hurdles to the va disability system and broaden eligibility for ptsd will add to the already accelerating stream of veterans who are applying to
47

February & March 2011

Sally Satel
enter it. Thus, it is imperative that the va turn its attention to that system itself. Two overarching problems need remedies. The first is the culture of clinical diagnosis. Some disability evaluators now use a detailed interview checklist to gauge the degree to which daily function is impaired, but its implementation is uneven across medical centers. Thus, it is still easy for clinicians — especially those whose diagnostic skills were honed during the Vietnam era — to label problems such as anxiety, guilt over comrades who died, and chronic sleep disturbance mental illnesses. This is facile, of course, as symptoms splay out along a continuum ranging from normal, if painful, readjustment difficulties to chronic, debilitating pathology. Further, not all symptoms of distress in someone who has been to war reflexively signal the presence of ptsd, as some clinicians seem to think. Among veterans whose problems are indeed warThe distinction related, however, the distinction between reversible and lasting incapacitation matters greatly when the between veteran is seeking disability status. And this brings reversible and us to the second matter: the inadvertent damage that lasting disability benefits themselves can sometimes cause. Imagine a young soldier wounded in Afghanistan. incapacitation His physical injuries heal, but his mind remains tormatters greatly mented. Sudden noises make him jump out of his when the veteran skin. He is flooded with memories of a bloody firefight, tormented by nightmares, can barely concenis seeking trate, and feels emotionally detached from everydisability status. thing and everybody. At 23 years old, the soldier is about to be discharged from the military. Fearing he’ll never be able to hold a job or fully function in society he applies for “total” disability (the maximum designation, which provides roughly $2,300 per month) compensation for ptsd from the va. This soldier has resigned himself to a life of chronic mental illness. On its face, this seems only logical, and granting the benefits seems humane. But in reality it is probably the last thing the young soldier-turning-veteran needs — because compensation will confirm his fears that he is indeed beyond recovery. While a sad verdict for anyone, it is especially tragic for someone only in his twenties. Injured soldiers can apply for and receive va disability benefits even before they have been discharged from the military — and, remarkably, before they have even been given the psychiatric treatment that could help them considerably. Imagine telling someone with a spinal injury that he’ll never walk again — before he has had surgery and physical therapy. A rush to judgment about the prognosis of psychic injuries carries serious long-term consequences insofar as a veteran who is unwittingly encouraged to see himself as beyond repair risks fulfilling that prophecy. Why should I bother with treatment? he might think. A terrible mistake, of course. The months before and after separation from the service are periods when men48 Policy Review

PTSD’s Diagnostic Trap
tal wounds are fresh and thus most responsive to therapeutic intervention, including medication. Told he is disabled, the veteran and his family may assume — often incorrectly — that he is no longer able to work. At home on disability, he risks adopting a “sick role” that ends up depriving him of the estimable therapeutic value of work. Lost are the sense of purpose work gives (or at least the distraction from depressive rumination it provides), the daily structure it affords, and the opportunity for socializing and cultivating friendships. The longer he is unemployed, the more his confidence in his ability and motivation to work erodes and his skills atrophy. Once a patient is caught in such a downward spiral of invalidism, it can be hard to throttle back out. What’s more, compensation contingent upon being sick often creates a perverse incentive to remain sick. For example, even if a veteran wants very much to work, he understandably fears losing his financial safety net if he leaves the disability rolls to take a job that ends up proving too much for him. This is how full disability status can undermine the possibility of recovery.

What to do: Treatment first
or many veterans, the transition between military and civilian life is a critical juncture marked by acute feelings of flux and dislocation. Recall the scene in The Hurt Locker (one of the few scenes, incidentally, that former soldiers have deemed realistic) in which Sergeant William James stares at the wall of cereal boxes in the supermarket, disoriented by the tranquil and often trivial nature of the civilian world. As Sebastian Junger wrote in his powerful book War, “Some of the men worry they’ll never again be satisfied with a ‘normal life’ . . . They worry that they may have been ruined for anything else.” Returning from war is a major existential project. Imparting meaning to the wartime experience, reconfiguring personal identity, and reimagining one’s future take time. Sometimes the emotional intensity can be overwhelming — especially when coupled with nightmares and high anxiety or depression — and even warrants professional help. When this happens, the veteran should receive a message of promise and hope. This means a prescription for quality treatment and rehabilitation — ideally before the patient is even permitted to apply for disability status. However, under the current system, when a veteran files a disability claim, a ratings examiner is assigned to determine the extent of incapacitation, irrespective of whether he has first received care. As part of the assessment, the examiner requests a psychiatric evaluation with a psychiatrist or a psychologist to obtain a diagnosis. If the veteran is diagnosed with ptsd by the clinician, the ratings examiner then assigns a severity index to his disability. The Veterans Benefits Administration recognizes different levels of disability. As detailed in the Code of Federal
February & March 2011 49

F

Sally Satel
Regulations, a ten percent severity rating for a mental illness denotes “mild or transient symptoms which [affect] occupational tasks only during periods of significant stress.” A patient assigned 30 percent disability has “intermittent periods of inability to perform occupational tasks although generally functioning satisfactorily.” A 50 percent rating begins to denote significant deficits including “difficulty in understanding complex commands” and reduced reliability and productivity. The most severe level, 100 percent, corresponds to “total occupational and social impairment.” Something is terribly wrong with this picture. To conclude that a veteran has dismal prospects for meaningful recovery before he or she has had a course of therapy and rehabilitation is premature in the extreme.2 To be sure, the va is trying hard to make treatment accessible, but administrators, raters, and clinicians cannot require patients to accept it as a condition of being considered for disability compensation. Absent a course of quality treatment and rehabilitation, evaluators simply do not have enough evidence to make a determination. Unwittingly, this policy has set in motion a growing dependence on the va and disincentive to meaningful improvement. In 2008, former Senator Richard Burr of North Carolina, then the ranking member of the Senate Veterans Affairs Committee, sought a limited remedy. He introduced the Veterans Mental Health Treatment First Act. The purpose of this bill was to induce new veterans to embark upon a path to recovery. Any veteran diagnosed with major depression, post-traumatic stress disorder, or other anxiety disorders stemming from military activity would be eligible for a financial incentive (which Burr called a “wellness stipend”) to adhere to an individualized course of treatment and agree to a pause in claims action for at least a year or until completion of treatment, which ever came first. The bill died in committee.

Don’t fight the same war twice
ental health experts have learned a lot about how not to treat veterans from our experience during the Vietnam era. I speak from my experience as a psychiatrist at the West Haven Veterans Affairs Medical Center in Connecticut from 1988 to 1992, a time of blossoming interest in ptsd within both the va and the mental-health establishment. Good intentions were abundant, but, in retrospect, much of
2. Studies of Vietnam veterans have found that 68 to 94 percent of claimants seeking treatment for the first time are also applying for ptsd disability benefits; for review see B. Christopher Frueh, et al., “Disability compensation seeking among veterans evaluated for posttraumatic stress disorder,” Psychiatric Services 54 (January 2003), 84–91. According to Nina Sayer and colleagues, “most claimants reported seeking disability compensation for symbolic reasons, especially for acknowledgement, validation and relief from self-blame; see Nina Sayer, et al., “Veterans seeking disability benefits for posttraumatic stress disorder: Who applies and the self-reported meaning of disability compensation,” Social Science & Medicine 58:11 (June 2004), 2133–43. I could not find comparable studies on oie and oif veterans, but the Compensation and Pension examiners I interviewed suggest that the “disability first” approach is not uncommon.

M

50

Policy Review

PTSD’s Diagnostic Trap
our treatment philosophy was misguided. For example, clinicians tended to view whatever problem beset a veteran as a product of his war experience. In addition, therapists spent too much time urging veterans to experience catharsis by reliving their war experiences in group therapy, individual therapy, art therapy, and theatre reenactments. Groups of twenty or so veterans were admitted to the hospital and stayed together, platoonlike, for four months. This practice took them out of their communities and away from their families. I remember some of the men coming back from a day’s leave from the hospital ward with new war-themed tattoos and combat fatigues — not exactly readjustment! It is clear, in retrospect, that instead of fostering regression, we should have emphasized resolution of everyday problems of living, such as family chaos, employment difficulties, and substance abuse. Some clinicians, The good news is that most of these inpatient myself included, programs are now shuttered. Studies showed them to be largely ineffective. What followed over the would even years was a wholesale shift away from cathartic like to see reenactment of war trauma and a growing emphasis the diagnosis on forward-looking rehabilitation and evidencebased treatments such as cognitive therapy, behavof PTSD ioral desensitization (some techniques involving virdownplayed tual reality recreations of combat scenarios), and medication if needed. The va does appear to be altogether. making serious efforts to ensure that all mental health clinics are equipped to offer state of the art treatment for ptsd. Some clinicians, myself included, would even like to see the diagnosis of ptsd downplayed altogether in favor of trying to understand patients’ symptoms in context. As Texas psychiatrist Martha Leatherman puts it, “behaviors such as easy startling, hypervigilance, and sleep disturbance that are common in combat situations are normal, survival mechanisms,” she says. Unfortunately, when they return, veterans are told that these symptoms mean p t s d . “This stirs up visions of Vietnam veterans living under bridges,” Leatherman says, “and then, in a panic, they apply for disability compensation for p t s d so that they won’t end up homeless too.” Regrettably, the legacy of Vietnam era ptsd haunts the current generation of veterans. “It has been very troubling to me to see oef/oif veterans who truly need mental health treatment refuse it because it would mean having an illness that is associated with Vietnam-era chronicity and thus is incurable.” The clinicians’ job, of course, is not to incite morbid preoccupations, but to dispel misconceptions about Vietnam veterans (the vast majority of whom went on to function well) and steer veterans, as early as possible, to healthier interpretations of their symptoms. Early intervention also leverages the well-established fact that prognosis after trauma greatly depends on what happens to the individual in its immediate wake. That is why serious attention must be paid to the everyday problems that beset many veterans
February & March 2011 51

Sally Satel
during the readjustment period, such as financial stress, marital discord, parenting strains, occupational needs.3 Finally, the balkanization of the veteran’s services complex demands attention. The federal Veterans Benefits Administration (vba) and the Veterans Health Administration (vha) tend to operate in separate universes. The vba is geared toward helping veterans maximize benefits and gives little to no attention to improving their clinical situation. On the other hand, the vha is focused on treatment, as it should be, but doesn’t extend its expertise to helping veterans with the financial hardships they face. (These can be the kinds of problems that might lead a patient to turn to disability compensation — not because he is incapable of work but because the reliable check is a rational solution to his financial woes.) Countybased Veterans Service Officers actively help veterFor those who ans file for disability — not necessarily a bad thing never regain at all, but because they are advocates, their job is to their civilian get a veteran what he wants, which is not necessarily footing despite in his best clinical interest. Lastly, the Veteran Service Organizations which, as a matter of princithe best ple, are driven to funnel largesse to their constituents, tend to be extremely suspicious of protreatment, full posed reforms of the disability system, as they were and generous of Senator Burr’s proposal. With the missions of both agencies and the agendas of pressure groups all disability working at cross purposes, disability reform is a compensation is daunting challenge indeed. their due. Anyone who fights in a war is changed by it, but few are irreparably damaged. For those who never regain their civilian footing despite the best treatment, full and generous disability compensation is their due. Otherwise, it is reckless to allow a young veteran to surrender to his psychological wounds without first urging him to pursue recovery. Over the last hundred years or so, psychiatry has taken very different perspectives on war stress: from an overly harsh, blame-the-soldier stance in World War I, to the healthy recognition in World War II that even the most psychologically healthy individual can develop war-related symptoms, to the misguided expectation in the wake of Vietnam that lasting ptsd was routine. The new va rule, which expands ptsd disability eligibility to noncom3. The problem of fraud, too, cannot be overlooked. See B. Christopher Frueh, et al., “US Department of Veterans Affairs disability policies for ptsd: Administrative trends and implications for treatment, rehabilitation, and research,” American Journal of Public Health 97:12 (December 2007), 2143–2145; see also Gail Poyner, “Psychological Evaluations of Veterans Claiming ptsd Disability with the Department of Veterans Affairs: A Clinician’s Viewpoint,” Psychological Injury and Law 3:2 (2010), 130–2. Poyner, who had received praise from va personnel for her careful diagnostic evaluations, was told by the va in 2009 that her services were no longer needed after she began using approved psychological tests to distinguish between veterans who were claiming to have ptsd when they did not and those whose complaints were clinically authentic.

52

Policy Review

PTSD’s Diagnostic Trap
batants who have experienced the dread of harm but have not had an actual encounter with it, alters the meaning yet again. What should have been a welcome bureaucratic reform by the va — waiving documentation that might be difficult or impossible to obtain — ended up distorting the diagnosis. Add to this the practice of conferring disability status upon a veteran before his prospects for recovery are known, and the long journey home will now be harder than it already is.

February & March 2011

53

New from Hoover Institution Press
Social Security
The Un nished Work
By Charles Blahous

Arguing that an equitable Social Security solution will be unattainable unless we bring stakeholders together around a common understanding of the facts and of the need to take action to address them, former White House adviser Charles Blahous presents some often misunderstood, basic factual background about Social Security. He discusses how it a ects program participants and explains the true demographic, economic, and political factors that threaten its future e cacy.

November 2010, 436 pages ISBN: 978-0-8179-1194-2 $29.95, cloth

Pension Wise
Confronting Employer Pension Underfunding—And Sparing Taxpayers the Next Bailout
By Charles Blahous

Charles Blahous, one of the nation’s foremost retirement security experts, explains the origins and dangers of current underfunding in our single-employer de ned-bene t pension system and outlines the options for solving the problem and preventing the next taxpayer- nanced bailout. He provides a tutorial on the basic workings of pension law, reviews the recent history that led to the worsening condition of the pension insurance system, and suggests a range of reforms to improve the system’s operation and to resolve the projected shortfall.

January 2011, 90 pages ISBN: 978-0-8179-1214-7 $19.95, cloth
Charles Blahous, one of the nation’s foremost retirement security experts, serves as one of two public trustees for the Social Security and Medicare programs. Blahous served as deputy director of President George W. Bush’s National Economic Council and, before that, as executive director of the president’s bipartisan Social Security Commission and special assistant for economic policy.

To order, call 800.621.2736
Hoover Institution Press, Stanford University, Stanford, California 94305-6010 www.hooverpress.org

Cuba’s Lost History
By Michael Gonzalez

hen i was a child and the communist authorities would send a volunteer worker to inquire why I had not yet joined the Pioneros or generally was not going along with the rhetoric of the Cuban Revolution, my grandmother would react in a way I found puzzling. She would show the visitor, usually a woman, to the sitting room my family used for people with whom we were not intimate, and then serve coffee in the good china (not the chipped and weathered cups used with family and friends). Then she invariably would launch into a version of the same routine. Abuela would employ an exceedingly cordial but distant manner, with none of the comfortable banter that Cubans with bonds of friendship use with one another — though she was careful to drop here and there a wellcalibrated cubanismo, to display her roots in the native soil. A cold smile fixed on her lips, she would get around to informing our visitor that “this family has been in these parts for many generations — centuries.” Yes, she would go on, “we’ve noticed there is a Revolution going on
Michael Gonzalez, a former journalist, is vice president of communications at The Heritage Foundation (www.heritage.org) in Washington, D.C.
February & March 2011 55 Policy Review

W

Michael Gonzalez
outside. The important thing to us is that we’re a familia cubanisima. And what was it that you said about my grandson again? Oh, no, he won’t take part in that civic event Sunday morning. He will be at church, you see. He’s an altar boy there. Our family built that church, by the way, the one you saw on the way here. Yes, our name is on the first brick was laid there.” Why on earth Abuela wanted to share these things with these visitors, I always wondered. Today, four decades on, I know why she proclaimed we were a corner stone. We had memories of a time past, and history put things in context. From earliest childhood I was taught that my father’s family’s roots in Cuba ran deep. “Childhood” is perhaps too quaint a word for what was a daily anomaly: Outside raged the Revolution, with such Our knowledge foreign visitors as Angela Davis and Stokely Carmichael descending on Havana to join Fidel in of the past marches, sloganeering, and taunting counterrevoluamounted to tionaries; inside our walls, laden with portraits of ancestors, angels, and saints, we had ourselves, more than just books, and hallways that echoed of La traviata and the point of pride Cavalleria rusticana, emanating from my father’s that people take record player and prized 78-rpm record collection. Our knowledge of the past amounted to more in their family than just the point of pride that people take in their histories in stable family histories in stable countries. History gave us a sense of permanence that assured us of our daily countries. survival, just as oaks can withstand gale force winds because of deep tap roots. The people of Cuba today need that permanence, that stability and sense of belonging. This may jar those who have visited Cuba and seen a deprived but proud people. Yet, as the Cuban blogger Yoani Sanchez has written, “Most young people’s eyes are looking to the outside, because they see that they cannot make change in their country. They desire to take a plane to Miami or Europe and in ten hours change their lives completely.” Convincing them that they have a future on the island will require wholesale change: massive amounts of capital, a huge infusion of technical know-how, and restoration of the rule of law (respect for private property, for starters).But even these major fixes won’t be sufficient to secure the island’s revival. As the Cuban writer Jose Azel recently observed, “Post-Castro Cuba will need to rebuild much more than its economy; it will need to rebuild its national identity.” That is because wiping out that identity — one that had grown organically through the centuries and had produced an enterprising and creative national character — was Job One for the Revolution; it was a necessity, even an obsession, to communists intent on imposing an alien blueprint on the people of Cuba. “Cubanism” had to be wrung out of the people’s consciousness so that the much-touted “revolutionary consciousness” could be
56 Policy Review

Cuba’s Lost History
installed in its place. Timeless habits had to be changed, ways of thinking rewired, history rewritten. One of the myriad ways the Revolution’s henchmen and apparatchiks went about doing that was to completely reorder the way history was imparted, teaching it — when at all — through a political prism that denigrated the past and explained why it had to lead to Castro’s Marxist experiment. To ensure that Cubans could not henceforth have access to history books that did not comport to the Marxist view, Cuba’s communists resorted to the same instruments their counterparts have used everywhere from Moscow to Beijing: strict censorship. Books that laid out a different view were confiscated and taken out of circulation. The island was hermetically shut. Since 1959, Cubans have not had access to a non-Marxist version of current or historical events Since 1959, about the outside world or about themselves. Today Cubans have not most Cubans are barred from accessing the internet. had access to a Much attention is paid to the economic failure and political repression that end up being the sine non-Marxist qua non of Marxism, but relatively little notice is taken of the attendant necessity to wipe out a cul- version of current ture and the deleterious effect this destruction conor historical tinues to have on a nation even after the communists have had their guns taken away. China is slowly events about the regaining a 5,000-year-old culture that was ruthoutside world. lessly suppressed during the Cultural Revolution; Russia is working through difficulties in that arena. Regaining memory will be important for freedom itself. The reason is simple. As British historian Simon Schama puts it, “History and memory are not the antithesis of free will but the condition of it.” Memory allows humans to contrast different events and outcomes, a freedom which any dictator intent on enforcing a blueprint must eradicate. “And because history is the enemy of tyranny,” Schama notes, “oblivion is its greatest accomplice.” To be sure, the roots for what went wrong in Cuba can also be traced to flaws in the Cuban character and identity. Given Cuba’s penurious state, the task of sifting through the historical record to have some sense for what these flaws are is urgent. Cuba, after all, wasn’t reduced to impoverished incarceration by Martians arriving on flying saucers. Something Cuban, very Cuban, allowed this situation to emerge. What character flaw was there and how might some future government go about ameliorating its effects? In the case of Cuba, one flaw was obviously the institution of slavery and its pernicious generational impact. Another was very probably a strong class consciousness, one that was much more like Europe’s than what one finds in the U.S. Both provided some tinder for the revolutionary match to light. But we should also look at the strengths of the Cuban character that are products of the cauldron of history. They help explain Cuba’s success prior to Fidel Castro’s revolution. A review of the historical record helps us underFebruary & March 2011 57

Michael Gonzalez
stand Cubans’ innate attachment to private property, entrepreneurial spirit, cosmopolitanism, and pursuit of individual freedom and self-government, all characteristics anathema to the Marxist ideal. Such a review would help explain why those who have tried to impose the foreign ideology of Marxism, with its initiative-killing collectivism, arbitrary and resentful division of the world between north and south, and political repression would out of necessity seek to quash a previous understanding of history. Imposing communism in Cuba required wiping the historical record and wringing Cuban-ness from each individual, especially since every element of Marxism’s New Man was diametrically opposed to the Cuban temperament. This is why Grandma acted as she did. She was registering her impatience with revolutionaries coming to her babbling about “revolutionary consciousness.” Her message was simple: Please don’t tell us about out national character; we have been refining it for centuries. I saw the toll that the imposition of this unfamiliar ideology took on my family, how it eventually drove my grandmother insane, killed my father and tore my family asunder. The oak was shaken. But the end of this bizarre experiment is now within sight — after a very long half-century, the displacement of hundreds of thousands, and the ruination of the once-stately Havana and the countryside. The revolutionary leaders have become gerontocrats. It is time now to peer into the historical record and understand the roots of Cuba’s character, so that Cubans themselves can start rediscovering why it is that they think a certain way and can start responding to ancient instincts once the shackles are removed. In this, America can be a model, because it especially has been well-served by a deep understanding of its history, of the Constitution, its earliest settlers and its founding generation, but also of how it has dealt with its problems, and how all have informed the American character. This has been a guarantee of its freedom and prosperity.

Recalling a national character
n anniversary upon us should spark interest in an often overlooked period of Cuban history. This year, 2011, marks half a millennium of Cuba’s existence. In early 1511 (historians are not sure about the exact date, but they think it was in January), nearly a century before the settlement of Jamestown, a group of Spanish knights clad in heavy armor arrived on boats to conquer the island and recreate their medieval ways in the Caribbean. They couldn’t have been more unlike the Pilgrims, but like them, they sowed the seeds of success. Some have placed the spark that ignited Cuba’s identity at other times. Historian Hugh Thomas, for instance, places it at 1762, when the British invaded Havana. Others say 1898, when Spain lost to the U.S. in the Spanish American War; or 1902, when the U.S. left and the Republic began
58 Policy Review

A

Cuba’s Lost History
(Castro obviously believes Year Zero is 1959, when he took over). I believe, however, that the markers the conquistadors put down led directly to the formation of the Cuban character, though they are not often given credit. Placing the genesis of the Cuban nation at this Spanish landing is not meant to deny the existence of Indian nations, some of which had inhabited the island for centuries (others had come to Cuba only a few decades before the Spanish, hopping around the West Indies, island to island, all the way from the basin of the Orinoco River). But unlike with Peru, Mexico or, say, Ireland, in Cuba the population in place before the imperial power’s arrival left relatively little trace. The Indians bequeathed the cultivation of some tasty tubers, the enjoyment of the hammock and, much more famously, the use of tobacco. But the country with the culture, It is time now ethnicities, language, religions, customs, and comfor Cubans to mon history that it has today (in short, with all the peer into the attributes that make a nation) had its moment of birth at the point of the conquistadors’ landing. As historical record soon as these Spaniards arrived, they started putting and understand down the pillars on which the Cuban economy, culture, and national character were built. The leaders the roots of of the conquistadors came mostly from the landless their nation’s gentry of Castile and especially Andalusia in southwestern Spain — on whose dialect Cuban Spanish is character. based. They cut through the serpentine island in a matter of months, subduing the natives. In four years they established the seven villages from which Cuba sprang. The seven foundational villages — Baracoa, Bayamo, Santiago, Trinidad, Sancti Spiritus, Puerto Principe, and Havana — are all cities today. A quick historical review, guided by the ghosts of abuela’s ancestors, suggests why Cuba was once a success, why Cubans still tend to thrive outside communism, and — most important of all — why the country has its present problems. Within decades of their arrival, the conquistadors thrust Cuba into a global trading system which produced conditions for wealth creation for centuries to come and gave Cubans a cosmopolitan outlook that allowed them to think of themselves as the equals of Europeans and North Americans. The conquistadors also laid the foundations of representative government. Yes, the conquistadors could be implacable, even sadistic. One of abuela’s ancestors — one Vasco Porcallo de Figueroa, a native of Extremadura, as were Cortes, Pizarro, and many other leading conquistadors — is said by many historians to have been among the cruelest. He had a hand in founding four of the first seven towns. He also reportedly mutilated and castrated many natives — and had children with scores more (it is to one of these “cross-cultural” encounters that my children and I owe our existence).
February & March 2011 59

Michael Gonzalez
These first settlers were infamous for self-dealing in land, introducing the roots of the corruption and the strong sense of class consciousness that marked the island for centuries. They did this through the Havana Council (El Cabildo), which they took over early on. Conquistadors of noble birth got the best land, followed by those of baser origin, followed by simple settlers who hadn’t fought the Indians and shed blood, then followed by mixedbloods and, finally, Indians and blacks. There’s no doubt that this created a wedge the communists were able to expand and exploit centuries later. The identification of family honor with private property by these first Cubans did, however, prove a boon to Cuba for centuries to come, assuring its economic success. In ignoring edicts from Madrid to establish commons, they avoided the near fatal mistake of Jamestown and Massachusetts, where land was at first held Within jointly. In the English colonies commons led to decades of penury, as they do everywhere. Luckily for Virginia their arrival, the and Massachusetts, settlers switched to private holdings in time to avoid starvation. By disobeying conquistadors Madrid and apportioning land among themselves, had thrust Cuba the Cuban conquistadors eluded “the tragedy of the commons.” into a global The primacy of private property rights was finally trading system. encoded into legislation in the first decade of the 19th century, in a law that condemned “all government intervention in the management and development of private capital.” The revolution’s early historian, Manuel Moreno Fraginals, in his 1964 work The Sugar Mill, decried the fact that with this legislation, the Cuban “sugarocracy won its greatest legal victory.” Respect for private property lasted in Cuba from Vasco to Fidel, surviving numerous corrupt regimes. Abuela’s husband, my grandfather, Luis Miguel Gonzalez, fought against the dictators Machado and Batista in the 1930s and ’40s. Sometimes he had to flee to the countryside for weeks, to escape arrest. My father Miguel Angel was likewise thrown in prison by Batista in the 1950s. But never were their houses confiscated. And this organic Cuban attachment to one’s own patch of land explains why the strongest rejection of the revolution’s “Agrarian Reform” — the expropriation of people’s property — came from the guajiros (which is translated as “peasants” by leftists and as “ranchers” by everyone else), not from the upper classes. On this foundation of private property, success was built. Soon the descendants of the conquistadors were planting sugarcane, using technology brought to the island by merchant Genoese families. In a matter of decades, Havana was linked to different ports throughout the world, via Seville and the Canary Islands, honing the enterprising spirit that is one of Cuba’s most enduring traits. As the new Cuban expert at the University of Pittsburgh, the bright historian Alejandro de la Fuente, has documented in Havana and the
60 Policy Review

Cuba’s Lost History
Atlantic in the 16th Century, by the early 1600s, Havana had become a prime center in the Atlantic and the Caribbean web of distribution and commerce. As de la Fuente puts it, the key was making Havana the nodal point of the Spanish colonial fleet system (la Carrera de Indias), which gave the port city “the ability to redistribute European products among colonial markets in the circum-Caribbean area.” He continues:
Through this constant flow of vessels and commodities local merchants and residents got access to a large variety of goods. These goods came from production centers all over the world, from Amsterdam to Ceylon, and they were part of life in Havana and other Atlantic port cities. Local merchants, residents, and transients consumed and traded these commodities, shifting them from one sea route to another depending on the available information about local, regional, and distant markets and demands. (Italics added.)

In making full use of this world system, Cuba was soon producing the coffee, sugar, and tobacco that fueled the democratic explosion of the 1680s in London’s coffee houses, the incubators of Britain’s Glorious Revolution (whose understanding of freedoms informed the American Revolution). This trading spirit was in full swing at the dawn of the 1700s, and made its adherents feisty enough to battle an empire. In the 1720s, the newly installed Bourbon dynasty in Madrid, in the person of Philip V, brought statist French ways. In France itself, of course, within 60 years this statism would lead to the French Revolution and the beheadings of the French Bourbons. In Cuba, the Spanish Bourbon king forced Cuban farmers to sell their tobacco to a state monopoly. The tobacco farmers rebelled, albeit unsuccessfully. Defeated, many of abuela’s ancestors fled to western Cuba — through sheer luck happening upon Vuelta Abajo, the best land in the world for growing the weed. Soon they were selling tobacco and other wares to privateers from Holland, France, and England. Cubans, indeed, traded with all comers, often breaking the law if they had to. Many of the legal cases in the Archivo de Indias in Seville have to do with the authorities’ taking reprisals against smugglers. Pirates were so welcome from the start of the colony that the Cuban historian Levy Marrero quotes a 1603 letter from Bishop Juan de las Cabezas to Philip III in which he laments that the island was so lost to the pirate trade that there was even a man “who has not wanted to baptize a son until a pirate could be his Godfather.” Commercial liberty had to come amid constant attempts by the crown to regulate trade, just as the conquistadors ignored Madrid’s edicts on setting aside commons land. The settlers found different means to get their way on trade. As de la Fuente wrote, one of the ways “to circumvent legal limits and prohibitions was to falsify cargo registries.” This commercial spirit gave the island a great deal of prosperity, especially after the British takeover in 1762, which lasted only a few months but
February & March 2011 61

Michael Gonzalez
which transformed Cuba by making trade with the entire world possible. After that, Seville and the Canary Islands could no longer be the only gateways to Europe, as they had been up to that point. As the British historian Hugh Thomas writes, Cuba in the 19th century became “the richest colony in the world.” By the mid-20th century, as the Cuban-American business historian Oscar A. Echevarria has written in Cuba’s Builders of Wealth Prior to 1959, “Cuba’s entrepreneurial and managerial class was disproportionally large.” He concludes that it was not just good soil and proximity to the U.S. that accounted for Cuba’s economic and social success, but its entrepreneurial know-how. This resolve to trade freely with one another, defying authority when the law was an ass, was alive even under the revolution. In 1960s Havana, it propelled my The resolve to poor father (and countless others) to risk prison by trade freely with trading in the black market, so he could secure food for our family. Throughout my childhood, this highone another, ly illegal barter system was the only sector of the defying authority Cuban economy that I saw function. Politically, too, the conquistadors made the first when the law strides in electoral politics, with lasting consewas an ass, was quences, and, as it happens, another of abuela’s alive even under ancestors played a part. They were not all rogues the revolution. like Vasco, but included also such pious Christian men as Manuel de Roxas, an early governor who in the 1520s made a point of complaining to the court in Madrid about the cruelty being meted out to Indians, and about the incipient licentiousness in the colony. Unsurprisingly, given de Roxas’s moral rectitude, he also secured for the fledgling Cuban colony a level of political selfdetermination and, yes, democracy, that Madrid ended up denying the towns and cities in the Iberian Peninsula itself. These embryonic political developments show that a desire for representative government — one quite at odds with what Cubans have now — has existed in Cuba from the very beginning and was culturally ingrained before Castro’s half-century experiment in Marxist oppression. De Roxas’s achievements have been chronicled by a few historians, but the most comprehensive look I have seen was by the great chronicler of Cuba’s early years, the intrepid American Irene Aloha Wright, in her 1916 classic, An Early History of Havana. The key event, according to Wright, came in a communication to the Emperor Charles I in 1528, in which de Roxas requested for the fledgling Cuban settlement the same privileges for which European towns and guilds were fighting at the time. The gist of the request was that the 17-year-old colony be able to elect its own judges and legislators. After some consideration, Charles did accede to a mixed system: Some judges and legislators would be appointed by the crown — but some would be elected by the settlers in Cuba. The suffrage was limited to
62 Policy Review

Cuba’s Lost History
landowning heads of households, to be sure, but, lest we forget, this was the case in the United States and Britain as late as the 1800s. Local elections continued throughout most of the colonial period, albeit with limited suffrage. It was only with the arrival of Governor Miguel Tacon in 1834 that Madrid began to take back the degree of autonomy that landowning Cuban families had enjoyed for three centuries. Tacon canceled local elections in 1836, expelled the archbishop of Santiago and even engaged in something as silly as attempting to delay the installation of railways in Cuba so tracks they could be laid in Spain first. Needless to say, such a power grab alienated Cubans, leading to their insurrection, and eventually to the SpanishAmerican War and the defeat of Spain in 1898. The Cubans colonials who likewise had economic In 1834, Madrid and some political freedoms saw themselves as very began to take much a part of Spain and the rest of Europe, and back the degree they felt their lives were interwoven with events on the other side of the ocean. A copy in my possession of autonomy that of the 1817 will and testament of Matias Jose landowning Duarte, the brother of abuela’s great-great grandfather, shows that among the legacies he left to Cuban families churches, relatives, and slaves, he also worried about had enjoyed for Napoleon’s victim’s in Spain and left 1,000 pesos “to Europe so it can distributed at the rate [of] 10 three centuries. pesos for each poor person who lost a husband or a father in the wars that they have had with France.” It is important to note that Matias Jose Duarte was a fifth-generation Cuban. It is also noteworthy that this will was written seven years after Mexico had declared its independence from Spain, and while Simon Bolivar, Jose de San Martin, and their peers were fighting to liberate the South American continent from Spain. Until Tacon, the majority of white Cubans had wanted no part of this independence. To be sure, the privileges of Cuba’s cities and towns in the colonial period were not as far-reaching as the rights that American colonists in Virginia, Massachusetts, and the Carolinas enjoyed in the 1700s. It is important, however, to lay out a predicate for representative government in Cuba. And free elections were part of Cuba all the way to the 1902–1959 republic. Abuela’s father was elected in free elections of 1902 to the first Havana Cabildo in the republic, and my grandfather ran for the Constitutional Assembly in 1940 (he lost). Much is made of dictators Machado and Batista in the 1930s and 1950s, and rightly so, but there were more democratically elected presidents than dictators during those six decades. In the 16th, 17th, 18th, and 19th centuries, economic freedom was reserved for the free, white or black. Excluded were the island’s slaves, until slavery was abolished in the 1880s, and anyone who goes looking for the
February & March 2011 63

Michael Gonzalez
roots of Cuba’s present-day ills must stop and consider slavery’s pernicious and lasting influence. And how could it be otherwise? Here in the United States, the New England Yankee John Adams worried about the colonies in the American South and wondered how people who deny liberty to others could ever obtain it for themselves. Slavery existed in Cuba from 1511. Because it associated personal degradation with one race, it led directly to racism. It runs as deeply in the Cuba of today as it ever has, in ways that the revolution’s starry-eyed supporters often fail to grasp. One of its most lasting and corrosive political impacts may have been a toleration of subjugation, which was accepted by all races, the subjugated and the subjugators. Cuba having had slavery till the 1880s, the folk memory of an institution that lasted for centuries can reach across the generations. But it is important to remember that racism did The republic not prevent free blacks from having private propersucceeded in the ty. There are records in the Archivo de Indias in Seville of blacks and mixed race individuals receiv20th century ing grants of land from the Havana Cabildo as early because it was a as the 1560s. A history of slavery and racism also descendant of the does not condemn a society (if it did, the whole world would be doomed). During the 20th century, colonial center of Cuba was so prosperous that it attracted hundreds commerce from of thousands of immigrants from Europe. Four of these 20th century European immigrants were my the 1500s. mother’s grandparents. They came penniless to Cuba. Yet one generation later, my mother was graduating law school. The 20th century republic succeeded because it was a direct descendant of the colonial center of commerce incubated in the 1500s. Cubans’ sense of enterprise was sharpened over the centuries; it became part of the national character. Cubans were ferociously competitive, and international competition sharpened on the island global managerial best practices. United Nations statistics for 1959, the year Castro took over, bear this out. In infant mortality, Cuba’s 32 deaths per 1,000 live births was well ahead of Japan, West Germany, Luxembourg, Ireland, France, Italy, and Spain (40, 36, 39, 33, 34, 50, and 53 respectively) among other nations. In terms of calories per day, Cuba was ahead all of Latin America except beef producers Argentina and Uruguay. In cars per 1,000 inhabitants, Cuba’s 24 was ahead of everyone in Latin America expect oil-producing Venezuela (27). And so on. In most vital statistics, Cuba was on a par with Mediterranean countries and Southern U.S. states. One reason for Cuba’s economic and (to a lesser degree) political success in the 1 9 0 2 – 5 9 republican period was the beneficial impact of the 1899–1902 U.S. occupation, and the strong role the U.S. continued to play in the island during the first half of the twentieth century. That is, of course, not something that you will find in the history books read in Cuba today.
64 Policy Review

Cuba’s Lost History
But Cuba in 1898 was prostrate after three decades of war with Spain, and the U.S. occupation not only made it possible to have a transition to an elected government, but also considerably fixed other ills such as public sanitation. One of the first things Governor Leonard Wood did was set up a sanitation commission, which was led by abuela’s father, the surgeon and sociologist Ramon Maria Alfonso.

Reclaiming a history
iscussion of the Cuban national character often emphasizes other aspects than I have — individualism being a positive one, informality one less so — and places the historical formative period of national character during the independence wars fought against Spain in 1868–78 and 1895–98. Castro’s Revolution emphasizes these wars of independence in its teachings of history, and reinterprets them as proto-Marxist events. The wars of independence have been purposely left out of my treatment, which has concentrated on the roots of such Cuban characteristics as attachment to private property, an entrepreneurial spirit, an international cosmopolitanism, and the pursuit of freedom (the latter of which was the title of Hugh Thomas’s classic work on Cuban history). I posit that the seeds for these traits were laid in those early years of the conquest. My use of personal events, from my life and my family’s past, are meant simply to demonstrate the always implicit contract between the dead, the present generation, and those to still be born. Today, the overwhelming majority of Cubans have been denied any of this history. A global trading system and wealth creation not being part of the socialist grand plan, virtually all pre-revolutionary history has been denigrated in Castro’s Cuba. Castroite ideologists especially spurn what they call the “pseudo-republic” of the 20th century — the state that was the land of promise to my four Spanish maternal great grandparents. Yet, it should be clear to all that the revolution’s victory in 1959 reversed the flow of humanity; after that event people stopped coming in and began to exit en masse. To think that Europeans today would immigrate to Cuba is risible, whereas I, like hundreds of thousands of other sons and daughters of Cuba, have had to go elsewhere to make a good life and find happiness. Castro knew he had to smash Cuba’s old identity, smear its old pride, and degrade its traditions in order to create the “new man” called for in the bizarre 19th century collectivist cult — Marxism — that he forced the poor Cubans to embrace. The British writer Theodore Dalrymple even theorizes that Castro needed to wipe out the physical evidence of the previous culture, and that that is why the Revolution purposely destroyed Havana and its once stately architecture and left it to crumble. Dalrymple is not optimistic about what it will take to rebuild Havana, that symbol of Cuban identity:
February & March 2011 65

D

Michael Gonzalez
The terrible damage that Castro has done will long outlive him and his regime. Untold billions of capital will be needed to restore Havana; legal problems about ownership and rights of residence will be costly, bitter, and interminable; and the need to balance commercial, social, and aesthetic considerations in the reconstruction of Cuba will require the highest regulatory wisdom. In the meantime, Havana stands as a dreadful warning to the world — if one were any longer needed — against the dangers of monomaniacs who believe themselves to be in possession of a theory that explains everything, including the future.

In that way — and other ways — Castro has been just another banal dictator. Every tyrant from Hitler to Pol Pot has tried to restart the clock in order to put in place their plans. And like all of them, Pol Pot and Hitler included, Castro has found useful idiots outside to parrot his denigration of his country’s previous history. The historian Manuel Moreno Fraginals’s own trajectory brings to light the revolution’s distaste for history. He started out as a darling to some, and Che Guevara even praised The Sugar Mill for its “rigorous Marxist analytic method.” But the revolution’s men quickly decided that Fraginals was writing too much history. As the American Historical Association puts it in its entry on Fraginals, The Sugar Mill “was not sympathetically received by Cuban official historians, who claimed at the time that Marxist historians should apply themselves to reinterpret the past, not to reconstruct it using new evidence and methodology.” Poor Fraginals fell into disfavor quickly; the Revolutionary government in fact never allowed him to teach at the University of Havana. He ended up dying in exile in Miami in 2001. To convince young Cubans today that there are better goals than becoming a foreigner, as a wry Havana joke puts it, Cubans will need to look at the whole vista of the past 500 years, and put the past half century in that context. That process can start today, up to a point, by adding more cultural and historical content to American transmissions to Cuba. Radio Martí, despite its many problems, has a following on the island and could certainly be improved. The concept is good even if the implementation may sometimes fall short. But the real heavy lifting will need to come after the Castros have passed from the scene and a real transition to freedom is under way. The transitional administration should give high priority to celebrate Cuba’s heritage, all of it, with honesty. History and culture, from Vasco to Castro, needs to be taught again. Cuba’s zarzuela, “Cecilia Valdes,” needs to play at Havana’s main theater again. This cultural recapturing is a process that was undertaken in Eastern Europe after it regained its freedom. Not for nothing was the first president of a free Czech Republic a novelist. America will unquestionably play a role in post-Castro Cuba. The only question is whether America will be well prepared to accept this responsibility or will be dragged unwillingly into it. It
66 Policy Review

Cuba’s Lost History
will be the latter only if the U.S. government accepts that it is a force for good in the world — that it already has been that in Cuba — and has a deep understanding of the challenge at hand. I have a cousin who lives in Europe. He and I have led parallel lives. I left Cuba almost 40 years ago, when I was twelve. He left seven years ago in his mid-30s, and knows little of Cuba’s history, except the skewed version the revolution taught him. Of Vasco, the centuries in between, and the presidents of the republic, he knows next to nothing and, honestly, doesn’t much care. He also thinks it pretty bizarre that I like guajiro music from the countryside. He would not go anywhere near it, and prefers Lady Gaga. I get him. After being force-fed Cuba as “revolutionary consciousness” for decades, he wants to turn the page. But if Cuba is to have a shot at being as successful as it was before, the Cubans who will make a go of the country need to know what came before them. They need to understand what abuela intuited. My grandmother, that great transmitter of culture, knew what she was doing . Her father, husband, brother, and many of her ancestors were all involved in the making of Cuba to one degree or another. My grandmother’s whole life had been about history, the present and, through me, the future. Her attachment to Cuba and its survival was personal. And that’s where national identity and character must be felt — at the personal level. Without the romanticism of culture, life becomes purely transactional and not worth living. And only by transmitting to present-day Cubans the importance of the contract between the generations that are dead and those not yet born can Cuba hope to survive.

February & March 2011

67

New From Hoover Institution Press Ne Fro Hoo er I om Instit tion P
Healthy, Wealthy, d Wise Healthy, Wealthy, and Wise
Five Steps to Better Five Steps to a Better Health Care System Care System Edition 2nd Edition
By John F. Cogan, R. Glenn Hubbard, and F. Co ogan, ogan R Glenn Hubba d, ar P. Kess sler Daniel P. Kessler
Health care in the United States has made r markable re Health care in the United States has made remarkable advances during the past for ty years. Yet our health care advances during the past forty years. Yet our health care system also has several well-known problems: high costs, system also has several well-known problems: high costs, signi cant numbers of people without insurance, and signi cant numbers of people without insurance, and glaring gaps in quality and e ciency—and the Patient glaring gaps in quality and ciency—and the Patient Protection and ordable Care Act of 2010 is not the Protection and A ordable Care Act of 2010 is not the answer. The second edition of Healthy, Wealthy, and Wise answer. The second edition of Healthy, Wealthy, and Wise details better approach, ering fundamental reform details a better approach, o ering fundamental reform alternatives centering on tax changes, insurance market alternatives centering on tax changes, insurance market changes, and redesigning Medicare and Medicaid. changes, and redesigning Medicare and Medicaid. The book proposes speci reforms to improve the The book proposes ve speci c reforms to improve the ability of markets to create lower-cost, higher-quality ability of markets to create a lower-cost, higher-quality health care system that is responsive to the needs of health care system that is responsive to the needs of individuals, including increasing individual involvement, individuals, including increasing individual involvement, deregulating insurance markets and redesigning Medideregulating insurance markets and redesigning Medicare and Medicaid, improving availability and quality care and Medicaid, improving availability and quality of information, enhancing competition, and reformof information, enhancing competition, and reformi n g t h e m a l p r a c t i c e s y s t e m . Th e a u t h o r s s h ow t h a t , by ing the malpractice system. The authors show that, by promoting cost-conscious behavior and competition in promoting cost-conscious behavior and competition in both private markets and government programs such as both private markets and government programs such as Medicare and Medicaid, Medicare and Medicaid, we can slow the rate of growth can slow the rate of growth of health care costs, expand access to high-quality health of health care costs, expand access to high-quality health care, and slow down runaway spending. care, and slow down runaway spending.

F. Cogan Leonard Shirley y Senior John F. Cogan is the Leonard and Shirley Ely Senior Fellow at Hoover Institution, Stanford University. o f niv Fellow at the Hoover Institution, Stanford Un ersity. Glenn Hubbard r Dean . Carson R. Glenn Hubbard is the Dean and Russell L. Carson Professor Finan e f nc Economics, Graduate School a Professor of Finance and Economics, Graduate School of Business, and a professor of economics at Columbia Business, professor economics at Columbia f University. Daniel P. Kessler is a professor of economics, University. Danie P. Kessler el professor f economics, law, and policy at the Stanford University Graduate School law, policy at Stanford University Gra f aduate School of Business and a senior fellow at the Hoover Institution. Hoover Institution. fello fellow at

March 130 March 2011, 130 pages 978-0-817 79-1064-8 h ISBN: 978-0-8179-1064-8 $19.95, cloth

To order, call 800.621.2736 To order, call 800.621.2736
Hoover Institution Press, Stanford University, Stanford, California 94305-6010 Hoover Institution Press, Stanford University, Stanford, California 94305-6010 www.hooverpress.org www.hooverpress.org w

The Rush to Condemn Genetically Modified Crops
By Gregory Conko & Henry I. Miller

n spite of more than twenty years of scientific, humanitarian, and financial successes and an admirable record of health and environmental safety, genetic engineering applied to agriculture continues to be beleaguered by activists. Gene-spliced, or so-called genetically modified, crop plants are now grown on nearly 150 million acres in the United States alone, helping farmers to increase yields, reduce pesticide spraying, and save topsoil — and without injury to a single person or damage to an ecosystem. But this remarkable record hasn’t kept radical environmentalists from condemning and obstructing the technology. When they can’t sway public opinion with outright misrepresentations or induce regulators to reject products, activists have resorted to vandalism of field trials and, finally, to harassment with nuisance lawsuits. Environmental activists succeeded in alarming the American public about gene-spliced crops and foods for a time during the 1990s and the early part
Gregory Conko is a senior fellow at the Competitive Enterprise Institute in Washington, D.C. Henry I. Miller, a physician and fellow at Stanford University’s Hoover Institution, was the founding director of the FDA’s Office of Biotechnology.
February & March 2011 69 Policy Review

I

Gregory Conko & Henry I. Miller
of last decade, but they cried wolf so often in the face of an unbroken string of successes that the public began to tune them out. More recently, the activists have had to dig deeper into their bag of tricks and revive a proven strategy for obstructing progress: litigation that challenges the procedural steps government agencies take when approving individual gene-spliced crops. Since 2007, a coalition of green activist groups and organic farmers has used the courts to overturn two final approvals for gene-spliced crop varieties and the issuance of permits to test several others. At least one additional case is now pending.

The nepa process
nder the national Environmental Policy Act (nepa), which took effect in 1970, all federal government agencies are required to consider the effects that any “major actions” they take may have on the “human environment.” Agencies can exempt whole categories of routine or repetitive decisions, but most other decisions — such as the issuance of a new regulation, the siting of a new road, bridge, or power plant, or the approval of a new agricultural technology — trigger the nepa obligation to evaluate environmental impacts. If the regulatory agency concludes that the action will have “no significant impact” (a legal term of art), it issues a relatively brief Environmental Assessment explaining the basis for that decision. If the environment is likely to be affected significantly, however, the agency must prepare a comprehensive Environmental Impact Statement, which typically requires thousands of man-hours, details every imaginable effect, and runs to hundreds (or even thousands) of pages. The obligation under nepa is wholly procedural, which means that even significant environmental effects do not prohibit the agency from ultimately taking the proposed actions. Its purpose is merely to force government agencies to consider the possibility that their actions may result in environmental effects, but courts have interpreted the law broadly and expanded its impact, requiring a comprehensive review of every imaginable effect on the “human environment.” This category now encompasses not only harm to the natural ecology but also economic, social, and even aesthetic impacts. It has become easy, therefore, for agencies to miss some tangential matter and be tripped up by an irresponsible litigant who alleges that the environmental review was incomplete. And even when regulators actually do consider a potential impact but reject the concern due to its unimportance or improbability, they can run afoul of nepa by failing to extensively and comprehensively document their reasoning. That is exactly what happened with the challenges to approvals of gene-spliced crops. The U.S. Department of Agriculture has approved 74 different genespliced crop varieties for commercial-scale cultivation, including dozens of varieties of corn, cotton, canola, potato, soybean, and tomato, as well as a
70 Policy Review

U

The Rush to Condemn Genetically Modified Crops
handful of other crop species. In each case, the Department’s Animal and Plant Health Inspection Service (aphis) reviewed copious amounts of data from several years’ worth of controlled field trials that evaluated the variety’s agronomic and environmental effects. And because each one of these plants is highly similar to conventionally-bred varieties already grown throughout the United States — differing only in the addition of, at most, a few genes that introduce useful and well-characterized traits — aphis concluded that they would have no significant environmental impact.

Bad-faith activism
ut anti-biotechnology activists never permit facts or the judgments of experts to get in the way of their antagonism, so a coalition of environmental groups organized by the radical Center for Food Safety joined with a handful of organic farmers to sue in federal court to halt the planting of alfalfa, sugar beets, and turf grass modified for resistance to the herbicide glyphosate (better known by its trade name, Roundup). The plaintiffs were unable to make any substantive arguments that these crops were unsafe but instead claimed that aphis’s environmental assessment was legally — that is, procedurally — insufficient. They argued that a comprehensive environmental impact statement should have been prepared instead. What were the supposedly significant environmental impacts that aphis allegedly failed to account for in its environmental analysis? The plaintiffs made two claims: First, that the widespread adoption of gene-spliced, herbicide-resistant crops could accelerate the development of herbicide-resistant weeds and force farmers to switch to other, less safe herbicides. Second, they claimed that cross-pollination by gene-spliced crops could jeopardize the certification for nearby organic crops (because organic farming prohibits the use of gene-spliced plants). These are hardly the kinds of ecological impacts that Congress had in mind when it enacted nepa, but the scope of that law has been expanded beyond all recognition by judges who interpret the term “human environment” to include effects that are largely economic or social. Moreover, the judges in all three cases ignored the fact that the development of herbicideresistant weeds is not unique to gene-spliced crops but occurs commonly in all crops exposed to herbicides. The usda is well aware of the phenomenon and considers it routinely when making an approval decision. Worse still, cross-pollination by gene-spliced plants does not, in fact, jeopardize the organic certification of those crops. Opponents have long argued that use of gene-spliced crops poses an economic threat to organic farmers, who are not permitted to use gene-spliced varieties. Pollen is disseminated by the wind, and it has been argued that plants on an organic farm cross-pollinated by a neighbor’s gene-spliced
February & March 2011 71

B

Gregory Conko & Henry I. Miller
crops would no longer be organic and therefore would be denied the higher price such foods command in the marketplace. This argument is without foundation, however, because it ignores the way that “organic” is defined. The usda’s rules for organic production are based on process, not outcomes. As long as organic growers adhere to permissible practices and do not intentionally plant gene-spliced seeds, unintentional cross-pollination by a gene-spliced plant (or for that matter, the drift of a prohibited pesticide onto their crops) does not cause those crops to lose their organic status. The possibility that weeds could eventually become resistant to herbicides at least has some basis in reality — but again, the phenomenon is a natural one and it should never have risen to the level of a cognizable environmental harm that triggers the need for an Environmental Impact Statement. Just as bacteria are known to Antidevelop innate resistance to certain antibiotics, it is biotechnology well-known to farmers that weeds and pests will, activists never over time, become resistant to individual herbicides permit facts or and pesticides, respectively. The biochemical mechanisms that produce resistance vary from species to the judgments of species but typically rely on subtle genetic variations experts to get in in a small population that affords those microorganisms, insects, or weeds a selective advantage. Corn, the way of their wheat, and rice plants are naturally resistant to the herbicide atrazine, for example, so the herbicide is antagonism. useful to control weeds when those crops are cultivated. But after repeated use of atrazine, resistant weeds began to appear just a decade after it was first introduced in 1958, long before the techniques of gene-splicing were even invented. (Selective pressure from a herbicide induces resistance, just as exposure to antibiotics eventually causes the appearance of antibiotic-resistant bacteria.) Similarly, all plants — whether crops or weeds — already possess the biological precursors necessary to develop resistance to the herbicide glyphosate, which disrupts the accumulation of chlorophyll by degrading a protein that all plants make naturally. Gene-spliced, glyphosate-resistant crops are simply modified to produce more of that protein, which overwhelms the effects of the glyphosate and prevents the plants from being killed. For a glyphosate-resistant weed to emerge, it merely takes a few wild plants to evolve over time to produce more of the protein; with that selective advantage, they survive, reproduce, and become prevalent. usda scientists know all of this, of course, and they understand that herbicide-resistant weeds are nothing new. And although they can be problematic for farmers, weeds that have become resistant to a given herbicide can be combated simply by switching to a different one. Moreover, there is nothing particularly novel about herbicide-resistant crop plants; breeders have been introducing herbicide-resistant varieties for decades. Indeed, many more such varieties have been developed using a
72 Policy Review

The Rush to Condemn Genetically Modified Crops
“conventional” method called mutation-breeding than with gene-splicing. With the former method, breeders use x-rays, gamma radiation, or mutagenic chemicals to randomly damage the plant’s dna to create mutations — that is, new genetic variants. Mutation-breeding has been in common use since the 1950s, and more than 2,250 known mutant varieties of dozens of different crops and ornamental plants have been bred in at least 50 countries, including France, Germany, Italy, the United Kingdom, and the United States. While seed companies like Monsanto, Pioneer, and Syngenta have been busy producing gene-spliced crop varieties, chemical giant basf has used chemical mutagenesis extensively to produce an entire line of varieties, including wheat, rice, and canola that are resistant to various basf herbicides. And during the four decades preceding the usda’s approval of glyphosate-resistant alfalfa, sugar beet, and turf grass, plant breeders in a number of countries worldwide developed at least sixteen known mutant varieties of those same three crop species. It is worth repeating that induced mutation is considered to be a conventional breeding method, so its use is not opposed by environmental activists nor is it subject to any kind of regulation in most of the world. Still, scientists agree that the gross, crude modification of plant genetic material inherent in induced-mutation and other conventional breeding methods make those varieties less predictable and arguably less safe than gene-spliced plants.

Discrimination against gene-splicing
rom a nepa perspective, the only thing that makes gene-spliced herbicide-resistant plants unique is that they are regulated by a federal government agency and that approval to commercialize them constitutes, in nepa-speak, a “major action.” Even though plant scientists are virtually unanimous that gene-splicing is an extension, or refinement, of earlier techniques and that it is far safer and more predictable than mutation breeding, only the former are subject to the kind of government approval that triggers the eis obligation. In other words, simply the use of gene-splicing technology to create a plant is the trigger for, literally, an extraordinary regulatory regime. It is ironic that the safer, gene-spliced crop varieties, whose development and environmental impacts are carefully scrutinized by the usda, are subject to the extra burden of a compulsory Environmental Impact Statement, while the potentially less-safe mutant varieties and new genetic hybrids are subject to neither pre-market assessment nor the nepa rules. This violates two fundamental principles of regulation: that similar things should be regulated in a similar way, and that the degree of regulatory scrutiny should be proportional to expected risk. usda’s approach (as well as that of the Environmental Protection Agency, which also subjects only gene-spliced
February & March 2011 73

F

Gregory Conko & Henry I. Miller
plants to a pre-market approval regime) gets it exactly backward: The amount of scrutiny, delay, and expense are inversely proportional to risk! This eis obligation for gene-spliced crop approvals could have been avoided if the usda and other federal regulatory agencies had heeded the advice of the scientific community some 25 years ago and chosen not to subject the products of biotechnology to special, discriminatory government regulation. As long ago as 1987, the U.S. National Academy of Sciences (nas) studied the then-emerging field of recombinant dna technology (the technical name for gene-splicing) applied to plants and found that using these techniques to introduce new traits into crop plants posed no unique hazards — that is, that the risks of gene-spliced organisms are the same in kind as those associated with the introduction of unmodified organisms and organisms modified by so-called conventional breeding methods. Consequently, the nas concluded that there is no scientific justification for regulating gene-spliced crops any differently than conventional ones, which are not subject to any pre-market assessment or approval requirements at all. The Academy has reiterated that recommendation in at least four subsequent studies, the most recent of which was published in 2004, as have countless other scientific bodies around the world, including the American Medical Association, the United Kingdom’s Royal Society, and the un’s Food and Agriculture Organization and World Health Organization. Indeed, many of these more recent studies have suggested that gene-spliced crop plants are often safer than their conventional counterparts because the techniques are more precise and their effects more predictable. Nevertheless, bowing to pressure from environmental activists, big agribusiness companies, and the packaged food industry, the usda and the epa established a set of complex, excessive, expensive, and scientifically unjustified testing and approval regulations that applied only to gene-spliced plants. Had the regulators instead followed the scientific consensus, they would not have been tripped up by the National Environmental Policy Act’s paperwork requirement because, in the absence of an approval process for gene-spliced plants, there would be no “major actions” to trigger the eis obligation.

The environmental impact subterfuge

I

t comes as no surprise to legal analysts steeped in environmental law that the scope of nepa’s coverage has experienced continuous creep. Through a combination of prodding by environmental activists and judicial overreach, the meaning of the statute’s term “human environment” has been expanded to include not just tangible ecological harms that affect people and human communities, but also impacts that are economic, social, cultural, historical, and aesthetic, according to a decision
74 Policy Review

The Rush to Condemn Genetically Modified Crops
of the U.S. Sixth Circuit Court of Appeals. Another decision, by a federal district court in Minnesota, illustrates how liberally the statute has been interpreted:
Relevant as well is whether the project will affect the local crime rate, present fire dangers, or otherwise unduly tap police and fire forces in the community . . . the project’s impact on social services, such as the availability of schools, hospitals, businesses, commuter facilities, and parking . . . harmonization with proximate land uses, and a blending with the aesthetics of the area . . . [and a] consideration of the project’s impact on the community’s development policy . . . [such as] urban blight and decay [and] neighborhood stability and growth.

In other words, almost any possible effect that anyone can imagine may constitute a significant impact under nepa if a sympathetic judge agrees. Even wholly outlandish claims can be sufficient to scuttle an agency’s decision. In the first ever nepa lawsuit precipitated by a gene-spliced product, environmental activist Jeremy Rifkin successfully challenged a 1984 decision by the National Institutes of Health, which had regulatory jurisdiction at the time, to permit the field testing of gene-spliced bacteria modified to help protect crop plants from frost damage. Researchers at the University of California had discovered that Pseudomonas syringae, a harmless bacterium found on many plants, contains an “ice nucleation” protein that helps to initiate the growth of ice crystals that damage growing crops. These scientists removed the gene sequence that produces the ice-nucleation protein in the hope that spraying the resulting “ice minus” variants on plants could inhibit ice crystal formation and thereby reduce frost damage. Rifkin’s Foundation on Economic Trends sued to stop the tests, arguing that the nih did not sufficiently consider, among other things, the possibility that spraying the modified bacteria might alter wind circulation patterns and interfere with aircraft flying overhead. Not surprisingly, the nih had dismissed such a ridiculous theory out of hand because the proposal was for a single small, well-circumscribed field trial. Perhaps more important, Pseudomonas syringae bacteria with a missing or nonfunctioning ice-nucleation gene were known to arise spontaneously in nature due to natural mutations, but they produced none of these hypothetical dangers. Remarkably, without a shred of credible evidence, or even a plausible theory for such effects to occur, both a federal district court and federal appeals court allowed the challenge and overturned the nih decision. The district court found in Rifkin’s favor, and the appeals court held that “the National Institutes of Health, in approving an experiment planned by scientists at the University of California, had not adequately assessed its environmental impact, nor had the experiment met the standard of environmental review necessary before an agency by law may decline to prepare a formal Environmental Impact Statement.” However, because Rifkin had neither
February & March 2011 75

Gregory Conko & Henry I. Miller
proved his allegations nor was required to do so, the appellate court also ruled that the nih could authorize such experiments in the future if their environmental effects were properly evaluated. With such history, it is little wonder that federal courts subsequently would recognize the possibility of agronomic impacts as the basis to challenge the approval of gene-spliced crop plants or that they would respond by invalidating the approvals and requiring usda to prepare an Environmental Impact Statement before they can be reapproved.

nepa’s effect on farmers
lthough unsurprising, the resolution of these lawsuits has been a nightmare for American plant breeders and farmers. In one case that involved glyphosate-resistant turf grass, the plaintiffs challenged usda’s decision to permit the Scott’s garden products company to test its seeds outside a greenhouse in order to generate sufficient data to support an application for full commercial approval. Another case involved usda field trial permits to four different seed companies to grow small, geographically isolated plots of corn and sugarcane that had been genetically modified to produce proteins that would be used in medical products. The cases of the glyphosate-resistant alfalfa and sugar beet were particularly vexing to farmers, however, because the lawsuits were filed after usda had approved the varieties for commercial release and farmers had already begun to cultivate the seeds. The alfalfa case involves the usda’s 2005 approval of a glyphosate-resistant variety, sold under the trade name Roundup Ready by its co-developers Monsanto and Forage Genetics. To secure approval, the firms conducted nearly 300 usda-monitored field trials over a period of eight years. aphis scientists evaluated the data generated from these trials to determine the likely environmental impacts of an approval — called “deregulation” in agency parlance — and of widespread use of Roundup Ready alfalfa by American farmers. Since 1996 the same genetic trait already had been incorporated into dozens of approved varieties of corn, soybeans, canola, and other crop plants grown in the United States. Therefore, aphis was quite familiar with the glyphosate-resistance trait’s likely effects, not just from the field tests needed to secure approval but also from a decade of real-world experience. Regulators also considered hundreds of public comments on the proposed approval and concluded that deregulating the crop would not have any significant environmental impact because the introduction of the glyphosateresistance gene is harmless to humans and other animals and was already in common use in American agriculture. Therefore, usda prepared a shorter Environmental Assessment explaining its “finding of no significant impact” and, accordingly, deregulated Roundup Ready alfalfa.
76 Policy Review

A

The Rush to Condemn Genetically Modified Crops
Using herbicides to control weeds — especially in the initial growing stages — is vital to the health of a crop, because weeds can compromise yield by out-competing immature plants for water, nutrients, sunlight, and space. For alfalfa growers, weed control is especially important. The crop is usually grown to be harvested as hay, which is fed to dairy cows throughout the country. If the feed hay is laden with weeds, its nutritional value plummets and both the alfalfa grower and dairy farmers lose money. Consequently, a ban on Roundup Ready seeds would not stop growers from using herbicides; they would just shift to different ones. In fact, they would be forced to use more of them, and more often. Experience with other Roundup Ready crops, for example, shows that growers typically only need to apply glyphosate herbicide once or twice, while conventional crops usually require repeated applicaThe potentially tions of several different herbicides throughout the growing season in order to achieve the same level of negative impacts weed control. of approving Roughly 5,500 farmers across the United States Roundup Ready had planted more than a quarter million acres of alfalfa are Roundup Ready alfalfa by 2007 when a federal district judge in San Francisco issued his opinion. After minimal and studying the plaintiffs’ arguments, Judge Charles manageable. Breyer determined that u s da ’s Environmental Assessment was legally insufficient and he issued an injunction revoking the approval and prohibiting new seeds from being sold until usda completed a full Environmental Impact Statement to evaluate concerns about glyphosate-resistant weeds and the possible impacts on organic farmers. Preparing the eis took usda another three years, but in December 2010 it issued the final document and solicited public comments. Unsurprisingly, the eis concluded what agency scientists and the scientific community already knew: Cross-pollination with and gene flow into non-gene-spliced alfalfa is unlikely because the crop is almost always harvested before it matures enough to grow seeds and pollen, so this kind of out-crossing would happen less than once in every 100,000 plants. And as discussed above, unintentional cross-pollination does not affect the organic status of a neighboring farmer’s crops. Finally, because breeders have for decades used conventional methods to develop herbicide-resistant crop varieties, farmers long ago developed common sense methods for delaying the development of herbicide-resistant weeds and they know how to cope with them when they do arise. The potentially negative impacts of approving Roundup Ready alfalfa are therefore minimal and manageable, and they are in any event far outweighed by the crop’s many substantial benefits. usda must now reapprove the crop, a decision eagerly awaited by tens of thousands of alfalfa growers. But even though the substantive work on the eis is complete, the administrative process of moving to final reapproval
February & March 2011 77

Gregory Conko & Henry I. Miller
could take many additional months and probably will not be completed in time for new Roundup Ready alfalfa seed to be grown for planting by farmers in the spring of 2011. Worse still, usda announced that it will consider reapproving the variety with geographical planting restrictions intended to protect organic farmers from a risk of cross-pollination that the eis concluded was remote, and which would not be particularly damaging if it did occur. Such an unscientific and unwarranted decision would set a bad precedent for future approvals, and it could eventually prevent hundreds of thousands of American farmers from planting demonstrably safe and beneficial crop varieties. Fortunately, the original 5 , 5 0 0 farmers who planted Roundup Ready alfalfa when it was first approved were permitted to continue growing and then harvest that initial crop. They In August 2010, constitute only a small percentage of the country’s alfalfa growers, so the overall impact on the nation’s another federal alfalfa farmers for the next growing season will be limited. A far worse fate is in store for America’s judge in San sugar beet growers. Francisco In August 2010, another federal district court judge in San Francisco revoked the usda’s approval revoked the of gene-spliced sugar beet seeds. That decision by USDA’s Judge Jeffrey White has sown monumental confuapproval of sion among sugar beet farmers, who have enthusiastically embraced the new seeds. An estimated 95 gene-spliced percent of the sugar beets currently in the ground sugar beet seeds. are the gene-spliced Roundup Ready variety. As was the case with Roundup Ready alfalfa, the court’s injunction will permit the already planted crops to be harvested and processed. But before usda may reapprove the variety so new seeds can be sold and planted, a full Environmental Impact Statement will be necessary. The process will take years, leaving farmers in regulatory limbo in the meantime. Perhaps the biggest problem arising from this decision, however, is a looming shortage of non-gene-spliced sugar beet seeds for next spring’s planting. About 10,000 farmers grow a total of more than one million acres of sugar beets in the United States every year, accounting for about half the refined sugar produced here (with the rest coming from sugar cane). Because the Roundup Ready variety so quickly became popular with American beet growers (which is typical generally of gene-spliced products), seed companies cut back on their production of inferior, non-genespliced seed. According to Duane Grant, chairman of the Idaho-based Snake River Sugar Co., which produces about 20 percent of the nation’s beet sugar, many growers fear they will have nothing to plant in 2011. “There has been no incentive, no market, no demand for conventional seed since 2008, and we believe there is not enough conventional seed available for our growers to plant a full crop in 2011,” Grant told the New York
78 Policy Review

The Rush to Condemn Genetically Modified Crops
Times. All this is thanks to bad-faith litigation initiated by invidious activists and to a myopic jurist. Seed companies are, of course, rapidly trying to ramp up production of conventional sugar beet seed, and it may be possible to import some from other countries, but it seems likely that the court’s decision will nevertheless result in an actual shortage of sugar beet seed next year. Such a scenario would be troubling in any circumstances, but may be particularly acute now because sugar production has fallen substantially worldwide during the past few years as a result of harsh weather and poor harvests. U.S. beet growers were poised to capitalize on the global sugar shortage by expanding acreage. But with seeds in short supply next year, consumers will surely feel the pinch of sharply rising prices — wholesale prices increased 55 percent between August and November last year, largely as a result of the nepa litigation — and farmers and others who care deeply about protecting the environment will actually be denied the proven beneficial effects of these crops until they are restored to the marketplace. The usda had issued permits so that four breeders could continue planting Roundup Ready beets under tightly confined circumstances in the hope of preserving a supply of seed for use after the eis is prepared. In December 2010, however, the judge ordered those plants dug up and destroyed even though usda insisted that doing so would cause “substantial harm” to seed companies and beet growers, and although the department’s lawyers observed that “not a single piece of evidence has been put forward to suggest that these fields could cause harm.” Judge White wrote in the court order that “the legality of [usda’s and the plant breeders’] conduct does not even appear to be a close question. It appears clear that [usda and the breeders] were merely seeking to avoid the impact of the Court’s prior order.” Thus, even the strictly controlled cultivation of a small number of perfectly safe plants was sufficient to violate the gratuitous paperwork obligation.

nepa’s one-sided analysis
hat the courts’ nepa decisions fail to take into consideration is that gene-spliced herbicide-resistant crop varieties have been a huge boon to farmers, consumers, and the natural environment. For farmers, the most important benefit of Roundup Ready varieties seems to be the relative simplicity and efficacy of weed control provided by use of the herbicide-tolerant seeds. They produce higher yields with lower inputs and reduced environmental impacts. They enable more environment-friendly, no-till farming practices that prevent topsoil erosion, reduce runoff into streams and lakes, and release less carbon dioxide into the atmosphere. Furthermore, because glyphosate is not harmful to anything but plants and biodegrades quickly once it’s sprayed on crops, even the
February & March 2011 79

W

Gregory Conko & Henry I. Miller
Environmental Defense Fund has called it one of the most ecologically benign herbicides ever developed. Merely switching from older herbicides to glyphosate therefore yields substantial environmental benefits. These are just a few of the reasons why farmers have made biotech crops the most rapidly adopted farming technology in history. Gene-spliced varieties with herbicide resistance and other traits are now grown on over 300 million acres by more than thirteen million farmers in 25 countries. The fact that gene-spliced crops have clearly delivered substantial environmental benefits is irrelevant to the outcome of an Environmental Impact Statement, however, because agency actions are presumed to have some benefits. Otherwise, why would the agency take them? Ironically, even without such offsetting benefits, the National Environmental Policy Act would not forbid the government from taking actions that would on balance degrade the environment. The only function of nepa is to ensure that agencies consider whether their actions may harm the environment. As the influential U.S. Court of Appeals for the District of Columbia Circuit has noted, Congress’s enactment of nepa “did not establish environmental protection as an exclusive goal; rather, it desired a reordering of priorities, so that the environmental costs and benefits will assume their proper place with other considerations.” On the rare occasions in which an agency decided to take an action when the environmental impacts were likely to be great and the benefits of the project small, activists could challenge the decision as arbitrary and capricious under the Administrative Procedure Act. Generally, however, the statute’s purposes are transparency and inclusiveness. It requires agencies to consider all possible environmental impacts, consider alternative approaches that would lessen the impact on the environment, and then state explicitly the justification for moving forward with a project. The public can then better judge the appropriateness of the final decision and reward or punish the president and members of Congress at the polls. Unfortunately, the presence of offsetting environmental benefits is not enough to save the project from a court’s injunction. Courts are empowered to reverse the agency decision until all the impacts are reconsidered and discussed in the form of a lengthy and often superfluous Environmental Impact Statement.

Time for nepa reform
he goal of transparency is a laudable one, but it is manifestly not the intention of most nepa litigation. Instead, the statute has been hijacked by environmental activists in order to slow down or prevent government agencies from taking actions they do not like. Since the law requires agencies to consider almost any conceivable ecological, economic, social, or aesthetic impact, it is literally impossible for nepa to be followed to the letter. No matter how meticulous the agency is in preparing
80 Policy Review

T

The Rush to Condemn Genetically Modified Crops
an Environmental Assessment or eis, the statute offers fertile ground for bad-faith, obstructionist litigation. The National Environmental Policy Act is therefore a recipe for stagnation, a particular problem when “gatekeeper” regulatory agencies must grant approvals before a product can be tested or commercialized. It seems clear, then, that something must be done to change the system. But what? Short of substantive reform of the underlying statute by Congress — the preferable solution — agencies themselves can take some minor steps to mitigate the law’s worst effects. Under the nepa statute itself and accompanying regulations promulgated by the White House’s Council on Environmental Quality, every agency may establish a set of “Categorical Exclusions” that exempt whole classes or types of activities from the The goal of eis obligation. These may include routine or repetitransparency is tive actions that, based on past experience, do not a laudable involve significant impacts on natural, cultural, recreational, historical, or other resources, and those one, but it is that do not otherwise, either individually or cumulamanifestly not tively, have any significant environmental impacts. Indeed, for these very reasons, the usda has already the intention of categorically excluded most small-scale field trials of most NEPA gene-spliced plants from both the Environmental Impact Statement and Environmental Assessment litigation. requirements. The exclusion stipulates that all largescale field tests, as well as any field release of gene-spliced organisms involving unusual species or novel modifications, still generally require an ea or eis, but any of these features could be modified if there were sufficient scientific justification for the change. The crafting of categorical exclusions for certain actions is not trivial. It must be accomplished via notice-and-comment rulemaking in which the agency sets forth the complete analysis and rationale for excluding the activity — which is essentially tantamount to the drafting of a comprehensive eis for the entire class of future actions. Nevertheless, given the usda’s two decades of pre-commercial and commercial experience with Roundup Ready crop plants, one could easily imagine that a decision to exempt all future glyphosate-resistant varieties could be defended scientifically. Still, even when a class of activities may appear to meet the criteria for a categorical exclusion, courts are authorized to second-guess the decision to exempt them. Therefore, no matter how scientifically sound the decision to categorically exclude Roundup Ready varieties as a class may be, it would only take an aggressive plaintiff and a sympathetic judge to render the entire effort moot on the grounds that some supposedly relevant impact was not taken into consideration when preparing the exclusion. The most reasonable and definitive approach would be simply to eliminate the agency action that triggers the nepa obligation in the first instance
February & March 2011 81

Gregory Conko & Henry I. Miller
— namely, case-by-case reviews of virtually all field trials and the commercialization of gene-spliced plant varieties. That would offer the dual advantages of relieving aphis’s nepa difficulties and also making regulators’ approach to gene-splicing more scientifically defensible and risk-based. The activists’ strategy is reminiscent of the old courtroom dictum: When the facts are on your side, pound the facts; when the facts are against you, pound the table. Because there is no scientific evidence to support allegations about negative effects of genetic engineering they are pounding the table, resorting to scare tactics and wholly unfounded assertions. Nowhere in the peer-reviewed studies or monitoring programs of the past 30 years is there persuasive evidence of health or environmental problems stemming from genetically engineered seeds or crops. Quite the opposite: The technology used to produce these seeds is a paragon of agricultural progress and benefit to the natural environment. These obstructionist lawsuits have prevented the marketing of products that offer palpable, demonstrated benefits to the environment and to the welfare of farmers and consumers. Nuisance lawsuits intended to slow the advance of socially responsible technologies are abusive, irresponsible, and antisocial. And so are those who file them. It is long past time for nepa’s burdensome paperwork requirements to be lifted from such an important and beneficial technology.

82

Policy Review

Books The CenterRight Honorable Tony Blair
By James Kirchick
Tony Blair. A Journey: My Political Life. Knopf. 720 Pages. $35.00.

s i d e f r o m George W. Bush, has any other Western political leader in modern times been so reviled and savaged by the intellectual elite, media, or his own people than Tony Blair? As the targets of novelists, satirists, and polemicists, democratically elected politicians have been portrayed in some very unflattering ways, but rarely as so callous as to orchestrate the murder of their own colleagues. So loathed is the former British prime minister that he was recently depicted, albeit in fictional form, as doing just that. In The

A

James Kirchick is writer at large with Radio Free Europe/Radio Liberty and a contributing editor of the New Republic. He worked for a Labor Minister and Member of the House of Commons in 2005.
February & March 2011 83

Ghostwriter, a commercially successful 2010 film directed by Roman Polanski (based on a novel by Robert Harris), a suave, handsome ex-prime minister is not only shown to be an unctuous lapdog to the American imperium, the most frequent charge leveled at Blair. He is also alleged to be in the Americans’ pay, through the connivance of his shrewish, Lady MacBeth of a wife. Blair shares an important trait with his American counterpart, George W. Bush: He inspires the most zealous feelings in people. In 2005, as Blair reveals in his colloquial and enjoyable memoir, A Journey, he sat for a presentation by a pollster who had surveyed the British public’s attitudes towards their prime minister. He “had never conducted research on a person and seen such strong feelings aroused,” Blair writes, with the pollster analogizing the British people’s relationship with Blair as “like a love match or a marriage.” In the 1997 British general election, Blair won the greatest single-party majority in British history, largely due to his own persona and promise of reforms. Ten years later he left office with dismal approval ratings and nary a supporter in the media that had once fawned over him. To comprehend the full extent of Blair’s transformative role in British politics — and why he elicits such passionate emotions — it is first necessary to understand the state of the Labor Party when he took the reins as leader in 1994. At the time, the Labor movement was still dominated by the country’s trade unions, which had historically played a fundamental, though disproportionate, part in its decision-making. Prominent leaders were still talking
Policy Review

Books
of the “class enemy,” advocating a policy of unilateral nuclear disarmament and the eviction of American military bases from the British Isles, and clinging hard to a clause in the party constitution calling for the “common ownership of the means of production, distribution and exchange.” In 1 9 8 1 , a small but influential group of mps, distraught over the hard left’s domination over Labor, broke away to form the Social Democratic Party (which later merged with the Liberals to form today’s Liberal Democrats). In preparation for the 1 9 8 3 general election, Labor produced a manifesto which, in the words of one rueful mp, amounted to “the longest suicide note in history.” Several years later, the party purged members of an entryist sect called the “Militant Tendency,” a “party within a party” founded by former members of the Revolutionary Communist Party. Labor’s 1983 election defeat — as decisive a loss as its 1945 win was a victory — was an edifying experience for Blair. To most neutral observers, it was pretty clear that a party that was forced to fend off a conspiracy by Marxist infiltrators, called for abject surrender in the face of mounting Soviet aggression, and supported widely unpopular industrial actions by public sector unions that threatened to paralyze the country was not fit to govern a modern democracy. Yet to Blair’s astonishment, his fellow party members almost universally informed him that the reason why Labor lost the election was because it was not left-wing enough. “Progressive parties are always in love with their own emotional impulses,” he writes. Under a long line of leaders who headed the party for nearly two decades of Conservative
84

dominance, Labor had become “a party of protest, not of government.” With a determined band of young allies, Blair brought a massive wrecking ball to Labor and refashioned it in his own image. One of the first things he did was to abolish Clause IV of the party’s constitution, the one committing Labor to common ownership, which he describes as nothing less than a “graven image, an idol.” Gone was the class war rhetoric; in came policies that recognized the legitimate economic aspirations of the working class without attacking the wealthier echelons of society. These moves obviously infuriated both the party’s intellectual classes and militant trade unionists, who had sentimental and practical motives to oppose anything that had the whiff of public sector reform. A question which arises at this point of the book is why Blair ever joined the Labor Party in the first place, what with his instincts, temperament, and policy preferences falling more comfortably within the Tory paradigm. Blair did not come from a traditional Labor household; indeed, his father, a self-made lawyer, had ambitions to run for parliament as a Conservative but was foiled by a stroke. The buzzword “modernization,” which Blair uses countless times to describe his reform agenda, was essentially a campaign to make the Labor Party more conservative. Some of Blair’s left-wing critics (and not a few of his right-wing admirers) argue that he chose Labor as it provided a better vehicle for his political ambitions given the fact that it couldn’t sink any lower and had a dearth of young political talent. The greatest revelation to come from this book is the one his enemies
Policy Review

Books
on the left always threw at him and which he still denies: that he is ultimately a man of the center-right. “New Labor,” despite all of Blair’s attempts to market it as anything but, is essentially a reformed, more family-friendly conservatism. On issues from education, to the provision of health care and other public services, to immigration and the recent swarm of asylum-seekers, Blair’s instinct has been to cast aside the trendy notions expounded by the intellectual left and trade unions in favor of policies that are more market-friendly and jibe with a commonsense consensus. Indeed, the entire New Labor project was to shred the party’s Old Left nostrums, while simultaneously stealing the best ideas from the Tories and making them more palatable to the general public. He introduced choice and competition into realms once marked sacrosanct by the left as preserves of the state. As to crime, Blair “hated the liberal middle-class attitudes towards it” and adopted tough policies under the slogan, “tough on crime, tough on the causes of crime,” citing Rudy Giuliani as his inspiration. With regards to foreign policy, however, he frequently expresses befuddlement that the term “neoconservative” is applied to a view of interventionism which he sees, not without justification, as arising from old-fashioned left-wing notions of international solidarity and humanitarianism. Blair admits that he had never thought deeply about foreign policy or Britain’s role in the world until after he assumed the office of prime minister. Though Blair is now remembered mostly for his international role, he admits that “if you had told me on that bright May morning as I first went
February & march 2011 85

blinking into Downing Street that during my time in office I would commit Britain to fight four wars, I would have been bewildered and horrified.” This is the sort of observation frequently made by, and about, American presidents, who enter office with grand domestic agendas only to face the ineluctable pull of the outside world. For Blair, his “awakening” was the Serbian campaign of ethnic cleansing against Albanian Kosovars. And without Blair’s insistence, it is unlikely that Bill Clinton would have made the decisive push for American intervention. Blair is unapologetic about the most controversial and momentous decision he made as prime minister: to take Britain into war against Saddam Hussein’s Ba’athist regime alongside the United States. He reminds readers that repeated inquiries into pre-war intelligence have absolved the British government of manipulating information to make the case for war. One of the most fearsome claims included in an intelligence report — that Iraq had the ability to deploy weapons of mass destruction within 45 minutes of an order to do so — was later pounced upon by war opponents and the media as a textbook case of the Blair government’s deception. But the dossier in which the (latterly-proven) false estimate was published had been authored by the Joint Intelligence Committee, not Downing Street, and was never cited by Blair in his own arguments. Blair admits that, regarding the choice between, on the one hand, a realist foreign policy of maintaining the status quo with rogue regimes, or dealing with them forcefully, he “had become a revolutionary.” One gets the feeling that Blair, emboldened by his

Books
domestic political success (“I felt a growing inner sense of belief, almost of destiny,” he writes about his years rising through the ranks of the Labor Party) carried that confidence over to the foreign realm, where he sought to make Britain a world power that was equally comfortable maintaining both the “special relationship” with the United States and a no less productive one with the continent. At the same time, Blair is glib on the question of Europe; while rightfully attacking the more extreme anti-European sentiments of the British right, he never fully confronts the arguments that a deeper relationship with Europe might threaten his cherished alliance with America. And so the man who entered Downing Street giving nary a thought to foreign policy left admitting that he “would have loved to” overthrow Zimbabwean tyrant Robert Mugabe, yet desisted from doing so only because “it wasn’t practical.” In the context of British politics, there is nothing particularly “conservative” about Blair’s foreign policy interventionism. There has always been a strong, antitotalitarian wing of Labor, and a significant bloc within the Conservative Party has long been isolationist and patronizing towards American power. Yet as Blair is forced to admit, his strong support for American global primacy and a robust uk-U.S. alliance finds few followers in the mainstream left today. Indeed, for all of his eloquence and passion, Blair left behind a Britain that is far more anti-American, pacifistic, and inert on the world stage than it was when he became prime minister in 1 9 9 7 . Maybe that was the inevitable consequence of a set of policies that were as
86

radical as they were necessary, but it is a legacy with which Blair never seriously grapples. In passages that will make him even more unpopular with British and American liberals, Blair frequently defends George W. Bush from charges of dangerous Manichaeism, not because it is the polite thing for an exstatesman to do in service to a fellow former leader, but because he shares Bush’s view of the world. Blair’s cleareyed and moralistic understanding of Islamism, and the force of arms and ideas he believes must be married to defeat it, is of the sort that’s frequently ridiculed by Britain’s chattering classes (Blair would be criticized as hyperbolic by most Labor members and intellectuals not only for comparing the threat posed by militant Islamism with that of “revolutionary Communism,” but for his casting the latter as anything we ought to have been seriously worried about in the first place). As prime minister, Blair’s closest political relationships were formed not with his fellow members of the Socialist International but European conservatives: Germany’s Angela Merkel, Italy’s Silvio Berlusconi, Spain’s José María Aznar, and France’s Nicolas Sarkozy, all of whom he frequently praises throughout the memoir. Even former Israeli Prime Minister Ariel Sharon, a man more despised by European leftists than perhaps any political leader in recent times, wins praise from Blair. Bill Clinton, who emerges from this book as a godlike figure, came from the right wing of the Democratic Party and essentially governed as a moderate Republican. In the final chapter, which addresses the major global events that have taken place since he left office, Blair confirms
Policy Review

Books
himself as a man of the center-right, at least on economic questions. “I profoundly disagree with the statist, socalled Keynesian response to the economic crisis,” he writes. The schadenfreude visible in left-wing quarters over the supposed collapse of capitalism must be addressed head-on with the argument that “‘the market’ did not fail.” Blair writes that “The role of government is to stabilize and then get out of the way as quickly as is economically sensible.” Setting the retirement age at 60 is an idea he finds “absurd; horrifying, in fact.” These prescriptions may sound insufficiently libertarian to the American Tea Party right, but they are outright heresies in European left-ofcenter politics. Obsessive followers of the Tony Blair-Gordon Brown relationship will find much here to pore over. Brown, who served as chancellor of the exchequer throughout Blair’s decade in office, was always more of a Labor man than Blair, which is precisely why he wasn’t suited to lead the party out of the wilderness at its most desperate moment. The two men began as friendly colleagues, with a relationship that soon turned difficult (“we were like a couple who loved each other, arguing whose career should come first”), then utterly sour, with long stretches of passive aggressive (and sometimes openly aggressive) hostility. That there existed a prolonged situation in which the prime minister and chancellor of the exchequer would literally not speak to one another is one of the debilitating peculiarities of the British parliamentary political system; if Blair had sacked Brown, he would remain in parliament on the backbenches as at best an irritant, at worst an organizer of an intraFebruary & march 2011 87

party coup. Still, keeping Brown around meant that he became a major distraction and impediment, and Blair’s hesitance to fire him over his conspiring is one his most widely acknowledged failures. One is struck by Blair’s magnanimity; the arrows shot at Brown are an exception rather than the rule when it

Brown, who served as chancellor of the exchequer throughout Blair’s decade in office, was always more of a Labor man than Blair, which is why he wasn’t suited to lead the party out of the wilderness.
comes to enemies and irritants. Although the media have relished these barbs (which, in light of the men’s disputatious relationship, are remarkably mild, not to mention prescient), A Journey is not the prototypical, scoresettling political memoir. Tony Benn, one of the most prominent, far-left figures in 20th century British politics, who frequently attacked Blair as a liar and war criminal, “is something of a national treasure.” To dismantle the uk’s nuclear deterrent, a move Blair obviously opposed, “would not have been stupid.” Blair’s high-minded attitude is best evinced when he deals with Clare Short, a minister for international development who quit the cabinet over the Iraq War and later went on to disparage her former boss repeatedly and

Books
mercilessly in the media. “She thought people who disagreed with her were wicked rather than wrong — a common failing of politicians — and when she turned sour, she could be very bitter indeed,” Blair writes, with hardly a trace of what must surely be his own bitterness at her betrayal. “But we should be proud of our aid record and she of her part in it.” If only those who disagreed with Blair showed him the same charity that he has shown them. h e c o n v e n t i o n a l wisdom on Blair is that he, along with aide de camp Alistair Campbell, introduced “spin” to British politics. This always seemed exaggerated; double-talk and outright lying are hardly recent phenomena in human nature, never mind political life. Yet there are several instances in this book which make one doubt Blair’s sincerity, and where his overwrought piety gets the better of him. For instance, Blair insists that it’s “the honest truth” that he “was never desperate to be prime minister or to stay as prime minister.” Such an avowal, expressed by many a politician, is better left in one’s head, no matter how sincerely he may actually think he believes it. Elsewhere, Blair denies any impropriety in the Bernie Ecclestone affair, in which a ₤1 million donation to the Labor Party by the head of Formula One Racing was curiously followed by the British government’s attempt to exempt the league from a ban on tobacco advertising in sports. (Ecclestone, Blair writes credulously, “had genuinely never made a linkage, not even implicitly” between his massive donation and the proposed loophole.) Blair strangely absolves Russian President (now Prime Minister)
88

T

Vladimir Putin of being anything other than a ruthless kgb thug on more than one occasion, writing that he “had prosecuted the war in Chechnya with vigor and, some said, brutality.” Some said? As to the prose, Blair sprinkles his memoir with colloquialisms and honest observations about family and political life that demonstrate the common touch he had with voters, a connection that, though short-lived, was nonetheless unprecedented in British politics. “Your average Rottweiler on speed can be a lot more amiable than a pensioner wronged,” he writes of an elderly woman at a public event holding a sign labeling Blair an unprintable name. “I like to have time and comfort in the loo,” he declares. Rupert Murdoch “had balls.” Deputy Prime Minister John Prescott, a burly, former ship steward who acted as Blair’s muscle in dealing with the old Labor Party guard, had a knack for sniffing out young smart-alecks “like a pig with a truffle.” On children, Blair observers, “from about age three onwards, they get interesting and remain like that up to around twelve, when the dark mists of hell envelop them.” He writes self-deprecatingly of his initially awkward interactions with the royal family: “Had it been a dry event, had the Queen been a teetotaler or a temperance fanatic, I don’t believe I could have got through the weekend.” Most frustrating to Blair’s detractors — on left and right — was his uncanny ability to read the British mood. Even when he knew he was swimming against the tide (as on, say, Iraq), he was cognizant of the divide between his views and those of the public, never deluding himself that he had mainPolicy Review

Books
tained the popular support that was so apparent in the aftermath of Labor’s 1997 landslide victory. Blair persisted because he thought he was in the right, and he admits here that he was willing to lose his premiership over Iraq if his party or the electorate so decided. It was perpetually confounding that a man so loathed by so many people, and whose decision to take his country to war in Iraq soon became almost universally unpopular, could still win a massive electoral majority in 2005. Blair has his own explanation for why the voters were willing to give him and his party an unprecedented third term: “they had a keener appreciation of how tough it was to decide the issue [of Iraq] than the black-and-white predilection of the media.” Blair makes repeated mention of the strange alliance between Britain’s reactionary right-wing press (embodied by the “little England” tabloid Daily Mail), and the highbrow, left-wing outlets (the Guardian, Independent, and b b c ), which united against him over his decision to take Britain to war. Yet despite their best efforts to portray him as a man who misled the country (the difference between Guardian editorials decrying Blair’s alleged mendacity and posters at antiwar protests screaming “bliar!” was a matter of articulacy, not reason), attempts by Blair’s enemies to destroy his political career were a complete failure. It says something about Blair’s instincts that the one issue where his “emotional intelligence” was completely off-base was on the matter of fox hunting, a practice about which Blair too readily assumed the politically correct prejudices and class resentments that he usually saw as misinformed.
February & march 2011 89

Blair’s genuine fear that Brown would not continue through with the reforms he championed persuaded him to remain in the top office for so long, a decision which served only to heighten the poison between the two men. Blair warned Brown not to abandon the policies he had implemented over the course of ten years in Downing

Even when he knew he was swimming against the tide, Blair was cognizant of the divide between his views and those of the public, never deluding himself that he had maintained the popular support.
Street, telling his successor that “there was no alternative vision” to that of New Labor, a conscious invocation of Margaret Thatcher’s catchphrase emphasizing the indispensability of liberal economic policies to a modern economy. At its annual conference in late September, the Labor Party chose Ed Miliband — a longtime ally of Brown and a rising star on the Labor left — as its next leader. Ed defeated his older brother David, a Blairite and former foreign secretary, for the job, and did so thanks to the support of the unions, which, despite Blair’s best efforts, still maintain a disproportionate role in party affairs thanks to an outdated electoral college system. When the results came in, a trade union delegate

Books
leaned over to former Labor leader Neil Kinnock and said, “We’ve got our party back.” That may very well be the case. But the chances of the Laborites getting the country back grow slimmer the more they distance themselves from the legacy of Anthony Charles Lynton Blair. Polemos: From Being to Politics. The father voted for George W. Bush in 2000 and supported the war in Iraq; the son opposed Bush and “only reluctantly” backed the Iraq war. The conversations in question arose out of shared concerns about the dangers posed to the United States by those who attacked the country on 9/11 and about the legality and morality of measures adopted by the Bush administration to defend the nation. Father and son found that their
conversations increasingly focused on two controversial tactics in the war against the terrorists — brutal interrogations of suspected terrorists abroad and pervasive electronic surveillance at home — tactics used to get desperately needed intelligence about a hidden and unfamiliar enemy. What we realized they shared was the question of whether an executive has the right to break the law in a time of crisis, for both were indeed illegal.

Thinking About Torture
By Peter Berkowitz
C h a r l e s F r i e d a n d G r e g o ry Fried. Because it is Wrong: Torture, Privacy, and Presidential Power in the Age of Terror. W. W. N o r to n & Company. 222 Pages. $24.95.

h i s i n m a n y ways admirable book was “born of conversations” between father and son. The father, Charles Fried, a professor at Harvard Law School, is the author of many works on legal and political philosophy, and served as solicitor general of the United States under President Ronald Reagan. The son, Gregory Fried, is chair of the Philosophy Department at Suffolk University and author of Heidegger’s

T

Peter Berkowitz is the Tad and Dianne Taube Senior Fellow at the Hoover Institution, Stanford University. His writings are posted at www.PeterBerkowitz.com.
90

To assess whether Bush administration interrogation and surveillance policy were nevertheless proper, the Frieds examine the meaning of torture and the significance of suffering and inflicting it; the sphere of privacy and the cost to liberty when government invades it; and the conditions under which executives may honorably and justly break the law. Despite their political differences, father and son succeed in producing a single voice and, up until their final pages, a single line of argument. They diverge on what to do about the Bush administration officials implicated in the use of harsh interrogation techniques that, they assert unequivocally,
Policy Review

Books
broke domestic and international law prohibiting torture. The younger Fried believes that Bush administration officials should be prosecuted to the full extent of the law, while the older Fried believes the extraordinary circumstances under which they acted and the need in a democracy for winners in elections to refrain from using their power to pursue their defeated rivals counsel forbearance. They articulate their difference of opinion in the same lucid tones and restrained terms that characterize even their most uncompromising claims. At the same time, their argument is marked by evasions, equivocations, and rash conclusions. The evasions and equivocations begin with their failure to state clearly that the genuinely hard questions they laudably confront arose not in the struggle against terrorists of all kinds but in a battle against Islamic extremists who have chosen terror as their tactic and are determinedly seeking weapons of mass destruction to kill vast numbers of American civilians and strike crippling blows against the United States. Prominent among their rash conclusions are the two most dramatic in their book: the philosophical opinion that torture is “absolutely wrong,” and the legal and political claim that in ordering the use of harsh interrogation and warrantless electronic surveillance the Bush administration brazenly, if on behalf of the national interest as the president and his team understood it, defied the law. Although scholars of law and politics will profit from their book, which draws on art, philosophy, moral and political theory, history, and legal analysis, it was not in the first place written for specialists. It is intended,
February & march 2011 91

rather, for “fellow citizens, and particularly for those engaged in public service, be it in the military or the government, who find themselves running up against those limits in the crisis we now face and may continue to face in new and unexpected forms.” Indeed, among the impressive features of the book is the way in which the force of the Frieds’ analysis, in its graceful movement between a more philosophic perspective and a more political perspective, compels them to run up against limits to their own view and at the end to call into question the cogency and morality of their central contention that torture is absolutely wrong. The Frieds recognize, as so many critics do not, that Bush administration authorization of harsh interrogation of enemy combatants and use of super computers to scan electronic communications for hints of terrorist activities sprang from legitimate concerns. Intelligence, they emphasize, while critical in all military operations, is even more so in the struggle against terrorists, because “governments know so little about where the enemy is or even who he is.” In this context they might have added the still greater demand for intelligence against jihadists whose language, history, culture, religion, grievances, and goals were virtually unknown in the United States in 2001 and remain today poorly understood in and out of government. While examining the implications of what they regard as the Bush administration’s trampling on the law is an important objective of their book, the Frieds also seek to expand the debate about post-9/11 terror and surveillance policy beyond what they regard as the “weirdly legal terms” in which it has

Books
largely been conducted. To deepen and refine the issues, they pose, and pursue answers to, “basic human questions” implicated in our judgments about torture and surveillance. These questions concern our rights and responsibilities as citizens and human beings, and in the Frieds’ book they culminate with an exploration of whether “our leaders [are] bound by the same rules as the rest of us? Or because they are responsible for all of us, may they do things (and order things to be done) that the rest of us must not?” Even where their answers fall short, their book demonstrates that answers to the hard legal questions raised by 9/11 depend on an inquiry that is at once moral, political, strategic, and philosophical. Take torture. The Frieds argue that it is not merely “intrinsically bad” but absolutely wrong — everywhere, always, and no matter what the circumstances. The use of violence to obtain information cannot be justified, they argue, when the police have good reason to believe a criminal suspect under their control has information that could lead to saving the life of a kidnapped child. The use of violence in interrogations cannot be justified even in ticking-time-bomb scenarios, in which a terrorist may possess information that might save tens or hundreds of thousands of lives or more. The Frieds respect but reject Alan Dershowitz’s sober view that in extreme circumstances, when officials conclude that the balance of considerations favors using torture, they should be able to go to a court for permission. And while they share Richard Posner’s opinion that since the use of courts would routinize torture and implicate the whole system in an immoral act,
92

torture should remain illegal, they respect but reject his unblinking proposal that since torture may nevertheless be necessary in extreme circumstances, officials who order it should be prepared to face the legal consequences in court and, if extreme circumstances did warrant its use, judges should be prepared to show leniency. The Frieds certainly understand that law is coercive. And they know that war, which can be necessary and lawful, is brutal. But torture — an act which, as a first approximation, is designed to elicit information or punish and which inflicts acute physical or mental pain in order to destroy the prisoner’s will — in their view is different. The difference is grounded, they contend, in the sacred character of every individual human life. ccording to the Frieds, torture is absolutely wrong because human beings are created in the image of God and torture desecrates the divine image in man. It is a bold step for these two philosophically minded professors to invoke the beautiful and tremendously significant teaching in the Bible’s first chapter that God created man, male and female (as the Bible emphasizes, Genesis 1:27) in His image. Deriving an absolute prohibition from it, however, is beset with problems. That human beings are created in God’s image signifies, the Frieds contend, that in all human beings there is something “of a value and significance than which nothing is greater.” Set aside the philosophical and theological problem that the image would seem to be of less value than the original. Strangely, given that they wish to rest
Policy Review

A

Books
an absolute prohibition on it, they deny that the notion that man is created in God’s image rests on belief in God, since secularists “have similarly celebrated the sacredness of the human person.” That’s true but begs the question. At issue is whether secularists can coherently and convincingly affirm the sacredness, or transcendent value, of the human person while denying God’s existence. The words of Hamlet (Hamlet Act 2 , Scene 2 , lines 319–323) comparing man’s actions to those of angels and man’s understanding to that of God that they cite to support the idea that secularists can believe in the sacredness of individual human life may suggest the inescapability of religious belief for finding absolute value in humanity. But Hamlet’s speech does nothing to show that it is coherent or convincing for nonbelievers to affirm the absolute significance of human life. More to the point — if it was philosophical clarity rather than affirmation of a secular faith that they were after — would have been addressing Nietzsche’s powerful argument that our modern morality of freedom and equality is a descendant of biblical faith, and collapses with the collapse of the belief in God. Instead, the Frieds weakly assert that “nothing is lost in the logic of our argument (though perhaps something of its force) if we substitute the humanist for the religious conception of this sacredness.” But if the logic of the argument for an absolute prohibition on torture depends on the premise that human beings are composed of a divine element, then allowing that the premise is true only in a metaphorical sense substantially diminishes the force of the argument’s logic.
February & march 2011 93

Moreover, even if, as the Bible teaches, human beings are created in God’s image, what justifies an absolute prohibition of torture? The Frieds accept that it is just to kill in self-defense, both for individuals threatened by criminals and for nations facing enemies. And they know that under the laws of war doctrine of proportionality, while innocent civilians must not be targeted, in the pursuit of legitimate military objectives soldiers may, if their actions are not excessive, incidentally injure or kill civilians without committing a crime. Why then, if the information gained might save hundreds of thousands, or hundreds, or tens of innocent lives, shouldn’t an enemy combatant, in extreme and extraordinary circumstances, be harshly interrogated? Isn’t placing an absolute prohibition on the use of violence in interrogations — declaring it not just morally and legally wrong but absolutely forbidden under all conceivable circumstances — tantamount to playing God? In part, the Frieds’ reply is that torture generally is absolutely wrong not merely because of the pain and humiliation inflicted on the victim, but because of the degradation to his humanity suffered by the torturer. To commit torture “is to become the agent of ultimate evil no matter how great the evil we hope to avert by what we do.” The Frieds’ powerful examination of the experience of torture will leave few doubting that it is vicious and cruel to the victim and corrupting to the torturer. And yet their argument arouses rather than stills concern that it would be an act of selfishness, in some cases monstrous selfishness, to permit a hundred thousand innocents to die terrible deaths — or perhaps one innocent to

Books
perish — in order to keep one’s own hands clean and conscience clear. In moving from the morality of torture to the conduct of the Bush administration, the Frieds show a good deal more appreciation of the moral and political complexity of the context in which administration officials acted than most of their academic colleagues, but on the question which requires the greatest care they cannot contain a propensity to rush to judgment. They treat as uncontroversial, indeed they assert without the slightest legal argument, the conventional progressive wisdom that in authorizing enhanced or harsh interrogation the Bush administration committed torture and knowingly and clearly broke the law. Astonishingly, however, the lawyer and law professor father and the philosophy professor son never perform the elementary work of analyzing the deliberately ambiguous legal definition of torture. In fact, one of the striking features of the Bush administration war effort rendered invisible by the Frieds’ account was that even in adopting extreme measures it sought and obtained legal authorization. To be sure, the legal reasoning employed by Deputy Assistant Attorney General John Yoo and his boss in the Office of Legal Counsel, Assistant Attorney General Jay Bybee, in their 2 0 0 2 memo which provided the initial legal basis for the use of enhanced interrogation techniques, including waterboarding, was overbroad and sloppy. But that is hardly the end of the story, or even the crux of the matter. Despite pressing for interpretations of the Constitution that gave the executive wide and arguably unprecedented lati94

tude, by constantly seeking and receiving legal opinions and ultimately complying when its lawyers said no and when courts said no, the Bush administration demonstrated its determination to adhere to the law. Accordingly, it is a grave mistake to present the Yoo and Bybee memo, which went so far as to suggest that under the Constitution the commander in chief could not be bound by torture law, as somehow emblematic of the Bush administration view. The Frieds bury in a footnote that Bush administration Assistant Attorney General Jack Goldsmith, who replaced Bybee as the head of the Office of Legal Counsel in 2 0 0 3 , withdrew the controversial Bybee and Yoo memo. Nor, in leveling the charge of lawlessness against the Bush administration, do the Frieds give sufficient consideration to the unprecedented involvement in military matters in the midst of war that the Supreme Court undertook after 9/11 through a series of cases concerning the rights of enemy combatants. And when, with the military campaign against al Qaeda and affiliates far from over, the Supreme Court ruled against it — in 2004 in Hamdi v. Rumsfeld and Rasul v. Bush, and in 2006 in Hamdan v. Rumsfeld — the Bush administration did not hesitate to comply with the Court’s rulings, and thereby powerfully demonstrated a core respect for the rule of law. As with harsh interrogation, the Frieds probe beneath the legal controversy over the government’s massive efforts to glean information about terrorists’ activities from the constant flow through cyberspace of immense amounts of electronic communications to bring into focus the human quesPolicy Review

Books
tions. We value privacy, they observe, because we fear that information about the personal details of our lives will be used, particularly by the government, to harm us, and because part of our notion of personal autonomy is that we and not others should decide what is publicly known about our lives. Laws aimed at securing individual freedom necessarily put a premium on safeguarding privacy, contend the Frieds, but unlike the prohibition on torture, the prohibition on invading privacy, they argue, is not absolute. Invasions of privacy are a lesser evil in part because the boundaries of privacy are conventional. It is also partly because the Constitution’s central promise of privacy (a term actually not found in the document) in the Fourth Amendment is limited: “the right of the people to be secure in their persons, houses, papers, and effects” is not protected absolutely but against “unreasonable searches and seizures” (emphasis added) and “probable cause” is sufficient to justify the issuance of a warrant that allows the police to examine intimate details of one’s life. Although they are certain that the Bush administration’s expansive interpretations of its authority to conduct warrantless electronic wiretapping violated the 1978 Foreign Intelligence Surveillance Act (fisa) — the Frieds again do not pause to examine the Bush administration argument that it operated within the law — the Frieds insist that drawing the boundaries depends on “precedent and practical wisdom” and “convention and historical practice.” Despite their condemnation of his decision to authorize harsh interrogations and their criticism of his electronic surveillance programs, the Frieds
February & march 2011 95

find no reason to doubt that in responding to 9/11 President Bush was moved by a “a sense of honor, the sense of an unexpected, awesome responsibility, and the imperative to do his duty.” And like wartime Presidents Jefferson and Lincoln, Bush, the Frieds maintain, “thought that the crisis he faced required him to break the law.” In 1807, Jefferson on his own authority spent money to fortify U.S naval vessels against British attacks, even though Article I, section 9 , clause 7 of the Constitution makes clear that funds from the Treasury must be appropriated by Congress. And in 1861, at the outbreak of the Civil War, Lincoln unlawfully suspended the writ of habeas corpus, ultimately imprisoning between 1 0 , 0 0 0 and 1 5 , 0 0 0 Maryland citizens without due process of law. To make sense of executive power in emergency circumstances, and to vindicate Jefferson’s and Lincoln’s exercise of it, the Frieds provide a thoughtful discussion of seminal explanations that Aristotle, under the name of equity, and Locke, under the name of prerogative, give for the need when the law fails in its purpose to correct it by completing or even acting against it. But according to the Frieds, Bush’s conduct in the wake of 9/11 was “a radical departure from the behavior of Jefferson and Lincoln in a time of crisis.” In contrast to Bush, Jefferson and Lincoln
both openly recognized the danger in their own actions, and acknowledged that they had violated the Constitution, seeking ratification by Congress after the fact. Bush and his administration insisted that the power of the president to vio-

Books
late the law was an entailment of presidential authority to act in his capacity as commander in chief in a time of war and emergency, allowing Congress only so much oversight as he could not avoid or evade.

Their own formulation, however, suggests that while there was an important difference between Bush and his great predecessors, the Frieds misunderstand it. Unlike Jefferson and Lincoln, Bush insisted that he did not break the law. Indeed, in contrast to Jefferson and Lincoln, his administration went to considerable lengths to show that all of its actions were authorized by the Constitution, properly understood. To the detriment of the judicious analysis to which they laudably aspire and for the most part achieve, the Frieds throughout their book blur the difference between proceeding in defiance of the law and proceeding on the basis of controversial, dubious, or even weak but nevertheless recognizably legal rationales. In their final chapter, perhaps as a result of their intervening reflections on the need for practical wisdom to correct the inherent limitations of law, the Frieds exhibit a certain anxiety about the moral and political implications of an absolute prohibition on torture. While they disapprove of the distinctive combination of good humor and coldbloodedness with which Machiavelli in The Prince advises rulers “to learn how not to be good” and to use goodness or conventional morality according to the dictates of necessity, they recognize that the great Florentine had a point. Political responsibility brings dirty
96

hands, tragic choices, and “a sadder but wiser prudence.” The Frieds even permit themselves the daunting thought, at odds with their conclusion that torture is absolutely wrong and an ultimate evil, that “the most costly kind of moral heroism” may be a political leader’s “ultimate responsibility to be prepared to lose even his soul in a cause that all can understand.” Contrary to their signature contention that torture is absolutely wrong, father and son seem to be acknowledging that even the firmest and finest principles must yield to the judgments of practical wisdom.

Brutish and Short
By Henrik Bering
Robert Coram. Brute: The Life of Victor Krulak, U.S. Marine. Little, B r ow n a n d C o m p a n y. 3 5 9 Pages. $27.99.

mall men were never the Marine Corps’ target audience in its recruitment drives: The world has yet to see a Marine poster calling for “a few tiny men.” But it just so happens that one of the Corps’ true legends was the smallest man ever to graduate from Annapolis. When Victor “Brute” Krulak faced the Medical Examination Board back in 1933, he measured only 5’4” and weighed some

S

Henrik Bering is a writer and critic.
Policy Review

Books
116 pounds — two inches too short for commission as an officer. One of the stories he told of passing the examination was how he paid one of his friends to whack him over the head with a plank so that the resulting knot would add the extra inches. Other versions have him gaining height by sheer willpower. Actually, according to his biographer Robert Coram, he received a waiver, because one of his classmates had gotten one, and thus a precedent had been established. As to his nickname, just as cynics will name their pet Chihuahua Tyson, Krulak got his when a huge upperclassman had surveyed him contemptuously and asked “Well, Brute?” Meant as an insult, Krulak instead embraced the name and set out to fill it. What he lacked in size, he made up for in ambition and energy. He failed to make commandant of the Corps only because of lbj’s vengefulness, but his impact on the Corps and the way it fights is immense, as documented by Robert Coram’s splendidly entertaining biography, Brute. Coram sees Krulak as “the most important officer in the history of the Marine Corps.” Krulak’s leadership style was certainly colorful. Among his standard observations was, “This place needs an enema,” and the phrase became part of Marine argot. It was not unusual for colonels to take early retirement rather than serve under him. His bible was a book called A Message to Garcia. During the Spanish-American War, a lieutenant was told to deliver a message to an insurgent leader in Cuba, whose whereabouts were unknown. No further instructions were given. Undeterred, the lieutenant got himself off to Florida, landed a small boat on
February & march 2011 97

Cuba during the night, tracked down the elusive Garcia in the mountains and handed him the message. This was Krulak’s idea of a resourceful officer, and countless copies were handed out to his staff officers. Occasionally, he would reveal a more human aspect: During a parade, a major accidently knocked off his own “cover,” his hat, during the sword salute. Afterwards he was ordered to pick up the sorry remains, by now ground into the dust by marching feet. Humiliated, the major did not show up at a party at Krulak’s place that same evening. A note arrived at the major’s house, stating that another officer had once had a similar experience without it harming his career. The same thing had happened to Krulak himself. Krulak had less admirable sides, though. Throughout his life, he carefully kept his Jewish roots hidden. From a career point of view, this is understandable, given the bias against Jews and blacks at the time, but it caused pain to his family. Worse, not content with the truth, he constantly felt the need to improve on reality, which is odd, given all his achievements. Much of it was harmless, Coram notes, but as he reminds us, according to the Corps’ honor code, a Marine is not supposed to lie. ac k i n 1 9 2 0 , a Marine Corps intelligence officer, Lieutenant Colonel Earl Ellis, published a report predicting a future war with Japan. Entitled “Advanced Base Operations in Micronesia,” it is, Coram notes, “one of the most prescient military studies ever written.” In order to gain points of support and airfields from which Japan

B

Books
could be reached, America would have to leapfrog its way across the Pacific. This necessitated amphibious landings on defended coasts, which ever since the British disaster at Gallipoli were deemed “almost impossible,” in the words of the influential British military theorist Liddell Hart. The Marines disagreed; in their view, Gallipoli had failed because of poor planning and lack of coordination between the services. The Corps set about developing the concept of amphibious landings, which was to prove the war winner in World War II, both in the Pacific and in Europe. What the Marines lacked was the right kind of landing craft. While stationed in San Diego as a young lieutenant, Krulak was tasked with monitoring the trials of the various experimental craft, none of which worked. Because of their exposed propellers, they would get stuck in the sand if they got too close to the beach, which meant that the Marines would often plop in where they could not reach the bottom. The high gunwale did not make things easier. A 1937 transfer to China brought the solution, and ironically, Coram notes, it came courtesy of the Japanese. When the Japanese during the attack on Shanghai staged an amphibious flanking movement from the sea, Krulak as a military observer got himself a U.S. tugboat and sailed it in right among the Japanese warships. From his tug, he got a good look at the Japanese landing craft, whose flat bow would open and form a ramp when hitting the beach. At a later stage, he also got a view of the bottom of the craft, which were being repaired on land, revealing how the propeller was protected in a
98

tunnel, and how the bottom had two skegs, stabilizing it. Krulak’s copious notes and photographs formed the basis of his report, “Japanese Landing Operations Yangtze Delta Campaign,” which he forwarded to the Navy in the naïve belief that they would act on it immediately. When Krulak came to Washington to check on its progress, a Navy lieutenant finally locating the report in some dusty cabinet. On the cover was scrawled a note: “Prepared by some idiot out in China.” The navy still operated with versions of Atlantic fishing boats. Resolutely, he returned to Quantico and produced a wooden balsawood model which incorporated the revolutionary Japanese features, and took it to his superior, General Holland Smith, who set up a briefing with the Marine commandant. The outcome was that Krulak became the Corps’ “boat man.” Krulak tested and suggested improvements to Donald Roebling’s Alligator, which had tracks enabling it to negotiate coral reefs, and which became known as the Amtrac. And he aggressively promoted the Higgins boat, which incorporated the drop bow and which in trials blew the Navy’s entrants out of the water. As Coram notes, it is normally Holland Smith who is credited with the development of the Higgins boat, but it was Krulak who was point man in developing the landing craft. n world war II, Krulak got his baptism by fire leading a diversionary attack on Choiseul in the Solomon Islands. The object was to trick the Japanese into believing that the main attack would come there rather than on Bougainville.
Policy Review

I

Books
As regards vegetation, Choiseul was the most inhospitable of the islands, and of course it was infested with Japanese snipers. To make it harder for the snipers, who fire on officers first, he ordered all insignia removed, and only first names used. “If any of you call me Colonel, I will reply loudly, ‘Yes, General.’ ” Because Krulak’s targets were far apart, he had to divide his force. The second force ran into a Japanese battalion and had to be extracted at night by two pt boats, one of which was driven by John F. Kennedy — their own Higgins boats having been destroyed by mistake by friendly aircraft. Krulak’s own force fought on, destroying food and ammunition dumps. Altogether, his men racked up 7 2 confirmed kills before getting taken off the island. He lost only six men and was awarded the Navy Cross. On Okinawa, he again showed great courage, coordinating attacks on the front lines. He suggested to the overall commander, General Buckner, that the Marines make an amphibious landing on the southeast coast of the island, which would force the Japanese to divert forces away from the main front; Buckner, an Army man, turned down the proposal. “General Buckner did not like the water. We liked the water. It is a very useful route to get from a to b,” Krulak observed, and he later wrote in the Marine Corps Gazette, “For the force that has the skill and the courage to use it, the ocean is an immense tactical ally.” Clearly, writes Coram, Krulak’s words were “a slap at the army.” Incidentally, Buckner died when inspecting a Marine position on Okinawa. A Japanese artillery barrage
February & march 2011 99

caught him, probably, suggests Coram, because, unlike the Marines, Army officers insisted on wearing insignia. elations between the Marines and the other services have always been tense, and Brute is as much a book about the Marines’ constant bureaucratic battle for survival. The tension goes all the way back to Belleau Wood in World War I, the spot where the Marines blocked Ludendorff’s offensive and the German drive for Paris. Not only did the Marines block the German advance, they pushed them back. Afterwards, the Army jealously refused the Marines a memorial plaque in Belleau Wood, a situation that was not rectified until 1955. In World War II, for the Marines, Guadalcanal became the equivalent of Belleau Wood. But though in Coram’s view General Holland Smith and his Marines achieved as much in the central Pacific as the Army did in the western Pacific, at the surrender, MacArthur could not find it in himself to invite the commandant of the Marine Corps to be present during the ceremony. There was similar bad blood between the Navy and the Marines. Immediately after the Marines’ landing at Guadalcanal, Admiral Frank Fletcher had declared his aircraft carrier short on fuel and had left them to fend for themselves. As to when the Marines will forgive this, Coram quotes a recent editor of the Marine Gazette: “the 12th of Never.” And at Iwo Jima, the Navy had been asked for nine days of pre-invasion bombardment, but only delivered three. A direct threat to the Corps’ existence emerged towards the end of the

R

Books
war, when George Marshall launched a plan for “a single department of war in the post-war period,” i.e., a Germantype staff system with a single, military chief of staff controlling all branches. The proposal would weaken the controlling powers of Congress and it would reduce the Marine Corps to a mere Navy police force. As a member of the Chowder Society, Krulak was in the forefront of the successful efforts to defuse Marshall’s scheme. The society got its name from a comic strip, Barnaby, whose main character belonged to a social club called “The Elves, Leprechauns, Gnomes, and Little Men’s Chowder and Marching Society,” which an officer had tacked on the wall with the last six words underlined and an arrow pointing to Barnaby. Instead of Barnaby, the officer had written Krulak. The society fought tooth and nail for the survival of the Corps. Among Krulak’s efforts was the script for a film, Bombs over Tokyo, which made it plain that Japan could not have been bombed if it hadn’t been for the Marines. An exasperated Truman at one point accused the Marines of having “a propaganda machine that is almost equal to Stalin’s,” a comment for which the president had to apologize. battalion of 1,200 men and a reinforced regiment of 3,600 men respectively could be ready to sail. His boss being on holiday, Krulak’s answer was: “48 hours” for the battalion, while the larger force would take “five days, including a Marine aircraft group.” As Coram notes, this was a key moment in Marine history: Krulak had no idea if he could deliver, but he knew that “If we can’t, we’re dead.” The Corps needed to demonstrate its indispensability. And the Marines met the challenge. Sailing from Hawaii, they were the first troops to arrive from America. Here they found MacArthur and his troops about to be pushed into the sea at Pusan, and their first feat was to stabilize the Pusan perimeter. Subsequently the First Marine Division led the assault at Inchon, and pushed north. The war looked as good as won, when the Chinese hordes came pouring across the border, something MacArthur had said would not happen and which left the First Marine Division completely surrounded. In the breakout from the Chosin Reservoir — about which their commander, General O.P. Smith, stated that the Marines were not retreating, they were just attacking in a different direction — the Marines again proved their mettle, killing more than 3 7 , 5 0 0 Reds, against 4,418 casualties of their own. In all three places — at Pusan, Inchon, and Chosin — Krulak distinguished himself. He was all over the battlefield in his helicopter, issuing orders left and right and reporting back to headquarters. As Coram notes, the helicopter introduced a new element in warfare: It performed reconnaissance duties, dropped supplies, and evacuated the wounded; most significantly, the
100 Policy Review

A

s coram notes, America’s mistrust of a standing army leaves it dismally unprepared when war is forced upon it, as the case of Korea amply illustrates. When the North Koreans attacked, MacArthur rushed in garrison troops from Japan, but getting combat-ready Army troops on the way from America would take weeks. The Marines were asked how soon a Marine reinforced

Books
war saw the first helicopter assault in history in October 1951. This was very much a result of Krulak’s vision back in 1948, when the helicopter was still in its infancy. Kulak saw its potential to add an extra dimension to the attack, as “a cavalry of the sky” that “would carry men into combat from all directions.” “The evolution of the set of principles cannot wait for the perfection of the craft itself, but must proceed concurrently with that development,” he wrote, and he set about creating the doctrine for what was to become known as “vertical envelopment.” The 1950s also saw the two most embarrassing cases to hit the Marine Corps. One was the Schwable case: Colonel Frank Schwable’s plane had been shot down in 1952 by the North Koreans, and in order not to give away the nuclear targeting details, to which he had had access when working for the Joint Chiefs of Staff, he instead said the U.S. had conducted germ warfare, a real propaganda windfall for the North Koreans. To make pows less susceptible to brainwashing techniques, Krulak instituted training reforms with particular emphasis on American history and values. The other was an exercise at Ribbon Creek in 1956, where a drunken Parris Island drill instructor ordered a bunch of recruits out on an unauthorized night march and six of them drowned in the creek. The drill sergeant had made no attempt to recover the bodies. Unwisely, the Marine Corps tried to downplay the incident, with the predictable result that the media went into a frenzy. (By comparison, after a freak accident on Okinawa under Krulak’s command, in which eleven soldiers
February & march 2011 101

drowned during a typhoon, he laid out the facts immediately.) Krulak started the long process of reforming recruit training methods by issuing new guidelines for drill instructors and seeing to it that several got fired. You want tough hombres as drill instructors in the Marine Corps. You do not want sadists. nder john f. Kennedy, Krulak became special assistant for counterinsurgency and special assistant for special activities, i.e., covert action. These were key positions, given the administration’s emphasis on guerilla warfare, which meant he wielded influence way beyond his two-star rank. From a variety of sources, he put together the country’s counterinsurgency doctrine — how it was essential to gain the trust of the population, and where the protection of the locals is priority one. During the presidential campaign, he shamelessly puffed up his relationship with jfk, telling how Kennedy had rescued him with his pt boat in the Pacific; how after the rescue he had promised Kennedy a bottle of Three Feathers, the rotgut whiskey consumed by World War II troops; and how, after Kennedy had won, he had dropped in on the new president in the White House with a bottle of the stuff. All this was pure invention. “The truth is that Krulak and Kennedy never met in the Pacific,” Coram writes, as it was the other part of his force Kennedy had rescued, and he refers to an addendum to Krulak’s oral history at the Kennedy Library where Krulak admits they never met until after Kennedy became president. The addendum was to be read only after Krulak’s death.

U

Books
As the Vietnam War intensified, Krulak’s clashes with General William Westmoreland, the commander in chief in Vietnam, were unavoidable. As a conventional army general, trained to fight in Europe, Westmoreland did not buy any of this newfangled counterinsurgency stuff: Forget about hearts and minds, Westmoreland preferred raw firepower. What was particularly hard to accept for Krulak was Westmoreland’s use of the Marines in a traditional battle of attrition at Khe Sanh, a place of no strategic value. So as regards counterinsurgency, Coram rates Krulak’s efforts to get his message across “an abject failure.” Krulak’s run-ins with journalists David Halberstam, Stanley Karnow, Peter Arnett, and Morley Safer over their strong anti-military bias were equally bitter. Safer’s famous sequence with the Marines setting fire to a hut with a Zippo lighter particularly angered Krulak. What Safer conveniently left out, Krulak noted in a subsequent internal paper, was that the Marines had been under automatic fire from the village, which was crisscrossed by trenches and booby traps. Coram cites Chicago Daily News reporter Keyes Beech, who was present at the time, in support of Krulak’s version. In the end, Kulak’s outspokenness was his downfall. Kulak told Lyndon Johnson to his face how men were being needlessly killed because of his approach, the only senior general with the guts to do so, Coram writes. A photograph shows Krulak lecturing Johnson with pointed finger and Johnson looking hugely uncomfortable. The meeting ended with Johnson propelling him out of the Oval Office. As a result, Krulak did not get the appoint102

ment as commandant of the corps for which he had been regarded as a shooin. He was not fired either, just left to dangle, in a classic example of lbj’s vindictiveness. But while Krulak never made commandant, his legacy outweighs most of those who did.

Market Capitalism, State-Style
By Ying Ma
Ian Bremmer. The End of the Free Market: Who Wins the War Between States and Corporations? Portfolio. 240 Pages. $26.95.

an bremmer believes that the free market is worth defending. Though market capitalism has taken a severe beating in the recent global financial crisis, he insists that it remains the best model for creating wealth, promoting growth, and delivering prosperity worldwide. Yet, as he explains in his new book, The End of the Free Market, market capitalism now faces a major threat from within capitalism itself. That threat is state capitalism, and around the world it appears to be the new fad. Its adherents range from powerful former communist countries like Russia and China, to Arab monarchies Ying Ma is a visiting fellow at the Hoover Institution.
Policy Review

I

Books
of the Persian Gulf, such as Saudi Arabia and the United Arab Emirates, to energy-rich authoritarian states like Iran and Venezuela. Democracies such as India and Mexico showcase elements of state capitalism too, and vibrant emerging markets such as Brazil flirt with it. The governments may be different, but the underlying logic is the same. According to The End of the Free Market, state capitalism is a system in which the government acts “as the dominant economic player and uses markets primarily for political gain. The ultimate motive is not economic (maximizing growth) but political (maximizing the state’s power and the leadership’s chances of survival).” State presence in the economy, however, does not automatically make a country state capitalist. After all, the global financial crisis that began in 2008 has made clear that even market capitalist governments, under dire circumstances, will not hesitate to carry out massive interventions in their economies. The distinction between them and state capitalists, says Bremmer, is that the latter see government intervention not as a “temporary series of steps meant to rebuild a shattered economy or to jump-start an economy out of recession” but as a “strategic long-term policy choice.” For state capitalists, markets function “primarily as a tool that serves national interests, or at least those of ruling elites, rather than as an engine of opportunity for the individual.” Unfortunately for market capitalism, the great crash of 2008 has greatly burnished state capitalism’s credentials. As the excess of unfettered capitalism wreaked havoc and chaos worldwide,
February & march 2011 103

the free market lost much respectability, and the United States, the source of the financial crisis and the market’s most vocal champion, lost a great deal of its global economic swagger. Nonetheless, state capitalism is not the way to go, according to The End of the Free Market. The book describes the nature and appeal of the state capitalist model, the distortions it creates in the free market, and the challenges it poses to global prosperity. The author exhorts market capitalists to defend their economic model — and its virtues — against the encroachments of an increasingly attractive alternative. To that end, two important possibilities could hinder state capitalism’s power grab: political liberalization in state capitalist countries and sustained economic renewal in free-market economies. Yet Bremmer fails to recognize the former and pays insufficient attention to the latter. As such, this book offers a thought-provoking read but paints an incomplete map of the way forward.

S

ince the end of the Cold War, one authoritarian country after another — from Asia to the Middle East, from Russia to Latin America — has embraced capitalism. The trend took place not because authoritarian rulers believed in free markets or free peoples but because they believed they could keep a leash on both. By the time that President Obama strode into office, state capitalism was proudly holding court in Moscow, Beijing and elsewhere.1
1. For an in-depth discussion of the appeal of authoritarian countries that pursue economic liberalization while shunning political reform, see Ying Ma, “The Fate of the Freedom Agenda,” Wall Street Journal Asia (August 1–2, 2008).

Books
The End of the Free Market paints a troubling picture of the current condition of state capitalism. Already, three quarters of global crude oil reserves are owned by national oil companies controlled by state capitalist countries. Privately owned oil companies, like Chevron and bp, produce only ten percent of the world’s oil and hold just three percent of its reserves. Stateowned oil and gas giants outcompete their privately-owned counterparts by relying on state-backed support to pay above-market prices, or by doing business with repressive regimes that private multinationals avoid. Worse yet, the state capitalist home base may outright deny energy to other countries for thinly veiled political reasons. Russia did exactly that when Gazprom turned off gas supplies to neighboring Ukraine in the winter in 2006 and 2009. As national oil companies lock up natural resources, another agent of state capitalism, state-owned enterprises (soes), dominate domestic industries ranging from mining to petrochemicals to telecommunications to steel production. They are complemented by privately-owned entities designated as national champions, which receive from the state heavy subsidies, tax breaks, contracts and low-interest loans. As a result, soes and national champions can gobble up or compete against foreign competitors, both at home and abroad, with resources and advantages often not available to private companies that do not enjoy state backing. Meanwhile, a number of state capitalist governments further sharpen their economic edge with yet another tool, sovereign wealth funds. A vast majority of the world’s sovereign wealth fund assets are controlled by state capitalist
104

countries such as China, the United Arab Emirates, Saudi Arabia, and Russia. Concerns abound over how the more opaque funds, such as the China Investment Corporation, might make investments based on political rather than financial objectives. espite sharing various common, troubling characteristics, state capitalist countries are not all equal. The End of the Free Market devotes much (perhaps too much) time to identifying countries — ranging from Algeria to South Africa to the Ukraine — that are state capitalist or exhibit state capitalist tendencies. Some of these countries are not that state capitalist while others are not that significant. As Bremmer himself admits, powerful regimes such as Saudi Arabia, Russia, and China ultimately give state capitalism its glow because of their significant global economic clout. In the 2 0 th century, Arab states demonstrated through oil embargoes their willingness to wield rich oil supplies as a political weapon. In the 21st century, the rising power of China and Russia has given state capitalism much of its new shine. Both former communist regimes have built viable and sizable economies. Though they give no thought to a return to the socialist, command policies of the past, leaders in Moscow and Beijing show every interest in keeping a leash on their respective economies in order to maintain a monopoly on political power. Of course, no country can rival China as the granddaddy of state capitalism today. Since it embraced market reforms three decades ago, the country has drastically expanded its economy and offered its people improved living
Policy Review

D

Books
standards that were previously unimaginable. China’s explosive growth makes the case that state capitalism not only works but can work miracles. As the country emerges from the global financial crisis relatively unscathed, the aura of its state capitalism has only intensified. Beijing, for its part, has not been hesitant to tout the success of its political-economic model. In May 2009, China’s vice foreign minister, He Yafei, asked Bremmer and a small group of economists and scholars, “Now that the free market has failed, what do you think is the proper role for the state in the economy?” In many ways, China had already formulated its own answer. At the height of the financial crisis in 2008, Chinese Premier Wen Jiabao explained on c n n ’s Fareed Zakaria GPS that China’s resounding economic success came from the important thought that “socialism can also practice market economy” and that the formulation of China’s economic policy “give[s] full play to the basic role of market forces in allocating resources under the macroeconomic guidance and regulation of the government.” Within China, the national government takes a much less polite, or subtle, approach. As Willy Lam has written in the Jamestown Foundation’s China Brief, in the wake of the financial crisis Beijing made the exaltation of the China model the theme of a nationwide ideological campaign. The campaign lauded the government’s wisdom of balancing “between growth and stability — and between market initiatives and state control.” The point was to emphasize the virtues of a model that can serve as “an antidote to the kind of
February & march 2011 105

unbridled capitalism that underpins the financial woes in the Western world.” remmer is not the only one who has noticed the rising power of autocratic regimes that have embraced state capitalism. He is, however, one of the few who argues that economic freedom is worth defending for its own sake. Often, when noneconomist policy types talk about the rising economic influence of China or Russia, they prefer to discuss this phenomenon’s ramifications for democracy or power politics or its offensiveness to political freedom. Some scholars, like Robert Kagan of the Carnegie Endowment for International Peace, prefer to think of China and Russia as part of an “informal league of dictators” that challenges a U.S.-led world order and opposes U.S. priorities like preventing a nuclear Iran. Others, like certain unimaginative neoconservatives and left-wing activists, see the promotion of global trade and business with nondemocratic regimes as inherently unjustified when it conflicts in any way with human rights. For these observers, the free market is merely a sidekick in the game of power or the pursuit of universal political freedom. While the protection of U.S. power and the promotion of American ideals have a central place in U.S. foreign policy, policymakers and analysts who mouth foreign policy priorities often overlook the fact that the free market — and the economic freedom that it ensures — undergirds U.S. power and ideals, and is worth defending for its own sake. The End of the Free Market provides precisely that defense. In fact, the book is animated by the belief that state capi-

B

Books
talism’s biggest threat is the affront to market capitalism. As Bremmer points out, state capitalism practitioners, who are mainly autocratic, could decide to disrupt the global market when politically necessary or convenient, whether by thwarting foreign competition or fostering heavy protectionism. Instead of letting markets create greater wealth and prosperity for all, state capitalism fosters a world in which “politics trumps efficiency, entrepreneurship and innovation.” To some extent, this is already happening. For example, China suspended the export of rare earth minerals to Japan for two months in late 2010 as a result of a dispute over islands claimed by both countries. China mines 95 percent of the world’s rare earth minerals, which are crucial for the manufacturing of high-tech products such as iPhones, wind turbines, and hybrid gasoline-electric cars. Though it is certainly not alone, Beijing has said loud and clear that it will not hesitate to use trade as a political weapon. To counter the negative global effect of state capitalism, Bremmer warns first and foremost against protectionism. Practitioners of market capitalism should allow the market to do what it does best: offer a free and open place for global competition. This means keeping the door open to trade, foreign investment, and immigration, even if state capitalist countries refuse to do the same or populist sentiments demand otherwise. After all, those who wish to defend the free market would do well to uphold market principles. Bremmer also recommends that the United States continue to invest in hard power. Nothing will safeguard the free flow of goods around the world, espe106

cially oil and gas supplies, better than the preeminence of U.S. military power. These and Bremmer’s other recommendations make plenty of sense. He forgets, however, that the future of the free market depends not just on how free-market countries and state capitalist countries interact with each other but also on what they do internally. In that regard, political liberalization in state capitalist countries and sustained economic renewal in free-market economies could thwart or even end the influence of state capitalism. lthough The End of the Free Market makes it abundantly clear that state capitalism finds itself most at home in autocracies, the book refuses to take the next logical step: Acknowledge that peaceful liberalization or democratization of autocracies could lead to an unraveling of the state capitalist system that they run. Though democratic regimes are not immune to elements of state capitalism and the absence of democracy does not prevent the formation of a free-market economy, liberal democracy and its attributes — the rule of law, fair elections, civil society, free press, and other checks on state power — make it difficult for state capitalism to take root. Autocrats, on the other hand, can intervene in their economies with far greater latitude. As such, internal political reforms can play a big role in undermining or changing the authoritarian nature of regimes as well as the state capitalism that they practice. This does not mean that the United States should forcibly, surreptitiously, or carelessly try to democratize state capitalist autocracies, or act as if freedom would materialize
Policy Review

A

Books
as long as the United States mouths it fervently enough. Rather, it means that pursuing wise and workable efforts to promote political freedom in politically unfree countries could pay dividends for economic freedom. Unfortunately, this is something that The End of the Free Market does not recognize. Of course, state capitalist autocracies could survive for a long, long time without succumbing to internal or external pressures for political liberalization. This makes it all the more useful to keep in mind another important element that Bremmer glosses over in his prognosis: It matters whether leading free-market economies like the United States can get their economic groove back. If the dynamism of China’s economy has done more than anything else to enhance state capitalism’s luster, then a return to growth, vibrancy, and innovation in the American economy will be an unambiguous indication of market capitalism’s dominance. The United States cannot force state capitalist countries down a more liberal or enlightened political path, but it can do a whole lot to make sure that its own economic system is sound and its private sector is competitive. Bremmer agrees, but then avoids saying much about this country’s heated internal debate about how best to move forward economically. Bremmer argues that President Obama is not a state capitalist but ignores the implications of Obama’s policies: higher tax burdens, more intrusive government regulations, higher costs for doing business, and grander ambitions to fund programs that the United States cannot afford. To emerge victorious from the global financial crisis, America will have to choose
February & march 2011 107

between Obama’s vision and one that favors a freer marketplace and more responsible spending habits. The 2010 sovereign debt crises that hit Europe offer a fine example of the dreadful future in store for the United States should it continue down the path, as charted by President Obama and many in his political party, that favors chronic lack of fiscal discipline and crushing deficits. To his credit, Bremmer urges Democrats, who tend to prefer a greater role for government intervention in the economy, to speak up for free markets, but the problem requires much more than political rhetoric. The choice between Obama’s vision and small, limited government is not just one that snotty East Coast liberal elites regularly dismiss as the paranoia of Tea Party activists who need an excuse to vent their racism against America’s first black president. It has a whole lot to do with the battle between the free market and state capitalism, too. As it turns out, those in America who advocate unprecedented new powers for the U.S. government to regulate the private sector and direct private initiatives are also the very people who have taken to ogling state capitalism. As a presidential candidate, Obama tipped his hat to Chinese state capitalism when he bemoaned the crumbling infrastructure of his home country and noted that China’s state-directed infrastructure spending has produced ports, trains, and airports that were “vastly the superior” to those in America. When Senator John Kerry introduced comprehensive climate change legislation in May 2010, he exhorted the U.S. government to heavily subsidize green technology in part because “The

Books
Chinese aren’t waiting around” and have “surpassed us in renewable energy investment.” Those who are predisposed to worshipping government intervention in the economy see state capitalism, especially as successfully practiced by China, as overflowing with authoritarian chic.2 These big-government types, like Obama and his allies, would not dare to turn the United States into a state capitalist country, but as the Chinese economy hums along with three decades of explosive growth under its belt and emerges relatively unscathed from the global financial crisis, they increasingly believe that the U.S. must adopt certain Chinese, or state capitalist, characteristics to be competitive in the 21st century. That certainly is not what Bremmer supports. Unlike big government types, he believes that state capitalism is “burdened by . . . shortsighted, short-term thinking, especially when powerful players within the system have their own set of incentives for earning shortterm rewards.” In the long term, he is confident that the free-market system will prevail because “virtually all people value an opportunity to create prosperity for themselves and for their families and because free markets have proven again and again that they can empower virtually anyone.” Unfortunately, though, Bremmer fails to adequately acknowledge in his book that the choices the United States makes could induce or impede economic recovery, which would ensure or
2. For a discussion about the ogling of Chinese authoritarian chic in America’s climate change debate, see Ying Ma, “China’s View of Climate Change,” Policy Review 1 6 1 (June & July 2010).

erode the dominance of the free market model in the world. Stagnant growth, bloated government, out of control deficits, uncompetitive businesses — all have resulted from the policies of President Obama, who Bremmer defends as a market capitalist. Though the president has maligned the free market less since the American people roundly rebuked him and his political party in the midterm congressional elections of 2010, it remains unclear whether he will in fact change course. Unless he does, or is forced to do so by the new Congress, his policies could do a whole lot to make the title of Bremmer’s book a more imminent reality.

Home Economics
By David R. Henderson
B i l l B rys o n . At Home: A Short History of Private Life. Doubleday. 497 Pages. $28.95

I
108

h av e l o n g e n joy e d Bill Bryson’s books on travel. He has a rare ability to both enter-

David R. Henderson is a research fellow with the Hoover Institution and an associate professor of economics at the Graduate School of Business and Public Policy at the Naval Postgraduate School. He blogs at www.econlog.econlib.org.
Policy Review

Books
tain his readers, often with side-splitting humor, and get them interested in the history of the places he travels. My favorite, in part because of the humor, is In a Sunburned Country, his book on Australia. But if I were to judge his books solely on the importance of the history he uncovers, my favorite, by far, would be his latest, At Home: A Short History of Private Life. Bryson has pulled off a marvelous feat. He devotes almost every chapter to a room in his Victorian house in England. He then considers why the room is the way it is and what preceded it. In doing so he produces an important economic history, only some of which will be familiar to economic historians and almost all of which will be unfamiliar to pretty much everyone else. A large percentage of it is important, for two reasons: One, you get to pinch yourself, realizing just how wealthy you are; and two, you get a better understanding than you’ll get from almost any high school or college history textbook of the economic progress that made you wealthy. Not surprisingly, given that I’m an economist and Bryson isn’t, I have a few criticisms of places where he misleads by commission or omission. But At Home’s net effect on readers is likely to be a huge increase in understanding and appreciation of how we got to where we are. One of Bryson’s most striking descriptions is about the life of a servant when houses had two stories but no running water. The worst days for the servant were when family members or guests wished to take a bath. Bryson writes:
A gallon of water weighs eight pounds, and a typical bath held
February & march 2011 109

forty-five gallons, all of which had to be heated in the kitchen and brought up in special cans — and there might be two dozen or more baths to fill of an evening.

Nor did servants seem to get much appreciation from their mistresses. Although Bryson specifies too infrequently the time periods of which he writes, one gets the impression that this attitude to servants lasted into the 20th century. He quotes two 20th-century mistresses’ complaints about servants. Virginia Woolf said that servants were as irritating as “kitchen flies,” and Edna St. Vincent Millay stated, “The only people I really hate are servants. They are really not human beings at all.” Bryson describes, in detail, one of the toughest jobs — that of the laundry maid. One highlight of that description is what she (yes, Ms. Millay, they actually were human beings) needed to do to get stains out. The way to deal with stained linens was to steep them in stale urine or a diluted solution of poultry dung. ry s o n l e a d s o f f his chapter on the drawing room with a discussion of the words “comfort” and “comfortable.” Until 1770, he writes, the idea of being comfortable at home “was so unfamiliar that no word existed for the condition.” “Comfortable” meant simply “capable of being consoled.” But by the early 19th century, it was quite common for people to talk about having a comfortable home or making a comfortable living. “The history of private life,” writes Bryson, “is a history of getting comfortable slowly.” In other words, standards of living increased

B

Books
gradually due to the many labor-saving inventions that — though they reduced the demand for servants — made even the lives of servants easier. Indeed, improvements in technology were so important that the one chapter Bryson devotes to something other than a room in the house or a physical area in or around the house is his chapter on the fuse box. Electricity truly revolutionized life. Bryson writes, “The world at night for much of history was a very dark place indeed.” A good candle, he adds, “provides barely a hundredth of the illumination of a single 100-watt lightbulb.” Although Bryson makes a good case for how important lighting was and is, he would have made an even stronger case had he drawn on the pathbreaking work by Yale University economist William D. Nordhaus. In a study done in 1996, Nordhaus found that failure to adjust appropriately for the plummeting cost of light has led economic historians to dramatically understate the growth of real wages over the last 2 0 0 years. That one invention, plus many others, led to a burgeoning middle class. The term “middle class,” writes Bryson, was coined only in 1745. By the early 19th century, of course, the middle class was substantial. Something that fueled this growth, besides labor-saving inventions such as running water and light, was the increasing globalization of production through free trade. Take wood. Before the British engaged in extensive international trade, they used only one kind of wood in their furniture: oak. But Bryson notes that the British started getting (he doesn’t say when) walnut from Virginia, tulipwood from the Carolinas, and teak from Asia.
110

Because At Home is about various rooms in the home, not all of it is about technological change. Some of it is simply about how people’s consumption patterns changed as Britain industrialized and became wealthier, and it is no less interesting for that. In a chapter titled “The Cellar,” for example, Bryson details the enormous increase in coal usage for heating British homes. By 1842, he notes, Britain alone used “two-thirds of all coal produced in the Western world.” Coal burning became a bigger problem in cities as the cities grew: During Queen Victoria’s lifetime, writes Bryson, the population of London alone rose from one million to seven million. It would have been nice to see Bryson lay out how many people were saved from death by the switch from coal in fireplaces to coal burning in electricity generation and to oil, natural gas, and nuclear power for heating. But, as I noted, that’s not his point. Bryson wants to detail how his early 19th-century house contains a lot of history. Bryson occasionally breaks with the pattern by using a room as an excuse to discuss interesting technological developments that had little to do with the room. No matter — his discussion is always illuminating. A chapter called “The Study,” for example, doesn’t really deal with the study but does discuss mice, mousetraps, rats, plagues (naturally), mites, bedbugs, and germs. I will never put my head on a pillow in a hotel room again without remembering that ten percent of the weight of a sixyear-old pillow (six years, says Bryson, is the average age of a pillow) is made up of “sloughed skin, living and dead mites, and mite dung.” On a somewhat more comforting note, Bryson points
Policy Review

Books
out a positive change in the insect world: the disappearance of locusts a little over a century ago. We are so used to hearing about a species disappearing because of man’s activity that we sometimes forget to notice that the disappearance of some species is a welcome development. It turns out that the locusts hibernated and bred every winter in the high plains east of the Rocky Mountains. When new farmers there plowed and irrigated a little over a century ago, they killed the locusts and their pupae. It is impossible to read Bryson’s chapter on the bedroom without emerging with an appreciation of economic growth and modern medicine. At inns, strangers often shared beds into the 19th century, and “diaries frequently contain entries lamenting how the author was disappointed to find a late-arriving stranger clambering into bed with him.” He tells of a squabble in 1776 between Benjamin Franklin and John Adams when they shared a bed in New Brunswick, New Jersey. The issue disputed was not the role of the federal government. It was the far more important question of “whether to have the window open or not.” With increasing wealth, people no longer had to share beds. Bryson tells just how primitive medical knowledge was before 1850 and sometimes even later than that. For example, virtually all doctors were men, and it was not considered proper for men to examine a woman’s private parts. The American Medication Association expelled a gynecologist named James Platt White for allowing his students to observe a woman giving birth, even though the woman had given them permission. Nor did doctors
February & march 2011 111

seem to understand much about germ theory. Bryson writes that when President James Abram Garfield was shot in 1881, he wasn’t killed by the bullet but by doctors “sticking their unwashed fingers in the wound.” One shortcoming of the book is that Bryson doesn’t seem to have much appreciation for — or maybe it’s just a lack of interest in — how wealth is created. This comes out most strikingly in his discussion of “Commodore” Cornelius Vanderbilt. Vanderbilt, writes Bryson, “had a positively uncanny gift for making money.” True. But then he tells the reader of Vanderbilt’s immense wealth without saying anything about how he acquired it. The story that Bryson leaves out is that Vanderbilt made a large part of his wealth by making steamship travel relatively cheap for many Americans in the New York area, in the process challenging a monopoly that the New York legislature had unconstitutionally granted to Robert Fulton. In other words, Vanderbilt created wealth for himself by also creating wealth for consumers. Bryson also accepts many of the myths about the evils of child labor that various defenders of the aristocracy and advocates of socialism propagated in the 19th century as part of their opposition to British industrialization. Bryson does what virtually every opponent of child labor in factories has done: discuss the horrible conditions of work in mines and factories — they really were horrible — without comparing them to the even worse conditions in agriculture, which is where these same children would have been employed had they not worked in mines and factories. Bryson’s work is so well researched generally that it’s a

Books
pity that he didn’t come across economist William H. Hutt’s careful refutations of the critics of child labor. (See his “The Factory System of the Early Nineteenth Century” in the Hayekedited 1954 book Capitalism and the Historians.) many of these treasures to be sold off. The law I refer to is the British government’s death duty — in the United States, it is called an estate tax. This tax started in the late 19th century at a modest eight percent rate on estates valued at one million pounds or more. But by 1939, the rate was a hefty 60 percent. By the 1950s, writes Bryson, the stately homes were disappearing at the rate of about two a week. There is so much more in At Home than I’ve discussed here. One thing Bryson does often, for example, is tell the price of various goods and services at various points in time. Many of these prices were very high relative to wages. Realizing this gives the reader still another way of understanding and appreciating the awesome wealth created for all economic classes — in Britain and the United States — by two centuries of economic growth.

S

o , w h e r e a r e these old Victorian houses and the huge mansions that the newly rich built in the 19th century? Many of them, notes Bryson, were torn down after their contents were sold. And the reason so many of them disappeared is ironic given the author’s criticism of property rights. On the one hand, Bryson chides 19th-century critics of historical preservation laws, who saw such laws as “an egregious assault on property rights.” Just two pages later, on the other hand, he details the legal attack on property rights that caused

Statement of Ownership, Management, and Circulation: 1) Publication title: Policy Review; 2) Publication no.: 0146-5945; 3) Filing date: 02/19/10; 4) Issue frequency: Bi-monthly; 5) No. of issues published annually: 6; 6) Annual subscription price: $36; 7) Address of office of publication: 21 Dupont Circle NW, Suite 310, Washington DC 20036; 8) Address of headquarters of publisher: The Hoover Institution, Stanford University, Stanford CA 94305; 9) Names and addresses of: Publisher: The Hoover Institution, Stanford University, Stanford CA 94305; Editor: Tod Lindberg, 21 Dupont Circle NW, Suite 310, Washington DC 20036; Managing Editor: Liam Julian; 10) Owner: The Hoover Institution, Stanford University, Stanford CA 94305; 11) Known bondholders, mortgagees, and other security holders: None; 12) Tax status: Has not changed during preceding 12 months; 13) Publication title: Policy Review; 14) Issue date for circulation data: Dec/Jan 2010; 15) Extent and nature of circulation: A. Total no. copies: ave. no. of copies each issue during preceding 12 mos: 6,600; (no. copies of single issue published nearest to filing date: 6,535); B. Paid and/or requested circulation: 1. paid/requested outside-county mail subscriptions: 1,666; (1,666); 2. paid in-county subscriptions: 0; (0); 3. sales through dealers: 1,890; (1,890); 4. other classes mailed: 123; (123); C. Total paid/requested: 3,675; (3,675); D. Free dist. by mail: 1. outside-county: 2,536; (2,536); 2. in-county: 0; (0); 3. other classes mailed: 0; (0); E. Free dist. outside mail: 50; (50); F. Total free dist.: 2,586; (2,586); G. Total dist.: 4,202; (4,,202); H. Copies not dist.: 355; (355); I. Total: 6,600; (6,535); J. Percent paid/requested: 53%; (53%); 16) Publication of statement of ownership: required; 17) I certify that all information furnished on this form is true and complete (signed): Liam Julian, Managing editor, 2/19/10.

112

Policy Review

New from Hoover Institution Press
Skating on Stilts
Why we aren’t stopping tomorrow’s terrorism
BY STEWART A. BAKER
“A most unusual memoirist, Baker is a government bureaucrat with a philosopher's bent. This tough-minded, candid work is a cautionary tale to those who claim that we do not have to make choices between our values and our security. As Baker points out, security is a value, and those who pretend otherwise—be they business interests, privacy advocates, or international groups— put Americans at risk.”

General Michael Hayden,
Director of the Central Intelligence Agency (2006–9) and Director of the National Security Agency (1999–2005) “Stewart Baker was one of the leading thinkers in developing the architecture for Homeland Security. His insights and experience provide a unique perspective on our national security challenges.”

Michael Chertoff, former Secretary of Homeland Security
In this book Stewart A. Baker examines the technologies we love—jet travel, computer networks, and biotech—and finds that they are likely to empower new forms of terrorism unless we change our current course a few degrees and overcome resistance to change from business, foreign governments, and privacy advocates. He draws on his Homeland Security experience to show how that was accomplished in the case of jet travel and border security but concludes that heading off disasters in computer networks and biotech will require recognition that privacy must sometimes yield to security, especially as technology increases the risks to both. Stewart A. Baker was the first Assistant Secretary for Policy at the United States Department of Homeland Security and the former General Counsel of the National Security Agency. June 2010, 370 pages ISBN: 978-0-8179-1154-6 $19.95, cloth

To Order...

HOOVER INSTITUTION

call 800.621.2736 or visit hooverpress.org

. . . ideas defining a free society
HOOVER INSTITUTION, Stanford University Toll-free: 877.466.8374 Fax: 650.723.1687 info@hoover.stanford.edu www.hoover.org

Sign up to vote on this title
UsefulNot useful