An International Review of Culture & Society


Issue No. 10

Published by the Council on Foreign Relations

Winter 2002/2003

The Changing Nature of CITIZENSHIP






GROWING INEQUALITY: Reports from Africa, China, Japan, and India SPENGLER



f our previous issue was dominated inevitably by the aftermath of the September 11 terrorist attacks on New York and Washington, Issue 10 of Correspondence has no single, dominating theme. There are clusters of pieces, however, that deal with aspects of our post-9/11 world: changing ideas about citizenship and immigration, rising anti-Americanism in some quarters, and a renewed interest in growing economic inequality. We feature a report from Holland on how a country that has long seen itself as a model of tolerance is trying to define what it means to be Dutch in an era of growing anti-immigration feeling; an article on the role that immigration and fear of crime played in the recent French elections; efforts in Great Britain to curb arranged marriages that involve immigrants importing their spouses from their home countries; the difficulties of a European correspondent reporting on the U.S. after September 11, caught between sympathy for the victims and the growing anti-Americanism back home;

Germany’s love-hate relationship with American popular culture. We have reports on different reactions to growing inequality in societies as diverse as the U.S., Japan, China, India, and Bhutan as well as the relationship between the spread of AIDS and growing inequality in Africa. There is also a broad miscellany of pieces that fall outside these categories: Timothy Lenoir’s fascinating account of the strange alliance between the U.S. Defense Department and the video-game industry in making flight-simulator and combatsimulation software; Masakazu Yamazaki on the ways in which the recent Soccer World Cup reflected the decline of Japanese nationalism; Gadi Taub on the Israeli website that has opened up the previously closed world of the Arab press; Mark Lilla on French ambivalence about cloning and genetic engineering; Nina L. Khrushcheva on the politics of language reform in Russia. — Alexander Stille


THE QUESTION OF INEQUALITY 32 The Case For and Against Inequality by Alexander Stille 34 The Debate on Inequality in Japan by Masayuki Tadokoro 35 Inequality and Indifference in Today’s China by Xin Liu 37 Indian Inequality: In Pursuit of the Shadowy Oppressor by Ashok V. Desai 38 African Inequality and AIDS by Helen Epstein


3 Dutch (In)tolerance by Theodor Dunkelgrün 5 Marriage Arranged or Forced by David Jacobson 6 An End to “Vandal-Style” Divorce by David Jacobson 7 French Citizenship: Blood or Soil? by Julie Pecheur 7 French Immigration and Integration by Julie Pecheur 10 Importing “Zero Tolerance” by Ted Leggett TECHNOLOGY AND BIOTECHNOLOGY 11 To Each His Clone by Mark Lilla 12 Who’s Your Daddy? by Mark Lilla 14 Fashioning the Military-Entertainment Complex by Timothy Lenoir EUROPEAN ANTI-AMERICANISM 16 Between Hammer and Anvil by Martin Burcharth 18 The Americanization of Anti-Americanism by Michael Rutschky JOURNALISM AND THE MIDDLE EAST 20 MEMRI Gains by Gadi Taub 22 Quoted Out of Context: Reflections on the Online Ha’aretz by Hanoch Marmari 24 Saramago or The Demon of Analogy From the Stones of David to the Tanks of Goliath by José Saramago Reply to José Saramago by Barbara Probst Solomon REFASHIONING NATIONAL IDENTITIES 26 The Srebrenica Massacre and Holland’s Lost Illusions by Theodor Dunkelgrün 28 Italy’s Kinder, Gentler Fascism by Alexander Stille 30 A Parody of Nationalism: Soccer and the Japanese by Masakazu Yamazaki 31 The Last or the First of the Mohicans? by Takako Hikotani

39 Developing Happiness by Darrin M. McMahon 40 Inequality and Health by Helen Epstein LANGUAGE, LITERATURE, AND THE ARTS 42 A Demand for Last Days: Apocalypse Now and the Decline of the West by Wolf Lepenies 43 The Crack Squad by Naomi Daremblum 44 “The Russian Language Is Our Wealth” (Soviet Slogan) by Nina L. Khrushcheva 45 Verlan: The Vanguard of Backwards by Alexander Stille 47 The Fabric of Society: Madagascar’s Textile Art by Edgardo Krebs 48 The Return of South Korean Cinema by Helen Koh 50 China’s New Intellectual Landscape by Meng Li REFORMING JAPAN 52 Judicial Reform in Japan by Naoyuki Agawa 54 Is Japan Really “Sad”? by Masayuki Tadokoro IMAGE CREDITS
Matteo Pericoli: pages 3, 9, 11, 17, 21, and 38. Screen shot from Half-Life: Counter-Strike (Valve, LLC), page 15. Mayerhofer (Die Presse, Vienna, Austria), page 19. Zoudine (Russia), page 30. Nishikawa Sukenobu (1671–1750,“Beauty at Her Toilette,” detail), page 31. Peter Mundy (1634, China sketchbook), page 36. I. Puni (“Still Life with Letters”), page 45. Anonymous (Portrait of Chunhyang), page 49. Honoré Daumier (“Grand escalier du Palais de justice.Vue de face”), page 53. No-río Aomori, page 55.


The Changing Nature of Citizenship


cultural position. Though Holland became a monarchy in 1815, as late as the 1970s two of the queen’s sisters gave up their rights to the throne in order to marry Catholics, while the newlywed crown prince’s Catholic bride had to agree to raise any children in the Dutch-Reform faith. To this day, it is rumored that Catholics (considered more loyal to Rome than to the House of Orange) are rarely placed in highly sensitive posts such as the secret service. As the 19th-century constitutional monarchy also became a formal colonial empire, it ruled over nearly 100 million Muslims in what would become, in 1949, the Republic of Indonesia. Beyond defining the empire by racial division, Holland, in the conviction that its colonial rule was a moral exception, attempted to inscribe racial differences constitutionally. Several generations of preeminent Dutch legal scholars-cum-ethnographers classified and codified native cultures. Their fascinating intellectual project, involving law, anthropology, and Calvinist moral calling, blended into what one might today call “The Dutchman’s Burden,” humane and paternalistic. There were different laws for full Dutch citizens of the Netherlands, for indigenous peoples of the Indies’ hundreds of islands, for Arab traders, and for the wealthy ethnic Chinese community (“alien Orientals”). The civil code of law pertained only to the category of “Europeans.” The indigenous peoples, out of a combination of tolerance and pragmatism, were left to practice their own faiths and customs. Dutch law was not imposed on them, allegedly out of respect for their traditions, but also because the legal institutions in the Indies couldn’t deal with the many millions that inhabited the colony. With a paternalistic, Calvinist moral calling to be “guardian”of the “culturally infantile”native peoples and raise them to “adulthood”within their own cultures, Holland studied and tolerated local traditions but purged them of practices deemed immoral in Christian eyes, such as widow-burning in Bali, or a Muslim uprising in Sumatra. This freedom was limited. The “cultural system” of the 19th century guaranteed state monopoly on trade. The colony was run like one big plantation, and about one-third of government income came from a state monopoly in trade from the Indies. Writers like Multatuli voiced the moral indignation that led politicians to feel a “debt of honor”to the peoples of the Indies, and to be their cultural and ethical guardians. Thus, already a century ago, the Netherlands was a sophisticated, legally defined multicultural state that understood itself as

he assassination last spring of Pim Fortuyn—the controversial leader of a new Dutch political party urging a ban on immigration—was powerful proof that the Netherlands, and Europe, had entered a new political era. At the time of Fortuyn’s killing on May 6—the first political assassination in Holland in 330 years—opinion polls showed that his party was the most popular in the country. In its Golden Age during the 17th and 18th centuries, Holland—refuge to Jews fleeing the Inquisition and Huguenots fleeing France—became a model of the modern multicultural state, establishing separation of church and state, welcoming and fully accepting economically valuable foreigners, and creating a cosmopolitan society of Calvinists, Lutherans, Catholics, Mennonites, and Jews. Today, Holland—in a very different context—once again is in the vanguard of debate on immigration and national identity, tolerance and intolerance. Although Fortuyn rhetorically exploited postSeptember 11 Islamophobia, he was no traditional rightwing xenophobe, no Dutch Le Pen. His proposals— including Muslim acceptance of equal education for women—were rooted in beliefs in freedom of religion, liberal democracy, free-market capitalism, and civil liberties that defy easy left-right categorization. This curious blend of issues would probably have made him the West’s first openly gay prime minister. His insistence that Dutch Muslims accept the terms of liberal democracy both challenged Holland’s notions of tolerance and continued its long, complex history. Paradoxically, Dutch tolerance—down to its recent liberal policies toward drugs, prostitution, euthanasia, and religious asylum— is a form of secular religion central to a Dutch identity grounded in Calvinist notions of moral purity and ethical calling. During its creation as a separate entity in the late 16th century, Holland was a theater of bloody wars between Protestants and Catholics; four centuries later, in the modern kingdom of the Netherlands, three rivers (the Rhine, Maas, and Waal) divide the largely Protestant north from the Catholic South. Although early Holland was highly tolerant for its age, it had its limits. Until the late 18th century, Catholic churches had to be out of public sight—hidden in courtyards, for example. Yet, emulating the French Revolution in the 1790s, the new Batavian Republic adopted the Rights of Man and Citizen, granting all citizens equal rights. Still, Dutch-Reform Calvinists have always held a dominant


The Changing Nature of Citizenship

positively racist. But complex reality—chiefly the phenomenon of Dutch men fathering children with local women—was soon dissolving the ideal model of the professors. If a Dutch colonial civil servant had a child with an indigenous woman, a legal loophole allowed him to grant that child, especially if it was a son, the rights of a Dutch citizen and still not recognize it. The child would take the father’s last name, spelled backwards: Been, for instance, became Neeb. As colonial society industrialized, and modernized, an Indies middle class emerged, and with it, a generation of natives educated at Dutch universities, and, as with other European empires, an indigenous nationalism. Some Indonesian nationalists invoked the Dutch Golden Age Republic to contest the racism of the Dutch Colonial State (though Indonesia would become as, if not more, racist from a JavaneseMuslim perspective). When, in 1949, under U.S. pressure, Indonesia became independent, most people of mixed parentage came to the Netherlands. Whereas the first model of multiculturalism disintegrated with the colonial state, Dutch society itself became multiracial, and mixed-race Dutchmen so ubiquitous they were fairly invisible. But few were religious Muslims, and all spoke Dutch. Consequently, their integration into a secularizing Dutch society was much smoother. In the 1960s, the traditional pillars of Dutch society, empire and religion, fell as part of the general upheaval of the time, leaving the monarchy and the national soccer team as the only vestiges of national sentiment. Meanwhile, large groups of immigrants arrived: Surinam’s independence in 1975 brought nearly half its population (part-Indian, part-Javanese, many of African slave descent). After initial culture shocks, they have integrated well into Dutch society, mostly in the middle class. It is the first and second generation of Turkish and especially Moroccan immigrants, together some 700,000 people, who present greater socioeconomic problems. Reluctant to “integrate” into Dutch society at large, they have brought conflicted notions of citizenship and Dutchness to the forefront of public debate, spotlighting the extent to which cultural relativism, secularization, and a dogmatic multiculturalism have eroded much positive sense of nationhood. For without a clearly defined sense of what it means to be Dutch, there is no mode to which they can be encouraged to adapt. The Dutch notion of nationalism falls somewhere between two European models, the German and French. Germans are born, Frenchmen made; the Dutch are a mixture of both, or rather,“Dutchness” occupies an ill-defined space in between. On the one hand, post-Napoleonic emulation of France (citizenship is legal citizenship, religion a private matter); on the other, an implicit but prevalent sense of ethnic, that is extralegal, “Dutchness.” This ambiguity connects with larger social concerns. The lack of positive national identity is the heart of current political and cultural problems in which the legal and cultural notions of “Dutchness” are at odds. Dutch law allowed for family unification of migrant workers: Moroccan and Turkish gastarbeiders (guest workers) could thus

bring their wives and children to Holland, while children born in Holland became Dutch citizens. Furthermore, Dutch citizenship is eventually granted to anyone who marries a Dutch citizen (a basic salary is required for a residency permit), however labyrinthine the bureaucracy tries to make this. Moreover, the Moroccan and Turkish communities bring brides from their rural communities, while secular Dutchmen marry less and less frequently. Muslims form some 5 percent of the population of 16 million, but some 25 percent of all marriages. Last year, 37,200 immigrants came to join their family members; 17,300 came for political asylum, 15,300 for work, and some 6,400 for studies. In this way, the Turkish and Muslim populations of Holland have increased tenfold in the past thirty years and are increasing rapidly. Pim Fortuyn broke the taboo on criticizing immigration policy and the illiberal aspects of Dutch Muslim culture. Unlike the Dutch scholars of the colonial era, Fortuyn had developed an anti-Islamic politics that was nonracist and deeply rooted in Holland’s ideas of liberal democracy—indeed, unabashedly convinced of the superiority of the Dutch versions of Western values. (The party’s second man has been a 28-year-old black immigrant from the Cape Verde Islands.) At the same time, Fortuyn exposed deep contradictions in the left-wing, politically correct notions of Dutchness: Can a society champion gay marriage and women’s equality while also tolerating a community of homophobic, sexist Muslim peasants? Cultural relativism and secularization had gone so far in Holland, Fortuyn argued, that any positive sense of Dutch nationhood had become taboo. Famed Dutch tolerance had become a contradiction in terms. Fortuyn’s politics survive his assassination. This July, the Pim Fortuyn Party entered the coalition government of Jan-Peter Balkenende, a conservative Christian-Democrat who has vowed to change immigration legislation and declared that the country’s entire multicultural project had failed. The new coalition reflects a post-September 11 world of Dutch politics. Balkenende is careful to invoke “Dutch society” and “Dutch values” rather than a Dutch nation, since defining identity by blood or race is still very much taboo. But that distinction belies much of the anti-immigrant tone in the new government’s more conservative politics. The terms “autochthonous” and “allochthonous” have emerged to refer to indigenous and non-indigenous Dutchmen, showing how secondary some have come to consider formal, legal citizenship.“Non-indigenous” almost always refers to those with darker skin. Unable to describe Dutchness (unlike Germanness) in racial terms, yet confronted by a growing community of immigrants who are legally Dutch but culturally, religiously, and racially different, many Dutch people find themselves at a loss in their own fast-changing country. The enormous success in May’s election of the Christian Democrats, who most represent mainstream Dutch culture, reflects this confusion. Holland’s difficulty with immigration stems in part from asylum laws that are more about what immigrants are than what they do. The privileging of political refugees and relative disdain for “economic migrants,” who “merely” seek a better life economically,

The Changing Nature of Citizenship

keeps the country from seeing immigrants for what they contribute—as the United States does. Though de facto Dutch citizens, many immigrants are naturally left confused over national belonging in general. To their credit, the Fortuynists equate belonging to Dutch society not with being “racially European,” nor with merely having Dutch citizenship, but with being loyal to the democratic values that guaranteed its freedom in the first place. Declaring the country “full,” Fortuyn proposed legislation banning marriages arranged in Muslim countries. He wanted mandatory schooling of immigrants, in particular women, until they spoke Dutch. He also wanted legislation enabling the expulsion of immigrants who disagree with Dutch law—something that conflicts with Holland’s constitution. When the religious leader of the Rotterdam Moroccan community called homosexuals “worse than dogs and pigs,” Fortuyn claimed that freedom of expression was in conflict with other, even more fundamental, freedoms. When democracy itself is threatened, he argued, it must impose its values upon all its citizens. Without their charismatic head, the Fortuynists’ leadership has already collapsed. But Fortuyn’s understanding of citizenship, based on the values of the liberal democratic state rather than on the ghost of colonialist racism, remains a starting place for Holland’s debate on citizenship. That debate is likely to be inflected increasingly by second-generation Moroccan and Turkish immigrants, many of whom feel more Dutch than Turkish or Moroccan. In fact, second-generation women in these communities, having struggled hardest to shake off the power of their patriarchal culture while embracing Dutch civil liberties, have emerged as their communities’ most vocal political representatives. As with colonial intermarriage, facts outpace the categories developed to describe them. The second generation’s participation in the middle class is growing, placing more and more of its politicians, writers, and entrepreneurs in mainstream Dutch society. The cultural clashes remain sharp and painful, but also bear fruit. The Dutch again feel the relevance of politics, of political and cultural debate. In the process, the democracy of which Dutch citizens—all of them—are so proud, is becoming stronger. —Theodor Dunkelgrün

MARRIAGE Arranged or Forced
hile European Union countries, with their need of foreign labor and fear of foreigners’ unassimilability, debate curbs on immigration, France and Great Britain have stepped up efforts to intervene in immigrant families’ traditional marriage practices that clash with national laws. The violations have long been particularly clear-cut in France, but the British Foreign Office also claims to have handled more than 240 cases of forced marriage, which have, at least since 1998, received court annulments (annulments which shariah itself allows). Only in some 15 percent of these cases has the unwilling partner been the husband. In France, mostly teenage daughters of North African, sub-Saharan, or Turkish immigrants, French-born and thus with no direct experience of their families’ native countries, are forcibly flown back to unknown husbands, cousins or neighbors from home, or delivered locally—virtually sold—to unknown immigrant husbands. Since most of these girls are still in high school, the department of education has intervened after school nurses and social workers received complaints filed directly or passed on informally from the victim’s friends, usually compatriot schoolmates. In contracting these marriages, the parents, often under pressure from extended family or their community, act to save a family line from extinction, or their daughters from a “corrupting” Western lifestyle. This spring, a French court arraigned the Malian parents who married off their 16-year-old girl to a stranger as accomplices in her rape. If they are found guilty, the case would set an important precedent to curb excesses barely conveyed by the education ministry’s declaration to protect “the values of fostering freedom and self-development.” “If they are convicted, the case promises to have profound implications for arranged marriages generally,” reported the British Guardian (“The French case that could spell the end for abusive arranged marriages,” May 28, 2002). Yet for Britain’s predominantly Asian immigrants, this report’s very conflation of “arranged,” i.e., consensual, with “forced” marriage is itself disturbing, as a symptom of British cultural insensitivity in general and that of Home Secretary David Blunkett in particular. Being consensual, family-arranged marriage is a fully lawful tradition; and in India, it, or what is now called “semi-arranged marriage” (young adults’ selection under family supervision), accounts for an estimated 90 percent of matrimony. What Blunkett proposed this winter in his white paper on immigration “Secure Borders, Safe Haven” was that Asian Britons should confine arranged marriages to partners already in Britain rather than flying in spouses from the Indian subcontinent (or, in his more cautious phrasing: “We believe there is a discussion to be had within those communities that continue the practice of arranged marriages as to whether more of these could be undertaken within the settled community here”). Intranational marriage, he suggested, would aid integration by breaking down the “terrible tension that exists when people are trapped between two cultures and backgrounds.” Some Asian Britons endorsed Blunkett’s view. Manzoor Moghal, a national official of the Muslim Council of Britain, agreed that arranged marriages with overseas spouses increased both partners’ chances of suffering racism and isolation. “It is a very good idea for the resident Asian population to seek partners from within their own society, which is quite different now from the people of the subcontinent.” But many other subcontinental Britons bristled at being told whom to wed, just as they had resented Blunkett’s call in December 2001 for ethnic communities to adopt “norms of acceptability,” through conversance with English, citizenship classes, and


The Changing Nature of Citizenship

loyalty oaths, when the memory of the summer’s race riots in northern towns and anti-Muslim looting was still fresh. Others voiced suspicion that Blunkett’s implicit assumption that overseas marriages are contracted largely to circumvent immigration laws veiled an essential desire to bar immigrants—a virtual return to the xenophobia Labour supposedly reduced when it repealed the Conservative “primary purpose rule,” which had allowed consular staff to deny a visa when they thought it served mainly to give the foreign fiancé(e) entry to Britain. According to one estimate,“[i]n 2000, there were over 21,300 grants of entry clearance to spouses and dependent relatives from the subcontinent, while the comparable figure for 1990 was 17,600.” But recent government figures differ slightly, showing that in 2000, through marriage, more than 38,000 people were granted the right to live in the U.K.; almost 18,000 grants of entry clearance were made to spouses from India, Bangladesh, and Pakistan, nearly twice the 1996 figure. The Home Office demand for migrants’ loyalty and integration has roused one British commentator to remark: “We shall have group ceremonies where the chosen will pledge their absolute loyalty to the Queen and to the law. There are some contradictions here. A third of Her Majesty’s existing subjects decline to offer her such unequivocal support. A Home Secretary prescribing obeisance to all laws and liberties has himself nibbled away at habeas corpus and judicial review. A government preaching community cohesion also plugs divisive faith schools.” Native Britons would be hard-pressed, the argument suggests, to identify what British culture is. Yet native British public opinion toward foreign influx remains wary: “A survey carried out not long ago for The Observer showed that nine in 10 respondents thought the rise in multiculturalism was linked with negative issues, such as increased crime and social tension, and half of all Britons cannot name a single ethnic minority figure they admire. The average citizen vastly overestimated the number of ethnic minority people in Britain at 24 per cent; the real figure is 7.1” (“Citizenship Lessons,” Observer, Feb. 10, 2002). Most Asians surveyed believe from their own experience that the numbers of British Asians seeking spouses from “back home” is declining; others praise the “blend of eastern and western influence” mixedcontinent marriage can offer a family. Yet British Asian women’s groups, in particular, point to alarming trends in families: forced as well as merely arranged marriage, high rates of domestic violence, women’s suicide rates two to three times higher than those of the general population, mental health problems among abused wives whom ethnic taboos against extrafamilial disclosure keep from seeking outside, professional help. The brutal case of Fadime Sahindal, a young Kurdish woman in Sweden murdered this February by her father after she refused to accept an arranged marriage and abandon a Swedish boyfriend, haunts assimilationist first-generation Asians in Europe. One such Guardian reader pointed out to its editors: “By coincidence, claims that the home secretary is interfering with cultural traditions came on the day that you reported the case of an Asian man whose defence to the charge of stabbing his daughter to death is that his religion demands it.” —David Jacobson
Source: Guardian Weekly, March 14–20, 2002, translation from Le Monde, March 8: Sylvia Zappi, “France steps in to curb forced marriages.”

An End to “Vandal-Style” DIVORCE
ith three utterances of the word talaq (divorce), a Muslim husband—by any means of communication, whether he is enraged or drunk, whether or not his wife is present, and without due process, lawyers, courts, or registration in any secular institution— has hitherto been able to obtain instant divorce, with no obligation to provide his wife with an explanation or material support. In most cases the rejected women must return to their parents, who must then raise her children. Pakistan, Bangladesh, and Malaysia have banned instant divorce, and Muslim women’s groups in India consider the practice a travesty of the Koranic rules that a month should separate each talaq and reconciliation be sought. Orthodox Muslims in India counter that Islam permits instant divorce in “some circumstances” and fiercely contest the Bombay High Court’s ruling last month that if a wife challenges her husband’s instant declaration he must justify his decision in court. Divorce, they insist, should be “left to the community to sort out,” not the deliberation of a non-believing judge. The Muslim Personal Law on divorce, in fact, includes the requirement that, beyond providing reasons and trying reconciliation, a husband must give his wife mehr, or pledge-money, as security in the event of divorce. Instead, the reasons behind the constant threat of arbitrary abandonment Indian Muslim women face typically vary with social class, ranging from sexual to cultural or culinary shortcomings. The Koran urges men to give the mehr or their souls will have no rest. Women not allowed the mehr are often simply left destitute. The new ruling takes care not to interfere with Islamic law and a Muslim man’s divorce right. Legislators have learned, perhaps, from the furor that ensued 17 years ago after the High Court ruled in favor of a woman who dared to contest her imposed divorce after being left penniless. The government overturned the ruling at the time to calm Muslim outrage. Still, though it upholds the Muslim man’s divorce right, the High Court verdict allows a secular judge to adjudicate on a religious tenet. The All India Muslim Personal Law Board plans to challenge the ruling and, as some Muslims declare informally, considers it a Hindu “conspiracy” of interference. —David Jacobson
Source: Amrit Dhillon, “Marriage knot is tightened,” Financial Times, July 6–7, 2002.


The Changing Nature of Citizenship


Blood or Soil?


re people French because their parents are French or because they were born or live on French soil? These competing ideas of droit du sang (the right of blood) and droit du sol (the right of the soil) have coexisted and competed according to ideological fluctuations and demographic and economic needs since the middle of the 16th century. Despite attempts from various governments to restrict it, the droit du sol has prevailed, conferring on the country, until the last 15 years, the reputation of a welcoming nation. The Civil Code, introduced by Napoleon in 1804, broke with the ancien régime tradition by establishing the droit du sang as the primary way to become French. It also stated, however, that children born in France of foreign parents could ask to become French when attaining their majority (then 21) if they were to live in France. In 1851, as the French population was declining and the Industrial Revolution was calling for more workers, the double droit du sol was introduced: A child of foreign descent would be born French if one of his parents was born in France. To further facilitate naturalization, the 1889 law (valid until 1993) established that a child would automatically become a French citizen when attaining his majority (18 since 1974) if he had lived in the country for some time. After World War I, in which 1.5 million men died and some 2 million were handicapped, France encouraged the flow of immigrants to replace its depleted workforce. According to a 1927 law, for example, children born of a French mother and a foreign father would automatically become French, which had not been the case before. Between 1927 and 1938, there were an average of 38,000 naturalizations per year, rising to a high of 81,000 in 1938 alone. After the Nazi defeat of France during World War II, the Vichy government declared that the 1927 law “created French cit-

izens too easily.” Vichy stopped naturalizing immigrants and 15,000 people, including 6,000 Jews, lost their citizenship as a result. Many in the Resistance also lost their citizenship, including Pierre Mendès-France, Marshal Leclerc, and de Gaulle himself! In 1945, Vichy laws were abolished and many foreigners who served in the Resistance were naturalized. In 1993, another turn in political life served to limit the droit du sol. Under pressure from the growing anti-immigrant movement in France, the conservative government led by Prime Minister Balladur added that, along with having lived in France, a child on attaining majority must make an “expression of will” to be French. To obtain French citizenship, a child born in France of foreign parents had to fill out a form (and therefore “express his will”) between the age of 16 and 21. (And, as you may know, confronting French bureaucracy requires a huge amount of will.) In 1998, after the left returned to power, they changed the law back again. Now a child with foreign parents automatically becomes French when attaining majority under two conditions: 1. He must live in France when attaining majority. 2. He must have lived in France for at least 5 years between the ages of 11 and 18. Although a center-right government is back in power, there are no proposals at the moment to make naturalization more difficult. The fact that Jacques Chirac won by running against the anti-immigrant Jean-Marie Le Pen certainly makes the citizenship question too controversial to be reexamined anytime soon. The French government has instead focused on curbing illegal immigration, especially from Eastern Europe. — Julie Pecheur

French Immigration and Integration


our years ago, nearly a million rapturous French gathered in the Champs Elysées to cheer their heroes: France’s World Cup-winning multiethnic team. Enchanting images of young blacks, whites, and Arabs sharing tears of joy and waving the Tricolor prompted the media and many politicians to declare that this team mirrored the new “Blacks Blancs Beurs” society, a place where all races and ethnicities live side by side. The French declared Zinedine Zidane, the soccer star born in France of Algerian parents, their most popular countryman. But the exhilarating power of sport masked a complex reality. In October 2001, a friendly match between the Algerian and French teams (the first one since Algeria’s independence in 1962) turned raucous. In front

of a stoic Prime Minister Lionel Jospin, young supporters, mainly of Arab descent, hissed the French national anthem and invaded the field to stop the game when Algeria was losing 4–1. Some said they did not want to see the country of their parents being humiliated again. The fantasy of the 1998 victory was finally overturned on April 21, when the anti-Europe, anti-immigrant Jean-Marie Le Pen finished second, behind the conservative incumbent president Jacques Chirac, in the first round of the presidential elections and won up to one-third of the votes in some districts. Consequently, in the final runoff Chirac was reelected president with 82 percent of the vote, and his coalition of conservative parties gained the parliamentary majority in the legislative elections;

The Changing Nature of Citizenship

Le Pen’s party won no seat. (Five days before, France crashed out of the World Cup.) Yet the historic and unexpected results of Le Pen in the first round provoked a political earthquake, exposing, among other things, the voters’ deep disaffection with the traditional parties that had ignored the lower classes and their trouble over the recent layoffs, chronic unemployment, meager salaries, and uncertain futures. Neither Lionel Jospin’s Socialist Party nor Chirac’s Rally for the Republic were able to address their social and economic expectations. This turned out to be more critical for Jospin and the far left, who were particularly hard hit. Their traditional voters, essentially clerks and blue-collar workers who represent more than half the electorate, largely abstained, or voted for Le Pen. In the 1978 presidential election, the French Communist Party won 36 percent of these voters; this year, 3 percent. Similarly, in the 1970s, the Socialist Party attracted about 70 percent of this vote, against less than 40 percent today. In his new book, Democracy Against Itself, the philosopher and historian Marcel Gauchet reexamines what in 1990 he called the “social fracture,” the gap between the economic and political elite and the lower classes. The emergence of the sentiment d’insécurité (feeling of insecurity), Gauchet explains, destroyed the fundamental myth of the protective state. According to Gauchet, la fracture sociale was first revealed when politicians failed to pay attention and curb delinquency, which disproportionately affects the lower classes, since vandalism and delinquency are firmly established in poorer neighborhoods. Recently, delinquency has indeed been pushed to the foreground of public attention. Last year, the Interior Department published statistics revealing an increase in crime of 7.7 percent in 2001, up from 5.7 percent in 2000. (Crime had decreased from 1995 to 1997 and was stable in 1998 and 1999.) These statistics provoked heated debate because they used tools dating from the 1970s and combined all sorts of crimes, from mere insults to murder, from pot smoking to financial fraud. But despite all interpretations and controversies over the figures, nobody denies that violent acts and incivilities are on the rise. Nonviolent theft constituted two-thirds of last year’s increase; however, violent assaults increased tenfold between 1987 and 1999 (while homicides decreased by 20.8 percent since 1992). More important, all polls show a steady increase of the “feeling of insecurity,” and for the past two years it has been the number one issue for the French. The media did its part to sustain the polemic by showering its audience with sordid reports. According to an investigation by Le Monde, the number of stories dealing with delinquency rose by 126 percent between February and March 2002 and decreased by 50 percent after the first round of elections. Television contributed to 60 percent of the increase. From January to May “insecurity” was reported eight times more than unemployment. TV documentaries bore alarmist titles such as “The Evildoing Generation 2000: Investigation on Delinquency” or “Suburbs: Investigation on Violence Without Remedy.” On the eve of the first round of elections, television screens filled with images of the

overwhelmed, swollen face of an old man, beaten up by two kids who then set fire to his humble house. The all-news channel LCI mentioned this “Papy Voise” story 19 times that day, devoting nearly half an hour to it. Lurking behind this obsession with insecurity is the propensity of many, including Le Pen, to link it directly with immigrants and their descendants, using rhetoric that merges the delinquency issue with what has been falsely called the “immigration issue.” Publicly made racist remarks increased remarkably after the first round, as if Le Pen’s vote total had rid many of this social prohibition. The delinquency that is the center of this rage (insults, pokes, thefts, and assaults as opposed to dangerous driving, organized crime, or financial fraud) is perpetrated by youngsters who are, for the most part, born in France and are neither foreigners nor immigrants. Obtaining paperwork and citizenship are not their main problems; participating in France’s historical motto, Liberty, Equality, Fraternity, is. They are children of the immigrants who were massively recruited, mainly from the French colonies, during the decades of economic prosperity from 1950 to 1970. At the time, it was thought they would leave when France no longer needed them. In 1974, anticipating the economic slowdown resulting from the oil crisis, the government ended the immigration of uneducated workers. Soon afterward, it became obvious that the immigrants were in fact staying. They settled in the socially and geographically isolated poor neighborhoods ringing cities and were joined by their spouses and children. In the mid-1970s, they were particularly stricken by the rise of unemployment, which became chronic in the late 1980s. Immigrants’ children are caught between two cultures, lacking social markers and an economic base. In the absence of acceptance by the general culture, they developed underground economies, formed gangs, defended territories, and acted out their hatred of public symbols, from police officers to bus drivers. Inside and outside their communities, they exhibit wounded pride and a desire to provoke. Their neighborhoods, mainly ethnic working-class outer suburbs filled with decaying projects, are now referred to as quartiers sensibles (sensitive neighborhoods), and they are themselves often designated as the génération sacrifiée (sacrificed generation). Since the beginning of the 1990s, as juvenile delinquency has increased steadily, many wonder if the third generation will follow the same path. Traditionally, the right and the left have had differing approaches to the problem of crime. The right has advocated some kind of repression and has condemned police weakness and lax justice, while the left has sympathized with the hardship of daily life in the cités (projects) and has believed that social and economic intervention would render repression unnecessary. Over the years, the insecurity issue has become taboo among leftists: Perpetrators were often of foreign descent and the left feared that talking about it risked aiding Le Pen’s political progress. By 1997, however, as prevention policies and economic improvement did not stop the increase in crime, Jospin, then the new prime minister, accepted revolutionizing the left. He asserted that

The Changing Nature of Citizenship

“security policy must stress individual responsibility rather than sociological excuses” and crime became the second priority of his government, after unemployment. His government tried to open the police force to descendants of immigrants, allowed for faster prosecution of crimes, and created youth detention centers. But, despite the new reforms, daily offenses, especially juvenile delinquency, continued to rise. During the presidential campaign, Jospin and Chirac narrowed the debate solely to the insecurity issue, offering virtually the same solutions, including the creation of a special ministry dedicated to the crime issue. Chirac asserted in February that “nobody in France is safe” or “feels safe.” Chirac, who avoided prosecution for corruption by virtue of presidential immunity, tried to borrow the rhetoric of New York’s Mayor Giuliani to advocate a “zero impunity” (sic) policy. Jospin immediately responded,“every offense must have a punishment.” Le Pen was furious: Everybody was stealing his favorite themes. But he was also hopeful; according to his famous quip, voters always “prefer the original to the copy.” Last May, 86 percent of those polled believed “justice was not tough enough on petty delinquents” and 76 percent believed “the police should have much more power.” Whether repression is needed or not is a debate that masks a deeper question: Is the French integration system working? The “republican ideal” assumes a unified French identity in which differences are minimized. French culture and values are to be transmitted (or created) through the unified and centralized education system—public and free. This model insists on the social and geographical assimilation of the various segments of the population and rejects separation according to ethnic or cultural communities. As a consequence, it is difficult to know exactly how many residents are black or of Arab descent, and what jobs they hold, because keeping such records by race or ethnicity is illegal. Recent passionate and ongoing public controversies over whether Muslim girls should be allowed to wear the veil and regional languages (Corsican or Breton) should be taught in schools reveal the rather strict definition ascribed to French identity. In some aspects, integration has worked. In the past 20 years, for instance, mixed marriages (between French and foreigners) have doubled. In 1999, they represented more than one out of ten weddings, without counting those between Français de souche (French of French descent) and French of foreign descent, for which records are not available. Yet, for many, the notion of otherness is still difficult to accept. A solid 59 percent of the French think there are too many immigrants in France. In fact, foreigners account for 5.6 percent of the population, and according to some estimates, 7 to 8 million of France’s 60 million people are

of African, Arab, or Asian descent. Yet, except for a couple of Beurs in the entertainment industry, there are virtually no nonwhite faces in the media or in politics. Among the 8,424 candidates for the recent parliamentary elections, only 123 were of Arab or African descent. The new government appointed Tokia Saïfi, a woman of Algerian descent, as junior minister for sustainable development. It is the first time that a minority member has been nominated in the caretaker cabinet. For the first time in its history, Europe, which is currently trying to sketch a common immigration and asylum policy by 2004, has not only become a continent of immigration but receives the most immigrants in the world. In the 1980s, 700,000 people officially entered the region annually. From 1989 to 1993, the num-

ber increased to between 1 and 1.5 million annually. It has now subsided to about 700,000 again. However, aliens still represent only 5.1 percent of the European population, and excluding European immigrants, only 3.5 percent—3 million out of 380 million people. Contrary to the rest of Europe, France has always been a country of immigration, receiving 119,000 immigrants in 2000, about the same as the previous two years (26,000 from the European Community and 93,000 from other countries, half of them from Africa). Like the rest of the Union, it needs foreigners to supplement its aging population, low birth rate, and labor shortage in some economic sectors. No doubt France, like Europe, faces a cultural revolution. —Julie Pecheur

The Changing Nature of Citizenship

“Zero Tolerance”
As France’s “zero impunity” (see “French Immigration,” p. 7) shows, America’s anticrime policy of “zero tolerance” has recently caught on around the world as a model of policing. Mexico City, for instance, has just retained one of the policy’s first champions, New York City exmayor Rudolph W. Giuliani, as an anticrime consultant. Few articles, however, better illustrate the problems of its adaptation to local conditions than the following, written by an editor of the South African Crime and Conflict Quarterly, and published as “Zero tolerance makes little sense,” in the Johannesburg Weekly Mail and Guardian (Feb. 5, 1999). he police have promised a “zero tolerance” approach to the Western Cape crisis, a term that has been bandied about a lot lately. There is a growing sense that they are being too soft on criminals, that a concern for human rights has emasculated what was once a potent police service. “Zero tolerance” sounds sufficiently manly to serve as a rallying point for those who would adopt a “no nonsense” tack on law enforcement, the “nonsense” generally being large parts of the Constitution. The South African version of the zero tolerance idea was brought here largely by advisers from New York, like Jack Maple, who recently visited these shores at the invitation of the New National Party. New York has experienced a remarkable decline in crime, attributed by the city’s police to their targeting of “lifestyle offences,” such as begging and vagrancy. By getting the riff-raff off the street, the theory goes, the general atmosphere of lawlessness is reduced. By enforcing every statute, no matter how minor, respect for the law can be won again. What advocates of zero tolerance fail to recognise is that New York and South Africa are about as alike as footlongs and boerewors. Both are sausage-shaped meat products, but that’s about where the similarity ends. New York is concentrated around Manhattan, which is an island. Working with limited space, developers were forced to build up, rather than out. The result is one of the highest population densities in the world— eight million people packed into an area of 515 km2. This means the city’s 38,000 police have a relatively small land mass to patrol, smaller than the Durban metro area. There is virtually a bobby on every street corner. New York is also one of the wealthiest cities on earth. The incomes of Americans are not just taxed by the national government, but by their home state and city as well. The high income, property and sales taxes paid by New Yorkers are sufficient to give the New York Police Department (NYPD) an annual budget of $2.4 billion—roughly equal to the budget of the entire South African Police Service. In addition to their numbers and density, New York police are well paid, with overtime and benefits sufficient to attract a huge pool of applicants for available positions. There is usually a waiting list a year long of qualified people wanting to become officers. All of them have matric certifi10


cates and driver’s licences and are functionally literate. Being a cop is a prestige profession in the United States, so much so that many departments require a four-year degree to join the force. The NYPD drills human rights into the minds of all recruits, and with good reason: the almighty American lawsuit. A police officer who abuses a citizen in the U.S. can expect himself, his department, and the city to be served with legal papers within 24 hours. There is an entire community of lawyers who work for a percentage of whatever their clients are awarded and who watch every move the NYPD makes. A strong internal affairs division, under direct pressure from elected officials, is just the mustard on the pretzel. All this means that New Yorkers can afford to talk about zero tolerance. They can back it up by arresting every litterbug and vandal they come across. They can expect their officers will not misconstrue the tough-guy rhetoric and manhandle the public. Until South Africa has the capacity to identify a criminal when he is arrested for the umpteenth time, and until human rights become an entrenched part of local culture, we’d probably be better advised to look for guidance from nations more like ours. This is not to say South Africa has nothing to learn from the New York experience. What the NYPD has shown is that little crimes do make a difference, that they create an atmosphere of lawlessness and impunity that leads to bigger things. But zero tolerance is about generating a sustained ambience of order, not just responding intensively to crises. In South Africa, zero tolerance could only practically be applied in heavily patrolled urban areas. Using scarce resources to make sure there are no loiterers in Sandton, or drunks on the Durban beachfront will result in the rest of the country being relatively neglected. The zero tolerance approach has the potential to reinforce a kind of apartheid between those who have access to the police and those who do not. While private security firms and the new municipal police forces ensure that the rich continue to be better protected than the poor, it is incumbent on the government to minimise this differentiation. Disguising preferential treatment for the most visible sectors of society as “zero tolerance” is not the way forward. —Ted Leggett

Technology and Biotechnology

To Each His Clone
hen Michel Houellebecq’s The Elementary Particles, the most important French novel of the 1990s, was published in English translation two years ago, reviewers had trouble understanding what the fuss had been about in Paris. The story concerns two half-brothers, Michel and Bruno, who were abandoned by their hippie mother in the ‘60s and meet up later in life. Most reviewers focused on Bruno, whose sexual obsessions and dysfunctions were taken as an indictment of the psychological havoc that, in the author’s view, sexual and women’s liberation had wrought on young people over the past 40 years. Less attention was paid to the other brother, Michel, whose problems in the novel are not sexual but metaphysical. Michel is a biologist who does genetic research, a field he appears to have chosen by chance and in which he performs brilliantly. His heart is not in it, though, so he takes an extended sabbatical, hoping to find a new sense of direction and purpose. Yet all he discovers is loneliness and the vacuum of his empty soul. By the end of the novel, this emptiness itself becomes a kind of inspiration, as Michel withdraws to the Irish countryside to lay the foundations of a new genetic science that will abolish from the human psyche both love and pain (the first causing the second) and thus permit the construction of posthuman man. It is fair to say that Houellebecq’s dystopian vision of the aims and probable outcome of genetic research struck as deep a chord in France as his portrayal of the sexual revolution. And with good reason, for one of the most important tropes in French intellectual life since World War II has been the “death of man.” The theme was introduced by the Hegelian philosopher-prophet Alexandre Kojève, who wrote about the prospect of the “last man,” one who would live solely for the sake of consumption and sexual gratification [see Lilla, Correspondence 6,“Alexandre Kojève: Russian Agent?”]. It was elaborated by anthropologist Claude Lévi-Strauss, who spoke favorably of the end of the humanistic fantasy of “man” as having a fixed nature, and reached its most extreme formulation in the work of historian Michel Foucault, who predicted that the notion of “man” was an artifact destined to disappear like a figure traced in the sand as the next wave approached. So when we open a French book on genetic research and read an opening sentence like the following—“Man as we conceive him today is destined to disappear”—we know we are on familiar terrain. Bernard Debré’s La grande transgression (Laffont, 2000) is a fairly typical “shock” book about genetics of the sort being written in most developed countries today, though like them, it also reflects


intimate national obsessions and fears. Debré’s main fear seems to be that genetic research will lead to a genetic engineering industry dominated by American values (utilitarianism, materialism, profit) that will give birth to a new social order hostile to modern French values (justice, equality, social solidarity). Even a thoughtful observer like Axel Kahn, a respected biomedical researcher who, like Debré, has served on national ethics boards, opens his most recent book— Et l’homme dans tout ça? [And what about man?] (NiL, 2000)—by informing readers that his otherwise sober investigation into genetics and human evolution was inspired by recent strikes against economic globalization and Americanization. Nothing opens a wider window into national psychology today than technological change; and there is no more intimate and potentially frightening technology than biotechnology. A good book could be written, and perhaps someday will, on how different nations have reacted to the advances in biotechnology at century’s end. In Germany, for example, the memories of Nazi eugenics continue to cast a long shadow on discussions of everything from genetic manipulation to animal rights. In the United States, as recent debates over stem-cell research illustrate, discussion inevitably revolves around “rights,” whether those of the unborn or those of potential victims of diseases that research might one day help cure. Even the study of the ethics of such research takes on a different cast in different countries. There has been an enormous growth in academic programs in ethics in American universities in recent decades, especially medical ethics, and most large hospitals now have someone on staff trained to deal with such questions. This university-centered approach has been reinforced by the fact that roughly 3–5 percent of the American genome budget was earmarked for research on its ethical implications, making it, in the words of one project director,“the largest ethics project in history.” France has taken another approach, at once more centralized and more focused on quality-of-life issues than rights. The main public debate on genetic research takes place in the Comité national d’éthique, which was the first in the world when founded in 1983, and whose members (sages)—doctors, biologists, philosophers, lawyers—are appointed by the national government. And the questions raised in this venue tend to return time and again to what it means to be human (and a French human being) rather than on melodramatic cases of particular individuals suffering from illnesses. The best treatment we have of this theme is American anthropologist Paul Rabinow’s French DNA (1999), which is

Technology and Biotechnology

based on fieldwork he did in Paris in 1994 on a formal commercial collaboration between an American biotech start-up company and a French genomics laboratory. Rabinow notes that the term one hears most frequently in French bioethical debates is dignité. Harking back to France’s declaration of human rights, but also perhaps to its Catholic heritage, “French law basically refuses the individual the right to dispose of his or her body and its parts” on the grounds that such commerce removes human dignity. In a 1988 report delivered to the French president, the principle was stated in this manner: “The body is the person; this is one of the modern elements of France’s eternal civilizing mission—to make this principle triumph over the mercantilism of industrial society. [Therefore] human dignity forbids that man be given a right to own his own body.” For most Americans, it can probably be said, human dignity requires just such a right. French hostility to the commercialization of biology is also reflected in repeated use of the term bénévole, which literally means “benevolent” but in this context means the system of voluntary contributions to things like blood banks. How societies manage their blood supplies and the degree to which it reflects their conception of the sacred and the human has long been a subject of anthropological reflection. The French system, at least since the Second World War, has operated on the assumptions of anonymity of donor and recipient, and non-commercialization. In recent years, with the rise of HIV and the development of new screening and other technologies, this system has come under pressure, producing at least one horrific scandal. When an American firm discovered in 1983 that a heat treatment could neutralize inactive viruses in blood stocks it informed the French national Center for Blood Transfusions, which failed to act on the advice until it could develop its own technique, and HIV-tainted blood was distributed to hundreds of hemophiliacs, who became infected. When an HIV test was developed by the American Abbott Laboratories in early 1985 it was not given a license in France until much later in the

year, in large part because French researchers were trying to develop their own antibody test, but also because of resistance to the notion that bénévoles donors should be screened, and thereby identified. Since this decision was taken at the highest reaches of the French government, a number of researchers, counselors, and politicians were taken to court by the victims’ families, and many were convicted of crimes. Rabinow’s book is enlightening, if marred by a knowing suspicion of medical and ethical “discourses” that grows tiresome. Where it falls short, however, is in seeing the connection between the French DNA debates and the larger questions of human destiny that have plagued the French—and not just French—imagination for decades. This connection appears more clearly in a recent collective work edited by Lucien Sfez, titled L’utopie de la santé parfaite [The utopia of perfect health] (PUF, 2001). What is particularly refreshing about this volume is the fact that it puts genuine concerns about such biotechnology into the wider context of the extreme hopes which the notion of “perfect health” has evoked in the modern age, but especially in the last century. The practice of cloning, the authors suggest, has given rise to both fantasies of absolute selfmastery and nightmares of eugenic strategies to rid the world of weak individuals or disfavored races. Understandable concern about the ozone layer and diminishing biodiversity has encouraged some ecologists to imagine a utopian, worldwide management of the planet along the lines of James Lovelock’s 1991 best-seller Gaia: The Practical Science of Planetary Medicine. The pursuit of supermodel thinness, the campaign against smoking, the development of what one author calls “nutritional correctness”—all these phenomena appear linked to a utopian urge for perfect health that lies beneath both our search for the genome Grail, and our fears of a world populated by inhuman clones. As Michel, the researcher in Elementary Particles might have put it: Every utopia spawns the dystopia it deserves. — Mark Lilla

Who’s Your Daddy?
doption, divorce, remarriage, single-parenthood, in vitro fertilization, surrogate motherhood— never have the questions of paternity and maternity seemed so complicated. As in so many other cases (see “To Each His Clone,” p. 11), the European debate on these issues differs from the American in that principles of right and choice tend to be less important than those of social cohesion and solidarity. Nonetheless, as historian and anthropologist Agnès Fine argues in a fascinating article,“Vers une reconnaissance de la pluriparentalité?” [“Toward a recognition of multiparenthood?”] (Esprit, March–April 2001), as the forces of individualization and atomization of the family increase, all Western societies will have to face up to a new social phenomenon: multiparenthood (pluriparentalité). At least since the Middle Ages, Fine writes, Europe has followed a “genealogical” model of descent (filiation) in establishing legal norms concerning the family: Each individual is considered to have just two parents, each of whom also have two, and only two, parents. With the wide spread of legal adoption, beginning in the 1920s, the distinction between biological and legal parenthood had to be faced, and most European countries followed the American example in opting for rigorous anonymity: Neither children nor adopting parents were permitted to know the identity of the child’s biological mother (or, for that matter, father). France developed a rather strict law in this regard in 1966. But with the rise in divorce rates over the past fifty years, this exclusive model of parenthood has become harder to sustain. Legally, French law gives clear and distinct rights to divorced parents over their children, and none to their new spouses. Practically, though, it appears that a significant proportion of legal adoptions are those involving one partner’s new spouse and his or her children from a previous marriage—throwing into legal doubt the chil-


Technology and Biotechnology

dren’s relations with one of their biological parents, grandparents, even brothers and sisters. Who then are the parents— that is, those with both rights over and duties to the children? This remains a legal gray area. Similar problems arise with artificial insemination, whether utilized by a couple or a single mother. As Fine suggests, infertile fathers are anxious to claim paternal rights over the children born to their wives through such technologies, not only out of pride but as a way of affirming the maternity of their spouses. But does that mean sperm donors have no rights (even to information) with regard to the children born of their sperm—or that children have no right to know who their biological fathers are? Another gap. Slowly, European countries are trying to address the new shape of the modern family. In Britain, for example, the Children Act of 1989 gives stepparents limited rights and responsibilities regarding their spouses’ children who have been under their care for at least two years. France has been slow in following this evolution, though new European Community statutes relating to family relations may push it more in the British direction. As Fine notes, one is already seeing television reports of two sets of parents joyfully awaiting the birth of a child: one set including the surrogate mother, the other the biological father. Nor is it unheard of today for two homosexual couples—one lesbian, one gay—to serve as parents to a child conceived through reproductive technology. All of which raises a number of questions, which Fine enumerates but does not try to answer. Who, for example, is to be held responsible for the child’s upbringing and well-being in such cases? From whom can the child expect to inherit? Should he or she be able to “choose” whom to recognize as his parents from among these many “parents”? This new legal landscape is complicated further by the question of designating a child’s last name. For centuries the standard European practice was to give all children, male and female, the names of their (presumed) biological fathers. The significant exception was Spain, which developed a system of hyphenated last-

names comprised of the father’s (always first), then the mother’s; when children married they passed only their father’s first last-name on to their own children, which names were then hyphenated to the first last-name of their mother. Now both these systems are undergoing changes, and confusion is growing. In early 1991, a member of the French parliament, Gérard Gouzes, submitted a bill that would give parents the right to choose whether the children would receive the mother’s or father’s last name, or some amalgam of them. This seems like a simple proposal but, in fact, it raises weighty questions, both practical and moral, which have been the subject of debate in France for the past year. In a recent symposium on the Gouzes proposal in the magazine Esprit (February 2002), sociologist Bernard Zarca pointed these questions out, in a comic vein at first:
Imagine a pregnancy in the year 2035. Madame and Monsieur, who each received hyphenated last-names from their parents at the turn of the century, are chatting before the fireplace. Monsieur opens the hostilities. He finds his father-in-law’s name ridiculous and doesn’t want his children to inherit it. Madame, in a mood for compromise, agrees to contribute her mother’s last name instead, but only if it appears first. But Monsieur responds that his name should go first for alphabetical order. Madame is getting angry, saying, actually, she doesn’t find her father’s name ridiculous (even though she prefers her mother’s name); and if alphabetical order is what counts, she’ll use her father’s name, which comes first. Monsieur counters by proposing his mother’s name, which alphabetically would precede that of his wife’s father. No, says Madame, because your mother doesn’t like me.

Humor aside, Zarca has a proposal. He notes that Spain recently revised its law in this area, so that children, at the age of maturity, have the right to decide for themselves which of their parents’ names they wish to have first; they would then pass on their first choice to all their own children, who would in turn have double last-names in an order determined by the couple involved. For example, when Pedro González-Hernández reaches maturity he

would choose whether he wished to be called González-Hernández (in which case he would pass on González to his children) or Hernández-González (in which case, Hernández). His wife would do the same. If, then, Pedro Hernández-González (as he is now called) marries Maria Vargas-Santiago, their children would be named either Hernández-Vargas or Vargas-Hernández, as the parents chose. However, when the children reached maturity, they could invert the names. And so on . . . . For sociologist Irène Théry, responding in the same issue of Esprit, proposals like Gouzes’s and Zarca’s, while clever, avoid the important social issues buried in this debate. One has to do with taking the rule-making authority in this domain away from society and its conventions, and placing it in the hands of individual families, further fraying social bonds. “By trying to open space for liberty, the law would do the reverse: It closes the little nuclear family onto itself. The child apparently belongs only to its parents, until it decides to give itself a name.” Another problem has to do with the asymmetry between men and women in parenthood. At a time when families are breaking up due to divorce and remarriage, and when more children are being born out of wedlock, more and more of the burden of parenthood is falling on women—which also means that intergeneration filiation is becoming more matriarchal, not shared. Théry prefers the Spanish system before the reforms: although somewhat sexist because it placed men’s family names before women’s, it ensured that parents would share a name with each other and with their children, who would in turn share a lineage into the past. Why not revert to this system, she asks, or modify it by allowing each couple to form a common hyphenated last-name, which they would pass along to all their children? This seems a practical proposal—so long as one assumes that children only have two, clearly designated “parents.” But if Fine is right, that is increasingly less the case. How then will we know what to call ourselves? Take a number? — Mark Lilla

Technology and Biotechnology

Fashioning the Military-Entertainment Complex


n Independence Day, the traditional summer blockbuster date in the entertainment industry, the U.S. military released its new video game, America’s Army: Operations. Designed by the Modeling, Simulation, and Virtual Environments Institute (MOVES) of the Naval Postgraduate School in Monterey, California, the game, intended as a recruiting device, is distributed free on the Internet. Produced with brilliant graphics and the most advanced commercial game engine available, at a cost of around $8 million, the game is a first-person multiplayer combat simulation that requires players to complete several preliminary stages of combat training in an environment mirroring one of the military’s own main training grounds—a cyber-boot camp. On the first day of its release, the military added additional servers to handle the traffic, a reported whopping 400,000 downloads of the game. The site continued to average 1.2 million hits per second through August. Gamespot, a leading review, not only gives the game a 9.8 rating out of a possible 10, but also regards the business model behind the new game as itself deserving an award. As the military’s new blockbuster video game illustrates, the military-industrial complex, contrary to initial expectations, did not fade away with the end of the cold war. It has simply reorganized itself. In fact, it is more efficiently organized than ever before. Indeed, a cynic might argue that whereas the military-industrial complex was more or less visible and identifiable during the cold war, today it is invisibly everywhere, permeating our daily lives. The militaryindustrial complex has become the military-entertainment complex. The entertainment industry is both a major source of innovative ideas and technology, and the training ground for what might be called posthuman warfare. How has this change come about? In the 1990s, with the end of the cold war came an emphasis on a fiscally efficient military built on sound business practices, with military procurement interfacing seamlessly with industrial manufacturing processes. The Federal Acquisitions Streamlining Act of 1994 directed a move away from the DOD’s historical reliance on contracting with dedicated segments of the U.S. technology and industrial base. In Secretary of Defense William Perry’s newly mandated hierarchy of procurement acquisition, commercially available off-the-shelf alternatives should be considered first, while choice of a service-unique development program has lowest priority. In effect, these changes have transformed military contracting units into business organizations. In keeping with this new shift in mentality,“company” websites now routinely list their “product of the month.” This shift in policy radically transformed the fields of computer simulation and training. Throughout the 30-year history of these fields, developments in computer graphics, networking, and artificial intelligence had always been driven by demands of military and aerospace contractors because of the importance of simulation

technology to military training. Currently, in fact, the simulation budget alone constitutes 10 percent of annual U.S. military spending. In this context, the shift in procurement policy had immediate consequences for the relations between military contractors in the simulation business and the entertainment industry. From the late 1980s through the mid-1990s, the game industry (including video console games, p.c. games, and arcade games) was growing at a brisk pace. A number of high-end military contractors decided to spin off some of their products into the game market. Evans and Sutherland, for instance, a major producer of stand-alone flight and tank simulators, repurposed some of its simulators as arcade games. Silicon Graphics made a major move in contracting with Nintendo to produce the graphics boards for the Playstation and the extremely successful SuperMario game series. So successful was this venture that Silicon Graphics management admitted that while their heart was still in the business of scientific and medical simulation, company revenues were mainly flowing from the game console market. The largest military contractor, Martin Marietta, spun off Real 3D, a company founded on several of Martin Marietta’s major patents in graphic chip design. Real 3D contracted with Sega to produce its next-generation arcade game platforms. The new policies resulted in a flow not only of technology from the military to the entertainment industry but of highly talented people as well. Steven Woodcock, a chief designer of AI components for the military simulation network, SIMNET, moved to Real 3D, where he designed several popular games, including Thundering Death, which used AI to generate the firstever learning opponent in a video game. Two other SIMNET warriors, Warren Katz and John Morrison, founded Mäk Software, specializing in constructing simulation training environments as well as commercial games. An illustration of the new era of open collaboration between military and commercial sectors, Mäk produced a war game called Spearhead under contract to the Marines, which was simultaneously released as a commercial game differing only in certain classified details. This flow of technology has been bidirectional. Upholding its new policy to use off-the-shelf technology, the military has adapted game software to its own purposes. The reason is obvious: The game industry has advanced rapidly in the past five years, taking advantage of hardware developments to produce spectacular, realistic graphic displays and games with increasingly sophisticated AI components. Game software now outstrips the best the military has to offer. Consider the military’s adoption of Falcon 4.0 as the training program for its F-16 fighter pilots. Falcon 4.0 mimics the look and feel of real military aircraft and allows users to play against computer-generated forces or, in a networked fashion, against other pilots, thus facilitating team-training opportunities. This video

Technology and Biotechnology

game’s extreme realism led to work with Spectrum HoloByte Inc. to modify the Falcon 4.0 flight simulator game for military training. Just as the military has leveraged the commercial sector for advanced technology, the game industry has pursued the opensource community for some of its hottest developments. This pattern began with id Software’s release of the code for its pathbreaking first-person shooter game, Doom, so that shareware gamers could modify the game by adding new rooms and levels (called “mods”). Id followed this innovative step by making available the scripting language for its hit game Quake, which radically changed the level of interactivity in games. A large shareware community of gamers has evolved, contributing tools from level editors to scripting languages for creating new environments and even changing the look and feel of the game. Other developers have followed suit, allowing players to alter their computer opponents in direct fashion through scripts and code plug-ins. This entire development has spilled over into the production of networked games, such as Counter Strike, that host upwards of 165,000 players. These developments have had enormous implications for the industry and raise some interesting security issues as well. The U.S. military has joined the fun of modifying games as well. In 1996 a group of Marine simulation experts from the Marine Corps Modeling and Simulation Management Office acquired the shareware version of Doom and adapted it as a military fire team simulation with software tools developed by shareware Doom gamers on the Internet. Real-world images were scanned into the game files so that 3-D scans of GI-Joe action characters replaced the stock game monsters. The game was also modified from its original version to include fighting holes, bunkers, tactical wire, “the fog of war,” and friendly fire. Marine Doom trainees used Marine-issue assault rifles to shoot it out with enemy combat troops in a variety of terrain and building configurations. The simulation was later reconfigured for a specific mission in the Balkans immediately prior to engagement. Such developments encouraged several top officials in the military simulation command to seek more formal collaborative relations with the video-game and entertainment industries. In December of 1996, the National Academy of Sciences, acting on the initiative of Professor Michael Zyda, a computer scientist specializing in artificial intelligence at the Naval Postdoctoral Academy in Monterey, hosted a workshop on modeling and simulation to investigate the possibility of organized cooperation between the entertainment industries and defense. Zyda’s report and follow-up proposal stimulated the Army in August 1999 to give a $45 million, five-year grant to the University of Southern California to create a research center, the Institute for Creative Technologies, to support collaboration between the entertainment and defense industries; to apply entertainment-software technology to military simulation, training and operations; and to leverage entertainment software for militarily relevant academic research. The ICT’s mission is to enlist film studios and video-game designers in the effort, with the promise that any technological advances can also be applied to make more com-

pelling video games and theme park rides. Although Hollywood and the Pentagon may differ markedly in culture, they now overlap in technology: War games are big entertainment. In opening the new Institute for Creative Technology, Secretary of the Army Louis Caldera said, “We could never hope to get the expertise of a Steven Spielberg . . . working just on Army projects.” But the new institute, Caldera said, will be “a win-win for everyone.” As part of the drive to leverage the entertainment market, the ICT is working on commercial games. Two games, Combat System XII and C-Force, are scheduled for release by the end of 2002. Designed for Microsoft’s Xbox, the games are intended to have the same holding power and repeat value as mainstream entertainment software and will be available commercially as well as for military training. The goal of the ICT games project is to create immersive, interactive, real-time training simulations to help the Army teach decision making and leadership rather than combat skills, so it is unlikely they will enjoy the success of America’s Army. The rise of the military-entertainment complex is not without a certain irony. Military-supported games, it turns out, are considerably less violent than their competitors. America’s Army: Operations, for instance, renders only a puff of blood when a player is hit. Real War, a game commissioned by the Office of the Joint Chiefs of Staff, released by Rival Interactive, Inc., and published by Simon & Schuster in October 2001, is based on an official military simulation called Joint Forces Employment. The only difference between the two versions is that the official one contains more learning objectives and the player has only a finite number of military resources—tanks, planes, and battleships. Visually, the game-play is nearly identical. Real War is particularly notable for its premise—a U.S. war against terrorism—created entirely before September 11. In the game, players can assume the role of the U.S. military or the terrorist organization. Just to even the odds, the latter has a military strength comparable to that of Washington. Real War is rated “Teen” because of its lack of gore. Although Rival Interactive’s president, James Omer, defends the game as a strategy challenge, not an actual simulator, several online game reviews have criticized Rival’s product for not

European Anti-Americanism

being realistic enough, calling the movements jerky and cartoonish. Gamespot gave the game a 3 out of 10. What scores a 10 in the game community? Games like Rock Star Games’ Grand Theft Auto, a role-playing game in which the player, betrayed and left for dead, curries favor with mob bosses and crooked cops while avoiding a lethal street gang, or Max Payne, where a fugitive undercover cop framed for murder is hunted by the mob. To date, the ICT has not followed the game-industry strategy of opening its game editor and level design software to the mod-developer community, but if their intent is truly to leverage the commercial market for military interests in the new era of cyberwarfare, that step cannot be far behind. Indeed, it may not even be necessary: The Unreal game engine used by the MOVES Institute for America’s Army has spawned a very large mod community of its own, visible, for instance, on the website. One group currently recruiting there is developing a mod based on the Unreal engine called Terrorism: Fight for Freedom, expected to be completed in early 2003. The architects of this multi-player Web-based game—a distributed multinational group—describe their project in an update from August 11, 2002, as “a modern-day, small-scale warfare Total Conversion for Unreal Tournament 2003. The mod is based upon wars that are currently occurring in the world.” The military is using newly minted best practices of game design and business models to compete in the arena for young, highly trained cyberwarriors. In a post-9/11 world where distributed collaboration in a military context has come to signify “terrorist cells,” the potential mods based on the Unreal engine conjure up an all-too-frightening potential reality. No doubt, somewhere, either in the game industry itself or among the worldwide community of mod builders, a group is currently developing a cyberterrorist game based on attacking the computer infrastructure of a country, disabling its power grid, infiltrating its financial networks, and hacking into mainstream news media such as the New York Times to confuse the public about what’s going on. Will this be a market in which the U.S. military can choose (or afford) not to compete? — Timothy Lenoir

t was Sunday morning after 9/11. My editor called from Copenhagen to discuss the situation, and I told him that the attack was an earthshaking event we would have to deal with for a very long time. To my surprise, I sensed that he disagreed and I felt compelled to clarify: “What do you expect the Americans to do? Take it sitting down? No, they’ll go to war.” He sighed and suggested I was exaggerating the importance of 9/11, a terrorist act like many others before, of which Europe has had its full share. This conversation marked the start of a turbulent, at times emotionally trying, period in my work as U.S. correspondent for the left-of-center Danish daily Information—in which I would eventually be denounced as a CIA agent by a Danish author. Having covered the U.S. on and off for more than fifteen years, I understood and identified with Americans’ reaction to the first attack on their territory since Pearl Harbor and felt it was my duty to explain that viewpoint to my readers. Many other Europeans felt in solidarity with the U.S. in those days. “We are all Americans” was a popular refrain on the other side of the Atlantic coined by a frontpage editorial in Le Monde. But as I gradually discovered, the European left’s rather simplistic view of America was more deeply rooted than I had supposed just after 9/11. Obviously, the Bush administration’s go-it-alone style and occasionally arrogant rhetoric significantly aided the prompt revival of old clichés. No shock there. What surprised me was the emergence of an emotional kind of anti-Americanism so prejudiced that facts, circumstance, and historical context seemed to have lost importance for many of my friends and colleagues at home and across Europe. September 11 seemed to trigger an explosion of pent-up frustration with America many had felt but had not wanted to express. I suspect that many U.S.-based foreign correspondents faced a similar conundrum, caught between their awareness of the human tragedy before their eyes and the swell of anti-Americanism among their home audience. From the first days of reporting on 9/11 in Information, it became clear to me that the editors and reporters pursuing the story from our Copenhagen office almost exclusively sought out those Americans, Europeans, and Arabs who espoused the line: The U.S. had it coming. . . . Those voices too should be heard. But the excessive shrillness of their blame game seemed offensive. Information’s front- and back-page lead editorials, usually written by editors, staff reporters, and correspondents, hewed to the European leftist interpretation of the terror attacks. To my great surprise and distress, I was not asked to write a single editorial. Between September 11 and November 3, I wrote only one leader, published three days after 9/11. This may sound petty. Many colleagues wanted to air their views. Like many European newspapers, Information has a long tradition of letting staff reporters— not just editors—write leaders; indeed, it is a standard part of their job, particularly when the area they cover is at the center of attention. That the paper’s U.S. correspondent should not write editorials after an event of this magnitude was unusual in the extreme. I think I know why it happened. A couple of weeks after 9/11, I started criticizing my fellow reporters’ work in the newspaper’s internal electronic debate forum. I insistently remarked that a newspaper founded by pluralistic Danish resistance to German occupation in World War II


European Anti-Americanism

had a special obligation to distinguish between legitimate resistance and terrorism. I felt we must make it crystal-clear in our reporting and editorials that we sided with the civilized against the terrorists; that certain extreme acts should be condemned a priori—after which you can criticize America all you want. The three top editors made no response. Staff reporters who responded all criticized me, but I also received a lot of private email from journalists who shared my views but, evidently, were afraid to speak up themselves. In mid-October two editors left Information, creating a sudden need for extra lead editorial writers. I was back and glad to have the privilege to write a leader once a week. Yet it was then that I discovered how difficult my job had become. Angry letters from readers poured in to our opinion-page editor’s desk and my e-mail address. A trade unionist thought it outrageous that in one editorial I had called the war in Afghanistan “just and necessary”—a view held by a small minority at Information. “I don’t know whether war can be just, but I know that the U.S.A. has no right to attack another country. . . . Why not spend defense money on combating poverty in countries that give shelter to terrorists,” wrote John Hansen. This was all very well, only Mr. Hansen added: “But as long as Burcharth and other Americans believe that war is necessary and just, there’ll be more terrorism.” I’d become an American! Slowly but surely, I came to realize Hansen was not alone in mistaking my national identity simply because I didn’t follow a certain politically correct interpretation of 9/11. Another reader—an American living in Denmark—

objected to an editorial of mine which, while sharply critical of the Bush administration for its denial of basic legal rights to detainees at Guantanamo, referred to them as “illegal combatants.” “Martin Burcharth takes over American politicians’ views. Why can’t he get his act together and interview Noam Chomsky?” fumed Stuart Goodale, unaware apparently that a Human Rights Watch report suggested that detained Al Qaeda members be designated “illegal combatants.” For the sake of balance, I should note that not all American readers in Denmark shared this view of my work. One American leftist with whom I corresponded regularly got so fed up with Information’s take on 9/11 and the Middle East that he intended to cancel his longtime subscription. But these were lonely voices. In February 2002, I received criticism from several staff journalists and the editor-in-chief for my views in a memorial editorial to Daniel Pearl. I wrote that the American journalist’s execution was an unprecedented event in Western journalism, and a direct attack on the freedom of expression of Western journalists covering 9/11 in Muslim countries, on those journalists’ duty “to defend freedom of expression and mercilessly challenge authority and power in whatever guise it appears,”—and on their newspapers back home. The editor-in-chief (who has since stepped down) was particularly annoyed because—as he wrote to me—he and several assistant editors had agreed in a meeting prior to calling me that I should draw parallels between the terrorist murder of Daniel Pearl and such state bombings of media as the Al-Jazeera office in Kabul, the official Serbian TV station in Belgrade during the war in Kosovo, and a Palestinian TV station in the occupied areas. Their point was: Freedom of the press is not only threatened by terrorist organizations, it’s also under attack by sovereign states. I was stunned when I received these instructions by phone and refused to commit myself one way or the other. Never had I been given such strict coaching before writing a leader. Though very uneasy, I decided to trust my own moral compass. How could I equate the U.S., Israel, and NATO countries with criminals who had murdered an innocent journalist? Of course I had to ignore the instructions. This incensed the editor-in-chief. In an angry letter to me, he claimed that I had been unduly influenced by the American media’s reporting on the Pearl case. “We are not Americans,” he wrote.“I personally believe that there’s no difference between the murder of Pearl and the bombing of a TV station filled with journalists in Belgrade. It is equally terrible and horrifying.” I too had been horrified by NATO’s 1999 bombing, but had pointed out at the time that it was carried out at 1:00 A.M. to avoid civilian casualties. It was certainly not deliberate murder of Serbian journalists. In my response I also indicated that an American journalist reporting on radical Muslim groups in Pakistan was hardly comparable to Slobodan Milosevic’s propaganda station, which helped create the conditions for mass murder and ethnic cleansing. But such arguments seemed unavailing in the wake of 9/11.

European Anti-Americanism

As the action moved from Afghanistan to the Middle East later in 2002, the question of terrorism as a legitimate means of warfare again arose. The subject of debate became Palestinian suicide bombings in Israel. None of our editorials condemned this practice, which was deplored as horrifying but generally accepted as a legitimate, last-resort means of defense. In early April, I was asked to write a leader on Colin Powell’s travel to the Middle East. The main point I made was that the secretary of state’s mission would fail unless Arab states were able to convince Arafat to denounce suicide bombings. Of course, this would not happen, since the Arab League and the Conference on Islamic Organizations considered terrorist attacks on civilians legitimate warfare. “But this is morally repugnant. Arab states have passed a limit all civilized nations must obey. They feed a widespread illusion that young Palestinian suicide bombers can defeat Israel by turning themselves into martyrs. Nothing could be more untrue,” I wrote. The editorial also condemned the Israeli Defense Forces’ mistreatment of Palestinian civilians and took strong exception to the army’s refusal to let humanitarian organizations enter the Jenin refugee camp. The response from within and outside the newspaper was one of shock. How could I, wrote the editor-in-chief, “make the fight against terrorism the focal point in a conflict which is about the right of two states to exist. One state is occupied by the other. Terrorism/warfare is just a terrible consequence of this.” Again I was suspected of toeing the American line. “You’re awfully close to espousing the policy of the Bush administration, which sees the fight against terrorism as the central issue,” my editor-in-chief asserted. One of our Middle East experts got so upset that she published a long list of questions on our internal debate site, which

she demanded I reply to. I virtually had to pass an exam—a litmus test—on Middle East policy. A well-known Danish author felt offended for another reason. How could I say “it was unreasonable and unconstructive of Arab leaders to claim that the U.S. had given Israel a free hand to finish up its operations in the occupied areas before the arrival of Powell,” he asked in a letter that the editor decided not to publish. Everyone knows that the U.S. has a grand and elaborate strategy for hegemony, world dominion, one not drawn up yesterday in Bush’s nursery. All Danish journalists know this, so why don’t they say so,” he wrote. The author’s explanation was quite simple.“It would be such a relief if journalists like Martin Burcharth revealed their political views. Their unwillingness to do so creates unfortunate suspicions. A member of the U.S. secret services once claimed the CIA bribed 10% of European journalists.” “But whether you have been hired by a foreign power or not, you can still be an agent, that is, under cover of objectivity, veil the principal aspects of the subject you write about and underscore the less important perspective. As Burcharth does.” A chilling message from a well-known Information reader. Of course, just one person’s view, yet indicative of the antiAmerican outlook that has gripped the European left since 9/11. The scope of the debate on America’s role in the world has been narrowed significantly. You’re generally scorned, perhaps even excommunicated, if you dare to write something balanced and—God forbid—anything positive about American foreign policy. For someone who considers the United States—for all its arrogance and mistakes, shortsightedness and insularity—indispensable, and transatlantic understanding imperative, these are trying times indeed to have your feet planted in both camps. —Martin Burcharth

The Americanization of

ntil quite recently, it seemed easy to reflect on Germany’s Americanization, from the establishment of basic democratic political rule in postwar Western Germany to our canonization of Hollywood movies and pop music’s penetration of youth culture. But September 11, war in Afghanistan, and the peculiar reaction of our leftist and liberal intelligentsia have changed things. On November 15, 2001, Stern magazine surveyed prominent authors, politicians, and TV celebrities under the headline “Stop This War!” Among those quoted was Franz Xaver Kroetz, a 55year-old stage actor and playwright who’d had close ties with the German Communist Party in the 1970s:


A country that takes an entire Volk hostage to serve its own interest is guilty of crimes against humanity and is itself a war criminal. The bombing of the Serbian Volk is such a crime. To heap suffering upon the Afghan Volk, and then to bomb those corrupt bandits known as the Northern Alliance into power is just such a crime. The U.S.A. has committed many political crimes over the past several years. I thank God I am not an American [. . .] .

Such a strongly worded statement was obviously meant as provocation, especially the words “Volk” and “crimes” (which the U.S. is ostensibly enticing the German Volk to commit). As a dramatist, Kroetz gained notoriety for reviving the Volksstück, a typically Southern German and Austrian melodrama which

European Anti-Americanism

attempts to elucidate the sufferings and conflicts of peasants and the urban poor to educated theater audiences. As an actor, Kroetz won fame in a TV series, playing a gossip columnist who, by exposing scandals surrounding Munich’s trendsetters, gives them media exposure. Traditional German culture sees the Volksstück as highbrow culture and the TV series as lowbrow, attacking programs such as Dallas as trash culture. Yet TV critics heaped praise on this series, Kir Royal, convinced that an “illegitimate genre” (Pierre Bourdieu) could be dignified through clever screenplays and artistry—an instance of the mechanism whereby certain cadres of the intelligentsia legitimize products of the American culture industry simply by proclaiming them pop culture. Recent anti-American protests have followed their own traditions and iconography, but many have been inflected by the new post-reunification alliances between East and West German intellectuals. In early October 2001, East German author Daniela Dahn denounced the U.S. in a lecture on terrorism and September 11, at eastern Berlin’s symbolically important Gethsemane Church, a rallying point in the events of autumn 1989 that led to the GDR’s collapse.“Fundamentalist terrorism is a malignant cancer— but it is not the only cancer,” she said.“Economic terrorists have long sought to privatize power. We must assume responsibility for having stood idly by while these purveyors of death have been allowed to portray themselves as champions of Western values. We must stop this hypocrisy! We can no longer accept human-rights violations in poor countries as the price of our own prosperity. It is precisely this un-Christian humiliation that is the reason for fundamentalist rage. We must create a world where no terrorist can claim the right to selfdefense.” The September 11 attacks were perpetrated by terrorists, but in her view, terrorists also existed among the victims, at any rate, among capitalist functionaries who had wreaked more havoc than the Islamic fundamentalists. But the recent protests raise an interesting and troublesome question: Can this type of antiAmerican discourse and protest be considered the logical consequence of Americanization itself? Along with older genres, including the lay sermon in a church, anti-American sentiment appears in the magazine opinion poll and the TV talk show—quintessentially American forms. The TV talk show is a venue for discussing all types of issues and venting all kinds of sentiments, including anti-American ones. Like blue jeans and rock ’n’ roll, talk rules the West German air-

waves, and since 1989 those of the ex-GDR too. Ex-GDR citizens have exploited this genre very effectively: Gregor Gysi, the leading transformer of the SED, the ex-GDR’s Socialist Unity Party, is now the PDS’s most popular politician due to frequent talkshow appearances. By having to present any and all opinions, the TV talk show lets an erstwhile left-wing radical actor pretend to be an expert on armed conflict, and an author specializing in GDR nostalgia an economist. Of course, such “opinionatedness” was utterly rejected by the attackers of September 11. They left the task of interpreting their mute actions as a form of rampant anti-Americanism to the countless talk shows of German television, thereby revealing just how Americanized we really are. This conundrum echoes the anti-American protests in the West Germany of the ‘60s, when literally all forms of political protest—sit-ins, goins, teach-ins, mass demonstrations against the Vietnam war and U.S. imperialism— were taken part and parcel from American prototypes. Earlier Germanic forms of youth protest were rejected as reactionary, from the antiWestern, anti-Napoleonic quasi-military physical conditioning of the early 19th-century to the natureworshipping, antimodernist protests of the 1920s. Since high school and college students of the sixties seldom had access to record players or money for records (still extremely expensive), radio was the medium of choice and the U.S. Army radio station AFN our favorite program, confirming Hegel’s concept of the “cunning of reason” (List der Vernunft). Whereas German radio stations played rock ’n’ roll sporadically, AFN played it around-the-clock, thus providing the soundtrack to the motion picture in which we acted during our youth. The radical left-wing student, vehemently opposed to America’s involvement in Vietnam, spent most of the day listening to the “enemy’s” rock music. A decade earlier, of course, the so-called Teddy-Boy revolt of the fifties was also influenced by Americanization. In his exhaustive Bravo Amerika, cultural anthropologist Kaspar Maase relates how the first German youth revolt emanated more from lower classes than did the later student rebellion, comprising apprentices, young workers, and salaried employees simply driven by consumerism and “lifestyle” choices: particular haircuts,“riveted trousers,” as jeans were then called, and cruising on mopeds. Teddy-Boys were the first group to embrace a peer group subculture. Public attention turned to open-air rock ’n’ roll concerts that often ended in clashes with the police, but lacked any real political motivation. [. . .]

Journalism and the Middle East

The Americanization of East German youth also manifested itself through pop culture. [. . .] Although the SED regime sought to forbid rock ’n’ roll and dungarees, the same impelling force took hold that already gripped Western Germany. [. . .] One myth has it that blue jeans toppled the GDR, since the country was simply not capable of producing the necessary quantity of stonewashed jeans to satisfy the fashion needs of its younger generation. [. . .] For a time I believed that a traditional Teutonic esthetic fundamentalism would gain a foothold among East Germany’s young artists, authors, and theorists disgruntled by reunification, and resentful of West Germany. But in fact, these young East Germans were too permeated with rock ’n’ roll to elevate their anti-Westernism into an aesthetic program, despite their antiAmerican anticapitalist criticism, and loathing of consumerism. From the Swingboys of the Third Reich to West German Teddy-Boys and rock-accompanied youth protest in the GDR, younger generations have invariably embraced Americanization by simply adopting the corresponding music and clothing, and the attitude that goes with them. Part of the appeal of Americanization is its inclusiveness and ease of access. Traditional European high culture (a “culture of distinction”) involves complex learning processes, inculcated at a very young age, the lack of which are difficult to compensate for later in life; Americanization simply provides the unlimited opportunity to consume—the prospective customer can at least get his foot in the shop without being barred from the outset. So Americanization immeasurably broadens the concept of art and culture, allowing all players to become consumers of art and culture. Americanization brought Hollywood to Western Europe and to Germany, prompting a widespread passion for cinema; the subsequent cultural elevation of cinema eventually superseded the notions of highbrow and lowbrow, legitimate and illegitimate, art and consumerism. But since the ecumenical moment of Jean-Luc Godard’s Breathless when Jean-Paul Belmondo adopted a screen persona akin to that of Humphrey Bogart, the need for Europeans to differentiate between European and Hollywood cinema has grown acute. Hollywood means Rambo; Europe—well, we’re not quite sure what. It’s more of an ideal, even an undiscovered missing link, one that directors, critics, and moviegoers alike insist is “not Hollywood.” The German Wim Wenders has tried to direct such anti-Hollywood films, albeit using Hollywood actors, with very perplexing results. Such is the nature of anti-Americanism as reflected also in the German cinema, itself a leading industry Americanism helped reestablish in Europe after 1945: Not only are certain forms of anti-Americanism essential components of Americanization itself, they are now even helping to drive the process of further Americanization. — Michael Rutschky
A longer version of this essay was presented at the “Democracy and Popular Culture” conference at the University of Chicago’s John M. Olin Center for Inquiry into the Theory and Practice of Democracy, April 20, 2002.

n June 1996, the Egyptian newspaper Al-Ahbar claimed to have exposed a terrible plot by Israel’s security services, a devastating new weapon that would bring ruin on the whole Arab world. This terrible weapon had a misleadingly innocent appearance: It was a small, Chicklet-like rectangle of chewing gum. The gum, to be marketed in Arab countries, so the story went, would raise sexual appetite to an unprecedented pitch, creating a sexual frenzy all over the Muslim world. According to AlAhbar, “the sexual excitement brought about by the chewing gum is not an end in itself. The ultimate object is the negative consequences of this stimulation, for if this gum increases the activity of the sexual glands in an extraordinary way, multiplying it by at least 50 times its normal rate, the chewing of this gum, even if only rarely, causes impotence through the destruction of the reproductive organs. . . . The whole thing will result in total ceasing of sexual activity among the inhabitants of the Arab countries within a few months. . . . And the aim lying behind this infamous Israeli plot is the decrease in the birth rate in the Arab countries in order to narrow the wide demographic gap between Arabs and Israel.” This newspaper story was not only taken seriously. It prompted a parliamentary-level inquiry. That what would strike citizens of Western democracies as utterly bizarre may sound plausible to the readers of a respected mainstream paper like Al-Ahbar, as well as to the Egyptian parliament, should give us pause. Westerners, it would seem, are more than slightly misinformed not just about popular Arab conceptions of the West, but also about a whole world of cultural and political imagination. And this ignorance rouses some sad reflections on amazing Western arrogance and self-confidence: In recent decades the trendiest Western academics, those who were supposed to help us understand—there are of course quite a few notable exceptions—have not really been interested in the Arab world. Instead, they helped turn the profession of scholarly study of the Islamic culture into a study of . . . well, themselves. Inspired by Edward Said’s influential Orientalism (1978) and a host of “postcolonial studies,” the West’s Arabists in general, and those in U.S. academia in particular, have turned to the self-indulgent study of their own texts about the Arab world. Like Said himself in Orientalism, a great many scholars do not even consider data about the actual Islamic world relevant to evaluating what Westerners write about it. Even curiosity about another culture, let alone serious scholarship, cannot take the nose of fashionable “postcolonial theory” out of its own belly button. So we can scratch our heads with Gayatri Chakravorty Spivak’s now canonical “postcolonial” text, and ask “can the subaltern speak?” (since, the logic goes, representations ipso facto impose the view of the representer on the represented). We would then not need to bother with listening to what the so-called subalterns are actually saying, writing, even shouting, until they happen to shout something that we cannot help but hear. The collapse of the World Trade Center, for example.


Journalism and the Middle East

Instead of asking if “the subaltern” can speak, how about reading what he is writing? This simple insight was the basis for the Washington-based think tank MEMRI (Middle East Media Research Institute), essentially a massive translation operation, disseminated mostly through the Internet. With branch offices in London, Berlin, Moscow, and Jerusalem, MEMRI translates Arab press, television, radio, published religious sermons, programmatic speeches, and school textbooks into all major European languages, plus Hebrew and Turkish. If you want to know how New York Times columnist Thomas Friedman stays informed about the Arab world, you need only log on to the MEMRI website (, register, and join the tens of thousands who get free daily MEMRI translations by e-mail. The think tank was primarily the initiative of one man, Yigal Carmon, its current president. Carmon, a trained Arabist, served for two decades in the Israeli intelligence and the Civil Administration (a clean-sounding name for the Israeli occupation apparatus in the territories), and was an adviser on terrorism to two Israeli prime ministers, Yitzhak Shamir and Yitzhak Rabin. He participated in the Arab-Israeli negotiations in Madrid and Washington prior to the Oslo initiative. Carmon, however, opposed the Oslo Accord itself from the very beginning on the grounds, now more plausible than they seemed before the failure of the Camp David negotiations, that Yasir Arafat did not intend to follow through on what he signed. But in Israel, this would not qualify Carmon as a hard-liner or hawk. He believes in what used to be the left-wing position and is by now the view of the majority of Israelis: The occupation should end and the only long-term solution to the conflict is “two states for two peoples.” But MEMRI, supported by foundations dedicated to research institutes, and also by private donations from American Jews, was not designed, Carmon insists, to promote or inhibit this or that political plan. It was set up in 1998 to burst two bubbles: the Western refusal to look at what is going on in the Arab world, and the Arabs’ solipsistic talk about the West to Arab ears only. This puncturing operation, says Carmon, is a cure for their schizophrenia and our wishful thinking. Neither apologetic nor wary of politically correct sensibilities, Carmon believes in the straightforward principle that ignoring facts is hazardous.“Journalists, academics, and intelligence agencies in the West have failed tragically,” he tells me in an interview, “in supplying the most basic facts. They have just not been doing their job. Here and there we hear something about internal Arab debates. But there is a great difference between getting your information from Western correspondents, who usually don’t speak Arabic, and following editorials or other media for yourself.” Carmon dismisses the idea that MEMRI’s selection of material is geared to emphasizing Arab xenophobia, anti-Semitism, and fundamentalist radicalism. “We give special emphasis to dissenting writers who are liberal and aim to promote democratic reform in Arab countries,” he says. “We supply lists of such writers and their writ-

ings to anyone who seeks them, and the lists are surprisingly long.” Apart from the aspiration to expand its translating operations to countries not yet covered (Jordan, Sudan, and the Emirates), MEMRI has a budding project of analyzing Middle Eastern economies, and an annual report on anti-Semitism in the Arab world. That much of what MEMRI translations reveal is unflattering to the Arab world cannot, however, even when combined with Carmon’s biography, easily be dismissed as the result of bad intentions or a smear campaign. For what MEMRI does, and what Carmon says, are actually remarkably reminiscent of what dissenting Arab liberal voices urge. The chewing gum affair, for example, was reported not by malicious Westerners, but in a piece on masculinity in the Arab world by Lebanese intellectual Mai Ghoussoub. Ms. Ghoussoub contributed it to an illuminating anthology on the topic by Arab writers, which she also coedited (with Emma Sinclair-Web: Imagined Masculinities, London: Saqi Books, 2000). To take another notable example, Hazim Saghieh, editor of the op-ed page of the London-based Arabic Al-Hiat, has been consistently arguing against the paranoid style of Arab conspiracy theories and what he deems to be a superficial anti-Semitism, assembled hastily by propagandists for political purposes. In the spring 2002 issue of Dissent, exiled Iraqi writer Kanan Makiya (“The Arab World After Sept. 11”) argued that part of what sends young people on terrorist missions to the West is a failure of Arab intellectuals to help build a more liberal civil society. Instead, Makiya argues,“we”—i.e., Arab intellectuals in general—have helped nurture isolationist anti-West fantasies of victimhood and redemption ever since the collapse of great hopes in Pan-Arabism and Arab Marxism. In the aftermath of September 11, the American Middle East Studies Association (MESA) issued a request to its members not to cooperate with the U.S. government initiative to encourage the teaching of Arabic (among other languages), since this initiative is subordinate to narrow conceptions of “national security”— meaning the war on terrorism. This is, of course, the old Said paradigm all over again—knowledge of the Orient as a means of subduing it. By now all this should seem more than slightly anachronistic. Regardless of political opinion, no Western democracy can afford to ignore what is said and written in the Arab world today; if MESA wants to do so, there is no good reason to follow its example. But since not all will rush to study Arabic, at least what MEMRI translates and distributes is widely available. —Gadi Taub


Journalism and the Middle East


Reflections on the Online Ha’aretz
Following is an excerpt from a lecture (“Digging Beneath the Surface in the Middle East Conflict”) delivered by the editor-in-chief of the Israeli daily Ha’aretz on May 27, 2002, at the ninth World Editors’ Forum in Bruges, Belgium. irst, the good news: Abu Ali’s nine children are alive and well—as well as children can be among the ruins of the Jenin refugee camp. Please deliver this news to all of your friends who may have read, a few weeks ago, Abu Ali’s mournful declaration: “All my nine children are buried beneath the ruins.” Abu Ali’s photograph was spread across a double page in a very distinguished and influential European magazine, under the title: “The survivors tell their story.” Israeli tanks and bulldozers had entered the camp, Abu Ali recalled. He went out to fill his car, telling his nine children to meet him at a nearby intersection. But the Israeli forces blocked his way back, and it was a week, he told the reporter, before he could return to the ruins of what had been his home. [. . .] “I am sure all my children are buried beneath the rubble. Come back in a week and you will see their corpses.” The reporter and his editors did not wait a week. [. . .] The desire to hype the story blunted their healthy journalistic instincts to doubt and double-check any story before publishing it. I made some inquiries about Abu Ali’s case. First, final numbers indicate that three children and four women were killed during the fighting in the Jenin refugee camp. Second, Abu Ali’s children were not among them. And third, the magazine did not bother to tell its readers of this relatively happy end to its story. The past 20 months of the Israeli-Palestinian conflict have created a real crisis of values for journalism. I believe I can compress the enormous volume of coverage and comment into four fundamental sins: obsessiveness, prejudice, condescension, and ignorance. The story of Abu Ali conveniently exemplifies all four. It is impossible to cover an ancient dispute in postmodern idiom, using 21st-century technology, without recognizing the inherent dissonance. But such recognition is not always there. That is perhaps why the intensive media coverage of the conflict is often so self-absorbed and so harmful to the region. [. . .] One day, historians examining this period of crisis will have to consider the circular process by which the media were transformed from observers to participants. [. . .] The media in this cruel IsraeliPalestinian conflict are like a very rich junkie who parks his Mercedes on the high street of a slum. You can be sure that in no time at all, everyone will be out there, pushing a whole variety of merchandise. [. . .] The months of violence have forced our venerable, 84year-old newspaper to play its part in the collective national ethos, though our critics claim we do not show sufficient enthusiasm for this role. Daily, we feel the impact of our work in our contacts with Israeli public opinion, and we can trace our impact, though less measurable, on world public opinion.


[. . .] Recently, a best-selling Israeli author, politically middleof-the-road, wrote us: [. . .] “. . . I have reached the conclusion that you and I don’t live in the same place. A large and growing proportion of the reports and articles in your newspaper stink of the foreign press, which regards the State of Israel as a different, distant and repulsive territory.” [. . .] [Yet] unlike those who report the conflict as a grand adventure, we live the consequences of our reporting with every inch of our being. Ha’aretz is a small paper in a small country. Our paid daily circulation, Hebrew and English—the English edition is a joint venture with the International Herald Tribune—reaches 100,000 copies. This is less than 10 percent of the Israeli newspaper market. Nonetheless, in the past 15 months since we launched our online edition, our Hebrew-language Web site is now logging half-a-million page-views a day, and our English-language site, another 700,000, mainly from outside Israel. [. . .] [W]e have yet to earn a single penny from it. [. . .] [D]espite our modest pretensions, we had been chosen by many on the Net as producers, suppliers, and packagers of information from the Middle East. We are servicing individuals, media groups, communities, and organizations all around the world. [. . .] Are we one of the dealers that hang around the Mercedes parked on our high street? We certainly are not, but we constantly need to persuade others in our neighborhood that we aren’t. The Israeli-Palestinian conflict may lack mystery, but it is deceptive. Practically, nothing obstructs acquiring information from the region, but it is no simple task to assess to what extent that information reflects reality. [. . .] Exaggeration, disinformation, and provocation are the region’s stock-in-trade. At the most basic level of sight and sound, the conflict is easy to cover. But [. . .] [n]othing is what it appears to be. For example, one day last August, while on a family vacation in a peaceful seaside town in Brittany, France, I couldn’t miss the front-page headline of the regional newspaper [. . .]: “Israel assassinated Palestinian political leader.” The non-credited story told how an Israeli helicopter fired a missile through the office window of Abu Ali Mustafa, the secretary-general of the Popular Front for the Liberation of Palestine in Ramallah, killing him instantly. Now, the PLFP is indeed a political movement, but it is also an active terrorist organization. I could not help but wonder how this news report, as it appeared in the paper, enriched a local reader’s perception of the conflict. [. . .] Obviously, the editor who wrote the headline was not aware of definite information concerning Mustafa’s involvement in coordinating a terrorist attack on an Israeli school that took place the following week, on September 1. To know that in

Journalism and the Middle East

real time, the editor would have needed deep, reliable sources inside the secret service. If he had had that knowledge, would he have phrased the headline differently, or would he just make out of it a short foreign-news, back-page story? As you see, even simple, neutral coverage is often loaded. [. . .] So is the use of contradictory terminology that often reflects the two sides’ conflicting narratives. “Shaheed” (martyr) or “suicide bomber”? “Resistance fighter” or “terrorist”? These are all different expressions for the same person. By choosing to use one of them, you expose your own take on the conflict. In the Middle East, naïveté is an intolerable professional failing, especially when it comes to terminology. No one in the region uses the present tense to describe the actual moment. There is only past or future. Retaliation for what happened, or prevention of what is yet to happen. As our children tell us: “Everything started when she hit me back. . . .” And yet, the story as depicted in the media is sometimes so painfully present-tense, lacking in context and lacking in consequence. For example, the image of the Palestinian suspects, stripped to their underpants, with the Israeli soldier aiming his rifle at them, is inevitably shocking to anyone who does not know how much blood has been spilled by people wearing explosive belts under their clothes. [. . .] Quite a few of our reporters are driven by an ambition to improve society, [. . .] but [. . .] editors must make a constant, careful effort to remove the “over-enthusiasm” from news reports. In our own case, since both editors and local readers are intimately familiar with the local scene, these instances can usually be handled with a certain degree of success. But when a correspondent serves a distant, uninformed audience, his editors can often fail to filter out the distortions. Some correspondents might have been obsessive in their determination to unearth a massacre in a refugee camp. Prejudice and ignorance were at work here, too. A more professional approach would have factored in the five million cellular phones in Israel, and half-million more in the Palestinian areas, which would make a cover-up impossible. Even before the first reporters were on the scene in the Jenin camp, it was obvious that there had been no massacre there. Hundreds of soldiers who were involved in the operation are reservists, [. . .] and each one had a cellular phone in his pocket that he used constantly. [. . .] As Israel has gradually disengaged from the Palestinian territories over recent years, our coverage of those territories has become more like foreign correspondence in some respects than like domestic reporting.Yet, at the same time, we remain intimately familiar with the territories and with the Palestinian community— as though they were parts of our domestic beat. Over the years, our coverage has spanned most areas of Palestinian society. Our reporters have acquired a deep knowledge of its mores and culture, and deep relationships with their sources of information. Ha’aretz today has nine reporters covering various aspects of the Palestinian side of the story, and many others who take on special assignments. [. . .] [A] senior member of our editorial

staff, Amira Hass, has lived in the territories since 1993 [. . .] This is unique for an Israeli. Part of the special skills required by a Ha’aretz reporter covering these beats is the ability to critically examine manipulative information of all kinds and to filter it. Only someone deeply informed and intimately connected can, sometimes within a few hours, scotch a rumor or reduce an exaggerated report to its natural proportions. Thus, thanks to Amira Hass’s presence in Jenin as soon as the camp was opened, and thanks to the credibility of her reports from the chaotic scene, Ha’aretz was able to quickly and reliably report that there was no massacre in Jenin during or after the fighting. Because of Ha’aretz’s years of readiness to listen to the Palestinian side, and because of the natural inclination of the newspaper to regard our mission to be the exposure of wrongdoing, there are reporters at Ha’aretz who have specialized in documenting the humanitarian cases on the Palestinian side. This is not new for us. [. . .] As attacks proliferated and more and more innocent Israelis fell, antipathy has grown toward those reporters who continue to describe the suffering on the other side, and they are now the main target of criticism leveled against the newspaper. [. . .] Trying to be conclusive about the basic question,“What actually happened there?” is not always fruitful [. . .] We make a huge effort to give our reader a clear picture, but nevertheless some of the stories seem equivocal. They cite two or more conflicting versions, but sometimes make no final judgment. And that, of course, can leave your reader frustrated and angry. Over the past year, there has been a dramatic change in the demographics of the Ha’aretz reader. That is a direct result of our 24/7, free-access online editions, both in English and Hebrew. The newspaper’s content is now exposed through the Internet to two new communities we never knew before: the non-subscribing Israeli who browses for the latest news, using several sources of media for his information, and the foreign reader. [. . .] The English Internet edition has meant that Ha’aretz is now quoted in unprecedented numbers of articles and reports. [. . .] Sometimes, we discover that material that ran in Ha’aretz is taken out of context and used to serve various political or media purposes—sometimes deliberately distorting the intentions of our writers and editors. Sometimes we find ourselves being overly cautious because of our ongoing direct discourse with the Palestinians, with the Arab world, and with world public opinion. The newspaper’s reputation is sometimes exploited in order to legitimize anti-Israeli propaganda, and we are worried about that. If the paper exposes cases of vandalism by soldiers during the recent massive military operation on the West Bank, we do so in good faith, trusting that our work helps to clean the system. Then, when the story is quoted widely, under our brand name, as proof of Israel’s profound and pervasive evil, I find myself thinking that perhaps there is a fifth major sin in running a paper in this region: the sin of naïveté. —Hanoch Marmari

Journalism and the Middle East


or The Demon of Analogy
On March 25, 2002, the Portuguese novelist and Nobel laureate José Saramago visited Ramallah as one of eight delegates of the International Parliament of Writers begged to bear witness to the Israeli occupation by a Palestinian poet stranded in the town, Mahmoud Darwish. The group met with Chairman Arafat but also, in Israel, with Jewish writers and politicians committed to peace negotiations. In one such encounter, Saramago compared Ramallah to Auschwitz; as a writer, he explained the next day to Ha’aretz, it was his job “to make emotional comparisons that would shock people into understanding.” Weeks later Saramago extended his shock tactics to biblical analogy in an El País article excerpted below with a rebuttal from the paper’s New York correspondent. — David Jacobson


From the Stones of David to the Tanks of Goliath
El País, April 21, 2002 [. . .] The well-known biblical legend of the combat between the little shepherd-boy David and the Philistine giant Goliath (which never took place) has—wrongly—been told to children for at least 25 or 30 centuries. Over time, various parts of this story have been embellished, with the uncritical consent of more than a hundred generations of believers, Jews and Christians alike, a whole deceptive mystification about the unequal strength between the brutal four-meter-tall Goliath and the fair, physically frail David. This obviously enormous difference was compensated and turned around in the Israelite’s favor by the fact that David was a shrewd lad, and Goliath a hulking moron; the former was so shrewd that, before going to face the Philistine, he stopped by a brook, found some five smooth stones, and put them in his sack; the latter was so stupid that he didn’t realize David came armed with a pistol. But it wasn’t a pistol, lovers of sovereign mythical truths will protest indignantly, it was only a slingshot, a very humble shepherd’s slingshot, the kind used from time immemorial by the servants of Abraham as they watched over their herds. True, it didn’t look like a pistol, it had no gun butt, no trigger, no cartridges; just two strong, thin cords, tied at the ends to a small piece of pliant leather, and, into its hollow, David’s expert hand placed the stone which, from far away, shot swiftly, powerfully like a bullet against Goliath’s head, which it struck, leaving him at the mercy of the sword the expert marksman was holding. If the Israelite managed to kill the Philistine and bring victory to the army of the living God and Samuel, it was not by being more shrewd, but simply because he wielded a long-range weapon and knew how to use it. [. . .] Historical truth, modest and utterly unimaginative, concurs here, revealing to us that Goliath hadn’t a chance to lay a hand on David; mythical truth, well-known for spinning fantasies, has beguiled us for 30 centuries with the marvelous story of a little shepherd triumphing over the brutality of a giant warrior whose heavy bronze helmet, armor, shin pads, and shield were of no help to him. Whatever conclusion we can draw from the elab24

oration of this edifying episode, David, in the many battles that made him a king of Judah and Jerusalem and extended his power to the right bank of the Euphrates, never again used a slingshot or stones. Nor does he use them now. Over the last 50 years, David’s strength and size have grown so much, it’s impossible to tell the difference between him and the haughty giant; it may even be said, with no harm to the dazzling clarity of fact, that he has turned himself into a new Goliath. Today, David is Goliath, though a Goliath who doesn’t bother wielding useless, heavy bronze weapons. The fair David of olden days flies in a helicopter over occupied Palestinian land and shoots missiles at innocent, unarmed civilians; the delicate David of old now mans the world’s heaviest tanks, and crushes and blows up all that he finds in his way; the lyrical David who sang Bathsheba’s praises, now in the Gargantuan guise of a war criminal named Ariel Sharon, sends the “poetic” message that he must first finish off the Palestinians, then negotiate afterward with the ones who are left. And this, to put it briefly, has, with slight, merely tactical variations, been Israel’s political strategy since 1948. Mentally misguided by the messianic idea of a Greater Israel which once and for all realizes the expansionist dreams of the most radical Zionism, dreams contaminated by the monstrous, strongly rooted “certainty” that, in this absurd and catastrophic world, there exists a chosen people of God who, as such, receive automatic justification and authorization, by dint of the horrors of the past and the fears of the present, for all the actions born of an obsessive racism—a people psychologically and pathologically exclusivist, educated and trained with the idea that whatever suffering they have inflicted, are inflicting, or will inflict on others, particularly the Palestinians, will always be less than that which they suffered in the Holocaust, the Jews ceaselessly scratch at their own wound so that it never stops bleeding, to keep it incurable, and they brandish it to the world like a flag. Israel appropriates God’s terrible words in Deuteronomy: “Vengeance is mine.” Israel wants us all to feel guilty, directly or indirectly, for the horrors of the Holocaust; Israel wants us to abandon the most basic critical judgment and make ourselves a mere echo of its will; Israel wants us to acknowledge de jure what is its de facto practice: absolute impunity. From the Jews’ point of view, Israel can never be subject to judgment, because it was tortured, gassed, and cremated at

Journalism and the Middle East

Auschwitz. I wonder if the Jews who died in Nazi concentration camps, were persecuted throughout history, or forgotten in ghettoes, I wonder if that vast multitude of the wretched would not feel ashamed to see the vile acts their descendents are committing. I wonder if having suffered so much would not be the main reason to spare others suffering. David’s stones have changed hands; now it’s the Palestinians who throw them. Goliath is on the other side, armed and fitted out as no soldier in the history of war has ever been, except, of course, for their North American friends. Oh, yes, the horrible murders of civilians caused by the so-called suicide terrorists. . . Horrible, indeed; blameworthy, indeed; yet Israel has a lot to learn if it can’t comprehend why human beings turn themselves into bombs. —translated by David Jacobson


Reply to José Saramago
[in Spanish, El País, May 1, 2002] This past week José Saramago was in New York; his original visit, scheduled for September, had been postponed due to 9/11. The literary critic Harold Bloom delivered an enormously analytic homage to the Nobel laureate, going into each of his novels, to a jam-packed audience at the New York Public Library. I had read about Saramago’s remark to Israeli writers that described the West Bank as Auschwitz, but I had hoped it was a boutade. [. . .] It never would have occurred to me that Saramago would use the occasion of the terrible tragedies taking place in the Mideast to resuscitate the caricature of THE JEW. But when I read Saramago’s diatribe against the Jews in Sunday’s El País, [. . .] I realized this wasn’t a boutade, but a rant. You can have all sorts of points of view about the Middle East, about the history of the Middle East, and about the bloodbaths of the last months, but, [. . .] however grim things look, you have to be for the peace process and international law; Saramago isn’t addressing solutions for the Middle East, [. . .] he barely mentions Sharon. Saramago doesn’t want to dwell on a specific Israeli general, or any other of the specifics. [. . .] Instead he wants to deal with THE JEW that is roiling around in his head; [. . .] his remark about Auschwitz wasn’t based on anything he directly saw— apparently he was bused from his overnight hotel in Ramallah directly to the writer’s meeting in Israel—but on a strange conclusion about Auschwitz [. . .]. In this dangerous moment of xenophobia toward Arab immigrant workers and growing anti-Semitism in Europe, [. . .] Saramago has chosen to conjure up the picture of the eternal, immutable JEW. [. . .] I’m not as up as he is on the Bible and can’t comment on some devious meaning in the tale of David and Goliath, but I would have assumed that a writer of his talent wouldn’t be rummaging around in the Bible, which, in addition to producing some of the world’s most breathtaking poetry, has

enough nutty stuff in it to keep Hollywood going forever. Saramago’s portrayal of the vengeful Jew carrying out the work of a vengeful God [. . .] isn’t an original literary observation, [. . .] but standard anti-Semitism as it has rattled down through the centuries. The educated among us don’t rifle the Koran for the one sentence to vilify the Muslim religion. [. . .] Suppose I were to write the following gibberish: The loss of the Spanish colonies has caused Spaniards to hijack the world. In the words of the poet Cernuda, “Hate and destruction silently endure forever in the depths of the terrible Spaniard.” This Spanish appetite for blood, documented in the title of García Lorca’s play Bodas de Sangre [Blood Wedding], has caused their torture of North Africans, also documented by Amnesty International. The description of the Jews roiling around in Saramago’s head wasn’t formed overnight. His trajectory of THE JEW, from biblical times to the present, has nothing to do with an outer reality involving the UN, borders, the settlements, and the necessary solutions. Follow his logic: 1) The Jews are vengeful because they’ve been historic victims, which has made them obsessive racists. Really? So why have they been at the forefront of every civil rights and freedom movement? 2) [. . .] Jews have an unfair advantage: because they were tortured or incinerated they are answerable to no one, and have tremendous power to dominate the world. This is a new wrinkle on an old theme: Jewish capitalists control the world, Jewish Communists control the world, now murdered Jews control the world. Saramago resolves whatever debate is going on in his mind about all these dead Jews: [. . .] Would these dead reemerge, they would shun their descendants. His slurring provocation is not aimed at bringing any group to the peace table: “Oh, and yes, the horrible civilian deaths caused by the so-called terrorists . . .” doesn’t merit comment. My own dead relative, my father’s cousin, the Austrian writer Joseph Roth, [. . .] believed in seeing all sides. In The Wandering Jew, written in the early ‘30s, he expressed his evenhanded pessimism concerning the Middle East: “The Jew has a right to Palestine, not because he once came from there, but because no other country will have him. . . . Unfortunately he is as much a European as he is a Jew. He brings the Arabs electricity, engineers, machine guns and shallow philosophies. . . . The Arab’s fear for his freedom is just as easy to understand as the Jew’s genuine intention to play fair by his neighbor. . . .” [. . .] Caricatures, whether of an Arab, an African, a Jew, etc., direct our attention only toward the malicious stereotype. One of the things that a writer does, particularly those of us who have had the good fortune to know different parts of the world, is to make the human experience complex, and the unfamiliar familiar. In indulging himself by writing of the present crisis as though Jews were part of some novel taking shape in his head, in a country where Jews are an abstraction, a country whose relation to Israel is relatively recent, Saramago has done nothing to forward the cause of reasonableness, a cause which the Sadats, Rabins, and Martin Luther Kings paid for with their lives.

Refashioning National Identities

The Srebrenica Massacre and
ow often does it happen that a report by an academic institution causes a government to fall? This, however, is what happened earlier this year when Holland’s center-left government resigned in the wake of a report by the Netherlands Institute for War Documentation criticizing the inaction of Dutch United Nations troops during the massacre of Srebrenica, the town in Eastern Bosnia whose male Muslim population was rounded up and killed by the Bosnian-Serb Army in July 1995, while the area was supposed to be under UN protection. The government in The Hague—political center of a country used to judging war criminals, as home to the UN’s international courts of law—was forced to acknowledge “the political responsibility of the international community, of which the Netherlands are part.” The report and the government crisis that followed are reflections of an evolution in thinking about humanitarian intervention and of the changing ethics of a small, idiosyncratic Western-European democracy that has been accustomed, until recently, to a position of comfortable neutrality and moral superiority. Morality in the Netherlands has long been a matter of blackand-white. A highly secular society with deep roots in Calvinism, Holland has long felt itself to have a moral calling. Colonialism in the Dutch East Indies, based on the “ethical policy,” was “different” from other European countries’ imperialism. (See article on p. 3.) The first World Peace Conferences, 1899 and 1907, sponsored by Nicholas II and Theodore Roosevelt, were held in The Hague. Dutch neutrality in WWI led to the city’s Peace Palace becoming a center of international law and arbitration. Moreover, since the Second World War, with its long debate about wartime resistance and collaboration, the words goed and fout (“right” and “wrong”) became moral absolutes that could be used to explain all actions with Holland generally on the good side of things. This has begun to change in recent years, in part because of the important role played by the Netherlands Institute for War Documentation (NIOD), based in Amsterdam, most of whose archives deal with what the Dutch still call “the War.” Lou de Jong, its founding director, wrote the 29-tome canon of WWII historiography of the Netherlands and, through a popular TV series, did much to form Holland’s particular postwar moral landscape. De Jong was an historian-judge; politicians resigned over his findings. A Dutch Jew who continued living in Holland, he and the institute became the guarantor of national ethical purity, guardian of a spotless mirror in which the country could look in the morning. In recent years, cracks started to appear. Queen Beatrix, grand26


daughter of the wartime queen, confessed during a 1995 visit to Israel how few Dutchmen had helped Jewish fellow-citizens. Courageously she broke the myth of widespread Dutch heroism, admitting that though tens of thousands were resistance members, an equal number had collaborated, and that the vast majority of some 11 million had stood by as the largest number of Jews of any Western European country were deported. In the wake of the scandal over Swiss banks keeping the money of Jewish depositors after World War II, Dutch banks and insurance companies, which had not paid out life insurance to those without death certificates; the Amsterdam Stock Exchange, which had remained silent about stocks and bonds unclaimed after the War; and the postwar finance ministry, which had distributed jewelry confiscated during the war, all followed confessional suit. Official committees concluded that several postwar governments had robbed the surviving 15 percent of the Jewish population. Historians researched, a verdict—“fout”—was reached. Reparations were paid, Prime Minister Wim Kok apologized. The mirror seemed spotless again. Two years ago, it became known that crown prince WillemAlexander was romantically involved with Maxima Zorreguieta, daughter of a former secretary of agriculture in Argentina’s Videla dictatorship. Privately, Kok arranged for a historian to study her father’s past. He found no evidence that Mr. Zorreguieta committed any crime himself, but that there was no way he could not have known that thousands were being killed by the government he served. Within the parameters of Dutch morality, formed by WWII, Zorreguieta was “fout.” Then Kok dispatched Max van der Stoel, the éminence grise of Dutch diplomacy, to persuade the girl’s father not to attend the wedding. When their engagement was formally announced, Maxima apologized for her “loving” father’s having served in a “fout” regime—carefully using the necessary Dutch moral terminology. Her charm and fairy-tale wedding last February captured Dutch hearts, while Dutch minds had suddenly forgotten their moralizing. Needing both to keep the future king uncompromised by marriage into a “wrong” family, and to placate the Dutch congress into approving their marriage (required by law), Kok brilliantly choreographed keeping the country’s moral mirror clean. It seemed as if, having led the country for a near decade of strong economic growth while almost ending unemployment, his second and last four-year term would end peacefully. But the happy ending to the royal wedding story masked a larger problem of gray complicity. Holland had continued trading with Argentina through the years of the “dirty war,” and dissidents were thrown alive into the Atlantic from airplanes purchased from the Dutch manufacturer Fokker. The Dutch soccer

Refashioning National Identities

team even played Argentina in the 1978 World Cup finals in Buenos Aires. Holland’s selective indignation connects with a much wider set of problems of democratic culture. For all their open-mindedness, the Dutch have often mistaken inaction for neutrality, indifference for tolerance, multiculturalism for cultural relativism. War in Europe, “cradle of Western Civilization,” represented an enormous challenge, particularly after the war in Rwanda, where a far greater massacre was perpetrated, and the scandal to the international community was something much worse: not having intervened at all. To try to prevent a repetition of Rwanda, Holland felt compelled to send troops to Bosnia, putting itself in a place where it could not evade direct responsibility. The Srebrenica massacre occurred in the “safe-area” after Bosnian Serb forces had intimidated, kidnapped, blackmailed, and ultimately ousted Dutch troops. NIOD was called on to investigate the tragedy, part of a new expanded role. No longer simply an archive of WWII, the center has been designated to research all subsequent military conflicts involving the Netherlands. At the same time, NIOD has a new director, Hans Blom, representative and agent of a cultural shift. Moving away from the moral Manicheanism of De Jong, Blom is part of a more cultural relativist generation of scholars, convinced that moral matters are often far more complicated than “right” and “wrong,” that historians should analyze events in their context, not issue verdicts. In 1996, Kok’s first government requested a report on Srebrenica. It was an ethical imperative but also a welcome maneuver: As long as historians were working, public debate on Srebrenica could be postponed. Finally, in April 2002, the NIOD published its 3,000-page study. Supported by previously classified documents, it revealed meticulously what the public at large already knew. Though the perpetrators of the largest massacre in Europe since WWII were unquestionably Bosnian-Serb soldiers led by Ratko Mladic, it was made possible by the failures of many others: of Dutch troops and their commander, who were naive, unprepared, and intimidated; of the Dutch military leadership, which withheld information to save face; of the Dutch government, which sent troops whose safety it could not guarantee; of a European Union incapable of preventing war in Europe; of a NATO policy of containment and appeasement of Serbs, which made it complicit in the ethnic cleansing of Bosnia; of nation-states that too easily accepted homogenization along ethnic lines as a solution to the Balkans, where different ethnic, religious, and linguistic groups were traditionally mingled in the sprawling lands of the Ottoman and Austro-Hungarian empires. Perhaps most of all, there was excessive confidence in diplomacy, reluctance to use force, and inability to distinguish between aggressor and victim in ethnic warfare. Dogmatically, the UN placed neutrality above human rights, allowing itself to be manipu-

lated—and hated—by all sides. Though the NIOD had bent over backwards to contextualize the “dismal position of Dutch troops,” it proved insufficient to conceal the incompetence of both the small battalion on the ground as well as that of the international community as a whole, in preventing bloodshed and exile in the face of the anarchy and fear that followed the death of Tito and the disintegration of socialist Yugoslavia. The day after the release of the report, mere weeks from general elections and his announced retirement, Kok resigned in what seemed to some like an attempt to save the parties of his coalition embarrassment on the eve of an election. Coming seven years too late, it seemed to add insult to injury. Still, the toppling of a government in the name of human rights, prompted by a work of academic scholarship, seems unthinkable in any other country. Did it express Calvinist virtue or Dutch hypocrisy? It might be better to think about it as both the last instance and the swan song of a Dutch Weltanschauung. In The Hague, the international community, in the name of humanity, prosecutes the perpetrators of the crimes it has failed to prevent. Thus the city had come to embody a contradiction unbearable to the ethics of Kok’s Holland, a failing of the UN as much as an eclipse of Holland’s moral light: The prime minister’s residence looks out over the Tribunal. Expressing the extent to which historiography and a country’s moral record had merged, Kok declared, “This is a black page in a book that will never close” and “I have to be able to look at myself in the mirror in the morning.” To Kok’s credit, his last visit as prime minister was a private visit to Srebrenica, which he did not announce to the press. Then, on the day after the holiday when Holland remembers its war-dead and Liberation, Pim Fortuyn, a leading candidate for prime minister (see p. 3), was assassinated. Among other things, Fortuyn had criticized the center-left coalition, which had volunteered troops for the mission in Srebrenica, but then was reluctant to debate the issue after the mission failed. The assassination—the first of its kind in 330 years—further shattered the Dutch sense of moral exceptionalism. In a matter of weeks, Holland lost its innocence. Its typical moral culture had forced it to admit failure, albeit only to itself. But by definition, crimes against humanity are never local. In its small way, it was expressive of a global shift. The legacy of WWII both called on Europe to act and made it reluctant ever to use force again. In Srebrenica, European politics of appeasement again led to atrocity. The paralysis of democratic states trying to agree to wage war reflected the local problems of Dutch consensus politics in times of turmoil. The age of morally ambiguous humanitarian intervention, the long decade since the fall of the Soviet Union, is over. The illusion lost is one of the greatest in Western thought: that there might be such a thing as political neutrality. — Theodor Dunkelgrün

Refashioning National Identities


Kinder, Gentler Fascism
his summer, the head of the Italian state broadcasting system (RAI), Antonio Baldassarre, addressed the national congress of the National Alliance, the right-wing party led principally by “postfascists,” and announced that it was time to “rewrite history” as it is presented on Italian television. “The old RAI represented only one culture and not others,” he said.“Often, they didn’t tell real history, but told fables, offered one-sided interpretations.” This call to “rewrite history,” before a party, many of whose leaders were ardent admirers of Italian fascism, had a very clear meaning: no more “one-sided” portrayals of antifascists as noble patriots and fascists as evil villains. The disinterested pursuit of historical truth is supposed to take no note of the shifting political winds. The reality, of course, is more complex. History, some cynics say, is written by the winners. At the end of World War II, the antifascists—kept out of public life for 20 years of fascism—got to tell their story and named streets and piazzas after their heroes. With the return last year of a center-right government, whose second largest party is the National Alliance, many on the right feel that it is their turn. Much history has been rewritten in Europe after the fall of the Berlin Wall in 1989, but in Italy the process has been doubly complex. The last 13 years have seen both the dissolution of the old Italian Communist Party—Western Europe’s largest—and the rehabilitation of a party that, until recently, did nothing to hide its admiration for Mussolini and fascism. Until recently, the Italians Communists represented roughly one third of the Italian electorate, scrupulously obeyed the rules of parliamentary democracy, and governed major cities and regions. With an unusually high number of writers, university professors, filmmakers, journalists, book publishers, and museum directors gravitating in its orbit, the Partito Comunista Italiano (PCI) enjoyed particular cultural prestige (some would say hegemony). At the same time, the new Italian Republic made it illegal to reconstitute the fascist party of Benito Mussolini. Those who refused to renounce their faith in Il Duce regrouped in a thinly disguised neofascist party called the Movimento Sociale Italiano (MSI), which garnered between 3 and 8 percent. Its cultural weight was even less than its electoral strength. For most of its history, the MSI was headed by Giorgio Almirante, whose principal distinction was having been an editor of La Difesa della Razza (Defense of the Race), a magazine created in 1938 by the fascist regime when it decided to embrace the anti-Semitic and racist politics of Nazi Germany, and the party appeared to offer little more than fading nostalgia. With the death of Almirante in 1988 and the MSI vote stuck at about 5 percent, the party seemed destined for extinction, the relic of an era now truly ended. But events in Italy offered it unex28


pected opportunities for a new life: the corruption scandals that began in 1992 and wiped out the country’s principal governing parties, leaving a huge vacuum waiting to be filled. In 1994, under the leadership of Gianfranco Fini, the party changed its name to Alleanza Nazionale, distanced itself from its fascist past, and moved decisively toward the center. Suddenly, the party jumped from 5.4 to 13.6 percent of the vote and became the principal partner of prime minister Silvio Berlusconi, the TV magnate turned politician. Last year, Berlusconi returned to power with Fini as his deputy prime minister. (At the same time, the old Italian Communist Party has split into two separate parties, their share of the vote has diminished, and they are badly on the defensive.) This transformation depended on significant changes in Italy’s discussion of its history, of communism and of fascism. Fini could not have become deputy prime minister had he and his party not taken steps to revise their views on fascism. Fini criticized Mussolini’s racial laws and alliance with Hitler’s Germany, and made trips to Auschwitz and Israel. More recently, he publicly recanted a statement he had made ten years ago about Mussolini being the greatest statesman of the twentieth century. At the same time, Fini proclaimed that April 25—commemorating the day in 1945 when World War II ended in Italy—was a day of celebration for all Italians, bringing the return of liberty and democracy. This was a significant move. Most of the leaders of the old MSI and many in the newer National Alliance had been young volunteers in the Republic of Salò, the Italian government that fought alongside Hitler, which Mussolini had reconstituted in late 1943, after Italy had officially withdrawn from the war and thrown in its lot with the Anglo-American allies. For them, April 25—the equivalent of July 4 in the United States or July 14, Bastille Day in France—was a day of bitter defeat. But it is not just Fini who has made changes. The rehabilitation of the National Alliance would not have been possible, in all likelihood, without a gradual softening of the portrayal of fascism both in the scholarly literature and the popular press. For much of the postwar period, Mussolini was depicted as partcriminal, part-buffoon, whose regime was a 20-year “parenthesis” in the democratic history of Italy, begun with its independence in 1860. This consensus was challenged during the 1970s by the historian Renzo De Felice, who spent more than 30 years working on a multivolume biography of Mussolini, which has become a cornerstone of all historiography of fascism. De Felice insisted that the demonization of fascism failed to explain adequately how fascism could arise and hold power for 20 years in one of the principal countries of Europe. Mussolini, De Felice argued, enjoyed widespread popularity and governed with the consensus of most of the country up until World War II. De

Refashioning National Identities

Felice’s biography of Mussolini rested on a fundamental distinction between Italian fascism and German national socialism. Fascism, despite its own claim to being a “totalitarian” regime, was a softer dictatorship, which left much of the liberal bureaucracy in place, made peace with the Catholic Church, and did not share Hitler’s obsession with racism and the Jews. Although De Felice was unfairly accused by some of being an apologist for fascism, he did seem to respond to the attacks on him by becoming increasingly polemical and defensive, soft-pedaling the uglier side of fascism, minimizing Mussolini’s personal responsibility in some of its crimes, the killing of his political opponents, and the disastrous conduct of World War II. During the 1980s, cruder and more simplified versions of the De Felice thesis began to circulate. Contrary to Baldassarre’s assertion that RAI has only told one side of the story of fascism, the state broadcasting system used De Felice as consultant on numerous broadcasts and hired a number of his less-refined acolytes to make documentaries, some of them offering admiring portraits of leading fascist figures. (The 1985 documentary Tutti gli uomini del Duce (All the Duce’s men) offered a sympathetic gallery of portraits of leading fascists, focusing on the personal traits of individuals rather than the consequences of their actions.) There were a couple of films on the efforts of Italian Army officials to save and rescue Jews during World War II. The stories these films told were true enough but tended to gloss over the other side of the coin: Mussolini’s draconian racial laws forced Jews out of all public life and made it much easier for the Nazis to track down and deport Jews during the German occupation—a task in which the fascists of the Republic of Salò provided crucial aid. Some of the historical revisionism was healthy and necessary for restoring a more three-dimensional view of Italy’s recent past. Other historians, even some who were critical of De Felice, acknowledged that fascism was no parenthesis, but had built on powerful elements of nationalism, colonialism, and antidemocratic feeling that were integral, important parts of prefascist Italian life. Among historians with impeccable antifascist credentials, Claudio Pavone wrote a book, Una Guerra Civile (A civil war), that offered a more complex view of the struggle between partisans and fascists at the end of World War II. Most antifascist historians, especially those close to the Communists, used the term “War of Liberation” to refer to the efforts of the Allies and the partisans to defeat the Nazis and Mussolini’s reconstituted “Republic of Salò.” To use the words “civil war,” in their view, was to grant the fascists equal status with the partisans. But, as Pavone pointed out, many partisans used the term “civil war,” which better described the reality of the times: However misguided, tens of thousands of “repubblichini” had voluntarily fought and risked or gave their lives out of a sense of loyalty to fascism. This historical view crept into public life in 1997 when Luciano Violante, president of the lower house of the Italian parliament and a member of the post-Communist Democratic Party of the Left, asked that “republican” soldiers be recog-

nized as men of good faith who, right or wrong, had fought to defend their country. In the last several years, these revisionist tendencies have gone considerably farther. In 1996, Ernesto Galli della Loggia, a respected historian and political commentator, in his book La Morte della patria (The death of the fatherland), even blamed the antifascist partisans for the death of national feeling in Italy: By their suddenly switching sides from Germany to the Allies, the antifascists helped divide the country in two. Galli della Loggia ignored other simpler and far more plausible explanations for the diminution of patriotic feeling in Italy. Twenty years of fascist propaganda—beating the drums of war, denigrating other nations—revealed as completely empty by a disastrous war that left the country in ruins with more than a million Italians dead, had thoroughly discredited nationalism in Italy. Patriotism met a similar fate in Germany, where there was no partisan civil war. One of the more curious publications in this apologetic vein was a short memoir by Roberto Vivarelli, a respected historian of fascism, long considered “antifascist,” who revealed that, as a young boy of 14, he had volunteered and fought with the Republic of Salò. The child of a fascist family, whose father had been killed in Yugoslavia and whose 16-year-old brother had run away to fight with the fascists, Vivarelli explained how it seemed morally imperative that he, too, should fight to defend the Fatherland. But what most surprised and shocked many people was that rather than presenting this as a choice dictated by extreme youth and inexperience, he continued to defend it energetically. “Certain debts of loyalty must be paid, even if they involve defeat. . . . I do not regret my choice, on the contrary, I would repeat it.” This romantic version of the Republic of Salò has seeped into popular culture. Recent broadcasts treating the “civil war,” according to historian Massimo Salvadori, have tended to portray partisans no longer as heroes but as opportunistic turncoats who jumped onto the winning bandwagon and “repubblichini” as more morally coherent, refusing to abandon their cause even when it was clear it was destined to lose. The need to recognize former “repubblichini” in the government as people of good faith has distorted the historical discussion, Salvadori says.“What matters to me as an historian is not good faith, but the objective consequences of people’s actions,” he said.“Many Nazi storm troopers were no doubt also in good faith, believing they were serving their country, but what were the consequences of their actions? The Republic of Salò fought alongside Hitler and if they had won, it would have meant dictatorship in Italy and the rest of Europe. The consequence of the partisan struggle was to restore democracy and civil liberties to Italy. So, on the plane of objective consequences, I think it’s possible to say one was right and the other wrong.” Oddly enough, in a period in which many are bending over backwards to be fair to fascism, it takes a former neofascist like Gianfranco Fini to state publicly that the end of fascism was a victory for freedom and democracy that should be celebrated by all Italians. — Alexander Stille

Refashioning National Identities

A Parody of Nationalism
he 2002 FIFA World Cup will go down in history as the most peacefully staged World Cup ever. With the cooperation of their counterparts in Europe, security authorities in Japan and South Korea completely prevented hooligans from entering their countries. Despite the nationwide fever, neither country saw rioting or destructive behavior. For Japan, it was an opportunity to dissolve the bonds between sports and nationalism, and to show how their international sentiment has matured. Seeing headbands with the “Rising Sun”and the inscription Hissho (roughly meaning “certain of victory”) and hearing tens of thousands of fans chanting “Nippon, Nippon”may have given some the opposite impression. But, if you looked more closely, beneath those headbands, you would have noticed that more than half of the younger Japanese had dyed hair and shiny earrings. They chant the Spanish “olé” in unison and sing Verdi’s Triumphal March from Aida as their fight song. They say Hidetoshi Nakata, hero of the Japanese team, currently playing for Parma in Italy’s Seria A, started the trend. But that wasn’t the only thing to surprise European journalists. They saw the uniforms of all the different national teams being sold along the streets and Japanese spectators wearing them and cheering for a favorite foreign team. One popular uniform was that of England’s David Beckham. Prior to the game in Kakegawa, Shizuoka, jerseys were selling like wildfire for 7,000 yen (approximately $60) apiece. One young man who came from London to sell them earned enough to pay for his stay in Japan and happily told reporters,“This would never happen in another country.” At the game between England and Denmark in Niigata, the Japanese spectators, who accounted for more than half of the crowd, were wearing red and white in support of England, even though it was still possible for Japan to meet England in the semifinals. This kind of naïveté, unheard of in Europe, shocked Vladimir Novak, a reporter for a Yugoslav newspaper, and moved Kevin Miles, president of the England Supports’ Association. Japan lost to Turkey in the first game of the knockout tournament, but no one was happier than the residents of a small village in the southern end of Wakayama Prefecture. It turns out that toward the end of the 19th century, a Turkish warship ran into trouble off the coast of the village. The villagers went to the rescue,


and ever since there have been feelings of friendship between them. On the day of the game, children packed in front of the television with the Turkish flag painted on their cheeks, and their mothers wore traditional Turkish dresses. A middle-aged housewife interviewed on TV proudly told the cameras, “Japan is my mother. Turkey is my lover. Of course, I support my lover.” Obviously, Japanese at this event also raised the Rising Sun flag and acted out a nationalistic parody. This is because soccer became popular in Japan only at the end of the 20th century and those who love it tend to be young and, unlike other Olympic events, they are indifferent to any sort of association with nationalism. The coach of the Japanese team is French, several of the players belong to European teams, and there is even one player who was from South America but is now a naturalized Japanese citizen. Japan’s soccer fans know all too well that the FIFA World Cup is anachronistic and the “battle between nations” is almost a bad joke. But since the event is clearly a festival, isn’t it fun to act out nationalism? After all, “Choosing a side and rooting for it is more fun.” That’s how a young businessman, who went to several games in different uniforms and sat in the supporters’ section, responded to a newspaper interview. As sports today float between nationalism and commercialism, little by little they are favoring the commercial side regardless of country. In their wisdom, the FIFA Organizing Committee, to be commercially successful on their own, tried to hide their commercialistic intentions under the cloak of nationalism that inevitably arises when two nations meet on the playing field. Apparently, though, the simpleminded Japanese fan, while pretending to be a part of all of that, actually showed that nationalism today is no more than a way to spice up the fun. The offspring of the “transistor radio salesman” probably thinks commercialism is better than nationalism. Regrettably, however, there is every indication that this year’s World Cup failed to lift up Japan’s faltering economy and, worse still, left local governments saddled with huge debts. So as for the World Cup being commercially a success or not, there really is plenty of room for doubt. — Masakazu Yamazaki

Refashioning National Identities


of the Mohicans?
he 2002 World Cup made lots of news beyond the games themselves and stirred much debate within Japan. One news item was chapatsu (literally, “brown hair” in Japanese), or the hair color and hairstyles of the national team members. In contrast to the 1998 World Cup, when Hidetoshi Nakata (now playing professionally in Italy) was the only “blond” on the team, a solid majority of the players this year had dyed their hair brown, blond, gold, silver, and even red. The player who made the most news was the midfielder Kazuyuki Toda, who not only dyed his hair red, but had it cut in a “Mohican,” and indeed was selected by the British Daily Mail as one of the five best Mohicans (along with the “soft Mohican” David Beckham) in the World Cup. In an interview Toda said he did not intend to stand out through his appearance, but rather to motivate himself so that his performances in soccer would “stand out” as much as his hairstyle. One Liberal Democratic Party politician, Takami Eto, seized on this remark to complain, “[t]he National Team members should have more pride as representatives of Japan. Too many are ‘chapatsu.’ One even looks like a rooster. You don’t have to look like a rooster (i.e., ridiculous) to look strong.” Prime Minister Koizumi, known for his affinity with the younger generation, took an opposite stance when the national team paid him an honorary visit after the tournament, exclaiming to Toda, who had gone back to his natural hair, “Why did you change your hairstyle? You looked so cool with the Mohican!” So, is the chapatsu issue a big deal or not? The national soccer team, it should be noted, was not exceptional, but more or less representative of the level of “chapatsu-ness” among the young in Japan, men and women alike. To those who take the conservative view exemplified by Takami Eto, this is a sign of spiritual and moral deterioration among Japanese youth. In a society in which long black hair against light-colored skin has traditionally been a sign of feminine beauty, chapatsu among men and women conjured stereotypes of outcasts, even delinquents. This view became apparent in an incident last year when more than 100 people were injured and 11 killed on a crowded bridge after a fireworks show. The press initially


reported that the accident was caused by “a few ‘chapatsu’ young men fooling around on a crowded bridge.” In fact, many readers wrote indignantly to the newspapers, the chapatsu young men were actually trying to save women and children once people began to get crushed. Thus, the opponents of chapatsu tend to read symbolic meanings into it, while the younger generation dye their hair simply because it is “cool” to do so. Toda may be rather unusual in seeking to give meaning to his hairstyle. Once hair-dyeing becomes “un-cool,” people will probably start to dye their hair back. It is simply a fashion statement, not a political statement, as having long hair was in the ‘70s. As a popular song in that decade put it, “I cut my long hair to prepare for the real world, telling myself I’m no longer young.” It seems unlikely that the younger generation now will invest that much meaning in chapatsu. Meanwhile, it is still not clear whether the practice is considered socially acceptable across all institutions (schools and companies) in Japan. Although many schools officially forbid their students to dye their hair, to judge from youth on the street, the rules are no longer strictly enforced. It has become more commonplace to see business people with lighter hair colors, but it is still rather uncommon in more conservative workplaces, such as large corporations, banks, government agencies, and the military. Although one “blond” joined the conservative Ministry of Finance this year, he is reported to have since dyed it back to “brown” (though still not black). It is becoming increasingly unacceptable to be discriminated against for the color of one’s hair or to ban dyeing hair altogether. Yet it seems that people may have second thoughts about dyeing their hair if they want to be taken seriously. The “chapatsu-ness” of Japan’s World Cup team was a clear gauge of new latitude: You can become a national hero in Japan with any hair color (a point recent Japanese Olympic medalists had already made, of course), though the style may not be universally acceptable. What is symbolic about the debate over chapatsu during the World Cup may be that Japan can no longer afford to deny diversity and individuality, whether everyone likes it or not. —Takako Hikotani

The Question of Inequality

The Case For and Against
s economists first detected signs of growing inequality in the United States during the 1980s, they began to debate whether the trend was real or apparent, temporary or enduring. Now, even after the 1990s boom, whose benefits were more widely shared, the trend clearly is real and inequality is a basic feature of the “new” American economy, which—at least until Enron and WorldCom—was the envy of much of the world. “Forty-seven percent of the total real income gain between 1983 and 1998 accrued to the top 1 percent of income recipients, 42 percent went to the next 19 percent, and 12 percent accrued to the bottom 80 percent,” writes Edward Wolff, a professor of economics at New York University, in his book Top Heavy. “Why there has been increasing inequality in this country has been one of the big puzzles in our field and has absorbed a lot of intellectual effort,” says Martin Feldstein, a professor of economics at Harvard University and the chairman of President Ronald Reagan’s Council of Economic Advisors. The question for conservative economists like Feldstein is no longer whether inequality is growing, but why and what, if anything, we should do about it. “If you ask me whether we should worry about the fact that some people on Wall Street and basketball players are making a lot of money, I say no,” says Feldstein. While they disagree about the causes—technological change, declining unionization, and global trade are common suspects— economists, both liberal and conservative, have in recent years come to see inequality as a consequence of a postindustrial economy that has begun to reward talent, skills, education, and entrepreneurial risk with increasing efficiency.“There is no doubt that market forces have spoken in favor of more inequality,” says Richard Freeman, another Harvard economics professor, whose writing stresses the need for greater equality. The trend goes back to the early 1970s, when the U.S. began shifting from a heavily unionized, manufacturing society to a service, information-based one. And it has involved one of the largest redistributions of wealth and income in modern history. The richest 1 percent in the U.S. has seen its share of the national wealth grow from 20 percent in 1969 to 35 percent today, approaching the levels it reached before the 1929 stock market crash. The effects on family income were easily as pronounced. The top fifth of American households saw their incomes increase from $86,325 to $125,627, while the lowest fifth actually lost ground, from $11,640 to $11,388. The most dramatic gains were made by the top five percent, whose family incomes went from $131,450 to $217,355.At the same time, those holding only a high school diploma saw their earnings decrease by an incredible 21 percent in just over a decade (1979–1990). These changes have persisted through Democratic and Republican administrations for nearly 30 years and began at the same


time in the United Kingdom even before Margaret Thatcher’s market-oriented policies. They have occurred, albeit to a lesser degree, in social-democratic Sweden as well as in the laissez-faire U.S., indicating that they are not simply the product of economic policy, but reflect deep structural changes in the economy. But where economists split is in whether anything should be done about the growing divide. “The question is whether you lean against the wind of the market to try to preserve decent living standards for working and poor people,” Freeman said. “Europe and Japan are leaning against the wind.” The richest 10 percent in Germany and Japan have only four times more wealth than the average family; the top tenth of American families, 35 times more than the average. Too much inequality can be a problem, but so can too much equality, according to Ronald Englehart, a sociologist at the University of Michigan and director of the World Values Survey. The survey, which has been monitoring the attitudes in countries across the globe for some thirty years, has registered the highest levels at the two extremes.“Interestingly, the levels of dissatisfaction are highest in both the most equal countries in the world—the Communist countries or those that have just emerged from Communism and also with very high levels of inequality,” he said. The only way you can achieve total equality is through coercion, which makes people feel they have no control over their lives and no way to benefit from their labors, he explained. But dissatisfaction reaches almost equal levels in highly stratified societies where people feel crushed under the weight of privilege which they can never overcome. Conservative economists insist that the problem with the new economy is not that some people make a lot of money, but that a series of noneconomic factors are holding others back.“The problem is not inequality but poverty,” Feldstein said. Poor education, the breakdown of the family, and what Feldstein terms “low cognitive ability,” together may be responsible for persistent poverty. He also says that some poor people may choose to skip school or not to work as hard as investment bankers working 70 hours a week. James Heckman, a professor of economics at the University of Chicago, argues, “We have to look at some of the basic structural reasons why we have so many unskilled, poorly educated, poorly motivated people in our society. One of the most amazing phenomena of recent years is why so many people, especially minorities, have not responded to the opportunities out there. There has been a big increase in demand for skills, educated blacks have done very well but college completion rates have leveled off.” Perhaps never before, economists say, has education been such an important factor in predicting future earnings. In the early 1970s, economists worried that education had ceased to pay for itself. According to Marvin Kosters, a senior economist with

The Question of Inequality

the American Enterprise Institute, a conservative think tank in Washington, the earning advantage for those who attend college over those who do not has doubled from 30 percent in the early 1970s to about 60 percent. “Even finishing high school improves your income by about thirty percent,” says Heckman. One large factor in people failing to finish school is the rise in teenage pregnancy and single-parent households. Most teen mothers drop out of high school for a significant period of time. “More than 40 percent of all female-headed families with children had incomes below the poverty line,” writes professor Marc L. Miringoff of Fordham University in his recent The Social Health of the Nation. The high-school dropout rate for Hispanics—the largest growing ethnic group in the country—remains over 30 percent, more than twice that of blacks and nearly three times that of non-Latino whites. We may be at the beginning of an even greater move toward inequality, according to David Ellwood, a professor at the Kennedy School of Government at Harvard.“We are in the midst of a huge demographic change,” he said. “The native-born work force has stopped growing. In the last 20 years, our prime labor force—people between 25 and 54—grew by 20 million people, 16 million of them native-born. In the next 20 years, the growth of that labor force will be zero, all growth will be by immigrants and by older workers.” Immigrants generally do not have the same educational advantages and language skills as native-born workers, meaning that the supply of skilled labor will be more scarce and unskilled more plentiful, accentuating the income gap. Some European countries, like Germany, are beginning to experience the same trend: a growing immigrant labor pool that is culturally isolated, and diminishing levels of educational achievement. Economists like Feldstein consider the level of inequality— measured by the Gini quotient—irrelevant. “If the stock market goes up, which we generally regard as a good thing, the Gini quotient may well go up, too, as stockholders get richer. If the stock market collapses and everyone gets poorer, the Gini quotient would go down. So I think that’s the wrong way to focus the question.” If one person’s good fortune does not harm others, why should we worry? The problem, says Alan Krueger, an economist at Princeton University, is that this looks at money in the abstract.“There is a relationship between economic power and political power,” he says, noting that, as the income gap has widened, political participation among those at the bottom of the economic ladders has dropped precipitously in recent decades, while remaining strong among the well-off.As money becomes increasingly important to political campaigns, poorer people may be dropping out of the process because they feel they have no voice while special interests prevail. To Ellwood, a host of social ills come with growing inequality.“Income inequality may be linked to other non-financial disparities that ultimately can affect opportunities for future generations,” he said. Problems like drug use, teenage pregnancy, crime, and lowered life expectancy all have gone up with widening income inequality and improved somewhat when the poverty rate went down and real wages went up in the late 1990s.

Conservatives and liberals tend to interpret this data differently. Conservatives tend to see pathological behavior as a cause of poverty, while liberals tend to see it as a result, but both acknowledge that there are genuine dangers in a two-tiered society with a substantial (and growing) underclass. “Even if you don’t care about inequality, it should worry you that real wages have gone down or are stagnant for a generation,” says Ellwood. The U.S. claim of being the land of opportunity is being undermined by recent data that upward mobility is declining and economic status is increasingly determined by who your parents are. “The correlation between fathers’ and sons’ earnings is 0.40 or higher in the United States, 0.23 in Canada, 0.34 in Germany, and 0.28 in Sweden,” Krueger wrote in a recent paper. “Only South Africa—still scarred by apartheid—and the United Kingdom have close to as much immobility across generations as the United States.” “If you are born to a woman in the bottom quarter of the income scale or grow up in a single parent household, the odds that you will go on to college, even if you have the same grades and the same test scores, are much lower,” Ellwood said. Here, curiously, the views of conservatives like Heckman converge with those of liberals like Krueger.“Never has the accident of birth mattered more,” says Heckman.“If I am born to educated, supportive parents, my chances of doing well are totally different than if I were born to a single parent or abusive parents. I am a University of Chicago libertarian, but this is a case of market failure: Children don’t get to ‘buy’their parents, and so there has to be some kind of intervention to make up for these environmental differences.” Heckman and Krueger squared off in a long debate last spring sponsored by Harvard’s Department of Economics. Both agreed that much more accessible preschool education—programs such as Head Start—could have long-lasting effects on educational opportunity and future earning power. One intensive preschool program that Heckman touts produced an annual “return on investment” of 8 percent. (Heckman and Krueger disagree about the effectiveness of other programs such as job training for young people past a certain age, citing alternative studies offering differing results.) Krueger points out that the educational gap between rich and poor students grows dramatically in the summers, when children are not in school. American public schools average a mere 180 days a year, as opposed to 240 in Japan, and only 9 percent of students, generally better-off, attend summer school. Krueger advocates major investment in education—expanding preschool education, extending the school year, and offering more job training. Heckman, while granting that some of these programs could be helpful, cites their great cost. In 1994, he conducted a study which indicated that to restore educational earnings ratios to their 1979 level would require $1.66 trillion.“$1.66 trillion for a major national initiative sounds like a lot less money in 2002 than it did in 1994,” Krueger countered. “The estimated cost of the Bush tax cut from 2001 to 2011 is $1.8 trillion, and another $4.1 trillion from 2012 to 2021. The illustrative target of $1.66 trillion does not seem so far out of reach if there is the will to reduce inequality.” — Alexander Stille

The Question of Inequality

The Debate on Inequality in
ith Japan’s defeat in World War II, all Japanese became equally poor, and the prewar privileged class disappeared. After this, as Japan’s astounding economic growth progressed, an extremely egalitarian society came into being. At least, the overwhelming majority of Japanese people thought of themselves as “middle-class.” The late Yasusuke Murakami’s classic Shin chukan taishu no jidai (The age of the new middle mass; 1984) presents a brilliant theoretical treatment of this postwar Japanese society. The perception of the Japanese was consistent with economic statistics. The Gini coefficient, which measures levels of income inequality, declined sharply during the 1960s as Japan’s economy very rapidly grew. While there was a temporary rise of inequality during the ‘70s oil crisis, the index lowered into the 1980s then started to climb again. For several years now, however, a lively debate has been unfolding among Japan’s opinion leaders on the proposition that inequality has been swiftly advancing in Japanese society. Toshiaki Tachibanaki, an economist at Kyoto University, has argued that inequality of income distribution has been growing since around 1980, with the middle-income stratum shrinking, polarizing into high-income and low-income strata. As a result, he maintains, Japanese income distribution, which had once boasted a level of equality close to that of the Scandinavian countries, is now about as unequal as Britain, France, and other West European countries, although not yet as extreme as the United States. International comparisons of income distribution are notoriously difficult to make. But according to Tachibanaki, the Gini coefficient of income distribution after taxes in Japan was 0.365 in 1992, which was considerably higher than Sweden (0.22), Finland (0.21), Norway (0.33), and even Britain (0.35). (The Gini coefficient is measured on a scale from 0 to 1, with 0 representing a perfectly egalitarian society in which all incomes were equal and 1 representing a situation in which one person received all income and the rest nothing.) For the U.S., only pre-tax income figures are available. When considered before taxes, Japan’s Gini coefficient, 0.433 in 1992, was, surprisingly, even higher than that of the U.S. (0.40). But because of Japan’s steeply progressive income tax system, its after-tax income distribution is thought to be somewhat more equal than that of the U.S. This is borne out by comparisons of the percentage of people living in poverty. In 1994, 8.1 percent of the whole Japanese population was counted as poor (defined as those whose incomes are less than half of the average). The figure is higher than Holland (4.9 percent), Sweden (5.4 percent), Germany (6.5 percent) or France (7.5 percent) but lower than the U.S. (18.4 percent) or Canada (12.2 percent). Tachibanaki’s view drew immediate criticism. Fumio Otake of Osaka University voiced doubts about the international com34


parability of the data used by Tachibanaki and the way they were processed. He argued that the apparent widening of the income gap was largely due to the aging of Japan’s population. He also attributed it to the increasing entry of women into the workforce and the growing number of couples with double incomes, but he found no evidence that Japan was becoming a class society. The debate on inequality has gone beyond income distribution, moreover. Sociologist Toshiki Sato of the University of Tokyo contends that one can no longer rise up the social ladder by one’s own efforts. According to Sato’s research, the proportion of people belonging to the upper stratum of Japanese society— white-collar workers as well as Japan’s intellectual elite who enjoy an elevated social position and are economically secure—is increasingly made up of people whose parents also belonged to this upper echelon. Does this mean that the Japanese are becoming a class society where the children of the elite also become members of the elite? Counterarguments have also been made to this thesis. Another University of Tokyo sociologist, Kazuo Seiyama, maintains that the basis for Sato’s bold proposition is weak because Sato’s method of sampling is problematical and the amount of data on which his thesis is based is too small. Seiyama accepts that the “closed” nature of the top white-collar stratum has become more marked, the proportion of people who have moved up from a lower stratum having dropped from 83 percent in 1990 to 76 percent in 1995; but there is no way that this top stratum could be called “closed” in absolute terms. Moreover, it is quite impossible to foresee how the trend will evolve in the coming years. Be that as it may, although the proposition that inequality is increasing remains inconclusive, the debate has become the focus of much interest. Ultimately, the degree of inequality in a society as a whole can only be known at a later time. For people living in the real world, inequality is not a matter of statistics but of their perception of future opportunities. When people’s conviction that effort will unfailingly be rewarded is shaken, their feeling that society is not equitable will deepen. With seniority-based remuneration and hitherto secure white-collar employment tottering because of the drawn-out recession and increasingly fierce global economic competition, it is now more difficult in Japan to have a stable image of one’s future career. The resultant anxiety may draw people to the thesis that only a minority of people are enjoying an affluent lifestyle, and that the middle stratum has shrunk. To white-collar workers, who up to now have enjoyed secure employment, the intensified competition and greater labor mobility may be a threat. To others, though, it could mean greater opportunities for social advancement. It is not clear whether this means more inequality or not. Moreover, even if the white-collar

The Question of Inequality

upper stratum is becoming more entrenched, as claimed by Sato, is this so important? Class is also partly a creation of people’s perception of their own social position, linked with a political standpoint or a particular lifestyle. In a class society, the kind of consciousness represented by statements like “The children of laborers do not need education” or “I’m a laborer so I naturally vote for a left-wing party” is accepted as a matter of course. But by no means can one say, in today’s Japan, that belonging to the top stratum of white-collar workers is accompanied by such class consciousness. Moreover, the white-collar upper stratum in Japan cannot really be called an intellectual elite. True, up to now people have believed that the surest path to success is to study hard at school, earn good educational qualifications, and climb the ladder of success in a big corporation or a central government agency. To this end, young high school students have been involved in what has been called excessive competition to enter one of the prestigious universities. But in a Japan characterized by affluence, it is only natural that people will aspire to a variety of different lifestyles. The lifestyle represented by graduating from the University of Tokyo and having a successful career in a big corporation is no longer the token of social success it once was, nor is it attractive to many of the younger generation. So what is happening now may be no more than the diversification of lifestyles amid greater affluence. Indeed, even if inequality has increased, is it such a bad thing? Some commentators maintain that we should face up to the fact that Japan’s postwar egalitarianism had many harmful effects. Since the end of World War II, Japan has systematically eliminated the elite from society. Even the white-collar upper stratum that Sato refers to as the elite has not particularly had an elite consciousness or a special sense of social responsibility. I do not imagine that such a form of elite consciousness would have been at all welcomed by

Japanese society. Japan’s overall affluence came with great speed, and some people no doubt became richer than others. But was not this affluence under an egalitarianism without a wealthy class no more than petits-nouveaux-riches vying with each other for success? A wealthy class with a genuine elite consciousness, which makes available its private wealth for public ends, as once existed in Japan, might be preferable, suggests the writer Jun Sakada. Conditions in Japan since World War II—democratization, the dissolution of institutionalized social rank, high economic growth, and so on—have made a reality of the idea that anyone can become rich if they make the effort. Where an affluent society has been achieved, economic growth has slowed down and the younger generation is no longer able to find meaning in making an effort. At the same time, with the arrival of a more competitive and flexible employment system, we can no longer avoid the question of adjusting pay scales to account for differences in age, academic qualifications, and ability. In other words, the debate over inequality has heated up because it is deeply connected to many other aspects of the Japanese national consensus that have begun to wobble. The old egalitarian ethos of the postwar period was based on a social contract: people would put up with high, progressive taxation in exchange for a system that guaranteed that hard work would allow them ascent and guarantee them a high measure of lifetime security. With that contract breaking down, questions of fairness and inequality have assumed greater importance even if inequality itself may not have increased substantially. —Masayuki Tadokoro
Source: Translated from “Nihon ni okeru fubyodo rongi,” in Ronso: Churyu hokai (The debate on the collapse of the middle class), ed., Chuo Koron Editorial Department (Chuo Koron Shinsha, 2001).

Inequality and Indifference in
he street was elegantly lined with two beautiful rows of trees whose shade was veiling spectacular modern buildings. Cars of different sizes and shapes were driving slowly down the road. His first feeling on stepping into this city after a couple of years of absence was intense curiosity and excitement. It was colorful, lively, and dynamic—in starkest contrast to what he remembered of the urban scene during the years of Maoist revolution in the 1960s and ’70s: hordes of people on their bicycles against the grayish walls of old architecture. But this wandering ethnographer’s cheerful meditation on the new urban feeling in today’s China was cut short by a different sight: In front of a huge seafood restaurant, a young mother, with a baby in her arms, was trying to pull out something to eat from a trash can. In her mid-twenties if not younger, she was quite clean, fair-


skinned, and dressed in a white shirt and blue jeans. Sitting crossedlegged on the pavement, she placed her baby between her knees. The heavy-looking trash can probably contained some edible things but, as usual, smelled pretty bad in the heat of a South China summer. Reaching out her left hand—revealing stubble under her armpit—and holding the trash can toward her, she foraged for food. She looked quite desperate, and eagerly swallowed her finds without careful inspection. Pawing at the mother’s nipples, the baby, which could not have been more than a few months old, consumed whatever was left in the mother’s hand. There were five kinds of people on the street. The most varied group were men, a mix of middle-aged businessmen and less welldressed youth carrying suitcases. Some shuffled along gingerly as though hoping to go unnoticed; others moved steadily, with chin

The Question of Inequality

jutting proudly to show they were the true masters of the city; still others were nearly running, as if about to miss their flights. Most people ignored the scene around them or seemed indifferent in their haste. Others, a mixture of men and women, often in pairs or small groups, were more cheerful, restlessly turning their heads as they walked aimlessly, often quite loud, though never threatening, in their joyous talk. A few almost stepped on the young mother’s foot, but recoiled immediately, still laughing, seemingly determined not to be disturbed by anything unpleasant. Some young women were equally cheerful but less noisy. They walked along chatting together, holding plastic boxes that probably contained their lunches. Several wore a dark blue uniform. Their faces looked a bit weary and, when they saw the young mother and her child, they registered a quick sympathy but instantly walked on. Other people stood apart on corners or edged their way through the parting crowds, and either muttered to themselves or, as they watched the passersby, tried to speak to them; they looked around but paid no attention to the scene. A last, aimless-looking group of women, in short skirts or tights, sometimes smiled at strangers but stared at the mother and child with disgust or indifference. This was my experience on a recent trip to Shenzhen. Shenzhen, near Hong Kong, is a famous Chinese city because it stands as a symbol for the economic reforms China initiated in the late 1970s. Two major themes that inevitably come up whenever contemporary China is talked about are its rapid economic growth in the past two decades and the development of great economic and social disparity. It is a truth, generally acknowledged, that a new structure of inequality has emerged in today’s China. In the latest World Bank figures, China’s Gini quotient—the measure of levels of society inequality—is .403, well above those of the capitalist countries of Western Europe and close to the level of inequality of the United States, whose Gini quotient of .408 is among the highest of industrialized nations. Moreover, this new structure, unlike what happened in the Maoist revolutionary past, was openly supported by the ideology of the current State. The official slogan “to get rich first” was supported by the belief that economic differentiation is a necessary condition for the initial accumulation of wealth. This belief has remained central to the spirit of Chinese capitalism or, as it has been called by the government,“socialism with Chinese characteristics.” What is missed in this conversation about China, although often unconsciously done, is an examination of the reshaping of a moral outlook that has accompanied the new structure of inequality. For example, beggars have once again become a common phe36

nomenon in Chinese cities, especially in the big, prosperous ones. However, more than the poverty itself, what is astonishing is the attitude toward it: that nobody pays attention to a young mother and her child searching for food in a trash can on the street. The State authority has left society to “the invisible hand”; philanthropy is not yet known or institutionalized; but “a structure of feelings,” to borrow a phrase from Raymond Williams, a deadly mode of indifference, has been born, as seen in the masked expression of a new affluent generation. This attitude of indifference is indeed new to the history of the People’s Republic. Nobody would claim that the Maoist society was one of equality and justice, yet the poor and the weak during those years were probably cared for or at least cared about by the people surrounding them. This is no longer the case in today’s China, as was evident to me in the contrast between the well-dressed diners and the poor just outside, swallowing thrown-away leftovers. No passerby gasped with astonishment. Businessmen, tourists, female factory workers, pimps and other street people, working girls in various entertainment facilities, all wore the same expression of indifference. Economic disparity is a social fact in today’s China, but what has come with it is a change in the moral outlook of a society that still carries within itself a revolutionary past. This change is primarily one of feeling. I am not accusing those with indifferent faces, those seemingly indifferent to the immediate suffering of others—myself included—but simply noting that a certain kind of tolerance has grown as the result of economic reforms. Both the government and people do not seem to see, for instance, the increasing number of beggars as a serious problem. Both the government and people have paid little attention to those struggling for food on the street. This is a sign of change not only related to the question of inequality but also, perhaps more significantly, linked to the problem of morality and ethics. A new generation of businessmen and workers will pass a similar scene every day, and they have gotten used to it. More than needing to calculate the statistics of different incomes, observers of China must note this process of “transvaluation of values,” to borrow Nietzsche’s term.Yet, as Gilles Deleuze remarks in his Nietzsche & Philosophy, transmutation or transvaluation means not a change of values, but a change in the element from which the value of values derives. To put it plainly, the way in which the system of values is formed has been changed, rather than the values themselves modified. Central to this process of transvaluation is the emergence of a structure of feelings marked by sheer indifference to the sufferings of others. —Xin Liu

The Question of Inequality


In Pursuit of the Shadowy Oppressor
nequality has an uglier face in poor countries. In a rich country, inequality may mean a smaller house in old age, a holiday in Algarve instead of Alaska, beer instead of wine. A poor country’s inequality may be between those who wear shoes and those who do not, those who walk and those who drive, those who live under tin roofs and those who spend summers in Europe. And India is a poor country. It is also an old country; two-thirds of its people live in villages. Preindustrial India had three major classes: soldiers led by chieftains who owed allegiance to kings, peasants who paid taxes out of their crops, and workers who provided the rest with services like weaving, leatherworking, or toilet cleaning, and were generally landless, and often untouchable. Revenue needs varied with warfare, and crops varied with the monsoon; so there was much scope for conflict between peasants and courtiers. Sometimes, especially in the north, kings gave chieftains the right to collect taxes. The British mistook this for property rights and turned revenue collectors into feudal landowners. They also began to collect revenue in money; that brought in moneylenders and a new cause of conflict. Students of the subalterns have uncovered many rebellions of the less equal in Indian history. The imprint of that old society was still strong when I grew up. The fundamental difference then was between manual workers and the rest. The vast majority labored with their hands, legs, and shoulders; they lived in thatch houses, walked barefoot, and made their children work. A minority was literate, worked with words, lived in brick and stone houses, and took the bus. The clothes revealed the class; the accent told the rest. There must have been an untouchable subclass; especially in village communities, they must have been marked out for special oppression. But they were no different from the majority in their dress or speech. There was an upper subclass of the minority—the Maharajas, the tycoons, and the British rulers—but they were too few to matter. Today, the minority has grown enormously. The majority has become more like the minority: It wears sandals, puts on trousers, can generally read at least its own language, and takes active part in politics. Manual work has drastically declined; vehicles have taken over from handcarts, and tractors from ploughs. The majority is still ill-housed, especially in cities. Since the poor cannot buy land, they squat. But the vote protects squatters; and as they grow more comfortable, so do their huts. The new class differences are subtler: The majority dresses more loudly, goes to the cinema, rides buses and sends its children to government schools; the minority wears more sober colors, watches television, drives cars, and sends its children to private schools. Regional inequality has widened: While the coastal areas have prospered and industrialized, the northern plains teem with fecund and illiterate poor. After 1947, land taxes withered away under democratic pres-


sures. Feudal rights were abolished and went underground. As population grew, landholdings became smaller, and the number of landless workers grew. As the economy diversified, work outside agriculture grew. So today, the old conflicts have eased—except in pockets of north India where the descendants of feudatories try to get work out of workers who now have more choices and are more conscious of political power. Economic relationships have become less personal and more commercial; mobility and markets have made their dissolution easier, and the old, stable relationships between master and servant have decayed. Studies a quarter century ago showed Indian income distribution to be more equal than in Latin America and less so than in East Asia. There are no more recent comparisons, but those differences probably persist. Rebellions take more orderly forms now; democracy has given the propertyless a new arena to fight in. In the south, parties of these low-caste majorities have ruled for almost 50 years, and have brought them education, government jobs, homesteads, cheap rice, free irrigation, and often free electricity. In the north, parties of the poor are more recent, more divided, and less focused; so their argument with the minority is more acrimonious and violent. Politicians of the low castes squabble with those of lower castes, and both make common cause to deny privileges to the upper-lower castes. A bill to reserve a third of the seats in Parliament for women languishes because lowcaste parties want a third of that third to be reserved for their own. Before the poor can rebel against inequality, they must earn their living. Rebellions occur especially when livelihoods break down. Such breakdowns have become rare as food production has grown and become more stable. The economy continues its muddled expansion, and the growing poor find a niche. In hard times, family bonds help them out. There are, no doubt, times of desperation, and the idea of revolt is never absent. But the Indian society is an atomistic one; it has few big employers other than the government, and few concentrations of workers. So class struggle is too difficult to organize; political action is easier, and there is plenty of it. But politics, education, and the media awaken the urge in the poor to be more like the rich. Jean Dréze, who walked 1,500 kilometers through the tribal lands of central India, encountered children who could not write their own language but had picked up some English by watching television; a motor rickshaw driver in my neighborhood has adopted the unshaven, sulky look of his favorite film gangster. The manifestations of this Sanskritization are not always attractive or harmonious; but as they become more like us, their campaigns are corrupted by our civility. The growing north-south divide is dangerous; and violence mars social change in the north in which the poor often lose. But a long changeless society is now crumbling, and hitherto the change has been in about the right direction. —Ashok V. Desai

The Question of Inequality

African Inequality and
ast November, I traveled around East Africa visiting HIV prevention programs that were supported by an American foundation. I had expected to be shown condom demonstrations and HIV testing centers, but instead I was taken on a drive. My guide and I drove and drove all day, over muddy tracks, through endless pineapple and coffee plantations, rural villages and slums, through all of Africa, it seemed, to arrive at a small field, perhaps half an acre, with some weeds growing in it, and an old woman with a hoe. About twenty women had saved up for two years to buy this land. All of them were supporting orphans whose parents had died of AIDS, and they hoped the land would produce enough food for about fifty people in all. On a nearby hill, a coffee plantation loomed like the edge of the sea, and the old woman kept looking at it, as though she was afraid it might sweep her away. Development agencies are increasingly concerned about land inequality in sub-Saharan Africa, as well they should be, because it may be exacerbating not only poverty and underdevelopment, but also the spread of HIV. Traditionally, most people in this part of the world are farmers, who grow their own food and sell small surpluses for cash. However, this way of life is becoming more and more difficult. In some places, population growth has meant the carrying capacity of the land has nearly been reached; in others, the best land is occupied by plantations and ranches belonging to a few wealthy businessmen and corporations. Corruption and land grabbing often push poor people onto more and more marginal land, or off the land altogether. The consequent disruption of rural life is contributing to many social ills, including unemployment and poverty, and it also creates favorable conditions for HIV transmission. The HIV epidemic in a Tanzanian town illustrates this. Morogoro, about two hours by car from Dar es Salaam, is an important agricultural region, with sugar and sissal plantations, and the surrounding Uluguru mountains support thousands of subsistence farmers. But today the green farms lie in brown and stony folds of barren earth on the flanks of the mountains. During the past 30 years, the population of Morogoro has grown considerably. More and more trees have been cut down for firewood, and farms passed from one generation to the next have been subdivided into smaller and smaller plots; much of the soil is exhausted. Meanwhile, the corporate-owned planta38


tions lie on the most arable land. In Morogoro town, groups of idle young men lounge on the roadside, and girls in the brothels brew beer and entertain numerous guests. Some of these young people are university graduates, educated for white-collar urban jobs that don’t exist in this impoverished and underdeveloped nation. Young people now find themselves caught between the poverty and land shortages of the rural areas and the high unemployment—well over 50 percent nearly everywhere—of the towns and cities. The highest HIV rates in Tanzania are to be found in the cities, plantations and mining towns where these young migrants look for work, and their sexual behavior seems particularly difficult to change. On the plantations, they earn meager wages and breathe air thick with fertilizer and herbicide fumes, and on the diamond and tanzanite mines of Arusha and Shinyanga, they scavenge precious stones from the tailings left over after a day of blasting. These mines and plantations are surrounded by informal settlements, where girls as young as ten exchange sex for as little as a drink or a plate of chips. Meanwhile, in such Tanzanian towns and cities as Morogoro and Dar es Salaam, hundreds of adolescents arrive daily from the countryside. The boys become garbage pickers or machingas—hawkers selling bubble gum, crackers, cigarettes, clothes, pineapples and other things. The girls find work as barmaids or house girls—some become sex workers. The rural villages from which these young migrants depart, and to which they periodically return, very often experience high HIV infection rates as well. Young women left behind in impoverished rural villages are easily impressed by adventurous, sophisticated male returnees, especially those with a little money. Perhaps this is why Morogoro, whose rural areas send migrant youths all over Tanzania, and whose urban center receives migrants as well, has an HIV infection rate of 20 percent, twice the national average. Migrancy is not a new phenomenon in Africa, but it does seem to be increasing, and Africa is now urbanizing faster than any other continent on earth. One significant cause of migration from rural areas is land inequality. A recent study funded by the European Union confirmed findings from the 1970s that all over the developing world, migration is highest from those villages where land distribution is most unequal. The explanation for this is that when most land is concentrated on a

The Question of Inequality

few large farms, demand for labor actually falls, because wealthy farmers can afford tractors, herbicides, and other labor-saving devices. This contributes to unemployment among poorer neighbors, who have often already been pushed by the big farms onto marginal land—or no land at all. So the poor must look elsewhere, or send young relatives to look elsewhere, for survival. The relationship between the social conditions under which people live and the intimate decisions they make about sex are mysterious. But lonely migrants are probably at risk for HIV because they are more likely to fall into casual sexual affairs. In addition, life in the mining settlements and urban slums where migrant youths end up does seem to encourage risky sexual behavior. Researchers from the Rockefeller Foundation recently interviewed some of these working children. They complained that adults demonized them, even though they were only trying to survive.“To be an adolescent is like living in hell,” said one.“If we do not go out and look for any jobs, our parents, brothers, and sisters back home will suffer.” Drug abuse is common, but they claim it is a necessity.“Without drugs, there is no hope,” said another boy. When asked why he didn’t go home to his rural village, he replied that there,“life is even tougher!” Some of the boys in the Rockefeller study believed that they had to have sex as often as possible to maintain their virility, even though the “gifts” they had to pay their girlfriends created a considerable financial burden for them. On the other hand, stable rural villages, where many generations live together, where elders know who everyone is, where it is hard to keep secrets, and where young people are able to envision a positive future for themselves, might be particularly effective at encouraging the kind of sexual restraint that prevents the spread of HIV. However, in rural Africa, the number of households where three generations live together is falling, as young people escape, not only from their destitute family homesteads, but also from the moral order of agrarian society.

Unemployment used to be an urban phenomenon in Africa, because virtually all rural people worked as farmers, either on their own land, or on that of their neighbors. But today, rural unemployment is an emerging concern, although no one seems to know exactly what should be done about it. The World Bank is encouraging African countries to develop formal systems of land tenure, based on individual ownership, so that poor farmers will have a concrete asset they can borrow against or sell for cash. Many people, including Francis Ssekandi, former General Counsel of the African Development Bank, are skeptical about the World Bank’s plans. Before European colonization, most land in sub-Saharan Africa was farmed communally, by large extended families, every member of which had customary rights to use it. As long as they could manage to farm it productively, it was theirs for generations. Then the colonial powers gave title over some tribal land to individuals, usually chiefs. The chiefs could then sell the land right out from under their subjects, and very often they did sell it, to colonial farmers. Ssekandi worries that if African countries follow the World Bank’s advice, the poor could lose far more than they could ever possibly gain by selling and mortgaging their land, at least in the foreseeable future. This is because land provides people with something money, as yet, cannot. “The ordinary African man or woman,” he writes, “holds land not as a commodity of exchange, but as a reservoir of wealth, prestige, and social status. A source of belonging in the community, and existence on this earth.” Until Africa industrializes so that unemployment rates fall, land will continue to provide a vital safety net for millions of people. But in the meantime, land markets based on individual ownership could exacerbate rural poverty, migrancy, and the AIDS epidemic, by further concentrating the best land in an ever smaller number of wealthy hands. — Helen Epstein

Developing HAPPINESS
hen the delegation from the tiny Himalayan kingdom of Bhutan presented its 2001–2010 development goals last year at the United Nations Conference of the Least Developed Nations, in Brussels, it vowed to measure progress by the standard of “Gross National Happiness”(GNH).“Development in Bhutan is not judged merely in terms of income growth,” the published Action Program states, but “in terms of happiness, contentment, and the spiritual and emotional well-being of its people.” Happiness is not entirely new to the leadership vision of this isolated and desperately poor Buddhist monarchy. King Jigme Singye Wangchuck first invoked the concept in the 1970s to emphasize the “non-material components” of development, and he has recently taken to invoking it again. National integrity, protection of the environment, and the safeguarding of the country’s traditional Buddhist culture, he argues, should take precedence


over unrestrained development and the exploitation of natural resources. In this way, Bhutan might avoid the tourist inundation and environmental destruction that has plagued such neighboring countries as Nepal. The rhetoric of GNH has played well with foreign observers who are skeptical of the benefits of globalization. It has also helped to draw attention away from the unhappy fate of Bhutan’s Hindu and ethnic Nepali minority, tens of thousands of whom were expelled in the early 1990s in an effort to maintain the country’s national “integrity,” and who now linger in refugee camps. What is less clear, however, is the impact these policies have had on the happiness of those who remain. Unfortunately, it is impossible to know. For despite his love of the term, King Wangchuck has done precious little to try to measure “happiness,” rendering GNH, in this case, something of a sham.

The Question of Inequality

One may object that happiness, which is difficult to define, let alone to quantify, does not lend itself to calculation. But in other parts of the world—where public opinion polls and statistical surveys are conducted in the absence of royal fiat—researchers have made real progress in measuring this elusive state. The most ambitious is the World Values Survey, headed by Ronald Inglehart, Chair of the Institute of Social Research at the University of Michigan ( Carried out in collaboration with a massive team of international social scientists and local polling organizations, the survey attempts to gauge the “basic values and beliefs of publics in more than 65 societies on all six inhabited continents.” The questions are broad, dealing with topics that range from family and gender roles to religion and politics. But happiness, or as the experts say, “subjective well-being,” figures centrally on the list. The term itself makes clear that this is not exact science. Since the 18th century, in fact, when the English utilitarian Jeremy Bentham famously proposed “felicific” calculus, social planners have searched in vain for a Holy Grail of happiness, a pure mathematics of pleasure. Bentham’s project melted down on the sticky stuff of its primary variable (how on earth to measure pleasure?), and no end of regression analyses has brought successors much closer to the mark. The World Values Survey, by contrast, and those like it adopt a different tack. In effect, they do for entire populations what we do for our friends: They ask how they feel.“Taking all things together, would you say that you are very happy, quite happy, not very happy . . . ?”When refined to capture cultural nuances, and supplemented to emphasize life satisfaction over the long term, such questions can provide considerable insight into the happiness of peoples. From the perspective of developing nations, the most interesting result of the Survey in each of its four major waves is the strong correlation between economic development and well-being, or GNP and GNH. In certain respects, this is just what one would expect: Richer nations, on average, report higher levels of happiness than poorer ones. Yet there is a curious twist. When average national income is plotted against average reported happiness on

a graph, the curve shoots up steeply, but then levels off around the $10,000–13,000 mark. At this point, a rise in income produces diminishing happy returns. That piece of data has given social scientists great room to speculate on what it is that might cause the rise and fall of happiness after that. Why, for example, do the poorer Iceland, Denmark, Switzerland, and Norway consistently report higher levels of happiness than Japan, Germany, the United States, or France? Is it a question of income distribution and social services? Specific cultural and historical patterns, the evolution of democracy, closer family connections, better genes? The possibilities are endless, and so are the theories that argue for one over the other. What is less in dispute, however, is that initially GNP and GNH are closely linked. As Inglehart observes,“The early stages of economic development seem to have a major impact on subjective well-being.” Given that, as almost all economists agree, globalization fuels economic growth in developing countries, it follows that for those below the $10,000 per annum mark, globalization and happiness go hand in hand. There is an odd irony in this. As the journalist Robert Wright has pointed out, measured in happiness terms, the developed world is arguably benefiting far less from globalization than the developers—the very opposite of claims made by opponents of globalization. On the other hand, it is also true that getting to the $10,000 mark can involve its own set of costs: environmental damage, social dislocation, political instability. As any student of economic history knows, in the short term at least, these can be great. For now, though, Bhutan has managed to avoid these upheavals, and will probably do so for some time. Just this January, the country signed a landmark agreement with the World Wildlife Fund and the Field Museum in Chicago to help ensure the preservation of its environment, and build a biodiversity museum. This is to be applauded. Still, with an average annual percapita income of $550, the Bhutanese people would probably be happier if they had less talk of happiness, and more cash. —Darrin M. McMahon

Inequality and HEALTH
e Americans are certainly richer, on average, than we were 20 years ago, but are we happier? Not according to The Health of Nations: Why Inequality Is Harmful to Your Health by Ichiro Kawachi and Bruce P. Kennedy (New York: The New Press, 2002). Today, Americans work longer hours than the citizens of any other nation, but still rack up record credit card debts, which pressures us to work even more. We have less time to spend with our families and friends, and less time to participate in community life. An epidemic of individual financial worries has added up to a general tendency to support politicians who favor reduced taxes and lower spending on the common good.As a result,


our public infrastructure is falling apart, including many of our roads and mass transit systems, our public hospitals and public schools. Therefore, contempt for the government continues to grow, and the nation becomes increasingly polarized along economic lines. At one end, four million people live in gated communities, while at the other, the U.S. prison population has grown sixfold since 1972, so that the U.S. now imprisons a greater percentage of its citizens than any other nation, including Russia, China, and Burma. At the root of all these ills, say Kawachi and Kennedy, is inequality of wealth and income, which has soared in America during the past 30 years. Inequality is not the same as poverty.

The Question of Inequality

Those who live near the bottom of an unequal society may not be deprived of the necessities for a decent life, but they may feel deprived when they compare themselves to others. The effects of inequality manifest themselves largely through perceptions and behavior. Over the past century, many economists have argued that the subjective effects of inequality are generally favorable: Inequality made us all better off, because envy spurs us to work harder and be more productive. Kawachi and Kennedy dispute this. They say that rising inequality has only served to make the rich extremely rich, and the poor even poorer than they were, and all of us feel increasingly disadvantaged. The more other people have, the more the rest of us seem to want, and our own fulfilled desires beget new, ever more costly, desires. Increasingly aware of what we don’t have, many of us long for material goods that fail to satisfy more genuine human needs such as efficient public transportation, a clean environment, affordable health care, decent education—not to mention more spiritual needs such as love and a sense of community, which require investments of time rather than money. The authors claim that inequality is not only creating a materialistic, selfish, lonely society, but damaging business too. They cite findings that baseball teams in which the players earn wildly unequal salaries tend to perform less well than those in which player salaries are more equal. American business may be similarly affected. Giant corporations reward bureaucrats rather than workers or researchers, and this also stifles creativity. These companies then compensate for declining creativity by merging together, laying off workers, and pouring ever more money into advertising, often far more than they invest in research. Could this tendency have contributed to the spate of recent scandals, in which so many struggling, bloated companies turned to devious accountants to increase profits? The Health of Nations actually says surprisingly little about health. For years, the authors have argued that income inequality is harmful to the health of all of us, rich and poor, because it damages the immune system by contributing to stress and loneliness, and because unequal societies tend to invest less in healthpromoting public goods, like pollution controls, public transportation, and public health care. However, the relationship between income inequality and health has been challenged by a number of authors (in particular Angus Deaton, “Health, inequality and economic development,” Research Program in Development Studies Working Paper, Woodrow Wilson School of Government, Princeton University, 2001). A small number of studies have shown that child death rates are higher in some highly unequal developing countries, and there is anecdotal evidence that inequality contributes to the spread of HIV in subSaharan Africa, but for most causes of death, poverty itself is far more dangerous and important than inequality. Even in a rich country like the U.S., the health of the poor is threatened by the fact that 30 million people are hungry and 40 million lack health insurance. The uninsured often delay consulting a doctor until their health problems are beyond repair. The poor also tend to have more stressful lives, thus they smoke

more and find it harder to quit. High levels of stress hormones also seem to derange the normal metabolism of fat, and this contributes to higher rates of heart disease among poor people. The poor also consume bad diets, partly because nutritious foods, such as fresh vegetables, whole grain bread, and fish are far more expensive than frozen food, fast food, and canned drinks. The real-estate market in unequal cities pushes the poor into polluted, dangerous neighborhoods. Also, poor ancestors can cast an unhealthy shadow on their descendants. Rich people whose parents were rich outlive equally rich people whose parents were poor. This may partly be an effect of low birth-weight, which is more common in poor families. Low birth-weight predisposes people to a range of health problems in later life, including diabetes type 2, heart attacks, and strokes. These factors all help explain the extremely subtle effect of social status on health. A study of U.K. civil servants found that men who lived in fourbedroom houses were healthier and lived longer than men who lived in three-bedroom houses. In most cases, the effects of poverty on health swamp the effects of inequality. But there may be exceptional cases in which inequality really is bad for your health. In the U.S., the only health problem that seems to be strongly influenced by inequality itself is homicide, which is more common in more unequal cities. The surge in alcohol-related deaths in Russia in the 1990s was also associated with rising inequality. It is interesting to note that according to the World Health Organization, many of the fastest-rising causes of death in developing countries today include not only homicide, HIV, and alcoholic bingeing, but also other causes associated with risky behavior such as road-traffic crashes, drug abuse, war-related casualties, burnings, falls, and other accidents. Whether these trends are somehow connected to rising global inequality is unknown. Because inequality is in some ways built into the American rags-to-riches ideal, it is hard to know what to do about the general trends Kawachi and Kennedy describe. Despite the damage that inequality is apparently doing to our society, Americans continue to support lower taxes for the wealthy, and smaller government for all. Revolution could not be further from the minds of most of us. And the rest of the world? While there is general agreement that inequality has grown in the U.S. in recent decades, there is debate about whether it has also grown elsewhere. Even if inequality in absolute financial terms isn’t increasing, the experience of it certainly is. Globalization has brought home to people in the remotest parts of the world how much they are missing. Not only do the media make people in developing countries more aware of how poor they are, but multinational corporations, skewed development policies, and corrupt governments have created islands of wealth in oceans of poverty. Could a growing sense of inequality and exclusion be contributing to recent staggering increases in developing countries in HIV infection, drug abuse, alcoholism, and other afflictions associated with risky behavior? And how much inequality is the rest of the world willing to accept? —Helen Epstein

Language, Literature, and the Arts


Apocalypse Now and the Decline of the West
ertelsmann is a firm whose business policy is, in keeping with the will of its founder, firmly marked by Christian principles. If anyone doubted the caritas of this megacorporation, started in Gütersloh, they had a chance to stand corrected recently. The publishing house Bantam Dell, an American Bertelsmann subsidiary, paid $45 million to secure the continuation of a series of Christian apocalyptic novels that have climbed to the top of best-seller lists in the U.S. The authors of the hit book series Left Behind are Tim LaHaye and Jerry B. Jenkins. Jenkins, who previously worked as a ghostwriter for the Reverend Billy Graham, is the writer, LaHaye the inspirer of these apocalyptic novels that are giving millions of Americans sleepless nights. Tim and Jerry are both veterans of assembly-line book production; to date, the new novels of “last days” have sold over 50 million copies. In Germany, Verlag Blanvalet, also a Bertelsmann subsidiary, will market them under the general title Die letzten Tage der Erde [The last days of earth]. According to biblical prophecy, the world’s end will be preceded by “the Rapture,” or the welcoming of the Chosen into the kingdom of God. The rest of us will remain on earth—“left behind.” The basic idea for the novel series came when frequent flier LaHaye took to wondering, during one particular flight, what might happen if the Rapture were to take place right then and there. Rayford Steel is the pilot of a Boeing 747 flight from Chicago to London. During the Atlantic crossing—just as he gives himself up to sinful thoughts of adultery—a stewardess bursts into the cabin to report the disappearance of dozens of passengers. Left behind in the seats of those taken in the Rapture are their shoes and clothes, their eyeglasses and hearing aids. To Steel and a reporter who happens to be on board the flight, it becomes instantly clear that what has begun is the seven-year period that precedes the apocalyptic end of the world. The enlightened reader feels compelled to smile at this remake of the Revelation of Saint John. One of the high points of the 27th annual convention of the American Atheists’ Association, besides a screening of the movie Blasphemy, was a lecture on apocalyptic kitsch. American critics ranked the film version of Left Behind among the 200 worst of all time. One waggish critic wrote that anyone who had seen this flick had already lived through apocalypse. More maliciously still, the Chicago Reader described it as a film George W. Bush might have scripted and even directed. But a contemporary individual afraid of flying, feeling betrayed by the stock market, and threatened by the incursions of natural disasters even into the temperate zones of earth, might well feel a little uneasy making fun of this present-day adaptation of the Apocalypse. He opens the Revelation of John and to his horror finds that the “four angels . . . which were prepared for an


hour, and a day, and a month, and a year, for to slay the third part of men” (Revelation 9:15) have been “loosed,” of all places, on the banks of the Euphrates—in Iraq. Adepts in the school of prophets LaHaye has founded are initiated into the secret of secrets: They learn “how History will end.” In everyday America, new prophets decipher the warning signs of apocalypse: among them, society’s acceptance of homosexuality and the legalization of abortion. The dissolution of Western values, which America has upheld longer than Europe, heralds the world’s end. Two years before the start of the Left Behind series, an American millenarian began developing a computer program that would predict the precise beginning of the end of the world.The data for timing this were provided by Oswald Spengler’s Untergang des Abendlandes (The decline of the West, 1918). Tim LaHaye and Jerry B. Jenkins are far from such crazy algorithms; but their keen authorial pride to know “how the story ends” may remind European readers of the matchless arrogance of Spengler’s opening sentence: “This book will attempt for the first time to predict history.” Beneath the decades of dust that has settled over it, even the title of Spengler’s book has come to seem barely readable. If one does reread it against the background of the present vogue for apocalypse, one’s reaction may be mixed. The reader may feel as Theodor W. Adorno did in 1938 in his essay “Spengler nach dem Untergang” (Spengler after the decline). On the one hand, Spengler, for Adorno, is not a prophet but a fortune-teller, one as sure of his material “as a hangman [is] once the judges have handed down their verdict. . . . Behind this gigantic, destructive soothsaying is the triumph of the petit bourgeois.” Triumphant (and to boot, stinking rich) petits bourgeois—nothing could better characterize the authors of Left Behind as well. On the other hand, Adorno marveled that Spengler, with his sergeant-like tone, finds no opponent worthy of giving satisfaction: “Forgetfulness functions as an excuse, an escape.” Spengler was right enough about the course of world history “for it to be startling, if one still recalls the prognoses. Forgotten, Spengler has his revenge, in that he threatens to be right still.” Grudging surprise over Spengler’s foresight compelled Adorno, in the second part of his essay, to distance himself emphatically from the stance and style of an author who had been not just a fortune-teller but rather a prophet, whose prophecies have a reality content that couldn’t fully be denied. Even if Adorno compared The Decline of the West to a warehouse that job-lots the dried fruits of reading from its culture’s bankrupt assets—he could not shake off his reluctant fascination. Today’s reader marvels no less over Spengler’s prognoses than

Language, Literature, and the Arts

over the disquiet that the speculative exploitation of the Book of Revelations can still elicit in millions of people at the start of the third millennium. Left Behind’s highly filmable updating of biblical apocalypse is eminently suited to deflect today’s core problems. In the wake of this commercially geared updating, Spengler’s dusty book seems startlingly modern in many passages. Spengler’s power, according to Adorno, is revealed not least in focusing his philosophy of history on the phase of “Caesarism.” Its chief characteristics—domination of the masses, propaganda, and mass art—mark the end-phase of political life. With the outbreak of World War II, Adorno had in fact to view “certain tendencies of democracy to turn abruptly into dictatorship” no longer as prophecy but rather as a premonition of political developments that were embodied in “Caesars” such as Hitler, Mussolini, and Stalin. Spengler, however, had thought further. He spoke of private economic forces that sought free rein to conquer vast fortunes: “No legislation should stand in their way. They want to remake the laws in their own interest, and to this end they avail themselves of the tool they themselves have forged, democracy, the bought party.” The result was the end of democracy and the implementation of Caesarism. For Adorno, Spengler had hit on something of lasting value that people would rather have consciously forgotten. Today we

have again forgotten Spengler—again, precisely where he was right. For the prognosis that has long since become reality sounds as quaintly out-of-date as any wrong prediction. Adorno described Spengler’s understanding of economic processes as helpless. Spengler would rail against the omnipotence of money “in the same tone as a petit-bourgeois agitator inveighs against the world conspiracy of the stock market.” Given the writing we’ve seen on the wall, which bears the names Enron and Worldcom, Spengler sounds prophetic when he writes that democracy destroys itself through money after money has destroyed the spirit: “One is weary to the point of disgust with the money economy.” The combination of political moroseness and disgust with the economy makes a dangerous cocktail indeed. But that isn’t the only reason the perils of Caesarism mount. Spengler underestimated how quickly the world would merge into a single world economy. Today we can glimpse the contours of an American empire that may be no less world-defining than the Roman Empire once was. Whether this new empire will be able to manage without Caesarism—that form of domination born out of the destruction of democracy—is the question that arises before the onset of Apocalypse. — Wolf Lepenies translated by David Jacobson

The Crack Squad
he lead story of the March 9 Babelia, the literary supplement of Spain’s El País, was an obituary. It announced the extraordinarily long-predicted death of el boom, which swept Latin American literature beginning in the 1960s and made it synonymous with magic realism. But the piece wasn’t exactly a mournful farewell to the glory years. Rather, it declared that the Boom had been superseded by a new trend in Latin American literature. Its steady ascendancy over the last ten years was unexpected, given its unabashedly cosmopolitan style and ambition, something until recently atypical for the region. All the up-and-coming writers interviewed for the article concurred that their goal is to create a literature at home in the world at large—one that needn’t entice with layers of native color and whimsy, much less flourishes of magic realism, which, while refreshing and innovative thirty years ago, have grown feeble with mechanical repetition over the last decade. While novels like One Hundred Years of Solitude by Gabriel García Marquéz justly rank among the greatest in the Spanish language, uninspired replicas seem to have overtaken all of literature, producing magic-realism outposts the world over. Contemporary Latin American authors writing in the new style believe it will serve to renovate and recover some great traditions of Spanish-language writing that were sacrificed at the altar of indigenous enchantment. The writers aspire to replace


the formulaic ease of the Boom with the ambition and depth of novels such as Pedro Páramo by Juan Rulfo, Farabeuf by Salvador Elizondo, and La obediencia nocturna (Nighttime obedience) by Juan Vicente Melo. El País’s report on the range and steady success of this new literature was long overdue. Although it is always difficult to pinpoint the beginning of a new literary trend, in this case the birth pangs are very well documented. Back in 1996 a now-defunct Mexican literary magazine agreed to publish a manifesto by five young local writers (Jorge Volpi, Ignacio Padilla, Eloy Urroz, Vicente Herrasti, and Ricardo Chávez). They identified themselves as el grupo del crack, a reference to the economic crash Mexico suffered as they came of age in the 1990s, and their manifesto called for the abandonment of “banana-republic literature” and the rebirth of Latin American writing. El manifiesto del crack, as it became known, caused a sensation in Mexican literary and intellectual circles. The five writers found themselves at the center of a scandal of national proportions, accused of no less than betraying their roots and rejecting 500 years of literary tradition. Although the crack writers hailed as their forbears such nationally respected authors as Juan Rulfo, José Revueltas, or Agustin Yáñez, Boom devotees refused to consider the new movement a project of renovation. The new Mexican authors were labeled vende patrias (turncoats) by their peers and, disheartened, turned toward Europe.

Language, Literature, and the Arts

And in Europe el crack succeeded. In 1999 the Barcelona publishing house of Seix-Barral, an institution of tremendous cultural prestige, decided to revive its famous Biblioteca Breve Prize. The award was created in 1958 to spur on young Spanish-language writers to join the literary renewal then taking place in Europe. It was this spirit of renewal and the authors to whom the prize was awarded (among them Mario Vargas Llosa, Carlos Fuentes, Vicente Leñero, and Guillermo Cabrera Infante) that launched the original Boom. In 1972, after 14 years of nurturing a generation that would shape the Hispanic literary world for decades, the publishing house decided to discontinue the prize. When it reconvened the Biblioteca Breve in 1999, Seix-Barral continued to honor the tradition of innovation, and awarded it to crack member Jorge Volpi for his novel En Busca de Klingsor (In search of Klingsor). Unlike the novels the prize honored in the 1960s, when the writers tackled universal questions while imbuing them with the scenery and the aroma of Latin America, in Klingsor Volpi broached scientific and philosophical questions in a European setting. So, in spite of this official cultural imprimatur, skeptical voices now complained that no reader could tell from the novel that Volpi was Mexican. After decades of magic realism, who would expect a Mexican storyteller to credibly explore the moral dilemmas of nuclear physics, war, and politics through the characters of Niels Bohr and Werner Heisenberg during the Second World War? The members of the crack group did expect this, of themselves and of a whole generation of writers in the region. And the public and critics agreed. At the 1999 Hamburg Book Fair, translation rights to En Busca de Klingsor were sold to prestigious publishing houses in the U.S. and Europe. Nor is Volpi’s success isolated: Ignacio Padilla won the Primavera Prize in Spain in 2000 for his novel Amphitryon, which grapples with the issue of individual responsibility during war by following, through a series of chess games, the lives of three different men in Europe during the First and Second World Wars. Since Klingsor, the revived Breve Prize has consistently gone to crack-like novels. And other noncrack Latin American authors have been making the news with groundbreaking fiction, such as the Argentinian Rodrigo Fresán, whose latest novel Mantra has been hailed as the “great Mexican novel” no Mexican had yet written. Yet voices decrying the abandonment of the native and the colorful have found an echo. The vehement defense of the Boom is perhaps rooted in a fear that once novelists pose universal questions without the trappings of the quaint and the fantastical there will no longer be an avenue to convey Latin American culture to the world. This doesn’t mean that crack novels are not here to stay, just that they might have a harder time becoming the hallmark of their generation than the Boom novels did in the ‘70s, the age of radical chic. With anti-globalization feeling spreading in the Latin American and European intellectual and literary circles that make up their essential readership, it might be harder for crack novels to engage the world and not be damned for it. —Naomi Daremblum

“The Russian Language Is Our Wealth”
or the past year a group of State Duma lawmakers have been informing the public of their decision to protect the Russian language from expert reformers—demonstrating, in the process, what a bad precedent the American/Western example of obsessive legislation can set for Russia. The aim of this recent linguistic reform—the first in nearly half a century—is to simplify and modernize Russian orthography, reduce the number of grammatical exceptions, and officially allow use of contemporary words such as “Internet.” The many additions since 1991 have, unsurprisingly, favored English-language business and technical terminology, as IT churns out the jargon of “byte,” “hacker,” and “www” faster than any Russian equivalent can be found. The reform plans to amend Russia’s last language rulings, passed in 1956, must now also decide, for instance, whether the word “god”should begin with a capital letter, a usage banned in the atheistic Soviet Union. In short, the goal is to supplant once-autocratic language with the new realities of democracy. But not so fast! Linguists in charge of the reform have met with resistance, from Duma lawmakers, ordinary Russians (mostly of an older generation), and even the media, which have dubbed the new measures “idiotic.” The First Lady, Lyudmila Putin, a linguist by training, has also felt compelled to join in the debate. Nothing new here, scholars would argue: Initiatives to control the language recur throughout Russian history. Boris Yeltsin sought to replace russkie with the more pompous, unwieldy rossiyane; Stalin weighed in with “Marxism and Questions of Philology”; while Leonid Brezhnev was known more for mangling Russian (as well as geography) than protecting it, for example, confusing pronunciation of “Afghanistan” and “Azerbaijan.” For much of the 18th century a linguistic feud raged between those seeking to revitalize Russian and thereby bequeath it its own literary style, and those insisting on a return to Church Slavic (a medieval adaptation of Old Church Slavonic used in Slavic churches) as the root of all Russian words. Today’s discussion over whether “default” should be defolt or the ungainly nevypolneniye obyazatelstv is merely the latest round of the debate, which is as fierce as ever. There are even advocates of punishment for those who violate the language’s purity or who swear in public. What kind of punishment remains to be seen: being forced to write out irregular verbs 500 times or having the mouths of cursing politicians or postmodern authors washed out with soap? Or should writers who fail Pushkinian standards of usage, such as Victor Pelevin, become subject of the “writers’ campaign”? In a drive against “degenerate literature,” which the extremely popular Pelevin incarnates, the Russian Writers Union encourages its


Language, Literature, and the Arts

readers (familiar echoes of the “socialist realism” resolutions!) to exchange his debasing books for the more “wholesome,” linguistically acceptable Russian classics. In a democracy, books that don’t meet the state’s standards can no longer be banned and their authors cannot be sent to Siberia, but they can be argued against or legislated against if there is enough public consensus. Creating such a consensus can even be a good PR coup. And that’s where the linguistically credible, homey-looking First Lady comes in. She’s not her husband, after all: No one can accuse the demure Lyudmila Putin of being Stalinesque. “It is too early to launch a language reform,” she calmly and authoritatively argued at the Saint Petersburg Congress on the Russian Language in April. “The reform proposed by the Academy of Sciences is pure conjecture.” If the influx of “degenerate literature” and foreign words irks the First Lady, Duma deputies, and the older generation of Russians, it is, some experts explain, because “[d]isturbing books and spelling reforms fare badly when the country’s situation is stable. People consider all change a violation of their private life.” As Mrs. Putin put it recently at a business breakfast with the Rossiyskaya Gazeta, “With the economy in bad need of patching up, with teachers being paid meager wages, [. . .] such a reform is not needed.” To prevent Westernization flooding Russia all at once she is using conservatism, if not linguistic and cultural stagnation, to counterbalance her husband’s political and economic Westward drive, where speed and competition are musts.

But whether we want it or not, the new lexicon “invading” Russian today is probably the largest since Russia’s conversion to Christianity a thousand years ago, when a whole wave of new concepts found expression by pinching words from other Slavic languages. When and to what extent language should embrace change has occupied many great Russian minds. The peasant academic Mikhail Lomonosov (1711–1765) devoted years to the formation of the Russian language; writers like Nikolai Karamzin in the late 18th century adapted entire French and German phrases in their literature, and Alexander Pushkin largely resolved the linguistic questions of his day through sheer genius (although he did advise women to speak French, lest Russian’s complexities overheat their delicate brains). All were indeed experts, similar to the modern Academy of Science philologists who desperately try to push the necessary and timely reforms through. Since Russia is no longer an autocratic society settling all matters through the will of one man, the State’s interest in language is welcome. But aiding in cultural matters in today’s busy market economy should not mean policing them. It evokes bad memories; besides, neither Yeltsin nor anyone before him will be remembered for the brilliance of their linguistic input. The smart money has it that this latest group will fare no better: In her rare speaking engagements, the Russian First Lady has not been on the level of Cicero; quite frankly, not even of Uncle Joe. —Nina L. Khrushcheva

hose who have studied French but haven’t been in France for a while may find themselves confused when they overhear conversations that sound familiar but remain largely incomprehensible. Gradually they may realize, or some kind soul may explain, that what they are hearing is a popular slang called Verlan, in which standard French spellings or syllables are reversed or recombined, or both. Thus the standard greeting “Bonjour, ça va?”or “Good day, how are you?”becomes “Jourbon, ça av?”“Une fête”(a party) has become “une teuf ”; the word for woman or wife, “femme,” has become “meuf”; a café has become “féca”; and so on. The word Verlan itself is a Verlanization of the term l’envers, meaning “the reverse.”


Within a couple of decades, Verlan has spread from the peripheral housing projects of France’s poorest immigrants, heavily populated with Africans and North African Arabs, and gained widespread popularity among young people across France. It has seeped into film dialogue, advertising campaigns, French rap and hip-hop music, and the mainstream media. It has even made it into some of the country’s leading dictionaries. A language of alienation that has, paradoxically, also become a means of integration, Verlan expresses France’s love-hate relationship with its immigrant community and has begun to attract a number of scholarly studies. “Speaking backwards becomes a metaphor of opposition, of

Language, Literature, and the Arts

talking back,” writes Natalie Lefkowitz, a professor of French applied linguistics at Central Washington University in Ellensburg, Wash., and the author of Talking Backwards, Looking Forwards: The French Language Game Verlan (Gunter Narr, 1991), which, when it was published, was one of the first major studies of Verlan. But along with its subversive element, Lefkowitz explained in an interview,“for the young urban professional,Verlan is a form of political correctness expressing solidarity with and awareness of the immigrant community at a time of anti-immigrant politics.” The first documented uses of Verlan date to the 19th century, when it was used as a code language among criminals, said the French scholar Louis-Jean Calvet. But the current and most widespread use of Verlan has its origins in the growth of France’s banlieues, the peripheral areas outside major cities, where the government built high-rise housing for its immigrant-worker population after World War II. In the 1960s and ’70s, many North African workers were joined there by their wives and families. “This housing that was supposed to be temporary, and was built intentionally apart from the mainstream society, became permanent,” said Meredith Doran, an assistant professor of French applied linguistics at Penn State University, who recently finished a dissertation on the culture and language of the French banlieues. Verlan caught on among the second generation of immigrants who were living between cultures.“They were born in France and often did not speak Arabic,” Lefkowitz said,“but they did not feel integrated into France.” Doran explained, “Verlan was a way of their establishing their language and their own distinct identity.” The term beur, which is a Verlanization of the French word Arabe, refers specifically to the second- and third-generation North Africans. Until recently, there was even a radio station of French North Africans called Radio Beur. “Verlan has many functions,” writes Vivienne Méla, an anthropologist who teaches at the University of Paris VIII, in a recent article called “Verlan 2000.”“Initially, it was a secret language that allowed people to speak about illicit activities without being understood.And while Verlan conserves this function, its principal function is for young people to express both their difference and their attachment to a French identity. They have invented a culture that is in between the culture of their parents, which they no longer possess, and the French culture to which they don’t have complete access.” Verlan, however, is also widely spoken by the other immigrant groups of the banlieue, mainly sub-Saharan Africans and Caribbean blacks. And Verlan, along with reversing syllables and words, has also incorporated terms from Creole, Arabic, Rom (the language of the Gypsies), and American slang to create a kind of speech of the disenfranchised. “Verlan serves as an interface between these different groups who do not have a common language,” said Alain Rey, one of the editors of the Petit Robert, the first of the standard dictionaries to incorporate a number of Verlan terms. More than just reversing words, scholars say, Verlan reverses what have traditionally been regarded as negative qualities in

France—ethnic and religious differences, non-French identity, nonstandard speech—and turns them into positive attributes that are consciously cultivated. “In a country obsessed with linguistic purity, it turns a stigma into a positive emblem, a form of covert prestige,” Lefkowitz said. Verlanizing words, she and others say, changes their tone and meaning.“When you say téçi for cité, it is a way of expressing affection, like saying homeland,” she added. Verlan, in the views of Rey and others, is also a playful way for the French to forge a language for dealing with ethnic, racial and religious differences. The Verlanized words for Arab, black or Jew “allow you to mark racial and culture differences without insulting people,” Lefkowitz said. But Leyla Habane, a Moroccan-French university student who provided research assistance when Doran was working on her dissertation, is leery of that interpretation.“I think these terms can be pejorative in any form,” she said, although she admitted that they could also be used playfully. Perhaps because it has been so widely adopted by most French, she finds the term beur offensive. But there is no question that Verlan is used to discuss race, ethnicity, and other taboo subjects. In one recent study, the French scholars Jean-Luc Azra and Véronique Cheneau, both of the University of Paris VIII, documented about 350 Verlan terms, which tended to be clustered around a handful of subjects: illegal activities like theft and drugs; race, ethnicity and national origin; and taboo topics like sex, as well as everyday objects on the street and in the subway. Verlan was discovered by mainstream French in the 1980s after a series of major riots and confrontations with police brought the problems of “les cités” to the attention of most French.“These riots put a spotlight on the youth subculture of the banlieue, and that’s when everybody noticed that these youths had this language of their own,” Doran said. A series of books and films about life in the banlieue followed. The 1995 movie La Haine (Hate), about the lives of three housingproject friends, with much of its dialogue in Verlan, was a revelation to many French, though some found parts of it incomprehensible. Also very popular was a film thriller called Les Ripoux, which is a Verlanization of the French word pourri, meaning rotten. Ripoux has become a common term for corrupt police officers. Verlan became so popular that even former French President François Mitterrand showed off his knowledge of it during a television interview several years ago. When he was asked whether he knew the word chébran (Verlan for branché, which means “hip” or “trendy”), he answered, of course, but added, “That’s already passé; you should say cablé,” which literally means “wired for cable,” but means “plugged in” or “with-it” in current slang. Lefkowitz explained: “There are now different kinds of Verlan. There is the Verlan of the original group, the working-class immigrants from the banlieue. Then there is the Verlan of the urban professionals, bourgeois Verlan or ‘Verlan geoisbour.’ There is also the Verlan of the teenagers who use it to distinguish themselves from the adult word as a game and a form of amusement.” The appropriation of Verlan by mainstream French culture

Language, Literature, and the Arts

is viewed with some uneasiness by those in the banlieues. “They find it annoying,” Habane said.“They feel it is their language, and now they want to take this from us, too.” As a result, Verlan keeps renewing so that the speech of the housing-projects stays a step ahead of geoisbour Verlan. Many terms have also been “re-Verlanized.” Beur, Habane said, now that it has been widely adopted by the French, is sometimes seen as pejorative, with many North African speakers using the term reub, which is beur itself turned inside out. As a Frenchwoman of Moroccan descent pursuing a university degree, Habane expressed mixed feelings about Verlan. “I worry that it creates a kind of linguistic gap between these young people and the rest of the world that can become a trap,” she said. “When I speak to some kids in my neighborhood, they often don’t understand me.”

And while the emulation of Verlan and banlieue culture might be flattering, she worries about recent polls showing that a majority of French feel that there are too many Arabs in the country. Whatever the case, Verlan has made its mark on the language, said Rey, the lexicographer. “Many of them have become so common, they are not even thought of as Verlan,” he said, and their proliferation in newspapers and novels has forced Le Petit Robert to include many Verlan terms in its most recent editions, to the annoyance of purists at the Académie Française, whose dictionary has resisted. “We feel that a dictionary should reflect the language that is actually spoken,” Rey said. “Besides, I think, on balance, there is much creativity in Verlan, and it shows that the French language is very much alive.” —Alexander Stille

The Fabric of Society
arco Polo introduced the word Madagascar to the medieval languages of Europe by referring to the island in his famous book of travels. He accurately depicted it as one of the world’s largest islands, but fantastically populated it with Saracens and a fauna of elephants, tigers, and camels. On the lemurs, Madagascar’s emblematic primates, he was silent. The word “lemur,” in fact, has a much older pedigree: The ancient Romans used it to name the spirits of the dead. On the poet Ovid’s authority, we know that every mid-May all Roman households would placate the wandering spirits of dead relatives by leaving offerings of black beans at their tombs. This ritual spared the living mischief while helping the dead on their journey toward Lemuria, a mythical land somewhere in the Indian Ocean. In modern times, when the Swedish naturalist Linnaeus had to choose a name for the primitive sub-family of primates found only in Madagascar, and eerily resembling humans, he cleverly called them lemurs, recasting old Roman beliefs into the idiom of natural history. Madagascar still carries the burden of these evocative Western associations. Outsiders persistently tend to see it as “an island out of time,” a Noah’s Ark of biodiversity, the habitat where several species of lemurs do indeed survive; we hear far less often of the island’s people, divided into more than twenty ethnic groups, and of their history and culture.“Gifts and Blessings: The Textile Arts of Madagascar,” an exhibition mounted last year at the Smithsonian’s National Museum of African Art, aimed to remedy this. This first comprehensive historical survey of the island’s extraordinary hand-weaving traditions focused on cloth because of its importance for the Malagasy. “Cloth is something miraculous for them,” says anthropologist Sarah Fee, the show’s guest curator.“It is a cultural artifact that establishes bonds between men


and women, sovereigns and subjects, the living and the dead, locals and strangers.” Even today, in busy Antananarivo, the nation’s capital, which is nestled in the highlands among emerald fields of rice and rolling hills and increasingly crowded by migrants from the coasts and the south, most inhabitants still cover themselves in a lamba, a rectangular wrap worn as a shawl or around the hips. “It is cloth that makes people,” runs an island adage. In traditional Malagasy terms, a person not clad in a lamba is not a person, but an unknowing and asocial being. The prescribed ritual giving of cloth creates or cements kinship ties.“Kinship is like spinning silk,” says another popular Malagasy proverb,“even if it breaks it can be repaired.” The two most important ritual occasions for giving cloth are weddings and funerals. Among the Tandroy of Southern Madagascar, when a man wants to marry, he offers a lamba to his intended. If for some reason the couple later decides to divorce, the man takes back the cloth, thereby ending the union in the eyes of the community. A village funeral is always a momentous affair. Among the Tandroy studied by Fee, the ceremony takes months if not years of preparation. On the appointed day, relatives and extended family members stream down country roads, herding before them the livestock they bring as a gift. But lambas are the main offering to the deceased. All the accumulated pieces are wrapped around the coffin and borne with it to the family’s forest tomb. Among the Merina of the highlands the ancestors are celebrated and appeased in ceremonies called famadihanas. Because of their expense, they take a long time to prepare. The family that hosts a famadihana must be ready to feed the hundreds of relatives and friends that will attend. During the ceremony the remains of ancestors are

Language, Literature, and the Arts

removed from the tomb and wrapped in new lambas. As Fee explains, this is a gift from the living to the dead, in exchange for which they hope to receive their ancestors’ blessings. One Malagasy myth traces the origin of cloth to a kokolampo, a forest spirit who teaches a woman to spin, dye, and weave. Today most lambas are cut from factory-made fabrics, but until about 50 years ago, boldly striped, vividly colored cloths were handwoven by the island’s women from indigenous fibers—cotton, raffia, banana stem, palm and hemp, as well as reeds and spun tree bark. Among Madagascar’s artists currently reinterpreting the lamba, one of the most remarkable is Zoarinivo “Mme. Zo” Razakaratrimo, who incorporates such eclectic objects as newspapers, Super8 film strips, and aromatic herbs into her creations, uniting, in her words,“the beautiful with the ordinary.” When pulled over the head, a lamba can express shyness or modesty. Secured around the body, it denotes action and determination. Worn by Merina girls on the left shoulder it indicates acceptance to their suitors; on the right, rejection. They are used as hip wraps in the rice fields and to carry babies on mothers’

backs. There is a whole mundane language expressed by the lamba beyond its more sacred role of defining what is human and serving as a gift to the ancestors (an ambivalent gift, Fee suggests, because by wrapping the remains of the dead in ever-smaller, tighter bundles, the living are also crushing and containing their power). The central pieces in this Washington show were two intricately woven silk textiles sent by Madagascar’s Queen Ranavolona III to U.S. President Grover Cleveland in 1886. During the 19th century, the U.S. was Madagascar’s main commercial partner. When the queen saw her island kingdom menaced by the imperial pretensions of France, she typically sought to cement an alliance with the American president through a gift of cloth. Such a gesture, in the language of Malagasy ritual, attempts to bind the recipient of the gift into a kinship network and a circle of rights and obligations. This symbolism was apparently lost on Cleveland: The U.S. pursued a non-interventionist foreign policy. In 1896 a colonial army invaded Madagascar, exiled the queen, and incorporated the island into the French empire. —Edgardo Krebs

The Return of
outh Korean cinema has undergone a major revival in recent years, recapturing a domestic audience and meeting with phenomenal success elsewhere in Asia, particularly in Japan, Taiwan, Hong Kong, and Vietnam. At the start of the 1990s, Korean movies represented a dismal 20 percent of the domestic market; by 1999, domestic films had climbed to 38 percent of films viewed, and in 2001, domestic films reached an unprecedented 50 percent market share. The most popular film of last year, Friend (Ch’ingu), was seen by over eight million—the best-selling Korean film of all time. In fact, eight domestic films attained audience levels of well over two million each last year. This makes South Korea one of the few countries where locally made films are favored over Hollywood. What can explain such a rapid explosion? A new generation of filmmakers, perhaps: The majority of feature films released in the last five years were made by first-time directors. Only about half a dozen directors who began making films before the 1990s continue to be active today. Im Kwôn-taek is the only director over the age of 60 still making movies. None of the 30-something directors trained under the old Chungmuro apprenticeship system that dominated the industry through the 1980s. Instead, many attended Western art or film schools and look to Hollywood and European directors for inspiration. With little enthusiasm for creating a national cinema, given its authoritarian history, these filmmakers are framing their own vision of filmmaking, often in genres popular in the West—among them,


big-budget action movies, horror movies, art pornography, documentary-type work, and independent short films. Yet the prediction that Korea would follow Iran, Hong Kong, and Taiwan as the next great “new” international cinema center has largely failed to come true. And the reasons for this global failure may be closely related to its local success. Korean cinema has avoided a recognizable national style, and its reliance on Hollywood techniques and motifs from French or Russian cinema strike many Western viewers as derivative. Western film critics fault them for lacking a distinct style or vocabulary of images and themes that distinguishes them from Japanese or the various Chinese cinemas. At times dismissed as imitation Hollywood or just plain bizarre, the recent smattering of South Korean films released outside Korea has failed to find a comfortable niche in the global market. Some of this also has to do with the historical invisibility of Korea, often overshadowed by its more powerful neighbors. A brief history of the Korean film industry helps explain why the current generation of filmmakers has little interest in working within the scope of a national cinema, even if it provides a marketing tool for foreign promotion. Rather than give national definition to South Korean films, young filmmakers would prefer to win back a domestic audience that has abandoned local for Hollywood films for some thirty years. Even foreign cinephiles may be surprised to learn that Korea had a thriving film industry from the mid1920s, and that the mid-1950s through the mid-1960s is called the Golden Age of South Korean film. The most successful directors

Language, Literature, and the Arts

during this decade—Shin Sang-ok, Yu Hyôn-mok, and Kim Kiyông—made innovative features that captured the impoverished society after the Korean War and experimented with then-popular European avant-garde techniques. These directors gained domestic acclaim as well as international success, and received prizes at major film festivals during the 1960s (more recently Shin, Yu, and Kim have been rediscovered in film retrospectives in Korea). The Korean film industry was particularly active from 1926 to 1935, with 80 films a year being produced by over 40 production companies. But the Korean War (1950–1953) severely crippled the industry: Only 15 films were made in 1955. By 1959, however, the South Korean film industry had surpassed its former output, with annual production leaping to 108 films. In the late 1960s, production peaked at more than 200 films a year; the average viewer in 1968 saw six movies, and favored melodramas, which comprised 75 percent of the films made, along with comedies, historical films, thrillers, and action movies. Business-savvy directors like Shin Sang-ok had their own production companies. During the apex of filmmaking, 75 production companies and independent filmmakers competed for market share in South Korea. But the heyday of South Korean cinema gradually waned with the 1963 coup d’état and authoritarian rule, which did not end until 1987. The Motion Picture Law passed in 1973 resulted in massive state interference at all levels of filmmaking. The film industry was subjected to a state reward system that limited the number of production and distribution companies to 20 and strictly enforced quotas that tied the lucrative importation of foreign movies to domestic film production; but it was also hampered by severe censorship laws that banned political discussion and sought to regulate moral and social content. Art films and social realist films were deemed subversive, sexually explicit scenes were prohibited, and even comedies were scrutinized for social criticism. Many formerly successful directors, actors, and comedians were hounded out of the industry on trumped-up charges. Throughout the 1970s and 1980s, the state intervened in film production with the view to creating a national cinema. Particularly in the 1980s, the South Korean government encouraged the visibility and export of Korean culture to improve the country’s poor image abroad. Film was designated an invaluable tool for enhancing international prestige, and South Korean directors were honored and financially rewarded if their films were screened at international film festivals. Domestically, however, audiences had begun turning away from locally made movies, whose quality they found poor and uninspired. Within a decade after 1969’s peak figure of 173 million moviegoers, the audience for Korean films had plunged to 65 million.

One paradoxical outcome of censorship was that filmmakers increasingly resorted to sex scenes, within the limits of the law, that often featured top actresses being brutalized to attract audiences. During the 1980s, whereas political commentary remained strictly monitored, the ban on sexual content was loosened. South Korean cinema now became internationally known for the ubiquitous degradation of women on screen. By the end of the 1980s, when the authoritarian regime was finally overthrown by a pro-democracy movement, the state’s attempt to create a national cinema had stifled artistic freedom and almost destroyed a once-thriving film industry. Though not “national” in style, and applying familiar, generally foreign techniques and genres to local settings and social and political topics salient to South Koreans, the recent crop of films are nonetheless struggling with issues relevant to Korean society. Shot within national borders, they necessarily bear a “Korean” stamp that will have a different meaning for the domestic viewer than for those who live outside the country. The disconnection between the films most popular in Korea and Asia and those distributed in the U.S. is striking. Many of the most popular films in South Korea during the last decade have continued to be love stories like Contact (Chôpsok, 1997), A Promise (Yaksok, 1998), Art Museum by the Zoo (Misulyôp tongmulwôn, 1998), and Christmas in August (1998), which was a big hit in Japan as well. Even more successful with audiences, the politically topical thrillers about North Korean espionage Shiri (1999) and Joint Security Area (JSA, 2000) became mega-blockbusters in South Korea and in Hong Kong, Taiwan, and Japan, where the threat of North Korean spy ships also makes news. None of these films, with the exception of Shiri, released in the U.S. this year, has been distributed outside of Asia. Distribution, indeed, remains the single biggest obstacle for Korean films. In contrast to the South Korean film boom in Japan; in North America, and Europe, Korean cinema is almost entirely relegated to the film festival and art-house circuit. The few films that are distributed in the U.S. constitute a strange representation of South Korean filmmaking today. Im Kwôntaek’s Chunhyang (2000) became the first Korean film to enjoy regular distribution in the U.S. Im’s filmic adaptation of a popular folktale is an understandable choice, because Chunhyang makes Im an “auteur”—but one whose pristinely traditional images, mountain scenery, and pansori music soundtrack are exotically Korean—and because it was the first Korean film selected for the main competition at Cannes. But what about the other distributed films? Features such as Lies (2000) and The Isle (2000) are arguably the most sexually explicit of the films released in the last two years, testing the country’s boundaries of film censorship in their fascination with perverse sexuality and the interplay between sex, violence, and mutilation. But

Language, Literature, and the Arts

the apparent common denominator among U.S.-distributed Korean films is their preponderance of female characters prone to degradation and/or perpetrators of violence themselves. Even in Im’s folktale, the tortures that the virtuous Chunhyang endures are not unlike the masochism demonstrated by the modern-day women in Lies, The Isle, and Shiri, except that they strike back. This is really not representative of recent South Korean films. In fact, one of the most salient features of the new cinema’s individualistic approach to filmmaking has been its refreshing treatment of women, sexuality, and social relations. The late 1990s, indeed, has marked a coming of age for women in the industry. In the past six years, no less than three women have debuted with well-received feature films and countless more have produced documentaries and short films to critical acclaim. This is an astounding increase in the number of women active in the film industry, given that before the last decade only five films had been directed by women. The film industry’s first women’s association was launched in 2000 with 220 members, and it is reported that half of the workforce in sound and stage lighting are now women. Three female producers are recognized as being among the most powerful in the industry, with such hits to their credit as Attack of the Gas Station (Chuyuso sûpgyôk sakôn, 1999), Foul King (Panch’ik wang, 2000), Happy End (1999), and JSA (2000). Among the important new films by women directors are: Im Su-rye’s Three Friends (Se ch’ingu, 1996), Lee Jeong-hyang’s popular romantic comedy Art Museum by the Zoo (1998) and her recent mega-hit The Way Home (Chipûro, 2002), and Jeong Jae-eun’s debut film, Take Care of My Cat (Koyangirûl put’akhe, 2001). In all these movies, female and male characters relate to each other on screen as companions more than sexual objects. This avoidance of stereotypes extends to the portrayal of men and children too. The best of the recent films produced are richly inventive in subject matter and keen to examine the complexity of social relationships. This, I think, is the real breakthrough of the industry today, as South Korean filmmakers struggle to capture the vast potential of the human subject, while creating their own visions of the world through filmmaking. —Helen Koh


New Intellectual Landscape
efore 1989, there was widespread consensus among Chinese intellectuals. Ever since the famous debate about the young Marx’s humanism, which was initiated in the late 1970s by Wang Ruoshui (then the chief editor of People’s Daily) and helped free intellectuals from rigid dogmatic Marxism, most Chinese intellectuals had gradually come to accept that modernization in its broadest sense was a necessary and desirable goal. They also assumed that tension among the different aspects of modernity—the market system, scientific positivism, technological advancement, and liberal democracy—would naturally disappear as modernization progressed in China. The intellectuals’ responsibility in this process, they also thought, was to enlighten the common people, as well as those officials imprisoned in the outmoded ideas of dogmatic Marxism and ingrained Chinese traditionalism. “Liberation in Thought,” or the New Enlightenment, became the major political task of the intellectuals, and was closely linked to the whole enterprise of economic and political reform led by Deng Xiaoping. The close relation between the New Enlightenment and reform did not narrow intellectual exploration in China, however. Instead it opened up the whole question of “civilization” (wenming). The success of modernization, it was thought, depended on achieving a more comprehensive horizon, which was by no means confined to practical politics, as it appeared to be. As a result, the total problem of “civilization,” or culture, dominated all the important disputes among the intellectuals in the 1980s, even in such fields as law, political science, and economics. The Chinese Culture Academy, founded by Tang Yijie, a Chinese philosophy professor at Peking University, was an influential example in this respect. It sought an understanding of Chinese and Western culture from a comparative-historical approach and revived the old problem of culture prevalent in China at the beginning of the 20th century. Another sign of this orientation was the highly influential translation series “Culture: China and the World,” which began in the late 1980s and remains very important today. This series links the old problem of Eastern versus Western civilization with a new reflection on the nature of modernity by introducing classics of Western thought to China. Many intellectuals of the older generation who had taken part in the New Enlightenment movement had an optimistic picture of the West, while the young generation had begun to adopt the West’s self-critique as the starting point for understanding it. The translations selected by the series editors (Gan Yang, Su Guoxun, and Liu Xiaofeng, among others) revealed this: Martin Heidegger’s Being and Time (more than 50,000 copies of the first version of this translation were sold out in one year), Walter Benjamin’s work on Baudelaire, selections from Nietzsche, and Max Weber’s The Protestant Ethic and the Spirit of Capitalism. It is striking that these critiques of Western modernity, rather than classical figures of the modern tradition such as Locke, Adam Smith, Rousseau, and Kant, were chosen by younger intellectuals seeking profound social change in China. The same tendency could be found in some of the more important books published by young Chinese thinkers themselves. Poetic Philosophy (1986), the first work of Liu Xiaofeng, a rising star of the young generation, examined the thought of German Romanticism, Nietzsche, and Heidegger. His next book, Delivering and Dallying (1988), was more controversial. It employed Christian theology to criticize the Chinese cultural ideal of “dallying” (xiaoyao, wandering boundless and free) and became the focus of many related disputes. Gan Yang’s famous article “From Rational to Cultural Critique” (1987), which appeared in Reading, the most influential public intel-


Language, Literature, and the Arts

lectual journal since the 1980s, made use of European thinkers like Hans-Georg Gadamer and Paul Ricoeur to advocate for a “Chinese cultural consciousness.” In part, this turn from practice to “civilization” represented a politics of nonpolitics and was meant to cultivate an implicitly autonomous community of intellectuals. Since, it was thought, the Chinese problem could not be solved at the level of politics alone, only an independent community of intellectuals could cultivate the mature thinking needed to decide China’s fate. However, this lofty perspective did produce some unintentional political thinking and strategies, which often paid too much attention to the full-scale transformation from above, while ignoring the complicated mechanisms of China’s more earthbound political and social reality. This approach would soon confront harsh challenges and expose its inherent defects in the Tienanmen crisis and its aftermath. Yet in 1980, there was already an implicit tension between ideas about the revival of Chinese civilization and nation-state building. Wang Jiaxiang and Xiao Gongqin, influenced mainly by Samuel Huntington’s Political Order in Changing Societies (at least two Chinese translations were published between 1987 and 1989), suggested that an enlightened authoritarian government, though itself unjustified, could make use of its powerful central authority to promote reform. This so-called neo-authoritarian school did not oppose the ideal of modernization, but thought it could be realized only with nonmodern or even anti-modern means. But the most important distinction between this school and the mainstream intellectuals lay in their emphasis on the specificity of Chinese reality and tradition. They believed that this specificity determined the real course of Chinese reform. Some themes of this debate were taken up again in the debate between the liberals and the New Left in the mid-’90s. The recent split among Chinese intellectuals along these lines arose from their reflection on the pro-modernization tendencies of the ‘80s generation. The latter’s optimistic but naïve attitude toward the West became a target, as did the concept of “civilization.” Instead, a complex and plural conception of modernity took the place of a simple and harmonious one. Most intellectuals came to understand that even if China still wanted to become a modern country, she had to choose among often-conflicting modern goals. In other words, the problem was not whether China should be modern, but what modernity she should strive for. While the critique of the modern West was confined to some elite circles of intellectuals in the 1980s, it now became the popular topic of the culture industry and had widespread popular resonance. While the earlier critique of the West was often accompanied by a self-critique and seldom used as a weapon to defend the established regime, the new stress on the complexity of modernity was far more ambivalent. In order to understand the failure of the original project of modernization in the 1980s, Chinese intellectuals, especially those who had academic backgrounds in social science and cultural studies, began to pay attention to the robustness of the established Chinese regime. It was against this background that the debate between the

liberals and the New Left emerged. In comparison with the lofty themes widely discussed in the ’80s, the new opposition centers on more concrete issues such as the market system, mass democracy, foreign policy, and national identity. The liberal camp is represented by Liu Junning, a major political scientist who insists on the fundamental applicability of the Western model to China and especially stresses the primacy of freedom as embodied in the market, property rights, and freedom of speech. For members of the New Left, such as Wang Hui and Cui Zhiyuan, these emphases are misplaced. Economic problems are the main source of the discontents that have fueled the New Left. Social injustice and corruption, inevitably following from partial economic reform without political reform, are regarded by many intellectuals as the necessary results of capitalism. Some even assert that China is no longer a totalitarian country, but more capitalistic than many Western ones, especially in its ruthless social differentiation between rich and poor, the lack of social justice and social security, and the hidden alliance between wealth and power. Though the New Left does not really believe that return to a Maoist regime would work, they definitely do want to create a regime quite different from Western “capitalism.” Liberal intellectuals reject these criticisms of capitalism as misleading because they overlook the more fundamental problems of contemporary China. In their view, only a more liberal regime can be trusted to solve its social problems. The source of corruption and social injustice is not capitalism, they argue, but the outmoded system of government, or the transitional phenomena between the old Communist regime and a future liberal one. Moreover, they point out that capitalistic society is no longer the exploitive system described by Marx, but a complex regime involving equally complex political, legal, and social institutions. Under an efficient market system, such a regime can provide citizens the basic conditions for their free pursuit of various higher goals. Democracy is another issue dividing liberals and the New Left. The liberals continue to worry that mass democracy might lead China back to a vicious cycle of mass mobilization, violent revolution, disorder, and dictatorship. Even though most of them accept the basic value of democracy, they prefer to put liberty before democracy as a more important, or at least more urgent, goal for China. Without a liberal regime, they assert, democracy can become a dangerous instrument of some new tyranny. The New Left regards this implicit or explicit fear of democracy as proof of the liberals’ hidden complicity with the established regime. For the New Left, only a democratically oriented reform can overcome increasingly harsh social injustices. Moreover, democratization will provide the central government the legitimacy it needs and foster social integration on a new basis. The New Left critique of capitalism becomes more convincing when one considers the latter’s international dimension. The New Left suspects that even though the market system is economically efficient, the incorporation of China into the capitalist global system will make China highly dependent on the West.

Reforming Japan

This anxiety has become intense in the recent dispute over whether China should join in the WTO. In this respect the New Left echoes nationalistic sentiments prevalent everywhere in contemporary China, arguing that liberals are encouraging a colonization of Chinese thought by the West. Liberal intellectuals strongly object and rebuke the New Left for overstating the dangers of international capital and understating the defects of the current regime. In this respect, while liberals tend to stress the primacy of the internal political economy of China, the New Left insists that these internal problems have to be considered from an international perspective. Despite these differences, liberals and the New Left share a certain outlook that informs both their defense and the criticism of the West, and that is a basic utilitarianism that distinguishes them from the ‘80s generations. This might also be the source of their limitations. While the New Left employs the self-criticism

of the West to explore a distinctively Chinese way to modernity, it tends to romanticize China itself. The liberals base their criticism of the current regime on mainstream Western opinion, which believes it has discovered the universal fate of all societies, but its proponents are incapable of showing convincingly how the Chinese can establish such a regime under present conditions. In a sense, the current debate between the New Left and the liberals continues the ancient quarrel about the fate of the Chinese “world” in a modern world among the Chinese intellectuals, begun with the first Chinese encounter with the West in the late imperial age. It proves that the search for a decent way to fulfill their aspirations is still a serious problem for the Chinese people. It is not easy to predict the future of this debate, but we can learn something from the attempts of both sides to face up to the grave difficulties of this enduring problem. — Meng Li

Judicial Reform in
mericans often deplore their country’s litigiousness. There are too many lawyers who encourage people to sue others for almost anything. In sharp contrast, there are only 23,000 lawyers in Japan; 3,000 of them are judges and 2,000 are prosecutors. Japanese population per lawyer is as many as 6,300. This figure is 22 times larger than that in the U.S., 9 times larger than that of the U.K. and Germany, and 4 times larger than that of France. In some remote areas of Japan, practically no attorney is available to give legal advice to local citizens. People prefer settling disputes with others through informal negotiations and conciliation. They seldom bring their cases to court. Is there a problem with this? Wouldn’t it be better to run a society with a smaller number of lawyers? Haven’t the Japanese been more successful in keeping their streets safer than the U.S. with their own judicial system? Of course the Japanese judicial system may look strange from an American perspective. Certainly, it is different and there are reasons for that difference. But Japan has developed her own workable judicial system as she has adopted judicial systems from abroad in her long history. In fact, Japan’s first efforts in this field took place in the eighth century, when Japan adopted the Chinese legal system. More recently, Japan drastically changed her judicial system when, beginning in the 1870s, she adopted the French and German legal system. Further changes were made under American influence immediately after WWII. The contemporary Japanese judicial system is thus a somewhat awkward mixture based both on the European (mostly German) and American models. However, each time Japan adopted formal legal systems from abroad, it modified and naturalized them over the years to fit the indige52


nous needs of the Japanese people, resulting in a uniquely Japanese legal culture. For example, despite the strong American influence on the rules of criminal procedure after WWII, public prosecutors’ offices in Japan have remained exceedingly powerful. Some believe they almost play the function of courts by exercising a broad discretionary power to decide whom to indict based on their thorough pretrial investigations. Thus, in Japan, almost 99 percent of those who are indicted will be found guilty by the courts. This creates the presumption of guilt at the point of indictment. Such a criminal procedural system is drastically different from the American system where one is assumed to be innocent until and unless one gets the guilty verdict. But it has worked reasonably well in Japan, where people have traditionally trusted the integrity of the law-enforcement authorities. Despite these successes associated with the Japanese judicial system as it stands today, there has been a growing sense that a major overhaul of the judicial system is needed. First, many believe that the current judicial system is incapable of handling increasingly complex and numerous legal disputes. Traditionally in Japan, the executive branch of the government has handled the bulk of public disputes in Japanese society: Bureaucrats mediated and resolved disputes between different interest groups by means of decrees, administrative guidance, and other informal methods. Litigation was considered to be the last and least desirable means of dispute resolution. As a result, the judicial system played a rather minor role in the society. This situation began to change with the new emphasis on smaller government and deregulation. People are more willing to resort to litigation and fight it out in

Reforming Japan

court. The judicial system is thus expected to play a more active role. Unfortunately, however, the system is not ready to assume a bigger role. Hence, the need for reform. Secondly, many fear that an arcane and inflexible judicial system in Japan may ultimately decrease the country’s competitiveness in the global market. Multinationals today can pick and choose the country in which to do business. A country with a user-friendly judicial system tends to attract investment and business activities; one with an inflexible legal system would chase away businesses and lose tax revenues. Japan urgently needs to make her judicial system competitive with other developed countries. That includes the need to make her legal profession more internationally minded and capable of handling cross-border disputes. This is another reason for reform. Thirdly, many feel that the Japanese judicial system is detached from society and isolated from the general populace. The members of the judicial system—judges, prosecutors, and attorneys—are a highly professional, exclusive, and elitist group. Having passed the notoriously difficult bar examination, they are small in number and regard themselves as above commoners. There is relatively little participation by the general public in the running of the judicial system, and they feel the distance. That has to change in order to let the judicial system play a greater role in the society. In an attempt to make the judicial system more user-friendly, the Japanese government established a blue-ribbon committee in July 1999 called the Judicial Reform Council (the “Council”) to work out a blueprint for reform. The Council, after intensive discussion, submitted its final report on Japan’s judicial reform to the prime minister on June 12, 2001. Their numerous and comprehensive recommendations include the following measures: The number of people passing the bar examination should be increased from the current 1,000 a year to 3,000 a year by the year 2010. If this objective is achieved, the number of actively practicing licensed attorneys will reach 50,000 by the year 2018. The number of judges and prosecutors is to be increased proportionately. By way of comparison, the United States produces approximately 40,000 attorneys a year and approximately 800,000 attorneys practice nationwide. A three-year graduate-level law school system should be newly established to produce better-trained and competent lawyers. Currently, legal education is conducted at the undergraduate level. Only a

fraction of those who receive an undergraduate law degree take and pass the bar exam. Many of them prepare for the bar not through the university courses but by attending outside cram courses for years. The new law school system is to be established in order to correct this abnormal state of affairs. The new system, clearly modeled after the U.S. law school, is expected to attract students with a wider educational and professional background, to offer smaller classes and use the Socratic method to train students to think like lawyers, and thus produce more competent legal professionals. Approximately 70 to 80 percent of those who complete the new law school education are expected to pass the bar examination. The bar exam system itself will be reformed to meet the new method of legal education. The bar should be more open to the outside world. Traditionally, the bar in Japan has been a closed society. There was a high wall between those inside and outside the bar. Attorneys were not allowed to assume positions with the government. Attorneys typically did not become members of the board or the legal section of private corporations. On the other hand, the bar strictly prohibited those outside it from providing any legal advice except in extremely limited circumstances. The Council recommends that attorneys play a wider role in the society, including working for the government. They should be allowed to advertise their expertise and practice experiences. They should also be allowed greater freedom to form partnerships with foreign attorneys. Conversely, certain other legal professionals, such as tax and patent lawyers, as well as experienced in-house counsel, should be allowed to engage in limited practice before the court. Judges should be selected from a wider source. Historically, the Japanese judges are career bureaucrats. A portion of those who pass the bar exam choose a career in the judgeship after basic practical training at the national legal institute. The Council believes that judges should have more varied experiences in the outside world in order to be effective magistrates. More judges should be recruited from among practicing attorneys. The appointment of judges, including Supreme Court judges, should be subject to transparent and objective review by a body reflecting popular will. Lay persons should sit on the bench at criminal trials with professional judges. Japan adopted a jury system for a brief period of time in the 1930s. It worked relatively well in the beginning, but soon proved to be ineffective and was suspended during the war. It has

Reforming Japan

never been resumed. The Council believed that Japan should adopt some form of popular participation in trials, but decided not to try the jury system again. Instead, it recommends a German-style popular participation in which judges and lay persons sit together on the bench, jointly conduct a trial, consult with each other, and render a decision together. It is also contemplated that non-lawyer experts participate in trials involving highly technical issues by sitting on the bench with professional judges. Would the implementation of all these recommendations mean a total Americanization of the Japanese judicial system?

Probably not. Like a number of increasingly obsolete and ineffective postwar Japanese social institutions, the judicial system will have to evolve to meet new social and cultural challenges, including globalization. Nevertheless, a workable and effective legal system is possible only when it is consistent with indigenous cultures and traditions. Although the reform envisages introducing a number of American-style legal institutions and methods, it is probably safe to predict that if adopted, they will again, as in the past, be modified and domesticated by the Japanese. —Naoyuki Agawa


or its February 16 issue, the British weekly The Economist ran a cover story titled “The Sadness of Japan.” The article argued that Japan is in seemingly endless free fall. The Koizumi cabinet has clearly proved disappointing, despite its brave pledge of reform. Thus,“the charitable consider Japan an irrelevance,” while others—“the panicky”—“see it as a liability” if not “a danger.” But who is sad? If Japan were only a business partner, it would not be surprising if foreigners were saddened by Japan’s lackluster economic performance. But the Japanese themselves? In fact, they have reason to be less sad than The Economist expects them to be. Their country’s material wealth is no cause for “sadness.” Japan’s per capita GDP in the year 2000 was approximately $35,000, and remains one of the highest in the world. Its unemployment rate, reported as “serious,” is 5 percent, only slightly higher than the United States’ historically low 4.8 percent, and only half that of continental European countries such as Germany, France, and Italy, whose rates have remained around 10 percent for quite some time. Curiously, this is something that is rarely reported. According to the UNDP human development index (which evaluates and indexes quality of life using criteria not limited to material aspects), Japan always ranks high: ninth place in 2001, just after the Netherlands (the top country was Norway), but higher than Switzerland, France, or Germany. The United States ranked sixth, Korea 26th, and China 87th in the same year. Remarkably, too, it is untrue to assert that no notable change or achievements took place in Japan in the “lost” decade. Ironically, the good news may be that the ludicrous “Japan-as-threat” or “Japan exceptionalism”-type arguments are nowhere in sight, although trade “imbalances” still exist. Changes in the security sphere are quite substantial. The Gulf War in 1991 built momentum to make Japanese views on security more realistic. Neither participation in peacekeeping activities nor dispatching SDF vessels to the Indian Ocean were even thinkable during the days of the bubble economy. Further good news is the democratization of Korea


and Taiwan, and economic development of Asian countries (which Japan actively supported in the form of development assistance). Japan is no longer the only non-Western major player in the Western-dominated world community. Although Japan’s relative economic presence in East Asia will inevitably shrink, its cultural, particularly pop-cultural, influence is expanding in the region. Japan’s problem, in fact, is not its sluggish economy, but its excessive postwar reliance on “the economy,” which has paradoxically strangled that economy. What created this dependence in the first place? Survival was the primary goal for the postwar Japanese after complete defeat in World War II, but this alone cannot account for their energy in making Japan the world’s second largest economic power. One explanation may be that the Japanese became skeptical of “public consciousness,”“public institutions,” and “the state” in general after feeling betrayed by the government for years of personal sacrifice for the greater cause (the war). “Economic wellbeing” became the people’s safe, romantic slogan. Isao Nakauchi, the founder and former CEO of a major supermarket chain that once dominated the era of rapid economic growth with its emblematic slogan “more and more good things for less,” had been drafted to fight in the mountains of the Philippines, and saw the bravest men lose their lives for a meaningless operation. Seriously injured in combat, he returned in a grave state of starvation with just one wish in mind, to fill his stomach with sukiyaki. For him, offering consumers good prices and filling Japanese stomachs was a requiem for fellow soldiers who fell in vain—a revenge for the inferior and irresponsible military leaders at that time.“The economy” was a form of “requiem” and “revenge” for a great many other postwar Japanese. Secondly, at the forefront of postwar Japanese politics, fundamental philosophical divisions continued. To this day, we have failed to establish a political culture in which basic national goals are determined through public debate and election. In such a cli-

Reforming Japan

mate, economic well-being was a safe pathway across the Japanese version of the Berlin Wall, the deep divide in Japanese politics. Particularly notable is that Hayato Ikeda, who became prime minister immediately after the turmoil surrounding revision of the U.S.-Japan security treaty, spoke of “tolerance and perseverance” and “income-doubling.” This shows his desire to cure national divisions though economic growth, by opening the political possibility of negotiation and adjustment instead of the fundamental conflict between the conservatives and progressives, big business, and the labor unions. In other words, “economy” was the largest common denominator to unify the whole nation. Furthermore, since military power was no longer an option for raising Japan’s international status, the alternative was to become an economic superpower. Many Japanese engineers who led postwar Japanese industrialization were motivated by nationalism in order “not to be looked down on by Americans.”“The economy” was at once a “revenge” on the state, a “national strategy” to attain important international standing, “proof ” of Japan’s modernization, and a source of “pride.” It was also a bright dream to conceal past historical debts for dishonorable prewar conduct and present domestic philosophical divisions. However, the cultural, emotional, and spiritual situation that supported postwar Japanese economic growth has long since changed. Filling one’s stomach and “making more and more good things for less” no longer inspire most Japanese. Symbolically, Nakauchi’s supermarket chain is about to collapse. What prevails in Japan today is not “sadness” but a shortage of spiritual energy. Until recently, Japan was reported to be a country with no chance of “decadence,” a common feature among the developed countries. Washington Post reporter Fred Hyatt, who continued writing cool-headed articles even when sensationalism swept Western reporting of Japan, wrote with Margaret Shapiro in August 1990: “There is no question that Japanese society works in ways that often surprise Americans. People are diligent. Students study, workers work, no one begs and no one slouches.” Only after a decade or so, uneasiness and anxiety are present, but no longer the strong emotions that foster “requiem” or “revenge,” or “national pride.” This is because Japanese society is now “out-of-date,” and no longer motivates people. It is clear that Japanese people think quite differently about “the economy” than they once did. Now, “bringing food to the table” is no longer a heroic gesture but effortless “business as usual.” To the victims of prewar repression and sacrifice,“economy” was an emotional goal

in and of itself, and not just a means to eke out a living. It should not be regrettable that such emotion must wither away as Japan becomes more affluent as a democratic nation. “Economy” has become barely fulfilling as a game, as it once was during high economic growth in the egalitarian socioeconomic system of Japan. Nonetheless, because various Japanese institutions adapted all too well to popular efforts in the period of poverty, and since they were so successful, the outdated institutions have now become illsuited to people’s motivation today. Lifetime employment and intracompany unions that eased the tension between business and labor, together with the socialist systems of social security and health insurance, made it possible for all those who were poor to “eat well constantly,” and thus created incentives for work. Moreover, rapid economic growth raised wages and enabled the government to construct roads and increase subsidies. This cleverly tied together the prosaic goal of “eating well constantly” with the various feelings toward the economy, leading to the solidarity and vitality that the outer world found almost discomforting and ominous. But once those feelings toward the “economy” disappeared, what remained in these institutions were petty bureaucratic self-preservation, and disillusionment among motivated professionals and those who seek a sense of self-worth in the work they do. Japan needs to rebuild an incentive structure—but incentive for what, drawing on what values? “The economy” emerged as a postwar value under postwar conditions, and somehow filled the postwar value gap.“Economy first”-ism proved successful in avoiding excessive ideological conflict, but also avoided the necessary question for a political community, “Toward what goal at what cost?” and resulted in excessive reliance on money to settle all problems. But “the economy,” due to its unexpected major success, can no longer be the largest common denominator for the society to make up for the postwar lack of “value,” and has proved to be only a means, not an end. In fact, the main agenda for Japanese society today is not economic renewal. The agenda is to bid farewell to an economy-centered Japan. I believe there is no more reason for pessimism for today’s Japan than for Japan during the bubble. What may make Japan “sad” is not that the economy is bad, but that there is nothing but the economy. —Masayuki Tadokoro
A longer version of this article appeared as “Keizai-danomi no sengoteki ‘mukachi’ ni ketsubetsu o”(A farewell to value-blind economy worship), Chuo Koron, August 2002.

The Committee on Intellectual Correspondence is an international project
Naoyuki Agawa, until recently a law professor in Japan, is a Japanese minister to

sponsored by the Suntory Foundation (Japan) and the Wissenschaftskolleg zu Berlin.

Daniel Bell Wolf Lepenies Masakazu Yamazaki

Alexander Stille
Managing Editor

David Jacobson
Associates of the Directors Japan

Masayuki Tadokoro

Michael Becker

Mark Lilla
Graphic Designers

Gene Crofts Janet Noble

Matteo Pericoli (or as credited on page 2)
U.S. Address

Correspondence Council on Foreign Relations 58 East 68th Street New York, NY 10021 Telephone (212) 434-9574 Fax (212) 434-9832 E-mail The Council on Foreign Relations and the Committee on Intellectual Correspondence gratefully acknowledge the generous support of the Sasakawa Peace Foundation of Japan and the Zeit-Stiftung Ebelin und Gerd Bucerius, a foundation of the German newspaper Die Zeit. Correspondence is published twice a year by the Council on Foreign Relations, which does not accept responsibility for the views expressed in the articles presented here.

the U.S. in charge of public relations and cultural affairs. He has published many books on the U.S. and its legal culture. • Martin Burcharth is the U.S. correspondent for the Danish daily Information. His forthcoming book is tentatively titled The Other America. • Naomi Daremblum is international editor of the Mexican monthly review Letras Libres. • Ashok Valji Desai is a consulting editor of the Business Standard, New Delhi. His publications include The Price of Onions (1999), My Economic Affair (1993), and Technology Absorption in Indian Industry (1988). • Theodor Dunkelgrün is completing an M.A. in General Studies in the Humanities at the University of Chicago. • Helen Epstein is a consultant and writer on reproductive health in developing countries, and a contributor to The New York Review of Books. • Takako Hikotani is Assistant Professor Public Policy at the National Defense Academy in Yokosuka, Japan. • Nina L. Khrushcheva teaches Media and International Affairs at New School University, in New York. • Helen Koh has taught Korean literature and film at the University of Chicago and Columbia University, at whose East Asia Institute she was a visiting fellow for three years. • Edgardo Krebs is a research associate at the department of anthropology in the Smithsonian’s National Museum of Natural History in Washington. • Timothy Lenoir is Professor of History of Science and Technology at Stanford University, Chair of the Program in History and Philosophy of Science, and leader of a project called “How they Got Game,” which is designing a multi-sited art installation addressing critical theoretical issues of new media. • Wolf Lepenies, a director of the Committee on Intellectual Correspondence, is a permanent fellow and former director of the Wissenschaftskolleg zu Berlin. • Meng Li, formerly a lecturer in the Department of Sociology of Peking University, is a doctoral student in the Committee on Social Thought of the University of Chicago. • Mark Lilla is Professor of Social Thought at the University of Chicago and author, most recently, of The Reckless Mind: Intellectuals in Politics. • Xin Liu is Associate Professor of Anthropology and Chair for the Center for Chinese Studies at the University of California, Berkeley. His recent publications include In One’s Own Shadow (U.C. Press, 2000) and The Otherness of Self (U. of Michigan Press, 2002). He is now writing on contemporary China’s statisticalization. • Darrin M. McMahon is writing a history of happiness in Western thought for Grove-Atlantic Press. • Julie Pecheur, a freelance journalist in New York, writes mostly for Le Nouvel Observateur. • Matteo Pericoli, author of Manhattan Unfurled, is an Italian architect and illustrator who lives in New York. His next book will depict the city’s inner skyline in 32 feet of color drawing. • Michael Rutschky is a freelance writer in Berlin and author, most recently, of Berlin—Die Stadt als Roman (Berlin: The city as novel). • Alexander Stille is the author, most recently, of The Future of the Past. • Masayuki Tadokoro is Professor of International Relations, Faculty of Law, Keio University in Tokyo. • Gadi Taub is co-editor of Mikarov, an Israeli journal of literature and society. • Masakazu Yamazaki, a director of the Committee on Intellectual Correspondence, is artistic director of the Hyogo Prefecture Theater in Kobe, Japan.