Ten Years and Beyond: Economists Answer NSF's Call for Long-Term Research Agendas

Edited by:

Charles L. Schultze, Brookings Institution and Chair, AEA Committee on Government Relations Daniel H. Newlon Director, AEA Government Relations

Electronic copy available at: http://ssrn.com/abstract=1886598

Introduction
We would like to acknowledge and thank the National Science Foundation’s Directorate for the Social, Behavioral and Economic Sciences (NSF/SBE) for challenging economists and other relevant research communities “to step outside of present demands and to think boldly about future promises.” Specifically, NSF/SBE invited groups and individuals in August 2010 to write white papers that describe grand challenge questions in their sciences that transcend near-term funding cycles and are “likely to drive next generation research in the social, behavioral, and economic sciences.” NSF/SBE planned to use these white papers "to frame innovative research for the year 2020 and beyond that enhances fundamental knowledge and benefits society in many ways. This request is part of a process that will help NSF/SBE make plans to support future research." At the conclusion of the submission period on October 15, 2010, NSF/SBE had received 252 papers. A compendium of abstracts to the 252 white papers1 and most of the full texts of the white papers can be downloaded from the website http://www.nsf.gov/sbe/sbe_2020/. We are disseminating the white papers of interest to economists independent of the NSF because these papers offer a number of exciting and at times provocative ideas about future research agendas in economics that are worth further consideration by economists. These papers could also generate other compelling ideas for infrastructure projects, new methodologies and important research topics. Also some of these papers are not available at the NSF website because they were not submitted successfully by the deadline. We have placed 54 of the white papers on our website http://www.aeaweb.org/econwhitepapers/ and have assembled these white papers in this electronic publication. The following white papers are of possible interest to economists: Acemoglu, Daron Alesina, Alberto Altonji, Joseph Challenges for Social Sciences: Institutions and Economic Development ………….........….9 Why Certain Countries have Developed and Others Have Not? …………………….............15 Multiple Skills, Multiple Types of Education, and the Labor Market: A Research Agenda..............................................................21 Grand challenges in the study of employment and technological change …….......…………..27 Some Foundational and Transformative Grand Challenges in Economics ……..................…...37 A Proposal for Future SBE/NSF Funded

Autor, David and Lawrence Katz Baily, Martin Neil* Berry, Steven
1

National Science Foundation, Directorate for Social, Behavioral, and Economic Sciences. 2011. SBE 2020: White Papers; Titles, Authors, and Abstracts. Arlington, VA: National Science Foundation.

Electronic copy available at: http://ssrn.com/abstract=1886598

Bloom, Nick Blume, Lawrence

Boskin, Michael

Research: Refocusing Microeconomic Policy Research ……………………........................….49 Key Outstanding Questions in Social Sciences ..55 Robustness and Fragility of Markets: Research at the Interface of Economics and Computer Science ……………………................................59 Ideas About Possible NSF Grand Challenges in Economics Over the Next Twenty Years ….......65 Future Research in the Social, Behavioral, and Economic Sciences with the Panel Study of Income Dynamics ……………….................…..69

Brown, Charles, Dan Brown, Dalton Conley, Vicki Freedman, Kate McGonagle, Fabian Pfeffer, Narayan Sastry, Robert Schoeni, and Frank Stafford Brunnermeier, Markus, Lars Peter Hansen, Anil Kashyap, Arvind Krishnamurthy, Andrew W. Lo Card, David, Raj Chetty, Martin Feldstein, and Emmanuel Saez Charness, Gary, Martin Dufwenberg Cramton, Peter* Cutler, David Darity, William, Gregory N. Price, Rhonda V. Sharpe Diamond, Peter

Modeling and Measuring Systemic Risk …........75

Duflo, Esther Eaton, Jonathan and Samuel Kortum*

Fischer, Stanley Fudenberg, Drew Gintis, Herbert Goulder, Lawrence *

Expanding Access to Administrative Data for Research in the United States ……….................81 Future Research in the Social, Behavioral & Economic Sciences ………………...............…..85 Market Design: Harnessing Market Methods to Improve Resource Allocation …....................….87 Why Don’t People and Institutions Do What They Know They Should? ……..........................91 Broadening Black and Hispanic Participation In Basic Economics Research ….....................…97 Three Important Themes: Taxation of Capital Income, Behavioral Economics in Equilbrium Analyses, and Systemic Risk ……....................105 A Research Agenda for Development Economics ………………............................…111 The Contribution of Data to Advances in Research in International Trade: An Agenda for the Next Decade ……….....................…….117 Questions about the Future of the International Economy …………………...............................123 Predictive Game Theory …….....................…..125 Long-range Research Priorities in Economics, Finance, and the Behavioral Sciences …...........131 Integrating Economic and Political Considerations in the Analysis of Global Environmental Policies ………...................…..135

Greenstein, Shane, Josh Lerner, and Scott Stern Gruber, Jon Haltiwanger, John

The Economics of Digitization: An Agenda for NSF ……………………....................................139 What is the Right Amount of Choice? ...........…149 Making Drill Down Analysis of the Economy a Reality …………………................................….151 Future Directions for Research on Immigration..157 Developing a Skills-based Agenda for "New Human Capital" Research ……..........................163 Making the Case for Contract Theory ................169 A Research Agenda For Understanding the Dynamics of Skill Formation ….....................…173 Some Compelling Broad-Gauged Research Agendas in Economics ……...............................181 Challenges in Econometrics ….....................…..183 Research Opportunities in Social and Economic Networks ………..........................….189 A New Architecture for the U.S. National Accounts …………….....................................…193 Measurement and Experimentation in the Social Sciences ……………...........................…199 Implications of the Financial Crisis …................205 Virtual Model Validation for Economics ...........209 A Complete Theory of Human Behavior ...........215 Language and Interest in the Economy: A White Paper on "Humanomics" …..................…223 A New Household Panel in the U.S. …..............229 Economics, Climate, and Values: An Integrated Approach …………......................….233 Some Foundational and Transformative Grand Challenges for the Social and Behavioral Sciences: The Problem of Global Public Goods..239 Complexity in Social, Political, and Economic Systems …………………....................................245 Research Opportunities in Economics: Suggestions for the Coming Decade …...............251 Three Outstanding Challenges for Economic Research …………………...................................257 A Research Agenda in Economic Diagnostics......263

Hanson, Gordon Hanushek, Eric Hart, Oliver Heckman, James Hubbard, Glenn Imbens, Guido Jackson, Matthew Jorgenson, Dale Kapteyn, Arie Kroszner, Randall Levine, David Lo, Andrew McCloskey, Deirdre Moffitt, Robert Nelson, Julie, Evelyn Fox Keller Nordhaus, William

Page, Scott Poterba, James Reis, Ricardo Rodrik, Dani

Rogoff, Kenneth Roth, Al Samuelson, Larry Stavins, Robert Van Reenen, John Varian, Hal Weir, David Yitzhaki, Shlomo * Not available at the NSF website

Three Challenges Facing Modern Macroeconomics …………….............................267 Market Design: Understanding Markets Well Enough to Fix Them When they’re Broken........273 Future Research in the Social, Behavioral and Economic Sciences ……………….................…279 Some Research Priorities in Environmental Economics …………………...............................285 The Productivity Grand Challenge: Why do Organizations Differ so Much? ...................……291 Clinical Trials in Economics ….....................…..297 Grand Challenges for the Scientific Study of Aging ………………………...............................299 Sensitivity Analysis through Mixed Gini and OLS Regressions ………...........................……..303

We have grouped the papers below by infrastructure investments proposed and by the economic and social issues motivating fundamental research agendas. White papers also recommend more support by NSF for groups underrepresented in the economics profession (Darity), new NSF initiatives that transcend disciplinary boundaries (Gintis, Lo, McCloskey, Nelson), new statistical methods (Imbens, Yitzhaki), better theoretical tools (Fudenberg, Hart, Jackson, Samuelson) and prizes by NSF for research accomplishments for established researchers (Charness). Economics has been transformed by the increased availability of data and the methods, measures and computational power needed to analyze the data (Card, Eaton). According to some of the white papers, advances in economics could accelerate over the next decade if investments were made in: • cross-country research data that fill gaps in international data analysis such as the absence of international firm datasets with basic information on inputs, outputs, growth, management practices and technology of firms (Bloom, Eaton, Van Reenan) or more harmonization of measures between datasets for different countries (Alesina, Brown, Eaton, Weir), • a new longitudinal survey of US households to address limitations in the aging data infrastructure for studying economic and social dynamics in the US (Altonji, Moffitt); • an advanced data collection laboratory to gather longitudinal socioeconomic data for US households over the internet, from administrative records and from new forms of data collection including personal digital assistants, webcams and self-administered devices (Kapteyn, Moffitt);

• •

direct, secure access to the US government’s comprehensive micro-economic administrative electronic files for households and businesses so that researchers have access to a rich archive of information covering almost every aspect of socio-economic behavior at different levels of aggregation (Card, Eaton, Haltiwanger, Hanson, Hanushek); direct, secure access to large amounts of underutilized data collected by private sector firms and kept secret (Van Reenen, Varian); data infrastructure for cumulative, transparent, and high quality research on the digital economy and on the rules and policies that govern the economic incentives to create, store and use digital information (Greenstein Reis), a new system of National Income Accounts that better reflects the global economy and includes psychological measures of well-being and better measures of non-market activities (Boskin, Jorgenson, Reis) a special program in experimental design and analysis that would support field experiments/clinical trials designed to resolve fundamental debates in economics and encourage public-private research co-operation in this area (Varian), development and validation of an agent based virtual economy with sophisticated agents that mimic human behavior and well-developed models of production, trade and consumption, scaling up existing agent based models, and/or many small-scale research projects using agent based models (Blume, Gintis, Levine, Page), sustainability science environmental observatories with social science data collection efforts sufficient to capture the bi-directional linkages between human actions and natural-environmental processes (Brown), and improvements in existing longitudinal surveys (Altonji, Brown, Weir) by collecting better information on job content, skill requirements, education, genetics and humanenvironment interactions.

Many of the white papers propose agendas of fundamental research motivated by important and persistent economic and social issues including: • Financial crises and economic instability (Baily, Boskin, Blume, Cramton, Diamond, Gintis, Haltiwanger, Hansen, Hart, Jackson, Kroszner, Lo, Nordhaus, Poterba, Rodrik, Rogoff, Van Reenen) • Gaps between rich and poor countries (Acemoglu, Alesina, Baily, Duflo, Eaton, Fischer, Samuelson, Van Reenen) • Global warming and other environmental problems (Berry, Brown, Goulder, Nelson, Nordhaus, Page, Poterba, Stavins, Van Reenen) • Education and training (Altonji, Autor, Baily, Berry, Hanushek, Heckman, Moffitt, Roth, Van Reenen) • US Economic inequality (Autor, Brown, Heckman, Moffitt, Page, Weir)

Berry. Newlon. The American Economic Association’s Committee on Government Relations Charles L. Hubbard. Card. Weir) Taxation and government spending (Boskin.• • • • • Health care costs and health disparities (Baily. Diamond. Brookings Institution and Chair Daniel H. Kroszner. Roth. Page. Roth. Hubbard. Cramton. Cutler. Gruber. Reis. Card. Brown. Director. Schultze. Samuelson) Immigration (Hanson) Work and Family Balance (Bloom) Some would develop new approaches to applied research such as economic diagnostics to determine which among multiple plausible models best applies to a particular problem (Rodrick) or more collaboration within economics and with allied fields to strengthen the link between methods and policy (Berry). AEA Government Relations . Rogoff) Design of Efficient and Robust Markets (Blume. Poterba.

8 .

have expanded. Second. poverty. and the differences in the use of new technologies and the allocation of resources between activities with different levels of productivity contribute to incomes. This is for several reasons. First. rather than abating. while other nations achieved sustained growth. the causes of these widespread disparities. From proximate to fundamental causes Economic analysis has documented that differences in per capita incomes and prosperity across countries are related to differences in human capital. This pattern is challenging to most of our theories because many of the barriers to the spread of prosperity have disappeared: ideas travel around the world almost instantaneously. 2009. Challenging though these issues may be. Much of the progress on this issue has been made in economics (see Acemoglu. But the wide gaps in incomes and living standards remain. notably in much of sub-Saharan Africa. and any nation should today be able to easily copy any economic or social practice that it wishes. Massachusetts Institute of Technology Introduction Why some countries are much poorer than others is one of the oldest questions in social science. for an overview). these gaps have meant that while the rich world has become richer. in parts of South Asia and in various pockets of poverty in the Caribbean and Central America. and perhaps work towards redressing. despite spectacular growth in per capita incomes in much of the world during the 20th century. physical capital and technology and make worse use of their factors and opportunities. It will also be one of the most challenging and important questions in the next several decades. We also understand how the current large differences in prosperity have resulted from lack of steady growth in many parts of the world. physical capital and technology. 9 . but the next step will require us to combine the insights and tools developed in economics with perspectives from other social sciences. But these are only proximate causes in the sense that they pose the next question: why some countries have less human capital. various impediments to trade in goods and to financial flows and foreign direct investments have largely disappeared. We understand the extent to which differences in the quantity and quality of education. the gaps between rich and poor countries. disease and social injustice are still widespread in many parts of the world. differences in the availability of machines.Challenges for social sciences: institutions and economic development Daron Acemoglu. we are now much better equipped to understand.

are the humanly devised constraints that shape human interaction. should have a major effect on economic outcomes. This leads to a conflict of interest among various groups and individuals 10 . they influence investments in physical and human capital and technology." which contrasts with other potential fundamental causes. (2) political power. and why they often fail to change. 3) offers the following definition: "Institutions are the rules of the game in a society or. Herein lies part of the problem: different institutions will not only be associated with different degrees of efficiency and potential for economic growth. why institutions differ across countries. But the next stage. it is useful to consider the relationship between three institutional characteristics: (1) economic institutions. and often. as a key determinant of incentives. why do many societies choose institutions that are inimical to economic growth? To think about possible answers to these questions. The notion that incentives matter is second nature to economists. not all individuals and groups prefer the same set of economic institutions. (3) political institutions. and the persistence. many will prefer to maintain economic institutions that do not maximize the growth potential of a nation. But if institutions matter so much for economic outcomes." Three important features of institutions are apparent in this definition: (1) they are "humanly devised. inequality and poverty. Institutional differences. (2) they are "the rules of the game" setting "constraints" on human behavior. shape economic and political incentives and affect the nature of economic equilibria via these channels. of institutions). Institutions have emerged as a potential fundamental cause. with geographical differences or cultural factors (even as we recognize that cultural factors are central for understanding the evolution. Institutions Douglass North (1990. growth. but also with different distribution of the gains across different individuals and social groups. Economic institutions not only determine the aggregate economic growth potential of the economy. p. economic institutions are collective choices and because of their influence on the distribution of economic gains. Economic institutions matter for economic growth because they shape the incentives of key economic actors in society. which are outside human control. shaping why some countries have incomes per capita 30 or 40 times greater than those of others. but also the distribution of resources in the society. Here research and our understanding are still in their infancy. in particular. (3) their major effect will be through incentives. and institutions. contrasting.This has motivated economists and social scientists more broadly to look for potential fundamental causes. is more challenging. like geographic factors. more formally. and the organization of production. and why they change. including history and chance. How are economic institutions determined? Although various factors play a role here. associated with differences in the organization of society. for example. including economic development. There is now vibrant theoretical and empirical research documenting the importance of institutions for economic outcomes. which requires an understanding of which specific configurations of institutions are most likely to encourage growth in the decades to come.

even if they are not allocated power by political institutions. but this time in the political sphere. Examples of political institutions include the form of government. and the extent of constraints on politicians and political elites. Major questions for future research include. may possess it. and those who hold political power influence the evolution of political institutions. Despite these tendencies for persistence. similar to economic institutions. for example. recent research has shown how theoretical and empirical progress can be made on the effects of institutions and on the factors affecting institutional equilibria both at the national and sub-national levels.over the choice of economic institutions. many fundamental and applied questions remain unanswered. among others: • Why do institutions persist? Recent research has documented that several institutional features of current economies have historical roots going back several centuries or sometimes even 11 . De jure political power refers to power that originates from the political institutions in society. The challenges ahead Despite much promising research. dictatorship or autocracy. 2006). the distribution of resources and political institutions are relatively slow-changing and persistent. A second mechanism of persistence comes from the distribution of resources: when a particular group is rich relative to others. political institutions are collective choices. However. this will increase its de facto political power and enable it to push for economic and political institutions favorable to its interests. co-opt the military. the framework also emphasizes the potential for change. A group of individuals. affecting how political power will be distributed and how economic institutions will be chosen. reproducing the initial disparity. and consequently in economic institutions and economic growth. use arms. This creates a central mechanism of persistence: political institutions allocate de jure political power. or undertake protests to impose their wishes on society. and the political power of the different groups will be the deciding factor. In particular. The distribution of political power in society is also endogenous. This type of de facto political power originates from both the ability to solve its collective action problems and from access to economic resources (which determines the capacity to use force against others). for example. To make more progress here. democracy vs. and they will generally opt to maintain the political institutions that give them political power. "shocks" to the balance of de facto political power. hire mercenaries. let us distinguish between two components of political power. This discussion highlights that we can think of political institutions and the distribution of economic resources in society as two state variables. including changes in technologies and the international environment. Since. An important notion is that of persistence. de jure (formal) and de facto political power (see Acemoglu and Robinson. like economic institutions. the distribution of political power in society is the key determinant of their evolution. have the potential to generate major changes in political institutions. determine the constraints on and the incentives of the key actors. Political institutions. they can revolt.

South Korea and China. But the citizens of many other countries do not hold similar beliefs and institutional outcomes are often very different. we have only made limited progress in sources of persistence. There is also evidence that even after major institutional reforms. Despite their importance. Many pernicious dictatorships from North Korea to Burma highlight the dangers of allpowerful states. judicial institutions. they are not historically predetermined. we do not yet know which specific combinations of economic and political institutions are most conducive to economic growth. We also do not know which combinations of property rights. and theoretical and empirical investigation of various sources of institutional persistence. and in some cases. the African evidence suggests that the weakness of the state is a major area of economic development. For example. Relatedly. such changes have altered the economic trajectories of nations fundamentally. The rapid growth of China under a highly authoritarian regime has made some commentators conclude that authoritarian rule might be more conducive to economic growth. What enables institutional reform? Why do many attempts at reform fail and even backfire? How can we work towards successful reforms? These questions are both academically interesting and central to inform policy debates. while during the early phases of the growth experiences of many East Asian nations the state was heavily involved in the economy. important institutional continuities remain. Major institutional reforms have taken place in many countries. These are partly in expectations and beliefs. Asia and Africa or the fall of military regimes. remains a major area for future research. Despite much rhetoric on this topic. Nevertheless. Yet appealing to such beliefs without understanding what the sources of these differential beliefs might be is not satisfactory. The belief among the majority of US citizens that the Constitution safeguards their rights undoubtedly plays an important role in enabling the Constitution to do just that. Even though many of the economies spearheading economic growth over the last two centuries have been relatively democratic and many of the most disastrous economic performances have been under authoritarian regimes ranging from colonial rule to military dictatorships and personal rules. we have little theory to guide us and few applied insights. Yet this does not mean that greater state involvement is necessarily part of the cluster of institutions encouraging growth. we are also far from a consensus on the role of democracy and checks on political power in fostering an environment that is conducive to innovation and economic growth. financial institutions. Though institutions persist. in the postwar era democratic countries do not have appreciably higher growth rates than nondemocratic ones. education and various dimensions of social institutions are most conducive to economic development. including the dynamics of political and social beliefs. there are good reasons to think that authoritarian regimes will ultimately 12 . we currently do not know whether greater involvement of the state ensures a level playing field and facilitates economic development or whether it inexorably leads to insecure property rights and opens the way to more heightened political conflicts to control the all-powerful state. for example following the end of colonial rule in Latin America. Despite the ideas on the sources of persistence mentioned above. as in Botswana.• • • more. While the role of secure property rights for investment and the importance of checks and balances in the political sphere for stability are well understood.

become incompatible with innovation and the creative destruction that accompanies most growth experiences. Douglass (1990) Institutions. Daron and James Robinson (2006) Economic Origins of Dictatorship and Democracy. Acemoglu. Daron (2009) Introduction to Modern Economic Growth. Cambridge University Press. The extent to which this is the case and the various interactions between political regimes and economic growth are other major questions that will require much future research. Princeton University Press. 13 . Institutional Change. and Economic Performance. Cambridge University Press. References Acemoglu. North.

14 .

     In recent years economists have made progress by extending the realm of  variables included in their models. This  process needs to continue if we want to be successful. Examples include political  economics (bordering with political science).  historical analysis and the development of new and rich data sets.        15 . These  developments have lead also to a welcome deeper attention to long terms trends.  (nations. empirical analysis and overall thinking. law and economics (bordering with law of course) and recently  cultural economics (bordering with sociology and anthropology).  Alberto Alesina  Harvard University  September 2010        Question 1  The fundamental question for economists is to understand why certain countries. Answering  this question will of course help understanding how to defeat poverty. behavioral economics (bordering  with psychology). The most promising and  exciting areas of research in economics are those which lie at the border of the  field (strictly defined) and touch upon other disciplines.  We are of  course far to have definitive answers on many issues and more energy needs to  be devoted along these lines. regions) have successfully developed and others are lagging.

 in  other cultures the  family is important but attitudes are more individualistic and family relationships  are less important (say Anglo Saxon Countries and Scandinavian countries). How  do these cultural traits affect many economic decisions?                                                               1  Guiso. and material traits of a nation.     A new but rapidly growing body of research is taking. Sapienza and Zingales (2006)  discusses this definition and the methodological issues related to the  development of this field. which is the most recent and in my opinion very exciting.. Let’s begin with a definition of  culture: “The customary beliefs. attitude toward work. trust. How many times in our casual cogenerations we  mention the word “culture”  as an explanation of many things which are of  relevance for economists such as saving rates. the role and education of women.  racial. Sapienza. but neoclassical economists ignored it. how and how quickly different  cultures melt in the pot? Many times.  for instance in Mediterranean and Latin American countries.  Researchers in others fields did not  forget about culture. hard work. family  relationship. and L. How  many times in our casual conversations we wonder where do different cultures  come from?  How many times we wonder which. but very  challenging: cultural economics. instead. “Does Culture Affect Economic Outcomes?” Journal of Economic  Perspectives. In certain  cultures  families are very “tight” and family relationships are considered very important. L. A paper by Guiso. P.        16 .I will elaborate on probably the less known of the subject matters mentioned  above. 23‐48. religious or social group”. 1  Rather than discussing the question of culture in general let’s discuss one  example of a specific cultural trait:  family relationships. poverty traps. 20 (2006). the protestant ethic. Zingales. the idea of  including “culture” in our framework of analysis. But then when as economists we try to  understand those variables we ignore culture. Weber postulated a cultural root for the development of  capitalism. social norms.

 6) more inward looking attitudes and less trust towards  non family members. 2) lower participation of women in the labor market. growth potential and poverty  reduction policies. that is one has  to look at how different individuals within the same country behave as a function  of their levels of family ties.  However it is clear that these  correlations (and potential causation) are extremely important to understand  various aspects of the economic structure.  they live at home longer: 4) lower geographical mobility and as consequence less  flexible labor markets.  A paper by Alesina and  Giuliano (2010) is an overview of results regarding the role of the family  relationship. Cross country comparisons are very suggestive and provocative.2. 3) lower participation of youngsters in the labor market. Giuliano (2010) “The Power of the family” Journal of Economic Growth. thus less demand for publicly  provided social services. Strong or weak family ties  have different effects. and lower  education of women. By  ignoring these cultural aspects we may design the wrong policies and we may not  understand why certain policies. June 2010      17 . they lead to different social and economic organizations. One need to identify micro evidence.  For instance certain labor markets and social policies may  have very different effects depending on the nature if family relationships.    With strong family ties the family become an organized production unit and it has  important implications on: 1) the amount of home production: stronger ties more  home production. 5) more reliance on the family as a producer of social  insurance and care for elderly and children. but from a  scientific point of view they tell very little since too many things vary across  countries. The strength of family ties is measured by several answers from  surveys about relationships between family members.  There is not attempt obviously to be normative here. within countries. say labor market regulation may or may not                                                               2  Alesina A.  By looking within a country one can hold constant all  the other characteristics and institutions of a country. and in general lower social capital.  7) Lower tendency to participate in social activities. lower  political participation.  and P.  One cannot be ranked above the other.

 If he displays a behavior  consistent with the strength of family ties in Brazil this means that such cultural  traits persist even in different environment. ethnicity. What determines the speed of assimilation? How  does the geographical distribution of ethnic groups matters.work in different countries. Then one can look at how a first second  etc generation Brazilian immigrant behaves in the US. and how does it  affect such speed? The answer to these questions may lead us to better  understand immigration policies and better design policies to deal with  assimilation. This for instance is the point of a recent paper by  Alesina Algan Cahuc and Giuliano (2010).                                                                3  Alesina A. namely how quickly cultures melt and  hoe deeply. P. Algan. say a Brazilian immigrant in the US the average cultural trait (in this case family  ties) of his/her country of origin. That leads to a certain development of the role of women as “stay home”  mothers and wife rather than workers which may affect for centuries afterwards  the role of women and the organization of the family and society. Y.  Not only.. To some extent the US is a successful melting pot. nations families are tighter than  others? One has too look deep into history to understand the answer. Brazil.  This is how it is done. but the objective of  isolating causality is reached by attributing to this Brazilian immigrant not his  views (as measured by his answers to polls) but the aver views of Brazilians.  why in certain social groups.  Continuing with this example. Giuliano (2010) “Family values and the regulation of labor.    This opens up another fascinating question. typically in the US. For instance.  For  instance a hypothesis is that in the distant past the adoption of certain  technologies rather than others crated more or less need for women work in the  field. Cahuc and P. but cultural  differences in behavior persist. regions. 3   Progress along the line of uncovering causation is done by looking at immigrants  in another country. ”  unpublished      18 .  One can attribute to  a. are small ethnic groups more likely to assimilate quickly  or since they are small they will have a tendency to hold on more tightly to their  cultural traits? Which cultural aspects assimilate more or less quickly?    The next question is where culture comes from.

  This type of analysis more generally asks the question of  “where preferences  come from”. to international trade. Another widely studied  cultural trait is trust. Another point of contact her is with the literature on  identity. The latter consideration has important  implication for issue concerning the costs and benefits of ethnic fragmentation.  Cultural economics will lead in the direction of being more ambitious. We as economist always start with the assumption that preferences  are primitive.  The nature of family relationships is only one example. It can be plausibly argued that much of the economic  backwardness in the world can be explained by the lack of mutual confidence. exogenously given and we have nothing to say about where they  come from."  What determines trust.   Perhaps we can make some progress in the explanation of where certain attitudes  are born. its implications have been at the core of  research in Cultural Economics. its evolution. As economists not only we  think of preferences as primitive but also as constant over time. This cultural  analysis will also help understand evolution of preferences and will link up with  other fascinating areas of research like that of “persuasion” that is how certain  messages may changes not only information and beliefs but also the utility  function of individuals.    Religious beliefs may also matter and are certainly part of a brad definition of  culture. pushed amongst other by George Aklerlof. Work on trust spans from corporate finance to  growth and development.  The importance of trust in economics cannot be  underemphasized. certainly any transaction conducted over a  period of time. how they persist and what lead to a change. Beliefs s in the afterlife may have implication for activates kin the current  life. This point highlights  another fundamental issue: individual trust and interact better with those who  are more similar to themselves. where bilateral trust makings  countries has been shown to determine trade patterns. Weber’s views about the differences between Protestant beliefs and Catholic      19 . In Ken Arrow’s words "Virtually every commercial transaction  has within itself an element of trust.

 The domain is  advanced by including important but overlooked variables in the analysis. Economists have began using and  extending when possible) survey like the General Social Survey for the US. Many experiments have been  run. Reverse causality looms always in the background of these studies on  culture.  But what we uncover may be quite small and  none very general.  Identification problems are  huge. great efforts has been devoted to go back in history and collect data on early  institutions. Precisely because we are pushing the analysis towards  domains not typically travelled by economists one often feels the need to extend  the coverage of data and to build new data sets including: historical data sets.beliefs are the primary example of this point. human capital.  geographical data sets. surveys.       20 . experiments. The construction of new data set has been of the most important  output of this research. agricultural technologies.  etc. The role of women varies greatly in different religions.  This large collections (and dissipation) of existing but unknown to economists. Thrift may depend on your religious  views. This is because one of  the finding of this literature has been the long term persistence of cultural traits.    One has to admit that to study culture is not easy.  Graduate students are trained in thinking outside the box and push their  creativity. We should maintain the rigor  that economists have even in the study of culture. But we should not shy away from tackling big issues in economics. In my  opinion our profession is slipping too much into perfectly tight methodologies  applied to smaller and smaller problems. and  new data sets has been a very important outcome of this literature. It is a concept hard to measure  and it is easy to fall into a trap of “anything goes”. The  World Value Survey and various regional surveys. We may perfectly identify certain things  based upon “natural experiments”.         Question 2  The answer to question 1) implicitly answers the second as well.

 and the labor market.      In  this  document.  educational  attainment. San Francisco.  technical  change. with important implications  for the wage distribution.  and  knowledge  at  the  center  stage  of  theoretical  and  empirical  research  on  child  development.     21 . secondary  and  postsecondary  education  is  heterogeneous  in  quality  and  in  the  types  of  skills  and  knowledge  provided. Altonji  Department of Economics  Yale University  September.    Finally.  skills. and the Labor Market:   A Research Agenda1    Joseph G. education.0  Unported License.    Multiple Skills. globalization. and shifts in the composition of demand for goods and services alter the  demand for particular skills in the labor market relative to supply. ability and skill are multidimensional.  First.0/ or send a  letter to Creative Commons. Multiple Types of Education. 94105.    Why is Research on Multiple Types of Skill and Education Needed?                                                           1 This work is licensed under the Creative Commons Attribution‐NoDerivs 3. visit http://creativecommons. 2010     Summary    I propose a major program of research on skill. California. Suite 300. 171 Second Street. USA.  and  labor  market  careers.  The  program will build on four facts.  jobs  differ  substantially  in  what  they  require.  In essence. To view a copy of this license.  I  discuss  why  the  program  is  needed  and  why  the  prospects  for  success  are  high.    I  provide  a  brief sketch rather than a full blown proposal and of necessity use a very broad brush.  Second. the research program will place the multidimensionality  of  ability.org/licenses/by‐nd/3.    Third.

  and  early  labor  market  success.   Much of this work has focused upon child development.    There  is  also  current  research  on  how  cognitive  skills  and  personality  traits  arise  from  genetic  influences. broadly  defined.      Over  the  past  ten  years. The focus  is either on years of school completed or broad education categories such as high school.  There  have been important advances in the use of instrumental variables methods and in the use of  structural  models  of  education  choice  and  labor  market  outcomes.      Since  the  pioneering  work  of  Gary  Becker  and  Jacob  Mincer.   However.  for  example. and neuroscience make  this a promising area for research by economists on the production of human capital.  Heckman  and  Shannach (2010)). This is unfortunate because basic descriptive analyses show large differences in the  labor  market  payoff  across  subject  areas. And mismatch between  the  skills  and  knowledge  the  education  system  produces  and  the  types  valued  in  the  labor  market is a perennial public concern.  a  large  community  of  scholars has studied the demand for education and the economic return to education. genetics.  early  childhood  environment. some  college.  (See.  Cuhna. cognitive psychology. etc. we know much more than before about the average return to a year in school. this research has not been systematically extended into models of the type of      22 .  So far.   Progress in developmental psychology.  educational  attainment.    As  a  result  of  these  developments. the overwhelming majority of the studies abstract from type of education.  the  role  of  non‐cognitive  traits  and  cognitive  traits  in  the  acquisition  of  human  capital  and  in  the  labor  market  return  to  human  capital  has  received  considerable attention in economics.  and  formal  education.    The  substantial  differences  by  gender  and  race  in  course of study contribute to observed gaps in labor market outcomes.

 globalization.   Finally.  The first      23 .    Second. employment.    Autor. for several reasons.  There  has been considerable progress in this area over the past decade for researchers to build on. we need models that distinguish workers along  multiple dimensions of skill and knowledge. progress has been made in the development of models of course selection in   high school and in college that place the role of differences in predetermined abilities.S.  research  on  the  distribution  of  earnings  has  paid  increasing  attention  to  the  effects of technical change.  To understand trends in the level and distribution of wages.  This is likely to continue. and  unemployment in the US and other countries.  In  particular. researchers in the U. fueled by advances in the understanding of the role of  genes and environment in shaping the talents and personality traits that matter for particular  programs of study and subsequent career paths. and changes in product demand on the demand for  particular  types  of  skills. and that distinguish jobs in parallel fashion.  First. research on the child development process  child development is progressing rapidly.    Potential for Success:    The time is ripe.    secondary and higher education people acquire or models of the effects of skills and education  on career paths.  Advances in computer power and econometric  methods have made estimation of such models feasible. and preferences at center stage. and Europe have had success in quantifying the importance  of occupation specific skills for wages and job mobility patterns.  Levy  and  Murnane  (2003)  and  Autor  and  Handel  (2009)  are  good examples.  knowledge.  Empirical research on the causal  effects of particular courses of study is in an early stage but should follow two paths.

   These longitudinal data systems have enormous potential for the study of how student  achievement and field of study affect labor market performance.    Fifth. vocational programs. Arcidiacono’s (2004)  study of college major is one of a small set of papers that can be built upon. the data are getting better.  Research  in other countries that rely more heavily on test scores to decide type of primary and secondary  education and admission to particular college majors will be valuable. human capital  accumulation. advances in modeling and especially in computation will make it feasible to  incorporate multiple skill and education types into general equilibrium models of the supply  and demand for labor that have been used in macroeconomic studies of wage growth and  distribution.      Third.  Other countries (notably      24 .  and college major require more data sets that track individuals well into the labor market.  Data  for Florida and Texas have been matched to data on higher education and to earnings records.    Fourth.    Research on the return to student achievement. advances in computation and in simulation‐based estimation methodologies are  making it possible to estimate an integrated model of child development.    is to use dynamic choice models to understand educational decisions and use the restrictions of  the model to account for selection bias when measuring causal effects.   The second path is  to use quasi‐experimental methods that exploit variation in institutional features that influence  how students are assigned to course sequences in high school or to majors in college. and labor market success that spans birth to adulthood by combining data sets  that individually lack the necessary information. high school curriculum.  State level longitudinal data systems that track  individual students are revolutionizing research on the “education production function”.

 the  more valuable it will be.  Here I have in mind the  National Education Longitudinal Survey: 1988. have administrative data sets that can be used to research student achievement.000 children between 12 and 16 in 1997.     Sixth.  In a few more years NLSY:97.  Thee NLSY79 and NLSY97 and the PSID and cross‐sectional      25 . will become an extremely  valuable resource.    Denmark).       Extending survey data through matches to administrative earnings records is another  avenue that should be explored. and labor market success. which  started with about  9. it can play a key role in research designs that use  multiple data sets for estimation.  Excellent survey based panel data sets beginning in  childhood have been collected in other countries.     A critical need is for better data on the information on job content and skill  requirements in panel data sets.  Part of the research program should be to  invest in new data sets that begin in early childhood and continue until career patterns are well  established. perhaps using advances in statistical methods for construction  of synthetic data sets that mimic the statistical properties of the original data but completely  hide data on individuals.   The Early Childhood Longitudinal Study is having an enormous impact on  research on the role of family and schools in child development.  The NLSY79: Children and Young Adults data project has the promise of  accomplishing this. an extraordinarily rich data set that started with  a national sample of 8th graders in 1988 but ended in 2000.   Re‐surveying members of existing panel data sets that start  in adolescence but stop in the mid 20s would have a huge payoff.  The longer it is continued.  field of study.  To the extent its age coverage overlaps with the early period of other  data sets that follow children into adulthood. the data can and should be improved.

 121. June 2009. 2004).    A major effort is needed to  develop survey modules that can be used to provide more information about what people do  and the skills that are involved. and  Wages.   Autor and Handel (2009) achieved some success in using a  short serious of questions to elicit information about what people do on‐the‐job and what skills  they need.     References Cited  Arcidiacono.    Autor. and Susanne M.    surveys such as the CPS provide very little information on the tasks people perform at work and  the skills that they need to perform them. Flavio. Job Tasks. that can be merged into  household surveys using 3 digit occupation codes. Murnane. “The Skill Content of Recent Technological  Change: An Empirical Exploration. Nos. Peter “Ability Sorting and the Returns to College Major” Journal of Econometrics.” NBER Working Paper No. David and Michael Handel.   However. Heckman. Richard J.  Vol. 343‐375     Autor. 15116. and Frank Levy. James J. jobs vary a lot within a broad occupation classification. 1‐2 (August. “Estimating the Technology  of Cognitive and Noncognitive Skill Formation”. Econometrica 78 (May 2010).” Quarterly Journal of Economics. Shannach.   Most researchers rely upon variables from the  Dictionary of Occupational Titles and its successor data set. “Putting Tasks to the Test: Human Capital.                    26 . November 2003. ONET. David.   Similar data is available for other countries.  1279–1334     Cuhna. 118(4).

Grand challenges in the study of employment and technological change: A white paper prepared for the National Science Foundation* David H. To view a copy of this license.0/ or send a letter to Creative Commons. visit http://creativecommons. Autor MIT and NBER Lawrence F. Katz Harvard University and NBER September 29.0 Unported License. 2010 This document contains 1. California. USA. 171 Second Street. San Francisco.org/licenses/by-nd/3.968 words excluding this title page * This work is licensed under the Creative Commons Attribution-NoDerivs 3. 27 . 94105. Suite 300.

Autor and Lawrence F. 41 percent of the U. Hence. More recent technological changes from electrification to computerization have expanded the demand for highly-educated workers but substituted for less-skilled production workers. workforce worked in agriculture. The result has been a labor market that greatly rewards workers with college and graduate degrees but is unfavorable to the less-educated. increasing demand for labor throughout the economy. But it did not reduce total employment. and clerks) and for less-skilled operatives. It is not fallacious. shifting workers from older to newer activities. second. 9/27/2010 David H.S. in the long run technological progress affects the composition of jobs not the number of jobs. The economic and social repercussions are only starting to receive study.Leading economists from Paul Samuelson to Paul Krugman have labored to allay the fear that technological advances may reduce overall employment. a reversal of the gender gap in higher education (a supply-side force). After a century of astonishing agricultural productivity growth. Two forces are rapidly shifting the quality of jobs. The shift from the artisanal shop to the factory with mechanization in the nineteenth century reduced the demand for skilled craft workers and raised it for more educated workers (managers. for example. the number stood at 2 percent in 2000. altering economic mobility. Katz 28 . to posit that technological advance creates winners and losers. however. This Green Revolution transformed physical and cognitive skill demands and the fabric of American life. and the unemployment rate fluctuated cyclically with no trend increase. In 1900. particularly less-educated males. and redefining gender roles in OECD economies. reshaping the earnings distribution. Technological improvements raise overall living standards but may adversely affect the quality of jobs for some workers. These forces are. causing mass unemployment as workers are displaced by machines. The employment-to-population ratio rose over the twentieth century as women moved from home to market. What is fallacious in the ‘lump of labor fallacy’ is the supposition that there is a limited quantity of jobs. engineers. reflecting women's rising educational attainment and men's stagnating educational attainment. Higher productivity raises incomes. Technological improvements create new products and services. This ‘lump of labor fallacy’—positing that there is a fixed amount of work to be done so that increased labor productivity reduces employment —is intuitively appealing and demonstrably false. first. employment polarization (a demand-side force) and.

low-wage jobs. professional. The next four columns display employment growth in middle-educated and middle-paid occupations. On the left-hand side of the figure are managerial. These occupations were hard hit by the Great Recession with absolute employment declines from 7 to 17 percent. and operatives—accounted for 57 percent of employment in 1979 but only 46 percent in 2009. high-wage jobs and low-skill.S. non-agricultural employment. employment growth is polarizing with job opportunities increasingly concentrated in relatively high-skill. and technical occupations. Low-wage occupations increased as a share of employment in 11 of 16 and high-wage occupations increased in 13 of 16. and high-wage—covering non-agricultural employment and grouped by average wages.Employment polarization In the United States and other advanced countries. Autor and Lawrence F. In all 16 countries low-wage occupations expanded relative to middle-wage occupations. Figure 1 plots changes in employment by decade for 1979 through 2009 for ten major occupational groups encompassing U. fabricators and laborers. These occupations divide into three groups. In all 16 countries. Their growth rate lags the economy-wide average and slows in each subsequent time interval.S. caring for. office workers. production workers. and operators. office workers. craft and repair. production. Employment growth in high-skill occupations was robust for the past three decades. The consequence has been a sharp decline in the share of U. employment in traditional “middleskill” jobs. Employment growth in service occupations has been rapid in the past three decades. middle-wage occupations declined as a share of employment. employment growth in service occupations was modestly positive. Katz 29 . Workers in service occupations disproportionately have no post-secondary education and relatively low hourly wages. or assisting others. expanding by double digits in the 1990s and the pre-recession years of the past decade. The polarization of employment is widespread in the OECD. The final three columns depict employment trends in service occupations involved in helping. middle-. including sales. These are highly-educated and highly-paid occupations. Figure 2 plots the change in the share of employment between 1993 and 2006 in 16 European Union economies for three sets of occupations— low-. Employment polarization: Demand-side causes 9/27/2010 David H. The four “middle-skill” occupations—sales. Even during the Great Recession.

Katz 30 . medicine. Rather. These advances have also dramatically lowered the cost of offshoring information-based tasks to foreign worksites. For a task to be autonomously performed by a computer. We refer to the procedural. engineering. such as bookkeeping. William Nordhaus (2007) estimates that the real cost of performing a standardized set of computational task fell at least 1. persuasion. it must be sufficiently well defined (i. but as per the Green Revolution example. The price of information technology has fallen at a stunning pace. The automation and offshoring of routine tasks reduces the domestic demand for workers in these tasks. and monitoring jobs. altering the composition of jobs and the tasks workers perform within jobs. Their ability to accomplish a task depends upon the ability of a programmer to write a set of procedures or rules that appropriately direct the machine at each possible contingency. Abstract tasks are activities that require problem-solving. The substantial decline in clerical and administrative occupations is substantially a consequence of the falling price of machine substitutes for such tasks. storing.7 trillion-fold between 1850 and 2006. These non-routine tasks can roughly be subdivided into two major groups on opposite ends of the occupational-skill distribution: abstract tasks and manual tasks. science. with the bulk of this decline occurring in the last three decades. technical and creative occupations.e. Measures of job task content uniformly find that routine tasks are most pervasive in middle-skilled cognitive and manual jobs. Although computers are now ubiquitous. managerial. it creates significant advantages for workers whose skills are complementary to computers and it disadvantages those whose tasks are easily substituted by computers. and manipulating information are increasingly codified in computer software and performed by machines. it raises relative demand for workers who can perform ‘non-routine’ tasks that are complementary to the automated activities. These tasks are characteristic of professional. Autor and Lawrence F. Job tasks that primarily involve organizing. intuition. retrieving. they do not do everything. clerical work. repetitive production. 9/27/2010 David H. and design. does not necessarily reduce overall labor demand.. Simultaneously. secular price decline in the real cost of symbolic processing creates enormous economic incentives for employers to substitute information technology for expensive labor whenever feasible. codifiable) that a machine can execute the task successfully by following the steps set down by the programmer. such as law.A leading explanation for the polarization of employment in the OECD focuses on the computerization of many job tasks. and creativity. rule-based activities to which computers are currently well-suited as ‘routine’ tasks. The rapid. Workers who are most adept in these tasks typically have high levels of education and analytical capability.

on the other hand. This enormous growth in the earnings differential between college. The explanation for why it did not is a puzzle and cause for concern. This hypothesis is supported by a rapidly growing body of research that links the process of computerization to occupational change over time and across countries.95. Educational gender reversal The polarization of employment opportunities in the last three decades has been accompanied by a substantial secular rise in the earnings of those who complete post-secondary education.g. or installing a carpet are all activities that are intensive in non-routine manual tasks.Manual tasks.S. By 2009. was approximately 1. Yet. Driving a truck through city traffic. which are precisely the job tasks that are challenging to automate because they require responsiveness to unscripted interactions. The hourly wage of the typical college graduate in the U. e. Many other OECD countries have seen increases in the wage gap between college and non-college workers. though the U. haircutting. Bureau of Labor Statistics forecast these shifts to continue for (at least) the next decade. Katz 31 . the increase in these earnings differential may have been held in check. This latter observation applies with particular force to service occupations. The polarization of job opportunities is half the explanation for the growing wage gap. and in-person interactions. food preparation and serving.S. 9/27/2010 David H. Employment projections from the U. often. visual and language recognition.. often in direct contact with final consumers (e. But it did not in the United States. such jobs are often organized in ways that may require little or no education beyond high school. cleaning and janitorial work.S. this ratio stood at 1. A consequence of these forces—rising demand for highly-educated workers performing abstract tasks and for less-educated workers performing ‘manual’ or service tasks—is the partial hollowing out or polarization of employment opportunities seen in Figures 1 and 2. able to communicate fluently in spoken language.. are activities that require situational adaptability. Such tasks demand workers who are physically adept and. Such jobs are also difficult to offshore because they typically must be performed in person. food service.5 times the hourly wage of the typical high-school graduate in 1979. If the rate of growth of educational attainment had kept pace with the rising relative demand for highly-educated workers. and maintenance. house-cleaning). case is more extreme.g. These jobs demand interpersonal and environmental adaptability.and high-school–educated workers reflects the cumulative effect three decades of more or less continuous increase. preparing a meal. Autor and Lawrence F.

Figure 4 shows that male college attainment rose only weakly in most countries over the same period. Throughout the OECD. which depend on their besteducated workers to develop and commercialize the innovative ideas that drive economic growth. female educational attainment rose substantially in these decades. Figure 5 shows that female rates of college attainment now greatly exceed those of males in most industrialized countries. higher-wage jobs may be a key to whether we see continued economic polarization or the emergence of new middle-class jobs and shared prosperity..3. it rose from 27 to 33 percent. Comparing the fraction of women ages 25–34 with college education in 2009 with that of women ages 45–54 in the same year. but males’ failure to keep pace is problematic. The cross-national phenomenon of polarizing employment growth and stagnating male educational attainment presents a grand research challenge on two fronts: (1) understanding the sources of gender differences and trends in college attainment. In the U. For the European counties this ratio averaged 1. It means fewer young males will gain entry to high-end occupations and that the supply of workers who can perform high-end abstract tasks is not increasing as fast as demand. The counterpoint to gains in female skill investment is the lackluster increase among males. Indeed.S.S. however. For cohorts that were ages 45-54 in 2009. rising from 28 to 35 percent. in 2009. 9/27/2010 David H. the ratio of female-to-male college attainment exceeded parity among younger cohorts (ages 25-34) in all eleven countries in the figure. it rose from 23 to 43 percent.S. female to male college attainment was roughly at or below parity in eight of eleven countries. (2) analyzing the social and political implications of job polarization and the decline in traditional “middle-class” jobs. The rising educational attainment of women is good news. In Spain. This exacerbates rising wage inequality and retards the growth of advanced economies. The extent to which in-person services can be reorganized and professionalized into higher-skill. In Spain. the gains were more modest but substantial.. ratio.As shown in Figure 3. Katz 32 . college attainment in 2009 was several percentage points lower among males ages 25-34 than among males who completed schooling two decades earlier. In the U. almost identical to the U. Autor and Lawrence F. the share of females attaining post-secondary (‘tertiary’) education increased remarkably in this period. we can see that college attainment among women more than doubled in many countries over two decades.

6 Te c hn ic ia ns Sa le s O ffi ce /A dm in Pr od uc ti o n O pe r at or Pr ot e s/ La bo re rs Occupations Grouped by Wage Tercile Lower Third Upper Third 1979-1989 1999-2007 ct iv e Fo od /C le an i Figure 1. Katz 1989-1999 2007-2009 Se rv ic e ng Se r Pe rs on al Middle Third vi ce U Av SA er Po age rtu g Ire al la Fi nd nl a N nd N or et w he ay rla n G ds re ec e U Sw K e G de er n m an Sp y Be ain lg D iu Lu enm m xe a m rk bo u Fr rg an Au ce st ria Ita ly C ar e 33 .9/27/2010 EU M an ag er s Pr of e ss io n al s -20 -10 0 10 20 -. Autor and Lawrence F. 1979-2009 Figure 2.4 .2 0 .2 . Change in Employment Shares by Occupation. Percent Change in Employment by Occupation. 1993-2006 David H.

Female College Education Attainment Rates in 2009 by Birth Cohort and Country 5 15 25 35 45 55 U ni E. 0 10 20 30 40 50 te U d . Male College Education Attainment Rates in 2009 by Birth Cohort and Country 5 15 25 35 45 55 U ni 0 10 20 30 40 50 te U d . U. Autor and Lawrence F. Ages 45-54 in 2009 Source: Eurostat.S.( 10 Sta te C s ou nt rie s) er m N an et y he rla nd s G re ec e U ni Sp te ai d n Ki ng do m Ire la nd Ages 45-54 in 2009 Source: Eurostat.Figure 3.( 1 0 St a te C s ou nt rie s) G er m N an et y he rla nd s G re ec e U ni Sp te ai d n Ki ng do m Ire la nd E. U.S. Census Bureau G Ages 35-44 in 2009 Figure 4. Census Bureau Ages 35-44 in 2009 9/27/2010 David H. Katz D en m Ages 25-34 in 2009 ar Po k rtu ga l ly Fr an ce Ita an ce D en m ar Po k rt u ga l Ages 25-34 in 2009 Fr Ita ly 34 .

8 1. U.2 1.S. Autor and Lawrence F.6 te U d .6 1 1.Ratio of Female Rate to Male Rate by Birth Cohort and Country . College Education Attainment Rates in 2009 er m N an et y he rla nd s G re ec e U ni Sp te ai d n Ki ng do m Ire la nd Ages 45-54 in 2009 Source: Eurostat.2 .4 . Census Bureau G Ages 35-44 in 2009 9/27/2010 David H. 0 . Katz an ce D en m ar Po k rt u ga l Ages 25-34 in 2009 Fr Ita ly 35 .( 10 Sta te C s ou nt rie s) Figure 5.4 1.8 U ni E.

36 .

3. 37 . o The gap between theory and practice has consequences. maybe we should revisit the optimal inflation question. 2. we need to develop these new economic tools to understand how people and markets behave when there is widespread fear and panic. such as the FED. as happened in the late 90s and as seems to be happening in the current recovery. policy is made with a hybrid of Keynesian. behavioral economics has looked at issues such as how and why people save. At the present time. but their insights have not been incorporated into mainstream macroeconomics. As a comprehensive model of business cycles. For example. neoclassical. almost no one predicted how severe the housing collapse would be and how it would damage the financial system and the economy. This has resulted in gap between macroeconomic policymakers and the research community because. To define and measure systemic risk. Given that monetary policy is being limited by the zero bound on interest rates. The FED Chairman and successive Treasury secretaries were forced to improvise over weekends in the face of collapsing financial institutions and a potential breakdown of the financial system. 2. Tversky and Kahneman taught us in the 1970s that people do not make good economic decisions under uncertainty. Other important topics are: 1. The macroeconomic forecasting models that serve policymakers. There should be a willingness to examine new approaches and not be tied to past frameworks. and monetarist ideas. o The NSF should launch a large-scale effort to re-examine what we know and do not know about the macro-economy and how research can move forward constructively. However. the real business cycle model has failed dismally. 3. To improve macroeconomics.Some Foundational and Transformative Grand Challenges in Economics Suggestions from Martin Neil Baily. The Brookings Institution Bullet point format • The fundamental question: There is no generally accepted theory of macroeconomics that is adequate to respond to the challenges thrown up by the financial crisis and resulting deep and persistent recession. Although some economists saw the dangers of a housing bubble. such as severe recessions. o Much of the research in macroeconomics over the past forty years has been focused on developing models of rational behavior with rational expectations that are consistent with observed macro data. 1. there is evidence that technology shocks or productivity shocks can have major impacts on cyclical movements. neither monetary nor fiscal policy is capable of ensuring a solid economic recovery. or the business sector are extended IS-LM models that work pretty well under normal circumstances but do not predict sharp changes. in practice.

• 38 . Despite the earnings premium that is achieved holding a high school diploma and by college completion. the extent to which these programs improve subsequent wages. union bargains or routine pay increases. workers must have productivity to match. especially young men. Companies and unions that refuse to face competitive forces will go broke. given the current structure of the economy. wages were set with a large institutional component. Or for job training programs. o There is a strand of thinking about policy in this area that is essentially nostalgia based. an increasing number of students. o How can curricula be reformed to provide economic value and attract students? The current training of teachers does not equip them to train students in marketable skills in the job market. • The fundamental question: Most young people in the United States will not complete a two or four year college degree and most of them will end up in jobs that do not pay well. o It does not appear to be working to teach watered down academic curricula to non-college bound students because they do not see the value of what they are asked to learn. Sawhill pointed to the gap between Guy Orcutt and Alice’s original vision of building microsimulation models that can be used for policy analysis – and the reality. In order to receive middle-class earnings. We have not. but reassessing what is of value and what needs to be done to provide a better understanding of the economy. In the past. Much of the economics literature on education and skills has focused on tracing the relationship between the amount of education and the subsequent wage. How can we structure the US education and training system so that it provides the skills that young people need to earn middle class wages? o The combination of changing technology and globalization has resulted in a relative decline in the earnings of people without high levels of skill or education. not discarding what has been learned. How can teacher training and pay structures be reformed so that schools can serve the non-academic students? The fundamental question: What is the value of work and how does it change with age? (Based on a suggestion by Henry Aaron). Other countries have developed such models with government funds and use them for all kinds of analysis. These approaches will not work in the future. Education is something they hate and feel humiliated by. Alice Rivlin and Isabel Sawhill commented that the NSF should fund research that imbeds what we know about behavior at the microeconomic level into macroeconomics. o In a related argument. are dropping out. except for a few underfunded and modest efforts. It is essential that the education and training system in the United States do a better job for non-college bound students.o Macroeconomics needs to re-boot.

o Economics treats work as a source of disutility. productivity and employment being affected by globalization and the rise of new economic powers? o Popular discussion of the challenges facing the US economy almost always end up talking about India and China and what is happening in these economies. But work produces huge consumer surplus for many people and in the modern world perhaps for most. but those who make living carrying pianos feel differently. o Some college professors would choose to continue working until they drop. how many weeks a year. Lawrence Katz and Robert Lawrence. If everyone is required to work longer. Furthermore. choosing policies that result in much shorter working hours per year through shorter work weeks and longer vacations. but there are also reports that it is moving up the technology ladder. at least at the margin. but a casual check of recent economics journals does not reveal a broad enough interest.o As policymakers face the pressure of budget deficits. it is important to have a better of understanding of the value and cost of work. write about these and related issues. limitless leisure can be devastating for some people and may be devastating for all. Other industries are on the steep part of their S-curve and innovation spurs rapid productivity 39 . labor force participation is also much lower than in the US. o European countries have taken a very different path than has the United States. Does innovation in emerging markets result in a reduction in innovation in the United States or is it complementary? Do American consumers benefit from overseas innovation? o How do innovation and productivity growth affect employment? It seems that industries that developed within a national market and are then exposed to global competition are forced to restructure and make sharp employment reductions as they increase productivity. India is clearly generating innovation in its provision of offshore services and is starting to make manufacturing innovations in autos and medical devices. There are all sorts of margins that are relevant—at what age one starts working. what policy steps would be helpful as older workers manage the transition from jobs requiring strength and endurance to jobs that are less physically demanding? • The fundamental question: How are innovation. Alan Krueger. for example Paul Krugman. when one retires. how long one works each day. policymakers are moving to increase the amount of time spent working. how many years. with more early retirement and fewer women working. In some (not all) European economies. Gene Grossman. Some economists. Given population aging. it is almost inevitable that the retirement age will rise and people will be required to work longer. Faced with budget pressures in Europe. whether one retires abruptly or suddenly. o How important is innovation in emerging markets? By some accounts China’s manufacturing sector remains concentrated in low-value activities.

There seems to be little or no relation between the cost of treatment and the effectiveness of the treatment. o The Dartmouth studies have revealed wide variations within the United States and they found best practice treatments were not the most expensive. In Germany hospital stays are much longer because they provide an economic return to doctors and hospitals. The fundamental question: Can evolutionary economics provide new insights into areas where conventional economics has proven inadequate? o Attached to this statement is a copy of the recent speech by Charles Taylor in which he makes the case for evolutionary economics. o Despite statements to the contrary. o The National Institutes of Health is barred from considering costeffectiveness as they evaluate different treatment protocols. For the treatment of cancer. This puts the onus on economists and other social scientists to undertake studies of this area. o The health sector is very backward in its use of information technology. Successful export industries are able to drive increased domestic employment. but because incentives to increase efficiency are not in place. • 40 . For example. depending on the hospital or region where it is treated.increase and price declines that leads to increased demand and stable or increasing employment. Taylor is at the Pew Charitable Trusts but is expressing his own views. there is evidence that treatment protocols used by doctors are heavily influenced by the economic incentives that they face. surgeons recommend surgery and radiation oncologists recommend radiation. Medicare-driven reimbursement patterns have driven very short hospital stays in the United States. not because of problems in the technology. There is a similar large variation across advanced economies in health care costs. • The fundamental question: There is a wide variation in the cost of treating any given disease. The McKinsey Global Institute compared health care costs in the United States to other OECD economies and found that costs were much higher here and that health outcomes were generally as good or better in other countries. How can this be? How can this result be utilized to improve the quality and reduce the cost of the US health care system? (Also suggested by Henry Aaron).

who was Executive Director of Supervision at the Bank of England. It is not just greed that’s good. because his job was to make sure that banks didn’t fail. heaven forbid. 41 .increasing the chances of instability. But he had a good reason. But if everyone tries to adopt best practice in all things. And something that might cause one member to fail. Pew Charitable Trusts (Chicago Federal Reserve Bank/IMF conference: “Macro-prudential Regulatory Policies: The New Road to Financial Stability”. He wanted his staff to keep their hand in. When I was at the Group of Thirty. 2010) Thank you to the IMF and the Federal Reserve Bank of Chicago for inviting me to speak to you tonight. risk management or any other aspect of business. Adopting best practices is by definition beneficial for any individual institution that does it. Carried to an extreme. I speak on my own behalf and not that of my employer. You need turnover in any population for it to prosper in a changing environment and financial institutions most certainly live in a changing economic. I got to know Brian Quinn. they would know what they were doing if. whether we are talking about marketing. Herd behavior will be the result if any small thing shocks the system -. With a small failure or two each year. There is no way to understand everything that’s happening and so there is no way to rely entirely on planned adaptation. It’s a complex environment too. Financial Reform Project. the population will become increasingly homogeneous. Then again. is increasingly likely to wipe out everyone. The universal and strict adoption of best practices can be highly destabilizing for any system as a whole. two things happen. Brian said to me once that a good year for him was one in which a small bank or two failed somewhere in the United Kingdom. That was a provoking thing to say on the face of it. Pew Charitable Trusts. Individual institutions will inevitably make mistakes – however good their managers. consider the spread of best practice. one day something more serious happened. product development.Attachement to: Some Foundational and Transformative Grand Challenges in Economics Macro-Prudential Regulation and the New Road to Financial Stability Looking Through Darwin’s Glasses Charles Taylor Director. I think in fact there may be a deeper reason why some positive rate of failure among financial institutions is good. So is some failure. social and technological environment. shareholders and regulators. September 23-24.

what to watch for and what to do about it. Evolutionary theory is a big enough tent to accommodate several other theories and insights about systemic stability. It certainly has through my sixty years on this planet – pretty relentlessly. specialization. the technological. death makes way for new life and diversity is a form of insurance. it may help us pick up signals of future instability rather than just explanations for what went wrong in the past. talent and technology is fierce. Because it is comprehensive and deep. • There is selection based on fitness. Indeed. speciation. Looking at financial stability through Darwin’s glasses will bring a great deal into focus. Macro-prudential regulation is then simply the art of constraining evolution.at least when governments don’t intervene. Now. Evolutionary algorithms: • Work locally on individuals or families by changing them or their descendants and there is an element of unpredictability involved. Why is this? Well. For evolving populations. it is a mathematical certainty that many things happen. • The environment changes unpredictably. The list is long: diversity. co-evolution and predator-prey relationships emerge. cooperation. the initial population can then “spawn” so to speak evolving secondary populations of 42 . Let’s examine each of these assertions. 1. however. and the competition for funds. And networks of interaction and interdependency appear for all of the above. complexity. markets. history does matter. initiated by individual players or small groups clubbing together. I believe we should think in terms of evolution when we think about financial systems and macro-prudential regulation. evolutionary theory provides an array of insights into systemic instability – its causes. given enough time and a large enough population. seem sensible enough if you are thinking in terms of evolution. And • The environment provides only limited resources so that competition is inevitable as successful populations grow. you can see that the financial system also changes in large measure as an evolutionary system in the strict scientific sense: change is local. Still. there certainly is selection based on profitability -. evolution can be viewed as a family of algorithms for changing populations. symbiosis. Remarkably. if you stop and think for a minute. Both.*** Both of these insights are paradoxical if you think in comparative static terms. • History matters in that evolution usually tinkers and only occasionally does something radical: what comes after usually resembles what went before. The Financial System is an Evolutionary System As many of you know. There is no doubt that the financial system evolves in the colloquial sense all the time. economic and social environment is always changing unpredictably.

processes and strategies are all populations where change is local.networks. or bump along. it doesn’t always improve its fortunes. processes and strategies. evolution is a tough task master that throws often throws a curve ball. So much for the canard that evolution cannot make predictions! Because it is an evolutionary system in the strict sense. What happens to one of them is changing the environment of the others. Evolution is All About Instability While evolution always changes a population. Evolutionary theory predicts all these things. selection rules. And I mean that strictly. Anyway. they bump up against resource constraints. frankly. Let’s talk about instability. Evolving populations can only ever have limited intelligence – limited knowledge of the past and the present and limited foresight into the future. The investment bank extinction could be viewed that way: they had reached the limits to growth and had begun to compete in ways that undermined their own resilience and reliability – sacrificing capital and liquidity for profits. it turns out that heritable limited knowledge and foresight has interesting implications in evolutionary theory – especially when it comes to instability. We will come back to that in a moment. all anticipating intelligence of econ 101 because for such brainiacs there is no uncertainty and. in the sense that networks. 2. Consider the uneasy equilibrium between foxes and rabbits in the English countryside. There is a rich literature of mathematical modeling and agent based simulation that tells us that co-evolution can be highly unstable. competition or predator-prey relationships develop between them. “That sounds more like the bond trader I know.” Me too. no need to evolve. characteristics.” I am sure you are thinking. Dismal indeed. the environment changes and resources are limited. If • 43 . One more thing before we turn to financial instability. The facts confirm the theory remarkably well. Those creatures inhabit timelessness – heaven with the all the boredom and none of the joy. characteristics. they start to coevolve. Co-Evolution: When species arise and try to co-habit and either cooperation. not just metaphorically. or just die out. Leaving aside extinctions. Instability arises because of: • Resource Constraints: When populations grow steadily for a long time. which clearly are sub-optimal from the viewpoint of those concerned. history matters. Some evolutionary systems feature intelligent populations. they will either break into sub-populations that compete. we should expect to see all of these things occur in the financial system – and indeed we do. Nature does too. Limited foresight. If they can’t then adjust to zero growth or if the resources are not renewable. “Ah. They never feature the omniscient all knowing.

If rabbit slyness emerges fast enough. the rabbit population is becoming better at evading foxes creating negative feedback. Particularly where evolution is accelerating and new complexity is popping up all over the place. increasing both complexity and diversity. negative feedback comes on the scene too late. slower rabbits. Complexity. the pressure is on to become more cunning still as the rabbit population thins. it can be very destabilizing.foxes start to grow more cunning and hunt down more and more rabbits. Speciation or specialization begets more speciation and specialization. Financial positive feedback loops that eventually abate include credit. then cycles can continue forever. however. On average. Less attentive. Both require networks of interdependence. If we were likely to be wrong. avoiding costly and time-consuming random variation. Management fashions certainly influence the senior managers of large banks who face extraordinary • • 44 . negative feedback counteracts positive feedback quickly enough. which quite evidently co-evolve whether we are talking about nature or finance. are being culled continually. (When I was a currency analyst on Wall Street. It can eliminate a lot of really bad variations in thought experiments. busts and collapses play out. or to put it another way. It obscures the past and the present and makes the future hard to predict. adaptive Lamarkian evolution can enhance Darwinian evolution in intelligent species – a huge advantage. and to get good at preying on others. liquidity and leverage cycles and.) But extrapolation and imitation tend to produce herd behavior and homogeneity. But. These can be powerful survival tools for the below average much of the time and for everyone when times become uncertain – moving with the crowd offers some protection. What works for krill or starlings as a defense against whales or eagles can make matters a lot worse in a financial boom or a bust. it gets easier to make mistakes and extrapolation and imitation strategies become increasingly attractive. if they emerge too quickly or last too long. however. the dangers of concentration and about other characteristics of networks that can hamper or help contagion. unfortunately. Network theorists and epidemiologists have a thing or two to tell us about network stability – the importance of super-spreaders and critical nodes. however. Put another way. Intelligence allows a species to anticipate the responses of predator and prey. it adds significantly to instability because it makes strategies of extrapolation and imitation more likely. Complexity: One of the best things about evolution is that it explains complexity so satisfactorily. Limited Intelligence: Intelligence is a powerful advantage for any species. adds to instability for those of limited intelligence. we wanted to be wrong together. I was fascinated to see that when times got uncertain the dispersion of forecasts from different analysts diminished. There is positive feedback driving cycles of increasing foxy cunningness. they can often be found at the scene of the crime when booms. If. • Networks of Interdependence: Two good survival strategies for species are to cooperate.

products and services. It is not just that networks provide pathways for domino effects but. and strategies.of processes. Evolutionary systems are competitive and they drive individuals toward the edge of their abilities. • Self-criticality: A sixth source of instability is the tendency for evolutionary systems to be self-critical -. markets. but I think I have made my point.complexity both within and without. as avalanches get bigger. • Multiple small causes – that are associated with impending avalanches of selfcriticality. But. • Concentration – a form of unstable interdependence. • Complexity. practices. The size of these avalanches are often governed by a power law – there is a constant that dictates how. and • Interconnectedness – which can be creating super-spreaders. evolution lines up wobbly dominos. declining capital or liquidity. • Declining robustness and resilience (or declining “wellness. They tend to over-specialize – think of developing a particular skill but neglecting your general education. filaments of fragility of uncertain length develop and from time to time generate avalanches of failure. incomprehensibility of new instruments and trading strategies. strategies. Implications for Macro-Prudential Strategy I could go on. opacity and speed of change – including complexity of organizational structure. • Co-evolution -. increasing leverage. critical nodes and new pathways for other kinds of instability. their frequency declines.not in the sense that Wall Street traders spontaneously start to organize after work classes where they can criticize themselves -. and rapid growth in activity and profits. institutions. Evolution suggests plenty of things to monitor for early signs of instability: • Homogeneity – of organizational structures.” one might say) – meaning declining excess capacity. as I said. 3. • Incentives misalignment – a form of informational asymmetry that creates corrosive positive feedbacks • Turnover – which should be neither too low nor too high but just right. given half a chance. practices. Moreover.but rather that evolutionary systems have a strong tendency of their own accord toward states in which they are teetering on the edge of failure. 45 . across any network of interdependency. exhausting their reserves. • Positive feedback mechanisms – and things that encourage positive feedback such as information asymmetries and misaligned incentives (including moral hazard). which can be symptoms of something going wrong as often as something going right. extrapolation and imitation strategies breed herd behavior and homogeneity which are destabilizing.

he told me all about a new piece of software his firm had developed. evolution may help us ready ourselves to fight the next war rather than the last one. increasing transparency and making sure that institutions of all sizes can be broken up or allowed to disappear when they fail. Agent based simulation (ABM). but forever changing. He was very excited because it allowed him to try out different routes and then to “drive” along them to see how the lay of the land changed. Evolution is a forward-looking context for other theories and insights. he supervised the construction of the M4. products and processes along with institutions. Only “some comfort” because we are creatures of limited intelligence and. Beyond that.-lengthening those filaments of instability associated with self-critical systems. I was studying mathematics at Cambridge at the time and one day when he came to pick me up at my college for a lift to London.While some of these can be measured by traditional types of economic indices and surveys for others. there are some familiar ideas to consider such as discouraging concentration. CoVaR and credit exposure mapping address network instability by addressing the relationship between the size of links and vulnerability – how close and how wobbly the dominoes are. Evolution also gives us a more balanced theory of micro-prudential regulation and market discipline. The possibilities for undesirable side-effects not only are legion. first of all. the motorway that runs West from London across the Cotswolds toward Bristol and Bath. In the late 1960s. raise capital and liquidity requirements. tread carefully. Both are capable of adding to stability by raising wellness. monitoring the co-evolution of markets. such as homogeneity or process evolution there is experience in management and the sciences to draw on. Finally. What evolution can tell us about policy levers? Well. and looking for circumstances in which many small things may be going wrong together . 4. he and his colleagues 46 . But both are also capable of adding to instability through positive feedback loops and increasing homogeneity. Conclusions I am reminded of a old friend of mine who was a civil engineer. counteracting protracted positive feedback loops in the credit and liquidity cycles. avoiding threshold effects that can precipitate rapid positive feedback in microprudential and market regulations. Over the succeeding months. lowering permitted leverage and loan-to-value ratios. critical details are bound to elude us from time to time. Then there are some less familiar ones such as encouraging (and certainly not inhibiting) diversity. complex adaptive systems (CAS) analysis and network analysis all clearly apply to evolutionary systems. aligning incentives. while we may be looking at the right thing when we study evolution.

we need glasses that will help us focus on the landscape both near and far. For that. I hope that Darwin’s glasses were bifocals.worked out a route of gentle turns. The result is arguably the most beautiful motorway in England. they focused on the trees ahead and the individual sheep. we need to study the lay of the land and nudge the financial system away from cliffs and precipices. the view across the hills and valleys was blurred in the extreme. Macro-prudential regulators are also trying to look across the landscape and try out different routes. We need to ensure the co-evolving populations of which the system is made up are strong enough to run the occasional rapids and weather the occasional storm. elegant bridges and lovely vistas. They rarely looked at the woods or the flock as a whole. In the future. They are not building a motorway but rather shepherding an independent minded flock through the trees. If they did look up. In the past. 47 . I think we should try them on.

48 .

health care reform and education choice. which oftentimes show great promise as potential tools for policy relevant research.0 Unported License.A Proposal for Future SBE/NSF Funding: Refocusing Microeconomic Policy Research Steven Berry Yale University Department of Economics & Cowles Foundation and NBER Abstract How can the NSF harness large and vital research efforts in econometrics and economic theory to address our era's most important microeconomic. At a time when a vocal minority of empirical economists actively rejects the usefulness (for their particular research agendas) of much of econometrics and almost all of economic theory. But as is usual in academia. Suite 300. Introduction The broad field of economics invests tremendous resources (including money from the NSF) into various kinds of theoretical and methodological endeavors. “method” and “policy” often remain disconnected. This work is licensed under the Creative Commons Attribution-NoDerivs 3. USA. However. the incentives for econometricians and theorists to 49 . San Francisco. 94105. either directly in teams of researchers or via less formal interactions across fields.0/ or send a letter to Creative Commons. Increased “cross-sub-disciplinary” efforts within economics and allied fields could have important social payoffs. Examples of applications include bio-fuels / global warming. there is also strong pressure for increased specialization. Policy analysis in each of these examples requires the use of econometrics and economic theory together with a well-informed understanding of institutions and policy. This white paper highlights the potential for collaborative research efforts that focus on applying new ideas and methods to important policy questions with the aim of providing better policy analysis while also ensuring that the more technical researchers receive important feedback as to what tools are actually of real-world use. while making sure that the most useful newly developed ideas and tools are actually adopted in policy analysis.org/licenses/by-nd/3. social and climate policy questions? The goal presented in this white paper is to refocus the economics profession's more technical fields of inquiry on ideas and tools that are relevant to policy. To view a copy of this license. It is already the case that many economists are working on this kind of collaboration. California. visit http://creativecommons. to the detriment of both. 171 Second Street.

Methods and Empirical Policy Analysis Recent empirical economics has often emphasized the learning that can take place without the use of economic models.) The opportunity for the NSF is to tilt the profession's portfolio of research in a socially productive way. Another possibility would be to start an initial inquiry as to what broad policy areas would most benefit from the integration of methodology and policy researchers and then call for detailed proposals in those fields. While this proposal emphasizes cross sub-disciplinary research within economics. but rather to suggest that the NSF solicit novel and important proposals for tying together theory. and still vividly relevant example. the examples mostly focus on the problem of policy in market equilibrium. However. the logic will obviously often extend to policy experts outside of economics. data. When the model under consideration is behavioral. which is a classic setting where econometrics. This white paper provides three. policy examples where the more rapid dissemination of methods into a policy arena would have high value: • • • Bio-fuels and global warming. theory and policy analysis have to work together. recent technical advances in econometric method and economic theory may sometimes have particularly high value in just those allied fields where they are least disseminated. a negative feedback loop can set in. Indeed. For example. Educational choice. econometrics and policy. The purpose of this proposal is not to argue against ``pure'' econometrics or theory (which has great long-run value) and neither is it to argue against simple empirical strategies that are useful for policy analysis (whose value is obvious. very much non-exhaustive. for example a gasoline tax that is intended to reduce carbon emissions. is the need to uncover supply and demand elasticities in order to predict the effects of a change in a tax. where policy-minded researchers correctly observe that (for example) many parts of recent economic theory do not seem to have much policy-relevance. Refocusing Policy Research Drawing on the author's particular expertise. And once this happens. In other cases. and Health care reform. the intent is not to restrict an NSF initiative to such examples.define themselves as ``pure'' practitioners of their craft. increases. A classic. one can learn a lot about the effectiveness of a particular policy intervention in African villages by running a randomized experiment across a set of representative villages. devoid of policy concerns. for example in health care or climate change. organizational or political. some economists argue that a particular policy change represents a ``natural experiment'' that allows one to infer the effect of a policy fairly directly from data. However. economists traditionally teach their undergraduate and graduate students that many kinds of counter-factual policy analysis require us to uncover an underlying policy-invariant function (or parameter) that cannot directly be observed from data. researchers from allied social science disciplines may be the appropriate policy-oriented analysts. without much of an economic model. These elasticities are called 50 . as through field experiments or through thinking directly about ``causal effects'' of a policy change.

``structural'' because they represent the underlying structure of the model that allows us to make a prediction about a policy that has not been previously implemented. so it might be argued that any well-trained empirical economist could tackle the problem and that there is no need for ``cross-sub-disciplinary'' research. However. unless economists and policy-makers believe that nearly the entire approach of traditional undergraduate and graduate economics ought to be declared irrelevant to real-world policy debates. the supply-and-demand model is not at all new and the basic econometrics of ``instrumental variables'' that would allow us to estimate simple linear demand and supply functions is not new either. To the degree that new land is cleared for bio-fuel production. The US and the EU are both considering policies that would effectively require that a significant fraction of world crop output be converted to bio-fuels. we will often need to estimate (somehow) such underlying models and interpret their implications in light of some more or less explicit economic theory. Even if we can run localized experiments. The list is intended to be purely illustrative of the kinds of policy questions that would benefit from the proposed initiative. Refocusing Policy Research 51 . the 2008 paper in Science by Searchinger. in part because of unproductive feuds over what is meant by the term. et al) points out that the source of increased crop production for bio-fuels is critically important. rather than as serious policy research. leading (one hopes) to both better method and better policy analysis. but often (as it turns out) by a relatively small handful of researchers who work on a handful of problems.'') Now. Global Warming and Bio-fuels. The new literature on bio-fuels makes explicit the fact that crops used for bio-fuels have to come from some combination of new land. since the carbon released from burning the bio-fuel is recaptured when the next crop is grown. the experiments have to be designed to tease out the separate demand and supply price-elasticities that are required by the theory (as opposed to. Policy Examples This section considers richer examples of policy questions that could greatly benefit from interaction with new methods. In practice ``policy applications'' using new methods are often presented as mere stylized examples of the method. say. the carbon released from the process of land-clearing may more than offset any carbon gain. However. a vaguely defined single ``price effect. The theory tells us what are the relevant elasticities that we need to know and only theory together with econometrics can tell us how to estimate them. yield increases and demand reduction. econometrics and policy are tied up in a inextricable way. with much work that focuses on relaxing traditional assumptions that might be inappropriate or ad-hoc. it turns out that new instrumental variable methods are an extremely hot topic in theoretical econometrics. The idea of ``structural estimation'' of such parameters is controversial. Crop-based bio-fuels (like soy diesel) have been suggested as one part of a solution to global warming. In the case of the gasoline tax (as in much equilibrium policy analysis) theory. Such methods are sometimes implemented on a question of policy relevance. Much of the focus is on the interaction of policy and equilibrium markets but that (again) reflects the knowledge base of the author as opposed to the limits of a broader strategy for supporting research. This is because the ``demand elasticity'' is not observed directly from data on market outcomes and representative experimental data is likely unavailable. Recent research (see. for example.

there is only a little research effort currently being placed on improving the methods to account for the unique features of health care competition. Much policy research on health care looks at the ``directly observed'' effects of various health-policy experiments and interventions. In addition to the classic kind of “supply and demand” issues of endogeneity and equilibrium. Levinsohn and Pakes. David Dranove. Counterfactual policy analysis about introducing competition to health care markets. Recently. is necessary. bio-fuel policy makers are using very old Computable General Equilibrium (CGE) models that ignore the recent decades of research on world equilibrium trade. Mark Satterthwaite. Gautam Gowrisankaran. It turns out that the empirical agricultural economics literature on these topics fails to use even modestly up-to-date econometrics and ignores the fact that crop prices. by necessity. modeling exercises by the California Air Resources Board and by the EU “indirect land use” research initiative have placed large weight on such exercises. the policy analysis of bio-fuels requires us to know the market price-elasticity of crop yields.D. theses have shown that the “new trade theory” can have important real-world applications that improve on older models. On the theory side. 52 . a policy designed to combat global warming could instead result in the Brazilian rainforest being cleared to grow soybeans for bio-diesel. Market-wide “experiments” in changing the amount and nature of competition are likely to be very limited in scope. involves thinking about economic primitives of supply and demand. Recent work by Katherine Ho. However. which can be quite complex in the health-care context. Notable examples include contributions by Melitz and by Eaton and Kortum. recent work by the present author and colleagues (Berry.Refocusing Policy Research Thus. the FTC has apparently adopted some recently developed discrete-choice demand methods as an “official” basis for hospital merger analysis. Recognizing the importance of appropriate models. an appropriate set of empirical models would need to be able to move from disaggregated output and landuse data up to higher levels of regional and world aggregation. landuse and also the price-elasticity of demand. yields and demands are jointly determined in market equilibrium. The problem is that policy-makers are not being offered the insights of many years worth of advances in applied equilibrium trade theory and empirics. or poorly specified models. A number of recent economics Ph. Health Care and the Role of Markets. It is key to note that the policy application in that methodological paper is not nearly as important as the bio-fuels policy debate and that the general methods have not spread to the empirical studies relevant to bio-fuel policy. If bio-fuels policy is made on the basis of incorrect elasticity estimates. a serious unintended consequence indeed. For example. However. a frequently useful exercise. 2004) presents a strategy for dealing with “micro” and “macro” data in an equilibrium context. much of the 2008-2010 health care debate revolved around the role of health care markets in a partly competitive equilibrium. Josh Lustig and others shows that progress is possible. so models and estimates are necessary. Bio-fuel policy-makers clearly realize that some kind of world-equilibrium model. fit to empirically estimated elasticity estimates.

Much useful empirical research on education focuses on the possible outcome of direct policy interventions like changing class size. equilibrium and selection are all critical to policy analysis. and the role of potentially unobserved variation in tastes and school quality.Educational Choice. Justine Hastings. As in many other policy areas. School choice is a good example. so progress is clearly possible. Refocusing Policy Research 53 . it is not clear that complicated econometric or economic theory is necessary. Simple models that ignore the heterogeneity of schools and students.Patrick Bayer. are likely to provide misleading results. society overall would benefit if a new group of highly skilled methodological researchers would add their efforts to the existing research agenda. To answer some of these important questions. but they are often applied in situations of limited policy relevance. There is a lot of research on very sophisticated models of choice among differentiated alternatives in equilibrium. there are other examples of educational research where models of demand. Some researchers -. However.have made good progress in applying empirical equilibrium models to this class of policy-relevant models. The way that different students are sorted into a set of possibly very different quality schools is complicated. and Holger Seig (among others) -.

54 .

So a key policy question is should the developed world follow the French and Scandinavian model of regulating holidays to force workers to spend time away from work. White Paper for NSF Grand Challenges Nick Bloom (Stanford and NBER) September 15th 2010 Abstract This short piece outlines some of the topics I see as part of the “Grand Challenges” for the social sciences over the next 10-20 years. To view a copy of this license. I just highlight that neither approach is well suited for the type of causal analysis that is required for policy making. 2010).gov/files/documents/100331-cea-economics-workplace-flexibility. http://www. March 2010. 1) Balancing work and family The President and First Lady jointly launched the Council of Economic Advisers’ paper1 “Worklife balance and the economics of workplace flexibility” at the Whitehouse in March 2010. I regularly teach case-studies and am a co-author of one of these survey articles (Bloom. I have focused on three areas where I see the policy agenda being particularly constrained by the lack of high-quality research. Changes in technology and the demographics of the US workforce have led to increasing conflict between work and family life.2 The CEA viewed this prior literature as so limited that even in its Executive summary it devoted one of it’s main bullet points to call for more substantive research.5 in much of Southern Europe). Kretschmer and Van Reenen.0/ or send a letter to Creative Commons. stating: “A factor hindering a deeper understanding of the benefits and costs of flexibility is a lack of data on the prevalence of workplace flexibility arrangements. USA. 94105. San Francisco. and was amazed to discover the prior research is primarily case-study or cross-sectional survey based.pdf 2 This is not meant as an attack on these research approaches.This work is licensed under the Creative Commons Attribution-NoDerivs 3. and more research is needed on the mechanisms through which flexibility influences workers’ job satisfaction and firm profits to help policy makers and managers alike” [Executive summary. Suite 300. Council of Economic Advisers. far below the replacement ratio to sustain population levels. This report makes clear the importance of balancing economic growth with family friendly working practices to policymakers. 1 55 .whitehouse. voters and firms. California. In Europe these pressures are claimed to be one of the key reasons that birth rates have now fallen below 2 (and below 1. preceding page 1] “Work-life balance and the economics of workplace flexibility”. or the US and UK model of allowing firms and individuals the freedom to organize their time? Despite the huge economic and policy interest in this question the empirical evidence is extremely limited.org/licenses/by-nd/3.0 Unported License. visit http://creativecommons. 171 Second Street. I was personally involved in assessing the prior literature for the CEA report.

Kretschmer and van Reenen. management and technology. To advance our knowledge we have to employ the tools of modern economics . although the tools of economics in pushing for causal identification are clearly at the core of this research. The organizations that do provide international comparisons like the OECD tend to provide industry or macro-level aggregates. or reverse causation (women are attracted to better work-life employers) or some other correlated factor (maybe firms with more enlightened managers hire more women and provide better working conditions). and even within these countries to publicly listed firms (Compustat and Datastream) or to a few basic data fields (Amadeus). to what extent is Chinese trade now driving US manufacturing productivity and technology upgrading. This would move beyond the assumption that correlation implies causation. For example. or carried out by international organizations like the World Bank which have focused solely on developing countries. From my own experience running international management surveys it is feasible to raise grants of a few hundred thousand dollars to run one-off small scale surveys.The reason for the poor state of the literature is it is inherently difficult to examine the impact of work-life balance practices on firms and households. along with the weak state of the current research base. Without further evidence distinguishing these stories undertaking evidence based policy is extremely difficult. 2) Building cross-country micro databases for productivity and growth research The increasing globalization of economic activity means that answering any domestic economic question increasingly requires a global analysis. This is a challenging research study to pull off. no international datasets exist with basic information on inputs. Comparing. but extremely difficult to fund large 3 This is not to say no international data exists – for example Compustat in the US. But what does this mean? It could be a causal effect (women push for better work-life balance). 2010) we find more women managers work in firms with a balance work-life balance culture. I think this area deserves special NSF support. which the CEA was unsurprisingly not satisfied with in the prior research.worldvaluessurvey.org/ 56 . Datastream or Amadeus in Europe – but the coverage of these are limited to a small set of countries. long-lived international research projects. Given the top level policy interest. running experiments in firms in allowing some randomly chosen groups of employees better working conditions and comparing the impact against other control groups. Surveys struggle to elicit causation – for example in my paper (Bloom. This has led to huge gaps in international data analysis – for example. because comparability is only possible at these broad levels. potentially spanning several disciplines.3 So to date research undertaken across countries has either been set-up and funded by individual research groups like the World Values Survey or the World Management Survey4. for example. 4 See http://www.org/ and http://worldmanagementsurvey. and what should policymakers do in response? But collecting firm-level data has traditionally been the preserve of national agencies – for example the US Census and the Bureau of Economic Analysis – with very little international data comparisons. This is true both at the macro level but also at the micro level – for example. R&D expenditure and education levels of firms on a broad country by country basis is currently impossible. the growth rates.natural experiments and field experiments. growth. I view the challenge of funding large-scale. outputs.

highlighted the waste in the US healthcare system from poor management practices. but again I think economics is the central discipline here because of it’s emphasis on large-scale. This is a common issue across the whole of social sciences. international panel surveys. that could be used to exploit natural experiments to try and estimate the impact of management on performance. 57 . There has been a long debate on the importance of management practice in social science. Several recent articles in the New York Times have. and what evidence there was for market failures in this. I often see papers using individually collected data samples which after the paper is published never see the light of day. Senior policymakers including Larry Summers had apparently asked what evidence there was for the positive impact of improved management practices on firms performance. So my final suggestion is for substantially increased funding for research to evaluate the causal impact of management (and organizational) practices on firm performance.scale. given the important role that management practices presumably play in driving US economic growth. correlations of good management with productivity could be due to reverse causation – productive firms have the resources to hire in management consulting firms. To do this I would suggest doing two things: A) Building a large scale public access management database: This would start to build up a strong common survey infrastructure. While researchers in management and strategy often claim overwhelming evidence for large impacts of management on productivity. 3) Causal evidence on the impact of management on productivity After the crisis of 2008 the US government considered ways to assist its domestic manufacturing industry. both of which are problematic in terms of drawing causal inference. I would reallocate funding away from dozens of small-scale proprietary research projects to a few largescale public projects. I am involved in running a wave of such a survey at the US Census Bureau to survey around 50. Second. but also in public sectors like healthcare and education. rigorous panel data collection. and think it would be valuable to repeat this in future and extend to 5 See. and currently has a budget of around $100m per year. The MEP is a government funded agency that provides management assistance to US firms. the overview of the international management surveys in Bloom and Van Reenen (2010).5 My advice is to firstly increase absolute funding to primary data collection – without better basic international data no amount of clever estimation or clever theory will answer basic questions on the drivers of productivity and growth. for example. But from my discussions with economists in the administration one major reason holding this back was the lack of evidence for the impact of management practices on firms productivity. This is clearly a major research and also policy hole. with the debate in 2008 and 2009 about whether this should be doubled. for example. For example. This is true not only true in manufacturing and retail. fund one or two large projects that provide public access longitudinal data across countries then 50 smaller projects which collect their own data but keep this proprietary. economists have typically been skeptical.000 US manufacturing plants. The reason for their skepticism is the lack of (arguably) causal evidence – research to date has been based on surveys and case studies. for example. including providing a massive increase in funding to the Manufacturing Extension Partnership (MEP). but with unfortunately very little consensus. It would be much more useful to.

Nicholas Bloom. David McKenzie and John Roberts. But nothing remotely similar has been undertaken in the US. management and sociology about measurement and practices would also be very helpful. Fall 2010 “Does management matter: evidence from India”. This will be a public access database in that all researchers will have access (within the limits of protecting confidentiality of Census data). natural experiments and field experiments this would naturally be an area to have them involved in.would be invaluable in filling the current gaps in management research. 58 . and their performance compared to a control group . Discussion with other disciplines over like organizational behavior. References: “Why do management practices differ across firms and countries?”. so that the returns to building a large scale public panel database on management practices will be very high. To date nothing like this exists. Greg Fischer and myself have run management field experiments in developing countries (see Bloom et al. Benn Eifert. Running such field experiments in the US – where some firms are helped to randomly improve their management practices. Aprajit Mahajan.other industries like healthcare and retail. Strategic Management Journal. These have been extremely informative on the impact of management on firm performance. Antoinette Schoar. March 2010 “Determinants and consequences of family friendly workplace practices”. Journal of Economic Perspectives. Miriam Bruhn. B) Running management field experiments: to uncover the causal impact of management practices on firm performance. given the extensive experience of economists in large scale data collection. Again. Nicholas Bloom and John Van Reenen. Toby Kretschmer and John van Reenen. Chris Udry. 2010 and the references therein). A number of researchers like Dean Karlan. Stanford Mimeo 2010. Nicholas Bloom.

Markets are the central topic of economics. 1 59 . or disordered markets. We know next to nothing about This work is licensed under the Creative Commons Attribution-ShareAlike 3. Joe Halpern. if abstract. is based on a coarse description of how markets function. Yet while economists have a good understanding of the behavior of well-functioning markets. This is entirely sensible a key insight of economics is that at a certain level. however. We know even less about sound principles of market design. Suite 300. Robert Kleinberg and Eva Tardos.Robustness and Fragility of Markets Research at the Interface of Economics and Computer Science Lawrence Blume Cornell University Abstract Market behavior is the central topic of economics. we have little to say about market fragility. especially from the standpoints of robustness and resiliency. This understanding. market resiliency.0 Unported License. visit http://creativecommons. I am grateful for the many interesting discussions on the topic of disordered markets I have had with my ´ colleagues David Easley. well-functioning markets all look the same. and market collapse. taxonomy or typology of broken.org/licenses/by-sa/3.0/ or send a letter to Creative Commons. We have a very incomplete understanding of the causes and remedies of market breakdown. USA. Economists now have a broad. To view a copy of this license. Research emerging at the frontier between computer science and economics offers new ways of addressing this important issue. 94105. San Francisco. we have little in the way of a classification. understanding of the virtues of market allocation. California. Jon Kleinberg. 171 Second Street. But while well-functioning markets all look alike.

the finance market micro-structure literature purports to model financial markets as non-cooperative games. or desirable trades at a quoted market price may not execute. They are just as important in commodities markets. specifically utility. The gap between real markets and their game theoretic models is huge. market-clearing) prices and their associated market resource allocations. The workhorse model of market behavior is the general equilibrium model of production and exchange. The key phrase in the preceding discussion is “well-functioning”.and profit-maximization. there may be no price at which to trade. the auction. Consumers and producers are black-boxed. Allocation in disordered markets can no longer be described by the scissors of supply and demand. This claim has been validated by decades of economic practice. The claim is that the performance of all well-functioning markets can be captured at this level of abstraction. Moving beyond simple empirical descriptions of market collapse requires theories of market performance that are based in the social. and are especially important for economies in the developing world. initial resource allocations (and perhaps information and beliefs) from which consumer and producer behaviors are derived. The market outcome function maps environments into equilibrium (that is. on the frequency and magnitude of such events. social norms and other institutional arrangements under which trade takes place. has been intensively studied. roughly modeled as responders to the incentives provided by market prices. Second. and the exogenous market environment is specified by a list of tastes. First one particular form of market organization. for example. and the May 2010 flash crash. is a pointless exercise. legal and technological description of market institutions. This model contains no description of transaction rules. calibrating general equilibrium models to determine the effects of. regulatory policy.the transmission of failure across interlinked markets. These issues appear to be particularly important in light of recent financial history the credit freezes in overnight lending and commercial paper in September and October 2008. The transition from “well-functioning” to “disordered” is determined by the institutional and social arrangements of the markets and the volatility of their environment. technologies. Often (but not always) these black boxes are derived from reduced-form descriptions of behavior. There are three different research programs that address the social and institutional frameworks of markets. a point made by Amartya Sen in his path-breaking work on famines. Market design and market collapse have not been totally ignored by economists. Consequently. When markets collapse. This literature has not pro2 60 .

Little is known about the co-evolution of individual behavior and network structure as agents seek out advantageous network connections. Obvious examples include online auctions 3 61 . in the presence of incentives. Nonetheless. both disciplines are concerned with the consequences of agents interacting through networks. work in CS/Econ has contributed to the development of new kinds of markets. As a result. both disciplines have been trying to design and analyze complex interconnected systems. Issues relevant to market breakdown include the effects of network topology on information and liquidity flows in markets and the contagion of market collapse.vided much guidance in developing design principles for new financial asset markets or in understanding the consequences of regulation for market performance. Research at the CS/Econ interface is concentrated on three themes: Networks. and into fundamental research on economic systems more broadly through computational ideas and models. and individual decision-making. Furthermore. with a strong research interface forming between the two. and a level of complexity that makes it difficult to determine how these aggregate outcomes arise from the behavior of the participants. fragility and collapse has emerged suggests that a new approach and a fresh set of ideas are needed. This inter-disciplinary area of CS/Econ has had a number of significant successes. Computer science has for many years been concerned with the performance of systems in which agency is distributed across some network. but perhaps the most interesting of the three in its deployment of different research methodologies and in the conversation between theory and data. Finally. There are natural reasons for this: over the past decades. the two fields have increasingly interacted. the fact that no paradigm for understanding market robustness. there is a literature on the behavior and design of matching markets that is small. CS in recent years has become interested in networks with human actors. Issues relevant to market creation include the possibilities new technology affords for the creation of new markets and the reorganization of older markets. a number of crucial tools for reasoning about these issues have been developed over the past several years by researchers at the interface of computer science and economics. a set of different possible global outcomes that range from highly efficient to catastrophic. with adaptive agents. providing important insights both into new styles of economic interaction facilitated by computing technology. such as the market for search advertising. All markets that operate at an interesting scale share several important features: Interconnected groups of economic agents that act and learn in response to incentives. While the conventional economics toolkit has made little progress on these issues. mechanisms. Beyond this. While interest was originally focused on networks of machines.

com. These questions include the computational feasibility of mechanisms. principles of learning other than the Bayesian formulation which dominates economic analysis. Mechanism design had been a popular research topic among economic theorists. and CS researchers have raised a new set of questions that are now capturing the interest of economists. and which considers models of knowledge and belief alternative to the probabilistic model underlying dynamic expected utility. designing incentives to guide the behavior of self-interested agents toward a collective goal. Markets are important exemplars of resource allocation mechanisms. Despite this critique. The difficulties of enabling the emergence of a new research community extend beyond sources of available research funding. which considers the problem of institutional design. more recently it has become important to computer scientists. few decision-theory models have emerged that are sufficiently expressive to model alternatives to the behavioral hypotheses that comprise EU and sufficiently tractable to deploy in problems such as dynamic choice and portfolio choice where the structure of the decision problem is complicated. The interaction between computer science and economics has not been ignored by the NSF. dissenting economists were decrying its limitations. and other business-related social network sites (monster. machine learning models for the analysis of highdimensional data sets. In particular the awkwardly named CISE-CCF ICES program. Issues include the design of computationally feasible heuristics for complicated choice problems. however. Interface between Computer Science and Economics and Social Science is now collecting its first round of proposals. Equally fascinating is research conducted by a community of scholars including economists and computer scientists on the economic effects of cell phones in rural Africa and India. Even before the satisfactory axiomatization of the now-dominant expected utility theory emerged in the early 1950s. and market design is a special case of the general problem of mechanism design. that is.(eBay). job search. Computer scientists bring new problems and solutions to the table. A pervasive challenge in this area is the lack of people who have expertise in all the different facets of reasoning about complex economic systems. LinkedIn). and the analysis of mechanisms under a wide variety of behavioral postulates that go beyond the classic decision-theoretic models that economists favor. the identification of secondbest mechanisms when mechanisms that actually achieve the social goals do not exist. including their interconnected4 62 . the robustness of mechanisms to bad behavior by individuals and to environmental shocks. Nonetheless it was only in the 1980s that the discussion moved from a few well-conceived examples to a systematic critique.

feedbacks and the sources of their inherent complexity. this conference also celebrated the birth of the Cornell Center for the Interface of Networks. The research program described here is part of a broader theme that has captured attention in different parts of the economics community. Tardos. Computation. There is every reason to believe that these three disciplines together will have interesting things to say about the behavior of disordered markets.edu/conferences workshops/CSECON 09/post-workshop.cornell. Sociologists and economists have been exploring their shared interests for decades now. and Economics. The shared interests of computer science and economics can only be fully explored by a new generation of graduates who are well-trained in both disciplines. and E. surveying current work and exploring future possibilities. J.) The final report expands on some of the themes discussed here. The study of these arrangements is an active research area in sociology. Easley.cis. L. (Incidentally.ness. 5 63 . Kleinberg. Sociologists have also been collaborating with computer scientists in the study of on-line communities. REFERENCES ´ Blume. Kalai. and this includes informal social arrangements governing market organization as well as formal transaction rules.pdf. “Research Issues at the Interface of Computer Science and Economics: Report on an NSF-Sponsored Workshop. An NSF-sponsored conference on the emerging collaboration between economics and computer science. D. was held at Cornell University in 2009. E..” available online at http://www. A proper study of the transition between a given markets wellfunctioning and disordered regimes depends on the details of market organization. that institutions (sometimes) matter.

64 .

The failure of the fiscal stimulus was a playing out of permanent income theory. multipliers in New Keynesian theory when short-run fiscal stimulus is accompanied by expectations of large future expenditures and taxes beyond the period when the Fed is at the zero lower bound on interest rates. Excessively loose monetary policy was the biggest cause of the bubble. for which funding is available from other government agencies and private sources. perhaps negative. Before presenting a few ideas. I think the main issue for economics is explaining why frenzies. Nor do I think that the evidence of little response to the 2008-2009 stimulus bills is inconsistent with much of what is known in economics. I think the problem has been much more the failure to implement policies based on sound economics. Behavioral and Economic Sciences National Science Foundation Michael J. I also believe economics as a discipline is in better shape than recent commentaries would suggest.September 17. Myron Gutmann. 2010 To: Dr. Friedman Professor of Economics Stanford University Request for ideas about NSF “Grand Challenges” Dear Myron and Colleagues: This brief note is in response to your request for ideas about possible NSF grand challenges in economics over the next twenty years. I do not believe the primary causes of the housing bubble and severe recession were big gaps in economic knowledge. as opposed to very narrow programmatic specific research. The serial social engineering of housing was a predictable off-budget political response to budget restrictions. Boskin Tully M. Indeed. and the tiny. let me congratulate you for being pro-active and focusing on a long time frame. From: RE: 65 . rather than purely reactive. Let me begin by saying that I think of the National Science Foundation’s economics program as the core component in funding basic economics research of potentially broad applicability. Head Directorate for Social.

institutional issues. some fraction of which will help generate new technologies that are useful in analyzing more applied problems. many economic statistics and data lag behind. dynamic. and people. I have generally focused on applying theory and econometrics to real-world problems and policies. constantly evolving economy. I would put these in five interrelated categories: theory. “This Time is Different”. For those who think as I do that it is important to input the best economic research into the design and implementation of economic policy. and the implications for the design and evaluation of policy responses to deep. and while early in my career I developed some new econometric estimators. and new approaches to fiscal measurement and analysis. In my career.manias. where in each case. long-lived recessions. short-run Keynesian consumption out of disposable income. Perhaps the answer is a combination of neuroscience and difficulty in judging when a boom has really become (or almost become) a bubble. It has real uses. advances in recent years from a variety of agencies and academe produce a set of important ongoing revisions and perspectives. modern economic theory. in some cases considerably behind. research and measurement. As an example. I believe a renewed effort on the theory underlying basic measurement. at least as much as can be done given political and other constraints. there is considerable ongoing research in academe and within the statistical agencies themselves. Despite numerous improvements made by our government statistical agencies. Some of this is certainly inevitable in a flexible. Some of it reflects agency budgets. All that said. economics still has plenty of challenges. 66 . Let me briefly say a word or two about each: Theory. economic theory as the R&D part of the portfolio. economic theory is more than just an exercise for bright people. This should continue to be an important part of the NSF portfolio. measurement. importantly including the units of account – in the real world these have changed radically due to demography and social and economic patterns – would be well worth the investment. Sometimes the consequences are immensely consequential when compounded over a long period of time. But we need to think of economic research as a portfolio. and a proper reticence to adopt new procedures until they are extremely well tested. Of course. econometrics. Measurement. vs. which now spans four decades. not fundamental. What theory I have developed myself would more properly be called applied theory. in the words of Ken Rogoff and Carmen Reinhart. or alternatively life cycle theory. they were for use on particular problems. So I have always viewed. indeed encouraged in my own department. some part of which is shared and eventually incorporated into government and private measurement. and bubbles persist and why people keep thinking. aggregation. recall my discussion opening this letter on permanent income theory. Important recent examples are the “New Architecture for the National Income Accounts”.

with very different saving propensities. tend to be few. And we know from microeconomic data. as demography is evolving. on the effects of public debt on growth. But it runs up against some serious problems. for example. and its entreaty for the U.. Aggregation. The cases when we have a truly convincing identification strategy. for example. But they raise difficult questions. age. a weighted average works or does not work. while on balance an immense improvement. can have very different patterns of behavior. but it seems to me that much more work needs to be done in this area. The most commonly used model in economics. quicker fiscal consolidation is based on their estimate that each 10% increase in the debt-GDP ratio decreases the growth rate by almost one-quarter point. education. Importantly. The development of controlled experiments. We have a much greater understanding of the likelihood that we are identifying something that we think of as a relevant economic parameter rather than a reduced-form mishmash that is scientifically hard to interpret. I would like to see renewed emphasis on alternative econometric approaches to confronting these issues.I can certainly report that increasing the accuracy of economic data would be most welcome in the private sector and among policy makers.) earlier in life. There has been some work on trying to analyze how far off one gets from adopting representative agent models. perhaps deriving from a decision theoretic framework (as the minimax regret principle led to Stein estimators in statistics). say household data. we are going to have a larger and larger fraction of the population saving late in life relative to those saving (for college education for their kids. For example. etc. but on many of the big issues of economics they remain elusive. Econometrics. beg the question of whether they are permanent or transitory responses to policy interventions deemed likely to be temporary or permanent. etc. Capital income taxes affect savers and dissavers differently. e. retirement. In macroeconomics. to embark on a larger. Ingenuity and data development have sometimes solved this problem. The experimental studies. The measurement issues are closely linked to the discussion below of aggregation. is that of a representative consumer. a good teaching device. The structural revolution in econometrics. say. and on some occasions a decent rough first approximation. most extensive in recent years in development economics. under what conditions. patient and impatient. but again. That is convenient. there are several suggestive studies (Reinhart and Rogoff. in macroeconomics and public finance. recommendations and critiques. location. the really convincing identification of cause and effect is far from strong. instruments that we would widely agree are appropriate. leaves economists with a very thorny problem. that observationally equivalent households in terms of family composition. often analyzing differences in differences. Euler equations don’t aggregate if you have two classes of consumers.S. for example) and certainly official agencies make estimates and use those in their policy prescriptions. It is unclear for what purposes. The structural revolution in econometrics was an immense improvement. terrible for savers. the Survey of Consumer Finances or Consumer Expenditure Surveys. and related econometrics are also important arrows in the economist’s quiver. The IMF.g. among a host of other factors (see Angus Deaton’s piece in the recent Journal of Economic Literature). Low interest rates are good for borrowers. etc. 67 .

Almost always. This is more important than defining specific areas in advance into which to allocate research dollars. They may be a good first approximation in many. I certainly feel strongly about the areas mentioned above. differing beliefs (an idea that has such distinguished forbears as Hayek and Solow). not by people trained in “interdisciplinary issues”. creative. please do not hesitate to call upon me. the best research on topics and issues that span disciplines is done by great scholars in the core disciplines. I hope these comments are useful. energetic. and it is important for NSF to be funding them. Boskin 68 . In this regard. Scholars were cobbled together but didn’t produce much serious research. People. depend most of all on attracting bright. let me make my most important point. It is not clear how far off the representative agent models are in addressing many concerns. Behavioral. Interdisciplinary programs generally do not attract scholars of as high quality as the core disciplines. but the general support of economics research and the research enterprise is likely to continue to be the most valuable public good financed by NSF. If I may be of any further assistance. The future of economics research. one can think of the economy as a complex dynamic interaction of a variety of agents making decisions under incomplete information. committed scholars to the serious study of economics and to a career in cutting-edge economic research. importantly. MJB:jb Michael J. Finally. it is a large part of Stanford’s capital campaign – and has lots of people excited. but an overemphasis on interdisciplinary research at the expense of continued R&D investment in core economics is likely to provide some short-run popularity at the expense of the serious long-run mission of the NSF Social. helps retain and. NSF funding has played a valuable role in that regard. uncertainty. perhaps even most cases. and Economics Directorate. The best environmental economics is done by high-quality economists. But there are innumerable examples historically of vast amounts of funding poured into interdisciplinary research that wound up producing very little. let me also say something about interdisciplinary research. It is in vogue – in fact. Sincerely. There certainly are some potentially valuable areas. and the social value of that research and development. and I would view this as the single most important aspect of the program: it attracts. That’s not easy to model. frees time from teaching and administrative duties for productive scholars advancing the frontiers of economics.More generally. But it seems to me important to delve much farther in this direction. There certainly are some areas where it is important – the typical applied medical researcher is now a systems engineer.

University of Michigan Kate McGonagle. measuring genetic information in PSID will open a wide range of new studies on social and economic behavior and outcomes. University of Michigan Abstract: There are extraordinary opportunities to address the next generation of research challenges in the social. University of Michigan Robert Schoeni.Future Research in the Social. and Economic Sciences with the Panel Study of Income Dynamics 15 October 2010 Charles Brown. University of Michigan Narayan Sastry. PSID offers untapped opportunities to examine questions of relevance to our understanding of environmental sustainability. University of Michigan Frank Stafford. Behavioral. University of Michigan Dalton Conley. behavioral. Second. Advances in these areas will provide a foundation for future research and for new interdisciplinary collaborations. New York University Vicki Freedman. cross-national harmonization of PSID with other national panel surveys will be instrumental for developing and facilitating new research on the effects of policies and institutions. Third. University of Michigan Fabian Pfeffer. First. and economic sciences that build on the Panel Study of Income Dynamics (PSID). University of Michigan Dan Brown. Abstract word count (200 word limit): 108 words Main Word Count (2000 word limit): 1950 words 69 .

We describe three such opportunities relating to the Panel Study of Income Dynamics (PSID) that focus on human-environment interactions. utilities. 70 . and materials consumption. An important outcome of these investigations might be a better understanding of the degree to which changes in the economic. behavioral. Some additional information. housing. through new questions or a new module. PSID is the longest-running nationally representative panel survey in the world and is an important component of the National Science Foundation’s investment in research infrastructure for the social. PSID offers a unique opportunity to demonstrate the value of individual-level longitudinal information in the investigation of the human role and response in environmental systems. As this program and others like it matured. contributions to charities. economic. crossnational research. We believe that pursuing this research will generate important new understandings of the role and response of humans in environmental processes. Sustainability science has emerged as an important paradigm for investigating the bi-directional linkages between human actions and natural-environmental processes with the goal of helping to solve a variety of environmental problems. information. existing information on various expenses. cultural. We identify two major areas of environmental research that might benefit from engagement with and expansion of PSID. and location). Although not measured directly in PSID. The existing NSF Program on the Dynamics of Coupled Natural and Human Systems aims to investigate these problems. and behavioral dimensions of human activity that can complement processlevel understanding and data in the natural sciences. with a particular emphasis on modeling approaches. behavioral. education. would allow investigation of the direct consumption variables and their association with these more indirect measures. or natural environments might be most likely to yield changes in behavior. geospatial. Although some new social science data collection programs are emerging as part of existing or planned environmental observatories (like NEON and WATERS. and economic sciences. PSID offers untapped opportunities to examine questions of relevance to our understanding of environmental sustainability. income levels. and so forth can be used to investigate the social. but also demonstrate the value to the scientific community and to society of further investment in significant empirical social scientific research. economic. and economic sciences.Introduction There are extraordinary opportunities to address the next generation of research challenges in the social. Braden et al. Furthermore. PSID and Human-Environment Interactions First. research results have highlighted the needs for data on longitudinal. and cultural determinants of consumption behaviors. Influences of intergenerational processes on consumption behaviors might also be investigated. consumption in these areas is strongly related to existing PSID data (including information on travel. water. social science data collection efforts are still not implemented at the scale envisioned for these systems. One important area that can benefit from existing and expanded PSID data is in our understanding of the social and economic determinants of energy. and genetics. 2009).

linked through the geocode. in a globalized world nations may serve as the new laboratories of social and economic policy. how those choices influence the development of urban forms and structures we observe. For example. many industrialized countries face common challenges. focusing on demographic processes of migration and urban-rural movements. In short. and social and economic inequalities. and. In other words. Beyond description. ultimately. There is ample supply of novel policy approaches and alternative institutional arrangements around the globe. For instance. However. 71 . how those choices are influenced by contextual factors at multiple scales. in today’s increasingly interconnected world. detecting barriers to educational access among disadvantaged children yields important information for inferring why different forms of educational financing do or do not impact educational opportunities. such as population aging. Many of the most pressing concerns. There are several strategies for investigating the role of institutional characteristics and policies in explaining observed cross-national differences. studying the causal role of specific aspects of different policies and institutions based on cross-national comparative research suffers an important inherent problem: there are many more explanations for crossnational differences than there are countries to compare. on neighborhood social and physical characteristics to understand residential preferences across a number of land markets. pinning down the causal mechanisms that are at work at the individual-level is integral to our effort to make meaningful cross-national comparisons that have the potential to identify a “best practice” policy or institutional arrangement that may be transferable to a different nation. future research must strive to identify policy solutions that have proven successful in other nations. These studies would likely be most profitable in combination with environmental data on land use and cover. High quality. Detailed information on the housing and land related expenditures. One promising strategy begins by reliably establishing the individual-level mechanisms that account for the observed phenomena in each nation. nationally representative data that cover a wide array of topics from different spheres of life have already served to illuminate some of the common challenges as well as their differential consequences. Understanding why given social or economic phenomena occur increases our chances for understanding how a given policy or institutional arrangement may affect these phenomena. the complexity of educational careers and life-long learning in the knowledge society and transitions into and out of unemployment and poverty in times of economic downturn require this dynamic perspective and consequently rely on longitudinal data. the integration of immigrants. Although many of these problems affect modern nations similarly.Another important area for investigation and application of the PSID is the locational characteristics of participants’ places of residence and places of work and travel. These studies could be investigated at multiple scales. however. are often only captured in a dynamic perspective. house and land value. and at lot scales to investigate characteristics of residential land consumption and management. and movement can be used to better understand the residential choices of participants. their severity and impact on individual lives can differ markedly across countries. PSID and Cross-National Research Second.

in general. Because these data sources provide the most potent basis for cross-national research. in which important data collection efforts in one country inspire and guide similar projects in other countries. in particular. longitudinal data that not only allow a dynamic view on important social and economic phenomena but that also facilitate the search for best practices that have proven successful in other countries and that hold promise in being applied to the U.S. and many other national panel studies. such as the Cross-national Comparative Equivalent File project. the World Values Survey). For example. and economic data have not yet embraced the integration of genetic information. Although recently there has been increased interest in collecting biomarkers. provides an 72 . nor are these nationally representative samples of the entire adult population across the age spectrum. and genetic data. bestselling book The Bell Curve (1994). the main future challenge will be to further increase the harmonization of measures between these datasets. introducing genetics to discussions of social behavior in humans has been morally suspect. partially harmonized. nationally representative. the British Household Panel Study. no existing social science study collecting genetic information is intergenerational in nature. PSID has served as model for the German SocioEconomic Panel. demographic. however.Large-scale longitudinal surveys provide a strong foundation for the study of the causal mechanisms underlying a wide range of social and economic dynamics. Meanwhile. is that of harmonization by imitation. continued opportunities for ex-ante harmonization should be pursued wherever possible. aside from the PSID. and one that may be predicted to gain in importance in the future. Only the PSID provides what would be a “full service” socioeconomic dataset with gene markers. in social science surveys. Further. no present study. studies that focus primarily on collecting social. The most important data requirement for future research. for instance in the case of new panel surveys or new topical modules in existing surveys. And there is no study better positioned to maximize the intellectual return on investment in this area than the PSID. Another successful model. The social sciences have profited immensely from existing large-scale projects that provide comparable data for a number of nations. the focus to date of such studies has generally been on health dynamics. So far. Different organizational models have been successful. those U. Although the latter efforts should be expanded.S. For instance. from the days of Francis Galton’s eugenic theories of the heritability of intelligence and criminality through the controversial. The time is right for a nationally representative socioeconomic study to collect genetic markers. both in terms of sample construction as well as measurement. This will improve the foundation for fruitful cross-national comparative research by providing high-quality. ex-ante harmonization has mostly involved informal cooperation among the founding survey administrators while ex-post harmonization is beginning to take place in more formal initiatives. is that of cross-national comparability. Some international collaborations are dedicated to the ex-post harmonization of existing surveys (such as the Luxembourg Income Study) while others have accomplished ex-ante standardization of a set of core questions that are asked in a large number of countries (for instance. NSF support will be instrumental for developing and facilitating crossnational data harmonization. PSID and Genetics Research Third. This has led to an intellectual firewall between mainstream social science and biological data.

M.. 45: W11301.M. and.W. Social science in a water observing system. and Murray.G. 73 . for example.. and economic sciences. a problem is that alleles are not necessarily distributed randomly across sub-populations thus potentially biasing the observed phenotypic associations with those alleles.S. New York: Free Press.L. on one hand. J. Dozier. Hughes. within a particular subgroup such as ethnic group). then it suggests non-independence of the units of analysis for classic heritability analysis. They will be a national resource for conducting transformative research that will also strengthen links between the social. J. P. Schneider.. Brown. Swallow.S. If the PSID were to collect genetic markers.? Has accelerated immigration since the 1960s affected this distribution? How much genetic in-breeding occurs in the U. Herrnstein. • How do genes interact with exogenous economic shocks? The basic logic until now has been the following: a certain proportion of a population sample is found to have a variant of a particular allele. Schultz. or schizophrenia) within that same population (or subgroup).B. Gober. behavioral. If the behavioral phenotype of an individual is not just contingent on her/his own genotype but that of her/his siblings. it could be adaptive to have a putatively more emotionally “reactive” allele when one is the only offspring to be homozygous for this allele. Maidment.opportunity to obtain genetic information across three generations. References Braden. 1994. Water Resources Research. and Werner. and economic research on human-environment interactions. such as: • How does the distribution of haplotypes (unique sets of polymorphic markers in an individual) vary by race.. and region in the U... and the environmental. a number of important research questions will be able to be answered. and genetics.S.. In closing. Shortle. The Bell Curve.. the three research opportunities described above will provide a foundation for the next generation of social.K. shyness. 2009. This has been the approach of most work to date in both the social and biological sciences that have used observational data. thereby garnering more parental attention.. the expression of genetically-based propensities toward depression depend on growing up with a depressed parent? Does the effect of an individual’s genetic background on social and economic outcomes depend not only on the observed behavior of family members but also on their (unexpressed) genetic makeup? For example. C. However. C.? • How do the phenotypes and genotypes of our family and household environments affect individual outcomes? Does. J.. for example. and genetics sciences on the other hand. PSID would allow for within-family (cross-sibling or cross-cousin) and across-time (within-person) analysis what would alleviate some of these population stratification concerns. then researchers often look for specific outcomes which covary with the presence or absence of that particular allele. If this allele is shown to be randomly distributed across demographic subgroups (or. cross-national research. likewise. it is found to be associated with a specific social outcome or tendency (such as addictiveness. development and learning. S. class. D. D.. P. R. behavioral.R. S.. S.

94105. USA.0 Unported License. Suite 300.org/licenses/bync-sa/3. California.0/ or send a letter to Creative Commons. To view a copy of this license.This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. visit http://creativecommons. 171 Second Street. San Francisco. 74 .

psychologists. Investment in risky ventures can be socially productive even when this risk cannot be diversified away. and neuroscientists. “We must have a ∗ This work is licensed under the Creative Commons Attribution–NonCommercial–ShareAlike 3. Key questions include: What components of aggregate risk exposure of the private sector are problematic for a society? How might we measure these in meaningful ways. It is neither feasible nor desirable to eliminate all aggregate risk. Lo October 15. USA. statisticians. financial regulation around the world largely consisted of a patchwork arrangement with a bevy of regulators overseeing various institutions and markets in isolation. decision theorists. To view a copy of this license. California. 171 Second Street. There are at least three major components to this challenge: modeling. Andrew W. No single regulator was responsible for looking across the global financial system and identifying vulnerabilities that might be building up from the complex interactions of actors throughout the economy. and data accessibility. San Francisco. 2010 Abstract An important challenge worthy of NSF support is to quantify systemic financial risk. Anil Kashyap. Progress on this challenge will require extending existing research in many directions and will require collaboration between economists. visit: http://creativecommons. any meaningful discussion and implementation of such policy requires better measurements and better models of the interaction of the role of financial markets in the macroeconomy that motivate or justify these measures. Calls for regulation based on concerns of systemic risk are premised on concerns that the potential excess risk-taking within the financial system will lead to government bailouts when losses mount. or send a letter to Creative Commons. Modeling and Measuring Systemic Risk 75 . Proposal An important challenge worthy of NSF support is to quantify systemic financial risk. 94105.0/. measurement. Suite 300.0 Unported License. However. Lars Peter Hansen.Modeling and Measuring Systemic Risk∗ Markus Brunnermeier. sociologists. and what data can be used to support these measurements? What guidance do models provide on the best way for regulators and private agents to manage systemic risk? Prior to the crisis. The recent financial crisis has focused widespread attention on systemic risk in the global financial system.org/licenses/by-nc-sa/3. Arvind Krishnamurthy. Designing appropriate policy interventions that do not create perverse incentives for the private sector is important. As Federal Reserve Chairman Ben Bernanke put it.

See Ben S. We argue that systemic risk is a major social problem because of the potential for significant spillover from the financial sector to the real economy. Policy concerns along these fronts have been articulated by many.. yet existing models that identify externalities in the financial system with macroeconomic consequences are highly stylized and fall short of generating formal guidance for statistical measurement. not just its individual components. Meaningful measurement requires a clear definition of systemic risk and thoughtful modeling of this construct. Currently. There are at least three major components to the challenge of monitoring these risks: modeling. D. there is a much smaller literature on equilibrium models that include a role for financial market frictions and can speak meaningfully to financial stability. a substantial portion of the Dodd-Frank Wall Street Reform and Consumer Protection Act details how systemic risk should be regulated.”1 The global regulatory response to the crisis has followed Bernanke’s dictum. and regularly.strategy that regulates the financial system as a whole. There is a sharp contrast between our understanding of price stability and our understanding of financial stability and systemic risk. In the last decade there has been a substantial literature that explores dynamic stochastic equilibrium models estimated by formal econometric methods. from a policy perspective and social welfare objective. and data accessibility. how best to contain it. March 10. including by Former Fed Chairman Paul Volcker in a September 24th speech at the Federal Reserve Bank of Chicago. it is impossible to determine the appropriate trade-off between such risk and its rewards and. quantitatively. In the United States. But fulfilling this object will be extremely challenging. the term “systemic risk” is mere jargon that could support the continued use of discretionary regulatory policy applied to financial institutions and lead to ad-hoc policies that are inconsistent and fraught with unintended consequences. These models have gained considerable prominence in research departments of central banks and have improved our understanding of price stability. creating various agencies and committees that are charged with monitoring and controlling these risks. Washington. This is the current grand challenge that faces us today. In contrast. “Financial Reform to Address Systemic Risk” at the Council on Foreign Relations. Unless we are able to measure systemic risk objectively. but also the data needed to measure it. in a holistic way. Without the potential for measurement. measurement. The transparency and rationality of regulatory policy would be greatly enhanced by the thoughtful modeling and reliable measurement of systemic risk. But modeling in this area is still primitive. Bernanke. we lack not only an operational definition of systemic risk.C. 2009 1 Modeling and Measuring Systemic Risk 76 . where the gaps in our knowledge are much more pronounced.

000 in September 2010). including discussions of volatility fluctuations and tail risk. and the rate of inflation in consumer prices (0. additional data must be collected. 2010). one intriguing Modeling and Measuring Systemic Risk 77 . For instance. how non-farm payrolls have changed (–95. the level unemployment (9. dollar compared with other currencies (76. and connectedness will all be revealing. but is not tailored to the regulatory challenges going forward. A more integrated approach to studying these challenges will lead to enhanced understanding of their economic interactions and statistical relationships. Some research does exist that builds on measures of risk exposures of stochastic cash flows in asset pricing models. it is unlikely that a single measure of systemic risk will suffice. Also.6% as of September 2010). To support this new research agenda.3% relative to the previous month in August 2010). Moving beyond standalone inputs to a joint study will be difficult but is necessary if this task is to be achieved. thanks to financial innovation and technological progress. And we can measure the relative value of the U. We anticipate that the variety of inputs ranging from leverage and liquidity to codependence.88% as of October 14. but they require significant modification. Small perturbations in one part of the financial system can now have surprisingly large effects on other. extension. This will push modeling in new directions and reveal new challenges for measurement.S. and it may be necessary to draw on their experience.S. and integration.666 as of October 14. What is the current level of systemic risk in the global financial system? We cannot manage what we do not measure. and the newly created Office of Financial Research offers one promising avenue to meet this challenge. The increased complexity and connectedness of financial markets is a relatively new phenomenon that requires a fundamental shift in our linear mode of thinking with respect to risk measurement. parts of that system. we can quantify the state of the economy in many ways. 2010). concentration.Thanks to basic macroeconomics models from decades past which motivated national income accounting measures. the number of housing starts (598. characterizing risk and return relations using statistical methods. Existing research from a variety of areas may be useful catalysts for this new research agenda. the Census Department currently supports empirical investigations with confidential data. These effects have been popularized as so-called “Black Swan” events—outliers that are impossible to predict—but they have more prosaic origins: they are the result of new connections between sectors and events that did not exist a decade ago. We can measure the current risk of the U. This research has a long history. seemingly unrelated. For instance. The required models for measuring systemic risk will need to have quantitative ambitious of sufficient scope to confront real externalities that are induced by financial market behavior.7% for 2010Q2). Given the complexity of the financial system. stock market through the implied volatility of the S&P 500 index (19. we know GDP growth (1.000 in August 2010).

the challenge of learning and assigning probabilities in complex environments motivates the study of alternatives to the simple risk aversion model that has been a workhorse in economics. the nature and dynamics of liquidity. however. There are a variety of advances in decision theory. In the crisis. When applied to financial markets. It features time variation in volatilities. There are interesting Modeling and Measuring Systemic Risk 78 . The study of systemic risk requires also the study of indirect spillovers that occur through prices that clear markets because in a crisis situation. Research on mechanism design and incentives in the presence of private information has been a demonstratively successful research program. Concerns about ambiguity and.approach to modeling the interaction of financial firms is to view the financial industry as a network. policy-makers have had to fall back on qualitative models of systemic failure. How individuals. Going forward. including economics and other social sciences. This program. insights from corporate finance and asset pricing. psychology. There is scope for productive exchange with closely related literatures from sociology. Network models have been used in a variety of scientific disciplines. including research on asset prices that confront financial market frictions. and neuroscience. such as the well-known Diamond-Dybvig model of bank runs. policy could have been better calibrated if regulators could have relied on more sophisticated representations of the financial system. and the cognitive neurosciences that give some guidance for how people do and should confront uncertainty. Mechanical models of market frictions run the danger of failing to provide reliable guides to behavior in response to changes in the underlying governmental regulations of financial firms As mentioned previously. To push this approach in quantitative directions will require building on prior research from other fields that features quantitative modeling and empirical calibration. typically measured using high frequency data. firms and other entities respond to uncertainty in complex environments remains a challenge in economics and other social sciences. has been more qualitative than quantitative in nature. Converting these various insights into operational quantitative models are only in the early stages of development. there is an extensive literature on measuring risk-return relations using statistical methods. a network structure. and corporate governance structures related to risk management are critical to building rational and practical models of systemic risk. with the appropriate enrichments. more generally. Along some dimensions. probability theory. but they offer promise in helping us understand better the challenges of measuring systemic uncertainty. this literature is now quite advanced. they capture direct spillover effects such as counterparty credit risk. promises to provide one way of understanding better the systemic consequences of the failure of key components to a financial network. While these models have provided useful insights. these indirect effects might be even more potent. Nevertheless.

this is an exciting research challenge that can build upon a variety of previously disparate literatures to provide valuable insights. In summary. This line of inquiry may provide some valuable inputs going forward. but the systemic risk research challenge will require that this statistical literature be pushed in new directions.extensions that confront tail risk using so-called Levy processes as alternatives to the mixture of normal models that has been analyzed extensively. away from the problem of characterizing riskreturn patterns and providing inputs into pricing formulas for derivative claims. with major challenges going forward that involve collaboration among several disciplines in the SBE Directorate and beyond. New measures of risk or uncertainty will need to confront and quantify spillover effects that should be the target of regulation. systemic risk presents at attractive and intellectually stimulating area of inquiry that will attract young researchers. High-frequency risk measures that are now commonly employed in the private sector and in academic research will have to be supplemented by low-frequency quantity information that measures the magnitude of imbalances that can trigger so called “systemic events”. Modeling and Measuring Systemic Risk 79 . towards identifying and characterizing the systemically important components of existing financial enterprises. Finally.

80 .

In the post-war period the US led the way in the development of modern survey methods. administrative records offer much larger sample sizes. Administrative data are highly preferable to survey data along three key dimensions. and particularly for credible public policy evaluation. the fields of political science. and hence have already established statistical offices and set up the necessary files to produce such information. in the development of statistical techniques for analyzing these data. health and retirement. that dominant position is now at risk as the research frontier moves to the use of administrative data. and measurement error than traditional survey data sources. Harvard University Emmanuel Saez. First. all these administrative data are stored in electronic files that can be used for statistical analysis. Harvard University Martin Feldstein. A Wealth of Administrative Data Governments create comprehensive micro-economic files to aid in the administration of their tax and benefit programs. sociology. since full population files are generally available. The Internal Revenue Service and the various state income tax administrations compile income data for all individuals and businesses. is recorded in administrative data. and rewards for producing socially valuable scientific output. The Medicare and Medicaid programs record information on the health care services received by their beneficiaries. During the second half of the 20th century. family composition. Indeed. Although a number of agencies have successful programs to provide access to administrative data – most notably the Centers for Medicare and Medicaid Services – the United States generally lags far behind other countries in making data available to researchers. including education. Eroding US Leadership Traditionally. classes and teachers for all public school students. workplace and living place. the Current Population Survey (CPS) or the Panel Study of Income Dynamics. empirical research in social sciences has relied on survey data sources such as the decennial Census. secure access to administrative micro-data should be a top priority for the NSF. earnings. income. UC Berkeley Abstract We argue that the development and expansion of direct. School districts record detailed information on academic outcomes. Larger sample sizes can be harnessed to generate more compelling research designs and to study 81 . government agencies are required to produce statistical reports that inform the public about their activities. We then outline a plan to develop incentives for agencies to broaden data access for scientific research based on competition. Administrative data are therefore critical for cutting-edge empirical research. records annual data on earnings and retirement and disability benefit payments for virtually the entire US population. transparency. Counties record every real estate transaction. A rich archive of information covering most aspects of socio-economic behavior from birth to death. UC Berkeley Raj Chetty. The Social Security Administration (SSA). We discuss the value of administrative data using examples from recent research in the United States and abroad. With the advent of modern computer systems. State agencies collect quarterly earnings reports from firms on behalf of the Department of Labor for nearly all paid workers in the private sector. for example. and not coincidentally. non-response. Unfortunately. and economics were all revolutionized by US researchers using US-based survey data sources. Administrative data offer much larger sample sizes and have far fewer problems with attrition. The combination of data and methods established the nation’s dominant position in the conduct of empirical social science research.Expanding Access to Administrative Data for Research in the United States David Card. The full population earnings data from SSA or tax records is about 2000 times larger than the CPS.

A leading example of the research impact of routine access to administrative micro-data is CMS. and social security numbers) to researchers. and access is provided through an open competition process based on scientific merit. which are then reviewed by CMS. Third. Routine access to Medicare and Medicaid files has enabled US healthcare researchers to maintain their global leadership position in the field and have yielded many important public benefits. unemployment insurance records for many states can be accessed through the LEHD program at the Census Bureau – although this is onsite at a Census RDC. in the long-run. the development of administrative data access abroad will foster the development of empirical and econometric research programs in those countries. Because the US retains worldwide leadership in the quality of its academic researchers. In recent years. Many hundreds of medical studies each year use the agency’s Research Data Assistance Center (ResDAC) to develop requests for micro data files (including data protection plans). Because of confidentiality and security concerns. such as stimulus spending. although some valuable initiatives exist. However.e. In Denmark for example. However. including the researcher's office desktop) through a secure server. on job creation and overall personal income. which suffer from high and rising rates of non-response. administrative data provide much higher quality information than is typically available for survey sources. administrative data cannot be made publicly available. Regaining US Leadership Over the years. The record shows that access can be achieved in a way that maintains the strictest standards of privacy while still allowing researchers direct access to individual records. or the degree of earnings mobility over the life cycle.from the Centers for Medicare and Medicaid Services (CMS). research access to de-identified data has never resulted in the improper disclosure of confidential information. The availability of detailed administrative data abroad has led to a shift in the cutting edge of empirical research in many important areas of social science away from the United States and toward the countries with better data access. To the best of our knowledge. First and most important. many questions of central importance for US policy making cannot be tackled using evidence from other countries. state. addresses. The data extracts can then be accessed by researchers remotely (from any computer. US-based researchers are often involved in research using administrative data from other countries. in the same way that the development of US survey data was accompanied by great scientific progress in empirical methods in social sciences in the United States in the 20th century. Second. Outside the US. Access to existing administrative US data is required to evaluate the effects of various specific US government policies. However. or a severe local weather event. the United States has developed a number of initiatives to provide access to administrative data access for research. and under-reporting. SSA earnings data have been 82 . numerous examples -. this situation is less than ideal for at least two reasons. US public policy would be far better served having top researchers focusing on US policies issues using US data. from other countries. particularly in the fields of health. Researchers apply for data access through accredited "centers" at major universities. Statistics Denmark gives prepares de-identified data by combining information from administrative databases for approved research projects. and local government agencies -. and K-12 education. data that have been stripped of individual identifiers such as names. and from a variety of pilot efforts at federal. Second. access to data on income and earnings is not as satisfactory. In principle..show that it is possible to provide secure access to de-identified administrative data (i. administrative files have an inherent longitudinal structure that enables researchers to follow individuals over time and address many critical policy questions. attrition.important but relatively rare events – like a plant downsizing that affects some workers but not others. such as the long term effects of job loss. many countries have developed systems to allow access to administrative data for research purposes.

is also substantially inferior to direct data access because it does not allow for the inductive phase of data analysis that is critical for many empirical projects. there is a long tradition of distrust of centralized government in the US. However. relative to other countries. we believe that five conditions must be satisfied to make a data access program sustainable and efficient: (a) fair and open competition for data access based on scientific merit (b) sufficient bandwidth to accommodate a large number of projects simultaneously (c) inclusion of younger scholars and graduate students in the research teams that can access the data (d) direct access to de-identified micro data through local statistical offices or. with multiple agencies at three different levels covered by different privacy laws. First. Finally. We therefore believe that it is preferable to leverage the multiple agency setting and the principle of interagency competition by allowing and encouraging different agencies to provide their own data access systems. The option of sending computer programs. is an attractive model. Second. more preferably. in the US.is much more robust than the centralized agency model. as in the Danish case. The Value of Competition In principle. The Statistics of Income division of the US Treasury has also launched a promising tax data access program for statistical research purposes. the US government is far more decentralized. the main hurdle in the development of research partnerships between agencies and external researchers is the lack of internal incentives and the lack of dedicated agency resources.accessed by researchers through internships or co-authorship with SSA researchers. and in particular of monopoly control by a single government agency. Performance in scientific production is easily measurable via metrics such as peer-reviewed publications. This approach is much less attractive than providing direct access to the full administrative data set because in practice it is virtually impossible for the researchers to fully specify the contents of the ideal synthetic dataset in advance. subject to agency-specific rules that ensure the strictest standards of privacy. Synthetic data is simulated micro data that is constructed to mimic some features of the actual data. This could be achieved by rewarding agencies for performance. A well designed system would encourage agencies to improve their statistical capabilities and data access. it would seem reasonable to leverage the existing statistical offices of US administrative agencies for both their expertise and also as a base for access to such confidential data. and would unleash the forces of innovations as agencies compete for the best research projects. Currently. Alternatives such as access to synthetic data or submission of computer programs to agency employees will not address the key problem of restoring US leadership with cutting-edge policy-relevant research. however. while providing some data access. Based on experiences from other countries and these pilot initiatives. this model is less attractive for three reasons. secure remote connections (e) systematic electronic monitoring to allow immediate disclosure of statistical results and prevent any disclosure of individual records We emphasize that direct access to micro-data is critical for success. This model can also be extended to private institutions that gather data valuable for research (such as utilities for the analysis of energy and resource conservation for example) to create incentives for research 83 . the lack of sufficient resources and cumbersome data access severely limit the research potential. from the perspective of both privacy and efficiency. In all these cases. Rewards to agencies could take the form of resources provided by the major research funders (NSF and NIH) that would help agencies strengthen their statistical offices and develop partnerships with researchers. and statutory limits on inter-agency data sharing. having a centralized agency being able to obtain administrative data from all government branches and then maintain it and supply de-identified data to approved research projects. Any successful data access program must acknowledge the salience and value of this tradition. This model – which closely parallels the model of the Centers for Medicare and Medicaid Services -.

Another important case where data cooperation is valuable is the long-term analysis of randomized field experiments. as in Denmark. and the US was an early leader in the use of field experiments to evaluate negative income tax policies in the 1960s. The Value of Cooperation Experience from abroad and from the United States shows that there is tremendous value in carrying research by merging data. naturally allows such merging. and allow the analysis of long-term outcomes. The ability to systematically merge experimental data to administrative data can overcome difficulties of tracking. A centralized agency. starting from the decentralized landscape we have described. Such cooperation will naturally arise if all parties can share the benefits of the scientific output. nonresponse. Both government agencies and private institutions already have multiple business contracts for data work where outside contractors access the data for a specific business purpose. hence substantially expanding the scientific value of randomized experiments at low cost. for example educational data and earnings data. and under-reporting in conventional survey-based measures. However. the Florida Department of Education has teamed up with the state UI agency to allow linking of student education records to subsequent earnings outcomes. Field experiments are a powerful but costly method for scientific evaluation of alternative policy choices. Recently. Scientific research should follow the same model where NSF or NIH funds researchers to carry out scientific projects with the data. Precedents for this kind of cooperation exist even in the US. 84 .partnerships. it should be possible to encourage partnerships between two government statistical agencies (or between a statistical agency and an external partner such a non-profit or business) to accommodate research requiring merged data.

Under our new proposal. We imagine that this proposal has several benefits and few disadvantages: • As regards incentives for researchers to do good research. the method should require less infra-structure for conducting the evaluation. That takes more or less five years. 94105. Researchers less than five years out of their PhD may choose to apply for funding for research they propose to do rather than for prizes for research that they have already completed. 85 . Our proposal has one obvious drawback: young researchers may be disadvantaged if they have not had time to establish good track records. Behavioral & Economic Sciences (October 14. they actually have to do well in the future to get supported (again). • • In regards to the NSF’s scope question #1. USA. There will be less risk of mistakes. the NSF should award prizes for research that has already been done.0 Unported License. Under our scheme. our method may improve the accuracy of getting there. little will change and if so for the better. California. since evaluators (to some extent) can view that work as delegated to the referees of the journals that have accepted the work. Therefore we propose a ‘junior exception’. In regards to scope question #2.org/licenses/by-nc-sa/3. evaluating applications will take less time & effort than now.White paper for NSF/SBE 2020: Future Research in the Social. our proposal is important because whatever is the goal for the research that the SBE/NSF wishes to support. San Francisco. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. evaluating the quality of research already done is easier than evaluating research to be done. We consider the following question important: Which procedures should be used for evaluating and incentivizing research? We propose a new approach: Rather than financing projects that have been proposed. 2010) Title: Prize good research! Authors: Gary Charness (UCSB) & Martin Dufwenberg (U of Arizona) Abstract: We propose that rather than financing projects that have been proposed. Main text: NSF/SBE has invited individuals and groups to contribute white papers outlining grand challenge questions that are foundational and transformative. Suite 300. Under current conditions researchers have to make a case that they will do well in the future to get supported. 171 Second Street. To view a copy of this license. since published research has already gone through a review process. visit http://creativecommons. the funding agency should award prizes for research that has already been done.0/ or send a letter to Creative Commons.

86 .

governments face increasing challenges in making the best use of public resources. This innovation has been a win-win for taxpayers. The spectrum auction program stimulated key scientific innovations in our understanding of how to auction many related items. The auction program has been replicated worldwide and remains a key example of effective government. Within auctions. Market designers now have a much richer set of tools to address more complex problems. and abroad (Milgrom 2004).edu. Who should use the scarce radio spectrum and at what prices? How should electricity markets be organized? How should financial markets be regulated? And how should runway access be assigned at congested airports? All of these are important questions in major industries. USA. The efforts. are an excellent example. www. the companies participating in the auctions. Researchers in market design have made substantial progress in answering these questions over the last fifteen years. Communication and computational advances have certainly played an important role. The rewards to society from improved markets will be immense. 171 Second Street. government revenue. California. Tremendous opportunities lie ahead and will be realized in coming decades with further scientific advancement in auction design. 2 Professor of Economics. These auctions. are only the tip of the iceberg. This work is licensed under the Creative Commons Attribution-NoDerivs 3. industry. all working to solve real problems. which arose from a collaboration of scientists with auction expertise. and the hundreds of millions now enjoying advanced wireless communications services. To view a copy of this license. visit http://creativecommons.Market Design: Harnessing Market Methods to Improve Resource Allocation1 Peter Cramton 2 15 October 2010 Abstract The emerging field of market design applies auctions and matching to solve resource allocation problems. The advances to date. University of Maryland. the field holds much promise to provide better answers in even more complex economic environments over the next two decades. and more importantly. and engineers.org/licenses/by-nd/3. The spectrum auctions. This paper focuses on auction design.S. Auction applications are rapidly expanding. the paper examines applications involving government regulated resources. although at the forefront of theory have been closely tied to practice. but the development of simple and powerful auction methods has been important too. 94105.umd. One successful innovation to improve the allocation of scarce public resources is for the government to harness market methods to improve decision making. and involved interdisciplinary teams of economists. both public and private.0 Unported License. have put the scarce spectrum resource into the hands of those best able to use it. and the FCC. Suite 300. computer scientists. have led to over $100 billion in new non-distortionary U. These innovations have been applied not just to spectrum auctions but to e-commerce. conducted by the Federal Communications Commission (FCC) since 1994. As deficits grow and baby-boomers age. the branch of market design where money is used to facilitate the exchange of goods and services.0/ or send a letter to Creative Commons. Despite this rapid progress. San Francisco. while important.cramton. both in the U. 1 87 .S.

the bidders are in a much better position to submit any additional bids. Typically. These failures can involve trillions of dollars of cost to society. An optimization is then done to determine the value-maximizing assignment. A package auction enables the regulator to conduct a technology-neutral auction. and long-run investment markets coordinate new entry to cover any expansion in electricity demand. and bidders respond with their most preferred packages. These auction markets must be carefully designed to work together to achieve the goal of least costly. Design failures can be quite costly. Once this price discovery is over. 2006). This is important. This price-discovery process focuses the bidders’ attention on packages that are most relevant. as well as competitive prices that satisfy the stability constraints. the Medicare competitive bidding program. is still in a pilot stage and experiencing serious problems. such as the three New York City airports. For example. Left to their own devices. reliable supply. in spectrum auctions in which different technologies require that the spectrum be organized in different ways. an important step in market design is building prototypes and then testing those prototypes in the experimental lab or in the field before full-scale implementation. for example. airlines will overschedule flights during peak hours. which lets the bidders determine the band plan through their competitive bids. could have prevented or at least mitigated the crisis. and the bidding continues. Spot markets determine how much each supplier is generating on a minute-by-minute basis. There are simply too many packages to ask for preferences for all possible packages. The auctioneer names a price for each product. One of the challenges of package auctions is finding an effective way for bidders to convey preferences. The recent financial crisis is another example where the principles of market design.One example is a package auction (or combinatorial auction) in which bidders can bid on packages of items (Cramton. Design failures are all too common and persistent in government settings. forward energy markets enable customers and suppliers to lock in mediumterm prices for electricity. A common approach is to begin with a clock auction. either by shifting flights to less expensive times or by using larger aircraft to carry the same number of passengers with less runway use. The price is then raised on all products with excess demand. in large part as a result of the implementing agency failing to apply state-of-the-art methods and principles to the problem of how to price Medicare equipment and supplies. Package auctions are also proposed for auctioning takeoff and landing rights at congested airports. in which the quantity of paired versus unpaired spectrum is determined in the auction. The package auction enables each airline to bid for its preferred package of slots. which began over ten years ago. Good auction design in complex environments involves more than good intentions—it requires exploiting the substantial advances that we have seen in market design over the last fifteen years. taken together. The markets. Another example of market design is electricity markets. Modern electricity markets are organized as a number of auction markets. 88 . so a further optimization is done to find the prices that provide the best incentives for truthful bidding. In the past. not by the regulator. as well as improve the bids already submitted. A good example are recent spectrum auctions in Europe. as the California electricity crisis of 2000–2001 demonstrated. there are many such prices. the regulator has been forced to decide how the spectrum is organized with a specific band plan—effectively deciding how much spectrum is available for each technology. The resulting competitive prices motivate airlines to substitute away from expensive slots. if effectively harnessed by regulators. In a package auction bidders can express preferences for complementary items without running the risk that they will win just some of what they need. et al. and certainly involve many billions. The goal of the auction is to make the best use of scarce runway capacity. When the stakes are high. are designed to provide reliable electricity at the least cost to consumers. creating congestion and costly delay.

The field holds much promise for future advances in both theory and application. Specialized interdisciplinary conferences are common. Putting Auction Theory to Work. The process typically has involved scientists from several disciplines. About one-half of this year’s attendees were advanced doctoral students from elite research universities from around the world. In both auctions and matching. Tim Roughgarden. Noam. 89 . Algorithmic Game Theory. Eva Tardos. these advances will produce substantial and lasting welfare gains to society over the next twenty years. and engineering. MA: MIT Press. and the theory is enriched in the process. 2007). energy. and Richard Steinberg (2006).One exciting aspect of market design is working on the forefront of theory and bringing that theory to practice. Combinatorial Auctions. The process has involved interdisciplinary efforts among economists. Courses in market design are now offered at many leading research universities at both the undergraduate and graduate levels. Given the close and complementary connection between the science and the practice. such as algorithmic game theory within computer science (Nisan et al. Nisan. computer science. New and powerful specialties have emerged. even within research organizations that have historically focused on traditional fields. especially economics. Yoav Shoham. Market design is a young and vibrant field. and transportation. computer scientists. the emerging field of market design has demonstrated the power of harnessing market methods to allocate scarce resources. Vazirani (2007). solving real problems has proved to be an excellent way to develop new theory. and Vijay V. Peter. Paul (2004). The societal gains from these efforts has been substantial in several major industries such as telecommunications. the National Bureau of Economic Research now has an interdisciplinary market design group that meets annually. References Cramton. The applications benefit from the improved markets. For example. Cambridge: Cambridge University Press. Over the last fifteen years. operations research. focused on solving practical problems of resource allocation. Milgrom. and engineers. Cambridge. Cambridge: Cambridge University Press.

90 .

4 percent is the norm for American hospitals and certainly good enough for a hospital with many other pressing issues.Why Don’t People and Institutions Do What They Know They Should? David M. 1 So far. 91 . there were 49 Central Line Associated Bloodstream Infections (CLABs). Cutler Harvard University and NBER Allegheny General Hospital is a 728-bed academic health center located just outside of Pittsburgh and serving the surrounding five-state area. and health care as a whole gets 1 Insert cite to Alleghany General. In 2003. But it was not good enough for the chief of medicine at Alleghany General. the chief introduced several changes in its central line practice.000. The problem is what comes next. In the next few years. It monitored infections in real time and undertook corrective action when an infection was observed. the rate of central line infections fell by 95 percent. resulting in 19 deaths. The intervention worked. Within just three years. The hospital is big and complex.753 patients and placed 1. A CLAB rate of 4. the hospital saved nearly $2 million. the medical and cardiac intensive care units at Alleghany saw 1. so good. In economic theory. It standardized the placement and duration of central lines and authorized everyone involved in patient care to stop the process if a step was not followed. That year. Since the cost of central line associated bloodstream infections is about $50.110 central lines – tubes leading to a main artery to administer nutrition and monitor blood gases. other hospitals observe what has happened at Alleghany General and imitate it.

consider a few others: o American automobile firms never found a way to match the quality practices of Japanese automakers. rates of hospital-acquired infection are going up. Even when the drug is free. Peter Pronovost at Johns Hopkins. In addition to the central lines example. and medical errors are among the leading causes of death. Despite widespread publication of the results at Alleghany General and a few like institutions. Hospital infection control officers – every hospital has one – are frustrated.cheaper and safer. They know that medical errors lead to death and higher cost.” Throughout the medical system – indeed. and relatively straightforward to do. but not yet in practice. Indeed. replied “At the current rate. but they can’t get their institutions to focus on the problem. But that has not happened. and the ubiquitous checklist exist in theory. This problem is not a minor one. o Only 69 percent of Americans always wear a seatbelt when they drive. o Three-quarters of Americans prescribed a drug for a chronic condition have stopped taking the medication by one year later. in every facet of life – people and institutions do not do things that are valuable. Hospital-acquired infections cost the medical system about $30 billion annually. monitoring. 92 . Nationally. when asked how long it would take for checklists to diffuse throughout the medical system. it will never happen. even though 95 percent of Americans believe that a seat belt would help them in an accident. Standards. about one in twenty hospital patients are harmed because of the care provided in the hospital. long-term adherence is low. inexpensive. despite a willingness of Japanese firms to share best practices. the leading evangelist for them.

In all cases. they assert that they are already doing the best they can – the Alleghany General experience notwithstanding. and fastening a seat belt costs nothing. the peer effects literature confronts similar questions. As the list illustrates. Behavioral economists have examined individual propensities to engage in different actions. In sociology. the actions are not taken. these features are common to many economic and social settings. People do not take their medications because the cost of not taking today’s pill is trivial. But the theory is not right in all settings. Smoking is clearly a social action. this theory has strengths.These examples share common features. Alleghany General Hospital did not get better because like-minded people came to the conclusion that it had 93 . The monetary cost of the infection reduction program at Alleghany General was trivial. hospital managers rarely announce that they will start infection control operations next month. I propose as a central question for the social and behavioral sciences the understanding of such problems: why do people and institutions not do things that are so obviously in their self-interest. there are demonstrated successes getting people to save more by reducing the ability to procrastinate. even when they want to do so? The literature in the social sciences has addressed this question in various guises. But yet. Why do people not save for retirement or give up smoking? A major theme of that research is that people are prone to procrastination. the monetary costs of undertaking the actions are low. This theory is relevant in some settings. everyone agrees on the right thing to do. There is little serious debate about whether seat belts save lives and no debate that giving people infections is a bad idea. Again. But the theory fails in other settings. Rather. When queried. Further. and so too are obesity and mood. People wear their seatbelt if others around them do as well. if one will start taking pills tomorrow.

Others have a strong moral compass to always do what they perceive as right. It improved because the Chief of Medicine imposed changes. But I believe there are some ways to address it. Three features of inquiry strike me as particularly salient. Analyzing 94 . we need to better understand how people view their social environment. we are not good at this type of measurement. Aside from a specific person. Organizational behavior specialists study the combination of hiring. The firm’s manager wants to do something new. and promotion processes that lead to better and worse outcomes. First. What are the characteristics of people in each of these groups? Do people of similar types cluster together. what is different about Alleghany General relative to the thousands of other US hospitals that still have high rates of hospital-acquired infections? All of these disciplines are right in some circumstances. What we need to make progress is a scientific study of doing the right thing – what makes the right outcome happen or not.to change or because of peer interactions. Still others are motivated to be at the top of the hierarchy. or gains made one month were at risk of being undone. there is a large focus on principal-agent problems within the firm. compensation. There are a variety of strategies that firms might use to surmount this issue. and what are the barriers to repeating success? I do not know what the answer to this question will be. but does not want to create new problems while addressing existing ones. or to avoid being at the bottom. the change had to be continually monitored and stressed. or do different types co-exist? To date. In organizational behavior. Further. Some people are motivated by the desire to fit in with others – they ‘go with the flow’ as much as possible. but they all have limits. The question then becomes why some firms successfully tackle the problem and others do not.

One of the major points of contention was whether health care reform could ‘bend the cost curve’ – that is. The most influential studies in economics have come from interventions – changes in the information people possess. limit the increase in medical spending over time. The use of experiments has revolutionized the study of economic development. and how do they spread? Not all cultural changes are the same. reform will be a failure. we need to conduct experiments to understand different theories of behavior and test different interventions. But the return more than justifies the cost. but not all firms have been successful. as opposed to a collection of individuals. employees describe it as part of the culture. If we can bend the cost trajectory. To see this. or the environment they operate in. Third. Second. but the type of questions asked will be different from what is usual. to name just a few areas. and we may well repeal the recent reform legislation. If we cannot. For this type of analysis. how are decisions made? Initial decisions are often made in a top-down setting – witness Alleghany General – but they are sustained by a culture of individual belonging and empowerment. we will almost surely need new measurement techniques. In the past 18 months. When people in an organization disagree about the best strategy. the United States engaged in a prolonged health care debate. labor economics. return to the health care example. for example. and health economics. infection control would not happen if every nurse and every doctor did not participate. There is relatively little literature on how to characterize an organization. Even at Alleghany General. many firms that have imitated Toyota’s production methods. we need to understand the processes of group decision-making. the incentives they face. health reform will be a huge success. 95 . How are organizational cultures born. When asked. Experiments are by nature costly and time-consuming.individual personality is likely to involve standard survey methodology.

is a lot of money to save. If hospitals in general can do what Alleghany General has done. Thirty billion dollars of medical errors. If the social and behavioral sciences make turn possible into certain. we will have contributed more than our fair share to improving human welfare. 96 . if Alleghany General remains an outlier a decade from now. the health reform effort will have failed. In contrast. after all. we know that savings are possible. or even probable. At present.Bending the cost curve is ultimately in the hands of institutions like Alleghany General. overall medical costs will fall more than enough to pay for the promises made.

Price** Rhonda V.edu ***Division of Business & Economics..* Gregory N. (919) ..A Challenge For the National Science Foundation: Broadening Black and Hispanic Participation In Basic Economics Research William A. Such an outcome represents a challenge for science policy if indeed broadening participation is a serious objective. Greensboro.613-7336 email: william. 830 Westview Dr. 302 Towerview Rd.edu **Department of Economics. 900 E. (336) 517 – 2193 email: rsharpe@bennett. Atlanta GA. We conclude that NSF economics should 1) aim to cultivate and sponsor research that examines the causes and consequences of black and Hispanic underrepresentation among NSF economics grantees.edu 97 . 2) make concerted efforts to recruit proposal reviewers and proposal review panelists from a diverse set of institutions. October 15. 2010 _____________ *Sanford Institute of Public Policy. 30314(404) 653 – 7870 email: gprice@morehouse. and 3) incentivize broad participation and racial/ethnic diversity in the basic economics research by penalizing institutions for not achieving respectable levels of racial/ethnic diversity on their economics faculties. 27708. Sharpe*** Summary This white paper considers the low participation rate of black and Hispanic Principal Investigators (PIs) and the distribution by institution among National Science Foundation (NSF) basic economics research grants. SW. Durham NC. Duke University.darity@duke. Darity Jr. Bennett College for Women. Morehouse College. Washington St. NC 27401. An analysis of NSF economics grants between 1990 – 2010 show that black and Hispanic PIs received a very small share of awards and that 15 institutions received over 50% of the funds awarded.

Many of these innovations and new discoveries have been funded in part by NSF grants from the Economics Program―the largest disciplinary program in the Social. A consideration of NSF economics awards made during 1990 – 2010 suggests that one challenge NSF faces is broadening the participation of black and Hispanic PIs in basic economic research. 98 . 2008. the funding rate to black and Hispanic economic scientists mimics the 1 See: Broadening Participation At The National Science Foundation: A Framework For Action. our analysis suggests that it falls far too short in satisfying the NSF goals of broadening participation in the basic research enterprise. so should the pipeline of potential and actual scientists. underscoring its commitment to promoting science is a racial/ethnically diverse society. the National Science Foundation (NSF) has gone on record as being committed to: Broadening participation in terms of individuals from underrepresented groups as well as institutions and geographic areas that do not participate in NSF research programs at rates comparable to others.As one of its core science policy goals. the NSF role as a provider of basic research funds is important to the community of scientists. broadening participation to ensure that individuals from underrepresented racial/ethnic groups receive adequate basic research funding is a sound science policy goal. Arguably.S becomes increasingly racially/ethnically diverse. Indeed as the U. Notwithstanding the importance of the NSF’s Economics Programs to sustaining the growth of knowledge in economic science. National Science Foundation. In this context. Arlington VA. Behavioral and Economics Sciences (SBE) Directorate at NSF. innovations and new discoveries in economic science have significantly transformed society over the past century. As it currently stands. As it stands. the participation rate of blacks and Hispanics---as measured by the percentage of grants they received in recent history---is in our view intolerably low and incompatible with a science policy goal of diversifying our nation’s cadre of economic scientists engaged in the basic research enterprise. As research funds are a core input into the growth of basic scientific knowledge. 1 .

may retain a Hispanic surname. Price 2009. our counts of awards made to blacks and Hispanics could be downwardly biased as our data only reflect individuals who were PIs and not Co-PIs.8 and 1. 2 If one considers for example the number of economics awards made to black and Hispanic Principal Investigators (PIs). the share of awards by black and Hispanic PIs were approximately .S Colleges/Universities (Price. 3 We recognize that this approach to imputing Hispanic PIs may impart an upward bias. the percentages reported in Table 2 do not reflect the raw counts of the PIs identified in Table 1. These data are available at http://www.005 and 1.7 percent respectively. a continuation of this funding policy by NSF sends a signal that black and Hispanic research scientists are less capable and/or worthy of engaging research that merits NSF support. 2009) in academia. 38(2). Journal of Socio-economics. Table 2 provides an overview of the economics awards with respect to their distribution across race/ethnicity. 3Black PIs were identified on the basis of a roster of known black economists in academia. a continuation of this funding policy by NSF will only serve to replicate exiting racial/ethnic inequality that exists on the economics faculties of our nation’s research universities. In general.2 percent respectively of a total of 703 million dollars awarded over the time period under consideration. as reported in Price (2009). At its best. 5 The share of economics awards to black and Hispanic PIs were approximately 1. 99 . pp.gov/awardsearch. In addition. 2009). We use award data for which only the Principal Investigator can be identified. Table 1 reports the number of black and Hispanic PIs we could identify over the 1990 – 2010 period from data made publicly available by the NSF. As we 2 See Gregory N. the economics award distribution reported on in Table 2 underscores a vulgar racial/ethnic inequality in access to basic research funds in economics. 5 4 As there were instances in which several black and Hispanic had multiple awards during 1990 – 2010.nsf. An analysis of economics awards by race/ethnicity also reveals the extent to which the NSF replicates existing racial/ethnic inequality―their underrepresentation on the economics faculties of research universities (Price. Hispanic PIs were inferred on the basis of selecting those individuals that received an economics award who had either a recognizably Hispanic first and/or surname. 4 Our analysis reveals that over the 1990 – 2010 period NSF awarded grants to 31 and 50 black and Hispanic PIs respectively. ``The Problem of the 21st Century: Economics Faculty and the Color Line”. Given the total dollar value of awards received. At its worst.apparent color line in the hiring of economics faculty in U. 331 – 343. as non-Hispanic females who marry a Hispanic male. the results are rather sobering.

NSF economics will rise up to this challenge. 6 While the number of awards received by NBER is alarming. the institutions in Table 3 account for approximately 5 percent of all known black econnomists in academia 8 The dominance of these institutions in the awarding of grants by the Economics Programs thus crowds out the possibility for success by a broader cross-section of black PIs at other institutions. and Robert M. This is particularly ominous if these institutions have advantages 6 See: Price. page 9. The analysis of awards by race/ethnicity provides insight into the pattern of awards important for the professional development opportunities afforded by NSF grants.see it. it does not shed light on the pattern of awards to institutions that either do not house. Feinberg. 15 institutions received 55 percent of all awards funded and 71 percent of all dollars allocated by the economics program (element code 1320). this is a fundamental challenge for NSF economics.1 (2004): 245-252. The inequality portrayed in Table 3 suggests that Feinberg and Price’s (2004) recommendation that “NSF and other funding agencies to consider supporting alternative scholarly networks which could generate social capital for economists who are not NBER associates” 7may not be a sufficient remedy alone. The award distribution data in Table 2 provide evidence of a vulgar “color line” in the funding of basic economics research that denies black and Hispanic economists an opportunity to be full participants in the funded research enterprise. Table 3 supports the findings of Feinberg and Price (2004) that National Bureau of Economic Research (NBER) membership matters with respect to NSF funding. nearly 5 times as many awards as the second ranked institution Northwestern University with 112 awards. and eradicate this apparent color line in the funding of basic economics research. However. or have a small fraction of black and Hispanic PIs. 100 . Between 1990 and 2010. 18%. In the case of black economists. Gregory N. Ibid. These institutional inequalities could indeed be a barrier to broadening the participation of black and Hispanic PIs as NSF Economics Program Grantees." The Review of Economics and Statistics 86. "The Funding of Economics Research: Does Social Capital Matter for Success at the National Science Foundation. the NBER received 536 awards. 7 8 See Price (2009) for a itemized roster of black economists on the faculties of colleges/universities. Is it our hope that in this 21st century. The examination of awards to institutions is an approach to identifying the effects of exclusionary social capital associated with particular institutions.

but to the fact. we encourage NSF to consider putting real teeth into its “Broader Impact” merit criteria for evaluating grants by penalizing institutions that fail to achieve respectable levels of racial and ethnic diversity on their faculties. NSF can. The penalty could include not making grants to institutions who for example. it has supported the Minority Pipeline Project. Nonetheless our view is that minority economic scientists would also benefit from having fair and reasonable access to basic research support. and should do better by incentivizing behavior that is consistent with good science policy in a society that is becoming increasingly racially/ethnically diverse. We suspect that the ratio of research to non-research support to minority economists is 101 . We recommend that NSF economics consider and implement strategies that would increase the fraction of awards going to black and Hispanic economists. and rotating program officers. sponsoring research that explores the cause and consequences of submissions and success rates of black and Hispanic economists—which would inform strategies for broadening their participation in the basic economics research enterprise that requires funding. and if these program are effective. To the extent that the low participation of black and Hispanic PIs reflects their absence on the faculties that typically receive economics grants. and the Diversity Initiative for Tenure In Economics (DITE). For at least 20 years. say. Additionally. have either never hired a black or Hispanic economists or have such a small a persistently small share of Black or Hispanic Economists. that these very institutions dominate the ranks of NSF Economics advisory panels and external reviewers. but have never had a black or Hispanic economist on their faculty. NSF Economics does warrant some noteworthy praise for its support of mentoring/education programs that enable a pipeline of minority economic scientists. the population of external referees. A simple perusal of Table 3 can reveal economics programs that have benefitted tremendously from NSF economics support. This would include for example. Such support is indeed laudable. We suggest that this is very low “Broad Impact for the Buck”. NSF economics has supported the American Economic Association Summer Minority Program. While not reflected in our data. we urge NSF to make a concerted effort to diversify advisory review panels. they will significantly promote racial diversity in the supply of capable economic scientists. In more recent years.in the grant awarding process that is not necessarily tied to merit.

Maury Granger and Gregory N. pp.). Jacqueline.rather low. 10 Agesa. reports evidence suggesting that the receipt of NSF funding by minority economists has a substantial effect on their research productivity as measured by publication in refereed science journals. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. 2000. 94105. 342. 427 . Southern Economic Journal. Kim-Sau}. Maury Granger and Gregory N. Chung (2000) for example demonstrates that an increase in the proportion of minorities on the faculty has the effect of enhancing confidence among minority students that they too can succeed as college professors. 87 – 109. 90:3. Price. 67:2. 2007.. 640 . American Economic Review. 12 11 10 102 . we are confident that our descriptive analysis of NSF Economics funding provides a useful framework for future science policy interventions that would make for a substantive “Broader Impact” as it relates to racial/ethnic diversity. 9 However challenging the implications of our analysis is for NSF science policy. Agesa. 2000. ``Would Increased National Science Foundation Research Support To Economists at Historically Black Colleges and Universities Increase Their Research Productivity?".447. find that the research productivity of minority economics faculty is positively correlated with the production of minority graduates that go on to earn doctorates in economics. Review of Black Political Economy 25:4. and should be increased to enable the minority pipeline catalyzed by NSF support to develop further as capable and effective research scientists. 11 Lastly.0 Unported License. Price (2007). 9 Collectively. we suspect that total NSF economics support for the AEA Summer Minority Program. 2000). "Economic Research at Historically Black Colleges and Universities: Rankings and Effects on the Supply of Black Economists".: pp. 171 Second Street. Review of Black Political Economy 38(1/2).0/ or send a letter to Creative Commons. Granger and Price (1998. "Role Models And Arguments For Affirmative Action".41 .org/licenses/by-nc-sa/3. 1998. visit http://creativecommons. pp. Suite 300. Underlying our descriptive analysis exists evidence that providing funding for basic economics research to black and Hispanic economists has beneficial effects on both the pipeline and practicing community of minority economic scientists.648 See: Agesa. Minority Pipeline Program.630 (See Table 2. pp.54. "Economics Research At Teaching Institutions: Are Historically Black Colleges And Universities Different?". 12 All of these findings suggest that broadening the participation of black and Hispanic economists in the receipt of NSF economics grants would promote a valuable social goal by effecting racial/ethnic diversity in both the pipeline and practicing community of research scientists. See: Chung. Gregory N. USA. California. and IDITE between 1990 and 2010 is a nontrivial fraction of total research support to black PIs over the same period—which was $3. See: Price. San Francisco. To view a copy of this license. Jacqueline. Price.

999 12.992 50.064 150.2010 Black Principal Investigator Institution University of Kansas Wake Forest University Yale University Spelman College North Carolina A&T NBER University of North Carolina /Duke University Williams College Harvard University/NBER North Carolina A&T Stanford University Syracuse University NBER University of North Carolina Columbia University New York University Rutgers University/William and Mary University of North Carolina University of Central Florida Yale University University of Michigan Elizabeth Asiedu Sylvain Boko Donald Brown Myra Burnett Darnell Cloud Susan Collins* William Darity Economics Pipeline No No No # of Awards 1 1 1 1 1 1 4 1 1+Career 1 Career 1 2 2 1 1 4 1 1 Career 1 2 1+Career 3 7 1 5 1 1 5 1 4 3 4 2 2+Career 2 1 5 Value of Awards ($) 39.050 571.000 20.001 108.000 717.946 428.887 88.905 20.000 73.367 966.789 17.789 216.779 20.000 131.670 801.000 129.272 360.007 675.363 130.972 250.359 686.000 753.858 152.278 No Yes No Yes Yes No No No No No Yes No No No No No No Kaye Fealing* Roland Fryer * Maury Granger Peter Henry William Horrace Caroline Hoxby William Jackson Philip Jefferson* Yaw Nyarko William Rodgers Rhonda Sharpe Kasaundra Tomlin Ebonya Washington Warren Whatley Fernando Alvarez Manuel Amador Andre Lopez-Aradillas Hispanic Principal Investigator NBER No Stanford University No Princeton University/University of No Wisconsin Ricardo Cabellero NBER/MIT/Columbia University No Graciela Cabana None No Ann Carlos University of Colorado No Kathyrn Dominguez NBER No Linda Fernandez UC-Santa Barbara No Raquel Fernandez NBER/New York University No Ivan Val-Fernandez Boston University No Jesus VillaverdeNBER/University of Pennsylvania/Duke No Fernandez University Edward Miguel UC-Berkeley No Jose Victor Rios-Rull University of Pennsylvania/CarnegieNo Mellon Julio Rotemberg NBER No Emmanuel Saez NBER/UC-Berkeley No Xavier Salai-i-Martin NBER No Manuel Santos Arizona State No Jose Schienkman Princeton/University of Chicago No *Denotes alumni of the AEA Summer Program and Minority Scholarship Program 103 .994 20.055.398 169.666 374.260 81.280 386.771 51.690 248.500 225.283 20.307 701.000 469.339 1.320 360.Table 1: Black/Hispanic Principal Investigators: 1990 .

494.274 14.870 13.312 76.57 University of Chicago* 11.309 703.013.787. http://www.984 Table 3: Schools with the Largest Dollar Awards: 1990 .2 241.020.81 2.833.390.084.924 16.75 2.816 Total Number of Awards 27 536 67 111 31 112 86 102 69 86 84 77 9 81 68 59 1.2010 Number of Awards Black Hispanic All Other 31 50 2861 Percentage of Awards .248.380 16.86 2.786.gov/awardsearch ∗ Denotes institutions that have never hired a black.605 Total Source: NSF Awards Data Base.110.874.000 Aggregate Value share .92 3.77 1.098 Harvard University 11.371.974.973 Average Award Size ($) 107.2010 Organization University of Illinois at Urbana-Champaign ∗ National Bureau of Economic Research Inc University of Michigan Ann Arbor Stanford University National Opinion Research Center* Northwestern University University of Pennsylvania* Princeton University University of Minnesota-Twin Cities* University of Wisconsin-Madison* New York University University of California-Berkeley* Santa Fe Institute* Yale University Aggregate Value of Awards ($) 127.010 .47 2.05 3.000.01 54.686.Table 2: Summary of NSF Economics Awards by Race/Ethnicity: 1990 .9 Aggregate Value of Awards ($) 3.826.200 110.260. 104 .011 .5 161.000.749.949 14.533 12.605 Percentage of All Awards 0.936 14.28 3.721 496.35 2.92 18.092.017 .081 13.342.nsf.22 2.005 .62 0.31 2.561.630 8.31 2.027 14.000 17.566.484 12.92 2.

The first. pdiamond@mit. In other words. to a recognition of how to approach optimization while recognizing a higher dimensional and so more diverse population. an issue has been how capital income should be taxed relative to how labor income is taxed. there have been repeated advances in theory. empirical and simulation analyses for policy recommendations. 1971. Different mixes of social goals. The optimal tax literature addresses tax setting to accomplish social goals in light of the constraints and behavioral dimensions in the economy. optimal income tax paper launched the modern analysis of progressive taxation of earnings. without any intention to underplay the importance as well of other research inputs. behavioral parameters and equilibrium outcomes is central to having an informed debate about tax policies. understanding the links among tax structures. Another example is in the work of Diamond. The assumptions used by Mirrlees did not allow for a relevant extensive margin. optimal taxation of capital income is an area of steadily advancing “normal science” that is making significant progress. In all three areas I focus on needs and opportunities for theoretical analyses. there is an equityefficiency tradeoff which needs to be understood when thinking how to accomplish social goals. Saez. including empirical and experimental work. since these are areas and research methods with which I am familiar. MIT. With a 1 Institute Professor. The Earned Income Tax Credit (EITC) approach of subsidizing work by low earners is consistent with optimal taxation with an important extensive margin. The famous Mirrlees. As one example I cite the analysis by Saez and by Judd and Su that moved beyond the assumption of a one-dimensional distribution of workers differing only in skill. are more foundational. and in understanding how to use the insights from theoretical. I want to identify three areas of research that are important and have large potential payoffs. while such subsidizing of work is not part of an optimum in the Mirrlees model.edu 105 . It seems to me important for NSF to have a balanced portfolio.Three research themes Peter Diamond 1 I am an applied theorist and policy analyst. incorporating areas where important progress is likely and others where there is higher risk and the potential of more seminal advances. not preferences. incorporating behavioral economics into equilibrium analyses and understanding systemic risk. different revenue needs. While democratic governments will inevitably compromise different goals. and Laroque incorporating an extensive margin (the participation decision) for lower paid workers as more important than the intensive margin (number of hours worked) in the response of workers to taxes. extending the framework to incorporate elements not present in the initial approach. The other two. and different behavioral parameters call for different tax solutions. Since then. Taxation of capital income As long as income has been taxed.

I was first exposed to the cognitive psychology that underlies much of recent advances in behavioral economics over 40 years ago. From time to time I tried writing theoretical analyses incorporating insights from 106 . In contrast. While intertemporal concerns are real. A start has been made on incorporating diversity of savings behavior. Roth IRAs. including 401(k)s. I have long thought that incorporating empirically supported behavioral models is of great potential for the usefulness of economics. savings behavior has been one of the prime areas in which behavioral economics has identified and documented widespread responses that are not consistent with the standard individual choice model. Further complication comes from the diversity of savings behavior identified in empirical studies. Moreover. and the Savers Credit. Analyses now recognize a key role for uncertainty of future earnings in optimizing capital income taxation. I cite the work of Farhi and Werning. there is wide evidence and modeling advances to reflect the limits on the applicability of this model. Indeed.better developed theory. this has not yet happened with tax analyses. one can not make sense of savings decisions and the proper role of the taxation of capital income without paying attention to the dynamic perspective that saving is being done for future rewards. important progress in recent years. there has been significant. Improved access to government tax data would help this effort. advances in techniques of analysis of intertemporal behavior under uncertainty will influence other areas of analysis. Moreover. While Social Security analyses have considered a mix of underlying behavioral savings models. along with some recognition of intertemporal connections as workers may forego some earnings in order to accumulate more human capital. The fundamental question for this theme is how taxes out to be set to best accomplish social goals. IRAs. Incorporation of behavioral economics in equilibrium analyses The explosion of empirical findings and policy modifications arising in behavioral economics is extremely exciting. While the life-cycle model is the standard starting place for savings analysis. and Diamond and Spinnewijn. they have not been viewed as so central as to undercut the value of the insights from the one-period analyses. There is a “standard model” of taxes and labor supply in a single year that is the widely used starting place for analyzing earnings taxation. Taxation is extremely important for the functioning of an economy. Neither precautionary balances in the presence of an uncertain future nor very large accumulations by some fit in the standard model. Thus the literature has not yet come to grips in detail with the best tax treatment of retirement savings. Continuing support for advances in both theoretical understanding and empirical findings on behavioral responses is central to the potential for improving government policy. modeled as diverse preference parameters among standardly modeled savers. we can better examine and analyze the choice of parameters for the EITC in a coherent theoretical framework. Golosov and Tsyvinski with various co-authors. through the late Amos Tversky. While analysis of optimal taxation of capital income began in the mid 1970s.

Supporting the full range of behavioral economics. Finance economists and macroeconomists need to revisit the foundations of their subjects. The magnitude of the crisis has largely surprised economists and non-economists. So far the advances have been in empirically documenting and formally modeling behaviors that show the workings of psychological insights in actual markets. They don’t let us think about the U. page 23. experience in the 1930's or about financial crises and their real consequences in Asia and Latin America. But at present supporting research on great crises seems very valuable. The fundamental question for this theme is how more realistic pictures of individual decision-making affect the allocation of resources throughout the economy. 2004.” History of Political Economy. widely analyzed overall view. I don’t think. 2 Of course we do need macroeconomics to deal with more than just great crises.S. I identify three 2 theories embedded in general equilibrium dynamics of the sort that we know how to use pretty well now-there's a residue of things they don't let us think about. Great advances have been made in recent years. as small parts of the macro research output and have not been put together into a systematic. while identifying policies that can improve outcomes. but that our ability to incorporate realistic aspects of such behavior in formal models has not advanced very far. business and government policies is enormous. in combination. by and large. Systemic risk The global financial crisis and the great recession are critical events for the economy and for recognition of research needs. these papers stand. The potential for improved understanding of the workings of the economy and for improved individual. Narrative histories of the crisis and its effects have identified a number of elements that.” (“My Keynesian Education. 36(4). My view is that many of the important elements have been identified in individual research papers. However. with limited success. but from the difficulty of making valuable theoretical advances incorporating behavioral insights. That is not to say that people do not recognize the presence of behavioral biases in phenomena such as the recent housing bubble. However. very well about Japan in the 1990s.) See for example the statement of Robert Lucas: “The problem is that the new theories. One important advance is by Gabaix and Laibson.cognitive psychology. the impact on models of entire markets or the entire economy has been limited. partial and general equilibrium modeling must be a very high priority. experimental. but it doesn’t mean that this replacement apparatus can do it either. empirical. the 107 . In terms of the theory that researchers are developing as a cumulative body of knowledge-no one has figured out how to take that theory to successful answers to the real effects of monetary instability. It can’t. The limits were not from any resistance to psychological ideas nor from over-attachment to the standard model. There have been heated discussions of the extent to which existing macroeconomic studies have contact with recent experience and can help inform the design of policy. The speed with which some insights and empirical documentation have moved into government policy has been heartening. We may be disillusioned with the Keynesian apparatus for thinking about these things. contributed to the magnitude of the crisis. They don’t let us think.

as well as new ones bound to appear. Financial innovation. so deep understanding will be of longlasting importance. and dangerous.) Yet price-taking models have. There have been important advances. Based on their long history and experimental findings. along with the housing bubble. financial engineering. historically. The work of Case and Shiller and of Campbell and Shiller have been valuable. and providing infrastructure 108 . maturity mismatch.examples of central elements: bubbles. as are analyses of the nature of the generating process. the role of such contracts in response to differences in subjective probability beliefs (and not just differences in information) as well as in shifting risks. with a particularly important. developments in financial engineering. are very important for how the economy deals with risk. Advancing the domain. These are not the only areas needing significant research support. and the role of large institutions and their interactions in counter-party risk. Currently. Bubbles have happened for a very long time. building capacity. is of critical importance. Out of such studies should come a picture of how policies can limit the magnitude and time extent of bubbles and when such policies might be called for. This issue has been well identified by Hellwig. We should understand behavior by both buyers and sellers with limited understanding of financial engineering. The fundamental question for this theme is how to avoid a repeat of the global financial crisis that we have just experienced and. made more so in recognition of the fact that the allocation of resources plays out in real time. which has begun. Models based on price-taking behavior can not come to grips with the risks from such interconnections among large firms. will be difficult and important. And yet the properties and limited understanding of some financial assets have been important ingredients in the financial meltdown. but there is much to do. Economic equilibrium is an inherently complex phenomenon. Better understanding of how to use. to a large extent this intermediation is done by large financial institutions that deal with each other a great deal. (Gorton and Stein have identified some key issues. more generally. and their interactions. how to better understand the workings of asset markets in order to have better individual decisions and better regulatory policies from better understanding. the behavior of large financial institutions. I think bubbles will always be with us. not in some timeless. and how to limit the use. played a key role in the development of finance theory. Better studies of the psychological and rational bases that lead people to generate and continue to participate in a bubble are needed. role for mediating between different maturity desires of savers and investors. of existing risk sharing innovations. Making significant advances in understanding the roles of large players in financial markets. but ones where I have identified a need and opportunity with some detail. coordinated way as in the long-standing general equilibrium (Arrow-Debreu) model. Financial intermediation is central for a well-functioning economy.

with the tools and approaches spreading to many other areas. The same applies to the techniques developed for optimizing the government tax and regulatory policies. Having better models and better understanding of the determinants of equilibrium in models incorporating intertemporal decisions under uncertainty. Both the model structures and the analytical methods are likely to have a wide range of applications. and complex assets will advance the fundamental science of economics. opening up new avenues for research for students and faculty alike. large financial firms.The three themes I have explored all center on basic theoretical research. Successful advances in any of these areas would immediately appear in graduate education. 109 . behavioral decision-making. diversity of individual decision types.

110 .

111 .

112 .

113 .

114 .

115 .

116 .

  Yet many sources of data remain inaccessible to researchers.   Applying such a  framework to the appropriate data will link the aggregate outcomes that policy‐makers focus on with  their implications for individual households and producers in the economy.              1 2                                                               The Pennsylvania State University   The University of Chicago  117 .  Such an agenda has the  potential to confront a wide range of issues.  An important one is understanding the connections  between the invention and international diffusion of technology and growth. employment.The Contribution of Data to Advances in Research in International Trade:  An Agenda for the Next Decade    Jonathan Eaton1 and Samuel Kortum2    September 2010      Abstract    Observations from new sources of data spawned at least two revolutions in research in international  trade during the last several decades.    The situation calls for both the gathering and dissemination of data and the construction of modeling  frameworks that can link data of various types at different levels of aggregation. and welfare.

In recent decades, how mainstream economics has approached research has undertaken a rapid  evolution.   Economists are less willing to develop theories solely for the sake of their logical elegance or  their contribution to a canonical tradition.   The analysis of data has played a much more central role not  only as a means of testing theory but as a guide to developing theory.  Economists have made use of a  vast array of datasets to learn about the world and how to model it.  This evolution is a response to two related shifts in the environment.  One is much easier access to a  wide range of data, both at the aggregate level and datasets of individuals, families, establishments, and  firms.  The second is the computational power to handle large datasets and to estimate models that  exploit them.  Hence gathering new sources data and making them accessible to a wide range of  researchers is a major public good worthy of support.  We provide a brief overview of how this rapid evolution has transformed the field of international trade  and what we think are the remaining challenges in that field.  We then discuss what this transformation  may have to say about the direction of research in the discipline generally.  New datasets changed the direction of research in international trade:  Before looking forward a decade to 2020, it’s useful to look back at recent progress in international  trade.   Many individual researchers have contributed to this progress, but what stands out is the crucial  role played by the introduction of rich new data sources.   Before the 1980s, research in international trade typically involved the construction of elegant  theoretical models and the exploration of their internal logic.   Occasionally implications of these models  were “tested” on some available numbers but the analysis of the data themselves was never seen as the  driver of research.   What data were available pertained to a small range of industries and products.   Hence the theory never delved beneath these broad aggregates, treating all producers in these sectors  as using common technologies to make a homogeneous good.  When more detailed product‐level data on international trade became available, an early empirical  study by Grubel and Lloyd (1975) documented the prevalence of intra‐industry trade, a finding totally at  odds with the standard approach to thinking about international trade.  This provocative result led to  the development of the new trade theory, with its emphasis on product differentiation.  An important  insight was that trade not only benefitted people by exploiting comparative advantage and differences  in factor endowments, but gave producers and consumers access to a much wider variety of inputs and  products.    Initially getting access to these data was difficult, so few researchers made use of them.  They were  made widely available (and clearly documented) by Feenstra, Lipsey, and Bowen (1997).  These data  cover annual trade in goods between essentially all countries, disaggregated into hundreds of individual  products. Their availability stimulated a vast literature on estimation of gravity equations, and on trade  theories compatible with the gravity equation.  This work has proven useful for a wide range of  questions, including assessing the welfare benefits of tariff reductions.  Recently, even richer data have  become easily accessible from COMTRADE and WITS. 

118

More recently, data on the exporting behavior of individual producers have become available.  Bernard  and Jensen (1995) introduced such data for U.S. plants as did Roberts and Tybout (1997) for developing  countries.  Biscourp and Kramarz (2002) went directly to French Customs for micro data to analyze  imports and labor market outcomes.  Eaton, Kortum, and Kramarz (2004, 2008) examined the exporting  side of these data, using them to estimate a model of firm heterogeneity and export activity.  While  these micro datasets typically cannot be widely distributed, as can the bilateral trade data, researchers  know where they can go to work with them.  Each year we see data from additional countries becoming  available.  These new data have stimulated a host of new models that incorporate the underlying  heterogeneity of firms participating in international trade, and the resulting distributional consequences  of trade policies.   A key insight was that international competition can increase productivity by weeding  out inefficient firms and giving efficient ones room to expand.   The results of all these new data are a better understanding of basic facts, theories that are motivated  by these facts rather than only by their own internal logic, and more precise estimates of model  parameters.  Yet, while new data has stimulated much empirical work, it has not undone the long  tradition of careful general‐equilibrium reasoning that has been a hallmark of the international trade  field for nearly 200 years. Thus, the quantitative models coming out of this work remain useful for policy  analysis.  But to do so successfully they need to integrate the detail in the micro‐level data with the  aggregates of interest to policy‐makers.  Lack of data impedes the investigation of key questions about technology and employment:  While much progress has been made in the last decade, some basic questions are still unanswered.  Looking forward, perhaps the most important question in the field is whether economic openness has  consequences for living standards that differ substantially from estimates obtained from static models  of trade.  A leading candidate for these dynamic gains is the diffusion of technology from one place to another,  facilitated by international commerce.  Here, the lack of good data has held back progress.  Whereas  flows of goods are measured directly, flows of knowledge are not. One source of information is patent  citations, which have been usefully exploited due to the work of Jaffe and Trajtenberg (2002) in making  such data widely available. Other sources are international patents, royalty payments, and foreign direct  investment positions.  Each of these sources of information is valuable but suffers from its own set of  problems.  Creative ideas about new measures or indicators in this area, followed up by the hard work  of assembling them, is of the upmost importance.  While the effect of international trade on labor market outcomes is a constant subject of popular  discussion, our knowledge of the connection between the two remains very limited, mainly because  little information is available about workers and where they worked.   Exceptions are France and  Denmark, which have detailed data sets matching workers and firms.  These datasets tell us how the  employment history of workers shapes their earnings, and how exposure to foreign markets, both as a  destination for sales and as a source of inputs, affects labor market outcomes.  Datasets of these sorts  pose serious challenges for economic theory as well, as they require rich and flexible frameworks for 

119

understanding them.  Two studies that successfully dealt with these challenges are Postel‐Vinay and  Robin (whose 2002 Econometrica paper was awarded the Frisch Medal) and recent work by Lentz and  Mortensen (2010).   But these studies are only first steps of a highly challenging research agenda.  What is particularly needed here is access to data on firms decisions about investment, in particular the  nature of the capital goods they are using, about the inputs that they use and where they come from,  and how these interact with employment and productivity.  The challenge across fields is to design frameworks to accommodate data at different levels of  aggregation  Economists have typically addressed critical issues in economic policy‐making, such as: (1) the sources of  economic growth, (2) the determinants of economic fluctuations, (3) labor market transitions and  unemployment, and (4) how the first three relate to the interactions among individual countries in the  global economy, with aggregate data. Examples of such datasets include the Penn World Tables,  national accounts, unemployment statistics, and COMTRADE, data now readily available to researchers.   As a consequence, researchers working on these problems have tended to develop models capable of  explaining the data at this level of aggregation.   The real business cycle model, for example, interprets  national accounts data as reflecting the decisions of a representative consumer.  Another example is the  traditional analysis of international trade data in terms of a model in which firms in sectors of the  economy share a common technology.  The aggregate data themselves, however, are constructed from records of individuals and  establishments that economists rarely see.    Data at this level are usually confidential and are very  difficult for researchers to access, with a few exceptions.   For some time, however, economists have had access to data on individual households and firms, largely  based on surveys.  These data have led to significant advances in a number of fields such as labor  economics and industrial organization.   But the survey data are not typically the basis of what goes into  the aggregate data.  Hence research on individual units has proceeded quite independently from  research at the aggregate level.   In order to address macroeconomic policy issues the microdata  economists use need to serve as the basis for the aggregate data of interest to policy‐makers.  International trade data provide one example of why models that only operate at the level of aggregates  can be inadequate.  Bilateral trade flows are not the consequence of some aggregate forces, but of  decisions by individual firms to sell to individual buyers in a foreign market.  In some cases the number  of agents involved is quite small, to the point of becoming zero.  Understanding the decision to  participate in trade is thus as important as understanding how much to trade.  Confronting this feature  of the data requires a very different approach to modeling bilateral trade flows, building from the  individual decision‐maker up.   For other macroeconomic data enough agents usually participate so that  their individual decisions do not show through so strongly, but it is just as crucial to understand that  aggregate investment or research and development data, for example, reflect the very heterogeneous  decisions of individual firms. 

120

What is required is not only access to the data themselves, but the design of analytic frameworks  consistent with data at these different levels of aggregation.  Researchers are just beginning to meet  these challenges.  What is the challenge question, the capability to be created, and the scientific strategy?  We pose one challenge question facing international economics.  How can we access and combine data  on trade, foreign direct investment, patenting, royalty payments, labor flows, and other measurable but  perhaps overlooked phenomena ,  to quantify the invention and international diffusion of technology  and its effects on employment and welfare?  The capabilities to be created are datasets, readily accessible to researchers, that provide information  about the individual units behind the aggregate data.  To address the issue of confidentiality progress  has been made on the creation of “fuzzy” datasets that hide the identities of individual agents while  revealing the moments in the data of interest to researchers.  The scientific strategy is the development of models that can connect the aggregates measures of  interest to policy makers with what is going on at the level of heterogeneous households and producers.    Bernard, A. B. and B. J. Jensen. 1995. “Exporters, Jobs, and Wages in U.S. Manufacturing Plants, 1972‐ 1986.” Brookings Papers on Economic Activity: Microeconomics, 67‐119.  Biscourp, P. and F. Kramarz. 2007. “Employment, Skill Structure, and International Trade: Firm‐Level  Evidence for France.” Journal of International Economics, 72: 22‐51.  Eaton, J., S. Kortum, and F. Kramarz. 2004. “Dissecting Trade: Firms, Industries, and Export  Destinations.” American Economic Review, Papers and Proceedings, 94: 150‐154.   Eaton, J., S. Kortum, and F. Kramarz. 2008. “An Anatomy of International Trade: Evidence from French  Firms.” NBER Working Paper No. 14610.  Feenstra, R. C.  R. E. Lipsey, and H. P. Bowen. 1997. “World Trade Flows, 1970‐1992, with Production and  Tariff Data.’’ NBER Working Paper No. 5910.  Grubel, H.G. and P.J. Lloyd. 1975. Intra‐industry Trade: The Theory and Measurement of International  Trade in Differentiated Products. New York: Wiley.  Jaffe, A. and M. Trajtenberg.  2002. Patents, Citations, and Innovations: A Window on the Knowledge  Economy. MIT Press.  Lentz, R. and D. Mortensen. 2010.  ``Labor Market Frictions, Firm Heterogeneity, and Aggregate  Employment and Productivity,  manuscript. 
 

121

Postel‐Vinay, F. and J.‐M. Robin. 2002. “Wage Dispersion with Worker and Employer Heterogeneity.”  Econometrica, 70: 2295‐2350.  Roberts, M. and J. R. Tybout. 1997. “The Decision to Export in Colombia: An Empirical Model of Entry  and Sunk Costs.” American Economic Review, 87: 545‐564.   

122

This work is licensed under the Creative Commons Attribution-NoDerivs 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nd/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

Questions about the Future of the International Economy Stanley Fischer, Bank of Israel The problems that stand out are (1) Assuming the center of gravity of the global economic system is moving towards Asia and the emerging market countries more generally, what are the implications for the management of the international economy, and for the future behavior of the international economy. (Among the issues worth thinking about, there's an article in a recent, or perhaps the latest, Foreign Affairs, by Jorge Castaneda, called something like "Not ready for prime time", which is about the move from the G-7 to the G-20. The title explains what he contends.) (2) What can/should be done to try to channel this process in a constructive direction? (3) What are the political implications of this shift? (4) What are the factors that could derail this process (remember that in the 1980s many believed that Japan would take over the world) and what would be the political and economic implications of such a derailing? (5) It's clear that the futures of China and India are critical to this process, and I don't know whether enough work is being done on those sets of questions. In addition, (6) the information explosion/Google/Facebook/government censorship of their activities in many countries, is a critical and littlestudied issue. (7) Demography and demographic trends. It's hard to believe that Russia, Japan, China, Europe are simply going to stand by while their countries and economies become smaller and relatively less significant, as a result of demographic trends. Presumably at some point they will make intensive efforts to reverse current trends (in China's case, the trend that must bother them is that India will become more populous). (a) What will/can they do this end? (b) What consequences will such attempts have? (Note this is a long-term issue, but it's clear you're looking for long-term and profound issues) (8) The energy issue, including the potential role of nuclear power and all its geopolitical ramifications, is not going away. NSF could advance understanding of this issue, which is important not only for the supply of energy, but also for the global power balance and for international imbalances No doubt all these subjects are already on your list. with your well-timed and much-needed initiative In any case, good luck

123

124 .

are commonly used to guide play? How do people think about games with a very large number of actions. and what do people believe about other people’s social preferences? How do people update their play based on their observations? What sorts of “theories of mind. The standard methodology is to write down a description of the game and characterize its Nash or subgame perfect equilibria.what sort of “pruning” is involved? When will play resemble an equilibrium of the game. when will play converge to a stable outcome. The goal of predictive game theory is to develop models that better predict actual behavior in the field and in the lab. and which equilibrium will tend to emerge? Similarly. 125 .Predictive Game Theory Drew Fudenberg Harvard University Abstract: Game theory is used in a variety of field inside and outside of social science. in a decentralized matching market. and which one? To develop answers. but this is only sometimes a good approximation of observed behavior. researchers will need to combine insights from behavioral economics and psychology with formal modeling tools from economics and computer science. Core questions include: What determines people’s behavior the first time they play an unfamiliar game? When are social or altruistic preferences important.” if any.

One possibility is to take into account various cognitive limitations on learning that have been observed in decision problems. Research Agenda. which holds that equilibrium arises as the long-run outcome of a non-equilibrium process of learning or evolution. and has provided a number of qualitative insights It also yields a good approximation of observed behavior in some cases. Existing work has focused on tractable learning rules that yield qualitative insights about long-run outcomes. The standard methodology in applying game theory is methodology is to write down a description of the game and characterize its Nash or subgame perfect equilibria. the challenge is to go extend and perhaps unify these initiatives to build a coherent predictive theory. Another avenue for improvement is the addition of explicit models of the subjects “theories of mind”. it is time to go beyond equilibrium analysis to get more accurate predictions of behavior in the field and in the lab. and also with results that apply to 126 . importance. errors in computing posterior probabilities. such as the use of coarse categories. Also. Relaxing Equilibrium Analysis A key component of this program is the further development of adaptive justification for equilibrium. and so on. but in many others it is either too vague to be useful or precise but at odds with how games are actually player. from several different directions. and context. In addition. the literature on adaptation and learning in extensive form games should move beyond the rational or almost-rational approach to off-path experimentation by considering other reasons that subjects might test the consequences of an apparently suboptimal action. This was a good starting point for game theoretic analysis. A. researchers should begin to complement results on asymptotic behavior with results on the rate of convergence. There have already been some tentative steps towards this goal.Predictive Game Theory 1. Researchers should now consider learning rules that more accurately describe how subjects update their play in light of their observations. With the increased use of game theory in a variety of fields inside and outside of social science.their beliefs about how other subjects think about the game.

In an extensive form game. This is related to the second key component of the program. and at most fifty. and when all distributions over level-0 play are allowed the theory has very little predictive content. These models. Furthermore. Yet there is no general and empirically valid way of selecting between them. which describe the outcome the first time people play an unfamiliar game. which is especially important for applying make the technique useful for field data. so that their initial beliefs and attitudes can play a large role in determining what is observed over the relevant horizon. repetitions of the game.Predictive Game Theory laboratory settings. This motivates a more careful and less agnostic treatment of the players’ initial beliefs and attitudes. the cognitive hierarchy models should be complemented with an a priori method of determining level-0 play. Once again insights from behavioral psychology and economics should be brought to bear. even experienced players may not have learned how opponents respond to actions that have rarely if ever been used. but the existing theories are a poor match for the data from lab experiments: subjects do seem to cooperate when the gains to cooperation 127 . where subjects typically play ten. and allowing players to have incorrect off-path beliefs (as in self-confirming equilibrium) only makes the set of equilibria larger. but fitting these models to more complex games requires alternative ad-hoc modifications of level-0 play. the further development of models of cognitive hierarchies and level-k thinking. Thus. B. take as a primitive the players’ beliefs about the play of unsophisticated “level-0” agents. and supposed that level-0 agents give each action equal probability. Multiple Equilibria Many games of interest have multiple equilibria. in many cases of in the lab and in the field. as a result learning processes can converge to non-Nash outcomes such as those of self-confirming equilibria. even when restricting to standard solution concepts. Early work focused on simple matrix games. agents do not have enough experience with the game to learn even the path of play. ere is a sizable theoretical literature that provides evolutionary/adaptive arguments for why cooperation should be observed in repeated games. We also need a theory of how these beliefs are updated in light of observations and what the resulting play will be.

as the behavior rules are based on the agents’ simplified models of the environment as opposed to the environment itself. and what sorts of games are viewed as related? Ideas from computer science as well as psychology may be helpful here: computing the set of Nash equilibria of arbitrary large games is complex. These same ideas may permit more efficient estimation of behavior rules in complex economic environments. The smaller experimental literature that has focused on the special cases of coordination games and signalling games. the analysis of decentralized markets closely parallels that of equilibrium analysis. but do not cooperate in some settings that have cooperative equilibria.Predictive Game Theory are sufficiently high.” and a literature using stochastic stability to select equilibria. but stability is not a good approximation of the outcomes of laboratory experiments on decentralized matching except in extremely small markets with a unique stable outcome. and raises similar questions: when will a stable outcome will arise. C. and when it does. Matching Theory Classic matching theory is based on the idea of a stable match. Heuristics for Tree Pruning and Similarity How do people simplify complex strategic interactions. which one? 128 . etc. There is also a sizable theoretical literature on “equilibrium refinements. what subjects observe about other subjects’ play.what classes of strategies are viewed as equivalent and which ones are discarded? How do people extrapolate from past experience to one game to play in a “similar one. D.) and to then organize the findings in a way that makes testable predictions. once again what is needed is an empirical characterization of behavior to serve as a constraint on theories of equilibrium selection. When there are multiple stable outcomes. but some classes of games have more parsimonious representations that allow polynomial-time complexity. So when research question is to empirically characterize when cooperation occurs (varying payoff functions.

Moreover. It will also require graduate students who are trained in game theory. (2) test if the implicit equilibrium selection is stable over time and to changes in government policy. but it should also take up the problems of computing and estimating 129 . with cleaner code and a more intuitive interface. The challenges here are (1) to theoretically identify the sorts of equilibria that their algorithm tends to select. Another possibility is the use of exit surveys and in-game belief elicitations. A challenge in using field data is that the standard methodology imposes a form of subgame-perfect equilibrium as an identification condition to estimate model parameters. Empirical Validation Work on predictive game theory should draw on lab and field data. The program would also benefit from a more modern program for lab clusters than z-tree. Individual learning rules are notoriously hard to identify from laboratory data. so one focus will be the aggregate consequences of a population of agents using a distribution of rules. and (3) develop a way of testing if the equilibrium assumption is valid or if players have not even learned the path of play. allowing for players to maintain incorrect beliefs that are consistent with their observations. and may well justify the construction of new ones. and in many cases will be accompanied by explicit data analysis. the current wave of internet-based field experiments would benefit from a grounding in the theory of non-equilibrium learning. Recent work by Fershtman and Pakes relaxed this. A further challenge is to study non-equilibrium adaptation and learning on field data. and econometrics. Implications This program will require the use and support of existing game theory labs. 2. at present many of the best theory students neglect these more applied domains. experimental methods. this could be facilitated by running field experiments on the internet.this literature should continue to improve methods for computing Nash or subgame perfect equilibria in economically relevant games. either on “laboratory” sites or on commercial ones.Predictive Game Theory E. Both the experimental and field components would benefit from improvements in computational game theory.

it is far from exhaustive and reflects the availability biases of the author. Konstaninos Daskalakis and Tuomas Sandholm are exciting algorithmic game theorists with an interest in economic problems. David Laibson. Muriel Niederle. Philippe Jehiel. Guillaume Frechette. Miguel Costas-Gomes. Many people are doing exciting work on cognitive limitations in decision problems. and Dave Rand are doing intriguing experimental work on cooperation in repeated games. Who is Doing Provocative Research? The following very incomplete list is intended to give a sense of the scope of this agenda. Tim Salmon and Nathaniel Wilcox are pioneers in the econometrics of laboratory learning rules. Ignacio Esponda. including Xavier Freixas. and Matt Rabin. Pedro Dal Bó. Rosemarie Nagel. Andrew Schotter has made provocative use of in-game belief elicitation. Anna Dreber. 130 . Bernhard von Stengel is a leader of computational game theory. Levine are leaders in studying adaptive processes in extensive form games. and of simulating and estimating non-equilibrium dynamics. William Sandholm. and Jeff Shamma is a pioneer in bringing techniques from the feedback-control literature to the study of learning in games. and Sylvain Sorin are making important advances in the mathematics of dynamical systems and applying them to non-equilibrium dynamics. 3. Josef Hofbauer. but so far little of this work has been applied to learning in games. Sendhil Mullinaithan. and Dale Stahl are leading the surge in work on cognitive hierarchies. Vince Crawford. Tek Ho. and David K. Federico Echinique. and Leeat Yariv are studying decentralized matching in the lab. Colin Camerer. Michel Benaïm. and the sorts of non-Nash equilibrium outcomes that can persist even when players have a lot of experience with the game.Predictive Game Theory equilibrium concepts that allow for incorrect off-path beliefs and/or cognitive errors. Chaim Fershtman and Ariel Pakes are developing estimation methods for field data that allow for incorrect off-path beliefs.

as well as support for analytical modeling of the phenomena discovered through experimental interventions. as well as theoretical research that provides the analytical foundation for the phenomena discovered through agent-based models. with advisors drawn from all the behavioral sciences. Finance. I suggest increasing support for laboratory and field experiments in choice and strategic interaction. and the Behavioral Sciences Herbert Gintis Santa Fe Institute September 16. 2010 Abstract In macroeconomic theory. Traditional macroeconomic theories are not equipped to handle these new problems because these models use highly aggregated models with one financial instrument (money) and they carry out only comparative static as opposed to dynamic analyses. My remarks will cover two areas: macroeconomic theory and transdisciplinary research in rational choice and strategic interaction. This body of theory is not suited for dealing with financial instability. I suggest supporting agent-based models of decentralized market systems with sophisticated financial sectors. In rational choice and game theory. decentralized market economy. Yet the financial interdependence induced by globalization and the increasingly critical role of finance in the modern economy have elevated financial instability to a preeminent position in the theory of economic fluctuations. In their stead. I urge the formation of a transdisciplinary department of NSF devoted to peer reviewed support of transdisciplinary work. Finally. we need models of the price and quantity adjustment processes in a highly disaggregated. 1 Macroeconomic Theory Traditional macroeconomic theory has focussed on problems of monetary and fiscal policy in handling the stochastic nature of output and employment. The obvious 1 131 .Long-range Research Priorities in Economics.

each agent has a set of “private prices” that he deploys in engaging in transactions (individuals and firms trade when their private price systems “overlap” appropriately). In fact. and apply this in a way that includes a sophisticated. however. prices settle down to an average of unity. shows the history of the prices of goods in a ten-sector agent-based model. taken from Gintis (2007a). out of equilibrium there are no “public prices” at all. which would require considerably more computer power.After about 100 periods. there has been virtually no progress in analytically modeling the dynamics of general equilibrium. of course. Figure 1. significant excursions from equilibrium are experienced by one sector or another. but rather. and the economy occasionally experiences significant excursions from its quasiequilibrium state (so-called “bubbles”). In Gintis (2007a) I locate the problem with attempts to dynamicize the Walrasian model in the incoherent notion that out of equilibrium. I also show that the resulting model is fragile (amplifying rather than attenuating random shocks). Based on my research in this area. A next step. and avoid the temptation to fund a massive “blockbuster” project with pretensions to forecasting real-economy fluctuations until we know a lot more about the algorithmic and analytical foundations of dynamic market economies. although one that does not clear markets. and private prices converge rapidly to “quasi-public prices” that entail a “quasi-equilibrium. would be to extend this to one hundred sectors with endogenously generated heterogeneous sector sizes and a more realistic (set of) financial sector(s). It would be prudent in this rather novel research area for funders to encourage many small-scale “pilot” research projects for five or ten years. I show in Gintis (2007a) and other papers that the general equilibrium system is stable. there exists a common system of prices.” However. use the insights gained from agent-based models to develop an analytical model of a decentralized Walrasian system. Economists funded in this area should be thoroughly capable of writing such agentbased programs at a professional level and supervising the work of programmers who are not trained in economic theory. use largescale agent-based models to predict economic fluctuations. highly articulated financial sector. First. I project that there is much to gain from financing research in agent-based models of the macroeconomy with two interrelated goals. This endeavor will involve collaboration between economists and computer programmers. There is. a serious impediment to it use: while existence theorems for Walrasian general equilibrium were perfected some sixty years ago. Second.candidate. Using agent-based modeling techniques (an agent-based model is constructed so as to be directly implemented in a computer program). is the Walrasian general equilibrium model. but in many periods. and a strong one. This sponsored research should go hand-in-hand with support for economic theory that provides analytical founda2 132 . I believe.

and political science. we have already seen some of these rewards in the maturation of experimental economics and economically3 133 . once considered unfruitful and ad hoc. has become creative and innovative in dealing both with foundational problems in the interactions among physics. such research has been invaluable in dealing with environmental problems. sociology. the natural sciences has experienced an explosion of transdisciplinary research. 2 The Age of Transdisciplinary Research Over the past two decades. Indeed. biomedicine and other areas of applied research. All relative prices are analytically computed to be unity in equilibrium. chemistry. tions for the phenomena discovered through agent-based modeling. epidemiology. geology and other basic disciplines.Figure 1: Sectoral Prices in an economy with ten sectors. psychology. Moreover. Few students of the behavioral sciences doubt but that similar rewards will flow from an increased emphasis on transdisciplinary research in biology. Cross-disciplinary research. biology. economics.

which are part of the theoretical core in economics. “The Dynamics of General Equilibrium. To this end.” have revolutionized economic theory and have provided the empirical evidence for new theories of individual choice behavior and strategic interaction. staffed by researchers who are intellectually at home with the theoretical and empirical materials from diverse disciplines. and biology. Herbert.tex September 16. and tantalizing preliminary findings in neuroeconomics. and especially the reluctance of most social scientists to address the situation. sociology. economics. foundationally important insights from sociology and anthropology are simply ignored in disciplines that rely on game theory and the rational actor model. I think NSF should set up a transdisciplinary department or section with peer review boards drawn from all the social sciences. and theorists should be involved in both specifying what are the important questions to be empirically addressed. For instance. cnPapersnNSFSBEnFutureResearch. and must be overcome if transdisciplinarity in the social sciences is to have more than a brief life. . 2009). This situation. sociology. and anthropology. Gintis 2009). and the biology of social behavior are widely shunned in psychology. The Bounds of Reason: Game Theory and the Unification of the Behavioral Sciences (Princeton.” Behavioral and Brain Sciences 30. NJ: Princeton University Press. 2010 4 134 . anthropology.” Economic Journal 117 (October 2007):1289–1309. the rational actor model and game theory. These transdisciplinary areas. Conversely. . that transdisciplinary in the social sciences. “A Framework for the Unification of the Behavioral Sciences. unlike the natural sciences. is a deep embarrassment to the scientific status of the social sciences. but my efforts are at best a first step that needs much more supported research. which can be given the transdisciplinary name “behavioral game theory. however. This research should involve both laboratory and field studies. It must be recognized. R EFERENCES Gintis. and in formulating models that explain the empirical data generated thereby. It is quite clear that there are huge gains to be made in the support of behavioral game theory executed by researchers in psychology.oriented experimental psychology. I have sketched in Gintis (2009) how this might be achieved. is hampered by the persistence of fundamental differences in modeling human behavior across the disciplines (Gintis 2007b.1 (2007):1–61.

135 . Stanford University. given the irreversibilties. These features imply that. E-mail: goulder@stanford. Three sources of political failure seem especially important (1) special interests. Similarly. and the loss of biodiversity.” This note is in response to the invitation by Myron Gutmann (Assistant Director of the National Science Foundation) to contribute white papers outlining grand challenge questions that are both foundational and transformative. efforts to reduce emissions of greenhouse gases must begin early enough to prevent atmospheric concentrations from reaching a level implying very serious damages from climate change. scholars have been far less successful in devising policy approaches that can overcome political barriers. The environmental problems are unprecedented in scope.edu. These problems are especially worrisome because of system inertia and associated irreversibilities. 2010 Abstract: Economists and other scholars have offered useful diagnoses of the market failures that underlie major global environmental problems. the damages will persist for centuries. Goulder1 October 15. action needs to be taken well before the worst costs or damages are observed.A Grand Challenge for the Social and Behavioral Sciences: Integrating Economic and Political Considerations in the Analysis of Global Environmental Policies Lawrence H. If instead emissions reductions are initiated after this point. The note suggests that NSF offer support to research that can help overcome critical bottlenecks associated with pressing global environmental problems. to avoid huge losses of human welfare. 1 Shuzo Nishihara Professor of Environmental and Resource Economics. For example. overfishing needs to stop before the fisheries in question are depleted. Environmental degradation resulting from human activities is nothing new. But human activities are now having an unprecedented global impact. (2) the public goods nature of global environmental problems and associated problem of free-riding. given the long atmospheric lifetimes of these gases. Particularly severe problems are global climate change. and (3) problems in news reporting stemming from changes in the technology for communicating environmental (and other) information to the public. the depletion of marine fisheries. They have identified the market failures without offering solutions to political failures. and species loss must be curtailed before the ecosystem services of critical species are lost. and some of the bottlenecks stem from new technological developments. a reflection of greater human numbers and higher output or consumption per capita. However. I recommend that the NSF give considerable support to studies that combine attention to the economics impacts with attention to any of these three underlying sources of “political failure.

scholars have been far less successful in devising policy approaches that can overcome political barriers. These demandand supply-side factors together imply a system of news generation that hardens prior convictions rather than educates. Nations are sovereign and participation in international environmental efforts is voluntary. Because of the public goods nature of environmental problems (including the three problems mentioned above). with lump-sum side payments any policy that yields aggregate net benefits could be designed to be Pareto improving – no party would be made worse off. In addition. Concentrated groups can block efforts that would be beneficial to society as a whole. the way revenues from auctioned fishing licenses are rebated to fishing enterprises). television. Three sources of political failure seem especially important (1) special interests. Arriving at practical solutions requires attention to the distribution of policy outcomes. the way fishing licenses are allocated) or through side-payments that accompany the central instrument (for example. any individual nation may have an incentive to free ride rather than make the economic sacrifices associated with participation in an international environmental agreement. But institutional and informational barriers make such payment schemes difficult or impossible. and (3) Changes in the technology for communicating environmental (and other) information to the public. I recommend that the NSF give considerable support to studies that combine attention to the economics impacts with attention to any of te three underlying sources of “political failure. There’s a strong need for innovative research that identifies how policies can be designed to achieve distributional outcomes consistent with political feasibility. In particular. There is no • • 136 . Nowadays any individual has a choice of thousands of news outlets. There’s a strong need for analyses that indicate how international agreements can be made sufficiently attractive to overcome the free rider problem. specific news networks may find it profitable to present a view that is deliberately slanted or “customized” to appeal to particular political groups. They have identified the market failures without offering solutions to political failures. The targeted distributional outcome could be achieved either through specific elements of the central policy instrument (for example. This white paper encourages the NSF to devote considerable support to studies that aim to identify policies that attend to both failures – that is. studies that identify policies that are both economically attractive and politically feasible. he or she can select for whatever radio.” By definition.Economists and other scholars have offered useful diagnoses of the market failures that underlie these problems. As a result. It can also lead to political paralysis. it seems to require finding ways to achieve distributional outcomes that can “oil the squeaky wheels.” Three sources of political failure seem especially important: • Special interests. or internet source reinforces his prior convictions. Also. they have provided helpful templates of “first-best” solutions – the sorts of policies that would cure the market failures and produce efficient outcomes. However. Changes in the technology for communicating environmental (and other) information to the public. (2) The public goods nature of global environmental problems and associated problem of free-riding. The public goods nature of global environmental problems and associated problem of freeriding.

the selection problem seems sufficiently serious to deserve attention.• simple solution to this problem.” Such studies would need to harness the expertise in a range of social and behavioral sciences. However. political scientists.0 Unported License.  137 . including economists. This work is licensed under the Creative Commons Attribution-ShareAlike 3. visit http://creativecommons. USA. and ethicists. Indeed the problem seems so central to the effective political functioning of democracies that it seems critical to address it. California. 171 Second Street. despite the risks. 94105. Clearly there is great potential for abuse here. To view a copy of this license. I would recommend that the NSF give considerable support to studies that combine attention to the economics impacts with attention to any of these three underlying sources of “political failure. I hope these comments are useful.org/licenses/by-sa/3. There appears to be a need for better oversight regarding news accuracy and balance.0/ or send a letter to Creative Commons. Suite 300. San Francisco. How can “balance” be fairly measured? How can freedom of individual expression be safeguarded? New regulations to address the selection problem could easily violate basic values and individual rights.

138 .

0 Unported License. California. Also. San Francisco. For example. 139 . While a dispersed set of researchers address some important questions. 94105. The impact of digitization on the economy and society at large depend on the rules and policies that govern the economic incentives to create. use. The increasing scale of digitization has not generated a similar increase in theoretically grounded empirical research on the economic consequences of digitization. The increasing creation. 171 Second Street. transparent. In less than a generation digitization has transformed social interactions. and consumption of digital representation of information touched a wide breadth of economic activities. and Scott Stern This work is licensed under the Creative Commons Attribution-NoDerivs 3. managers. visit http://creativecommons. To view a copy of this license. government officials.0/ or send a letter to Creative Commons. A related gap also motivates this agenda. no research institution has it as a primary mission to develop novel databases related to digitization that are easy to access for follow-on researchers and policy analysts. and citizens – to access and leverage information. no research community with a recognizable identity facilitates cumulative research across this community. USA. job seekers. support. and high-quality research on digitization is poorly developed.org/licenses/by-nd/3. and reshaped the ability of people –consumers. Suite 300.The Economics of Digitization: An Agenda for NSF By Shane Greenstein. Motivation Our starting point is the gap between research and recent changes brought about by digitization. facilitated entirely new industries and undermined others. Josh Lerner. data infrastructure for cumulative.

zero marginal cost. • Understanding changes in market structure and market conduct: Increasing digitization initiated significant shifts in market structure and significant revisions in longstanding competitive behavior in newspapers. and sharing digital information are much lower than the costs of creating it? What determines market value when the fixed costs of creating information are large. particularly from an economics perspective. in markets shaped by digitization. We have identified several key areas we consider priorities for investigation. Too often the policy discussion is dominated by the narrow concerns of individual stakeholders rather than objective policy analysis. objective policy analysis of key policy issues is rare.store and use digital information. What are the relevant economic frameworks for analyzing behavior in markets for information where the costs of distributing. movies. To accurately account for the impact of digitization. but the costs of distributing and accessing are low or approach zero? 140 . An established community of economic researchers could help to bring high-quality research to the discussion. music. These features include increasing returns. and other media markets. such as in the area of copyright. at best. Most notably. taken these fundamental features into account in only a limited way. the existence of a wealth of “free” digitized information on the Internet (including a significant amount that is created by and for users) has effectively eluded systematic economic measurement. accessing. Traditional calculations evaluating the benefits of digitization have. reshaping policy evaluation of the consequences of digitization. This is particularly true regarding recent (or proposed) changes in governance and policy. and there are key features that have been overlooked entirely. and a “long tail” pattern of usage of digitized content. key characteristics of digital content must be taken into account. Yet.

What new tradeoffs shape the design of other forms of intellectual property. trade secrets. trademarks. its delivery. perhaps. What are the short and long run economic implications of increasing digitization for the design of copyright and its redesign? What would be the economic effects of various alternative copyright arrangements and proposals for its redesign? • Redesigning incentives for innovation and creativity: In broad context copyright law is one of several mechanisms for protecting intellectual property and regulating market behavior. so too the policies for governing its ownership may have changed its role. both in their online behavior and offline. have begun to play a prominent role in online experimentation and everyday online activity. such as the creative commons license. then. such as patents. What factors shape the returns society receives from employing commons license? What factors shape the effectiveness of different governance structures for commons licenses? • The economics of privacy: Increasing digitization has altered the fundamental cost of collecting and retaining and distributing personal information. and its use. Several related experimental forms of commons licenses have begun to play an important role in scientific discourse. What economic factors shape privacy in the new economics of digitization? 141 . and database protection? What types of institutions and organizations enhance the incentives for innovation and creativity while also enhancing its diffusion and impact? • The Economics of Commons: Experimental forms of copyright.• Rethinking the design of copyright: If the digitization of information has dramatically changed the form of expression. Commercial actors have the ability to learn an enormous range of details about consumer conduct. Copyright law is one of the most important mechanisms for protecting intellectual property in information industries.

alas. there is general agreement that digitization has affected a broad range of economic activities. Moreover. the changes have been dramatic. The measurement questions are central in some cases. for example. over 70 million US households have Internet access. Measuring digitization is. There has been a tremendous change in the allocation of time within many households. That independent academic voice could expresses skepticism in the face of the slanted or parochial view.• Measuring digitization with an eye towards open policy issues: Many of the topics just discussed require measuring one or another aspect of digitization. supplemental in others. While there is no definitive estimate of the size of the impact from increasing digitization. and could work alongside practitioners to assemble the relevant data and answer the basic economic questions that occupy public conversation. Where will such a community come from? What needs to happen to hasten the development of such a community? Why these topics and why an economic approach to them? It is not difficult to notice that digitization shapes a large part of the economy. According to the NTIA. Symptoms arise in many domains. More than 90% of those households have a 142 . because the change has arisen rapidly. What is the appropriate way to measure the extent of activity in digitization? What framework appropriately assesses the rate of return on investments in digitization by public and private organizations? • The absence of analysis untied to stakeholders: A range of policy questions require many voices. so the policy discourse is underdeveloped as a result. These changes have arisen in less than one generation. but there is not yet a recognizable independent academic voice on a range of governance and economic policy issues affected by digitization. an underdeveloped field.

the Internet access market alone accounts for $45 billion in revenue. For some industry specialists. adding across various household and business markets. to archive. According to latest data from the Census Bureau. to name a few areas. Nor is it in any stakeholder’s interest to develop institutional support to collect. music. The existing economic researchers in this area lacks a community with a recognizable identity. founded in data analysis. Increasing digitization has vastly reshaped economic activity in markets related to. and provide management to coalesce a dispersed economic research community. provide funding. and advertising. reservations. research. and organizations built around. the motivation seems self-evident. and business partners. logistics. news. Nor is it in any present stakeholder’s interest to develop an institutional home and support for expertise. up from 4% a decade earlier. Two decades ago this market did not register more than several hundred million dollars in revenue at households (from bulletin board services). This number is symptomatic of the large changes ICTs have brought to how firms relate to their employees. There has also been a tremendous change in the investment of business. in a manner not beholden to any stakeholder. retailing. and researchers lack the institutional commitment to support a community focused on 143 . customers. investment by US businesses in information and communication technology in 2009 was $522 billion (down from prior years). 2009). and no economic number can summarize the disruption of the last two decades.broadband connection. and to standardize data related to measuring the digital economy – particularly. enterprise IT. What do stakeholders and researchers do in the areas related to this proposal? It is not in any present stakeholder’s interest to provide the institutional commitment. According to the Bureau of Economic Analysis (BEA.

especially how they shape market conduct. Researchers in these two communities do not yet address many open issues in copyright. how broadband – that is. Once again. a small amount of funding could yield high returns. namely. as well as policy analysis based on economic analysis and the measurement of digitization. 144 . and its consequences for offline sales. which is a subset of the broad literature on multi-sided platforms. The literature on Schumpeterian competition has largely not focused on recent events in markets affected by digitization except in a few select areas. and in spite of some work about open source. These topics have attracted some interest in economics. but in no way should this activity be characterized as a unified field. copyright and privacy. Despite all the recent attention from governments. the economics of online communities governed by Creative Commons licenses is far less developed in economics. At the same time. we do not see much research on the aspects that overlap with this agenda. widespread inexpensive access to high speed internetworking – has changed user and vendor behavior in economic activity shaped by digitization. There is also a small literature on piracy online (particularly in music). So is research about market design in the search engine market – and especially the related market for keyword auctions.economic analysis of digitization. There is a substantial group of legal scholars who study issues in intellectual property. or a large field. privacy. One of the healthier niches is the economics and marketing literature on the pricing of goods on and off line. We perceive considerable room for more economic analysis of these licenses. and usually on in areas where existing stakeholders have interests in funding research. and related topics in economic measurement.

Besides government-sponsored surveys. There also are initiatives underway to map broadband infrastructure at the Federal Communications Commission and the National Telecommunications Information Administration. using data collected as a supplement to the CPS. There is no sustained effort to develop theoretically grounded empirical research tradition on the economic consequences of digitization. The Census Bureau’s E-Stats program reports the value of goods and services sold online whether over open networks such as the Internet. or on related policy issues. For example. Each of these efforts has their respective strengths and deficiencies. so it has little consistency over time. Each year the survey at Pew also focuses on different (social) aspects of American use of the Internet. there are a range of quasi-public Internet traffic statistics & surveys: Akamai publishes statistics about web traffic. Census Service Supplements provides measures of total revenue. For example. very little economic research has made use of this data. As another example. The National Telecommunication Information Administration also publishes a report about Internet access at homes. oriented to traditional definitions of “industries”. the Pew survey tends to change focus frequently. So does Andrew Olydzko at University of Minnesota. Unfortunately. Alexa and Google both publish statistics about web traffic for specific sites. for example. not digitization per se. get measured. The Pew Internet and American Life Project also releases general results from its many surveys about Internet adoption.Various pieces of the economy shaped by digitization do. We also draw attention to the lack of institutional commitment to provide forums for objective policy research 145 . as a result. The Bureau of Economic Analysis recently initiated the publication of an aggregate stock of Information and Communication Technology. and. which the Bureau of Labor Statistics administers. in fact. a great deal of measurement of the digital economy is ad-hoc.

so there is no home for the economics and governance of copyright or Creative Commons. 146 . Finally. for example.with economic foundations. we note the (almost) complete absence of any institutional support for data collection and economic measurement of digitization.

MIT Sloan School of Management. 2010.Further reading Shane Greenstein.0.html Scott Stern and Michael Zhang. Accessed at http://www. 2010. “The Economics of Digitization. Mimeo. V4.northwestern.kellogg. “The Economics of Digitization and Copyright: Theoretical and Measurement Challenges.” August.edu/faculty/greenstein/images/research.” July 20. 147 . An Agenda.

148 .

Alternatively. particularly the elderly who may face cognitive challenges in making appropriate choices. recent research along the lines of that by Raj Chetty and others on “sufficient statistics” approaches to policy analysis may allow researchers to avoid the thorny problem of specifying an alternative welfare function. Many econometric models of choice. then a larger welfare structure is superfluous for evaluating these local changes. This existing literature suffers. For example. Should states operate a “yellow pages” sort of exchange. Other papers have shown consumers choosing clearly dominated options in choice environments. But the reform provides little guidance as to the proper design of these exchanges – and in particular the number and diversity of options that should be offered through the exchange. If there are failures in choice. or by restricting the space set in which suppliers can compete to provide a more “organized” choice framework? This issue is highlighted by the recent move of government social insurance policy away from government mandated monopoly options to marketplaces where individuals can choose from a variety of government subsidized options). A wide variety of papers in behavioral economics has shown how increasing the size of choice sets can reduce participation in the market.Jon Gruber “Grand Challenges” in Economics: What is the Right Amount of Choice? A fundamental tenet of neoclassical economics is that more choice is good. A central element of health care reform was the establishment of state-level insurance exchanges where individuals and small firms will be offered a choice of a variety of health insurance options. and has reached a new level with the insurance exchanges that are included in the recently passed health care reform legislation. should it be limited through simply reducing the number of options. This research suggests that in a variety of contexts we may want to limit choice – but how much? And. I have found clear evidence that the substantial majority elders choosing prescription drug plans under the Medicare Part D plan do not choose the cost minimizing option. The welfare analysis in basic economic research is predicated on the premise that individuals are making the “right” choice. Yet what has been apparent to lay-people for many years has become clear to economists as well in recent years: too much choice can reduce welfare. by definition have error structures that show an increase in welfare as choices increase. from the standard problem with empirical work in behavioral economics: it clearly documents a positive anomaly. such as the standard logit choice model. More choices expand the possibilities set and can only lead to individuals finding outcomes that they prefer. 149 . if choice is to be limited. If there is a reduced form approach to documenting clear welfare improvements from local changes in budget sets. but leaves us with little normative guidance as to the policy implications. where any licensed insurer is allowed to offer any product they like? Or should there be a more restricted set of choices with limited differences between plan options? Addressing this question potentially requires tackling a very difficult question for economists: normative analysis with deviations from the neoclassical model. however. This approach was pioneered by the Medicare Part D program. then we need a new welfare framework – and there is no alternative framework which has gathered broad acceptance. in recent work with Jason Abaluck.

This is a tremendous challenge but a critical one.The challenge for economists is therefore to move beyond documenting choice anomalies to actually developing guidance for policy makers as they design choice mechanisms such as insurance exchanges. 150 .

immigration status. amount of financing from equity and equity type such as angel financing. The sources of financing for businesses by type of financing and by type of business (e. young businesses that normally would be creating jobs given their productivity and profitability who can’t get credit that is accounting for the anemic recovery as of September 2010. consumer finances and household composition. broad sectoral and broad regional level with data broken down by all of these dimensions. The analyst can conduct empirical studies at the economy-wide. private equity financing and public equity financing). The drilled down data aggregates to the national key indicators that receive so much attention. The analyst could track what 151 . age. Starting at the economy-wide level. debt terms such as the interest rate. consumption. The data on workers is linked to measures from household data tracking income. amount of financing from debt. sector-level and broad regional-level variation in terms of business productivity.. wages. job postings. wealth. for example. The analyst can ascertain. industry and geographic location) are available (e. The data are available not only for the present time period but historically for several decades permitting analysis of both secular trends and cyclical variation. The data are high frequency (monthly or quarterly) and timely (data for the most recent quarter or month). The business data permits identifying firm startups and also permits tracking firm exits. The analyst begins by exploring the latest aggregate data showing economy-wide. The data on unemployment can be decomposed into gross worker flows tracking flows into and out of unemployment. In addition. age. the analyst can drill down to the individual and firm level creating a longitudinal matched employer-employee data set with all of this information at the micro level.Grand Challenges – Making Drill Down Analysis of the Economy a Reality By John Haltiwanger The vision Here is the vision. The data on employment changes can be decomposed into hiring.. by business size. hedge fund financing. the analyst can drill down into the various key indicators by detailed worker and firm characteristics such as gender. prices. unemployment and population. job creation and destruction.g. is it really the case that it is small. output. employment. workers and households. layoffs. The firm characteristics include measures of intangible capital assets such as investments such as R&D and innovation as well as tracking foreign trade and outsourcing activity. and education of workers and business size and business age for firms. capital investment.g. This permits panel data analysis using rich cross sectional and time variation data tracking the outcomes of businesses. debt type. quits. venture capital financing. A social scientist or policy analyst (denoted analyst for short hereafter) is investigating the impact of the “great” recession and anemic recovery (as of September 2010) on businesses and workers. These outcomes can be tracked at the very detailed location (Census block or track) and detailed characteristics level.

type of financing has especially decreased relative to other economic recoveries. innovation. the good news. a wide range of socio-economic issues can be investigated. The analyst could analyze the impact of policy interventions historically and how they have or have not had influence on different types of businesses and in turn on the workers employed by these businesses. The drill down data infrastructure enables tracking the education experiences and outcomes of individuals so the factors impacting human capital investment can be studied. state agencies. The bad news is that many very difficult challenges remain that may take decades to overcome. and job growth.S. the drill down data infrastructure also is integrated with a variety of other measures of experiences – the housing experiences of children and adults. one might immediately ask why this duplication? We turn to this below 152 .S. The Bureau of Labor Statistics has developed a similar longitudinal business database tracking every establishment in the U. and private sector data developers have been working on various components of this vision for the last decade or more.S. The Reality as of 2010 The above vision is not as far from reality in 2010 as one might first surmise but achieving the above vision faces many different challenges. the drill down infrastructure would permit a range of analyses of the factors driving economic growth and other important economic outcomes for households and businesses.. that can be integrated into the surveys on businesses conducted by BLS. The role of business startups can be tracked in terms of their contribution to productivity. The U. the criminal activity as well as the victims of crimes while children and adults. Beyond analysis of the recession and recovery. The U. what is the career path of entrepreneurs and the factors that impact career path as well as success and failure of business startups? The drill down data infrastructure also permits rich studies of the outcomes of households and workers. The good news is that progress has already been made on many of these challenges. on many of these outcomes is already a reality. federal statistical agencies. The rapid increases in computer speed and disk storage has meant that the ability to track every business. In principle.S. With this added dimensions. that can be integrated into the rich surveys and censuses of economic businesses conducted by Census. 1 The Census Bureau has also developed a longitudinal matched employer-employee database that tracks the employment relationships 1 Of course. The impact of immigrant flows on native and immigrant labor market outcomes can be tracked and studied. The origins of business startups can be tracked given the longitudinal matched employer-employee data – e. household and worker in the U. the health experiences of children and adults. First.g.S. Census Bureau has developed longitudinal business databases tracking every establishment and firm in the U.

there is a “silo” approach towards data collection – collection and processing methods are designed to produce the specific micro and aggregate data of interest without regard to data integration. Beyond the legal issues. Legal issues still block the major U. The Challenges The challenges are many. the challenge is that attaining this vision requires integration of a vast array of administrative and survey data from a variety of sources with different objectives and legal requirements for using and protecting the confidentiality of the data. in spite of enormous progress. health. Of course. At a broad level.S. many technical challenges for the cyber infrastructure needed for a drill down data infrastructure remain. State agencies are likewise developing longitudinal databases tracking education experiences and outcomes and in turn to labor market outcomes for their workers. This data can in turn be linked to the business data at Census as well as the household databases such as the Current Population Survey and the American Community Survey. Even within statistical agencies. the bad news is that the U. Legal issues block the federal and state agencies from working collaboratively. Attaining this vision requires a change in the way data are collected and processed. and other outcomes) remains incredibly balkanized. The software and methodology are increasingly available for massive data integration – permitting matching of records at a variety of levels of aggregation using all available information. education and health outcomes can be integrated into these data. federal agencies from sharing their data (hence the duplication between BLS and Census – which not only leads to unnecessary duplication but also to significant limitations in the quality of key economic indicators like U.S. The other part of the good news is that it is not only the hardware of computer processing that has advanced rapidly but so has the software. the problem 153 . productivity and GDP). education.between all employers and employees in the U. Other major efforts developing the longitudinal micro data on households and businesses include the efforts tracking health and other outcomes for older Americans in the Health and Retirement Survey by the University of Michigan as well as efforts by the private sector (such as Dunn and Bradstreet) in tracking U. We turn to discussing such challenges in the next section. The software and methodology are increasingly available to address the inherent problems of protecting the confidentiality of such drill down data. A variety of administrative databases on housing.S. The statistical system needs to be smart in terms of using scarce resources to avoid duplication but also to collect the many different components in a fashion so they can be integrated. businesses. However.S.S. The Federal Reserve has rich data on balance sheets of the financial sector and is working on developing longitudinal data on financial sector firms and activity. many details – a non-exhaustive list is as follows: 1. statistical system (broadly defined to track economic. housing. There are many.

There are many issues in dealing with privacy concerns which I will not deal with adequately here. is securely protected from hostile attacks as well as inadvertent disclosure. Vast amounts of data will need to be stored and processed in a timely fashion. The efforts discussed above to integrate administrative and survey data have been severely hampered by the disparate nature of the data. The long run solution is to create a cyber-infrastructure environment that creates the data infrastructure. Privacy advocates rightfully express concerns that the above vision creates “big brother”. this likely is not the long run solution. One critical issue is insuring a secure environment for developing and accessing this type of data infrastructure. Secure enclaves have shown that they can provide access to many valuable projects from the research community without endangering the privacy and confidentiality of respondents. is accessible to the user community from many locations (ideally the desktop of the user) and generates statistically valid inferences from the above data infrastructure without the user ever being able to observe enough details to identify any business or person. 4. It requires a 21st century approach to disclosure protection – not simply cell suppression or top coding of some sort but rather the use of statistical methodology that will in a flexible manner protect confidentiality in a flexible manner. Further advances in this drill down data infrastructure will require further advances in both hardware and software.e. Of course. There needs to be a mandate that administrative data from all of the types of sources discussed above can be used for this type of statistical analysis. 5. data integration methods need further development and 154 . The use of synthetic methods to create and analyze micro data has made great progress in the last decade – but the statistical and research community has much work to do before these methods are ready for widespread use. In addition. the administration of some other program including the collection of taxes and the monitoring of public programs). This requires real-time disclosure protection on an interactive basis. This has already happened to some degree with agencies such as NCHS now housing their data at the Census RDCs. The development of standards and the agreed upon use of common identifiers would greatly facilitate data integration. This vision can likely only be achieved with the intensive use of administrative data that are collected for other purposes (i. the use of common identifiers raises many privacy concerns which we discuss below. One working model so far is to create secure enclaves (the Census-NSF research data centers). 2. if over the next decade or so the Census-NSF data centers can become Federal-State data center enclaves where a host of federal and state data can be housed this would represent major progress.. While this represents great progress relative to 20 years ago. 3. Still even if this is not the long run solution (more on this below). Overcoming the legal challenges is a significant obstacle in its own right.is much worse across federal and state agencies.

But we often need more than this – we need an ability to deal with differences in units of observation and frequency. 7. A very bad outcome would be that in a few decades the private sector developments have superseded the public sector developments so much that social scientists are essentially forced to use private sector data without adequate quality and methodological standards. However.. creating the best possible annual data). 155 .S. The smart systems need to be able to take data using a variety of different units of observation (e. It requires champions in the executive and legislative branches of government who make both developing this type of data infrastructure available and to protecting the privacy and confidentiality of respondents. individuals. Smart systems are also needed for the collection of data. many in the social science community are already beginning to conduct studies using these private sector data sources. The smart systems need to use the insights and continuing developments from statisticians on missing data imputations with statistical software packages that can adjust standard errors appropriately based upon sampling variation as well as associated imputation. This competition with the public sector in the provision of statistics is not necessarily a bad thing. Finding a way to create public-private partnerships and to help set standards for methodology. Achieving this data vision requires champions. Smart systems that can integrate weekly. households vs. create the best possible estimates of weekly data using available data) and aggregation (e. the data matching programs are capable of developing data matches using a variety of criteria with measures of the quality of matches. The private sector is already in many ways further along in attaining this vision for profitmaking purposes using data mining techniques.g.. monthly. this is not a bad thing per se but often the representativeness and statistical properties are not known and the access is not regulated in a manner that permits replication – a critical feature in the advancement of science. larger geographic areas and smaller geographic areas.refinement. Of related concern. quarterly and annual data readily in terms of both disaggregation (e.g. Many administrative data sources are in principle available on almost a real time basis (often with filings in the last week. access and privacy protection that have some commonality in the public and private sector developments would be valuable. month or quarter) but only become available to the statistical system with a considerable lag. It is hard to find champions from these branches of government since “data integration” is not a hot button topic. establishments vs. the ability of the scientific community to develop the methodology and standards for data integration and associated analysis is limited for most private sector developments.g. The private sector data mining developments will not go away and nor should they go away. Again. The ironic thing of course is that these branches of government continually ask questions that have critical consequences for the future of the U. firms. 6. Currently. detailed industry and broad sectoral classifications) and integrate.

156 .economy and involve billions or even trillions of dollars and the only way to answer the questions would be to have this type of data infrastructure available.

American Universities. Background A primary lesson of research on economic growth over the last several decades is that the pace of economic progress in a country is strongly associated with its access to highly skilled labor. Countries produce skilled labor through educating and training their own workers or by importing talent from other nations. Virtually all major US corporations in technology fields search for talent internationally. introducing more 157 . and the rate of economic growth across nations? 1. much of the intellectual firepower in research on immigration has been aimed at estimating the impact of the inflow of low skilled foreign labor on the economic well being of native-born workers in the United States and other high income countries. The literature is yet to produce compelling empirical evidence on the costs and benefits of skilled migration for either origin or destination countries. UC San Diego and NBER Abstract.Future Directions for Research on Immigration September 2010 Gordon Hanson. How does the international migration of talent affect the creation of knowledge. In coming decades. and the rate of economic growth across nations? In recent decades. the organization of work. Recent events suggest prospects are favorable on both fronts. Sound empirical analysis requires exploiting natural experiments or conducting experiments in the field. policy makers are setting immigration policy governing skilled labor flows largely in the dark. Students see studying abroad as a way to get their foot in the door of foreign labor markets. Future research on immigration should focus on the empirical analysis of how the inflow of skilled foreign labor affects productivity growth and innovation in receiving countries and how the outflow of talent affects prospects for growth and development in sending countries. The United States. Challenge Question How does the international migration of talent affect the creation of knowledge. Currently. as well as educational institutions in other advanced countries. including China. no longer viewing labor markets as defined by national borders. the organization of work. the dominant country in higher education for the last half of the 20th century. rapidly improve their educational institutions. it is how the world allocates skilled labor that will help determine which countries advance economically and which do not. have long sought to attract the best and the brightest worldwide. it is not at all clear that low skilled immigration matters very much for national welfare. at least. For the United States. is seeing is market lead erode as other countries.

While doubling the supply of illegal immigrants (currently 5% of US workers) in 158 . Because immigration changes the national supply of labor. we are far from a consensus on how much immigration matters for the low end US labor market. Despite the immense volume of work. leading many economists to ask whether the arrival of large numbers of low skilled foreign workers could be driving changes in the US wage structure. and the political economy of immigration policies – the bulk of intellectual firepower has been aimed at estimating the impact of low skilled immigration on the economic well being of native-born workers in the United States and other high income countries (Hanson. 2009). it must be through its effect on innovation and total factor productivity. for their part. the consequence of low skilled immigration for wages is an important issue. Over the last three decades. economists are predisposed to consider the impact of such supply shifts on prices. this is not the case. The impact of immigration on growth has been lost in the mix.competition in the global search for skilled labor (Freeman. Alas. the focus immediately shifts from low skilled to high skilled immigration. at least. To be sure. Immigration and economic growth If immigration is going to transform an economy. Research on economic growth identifies the relative supply of high skilled workers – and in particular those in science and engineering – as a key factor underlying a country’s R&D capacity and thereby its growth rate (Jones. 2010). Reputable economists can be found on both sides of the issue. often appear ambivalent about skilled immigration. then. it is not at all clear that low skilled immigration matters very much for national welfare. foreign born students account for over 40 percent of PhDs awarded in science and engineering. 1995). For the United States. there has in fact been an outpouring of research on immigration. While the literature has covered a wide range of topics – including immigrant assimilation. Governments. earnings inequality in the United States has increased sharply. The academic debate. the causes of illegal immigration. Once we raise the issue of productivity. In the United States. being relatively generous in providing visas to foreign students but stingy in granting work visas to these students upon graduation or to other prospective skilled immigrants. with some claiming that immigration has significant negative effects on wages and others claiming that no such effects exist. Since 1980. has been primarily about the distributional consequences of immigration. In the United States. Whether or not immigration hurts low skilled workers. the education of foreign students. 2. most economists agree that the aggregate effects – which involve summing gains to employers and losses to workers – are small. motivated in part by the increase in labor inflows in high income countries. for instance. One would think that given the importance of skilled labor for economic growth. the share of foreign born individuals in the population increased from 5% in 1970 to 13% in 2008. research on immigration would have made skilled labor flows a central focus of study.

arriving at a conclusion that any two-handed economist would love. The development economics literature has devoted considerable attention to brain drain from developing countries. What we lack is empirical analysis that identifies the sign and magnitudes of these effects and the theoretical mechanisms that account for them. That scarcity. The paucity of physical capital. policy makers are setting immigration policy governing skilled labor flows largely in the dark. which the literature has failed to answer. and the persistence of weak legal and political institutions hold down the productivity of high skilled labor in Pakistan. Simply correlating the supply of immigrants or emigrants with productivity growth or other outcomes is not informative. Whether high skilled emigration raises or lowers the stock of human capital in developing countries is therefore an empirical question. the use of outdated technology. How should we go about attempting to understand the impact of international migration on growth? Arguably. If an engineer leaves. but we have not yet seen convincing time series or panel data which shows that increasing prospects for emigration causes students in a country to increase their schooling by enough to offset the exodus of talent. owing to the fact that in Pakistan skilled labor is relatively scarce. Another possibility is that the prospect of migrating to the United States is sufficiently attractive that capable students in Pakistan obtain more education than they would have absent the opportunity to emigrate. as the migration of labor is not 159 . The literature has produced intriguing evidence in international cross sectional data in support of the brain gain hypothesis. 3. Currently. One possibility is that the exodus of skilled labor hurts Pakistan by directly reducing the supply of human capital. the supply of engineers will change by a larger percentage amount in Pakistan than in the US. how the world allocates skilled labor will in part determine which countries advance economically and which do not. The future of immigration research We are left then with two fundamental and interrelated questions about international labor flows: if we move one skilled worker from a low income country to the United States. however. countries losing these workers will be affected. doubling the supply of high skilled immigrants could have a first order effect. We have well developed bodies of theoretical work on how migration affects growth rates internationally. isn’t sufficient to yield high wages for Pakistani engineers. if the US absorbs more high skilled labor from the rest of the world.the US labor force would likely have at most second order effects on economic growth. say. Pakistan to work in the United States. giving Pakistan more human capital on net with emigration than without it (yielding a brain gain). by how much does US productivity growth change and by how much does the low income country’s stock of human capital change? In coming decades. Of course. keeping wages low. theory is well ahead of empirical analysis. The literature is yet to produce compelling empirical evidence on the costs and benefits of skilled migration for either origin or destination countries.

how corporations manage innovation across borders. And for engineers. given the desire for knowledge among government officials regarding how immigration affects growth. For political scientists. Further. One or two recent papers exploit such an approach. Field experiments would require getting governments to agree to randomize how they allocate visas across individuals. 160 . there are the questions of what determines political support for high skilled immigration. For sociologists. the events of 9/11) disrupt either the outflow of labor from sending countries or the inflow of labor in receiving countries. one would expect at least some high income countries to be willing to subject their immigration policies to rigorous analysis involving at least some degree of randomization. either in one country or in multiple countries affects the optimal organization of production processes both within and across firms. Such randomization may not be as farfetched as it sounds. and/or time. and how the exodus of skilled labor affects decisions governing economic policy in origin countries. the set of questions involved extends well beyond the economics discipline and even beyond the social sciences. the sending countries in the second). Rigorous empirical analysis requires sound experimental design. countries. What disciplines would be involved? Economists come first to mind. earthquakes in Haiti. companies. leaving countries with poor growth and moving to ones with better prospects. providing opportunities for causal identification of migration’s impacts on productivity growth in one set of countries or the other (the receiving countries in the first case. Currently. the entire stock of visas is allocated through a lottery among applicants (as occurred in 2007 and 2008). Simply giving researchers access to data on these randomization episodes would advance migration research immensely. environmental shocks (eg. Labor moves across borders in response to economic incentives. there are questions about how combining native-born and foreign workers.random. the United States already allocates about five percent of its permanent residence visas through an annual lottery. However. either by exploiting natural experiments in the data or by conducting experiments in the field. why countries more open to international trade than to international labor flows (which has received some attention in the recent literature). And when applications for temporary work visas for high skilled labor (H-1B visas) exceed the allocated quota in a given year. 4. floods in Pakistan) and geopolitical events (eg. and the manner in which migration affects the international transmission of ideas. how international labor markets are organized. as scholars in the discipline have spent a great deal of time thinking about the determinants of economic growth and the consequences of international migration. tsunamis in Indonesia. the questions include how combining native and foreign workers within a business affects the organization of firms. Regarding natural experiments.

Richard. where the aggregate welfare consequences for the United States are likely to be small.5. Jones. While the popular discussion may occasionally make for entertaining political theatre. “The Economic Consequences of International Migraiton. 103(4): 759-784. “R&D Based Models of Economic Growth. “What Does Global Expansion of Higher Education Mean for the US?” NBER Working Paper No. The literature has misallocated time and energy on the low skilled end of the spectrum.” Journal of Political Economy.” Annual Review of Economics. 1: 179-208. it is poor guide for rigorous analysis of international migration. 2009. as in other advanced countries. Charles. 1995. 14962. remains mired in tired invective about the adverse consequences of admitting individuals from poor countries. Gordon. 2010. It is time that empirical research shifted towards the consequences of skilled labor flows for economic growth. References Freeman. 161 . Closing thoughts The political debate on immigration in the United States. Hanson.

162 .

But. But more than that. there is now substantial reason to believe that many of our models and perspectives have been seriously distorted in the process. human capital is frequently taken as synonymous with school attainment. and they represent clever and powerful adaptations to available data. Hanushek and Woessmann (2008) 163 . The current state of research on schools and human capital does not. At the same time. however. 1. • First. Hanushek Stanford University Recent research points to a need for expanding the research agenda related to the production and the impact of human capital. much of the discussion both in research and in its public incarnations has been reduced to very simplistic shells of the underlying ideas. The central element of this is expanding analysis to identify and to incorporate different dimensions of skills – including new study into underlying measurement issues. requiring no discussion or explanation. This simplification to spending is perhaps even more prevalent in theoretical work.Developing a Skills-based Agenda for “New Human Capital” Research Eric A. for example. Stixrud. the concept of human capital has become thoroughly integrated into theoretical and empirical studies in economics and other social sciences – so much so that policy makers routinely pick up on it and infuse discussions into a wide variety of policies with this terminology. reflecting the belief in the central role of schooling for the future of society and our economy. Newly available data and newly minted researchers make this a propitious research investment. and Urzua (2006). recent research has highlighted the importance of both cognitive and noncognitive skills for individual earnings and careers. reflect either its importance or the possibilities that currently exist for a much deeper and useful research program. the existing analyses show clearly how expanded measures of human capital . How would an expanded and refined “new human capital” concept improve our understanding of economic and social outcomes? For the last half century.1 trillion annual spending on formal schooling ($660 billion for K-12) represents 7 ½ percent of GDP. The $1. A number of factors point to the productivity of a new initiative that would take the research and analysis into new lines of research related to the skills of individuals. investment in human capital is measured simply by spending on schools or other training activities. These narrowed perspectives have resulted largely from efforts to develop testable hypotheses. Understanding the role of schools in society and the economy needs little justification. In other cases. In empirical work. it represents the largest component of state and local budgets. ones that indicate more reliably the 1 Heckman. 1 While measurement issues remain.

• • • The “new human capital” agenda would pursue a rich view of the measurement and range of skills that are important. Specifically. labor market outcomes. and well-trained scholars who can take a new research agenda in human capital forward. And. and the like indicates a dramatic expansion in the detailed information that is relevant for consideration of human capital issues. This work also shows various paths to refining our knowledge.variations in skills across individuals. It would consider both the variety of factors that influences the success of investments in these skills and how these skills affect lifetime outcomes. military service. It would capitalize on the growth in well-trained new researchers and on the new data that are becoming available. The recent. This improvement in explanation holds for individual income and employment determination. say in the context of income determination. and warranted. vastly improve our ability to understand the underlying economic outcomes and processes. • Second. Fifth. and for aggregate productivity and economic growth. 164 . the considerable increase in study of education issues. cannot generally yield unbiased estimates of the impacts of schools. public support programs. Such an initiative would serve to develop the intellectual roots of much of the current policy discussion. Third. because these factors are correlated. the potential availability of extensive administrative data on schools. Fourth. energetic. 2. once the focus turns to the analysis of more refined skills. for consideration of the distribution of incomes. skepticism about the interpretation of many past statistical analyses indicate that much of what we “know” should be revisited. particularly among economics PhD students. Central to both. simple analyses of schooling. incarceration. is an overarching third question of basic measurement that will fit into both fundamental lines of research. The foundation elements for a new and exciting initiative around the “new Human capital” now exist. recent advances in understanding the analysis of causal effects provide relevant approaches to refining our understanding of the role of human capital in determining outcomes and in how skills are produced. work on the determinants of achievement and cognitive skills (often labeled educational production functions) suggest that skills come from a range of inputs including families and neighborhoods in addition to schools. however. indicates a vast group of bright. There are two complementary lines of research that are important – one looks at how skills are produced while the other looks at the effects of skills on individual and societal outcomes. other research becomes relevant and suggests a modified direction for much analysis. and it would contribute quite directly to governmental policy making at all levels.

more than that. The fact that school histories of specific schooling experiences and measured outcomes are now routinely included in administrative data bases at both K-12 and higher education levels suggests a variety of opportunities. The two major assessments – PISA and TIMSS – now cover all OECD countries and a large number of developing 165 . many states now make it possible to link to experiences out of school such as unemployment insurance records. a program that focused on the external validity of any measures could be extraordinarily productive. and quality of the assessments. North Carolina. But. Department of Education has already given $500 million in grants to states to develop their capacity. Medicaid usage. particularly in Florida. These greatly expanded data dovetail nicely with the recent appreciation for the potential biases from incomplete identification of statistical models. These school histories are beginning to cover the full length of schooling experiences for individuals. because of a variety of exogenous factors that impinge on individual schooling and career paths. allowing researchers to follow individuals as they progress through school including moves across schools and programs. A concerted effort to develop the capacity for interstate comparisons and analyses could provide considerable new evidence about the operations and effectiveness of schools.e. educational production functions – has followed the wider availability of state administrative databases. and more. military records. juvenile justice involvement. It has relied on the measures currently available – such as measures from math or science tests used in school accountability or survey indicators of personality factors in the noncognitive domain.. It has not involved much comparison of alternative measures. An additional element of developing analyses of the new human capital is the international dimension. Each existing database has some common elements of following students and their achievement over time but then also has a variety of special elements and advantages.S. but this effort has now picked up in terms of numbers of participating countries. because they will still be malleable for a number of years. frequency of the testing cycle. With the intensive efforts currently under way to develop measures of achievement at the K-12 level and the nascent efforts for higher education.The measurement issues reflect the fact that most empirical research into either cognitive or noncognitive skills has been opportunistic. It is important that researchers become involved early in these state developments. One element of this is much larger and more accurate data on human capital development and outcomes than previously available. There has been an effort to assess student achievement across countries particularly in math and science since the mid1960s. The movement toward developing longitudinal educational records across the nation and making them available to researchers has been proceeding rapidly. The U. New York. it becomes much more feasible to identify causal influences on individual outcomes. The recent explosion of work on the determinants of achievement – i. and Texas. It could capitalize on these other development efforts – ones that emphasize internal validity and common standards developed almost entirely within existing schools – while adding a useful dimension to thinking in those developments. Nor has it followed a purposeful program of development based on the external validity of any measures. Related to this.

There are of course a series of challenges in this area. 2 The majority of such international studies has been produced within the last decade.S. or FERPA). the area is poised for dramatic expansion that could in part provide extraordinarily valuable research that supports a range of crucial policy issues facing the U. Recent analyses have suggested that expanding research to investigate both the production and the impacts of a range of individual skills would yield large dividends. These assessments routinely include survey information of students and their schools. A natural extension of any U. As a result. The availability of new. offers the possibility of investigating issues such as national institutions that cannot be assessed within a single country or of how labor markets in different countries demand and reward skills. driven by data availability and by a longstanding set of research questions. for example. Much of this international research is currently being conducted by researchers outside of the U. and the flow of such work has now increased significantly. 166 .S. much richer longitudinal databases have attracted large numbers of new PhD’s into the study of schooling and other aspects of skill production.S. These issues will require separate attention. such as the impacts of accountability or of charter schools. The recent evolution of understanding the impacts of different skills have been led largely by economists. 2 It also fits naturally in terms of understanding the importance of STEM education for economic growth and development. It will be increasingly important to develop research procedures and protocols that satisfy FERPA (and the underlying ideals). One that is apparent now is the need to maintain confidentiality of individual data. The work on educational production functions and the analysis of specific policies. Summary The paradigm of individuals investing in skills that falls under the heading of human capital has proved to be very useful across theoretical and empirical research endeavors. This cross-country analysis. although the largest expansion of new PhD’s in the area has been in economics. The concerns about confidentiality of data become particularly acute when one considers merging the administrative records with other survey and programmatic information. based research program would be to incorporate the international analyses and comparisons made possible by these. have been carried out by economists and by a broader social science community including sociologists and political scientists. But it has also developed in a constrained way. Currently much of the administrative data from schools is covered by federal law (the Family Education Rights and Privacy Act. See Hanushek and Woessmann (2010).countries.

2010. 2006. 167 . no. "The economics of international differences in educational achievement. "The role of cognitive skills in economic development..References Hanushek. Eric A. and Ludger Woessmann. no. James J. Amsterdam: North Holland. edited by Eric A. Hanushek. and Sergio Urzua. "The Effects of Cognitive and Noncognitive Abilities on Labor Market Outcomes and Social Behavior. Heckman. 2008. Stephen Machin. 3: 411-482. Vol. and Ludger Woessmann.." In Handbook of the Economics of Education. Jora Stixrud. ———." Journal of Economic Literature 46." Journal of Labor Economics 24. 3 (September): 607-668. 3.

168 .

macroeconomics. The basic philosophy behind contract theory is the idea that parties can design their relationship to be efficient and that a contract is the means to do this. but they have developed independently. or make sensible policy recommendations. Economics has changed a great deal in the last thirty years and there is every reason to think that the changes in the next twenty to thirty years will be at least as great. of course. and where the contractual element is relatively trivial. it remains essential for understanding the (increasingly) complex world we live in. international trade. Also new fields have become established that were in their infancy in 1980: behavioral economics is the most obvious example. there are also important differences. In this respect there is significant overlap with the mechanism design literature. it remains essential for understanding the (increasingly) complex world we live in. without the organizing framework of theory. One cannot analyze the bewildering amount of data now available. In contract theory the mechanism is designed by the parties themselves and the only (possibly) impartial player is a judge who adjudicates disputes. In this short essay I will discuss some of the major themes of contract theory and also issues that are still not well understood. much of traditional economics is concerned with spot trades. In these circumstances a contract becomes an essential part of the trading relationship. such as the financial crisis. Moreover. However. public finance. Contract theory also draws on and contributes to ideas in law and 169 . Although theory may not be as prominent as it once was.Making the Case for Contract Theory Oliver Hart Abstract: Economics has changed a great deal in the last thirty years and there is every reason to think that the changes in the next twenty to thirty years will be at least as great. industrial organization. The techniques of contract theory have permeated many areas of economics. without the organizing framework that theory provides. However. or make sensible policy recommendations in response to these events. One cannot analyze the bewildering amount of data now available without the organizing framework that theory provides. including labor economics. basic research in theory remains vital. corporate finance. and may indeed design it. a sense in which contracts have always been basic in economics. In recent years economists have become much more interested in long‐term relationships where a considerable amount of time elapses between the quid and the quo. whether explicit or implicit. Any trade—as a quid pro quo—must be mediated by some form of contract. and yet where much remains to be done. Although theory may not be as prominent as it once was. There is much that we still do not understand. In the 1970’s and 80’s theory was dominant. although exciting developments in other fields of economics understandably attract attention. and yet where much remains to be done. I would also suggest that one cannot understand the extraordinary events that we have recently witnessed. where the two sides of the transaction occur simultaneously. In the first part of the twenty first century this is no longer the case: there has been a huge shift towards empirical work. Each literature has learned from the other. In mechanism design theory it is usually assumed that there is an impartial planner who oversees the system. and development economics. At the same time although much has changed some things stay the same. There is. Contract theory is a good example of an area where great progress has been made in the last thirty years. Contract theory is a good example of an area where great progress has been made over the last thirty years.

A classic topic of contract theory is the design of incentive schemes. and others. she may be discouraged from collaborating with other teachers. In recent years. e. As an example of how this more formal approach can be useful. pays himself too much (or in the wrong sort of way). what’s different about transactions inside and between firms. An implication of the theory is that assets will be owned by those whose investments are important. under symmetric information.g. an employer. Or suppose that the principals are parents and the agent is the teacher of their children. The problem may be that it is hard to measure the true outcome of teaching. To the extent that one can identify a firm with the assets it owns this yields a theory of firm boundaries. to act in her interest. The starting point of this recent literature—known as the property rights approach—is the idea that if parties can anticipate all future eventualities and include these in a contract then the boundaries of the firm are irrelevant: it is only if contracts are incomplete that boundaries matter. that is.economics. contract theorists have developed formal models to elucidate these issues. because more information makes it easier to write good contracts. There is no shortage of proposals for improving matters. Suppose that the principals are the shareholders of a public company and the CEO is the agent. The more recent literature has emphasized different issues. Paying a teacher according to test performance may encourage the teacher to focus on the wrong things: rote learning rather than more creative material. e. But is such a trend desirable? Or might it interfere with the reason that the employees are under the umbrella of a single firm in the first place? The question of what constitutes a firm. is one that contract theorists have studied intensively. teachers. and what determines the boundaries of firms.g. and the employee’s risk aversion as the main reason why making compensation very sensitive to outcome—high‐powered incentives—might not be a perfect solution. Williamson. The property rights approach takes the view that the owner of an asset has residual control rights. The problem may not be that the CEO does not want to work hard: rather it may be that the CEO is an empire‐builder. The compensation of CEOs. The early literature emphasized the employee’s desire to shirk as the main incentive problem. the division of surplus will depend on the assets they own. if a teacher is rewarded narrowly according to the test scores of children directly under her control. or is overconfident about his ability to run things. However. can motivate an agent. they will reach an ex post efficient outcome. is highly topical. In the simplest property rights model parties can renegotiate an incomplete contract once an unforeseen contingency has occurred and. advances in information technology will favor 170 . ways. the right to make decisions not covered by the contract. A formal contract can tie the agent’s compensation to the outcome of the agent’s actions. Principal‐agent theory studies how a principal. and. was insightful but largely informal. Also educating a child is a team process. by Coase. consider the question of how improvements in information technology will affect firm boundaries.. and possibly high‐powered. In practice contracts are incomplete and a key question is who has residual rights of control. Performance on tests can be assessed but this may be a very imperfect measure of what children should be learning. In this short essay I will discuss some of the major themes of contract theory and also issues that are still not well understood. Contract theory is enormously useful in clarifying the trade‐offs and helping us to avoid the adoption of policies that may actually be counter‐productive. an employee. The early transaction cost literature on this topic. This division of surplus will in turn influence the incentives of parties to invest. and others. It is often argued that. takes excessive risks. Advances in technology make it possible to measure performance more finely and in the future it will become feasible to pay people in increasingly subtle.

S. The property rights approach has been applied extensively in the recent international trade literature on the structure of multinational companies. studied in the standard corporate finance literature. Given this. most explanations are based on the idea that key institutions had excessive debt. that much of this debt was short‐term. The property rights approach provides a more nuanced perspective. Antras (2003) uses the approach to explain why U. It seems likely that in the future collaborations between contract theorists and experimentalists—both in the lab and in the field—will yield important new insights.independent contracting: carrying out transactions outside the firm. including the idea that contracts are reference points for entitlements. There is also a widely held view that banks and other financial institutions are different: they are more 171 . In contrast. Economists are still grappling with the causes of the recent financial crisis. This does not square with an observation of Coase that inside firms the price mechanism is superseded. Consider an entrepreneur who has an idea for a firm or project but does not have the funds to finance it. Although there is not yet consensus. the financial contracting literature considers all possible contracts or securities and tries to explain why debt or equity may be optimal among these. Psychological and behavioral elements can broaden the scope of contract theory in many interesting ways. The difference is that this literature tends to take the form of the securities a firm issues as given: equity or debt. Recent theoretical and experimental work has argued that explicit contracts can interfere with feelings of fairness and trust and as a consequence extrinsic motivation can crowd out intrinsic motivation. But should the borrowing be short‐term or longterm? How much collateral does the entrepreneur need to provide? Might it be better for the entrepreneur to issue equity rather than debt? Or might some sort of hybrid security be preferable to both? Many of these questions are. Recent work has argued that it is possible to explain Coase’s observation if one is willing to step outside the standard framework and introduce some psychological considerations. companies are less likely to own foreign suppliers if the goods they import are labor intensive (in which case the human capital investment of the foreign firm is likely to be important) than if they are capital intensive (in which case the physical capital investment of the U. Indeed this is an implication of transaction cost economics. and help contract theorists to refine the assumptions they make. The entrepreneur might borrow from an investor. A reduction in contracting costs also makes it easier to carry out transactions inside a firm and so firms may become bigger rather than smaller. informal and incomplete contracts may outperform formal and complete contracts even when the latter are feasible. Another significant application of contract theory has been to understand firms’ financing decisions. Contracts may also be written by one party to take advantage of the cognitive limitations of another party. One limitation of the property rights approach is that the standard model does not explain why transactions inside firms have a different character from those between firms: the theory supposes that parties will use monetary sidepayments to bargain to an ex post efficient outcome whether the parties are in the same firm or in different firms. and that the failure of one institution triggered the failure of others. firm is likely to be important). Support for this possibility has been found in empirical work on the trucking industry by Baker and Hubbard (2004). This has yielded new insights.S. of course. and why parties may deliberately write incomplete contracts. Many other papers have extended this work. This provides new insights into why high‐powered incentives may be costly. All this work is informed by experiments.

have made a notable start in this direction. Did institutions write suboptimal contracts with their investors (or for that matter with their customers. Journal of Political Economy. November. “Contractibility and Asset Ownership: On‐Board Computers and Governance in US Trucking”. 1443‐1479.. But why? Economists do not have fully convincing answers to these questions. Although Kiyotaki and Moore (1997). Kiyotaki. The next twenty years promise to be both challenging and exciting. Quarterly Journal of Economics. Contracts. References Antras. Pol (2003). George and Thomas N. Quarterly Journal of Economics. and Trade Structure”. April. 1375‐1418. “Firms.sensitive than regular industrial companies. Nobuhiro and John Moore (1997). among others. Hubbard (2004). November. “Credit Cycles”. home‐owners). Understanding the financial crisis requires putting contract theory into a general equilibrium perspective. 211‐248 172 . much remains to be done. Baker. and hence their failure is more serious.g. e. or were these contracts individually optimal but collectively suboptimal? What does a bank do that makes it different from other firms? How should large financial institutions be regulated to prevent the next financial crisis? The tools of modern contract theory seem indispensable if we are to make progress on these vital questions. But inevitably answering these questions will require new thinking.

rising tuition costs. This position paper summarizes a body of research that articulates a coherent approach to addressing these problems that is rooted in the economics. These problems are usually discussed in a piecemeal fashion. or the failure of a number of other social institutions. This paper is based on Heckman [2008] and the references therein. James Heckman is the Henry Schultz Distinguished Service Professor of Economics at the University of Chicago and a senior fellow of the American Bar Foundation. The slowdown in the growth of the skills of the workforce is reducing U. American society has polarized.A Research Agenda For Understanding the Dynamics of Skill Formation James J. psychology. 20% of the U.S. ∗ 1 173 . and biology of human development. A greater percentage of children is attending and graduating college. 2010 I American Society is Becoming Polarized and Less Productive In the past 30 years. Heckman∗ October 4. work force has such a low rate of literacy that it cannot understand the instructions on a vial of pills. a greater percentage is dropping out of secondary school producing a growing underclass. neither working nor going to school. Analysts blame the public schools. This has produced an array of competing proposals that lack coherence or a firm grounding in science and social science.S. At the same time. productivity.

Cognitive abilities are important determinants of socioeconomic success. society needs to recognize its multiple facets. in the U. teenage pregnancy. 4. 2. high school dropout rates. and self confidence. 3. 10. 5. 8. More than genetics is at work. Current public policy discussions focus on promoting and measuring cognitive ability through IQ and achievement tests. They contribute to performance in society at large and even help determine scores on the very tests that are used to monitor cognitive achievement. and adverse health conditions can be traced to low levels of skill and ability in society. Ability gaps between the advantaged and disadvantaged open up early in the lives of children.S.II A Coherent Approach to Skill Policy The current state of the literature can be summarized by eighteen points. perseverance. not evaluating a range of other factors that promote success in school and life. the accountability standards in the No Child Left Behind Act concentrate attention on achievement test scores. 9. and many other countries around the world have deteriorated over the past 40 years. as well as crime. “soft skills. Many major economic and social problems such as crime. For example.S. obesity. 1. So are socioemotional abilities. motivation. The evidence that documents a powerful role of early family influence on adult outcomes is a source of concern because family environments in the U. health and obesity. 2 174 .” physical and mental health. 6. In analyzing ability. 7. attention. Family environments of young children are major predictors of cognitive and socioemotional abilities.

reducing crime. public job training. Life cycle skill formation is dynamic in nature. and reducing teenage pregnancy. substantially impair child outcomes.11. The longer society waits to intervene in the life cycle of a disadvantaged child. Early interventions have much higher economic returns than later interventions such as reduced pupil-teacher ratios. 17. A fruitful direction for future research is to improve the core evidence on the dynamics of skill formation. 14. 18. 13. especially adverse parenting. the more costly it is to remediate disadvantage. If society intervenes early enough. tuition subsidies or expenditure on police. 3 175 . Experimental evidence on the effectiveness of early interventions in disadvantaged families is consistent with a large body of non-experimental evidence that adverse family environments. 16. motivation begets motivation. Early interventions reduce inequality by promoting schooling. she/he will fail in social and economic life. it can raise the cognitive and socioemotional abilities and the health of disadvantaged children. the more likely it is that when the child becomes an adult. adult literacy programs. These interventions have high benefit-cost ratios and rates of return. Similar dynamics appear to be at work in creating child health and mental health. If a child is not motivated and stimulated to learn and engage early on in life. They also foster workforce productivity. 12. Skill begets skill. A major refocus of policy is required to understand the life cycle of skill and health formation and the importance of the early years in creating inequality and opportunity and in producing skills for the workforce. convict rehabilitation programs. 15.

delay of gratification. and health in a variety of life outcomes. company job training. college attendance. the ability to work with others. self-esteem. Those with higher cognitive and noncognitive abilities are more likely to take schooling. personality factors and the ability to work with others—what are sometimes called “soft skills. In the U. Yet they have been measured and have been shown to be predictive of success. as is intuitively obvious and commonsensical. Recent research documents the predictive power of motivation. labor force participation. employment. attention. By noncognitive abilities I mean socioemotional regulation. compliance with health protocols and participation in crime strongly depend on cognitive abilities and noncognitive skills. participation in risky activities. and to participate in civic life. Cognitive and noncognitive ability are important determinants of schooling and socioeconomic success.S. delay of gratification. Cognitive and noncognitive skills are equally predictive of success in many aspects of life. 4 176 . Yet much evidence shows that. and many countries around the world.” Much public policy discussion focuses on cognitive test scores or “smarts.” The No Child Left Behind initiative in the US focuses on achievement on a test administered at certain grades to measure the success or failure of schools. schooling gaps across ethnic and income groups have more to do with ability deficits than family finances in the school-going years.III The Importance of Cognitive and Noncognitive Skills Recent research has shown that earnings. self-control. sociability. They are less likely to be obese and have greater physical and mental health. much more than smarts is required for success in a number of domains of life. teenage pregnancy. The importance of noncognitive skills tends to be underrated in current policy discussions because they are thought to be hard to measure.

Those born into disadvantaged environments are receiving relatively less stimulation and child development resources than those from advantaged families. Schooling after the second grade plays only a minor role in alleviating these gaps. in the U. and also in interaction with genes. VI Family Environments The evidence that family environments matter greatly in producing abilities is a source of concern because a greater fraction of American children is being born into disadvantaged families. Schooling quality and school resources have relatively small effects on ability deficits and only marginally account for any divergence by age in test scores across children from different socioeconomic groups. Deficits in college going between minority and majority groups are not caused by high tuition costs or family income at the age children are deciding to go to college. American family life is under challenge. The real source of child 5 177 .IV Ability Gaps Are the Major Reason for the Schooling Achievement Gap Controlling for ability measured at the school-going age. V Ability Gaps Open Up at Early Ages Gaps in the abilities that play such an important role in determining diverse adult labor market and health outcomes open up at early ages across socioeconomic groups. The evidence on the early emergence of gaps leaves open the question of which aspects of families are responsible for producing ability gaps. Is it due to genes? Family environments? Family investment decisions? The evidence from intervention studies suggests an important role for investments and family environments in determining adult capacities above and beyond genes. Measured by the quality of its parenting. This trend is occurring in many countries around the world. minorities are more likely to attend college than others despite their lower family incomes. A divide is opening up in early family environments.S.

Different types of abilities appear to be manipulable at different ages. Those born into disadvantaged environments are receiving relatively less stimulation and child development resources than those from advantaged families. at the same time. the less effective it is. suggesting a sensitive period for their formation below age 10. The available evidence suggests that for many skills and human capacities. the later remediation is given to a disadvantaged child. More educated women are working more. but. IQ scores become stable by age 10 or so. Less educated women are also working more but are not increasing their child investments. VIII Key Policy Issues From the point of view of social policy. VII Critical and Sensitive Periods There is a large body of evidence on sensitive and critical periods in human development. This creates persistence of inequality across generations through the mechanism of differentials in parenting. On average. the key questions are how easy is it to remediate the effect of early disadvantage? How costly is it to delay addressing the problems raised by early disadvantage? How critical is investment in the early years and for what traits? What is the optimal timing for intervention to improve abilities? 6 178 . and the gap is growing over time. later intervention for disadvantage may be possible.disadvantage is the quality of parenting. A lot of evidence suggests that the returns to adolescent education for the most disadvantaged and less able are lower than the returns for the more advantaged. are spending more time in child development. but that it is much more costly than early remediation to achieve a given level of adult performance.

Sensitive periods come earlier in life for cognitive traits. The powerful role of noncognitive traits and the capacity of interventions to improve these traits is currently neglected in public policy discussions. X Summary Many current social problems have their roots in deficits in abilities. This pattern is associated with slower development of the prefrontal cortex. and social behaviors. long after the interventions end. Evidence from a variety of studies shows that there are critical and sensitive periods for development. The age pattern is less pronounced for noncognitive traits. Longitudinal studies of the experimental groups demonstrate substantial positive effects of early environmental enrichment on a range of cognitive and “non-cognitive” skills. Children from advantaged environments by and large receive substantial early investment. Children from disadvantaged environments typically do not. They produce inequality and reduce productivity. job performance. Reliable data come from experiments that provide substantial enrichment of the early environments of children living in low-income families.IX Enriched Early Environments Can Compensate In Part For Risk Features of Disadvantaged Environments Experiments that enrich the early environments of disadvantaged children show that the effects of early environments on adolescent and adult outcomes are causal. A portfolio of childhood investment weighted toward the early years is optimal. schooling achievement. Society currently ignores this pattern in its investment in disadvantaged children. Later investment is more productive if early investment is made. The econometric evidence is consistent with the evidence from neuroscience. Noncognitive traits stimulate production of cognitive traits and are major contributors to human performance. devoting more resources to adolescent remediation than childhood prevention. Improvements in family environments enhance children’s adult outcomes and operate primarily through improvements in noncognitive skills. Later life investment is less productive if an adequate base has not been created in early life. Ability deficits open up early in life and persist. 7 179 .

Schools. Late remediation is very costly. J. skills and synapses. Reference Heckman. 289–324. Making these arguments more precise and rooting them more firmly in data on biology and behavior will lay the groundwork for addressing the core problem of rising inequality in a rigorous and meaningful way. Interventions should be directed toward the malleable early years. not income per se. (2008. Economic Inquiry 46(3). Quality of schools and tuition do not matter as much as is often thought. J. if society is to successfully reduce inequality and promote productivity in American society.The appropriate measure of disadvantage is the quality of parenting. 8 180 . July).

n Can we discriminate between models of entrepreneurial risk-taking and innovation from economics and models based on overconfidence and/or tolerance for ambiguity from psychology? THE FRONTIER OF RESEARCH ON ECONOMIC GROWTH n How can we enrich our knowledge about the impact of management practices and firm-level productivity growth? n Can we explain differences in recoveries from severe financial crises across countries and time periods? n What more can we learn from historical episodes of major innovations about determinants of major changes and incremental innovations? n How can we model and estimate effects of major fiscal reforms (e. and too unwilling to learn from other disciplines.g. The concerns are that economists are too axiomatic.. To wit: INTERDISCIPLINARY WORK ON ECONOMIC QUESTIONS n Can we use failures in the financial crisis to discriminate between models of poor incentives and models of overconfidence? n Can we use economic and behavioral insights to design and evaluate products in "consumer finance"? (We are doing this at Columbia in research in our Center for Decision Sciences and in a new MBA course entitled Consumer Finance. And there are scholars and agendas that could benefit from and quickly deliver results from this support. but consumer finance is at least as important conceptually and empirically. Columbia University Since the financial crisis.as spending changes and tax changes . entitlement reform in the United States) . But this is actually the time to increase support for broad-gauged economic research substantially in my view.on economic growth? ECONOMIC ANALYSIS OF MAJOR POLICY QUESTIONS n How can we enrich our understanding of health policy choices on insurance and care arrangements (many insights still date to the old RAND study)? n How can encourage more systematic modeling and estimation of fiscal policy multipliers (outside of the heat of battle of individual policy debates)? 181 .) Much research is done on topics in corporate finance. too doctrinaire.SOME COMPELLING BROAD-GUAGED RESEARCH AGENDAS IN ECONOMICS Glenn Hubbard. many political leaders (and indeed social scientists in universities) have called for putting economics in its place and redirecting support to other disciplines.

of course. 182 . there are many more topics one could cover.n How important are rising health care costs in explaining wage income stagnation for many Americans in the past decade? n How effective are large scale asset purchases by the Federal Reserve in altering the term structure or risk structure of interest rates? n What kinds of financial contracts can best address risk-sharing for job loss or retirement or disability? Dan. Large-scale NSF support for new data or to support teams and research colloquia could have a very high payoff. I would be happy to discuss any of this with you. but I think this short list makes the point. There are big areas in which progress can be made and in which scholars are ready.

sometimes with repeated measures as in panel data. In fully parametric models. dependence patterns and a relatively large numbers variables per unit. For each unit there is information on a relatively small number of variables. In such cases the current methods to do approximate inference based on large sample results. the first comprising cross-section and panel data econometrics and the second time series analysis. or markets. or groups of individuals. with complex. as well as in the more flexible semi and non parametric models we have gained an impressive understanding of the appropriate ways of analyzing such data. sometimes measured at a single point in time. Introduction To frame what is in my view of the main challenges facing researchers in econometrics. firms. In the cross-section branch of econometrics researchers have data on a large number of units. The units are viewed as exchangeable.Challenges in Econometrics Guido W. largely unknown. Moreover. and the properties of many estimators and methods for inference. 2010 1. at many points in time. Imbens . which are specifically designed to exploit laws of large numbers and central limit theorems. let me set the stage by describing the current state of research. In time series analysis the typical setting is one with observations on a small number of variables. are likely to be inadequate. trying to fit these more 1 183 .Harvard University. Much of the traditional research in econometrics can be divided into two branches. often individuals. For models designed for data configurations of these two types we have learned much in the last few decades. or independent in the sense that there is no interaction between the units: what happens to one unit does not affect other units. with relatively unrestricted dependencies between the different variables. In my view the biggest challenges faced by economists in terms of analyzing economic data concern fundamentally different configurations of the data. Sept.

but for the most these are unexplored areas for research. typically binary: individuals either influence each other in a constant way.. In economics it may be of more interest to understand how the spatial correlations generate effects of policies implemented in one location on outcomes in another nearby location. presence of natural resources at one location given measurements on measures of resources or proxy variables at nearby locations. relative to the number of questions. These dependencies are likely to weaken as the distances between units increase. 2. One branch of econometrics that has studied such questions is spatial econometrics. We may have information on units located in physical or economic spaces that exhibit strong. Such effects may operate with unknown lags. or not at 2 184 . For example. with often strong prior beliefs about the appropriate distance measures. dependencies in economic behavior. necessitating the combination of time series methods and spatial analysis.complex data configurations into the old methods would be unlikely to lead to much progress. but the appropriate notion of distance is likely to be partly unknown. Related to spatial statistics but with a different set of challenges. e. the dependencies between economic behavior may arise from what is sometimes called peer effects. in the statistics literature the focus was often on predicting outcomes in particular locations given outcomes in nearby locations. or social interactions.g. but complex and partly unknown. but this is still a relatively undeveloped part of the econometrics profession. Data Configurations More and more data are becoming available to researchers that do not fit the standard mold. Much of the work in spatial econometrics relies heavily on methods imported from the statistics literature where the focus was on different questions. Correlations may be stronger in some parts of the population than in others. In some cases econometricians and statisticians have made some progress on such alternative data configurations. There may be little prior knowledge about the relative importance of different distance measures. Here distances between units are often modeled as discrete.

through. tracking in educational settings. If successful. or from feedback in behavior. Controlling for shared background is also a difficult challenge. Jackson. Sacerdote. and empirical work has demonstrated the presence of correlations in behavior associated with such networks. 2008).all. Some individuals may be connected to many others through self-chosen friendship links. Ultimately a key question is whether these social interactions can be exploited by policy makers to improve the distribution of outcomes in society. While there have been numerous empirical studies documenting correlations in outcomes for individuals in the same class. Important is the fact that the peer groups in Manski’s analysis partition the population. the mechanism would be through the induced interactions associated with the squadron assignments. For example. An interesting paper in this respect is Carrol. but our understanding of the statistics and econometrics of these models is still in its infancy. Observed correlations may simply affect choices of individuals to team up with similarly minded individuals. Behavior of units in different peer groups is not correlated. In an important paper Manski (1993) studied identification questions in a special case where a population was divided into peer groups. both in the short and in the long run. Economic theorists have analyzed such network settings in considerable depth (e.. and the effect of two different peers on the same individual may be different. and West (2010) who attempt to improve average test scores by optimally assigning incoming recruits at the Air Force Academy to different squadrons. rather than effects on peers’ behavior. there is still a great deal of uncertainty whether these arise from teacher effects or interactions between students. Many questions arise when the groups within which the dependencies are present are partly the result of choices made by individuals.g. The analyses get even more complicated when the peer groups do not simply partition the population. 3 185 . for example. Within groups correlations may arise from correlated backgrounds. from a shared environment. Often individuals within a peer group are viewed as exchangeable: all individuals influence each other to the same degree.

sometimes more even variables than units. There are a number of specific challenges in analyzing such data sets. With possible dependence in behavior for many individuals in such networks. as well as detailed information per unit. and there is little knowledge about the sensitivity of empirical results to violations of these assumptions. in the biostatistics literature.the literature almost exclusively deals with exogenously formed networks. we may have detailed genetic information on a small number of individuals. Theorists have focused on the difficulty of defining useful equilibrium concepts in the context of network formation. the basis for conventional large sample results is unclear for even for simple statistics such as sample averages. we may have for a moderate number of individuals extremely detailed information about their behavior. with links between individuals either present or absent rather than of varying intensity. They arise from common features of such data. or all purchases made during visits to a supermarket. A general question in this area concerns the presence of data sets with a many variables relative to the number of units. Especially with some of these data sets drawn from internet communities. They often contain information on a large number of units. When taking account of the changing environment the dynamics of the equilibrium may lead to even more problems. Using such data to infer patterns in behavior that can inform policy questions is fundamentally different from that of inferring parameters of parsimonious models in large samples. or. all social interactions experienced. and with little attention to the dynamics of and feedback in the network formation processes. None of these are plausible assumptions. Simply following the 4 186 . For example. one may have information on a very large number of individuals. Questions of interest for economists include the effect of encouraging interactions by facilitating opportunities to form links. and the effects of interventions in some individuals on outcomes for those connected to them. including all web sites visited. followed over a period of time during which they were subject to many stimuli from outside and during which many interactions with other individuals took place.

although little specifically for network data (see the Holland and Leinhardt (1981) paper and the subsequent literature). B. 5 187 . as well as some statistical methodology. In practice even the number of units in the networks can be very large. None of these disciplines have focused much on the type of questions economists tend to be interested in. leading to even greater computational problems. but these methods have not found many applications yet in economics. Specifically they have looked at models that generate few large connected networks rather than many disconnected groups. and have contributed many substantive questions to this area. West (2010). Most of the sophisticated modeling has been done in the context of very small data sets. Lasso and related methods). but many have made progress on related issues.standard approach of approximating the distribution of estimators by joint normal distributions is unlikely to be a generally satisfactory approach in such settings with many parameters. and J. Even in such settings the number of possible links and networks can quickly be very large. Research related to these questions has been conducted in multiple disciplines and is a fertile area for interdisciplinary research. “Beware of Economists Bearing Reduced Forms? An Experiment in How Not To Improve Student Outcomes” Unpublished Working Paper.. Sacerdote. There are also huge computational challenges in this literature. They have also collected interesting data sets. A specific example of this is the study of regression models with more potential explanatory variables than individuals.g. Some methods have been developed for the covariate selection problem in the statistics literature (e. S. Computer scientists have focused on properties of networks emerging from various network formation processes. Sociologists have a long tradition of studying communities and social interactions. Statisticians have developed methodology for spatial data.. References Carrell.

(2008) Social and Economic Networks. “Identification of Endogenous Social Effects: The Reflection Problem. (1993). 60. Jackson.. (1981). P. 531-542. Leinhardt. Manski. Princeton University Press. and S. C. 76(373): 33-50. M.”Review of Economic Studies. “An Exponential Family of Probability Distributions for Directed Graphs.”Journal of the American Statistical Association.Holland.. 6 188 .

international trade patterns. Stanford University. Networks of relationships among firms and political organizations also impact research and development. Examples of the effects of social networks on economic activity are abundant and pervasive. patent activity. Networks of relationships among various firms and political organizations affect research and development. investment decisions and market activity. risk sharing. September 21. and even participation in micro-finance. and influence decisions regarding education. the study of social and economic networks can also benefit from an economic perspective. Networks also serve as channels for informal insurance and risk sharing. In addition network analysis provides new opportunities and challenges for econometrics. investment. and political opinions. trade patterns. voting. laboratory and field experiments. Tools from decision theory and game theory can offer new insight into how behavior is influenced by network structure. hobbies. For example. career. and political alliances. unemployment. employment. the fact that information about jobs is largely disseminated through social networks has significant implications for patterns of wages. there are also many business and political interactions that are networked. career choice. The study of how network structure influences (and is influenced by) economic activity is becoming increasingly important because it is clear that many classical models that abstract away from patterns of interaction leave certain phenomena unexplained. criminal activity. 189 . and can also be used to analyze network formation. Jackson. investment decisions. hobbies. and political alliances. as social interaction plays a key role in the transmission of information about jobs. and they are beginning to shed new light on the impact of social interactions ranging from favor exchange to corruption and economic development. Our beliefs. Beyond the many economic settings where social structure is critical. decisions and behaviors are influenced by the people with whom we interact. new products. criminal activity. technologies.Research Opportunities in the Study of Social and Economic Networks Matthew O. and education. 2010 White paper prepared for the NSF/SBE Abstract: Social network patterns of interaction influence many behaviors including consumption. Beyond the role of social networks in determining various economic behaviors. and even participation in micro-finance.

anthropology. as well as new tools for analyzing social interactions. sociology. It is an exciting area not only because of the explosion of ``social networking'' that has emerged with the internet and other advances in communication. and has implications for inefficient investment. it was noted early on in both the sociology and economics literatures that substantial amounts of information about job opportunities often comes from friends and acquaintances. and resulting inefficiencies. statistical physics and computer science. there was limited study of the wage and employment implications of that fact. political science. It was only in the last decade that it has been shown that incorporating network-based models of job information provides significant new insights into patterns of unemployment. and a perspective of social structure being symbiotic with social behavior. The other disciplines have much to contribute because they bring new perspectives on applications. and providing new tools for analyzing social interactions and their relation to human behavior. education. As an example. prices. Nonetheless. The study of social networks has a rich history in sociology. As mentioned above. especially in economics. and persistent racial wage gaps. a substantial portion of the current explosion in the study of social networks comes from expansions outside of sociology. theories of social structure. Let us briefly outline in turn these two important dimensions of economic studies of networks: supplying new perspectives on the role of networks in many applications. time series of wages.Research Opportunities in the Study of Social and Economic Networks/Jackson Given their importance. stylized views of markets as anonymous systems miss details that are critical in understanding many empirical patterns of trade. applied mathematics. Social network analysis has already taught us a great deal and it holds tremendous potential for future application. For instance. Other important questions of how patterns of interaction 190 . These are complementary aspects of the study of networks. but also because of the fundamental role that many varieties of social networks play in shaping human activity. and suggest abundant and pressing areas for research. the study of social and economic networks is expanding rapidly and naturally cuts across many disciplines including economics. The study of how network structure relates to economic activity is becoming increasingly important because many classical economic models that abstract away from patterns of interaction are unable to provide insight into certain phenomena. the role of social networks in disseminating job information affects wages and unemployment patterns. Despite this observation. The sociology literature includes the seminal references on studies of opinion leaders. strength of ties. homophily (the tendency of similar individuals to associate with each other). and many other things. with a variety of detailed case studies.

It also provides new insights into why the average social distance between people is so small even in very large societies and what this implies for the spread of information. That is. the endogeneity of social structure leads to a pervasive problem in analyzing behavior as a function of social structure. as well as opportunities in experimental economics. how they share risk and exchange favors. new econometric and statistical techniques are needed (and starting to emerge) to analyze network data and improve our understanding of peer effects in many areas. and which goods are traded? How do the patterns of liabilities among financial intermediaries relate to the potential for financial contagion? How are education and other human capital decisions influenced by social network structure? More generally. when and how are consumption and voting patterns influenced by friendships and acquaintances and what does this imply for efficiency in decision making? How do people learn and communicate by word of mouth? Will the networks of interactions that emerge in a society be the efficient ones in terms of their implications for economic growth and development? As an encouraging example. the recent awakening of network research in development economics has provided exciting new insights into a diversity of important questions such as how people choose production technologies in agriculture. Do friends behave similarly because of their influence on each other. For example.. or are they friends because of their 191 . and game theory. A second important area of the study of networks from an economic perspective derives from the fact that economic tools and reasoning are very useful in analyzing both network formation and network influence. and how they learn about new programs and opportunities. economic reasoning provides important new insights regarding how people self-organize and why certain patterns will emerge. and these tools are quite complementary to those from other disciplines.g. the internet) change interaction patterns.Research Opportunities in the Study of Social and Economic Networks/Jackson affect economic outcomes include: How does price dispersion in markets depend on network structure? How do new market technologies (e. the efficiency of markets. explicit modeling of individual choices can help us to understand homophily: why people tend to associate with other people who are similar to them along a number of dimensions. These sorts of reasoning can be used to predict behavior along the lines discussed above: which choices people make and how choices depend on friends’ choices. Particularly effective economic tools come from decision theory. It also provides new implications for resulting behavior. behavioral economics. even beyond the implications of network structure for economic activity and welfare. including how students’ study habits and human capital investment decisions depend on their peers’ and how the choice of their friendships relate to these choices. Empirical analyses of social interactions provide many issues for research in applied econometrics. In particular. both in the lab and in the field. and such reasonings are also very useful in analyzing network formation. In addition to the modeling tools that economics can provide.

or even because of some latent trait that correlates with their behavior? Given this.org/about/licenses/   192 .Research Opportunities in the Study of Social and Economic Networks/Jackson similar behavior. and (iii) the endogeneity of social interaction presents challenging hurdles in interpreting data that requires the use of structural models. Thus. Stanley Wasserman and Katherine Faust (1994) Social Network Analysis: Methods and Applications. Princeton University Press: Princeton. there are many important and pressing areas for the study of social and economic networks. (ii) economic reasoning can lead to new insights regarding social interaction patterns. This work has a Creative Commons Attribution Non-Commercial Share Alike license: http://creativecommons. and various field and laboratory experiments. or natural experiments to control for that endogeneity. Crowds. it is important to base empirical analyses of social and economic networks either on careful structural models that account for endogeneity. Three References to Relevant Readings: David Easley and Jon Kleinberg (2010) Networks. or to be able to take advantage of laboratory. NJ. Matthew O. and Markets: Reasoning about a Highly Connected World. Jackson (2008) Social and Economic Networks. Cambridge University Press: Cambridge UK. Cambridge University Press: Cambridge UK. new statistical tools. One very positive aspect in this regard is that recently emerging research in networks exhibits natural and healthy interactions between theory. This derives from the facts that (i) there are many instances where the network patterns of interactions are fundamental to understanding emergent economic behaviors. In summary. field. empirics. and econometrics. and other related issues that lead to hurdles in determining causality. the study of social and economic networks provides many exciting opportunities.

1 193 . Jorgenson Samuel W.. University of Chicago Press. ST/ESA/STAT/SER. New York: United Nations. Organisation for Economic Co-operation and Development. An electronic version of Jorgenson and Landefeld is available in Blueprint for Expanded and Integrated U. NATIONAL ACCOUNTS. with J. Landefeld. 2009. The public programs for retirement income and health care are critical components of the long-term development of the federal budget.harvard. Nordhaus. income and expenditure. Commission of the European Communities. Accounts: Review. The purpose of this Grand Challenge is to accelerate the development of new economic data for the resolution of policy issues involving longterm growth. including production. Jorgenson and J. “Blueprint for Expanded and Integrated U. eds. and Next Steps. A NEW ARCHITECTURE FOR THE U. Chicago.” in Dale W. why do we need a new architecture for the U. Assessment. Jorgenson September 20.S. J. Significant examples include public and private provision for retirement income and the outlook for health care expenditures and public programs to cover health care costs.S. Jorgenson. pp.5.S. 2 United Nations. An example of such a framework is the new seven-account system employed by the Bureau of Economic Analysis (BEA). The first question to be addressed is.S. and the World Bank.D. 13-113.edu/faculty/jorgenson A NEW ARCHITECTURE FOR THE U.W. 2010 Introduction. and William D. Chicago University of Chicago Press. pp 13-112.S. 2006. Assessment. 2 Both provide elements of a complete accounting system.economics. in D. A New Architecture for the U. National Accounts. Stephen Landefeld.edu WEB: http://post. eds. Nordhaus. International Monetary Fund. 1 A second example is the United Nations’ System of National Accounts 2008. such as research and development.S. NATIONAL ACCOUNTS: by Dale W. and W. System of National Accounts 2008.F/2/Rev. Morris University Professor 122 Littauer Center Cambridge. and Next Steps. J. MA 02138-3001 PHONE: (617) 495-4661 FAX: (617) 495-4660 EMAIL: djorgenson@harvard.. Steven Landefeld.HARVARD UNIVERSITY DEPARTMENT OF ECONOMICS Dale W. Jorgenson. Other important examples include broadening the concept of investment to include investment in human capital through health care and education and investment in intangibles. Stephen Landefeld. capital The BEA’s seven-account system of summarized by Dale W.S. 2006. National Accounts: Review. national accounts? In this context “architecture” refers to the conceptual framework for the national accounts.

Productivity Growth by Industry. the economy is confronted with new challenges arising from rapid changes in technology and globalization.worldklems. The basic architecture of the national accounts has not been substantially altered in fifty years.net/ 194 . http://www. The prototype system of accounts developed by Jorgenson and Landefeld incorporates the cost of capital and the flow of capital services for all productive assets employed in the U.formation. Productivity Manual: A Guide to the Measurement of Industry-Level and Aggregate Productivity Growth. and labor compensation include public use data for individuals from the decennial Censuses of Population and the monthly Current Population Surveys generated by the Bureau of the Census.euklems. The U.worldklems. hours worked.S. 6 This has been expanded to an initiative involving more than forty countries on all six continents.pdf 5 See Paul Schreyer. Meeting these challenges will require a new architecture for the U. 6 For details on the EU project. by Jorgenson. This methodology conforms to the international standards presented in the OECD Productivity Manual (2001).net/conferences/worldklems2010_jorgenson. The balance sheet covers the U. cit. Information Technology and the American Growth Resurgence. The purpose of such a framework is to provide a strategy for developing the national accounts.” by Jorgenson and Landefeld.net/.S. May. op. “New Data on U. In addition.” 4 The methodology follows that of Jorgenson. economy’s growth potential. education and class of employment. The underlying source data on employment. sex. however. New Architecture. national accounts were originally constructed to deal with issues arising from the Great Depression of the 1930’s.S. 2001. This provides a unifying methodology for integrating the National Income and Product Accounts generated by BEA and the productivity statistics constructed by BLS. national accounts. The key elements of the new architecture are outlined in a “Blueprint for Expanded and Integrated U.S. economy.S. Paris: Organisation for Economic Cooperation and Development. The parallel flow of labor services is broken down by age. economy as a whole and fills a gap in the existing Flow of Funds Accounts. see: www. 5 The European Union (EU) has recently completed a project to develop systems of production accounts based on this methodology for the economies of all EU member states. focusing on the current state of the economy. 3 They present a prototype system that integrates the national income and product accounts with productivity statistics generated by BLS and balance sheets produced by the Federal Reserve Board. Ho. The system features GDP. This production account has been disaggregated to the level of individual industries. GDP and domestic income are generated along with productivity estimates in an internally consistent way.S. as does the National Income and Product Accounts. and Samuels (2010). The production account for the prototype system of accounts is based on the gross domestic product (GDP) and gross domestic income (GDI) in current and constant prices.S. and wealth accounts. Hours worked for each category of labor services are weighted by total labor compensation per hour worked. Recovery from the economic crisis of 2007-2009 has shifted the policy focus from economic stabilization to enhancing the U. 7 http://www. Accounts. Ho and Stiroh (2005). 7 3 4 See Jorgenson and Landefeld.

which records imports and exports. 8 Another important advantage of beginning with the NIPAs is that the existing U. 611-625. and Samuels (2010). Rochelle Antoniewicz. BEA’s international accounts are undergoing substantial improvements intended to enhance the quality of information available to policy makers dealing with globalization. Improvements in the source data are an important component of this program. The accounts would include capital and labor inputs for each industry. Ralph Kozlow. and Brent Moulton. Karin Moses. have focused on the income and expenditure accounts. the BEA is currently engaged in a major program to improve the existing system of industry accounts. The system of industry production accounts would use the North American Industry Classification System (NAICS) employed in BEA’s official statistics. 10 BEA national income and FRB flow of funds data on income and expenditure are combined by Albert M. investment. produced by Teplin. and imports and exports by industry. The Integrated Macroeconomic Accounts for the U. This system includes the Foreign Transactions Current Account. fixed assets. Michael G. This program integrates the NIPAs with the Annual Input-Output Accounts and the Benchmark Input-Output Accounts produced every five years.S. Offshoring. based on the methodology of Jorgenson.. and Stiroh (2005). rather than balance sheets and the wealth accounts. Teplin..S. capital. and labor inputs would be presented in current and constant prices along with productivity. The next step in unifying the National Income and Product Accounts with the productivity statistics is to develop a more detailed version of the production account. This would incorporate BEA’s new system of official statistics on output. economy generated by Jorgenson and Landefeld to incorporate balance sheets for the individual sectors identified in the Flow of Funds Accounts. Improvements in the NIPAs could be added as they become available. Landefeld. and the Balance on Current Account.S. payments to the Rest of the World. January 6. Ho. For example. economy See. op cit. Genevieve Solomon.” BEA Working Paper. 2006. An important advantage of beginning with the NIPAs is that the impact of globalization on the U. 9 Next Steps. pp. and Nordhaus. as in Jorgenson. Mesenbourg of the Census and Kathleen P. Charles Ian Mead. as well as intermediate. employment. 10 A comprehensive wealth account for the U. especially in measuring the output and intermediate inputs of services. national accounts could be incorporated without modification. Palumbo. intermediate input.S. The next step in integrating the NIPAs with the Flow of Funds Accounts would be to extend the national balance sheet for the U.S. 9 See the Panel Remarks by Thomas L. Susan Hume McIntosh. The Census Bureau has generated important new source data on intermediate inputs of services and BLS has devoted a major effort to improving the service price data essential for measuring output. economy is reflected in BEA’s system of international accounts. and Multinational Companies: What are the Questions and How Well Are We Doing at Answering Them. “Integrated Macroeconomic Accounts for the United States: 8 195 . which registers Net Lending and Borrowing from the United States to the Rest of the World.The prototype system of Jorgenson and Landefeld begins with the NIPAs and generates the income and product accounts in constant prices as well as current prices. Utgoff of BLS in Jorgenson. Industry outputs. The international accounts also include the Foreign Transactions Capital Account. as well as receipts from the Rest of the World. “Globalization. Ho. et al. for example.

household production. pp. 143-160. Future Research.” in Jorgenson and Landefeld. as well as the value of residential land. why not leave this as a Grand Challenge to the statistical agencies? The answer is that no agency has responsibility for producing a new architecture for the national accounts. The value of the housing stock includes the value of residential structures. and the environment. pp. All three have been the focus of intense media attention during the recent housing boom and bust. The initial steps described above were carried out through collaborations among the agencies and between private and public sector investigators.S. These accounts are updated annually be BEA and the FRB. national accounts are defined by market and near-market activities included in the gross domestic product. as well as assessing the levels of domestic and national saving and their composition.” in Jorgenson and Landefeld. Landefeld. The final question to be addressed is. 196 . and level of investment in residential structures. but this architecture has important gaps and inconsistencies and is now in need of major updating and extension.” in Jorgenson. The conceptual framework for non-market accounts is presented by Nordhaus.S. Investment in housing also involves important long-term policy issues. The boundaries of the U. op. The new architecture project is not limited to these boundaries. while a near-market activity is the rental equivalent for owner-occupied housing. statistical system. national accounts will open new opportunities for development of our federal statistical system. 471-541. covering areas such as health. “Principles of National Accounting for Nonmarket Accounts. Such an account is essential for measuring the accumulation of wealth to meet future financial needs for both public and private sectors. cit. The new architecture project would involve close collaboration with the statistical agencies. recently instituted by the Bureau of Labor Draft SNA-USA. The value of land is included in the national wealth.S. and Nordhaus.. op. Under the auspices of the National Research Council. The existing architecture of the U. 161-192.S. such as the American Time Use Survey (ATUS). the effect of tax incentives for housing through income tax deductions for mortgage interest and state and local property taxes. An example of a market-based activity is the rental of residential housing. The creation of a new architecture for the U. pp.is currently unavailable. in part because of the importance of housing as a component of national wealth. the asset value of the housing stock. such as the impact of federally subsidized mortgages. These activities have been developed over many decades of experience of operating within the decentralized U. New accounts for health and education could make use of new data sources.. op. cit. 11 An example of future opportunities for development of federal statistics is the integration of rental values for housing. the Committee on National Statistics has outlined a program for development of non-market accounts. cit. Each of the agencies has a well-defined scope of activities supported through the federal budget. Abraham and Christopher Mackie. 11 The NRC report in summarized by Katharine G. and the role of investment in public housing. “A Framework for Nonmarket Accounting. education. but not in BEA’s accounts for reproducible assets.. national accounts was developed through collaboration between the statistical agencies and intellectual leaders in the private sector such as Simon Kuznets and Wassily Leontief.

Wealth. 1961. the value of investment in education is outside the boundary of the national accounts.S. See the BLS website for details about ATUS: www. 15 See Paul A. 2009. population. New York. 14 197 . India. New York. 14 This includes a breakdown of the population by age. These data will greatly facilitate international comparisons and research into the impact of globalization on the major industrialized economies and the future impact of globalization on the U. William Nordhaus and James Tobin. and Korea. The MIT Press. The Measurement of Economic and Social Performance. The Theory of Capital. “Is Growth Obsolete?” In Milton Moss.. Abraham. University of Chicago Press. Accounting for Investments in Formal Education(PDF). Chicago. 2003.S. Finally. Lutz and Douglass C.” In F. 12 This provides detailed accounts for time use for the U. pp. 1996. Christian.. Hague. based on the work of Land and McMillan. ed. Employed members of the labor force are included in the labor data base that underlies the prototype system of accounts developed by Jorgenson and Landefeld. 1973. Measuring the Subjective Well-Being of Nations: National Accounts of Time Use and Well-Being. Columbia University Press. Thomas Juster and Kenneth C. China. education. Fraumeni have provided estimates of investment in human capital. 32-57.. such as education.S. is included in the data base employed by Jorgenson and Fraumeni. like those described above for the U. but could be included in non-market accounts. Economic Growth. ed. and Martin Weitzman. 242-306. as well as the economic dimension captured by the measure of income in constant prices employed by Jorgenson and Landefeld. Japan. 16 This is based on the Day Reconstruction Method in which time use is associated with domain-specific satisfaction. Macmillan. eds. Samuelson. for the economies of EU members and fifteen other major U. Income. and labor force status. trading partners such as Brazil. “Demographic Accounts and the Study of Social Change. Academic Press. “The Evaluation of ‘Social Income’. 1981. population.Statistics. with Applications to Post-World War II United States. An overview of issues in measuring investment in education is presented by Katharine G. Martin Weitzman. sex. The availability of data on time use would also facilitate the implementation of measures of well being that incorporate social and psychological dimensions.S. 13 An important part of investment in education is the value of time spent by students enrolled in educational programs. Cambridge. including education. McMillen. William D. Human Capital Accounting in the United States: 1994 to 2006. The estimates of Jorgenson and Fraumeni have been updated by Michael S. Social Accounting Systems.S. Krueger. Land. See Dale W.S. Jorgenson and Barbara M. 16 See Alan B. a System of National Well-Being Accounts has been proposed by Daniel Kahneman and Alan Krueger.gov/tus/. Jorgenson. the World KLEMS project is now generating industry-level production accounts. BEA has recently undertaken a project to update the Jorgenson-Fraumeni estimates of investment in education as part of a program to measure the output of public educational institutions. Land and Marilyn M. London. following Paul Samuelson. economy. pp. Measures of satisfaction can be compared over time and among groups of individuals to measure levels of well-being and their evolution over time. Postwar U. Cambridge. The Jorgenson-Fraumeni estimates of education incorporate a detailed system of demographic accounts for the U. 13 12 See Kenneth C. Time spent in non-market activities.” In Fredrich A.. pp. 509-532. Harvard University Press. (PDF). Since this time is not evaluated in the labor market. 15 For example. and the Maximum Principle.bls. Time spent in labor market activities is also included in the labor data base. Nordhaus and James Tobin.

198 .

Measurement and Experimentation in the Social Sciences Arie Kapteyn RAND Abstract I propose to build an advanced data collection environment for the social sciences that maximizes opportunities for innovation. Thus numerous studies are conducted each addressing one of a variety of domains of human life but mostly ignoring the relationship with other domains. This situation can partly be attributed to budgetary limitations: After all.1 Every source of information has strengths and weaknesses. The laboratory will incorporate and pioneer new forms of data collection— including. Global Positioning System (GPS) devices. The core of this “laboratory” is a representative panel of households in the United States who have agreed to be available for regular interviews over the Internet. between experiments with real stakes and hypothetical experiments.org/licenses/by-nd/3. resources are wasted because different studies often collect overlapping information 1 This list is not an exhaustive typology and several finer distinctions can be made. Internet access will not be a prerequisite for participation in the panel. and eye tracking equipment. smartphones. laboratory experiments. San Francisco. web cameras. physical measurements and biomarkers. administrative data. To view a copy of this license. e. These sources include introspection. The Internet panel is representative in the sense that respondent recruitment is based on a probability sample. he or she will be provided with a laptop and broadband access. surveys. USA. experience sampling. participatory observation.October 8. Which source is used may depend on the research question at hand. More importantly. California. field experiments. 2010 This work is licensed under the Creative Commons Attribution-NoDerivs 3. and easy for everyone in the scientific community to use. If a respondent does not have Internet access at the time of recruitment into the panel. visit http://creativecommons. 199 . accelerometers to measure physical activity. self-administered measurement devices for the collection of biomarkers. who has money to study “everything”? Yet in fact. cost effective. 94105.0 Unported License. measurement is typically limited to one domain or at most a few. panel surveys and cross sectional surveys. 171 Second Street. and is fast. but not limited to.g. etc. The challenge of measurement Social scientists use many sources of information to construct their models of human behavior in a social and societal context.0/ or send a letter to Creative Commons. Suite 300. but in many instances also reflects the personal preferences and skills of researchers and the size of research budgets. and natural experiments.

Since 1992. conducting functional Magnetic Resonance Imaging experiments or qualitative interviews with a subset of study participants). Thus although the HRS has been revolutionary in its multidisciplinary approach and in its continuous incorporation of innovations.g. HRS has started to collect biomarkers such as grip strengths. and India). breathing tests. 2010 while missing opportunities to capture a more complete understanding of human behavior in its various aspects.. to quickly monitor the effects of major events (e. but also because data themselves are fragmented. work. Admirable exceptions exist: The US Health and Retirement Study (HRS. Naturally we would also want to link administrative data to individual records (subject to respondents’ consent and with adequate data protection safeguards). South Korea. and allow for data collection across a broad swath of domains and over an extended time period. several continental European countries. empirical social sciences tend to be fragmented. combine conventional surveys with physical measures and biomarkers use modern technology to monitor behavior. Despite the collection of additional information in off years and the growing breadth of information that is being collected (like biomarkers). assets. which started as primarily a socio-economic panel has added content over the years and now has a substantial health component. saliva (to extract DNA). China. with largely similar set-ups and comparable questionnaires. This kind of survey would allow researchers across multiple disciplines to consider all relevant domains for empirical analysis. Mexico. the PSID.umich. off-year surveys cover topics such as the Consumption and Activities Mail-Out survey (CAMS) and one-off surveys on topics such as Medicare part D and time use. and health care expenditures. social security earnings records) have been added. http://hrsonline. administrative data (e. Recently (since 2006).isr. Additionally. and to design experiments that take advantage of a wealth of readily available background information. subject to consent by respondents. not only because of disciplinary differences in approach and the communication issues associated with these differences. include all ages.g.October 8. the HRS has collected information biennially on individuals 50 and older about income. many factors continue to limit what we can be learn (including the age requirement for respondents). In addition. there are obvious limitations to the amount of information that can be collected in any single domain and to the number of experiments one can do.edu/) is the most prominent example2. cognitive functioning. 200 . The format would also support in-depth studies on subsamples (e. the financial crisis or the swine flu pandemic). etc. For instance. What would be a next step? Imagine a survey like the HRS that would allow researchers to recontact study participants at any time.g. dried blood spots (for Hemoglobin A1c. In short. The HRS is such a successful scientific model that it has been reproduced in some 20 countries (England. 2 This is not to say that other surveys don’t cover material from different disciplines. health insurance. total cholesterol and HDL cholesterol). disability. physical health and functioning. pension plans.

e. Software is now available or can be easily developed to track a respondent's GPS-enabled cell phone from the web and combine it with real-time location based information. but not limited to. Current technology allows respondents to participate in surveys using their preferred hardware device. 201 . pupil dilation. accelerometers and heart rate monitors for measuring physical activity and physiological responses. intensive methods to increase both unit and item response. A few examples will illustrate the point. and that is fast. preloading. blinks. presenting a target stimulus to a sample of consumers while an eye tracker is used to record the activity of the eye.g. the increased sophistication and accessibility of eye tracking technologies have generated a great deal of interest in the commercial sector. The idea is that the laboratory we propose will be able to follow new technological and scientific developments. Example 2.October 8.. such as allowing researchers to initiate a small survey by text messaging questions to a respondent when s/he leaves the gym or visits a tax consultancy office. desktop computer. using technology that currently exists or is right around the corner. saccades. infrared blood sugar monitors). without committing to one particular technology ex ante. to gauge which information on a screen is actually taken into account when answering a question. iphone or other smartphone. an advanced data collection environment for the social sciences that maximizes opportunities for innovation.g. 2010 What is possible today? My proposal is not to build something totally new with unproven technology. The Internet panel is population representative in the sense that respondent recruitment is based on a probability sample. but rather to build on what has been proven to work. Most applications focus on web or software usability. The virtual laboratory will develop and test new modes of data collection and the collection of new types of data. This combination would have many possible applications. including. The core of this laboratory is a representative panel of households who have agreed to be available for regular interviews over the Internet. Eye tracking: In recent years. devices for experience sampling. he or she is provided with a laptop and broadband. Day Reconstruction Methods. GPS tracking: With more and more cell phones equipped with GPS. self-administered measurement devices for the collection of biomarkers (e. researchers can determine a great deal about the effectiveness of the web or software interface. I propose to build a virtual laboratory. cost effective. Example 1. By examining fixations. and easy for everyone in the scientific community to use. This technology can be easily adopted and used to test alternative interviewing techniques and to examine respondent behavior during an interview. It is neither possible nor useful to describe the many kinds of data that might be collected. such as a netbook. or any other device like the browser on a game console. and data quality checks. and a variety of other behaviors. Internet access is not a prerequisite for panel participation: If a respondent lacks Internet access at the time of recruitment. GPS tracking is becoming more sophisticated and yet more affordable. web cams.

October 8, 2010 Example 3, Accelerometers to measure physical activity: An accelerometer is a device that measures proper acceleration, the acceleration experienced relative to freefall. Accelerometers are increasingly being incorporated into personal electronic devices like the iPhone and allow researchers access to objective measurements of physical activity. These measurements can be retrieved in real time or can be uploaded to a central location when the respondent has access to a computer, allowing for follow-up questions based on the measured activity. Example 4, Telemetry: Telemetry is a technology that allows the remote measurement and reporting of information of interest to a central location for further analysis. Thus it can be used to link the output of all these new technologies. Example 5, Integrating survey information with social network information: Having access to social networking sites like Facebook (only with a respondent’s permission, of course) provides researchers with ample information about a respondent without actually asking questions. This technology can help reduce the respondent burden, gives the respondent more flexibility and a familiar interface, and allows for consistency checks based on the data retrieved from the social networking site.

What exists today?
MESS An existing facility that comes closest to what I am proposing is the MESS project in The Netherlands (http://www.centerdata.nl/en/TopMenu/Projecten/MESS/index.html ).3 The core element of the MESS project, currently about mid-way through its first seven years of funding, is the so-called LISS panel (Longitudinal Internet Studies for the Social Sciences). The LISS panel consists of approximately 5,000 households representing the Dutch-speaking population in the Netherlands. The panel is based on a probability sample drawn from the population registers. Households without prior Internet access are provided with broadband access (and a PC) to participate. The LISS panel has been fully operational since early 2008 and has now collected two years of data. Annual interview time is about 300 minutes. Panel members complete relatively brief (30-minute) online questionnaires monthly and are paid for each completed questionnaire. Half of the interview time is reserved for the LISS core study. This core study is repeated yearly (spread out over several months) and borrows from various national and international surveys to facilitate comparison with other data sources. The core survey covers a much broader range of topics and approaches than would be possible with other surveys using more traditional interview methods. The remaining interview time is used for experiments and innovation: Respondents can complete online questionnaires at any time during the month.

3

For the sake of full disclosure, I am one of the principal investigators of the MESS project. I am also the director of the American Life Panel, discussed below.

202

October 8, 2010 The application and review procedures for experiments are similar to those of TESS (see below), but there is no a priori restriction on the size or duration of the experiment that one can propose. In the first two years, about 40 proposals for experiments were accepted. TESS TESS (Time Sharing Experiments for the Social Sciences) is somewhat similar to the MESS project in its use of a standing Internet panel, Knowledge Networks. The panel is available at no charge to researchers who complete an application. The TESS web-site lists about 125 papers based on experiments conducted with the panel between 2003 and 2008. MESS and TESS do have some notable differences: TESS does not collect much core information about the panel members, except for basic demographics, and the number of items in a questionnaire as well as sample sizes are strictly limited (essentially, the more items, the smaller the sample size). Also, unlike LISS (MESS), TESS considers proposals only for experiments, not for regular surveys. Nevertheless, TESS services are clearly in demand. American Life Panel The RAND American Life Panel (ALP) is similar to the Knowledge Networks and the LISS panel in its reliance on a probability based sample and its ability to include respondents without prior Internet access by offering a laptop and Internet subscription. The panel currently includes approximately 3000 US households, with firm plans to increase the number to 5000 (including a Spanish language subpanel). Since 2007, some 120 experiments or surveys have been conducted. The HRS survey instrument has been programmed for the ALP and administered to the ALP respondents, so the full HRS core information on all panel members is available. Use of the panel for surveys or experiments is open to all researchers, but is not free4. The ALP is used intensively (approximately three surveys or experiments per month) and one might worry about survey fatigue and hence increased attrition. The annual attrition is between 5 and 6% a year. The low attrition rate may be partly due to the relatively generous incentives offered to respondents ($20 per half hour of interview time). Occasional comparisons with other surveys about similar topics show broad consistency. Data are disseminated through a web-site that allows free download of datasets, the construction of a custom made dataset by combining variables from different waves and putting them in a “shopping cart”. One can download data at any time during or after the field period. (https://mmicdata.rand.org/alp/index.php/Main_Page)

4

A substantial part of the experiments and data collection is supported by grants from the National Institute on Aging. Other major funders are the Social Security Administration and several non-profit institutions. The pricing is $3 per interview minute for the first 500 respondents, $2.50 for the next 500 and $2 per interview minute beyond 1000 respondents.

203

October 8, 2010

Conclusion
A laboratory as proposed will both dramatically expand opportunities for social science research and be highly cost effective. The technology exists; we only have to put it together.

204

Implications of the Financial Crisis for the Grand Challenge Questions for the NSF/SBE Randall S. Kroszner October 2010 The recent crisis has highlighted areas and questions that would be extremely valuable to investigate in greater detail. I will choose to touch only on a few of these topics in this limited space and will purposely range widely rather than try to deeply develop each subset of ideas. The Role of Economic History and Comparative Economics: Perhaps the single most important piece of economic research that provided guidance to Federal Reserve Board members during the crisis was Milton Friedman and Anna Schwartz’ Monetary History of the United States, especially the sections related to the “Great Contraction.” In a crisis, policy-makers must act quickly and on limited information. Although the banking and financial markets have changed dramatically since the 1930s, the Friedman and Schwartz book provide a detailed exposition of a previous crisis and analysis of policy interventions, or lack thereof, that mitigated or exacerbated the crisis. This style of analysis – motivated by economic theory but heavily focused on institutions and data to provide a broad coherent view of how to think about and respond to a crisis – is one that may not be fashionable today but can be extremely powerful. Understanding past crises, both domestically and internationally, is crucial to the advancement of macroeconomic and finance theories. Comparative and historical analyses are also fundamental to capacity building. Students will gain an enormous amount by having a greater appreciation for what has happened and history can provide valuable examples, possibly even “natural experiments,” to advance our understanding of the fragilities of the macroeconomy and financial system. Gathering more systematic historical and comparative data sets on asset prices and the structure of markets is a necessary part of this capacity building. The Expectations Formation Process and Learning: The process through which key economic agents develop and modify expectations is not well understood. There is much evidence, for example, that explicit inflation targets are correlated with lower and less volatile inflation. The usual explanation is that the articulation of a target leads expectations to be better anchored and thereby allows a central bank to achieve its goals more easily. While this may be true, we understand little about the learning process itself: what types of actions and/or communications cause economic actors to update their beliefs and to change their behavior? Once beliefs become anchored, in which circumstances can they become “unanchored”? Precisely how does this affect their behavior, as price-setters and price -akers? Work in psychology and sociology could be helpful in developing richer data sets and theories of learning and updating behavior. Such research would be extremely valuable for helping central banks to decide the most effective communication strategy and help to shape how to respond to a crisis.

205

Similarly, we understand little about the expectations development and change process in asset markets. In a wonderful paper titled “Noise,” Fischer Black emphasized that the most we can hope for in pricing and valuing many classes of assets in markets is to try to be within two standard deviations of what our models might suggest are the “fundamental values.” In many cases, those standard deviations can be quite wide, so large movements in asset prices could be a “normal” part of the financial markets. Add to that updating of beliefs about appropriate discount rates, future cash flows, etc. and the result could be swings in market prices. Such volatility could in turn affect saving behavior and macroeconomic outcomes. Building a greater understanding of the interaction of the formation and change of expectations at the individual level and the implications for market-wide and economy-wide behavior by perhaps drawing on experimental methods and theories in other disciplines, is a great challenge for economics but one with potentially high pay-offs. Interaction between and Effectiveness of Monetary and Fiscal Policy: The crisis has underscored the knowledge gaps that we have in the effectiveness of and interaction between different aspects of government policy in mitigating (or exacerbating) economic and financial volatility. On the fiscal side, we have a paucity of systematic empirical research on the “fiscal multipliers” – what types of spending seems to be most/least effective, what types of tax changes seem to have the largest/smallest impacts in the short run? Much of the current debate has been focused on the size of fiscal actions rather than on what types of changes in taxes and spending have the greatest impact on behavior and incentives in the short and intermediate/long runs. Historical and comparative data sets once again can be extremely helpful here in building capacity. Also, interaction effects between fiscal and monetary policy have not been fleshed out. Can monetary policy simply “accommodate” fiscal policy changes and offset them? Is this symmetric, that is, if tighter monetary policy can offset looser fiscal policy in an expansion, is it also true that looser monetary policy can be effective in offsetting tighter fiscal policy in a downturn? Studying past combinations of monetary and fiscal actions, such as the tightening that occurred in the late 1930s in the US that led to a form of a “double dip” would be valuable. It will also be an important part of further capacity building to develop our theoretical understanding of monetary policy effectiveness when the interest rate falls towards the zero lower bound. Interconnectedness and Too Big/Too Interconnected to Fail: The interconnectedness of financial markets and institutions and the implications for macroeconomic outcomes is another grand challenge for economics (see Kroszner 2010). Moral hazard problems arise from the existence of any private insurance or public safetynet scheme. Studying mechanisms to mitigate excess risk taking behavior is a longstanding endeavor, but one that certainly needs increased attention going forward. In particular, there is relatively little systematic evidence on the size and types of distortions that arise from implicit or explicit safety nets. For example, does the potential for moral hazard distort incentives in financial innovation towards products with difficult-tomeasure tail risks? In addition, how does the interconnectedness of markets and institutions make the system more fragile and does “too interconnected to fail” make this problem worse?

206

Improving the dialogue between financial economics and macroeconomics to better understand the sources of fragilities, propagation mechanisms, and macroeconomic implications of moral hazard in markets and institutions would be valuable. Making students aware of these interactions and unanswered questions and encouraging the recruitment of faculty who do not easily fit into a macro or finance slot but can teach and research across these fields would provide important capacity building. References: Black, Fischer. (1986) “Noise,” Journal of Finance, Presidential Address to the American Finance Association. Friedman, Milton and Anna Schwartz. (1963) A Monetary History of the United States, 1867-1960. Princeton: Princeton University Press. Kroszner, Randall. (2010) “Interconnectedness, Fragility, and the Financial Crisis,” presented to the Financial Crisis Inquiry Commission, February, http://www.fcic.gov/hearings/pdfs/2010-0226-Kroszner.pdf

This work is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

207

208 .

 In particular.  and  through  internet  experiments. Subjects are expensive  to  pay. issues of causality that are difficult to analyze with field data can be  addressed. field and internet experiments all have important limitations. www.  extending  the  size  and  scope  of  the  populations used in experiments.  especially  in  large  scale  experiments. while conducting experiments in  more  natural  environments.  Not  only  is  it  easier. Levine.  Released  under  a  Creative  Commons  Attribution Non‐Commercial Share Alike license  Abstract:  How  can  economic  policies  lead  us  to  greater  wealth. On the other hand.  reducing  the  time  needed  to  improve  and  develop  new  theories.  To  give  an  example.  control  in  experiments  is  still  and  necessarily  imperfect.  among  others.    An alternative method of validating theories is through the use of entirely artificial economies. Modern economics has extended this in two directions: to field  experiments.  The  answer  lies  in  correct  economic  theories  that  capture  the  causality  linking  policies  to  outcomes.  but  through  the  greater  control  possible in the experimental setting.  imagine  a  virtual  world  –  something  like  Second  Life. but are still time‐consuming – the more so with the National Institute of Health  trying to apply inappropriate medical ethics to harmless economics experiments.  One  of  the  most  essential  needs  for  developing  better  economic  theories  and  policy  prescriptions  are  improved  methods  of  validating  theories. and have been discussed  in  depth  by  List.  and  more  practical  to  validate  theories. such an environment would offer enormous advantages: complete control –  for  example. as well as presenting specific ideas about the nature of models  of  sophisticated  expectations  that  are  needed  to  allow  artificial  agents  to  mimic  the  behavior  of  real  human beings. September 9.  welfare  and  happiness?  There  is  no  bigger  question  in  economics.  over  risk  aversion  and  social  preferences.  The  introduction  of  laboratory  experiments added a new dimension to validation: a good theory ought to be able to predict outcomes  in the artificial world of the laboratory. 2010  White  Paper  prepared  for  the  National  Science  Foundation.com.  I  make  general  proposals for developing infrastructure.  The  key  need  to  answer  any  economic  question  lies  in  our  ability  to  validate  theories. Moreover.  Economic  theories  are  a  dime  a  dozen  –  we  have  more  theories  than  we  have  human  beings.Virtual Model Validation for Economics David K. Do we live in an Austrian world? In a Keynesian world? A world of rational expectations? This  White Paper proposes that major advances in simulating virtual economies is possible and would form  the  basis  for  rapid  and  accurate  assessment  of  current  and  future  economic  models. Experiments are faster than waiting  for new data to arrive.  independence  from  well‐meant  but  irrelevant  human  subjects  “protections”.  faster. keeping many of the controls of laboratory experiments. The importance of these innovations is great. A good theory ought to be able to predict outcomes in such  a virtual world.dklevine. laboratory.  If  we    209 .  Even  the  largest  internet  experiment  is  orders  of  magnitudes  smaller  than  a  small  real  economy:  thousands of subjects rather than millions of real decision makers.  Finally.  Originally  economics  depended  largely  on  field  data  –  usually  gathered  from  large  or  small  scale  surveys.  say  –  populated  by  virtual  robots designed to mimic human behavior. it is not possible to control for either risk aversion or social preferences.  and  great  speed  in  creating  economies  and  validating  theories.

  These  are  difficult  or  impractical  to  include  in  existing  calibrations or Monte Carlo simulations.were to look at the physical sciences. Moreover. There the Federal Reserve is modeled as a sophisticated Bayesian learner  equipped  with  powerful  econometric  methods  and  sophisticated  intertemporal  preferences  –  but  limited  to  the  data  on  hand.  In  economics.  existing  agent‐based  models  are  too  primitive  to  be  used  either  for  evaluating economic policies or for validating economic theories.    Dynamic  Bayesian  optimization  including  the  use  of  policy  experiments  enables the Fed to learn the true relationship between unemployment and inflation leading over time to  superior monetary policy. The problematic aspect of agent‐based modeling has been  the  focus  on  frameworks  for  agents  interacting  –  the  development  of  languages  such  as  SWARM  or  Cybele – and the fact that agents are limited to following simple heuristic  decision rules. they assume one of the  underlying  models  is  correct:  in  an  environment  where  none  of  the  underlying  models  are  correct. we would see the large computer models used in testing nuclear  weapons as a possible analogy. A useful  place  to  start  thinking  about  the  issues  is  with  Sargent’s  The  Conquest  of  American  Inflation  and  the  follow on papers with Cogley.    The notion of virtual economies is not new: the general concept has become known as agent‐ based modeling. agent‐based models are largely limited to studying  phenomena  such  as  traffic  patterns.  real  people  in  the  laboratory and the field are able to recognize sophisticated patterns and anticipate future events. and with expectations that are backward  looking  rather  than  determined  in  equilibrium. despite three decades of effort. These models are also useful in constructing examples to  illustrate  special  points. In the economic setting the great advantage of such artificial economies  is the ability to deal with heterogeneity. The incoming data is also narrowly  circumscribed.  Yet  this  work  has  not  had  a  substantial  impact on our understanding of economics. the practical problem faced by virtual methodology is that people are far better learners  and  vastly  more  sophisticated  than  the  best  existing  computer  models. and issues such as learning by analogy do not arise. One  of the simplest examples is the learning that takes place in the laboratory when subjects discover the  idea of dominated strategies.  Although  some  behavioralists  argue  that  people  are  very  simple‐minded  and  follow  simple  heuristic rules.  inflation and unemployment. along with the study of economies that are of interest to economists and  policy  makers.  Simple  rules  are  not  a  good  representation – for example – of how stock market traders operate.  Bayesian methods are not so useful.  the  decision  problem  is  relatively  narrowly  circumscribed: how best to choose the rate of monetary expansion.     Notice  that  in  the  Sargent‐Cogley  world.    210 .  Agent‐based  models  are  interesting  from  the  perspective  of  modeling  order  arising  from  the  interaction  of  many  simple decision rules – along the lines of Becker’s observation that demand curves slope downwards if  people choose randomly along the budget line. with small frictions.  Existing  agent‐based  models  focus  on  the  simple  evolution  of  rules.  the  most  influential  work  has  been  that  of  Nelson  and  Winter  examining  the  evolution  of  growth  and  change.     The key to developing useful virtual economies is modeling inferences about causality. Yet.  However. The model is validated against the last 50 years of data on monetary policy. What is needed are agents who  use sophisticated algorithms.

  it  is  a  weak  criterion.  there  is  little  point  in  sticking  with  an  expert  when  it  is  clear  that  he  is  doing  a  poor  job. While this may be a useful  benchmark  for  learning  about  causality.  how  far  my  body  parts  will  be  flung.” The latter prediction is more precise. One may simply say  “if you jump you will die” while the other may provide a more detailed and accurate evaluation  of  the  consequences  –  the  speed  at  which  I  will  hit  the  water.  In  the  economic  setting  there are additional considerations. Calibration – how accurate are the predictions?  2. An expert who says your opponent will always cheat will lead you to cheat – and his  forecasts will be correct.  causality  between  periods  is  ignored.  But  this  additional  information  is  of  no  use  in  decision  making.  others  will  be  more  focused  on  their  records  as  stock  forecasters. A carefully chosen randomization strategy giving greater  weight to experts with better track records can do as well asymptotically as the best expert – this is true  even when all the experts are wrong.  Second.  Calibration  and  precision  are  the  traditional  criteria  for  model  evaluation. and you will do much better following his advice.  They  also  provide  suggestions  of  evidence  that  demonstrate  their  ability  as  experts.  While  the  evaluation  of  dynamic  plans  requires  that  those  plans  be  maintained  for  some  period  of  time.  A  useful  framework  for  thinking  about  this  problem  is  the  computer  science  problem  that  underlies boosting: the choice among experts.  blocking  periods  means  that  the  length  of  time  taken  to  learn  is  enormous.  3.     If we accept the basic framework of replacing a prior over models with a probability of choice  over  experts  –  where  an  expert  is  a  tractable  rule  for  making  forecasts  and  recommendations  –  it  is  possible  to  outline  the  issues  that  need  to  be  resolved.  Detailed  information about inferior plans is not especially useful. Relevance:  A  molecular  biologist  may  be  able  to  make  very  accurate  forecasts  about  the  formation  of  molecules  –  but  why  should  that  lead  me  to  take  his  investment  advice?  Notice  that there is surely heterogeneity among people in evaluating the relevance of forecasts: some  will  believe  that  a  good  molecular  biologist  can  better  forecast  stock  prices  than  a  bad  one.  To  take  a  simple  example. Of course an expert who says you should always cooperate and your opponent  will cooperate after the first period is equally correct.  That  evidence  needs  to  be  assessed on several dimensions:  1.    211 .  If  the  block  is  long  enough the payoff is approximately the same as the infinite present value. Precision  –  are  the  predictions  vague  or  are  they  sharp?  Does  the  expert  always  say  “it might  rain or shine with equal probability” or does he say half the time “it will rain for sure” and half  the time “it will shine for sure. The framework can be extended to dynamic decision making by  putting  time  into  blocks  –  the  technique  often  used  in  analyzing  repeated  games.  First.  Experts  make  recommendations  that  can  be  evaluated  directly  –  the  weak  criterion  for  asymptotic  success  has  already  been  described.  and  so  forth. imagine a repeated Prisoners’ Dilemma game where your opponent plays tit‐for‐tat starting by  not cooperating.  Utility  is  directly  related  to  relevance: two experts may both recommend that I not jump off a bridge.

 and D. which ones work best  in practice.  as  well  as  thoughtful  and  well‐developed  models  of  production. Notice that this goes the opposite direction from relevance. This can be done in the expert framework  by  providing  an  expert  who  advises  against  playing  dominated  strategies  in  all  games.    A  large  part  of  the  advancement  of  the  science  must  be  the  development  of  this  and  other  learning models. it deals less  well  with  the  need  to  experiment  with  “off  the  equilibrium  path”  behavior  to  determine  the  causal  consequences because it does not tell us what is the option value of experimentation. “Field Experiments in Economics: The Past. Ease  of  implementation.  Here  models  of  impulsive behavior such as those of Fudenberg and Levine and the empirical work of Cunha and  Heckman may play a useful role.  The  infrastructure  requirements  for  this  project  are  large.  the  framework deals well with the transmission of ideas – experts can be communicated from one person to  another – unlike the sending of messages or provision of data there is no issue of the reliability of the  information. Scope: experts differ in the number of things they can forecast.  An  expert who makes forecasts in many domains implicitly provides a formula for generalizing results from  one domain to another.  At  the  human  level  the  infrastructure  requires  the  collaboration  of  economic  theorists  and  practitioners with computer scientists. and J. It is natural to put more weight  on the advice of an expert who can predict a great many things well over one who can predict  only a few things well.  Notice  that  the  expert  approach  gets  at  several  tricky  issues. To combine many agent‐ models into a single economy requires reliable high speed networking and substantial computer power  at  each  end.  Second. For example we may want to get at the idea that when someone learns the idea  of dominated strategies they do not merely learn not to play a dominated strategy in a particular game. but are not equipped to handle  the  kind  of  information  flows  or  individual  agent  computations  that  simulating  an  artificial  economy  requires.  5. List. K. 1 Levitt.  Some  advice  may  be  difficult  to  follow  in  practice. The validation against behavior may  benefit  from  neuro‐economic  experimental  methods  such  as  that  of  Glimcher  and  Rustichini.  trade  and  consumption. Levine [2009]: "Learning‐Theoretic Foundations for Equilibrium Analysis."  European Economic Review. forthcoming.4. psychologists and neuroscientists. D. However. the recipients can test the ideas implicit in the expert for themselves. The Present. understanding which ones have the best theoretical properties. and which ones are most descriptive of actual behavior. S."  Annual Review of Economics. and The Future.  The  development  and  validation  of  sophisticated agent models is only a part of the huge infrastructure required. 2009.  but they learn not to play dominated strategies in any game.    212 .  References Fudenberg.  One  is  the  issue  of  generalization.  At  the  extreme efforts such as the blue‐brain project can provide additional paths of validation.  Existing agent‐based modeling frameworks may provide a starting point.

 Cambridge:  Harvard University Press      213 . R. Winter [1982] An Evolutionary Theory of Economic Change. and S.  Nelson.

214 .

and (4) broadening the NSF grant review process to include referees from multiple disciplines. 1931 Harris & Harris Group Professor. not rigorous. MIT Sloan School of Management. nevertheless. 2010 Abstract I propose the following grand challenge question for SBE 2020: can we develop a complete theory of human behavior that is predictive in all contexts? The motivation for this question is the fact that the different disciplines within SBE do have a common subject: Homo sapiens. (3) organizing “summer camps” for NSF graduate fellowship recipients at the start of their graduate careers. Please direct all correspondence to: Andrew W.0 Unported License. such an exercise is meant to be speculative. not around disciplines. MIT Sloan School of Management. competent people on a level with dentists.SBE 2020: A Complete Theory of Human Behavior Andrew W. and Dr. Daniel Newlon for encouraging me to submit a response. 100 Main Street. The National Science Foundation can foster this process of “consilience” in at least four ways: (1) issuing RFPs around aspects of human behavior.org/licenses/by-nc-sa/3. that would be splendid. where they are exposed to a broad array of research through introductory lectures by NSF PI’s. LLC. E62–618. Behavioral. Lo ∗. This work is licensed under the Creative Commons AttributionNonCommercial-ShareAlike 3. — John Maynard Keynes. visit: http://creativecommons. Lo. When they contradict each other—as they have in the context of financial decisions—this signals important learning opportunities. and Chairman and Chief Investment Strategist. Myron Gutmann of the National Science Foundation’s Social. psychological. and Economic Sciences Division for soliciting white papers on the grand challenge questions facing our respective disciplines. By nature. neuroscientific. 94105. Research support from the National Science Foundation (SES–0624351) and the MIT Laboratory for Financial Engineering is gratefully acknowledged. September 30. Suite 300. California. By confronting and attempting to reconcile inconsistencies across disciplines. USA. 171 Second Street.0/ or send a letter to Creative Commons. MA 02142. AlphaSimplex Group. I feel an obligation to apologize in advance to my academic colleagues for the informal and discursive nature of this essay. I would like to thank Dr. ∗ 215 . (2) holding annual conferences where PI’s across NSF directorates present their latest research and their most challenging open questions. sociological. San Francisco. Cambridge. To view a copy of this license. Therefore. If economists could manage to get themselves thought of as humble. we develop a more complete understanding of human behavior than any single discipline can provide. and economic implications of human behavior should be mutually consistent.

By “predictive”. but extraordinarily difficult. it is easy to forget the many genuine breakthroughs NSF SBE 2020 Andrew W. Because these disparate fields share the same object of study. and its close cousin. The particular mechanisms of genetic mutation have no direct bearing on the sources of time-varying stock market volatility. if not impossible. political.S. In such an emotionally charged atmosphere. Some of this criticism is undoubtedly unwarranted populist reactions to the life-altering economic consequences of the national decline in U. so checking for consistency between the former and the latter is unlikely to yield new insights. For example. I mean an empirically validated and repeatable cause-and-effect relation. behavioral. to achieve by 2020. Can we develop a complete theory of human behavior that is predictive in all contexts? That this should be the grand challenge question for SBE 2020 is by no means clear. otherwise flaws exist in one or both of these bodies of knowledge. The motivation for seeking an answer to this ambitious question is the simple observation that the social.I believe the most important grand challenge question facing the NSF’s Social. Of course. including economic. the efficient markets hypothesis. behavioral economists. But before attempting to defend this proposal. and which is sufficient for making correct predictions of human behavior in novel contexts. By “all contexts”. and economic sciences have a single common focus: Homo sapiens. Rational expectations. cultural. I mean all situations in which humans may find themselves. social. And by “complete theory”. sociologists. opportunities for consistency checks should arise often. and neuroscientists. their respective theories must be mutually consistent when there is any overlap in their implications. Lo 216 . and physical. and Economic Sciences Directorate is relatively easy to state. One of the most prominent inconsistencies among the SBE disciplines is the rational expectations paradigm of economics and the many behavioral biases documented by psychologists. But because all SBE disciplines involve the study of the very same human behaviors and institutions. let me explain more fully what the question asks. Behavioral. I mean a theory that is consistent with all known facts of human behavior. in many cases. anthropological theories of mating rituals must be consistent with the biology of human reproduction. residential real-estate prices from 2006 to 2009. implications may not overlap. have come under fire recently because of their apparent failure in predicting and explaining the current financial crisis.

manage leverage.319-page Dodd-Frank financial reform bill was signed into law on July 21. NSF SBE 2020 However. financial globalization. and option-pricing models. Imagine the FDA approving a drug before its clinical trials are concluded. Even the most sophisticated stochastic dynamic general equilibrium models did not account for the U. the coin is biased—75% heads and 25% tails—and the experimenter agrees to pay the subject $1 if she guesses correctly. central bankers.that have occurred in economics over the last half-century such as general equilibrium theory. The fact that the 2. nor were they rich enough to capture the consequences of securitization. six months before the Financial Crisis Inquiry Commission is scheduled to report its findings. a more productive response is to confront the inconsistencies between economic models of behavior and those from other disciplines. Consider the example of probability matching: an experimenter asks a subject to guess the outcome of a coin toss. at which point the subject should always guess heads so as to maximize her cumulative expected winnings. While frustrating. game theory. There are legitimate arguments that the rigorous and internally consistent economic models of rational self-interest—models used implicitly and explicitly by policymakers. portfolio theory. growth theory. But any virtue can become a vice when taken to an extreme. But rather than discarding rationality altogether. or the FAA adopting new regulations in response to an airplane crash before the NTSB has completed its accident investigation. Lo 217 . econometrics.S. credit default insurance. and rein in risk-taking in the economy—have failed us in important ways over the past decade. where. 2010. underscores the relatively minor scientific role that economics apparently plays in policymaking. the vast majority of subjects do not follow this Andrew W. and regulators to formulate laws. unknown to the subject. After a sufficiently long sample of tosses. contradictions often present opportunities for developing a deeper understanding of the phenomena in question. and attempt to reconcile them and improve our models in the process. housing market boom and bust. but will expect the subject to pay $1 if she guesses incorrectly. This experiment is then repeated many times with the same subject and coin (and the tosses are statistically independent). it should be possible for the subject to observe that the coin is biased toward heads. and the political pressures influencing Fannie Mae and Freddie Mac. and well before economists have developed any consensus on the crisis.

Suppose the entire population exhibits such individually optimal behavior—the first time there is rain. If the weather is sunny. by definition. the opposite is true if the weather is rainy: the valley floods. Lo and Brennan (2009) show that the behavior that maximizes the growth of the population is for individuals to randomize their nesting choice by choosing the valley with probability 75% and the plateau with 25% probability. but rather for the population as a whole. irrational from the perspective of maximizing an individual’s NSF SBE 2020 Andrew W. under certain circumstances. The “rational” behavior for all individuals to follow is to build their nests in the valley. not for the individual. the first time sunshine occurs. 2009). nesting in the valley will provide shade. if the entire population behaves in the opposite manner. for this maximizes the expected number of each individual’s offspring. And since. extinction also follows. but simply incomplete—humans usually do maximize their expected wealth but. it will reflect such advantageous behavior disproportionately to the extent that behavior is heritable. they appear to randomize. Using a simple binary choice model. the current population consists of the survivors. Lo 218 . Probability matching is likely to be a vestigial evolutionary adaptation that may not increase the chances of survival in the current environment. consider the hypothetical case of animal deciding whether to build its nest in a valley or on a plateau. but nevertheless is still part of our behavioral repertoire. Matching probabilities confers an evolutionary advantage. Now suppose the probability of sunshine is 75% and the probability of rain is 25%. Similarly. leading to no offspring. expected wealth. This broader perspective suggests that the economic notion of rationality is not wrong. but nests on the plateau survive. they may engage in other types of “hard-wired” behavior that are far more primitive. yielding many offspring. instead. leading to extinction. its evolutionary advantage is clear. always choosing the plateau. the entire population will cease to reproduce. indeed. hence any offspring will drown in their nests. However.expected-wealth-maximizing strategy. whereas nesting on the plateau provides no cover from the sun. Brennan and Lo (2009) show that While probability matching is. guessing heads 75% of the time and tails 25% of the time! This strange and well-known example of irrationality in human judgment may not be so irrational after all when viewed from the perspective of evolutionary biology (Lo and Brennan. leading to many offspring. To see why.

loss aversion. a term re-introduced into the popular lexicon by E. the social sciences. obtained from another different class. Other examples of important questions about economic behavior that fall outside standard economics are:      How do emotions affect the stability of preferences over time and circumstances? What role does memory play in economic decisionmaking? What do “theory of mind” experiments imply for strategic behavior? Can robust optimal control explain the regulatory challenges of fast-paced innovation? Does network analysis provide new insights for systemic risk in the financial system? By reconciling the inconsistencies and contradictions between disciplines. Lo 219 . we can develop a broader and deeper understanding of Homo sapiens. who attributes its first use to William Whewell’s 1840 treatise The Philosophy of the Inductive Sciences. yet the tools by which we will solve this challenge may come from other disciplines such as psychology. Wilson (1998). in which Whewell wrote “The Consilience of Inductions takes place when an Induction. and how does that repertoire change over time and across circumstances? The answer to this question has obvious consequences for virtually all economic models.several commonly observed behaviors such as risk aversion. ecology. These examples illustrate the value of “consilience”. obtained from one class of facts. This Consilience is a test of the truth of the Theory in which it occurs.” In comparing the rate of progress in the medical vs. Medical scientists build upon a coherent foundation of molecular and cell biology. The natural follow-on question—one that lies at the heart of the grand challenge question posed above—is why do we choose one particular behavior from our repertoire for a given occasion and not another. but it is much slower. coincides with an Induction. O. neuroscience. 182) makes a thought-provoking observation: There is also progress in the social sciences. Wilson (1998. and randomization are adaptive traits that can emerge organically through evolution. and evolutionary biology. They pursue elements of health and illness all the way down to the level of biophysical chemistry… NSF SBE 2020 Andrew W. p. and not at all animated by the same information flow and optimistic spirit… The crucial difference between the two domains is consilience: The medical sciences have it and the social sciences do not.

neuroscience. Split into independent cadres. consilience can speed up progress dramatically.Social scientists by and large spurn the idea of the hierarchical ordering of knowledge that unites and drives the natural sciences. and Economic sciences by taking up the grand challenge question proposed at the start of this essay. Although economics occupies an enviable position among the social sciences because of its axiomatic consistency and uniformity. The NSF’s SBE Directorate has a unique opportunity to foster consilience in the Social. The revolution in psychology that transformed the field from a loosely organized collection of interesting and suggestive experiments and hypotheses to a bona fide science occurred only within the last three decades. Developing a complete theory of human behavior that is truly predictive in all contexts will require contributions from and collaborations between many disciplines: economics. unlike the usual inter-disciplinary This could be the future of grants—which are often as effective as arranged marriages—RFPs centered on particular aspects of human behavior rather than specific disciplines will naturally draw the relevant fields together in productive ways. and computer science. Behavioral. Beyond issuing new RFPs. they stress precision in words within their specialty but seldom speak the same technical language from one specialty to the next. And even in fields where experiments are routine. anthropology. consilience is an essential means for moving the field forward. consilience may be less critical to progress because inconsistencies can be generated and resolved within the discipline through clever experimental design. Homo economicus is a fiction that can no longer be maintained in light of mounting evidence to the contrary from allied fields in SBE. engineering sociology. However. Holding annual conferences at NSF in which principal investigators from difference disciplines NSF SBE 2020 Andrew W. but it provides a clear directive for improving the status quo. economics. This is a bitter pill for economists to swallow. evolutionary biology. the NSF can encourage consilience through other means. medicine. psychology. thanks to synergistic advances in neuroscience. computer science. But for disciplines such as economics in which controlled experimentation is more challenging. ecology. and even evolutionary biology. For disciplines in which controlled experimentation is possible. Lo 220 .

Lo. Wilson. who would be asked to deliver overview lectures about the biggest challenges in their respective disciplines. 2009. “The Origin of Behavior”. A. is another way to “seed” the next generation of scholars. References Brennan. to appear in Quarterly Journal of Finance. 1998. Lo. NSF SBE 2020 Andrew W. T. as well as their frustrations and open challenges. Consilience. Knopf. 13–63. Providing “summer camps” for NSF graduate fellowship recipients at the start of their graduate careers. Lo 221 . changing the very review process of NSF grants to be more cross-disciplinary may create greater diversity in the type of research conducted. Journal of Investment Management 8. E. Finally. and M. New York: Alfred A. would be a natural extension of the NSF’s activities. “WARNING: Physics Envy May Be Hazardous To Your Wealth”.. 2010. where they are exposed to a broad array of NSF PIs. and A.are invited to come together to share their latest research. Mueller. increasing the likelihood of consilience in the SBE Directorate and across the entire NSF research portfolio.

222 .

using all the evidence for the scientific task: books as much as bonds. such as cultural anthropology. Until the 1930s the setting aside was gentle and non-dogmatic. and so is not mere “cheap talk. To put it another way. But in the shadow of 20th-century positivism. But the sort of language that can be treated by routine application of marginal benefit and marginal cost—which is the bed on which all studies of language in the 1 Distinguished Professor of Economics. economics has ignored the humanities and related social sciences. and under the influence of Lionel Robbins and Paul Samuelson and Gary Becker and others. and that changes in rhetoric such as the Enlightenment or the Bourgeois Revaluation do. The massive innovation leading to the Great Fact of modern economic growth since 1800 is an important case in point. allowing for occasional intrusions of human meaning such as Keynes on animal spirits or Dennis Robertson on economized love. University of Illinois at Chicago 1 223 .” Sweet talk accounts for a quarter of national income. A new economic history emerges. Adam Smith spoke often of “the faculty of speech. “sweet talk.” The research would direct economics and the numerous other social sciences influenced by economics back towards human meaning in speech—meaning which has even in the most rigorously behaviorist experiments been shown to matter greatly to the outcome.Language and Interest in the Economy: A White Paper on “Humanomics” Deirdre N. of studies by Marschak and Stigler and Akerlof and others on the transmittal of information? Yes: information is linguistically transmitted.” and considered meaning. English. But what. the study of the economy was reduced strictly to “behavior” (yet oddly ignoring linguistic behavior). Some economic historians are beginning to find that material causes of the Great Fact do not work. and innovation. but his followers gradually set them aside. and surely one of the main developments in economics since the 1970s has been the introduction of information and signaling. which connects it to the troubled economics of entrepreneurship. Sweet talk is deeply unpredictable. with their studies of human meaning. an economist would ask. and Communication. entrepreneurial courage and hope as much as managerial prudence and temperance. discovery. History. A worrying feature of economics as presently constituted is that it ignores language working in the economy. The economics of asymmetric “information” or common “knowledge” over the past 40 years reduces to costs and benefits but bypasses persuasion. McCloskey 1 Economics ignores persuasion in the economy.

In a way it is the oldest and most obvious finding of game theory that games have of course always a context of rules and customs and relationships. who cannot be trusted at all. The chattering character of people in markets and firms and households about their economic affairs would be like left-handedness or red hair: interesting for some purposes doubtless in the Department of English. a gigantic and economically meaningful sum. as Eleanor Ostrom and her colleagues have shown. If language in the economy was merely “cheap talk. and its share of economic would drift towards zero: an economic agent would be no more valuable if she were sweet than if she were a mere pipe for transmitted bids and asks. It can be treated mathematically by showing that cooperative equilibria (for example) cannot be achieved without a trust created by earnest talk. One can show on the basis of detailed occupational statistics for the U. “I have a brilliant idea for making cooling fans for automobiles.” Does it matter? Does persuasive economic talk have economic significance? Yes. “You’re fired. Indeed. "The bonds of words are too weak." It appears that Hobbes was wrong.” The trouble is that a large part of economic talk is not merely informational or commanding but persuasive: “Your price is absurdly high”. the maximizer of utility in a Samuelsonian way. as a matter of logic.” as the non-cooperative game theorists put it. but irrelevant to the tough. without the fear of some coercive power. but mainly the honest persuasion that a manager must exercise in a society of free workers or that a teacher must exercise to persuade her students to read books or that a lawyer must exercise if a society of laws is be meaningful. But the main emphasis in a research that would matter for the future of the social sciences would focus steadily on the facts of the matter. all of them affected by language.S. bringing together for example mathematical economists and rhetorical theorists. that about a quarter of income in a modern economy is earned by “sweet talk”—not lies or trickery always. anger. scientific matter of the economy. and not chiefly on the abstract theory (the abstract theory can yield any conclusion if permitted to choose any assumptions. “to bridle men's ambitions. 2 224 . experimental economics in the past twenty years has shown that allowing experimental subjects to establish relationships (“true”) through conversation radically changes the degree of cooperation. varice. The economy values sweet talk at one quarter of its total income. sweet talk emerges as crucial to experiments and field studies. and other passions.” Hobbes declared. Businesses work with trust. “We need to work together if our company is to succeed”. The research would need to establish the fact beyond doubt. and you should invest in it”. “Good old trustworthy Max”— not Max U. At one level. Formal maximum-utility economics cannot explain the sweet talk.15 for a bushel of corn”.economy have been laid down—is merely the transmittal of information or commands: “I offer $4. “I accept your offer”. “The new iPhone is lovely. the facts constrain the conclusions scientifically). But that is not the case. then ignoring it would not matter.

They point out that real discoveries. Once a discovery is made by what Kirzner call “alertness” it requires sweet talk to be brought to fruition. The trouble is that such events happened earlier and in other places. And so the modern world has depended on sweet talk. the historian Margaret Jacob. the trust. The language. much more since 1800 is the most astounding economic change since the domestication of plants and animals. economists. What appears to be needed to explain the Great Fact is a “humanomics. an economics and sociology and history that acknowledges humans as speakers of meaning. They cannot be pursued methodically—or else they are known before they are known. 3 225 . The Great Fact has usually been explained by material causes. the historical sociologist Jack Goldstone. In particular the Austrian economists such as Friedrich Hayek and Israel Kirzner have long recognized the importance of discovery and other human activities beyond maximization. the conversations. Goldstone 2009). such as expanding trade or rising saving rates or the exploitation of the poor. all depend on ethical commitments beyond “I’m all right.” The literature bearing on the matter even in economics alone has become quite large. Jack. The research would show in empirical detail that conversation is the crux of discovery. One can also show how attitudes towards the bourgeoisie began to change in the 17th century. The framing of bargaining anyway depends on the stories people tell. The “cooperative equilibria” are gigantically important to the success of a modern economy. Thus: what was the conversational context of invention and the sweet talk entailed by innovation in the era of the Industrial Revolution? The Great Fact of an enrichment by a factor of 20 or 30 or much. arise as it were by accident. An idea is merely an idea until it has been brought into the conversation of humankind. as in McCloskey 2010. itself of great importance. do not work. One can show in considerable detail.Maximizing utility is not human meaning. but stop short of grasping the role of language. alas. Historians. ranging from Vernon Smith to Herbert Gintis. first in Holland and then in an England with a new Dutch king and new Dutch institutions. a paradox. and especially the astounding series of discoveries that have made the modern world. and economic historians have been trying to explain it since Smith. and cannot therefore explain the Industrial Revolution and its astounding continuation. the anthropologist Alan MacFarlane. as in the work of the economic historians Joel Mokyr and Eric Jones.” that is. and recently have come to concentrate on it. that the material causes. The best way to persuade that a multi-disciplinary study of language in the economy and society might matter is to exhibit a possible sub-project. such as that a separate condenser makes a steam engine much more efficient or that treating the bourgeoisie with something other than contempt results in enormous economic growth. the sweet talk. as one can see in mothers and suicide bombers. on which a good deal of preliminary work has been done (Mokyr 2010.

which were postponed largely until the 20th century). faith. and the like do not explain the onset of economic growth in northwestern Europe (while the Rest stagnated). On the contrary.) Another question is the ethical: can a businessperson can be ethical without abandoning her business? What then was the role of ethical change in the Bourgeois Revaluation of 1600-1800 in the Industrial Revolution. temperance. no modern economic growth. What appears to be the case (say many of the economic historians who have been looking into the question since the 1950s) is that foreign trade. as they had always done. hope. the scientific revolution (not. and above all a change in the rhetoric of social conversations in Holland and then in England and Scotland and British North America about bourgeois virtue. aristocrats and their governments would have crushed innovation. and love—also run a business life. and the more so 1848-the present. capitalism works badly without the virtues—a fact long demonstrated by economic sociologists. And without modern economic growth. One could put it shortly: without spoken honor to the bourgeoisie. Humanomics to the scientific rescue. too) would require collaboration between the social sciences as behavior and the 4 226 . For another. (2. (This last was in essence the late economist Milton Friedman's Thesis). the material methods of production were transformed. the social position of the bourgeoisie would not have continued to rise. Whether the two were connected as mutual cause and effect through language remains to be seen. legal change. imperial extractions. however. Or perhaps not: that is the matter for research. The causes. the social position of the Third Estate was raised. Material causes do not appear to work. in its direct technological effects. and women and slaves and colonial people and all the others freed by the development of bourgeois virtues. justice. And the bourgeois gentilhomme himself would not have turned inventor. domestic thrift. but would have continued attempting to rise into the gentle classes.) One hypothesis would go as fellow: if the social position of the bourgeoisie had not been raised in the way people spoke of it. The virtues can be nourished in a conversation about the market. And so we must recur to non-material causes. by regulation or by tax. Businesspeople are people. Yet if the material methods of production had not thereby been transformed.Two things happened 1600-1848. changing psychology. one might conclude (I repeat: it remains to be seen). You can see why the neologism “humanomics” is appropriate here: a serious inquiry into the ethical context of the Industrial Revolution (and of development in presently poor countries. were freedom. One might reply that the seven primary virtues of any human life—prudence. too.) The two Friedmans capture the essence of freed men. and now admitted even by neo-institutional and behavioral economists. and often have been. (1. no spoken honor to the bourgeoisie. "Bourgeois virtues" would therefore not be a contradiction in terms. (This last is in essence the economist Benjamin Friedman's Thesis. For one thing. courage.

I repeat. back to Achilles and Abraham. declared that "a society is called capitalist if it entrusts its economic process to the guidance of the private businessman" (Encyc. In 1946 the great student of capitalism. (It's this admiring of the bourgeois virtues that Russia lacks. A showing that ideas matter is not so unusual nowadays among historians. the jailing of billionaires. Joseph Schumpeter. and where private profit is still subject to prosecution by the state. they must think the bourgeoisie capable of virtue. the cutting down of tall poppies. every time.” Ignoring the burden of art and literature and 5 227 . whether ruled by boyars or tsars or commissars or by secret police. In short. A hardnosed calculation of interest was supposed to explain all. That is.) Yet what Schumpeter leaves aside in the definition. and many on the right were embarrassed to claim otherwise. 1890-1980.) Thus “humanomics.) Attributing great historical events to ideas was not popular in professional history for a long time. where agricultural land is still not private. We do well to watch for cognitive-moral revolutions. Brit. say. the sub-project proposes to give a big example of the force of language in the economy—its linguistic embeddedness as the sociologists would put it. history. entails private property. such as in works by Skinner or Israel." "Entrusting" the economy to businesspeople. Actual interest—as against imagined and often enough fantasized interest—did not cause World War I. Schumpeter explained. is that the society—or at any rate the people who run it—must admire businesspeople. "capitalism. and not simply assume that Matter Rules. It is the best short definition of that essentially contested concept.) (4.” as the historian Peter Novick called it. and even theology as meaning (as in Robert Nelson’s books on economic theology). is to demonstrate that in the economy the force of language is not to be ignored. Non-slave-holding whites did not constitute most of the Confederate armies for economic reasons. 1946). and often enough from historical materialists in the humanities. hasn’t work out all that well. And on and on.) One can ask how an explicitly and persuasive bourgeois ideology emerged after 1700 from a highly aristocratic and Christian Europe. and has. private profit.humanities of philosophy. though his life's work embodied it. Nor did abolition became a motivating cause because it was good for capitalism. (Or that it is to be ignored: if the research is genuine the possibility must be lively that the hypothesis turns out to be wrong. (3. But the “dream of objectivity. and private credit. a Europe entirely hostile—as some of our clerisy still are—to the very idea of bourgeois virtues. The Pals Brigades did not go over the top at the Somme because it was in their prudent interest to do so. But it is another matter to show that the material base itself is determined by habits of the lip and mind—that conclusion evokes angry words among most people on the economistic side of the social sciences. anthropology. (In such terms you can see the rockiness of the transition to capitalism in Russia. The larger point. Men and women of the left were supposed to believe in historical materialism.

0/ or send a letter to Creative Commons. Mokyr. Why Europe? The Rise of the West in World History. I do not mean that “findings” are to be handed over from novels and philosophies like canapés at a cocktail party. London: Penguin Press.org/licenses/by-nc-sa/3. 6 228 . 94105. and the numerous other social sciences from law to sociology now influenced by an exclusively Max U economics. business cycles.0 Unported License. 2009. characterizes our economies. 171 Second Street. Suite 300. 2010. I mean that the exploration of human meaning from the Greeks and Confucians down to Wittgenstein and Citizen Kane casts light on profane affairs. 1500–1850. Goldstone. To view a copy of this license. California. beyond the monster of interest focusing on Prudence Only. but also to some degree calls into question. New Haven: Yale University Press. A Cooperative Species: Human Sociality and Its Evolution. New York: McGraw-Hill. Bowles. A human with a balanced set of virtues. Creative Commons License: This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. USA. too. And so (the hypothesis goes) economics without meaning is incapable of understanding economic growth. Samuel. San Francisco. under orders from a largely unargued law of method. The Enlightened Economy: An Economic History of Britain 1700–1850. or many other of our profane mysteries. Forthcoming. Joel. visit http://creativecommons. Jack A. modern economics. a good deal of the evidence of our human lives. The research extends.philosophy in thinking about the economy is strangely unscientific: it throws away. and Herbert Gintis.

while in some sense a national treasure because of its unique ability to study intragenerational and intergenerational dynamics over a 40-year period. family. how has this changed. and other social groupings to the evolution of economic and social wellbeing over the life course? Economics. 2010 The NSF SBE has invited the research community to submit white papers outlining grand challenge questions that are both foundational and transformative. marriage.S. What are the causes of economic and social disadvantage. child-bearing. and where in the life course do their origins lie? How have the dynamics of cohabitation.S. sociology. psychology. and one of the major reasons lies in limitations in the data infrastructure in the US for studying these issues. In the remainder of this paper. including educators and students. what are the reasons for these trends.A New Household Panel in the U. residential mobility. and what policies might the US society follow to address those challenges? What are the contributors to disadvantage during childhood. begun in 1968. Moffitt Krieger-Eisenhower Professor of Economics Johns Hopkins University October 13. neighborhoods. and related disciplines have long studied these questions. as well as to policy-makers. The NSF-supported Panel Study of Income Dynamics (PSID). labor market. (2) Current understanding both of trends and of their causes and implications is poor. It is by now well understand that the dynamics of population. The research accomplishments of the PSID are 229 . these points are elaborated.S. Robert A. The U. Such an investment would have enormous payoffs to the research community. and divorce and remarriage changed over time. and how does childhood disadvantage affect later life outcomes? How important are schools. was the first of the modern panels to follow a representative sample of the population over time. has long been a leader in the world in the development of household panel surveys. (1) The important questions are the major ones concerning economic and social dynamics in the U. health. has limitations which will increasingly prevent it from serving the necessary role in the future. education. and what are their implications for US society? How do US workers’ earnings and labor market success or lack of success evolve over their lifetimes. This white paper argues that a new long-term household panel would be foundational and transformative both to economics and related social sciences. A major investment in new data infrastructure is needed to provide the capability for new research and to inject new energy into social science research on economic and social dynamics. The major data set for studying long-term economic and social dynamics in the population as a whole is the Michigan Panel Study of Income Dynamics (PSID) which. and other key features of US society cannot be properly understood without data that follow the same individuals over time.

there are serious questions about its representativeness of dynamic patterns.000 household panel for methodological experimentation (the “Innovation Panel”). having accumulated over 40 years of data on both the initial sample and the children of the sample. only 5000 families. in the long run it should not be the only US national household long panel. despite its accomplishments. other countries are starting new household panels that completely dominate the PSID or any other general-population survey in the U. plus a self-completion instrument for 10-15 year olds. attrition which has affected the representativeness of second and third generation PSID respondents in ways that are difficult to ascertain. and less than that if the low-income oversample (which has since been largely dropped) is omitted. innovative panels. The most prominent example is the UK panel USoc (Understanding Society). and there is evidence that it is increasingly composed of individuals with more stable life trajectories. large. most prominently the Health and Retirement Survey. The SIPP has also always never been interested in some of the innovative survey additions like biomeasures. intergenerational dynamics) which are essentially defined by the PSID.000 household panel with an ethnic minority oversample. since it will continue to be the only data set in the country capable of examining medium-term and long-term economic and social dynamics. Weights in the PSID do not necessarily restore its representativeness. capability for linkage to administrative data sets. 230 . but a third (originally. the PSID is suffering from problems of its age. Fieldwork started in January. The research output from the PSID is enormous. All members of the household are interviewed. While the NSF should continue its vital support for the PSID in the next and future renewal rounds. collection of biomeasures.. which would probably not be allowed by the Census Bureau. There is no other panel in the US to fulfill this function. The original sample of the PSID was also quite small. These surveys are focused on labor market issues and are only aimed at examining specific birth cohorts. not attempting to recontact attritors from earlier rounds). Also. 2009. and interviewing is annual. The closest substitute is the Survey of Income and Program Participation. Other countries in Europe are discussing adopting these new. the representative Census Bureau survey. the SIPP is focused on short-term dynamics and no panel has lasted more than a few years. Germany is copying the USoc design with a similar new survey with similar sample sizes. However. There are several panels of the aged. It has had major cumulative attrition over time. The Department of Labor has conducted two panels of youth. the idea was for a new one every ten years) has not been started. However.virtually incalculable. and an additional 2. for it is the only extant panel which can study those issues. While a number of studies have shown that the PSID has roughly maintained its cross-sectional representativeness. At the same time.S.g. so the long-term dynamics that have been possible to study with the PSID cannot be examined with the SIPP. Canada is in the advanced stages of planning a new. It necessarily omits immigrants to the US since 1968 (an attempt to bring them in was unsuccessful). it is to some extent locked into its history by using what are now regarded as outdated methods of data collection in its first few years (e. large household panel. and there are certain areas of research (the long-term dynamics of poverty. one started in 1979 and one started in 1997. a new 20. but these necessarily are not useful for studying the non-aged population.

with the mix designed to optimize the relative advantages of both and to generate a more representative sample.” Nothing would unlock a new cycle of research on the dynamics of economic and social behavior in the U. and no amount of new ideas for research are capable of advancing social knowledge if there are not the data to test them on. by some estimates. the US is in danger of falling behind other countries in its ability to analyze important economic and social issues for the society as a whole. long enough to generate a longer panel than the SIPP). With the myriad socioeconomic problems facing the US at the current time. Experts from survey firms who are capable of generating realistic cost estimates should also be brought into the discussion.S. a conference. to support this cost even though several other countries are able to do so. as well as economists and other social scientists who would be using the data for applications.000 household survey would cost. cannot provide the same level of research potential as those being developed in other countries. Such surveys have a difficulty being representative and tend to underrepresent lower income households. Innovative alternatives. However.With these developments. somewhere in the neighborhood of $20 million to $30 million on an annual basis. the cumulative cost would be high indeed. A 20. research on internet surveys is still in its infancy. but this has. or a series of working groups and conferences of experts to discuss the possibilities for a new household survey. where individual agencies fund special-population panels. alternative designs.S. and we know relatively little about whether participating households are representative or about what nonresponse rates would look like over a longer-term panel. The piecemeal approach in the US. and the costs of each. The NSF has invited the research community to submit ideas that would “unlock a new cycle of research. are one means of reaching more households in the country. It may be that the US does not have the resources. such as offering to hook up household TVs to allow households without computers to participate in the survey. The appropriate course of action at this point would be for NSF to sponsor a working group. other modes of data collection must be considered. not been studied. The primary obstacle to starting a new household panel of such a large size has always been cost. again. If the panel were followed for a sufficient number of years (say. Experts at traditional surveys as well as experts in new modes of data collection should be involved. Another possibility is a mixed-mode survey which combines traditional in-person or telephone household surveys with internet surveys. or political will. Given this. panel. This is the reason for the critical importance of a new U. 231 . One alternative widely discussed but never attempted on a very large scale are internet surveys. The number of new theories and ideas for exploring social dynamics have far outrun the available data. it is essential that we have the research basis to address those problems. and the underrepresentation cannot be satisfactorily addressed simply by reweighting. The different modes of collection might generate mode effects. more than a fresh household panel which was adequate to investigate the key social issues.

Suite 300. California.0 Unported License. To view a copy of this license.org/licenses/bync-sa/3. San Francisco.0/ or send a letter to Creative Commons. 232 . visit http://creativecommons.This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. USA. 94105. 171 Second Street.

Nelson Department of Economics. as well as in patterns of support for practicing economists will be necessary to effect a shift to an ideal of objectivity in which the role of values can be properly integrated. environmental philosophy. The net effect has been to undermine the usefulness of economic advice to policy makers. Ultimately.Economics. And given the necessity of value judgments in economic analyses of policy options. 233 . and other social sciences. economic analysis has never been as important as it is today. many in economics seem to continue to adhere to outmoded (and now clearly inappropriate) images of science. and Values: An Integrated Approach Julie A. the tension between fact and value has never been more conspicuous. Climate. Technology and Society. University of Massachusetts Boston Evelyn Fox Keller Program in Science. Inst. The ideal of objectivity to which economists aspire needs to be reframed and broadened in ways that take advantage of new resources from the philosophy of science. But while significant strides have recently been made in the understanding of both the inadequacy and impracticality of a fact/value dichotomy in scientific research. changes in the education of young economists. of Technology Abstract How can we integrate the role of values and ethics in economic analysis of climate change without sacrificing the positive aspirations of that science? Given the urgency of the measures required by climate change. Mass.

attention shifts to the question of how we can effect the transformations in our economies required by the expected changes in climate . they claimed. But as many have pointed out. economic analysis is indispensable. the analyses of our most prominent. obviously weakens Nordhaus's argument. visioning and participatory methods. such techniques give one only. The belief that mathematical formalization combined with rigorous empirics automatically provides value-free results remains a foundational assumption of the contemporary mainstream discipline. and Values 234 . was. Both sides of that debate. is based on the intuition that such a rate might in principle be observable by anyone. The Need for a New Understanding of the Role of Values in Climate Economics How can we integrate the role of values and ethics in economic analysis of climate change without sacrificing the positive aspirations of that science? It is not hyperbole to say that this generation's major challenge is climate change. however. they ignore the morally preposterous implications of its results. The Stern Review's choice of a near-zero discount rate. What we do about it will have larger consequences for future human well-being—or future human suffering—than our actions on any other issue. For this. for example. Nordhaus's rationale for using a market rate of interest as a discount rate.1. By contrast. for example. economists lend themselves to a critical undermining of responsible policy responses. views such as Nordhaus's have given ammunition to those who argue that economic analysis is worse than useless. Trapped in an outmoded view of science as an enterprise that must eschew discussion of values in order to preserve detachment. claiming the high status of science and rationality for their own work. 1 Economics. Unfortunately. and should be entirely abandoned in favor of exercises in. Climate. 1 Yet this way of attempting to achieve unbiased research actually leads to a pronounced bias--a bias in favor of the status quo: evaluation of most meaningful changes requires the sort of explicit ethical reflection that is being avoided. but many in economics seem to continue to adhere to outmoded (and now clearly inappropriate) images of science. (One might ask re the contrast between Europe and the US?) Significant strides have recently been made in the understanding of both the inadequacy and impracticality of a fact/value dichotomy in scientific research. evidence of unjustified moralizing. the assurance that someone else starting from the same assumptions and data will reach the same conclusions. remain entrapped by the same fact-value dichotomy. however. Indeed. This was acutely apparent in the response of economists such as William Nordhaus and Gary Yohe to the British Treasury's Stern Review on the Economics of Climate Change. the economics profession in the United States is in large part failing to meet its responsibilities in this area. It is often supposed that any alternative to such methodology-based objectivity implies a rejection of science and a slide into relativism and unfounded emotion-based claims. As the findings of climate science gather increasing scientific support. however. The fact that there actually is no such single market rate on which economists agree.

that assume complete substitutability among different kinds of capital.There is. This (in fact more exacting) standard of objectivity requires that the viewpoints and values underlying the analysis be brought out into the open and subjected to scrutiny. Amartya Sen has called this "transpositional" objectivity. should be a key part of this transformation. however. which involves recognizing the inescapable intertwining of fact and value. that we cannot wait for the the outflow from such a new pipeline of training." This is further amplified by systems of peerreview in economics journals. and Values 235 . How to Advance Re-evaluating the role of ethics in economics challenges assumptions that are deep-seated in the mainstream of U. Both environmental concerns and questions of ethics are currently largely neglected in the core curriculum: Students in are generally taught that resource-blind growth models. and habits peculiar to particular professions. One prong should therefore seek to transform and revitalize economics education. reflect simply "the way the world works. The rising generation. Climate. the perspective of future generations. The climate change questions are of such urgency. race. economics. this requires being able and willing to articulate the reasoning behind one's research in ways that can be understood by a larger community than the one composed of one's closest peers. generation. The NSF could intervene in an important way in these professional systems by examining its own funding priorities. another solution. including women and minorities. from the undergraduate (or even K-12) level and on upwards. when the group of peers is constrained to an overly small group of like-minded scholars. for ethically-sophisticated and sustainability-promoting work hence must be another priority. Adherence to methodological strictures alone cannot assure this. cannot be neglected. gender. given their energy and larger stake in the outcomes of climate change policy. status. Accordingly. class. To the extent that review boards that make decisions about funding and promotion remain dominated by those who confuse adherence to methodological conventions with objectivity." Building the capacity for graduate students to think competently about the relation of ethical questions to their work would require special interventions such as summer institutes and innovative teaching materials. Creating an environment in which presently practicing economists could receive support. and promises productive Economics. projects that hide important value judgments under a veneer of technical sophistication will continue to receive funding. 2. since in many cases current faculty are largely unprepared to take this on. while it cannot be actually brought to the table.S. Funding individuals and institutes whose work exemplifies a healthy consideration of both facts and values. improving economic analysis of climate change will require a multi-pronged effort. In the case of climate economics. while explicit discussions of values will be considered "soft" and "not economics. however. and an openness to dialog with such larger communities. while continuing the systematic search for reliable knowledge. A more ethically grounded approach may also appeal to some groups who currently may be disproportionately disaffected with economics. rather than censure. Because viewpoints may be shaped by factors such as nationality.

includes a proposal for "greenhouse development rights" which looks at equity issues affecting more-and less-affluent groups around the globe. along with discussions from the philosophy of science. and Values 236 . Some critics of mainstream economics seem merely to exchange an obsession with detachment. as well as philosophers Martha Nussbaum and Hilary Putnam (2003). give rich and convincing arguments on the subject. including Institutionalist. and pristine wild Economics. philosophical or other thinkers outside the economics mainstream is. or social mores. socio-. for example. A number of critical or heterodox groups within economics. Not every contribution by critical. and social psychology have also greatly advanced our understandings of topics including human motivation toward ethical action.work on transformative economic change. Kristen Sheeran. Richard Howarth.S. for example. have also developed analyses which challenge the fact/value distinction and pioneer innovative methodologies. heterodox. relationships. The fact/value dichotomy has been well explored—and exploded—by economist Amartya Sen and a number of those who work in his wake. These include Frank Ackerman (2009). 3. there are also climate economists who are not afraid to make explicit their valuing of the future. one of Stanton's essays examines how regionally disaggregated Integrated Assessment Models are slanted to preserve rich world privilege." Also worth mention. and evolutionary economists. a number of economists are pursuing economic analysis with an explicit goal of valuing human well-being. qualitative work. Actual dollar awards would help persuade economists through extrinsic incentives. Paul Baer. while less prominent than some of the other voices in U. Relevant Research Fortunately. Julie Nelson (author. ecological. cognitive bias. and technological progress for an equally onesided emphasis on. could help shift the mainstream from its current course. in our judgment. for their potential contribution to better economics. much of economics still retains the assumption that economic investigators are ourselves untouched by emotional motivation. Stephen DeCanio. quantification. Oddly. and Dale Jamieson. As another example. are the works of environmental philosophers including Stephen Gardiner. Climate. It is important to recognize that critiques of mainstream economics are widespread. Philosophers of science Evelyn Fox Keller (author) and Phillip Kitcher. some of the results from these disciplines could be drawn on to enrich the discussion. and the imprimatur of NSF approval of "strongly objective" research would reinforce investigators intrinsic motivation to act in accord with our important values. helpful. Within climate economics. and Karen Warren. 2008). and Elizabeth Stanton. feminist. Baer's work. there are rich resources that can be inform a better understanding of the relationship of ethics and knowledge. under the cover of merely-technical-seeming "Negishi weights. cognitive psychology. And. Recent research in behavioral economics. debate. So. and discrimination must be exercised.

Other Questions Given the singular importance of climate change as an issue that must be faced by our generation. Economics." The need to transform climate economics is a "next-generation research" challenge in more ways than one: It requires the creation of a new generation of economic analysis. First. "For Ethics and Economics Without the Dichotomies. Can We Afford the Future?: The Economics of a Warming World. empirics. we are urging improvements upon rather than rejection of existing modes of analysis. demands that the projects that hold hope for the mitigation of climate change be given priority. Hilary. NY: Zed Books. Climate. References Ackerman. to try. and the urgency of the need for effective climate policy. We applaud the SBE for launching this request for input on "grand challenge questions that are both foundational and transformative. and some scholars may argue for a concentration on "basic" (meaning the most highly generalizable) science. to create a livable environment for the generations to come. Frank. Value Judgments. to the extent still possible. Julie A. we hope that the SBE program will keep two questions in mind when evaluating the results of this request for white papers. and policy. We believe that consideration of the well-being of future generations.” Ecological Economics 65(3):441-447. “Economists. 2008. Second. and Values 237 . 2003. do the suggestions deal with issues of importance for the well-being of humans (and other species)? While a great many projects may be intellectually fascinating. To be clear: This is not what we are advocating. The issue of values is therefore at the very center of what science in society is about. are likely to be helpful. and the damage that is being done by ethically irresponsible research. 2009. it is a fundamental economic insight that devoting resources to any one project generally involves the opportunity cost of forgoing others. Choices have to be made. Nelson. do the suggestions deal adequately with the issue of ethics and knowledge? Neither projects that pretend that ethical issues are irrelevant to research. Putnam. and Climate Change.environments." Review of Political Economy 15(3): 395-412. We believe the NSF is well-positioned to help economists and other social scientists address the question we raise in this essay: How can we integrate the role of values and ethics in economic analysis of climate change without sacrificing the positive aspirations of that science? 4. nor projects that propose ethical reflection detached from theory.

To view a copy of this license. and Values 238 . USA.org/licenses/by-nc-sa/3. Suite 300. Climate.0 Unported License. 171 Second Street. San Francisco. visit http://creativecommons. California. Economics.0/ or send a letter to Creative Commons. 94105.This work is licensed under the Creative Commons Attribution-NonCommercialShareAlike 3.

such as global warming or nuclear proliferation. invited people “to contribute white papers outlining grand challenge questions that are both foundational and transformative. This work is licensed under the Creative Commons Attribution-NoDerivs 3. the government is likely. banking crises and cyber warfare. Global public goods differ from other economic issues because there is no workable mechanism for resolving these issues efficiently and effectively. The author is Sterling Professor of Economics at Yale University. there is no market or government mechanism that contains both political means and appropriate incentives to implement an efficient outcome.edu. but they routinely fail to solve the problems caused by global public goods. USA. 2010 In a letter to colleagues. Email: william. California. The problem of global public goods Many critical issues facing humanity today -. visit http://creativecommons.org/licenses/by-nd/3. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences. 2010 239 . which are goods whose impacts are indivisibly spread around the entire globe. 1 __________________________________________________________________________________ Nordhaus.Some Foundational and Transformative Grand Challenges for the Social and Behavioral Sciences: The Problem of Global Public Goods1 William Nordhaus October 3. Assistant Director of the National Science Foundation. If scientists discover the lethal character of lead in the American air and soil.0 Unported License. 94105. San Francisco. He is also a member of the National Academy of Sciences’ Division of the Behavioral and Social Sciences and Education Committee. I am grateful for the advice and comments several readers of earlier drafts. 171 Second Street. These are examples of global public goods. If a terrible storm destroys a significant fraction of America's corn crop. These are not new phenomena but are becoming more important because of rapid technological change. eventually and often haltingly. Grand Challenges for the Social Sciences October 3.nordhaus@yale. To view a copy of this license. the reaction of prices and farmers will help equilibrate needs and availabilities. But if problems arise for global public goods.” This paper will address some foundational issues that cross the boundaries of many social and natural science: the issue of how to deal with global public goods. Myron Gutmann. Markets can work wonders. Suite 300. to undertake to issue the necessary regulations to reduce lead in gasoline and paint. oil-price shocks and nuclear proliferation – are ones whose effects are global and resist the control of both markets and national governments.0/ or send a letter to Creative Commons.global warming and ozone depletion.

economic. each of which was a political sovereign with power to govern its territory. or to rein in dangerous nuclear technologies. National governments have the actual power and legal authority to establish laws and institutions within their territories. this includes the right to internalize externalities within their boundaries and provide for national public goods. there is at present no adequate legal mechanism by which disinterested majorities. We expect households to deal with children’s homework assignments and take out the trash. under international law as it has evolved in the West and then the world. Particularly where there are strong asymmetries in the costs and benefits (as is the case for nuclear non-proliferation or global warming). there exist today no workable market or governmental mechanisms that are appropriate for the problems. to form a world army to combat dangerous tyrants. For the case of global public goods. the requirement of reaching consensus means that it is extremely difficult to reach universal and binding international agreements. In practice. one of the first questions raised concerns the appropriate organizational level at which the problem should be addressed (it is called fiscal federalism in public finance). they raise the most intractable issues for realworld resolution primarily because of what has been called the Westphalian dilemma. or political problem. Even for life and death issues such as nuclear weapons. to cure overfishing of blue-fin tuna. With a few exceptions. Whenever we encounter a social. but each is likely to perceive the costs and benefits of cooperation through a biased cognitive lens that justifies free-riding. By contrast. One answer to the political vacuum is to create international institutions. Not only does each nation face a powerful incentive to free-ride off the public-good efforts of other nations. can coerce reluctant free-riding countries into mechanisms that provide for global public goods. Because nations are deeply attached to their sovereignty. it led to the current system of international law under which international obligations may be imposed on a sovereign state only with its consent. if a state like North __________________________________________________________________________________ Nordhaus. or even supermajorities. we expect national governments to defend their borders and manage their currencies. Participants of the Treaty of Westphalia recognized in 1648 the Staatensystem. or system of sovereign states. there are no mechanisms by which global citizens can make binding collective decisions to slow global warming.Global Public Goods and the Westphalian Dilemma In theory. Such organizations generally work by unanimity. have few provisions that are binding on recalcitrant countries. Grand Challenges for the Social Sciences October 3. and generally apply only to countries which have agreed to participate. 2010 240 . we expect local or regional governments to organize schools and collect the trash. the Westphalian system leads to severe problems for global public goods. global public goods are well understood as the polar case of a Samuelsonian public good. The de facto requirement for unanimity or broad consensus is in reality a recipe for inaction. As the system of sovereign states evolved.

I have outlined the difficulties that are at the intersection of game theory. ecologists. slavery. One critical task. or weakest-link. and will become increasingly important in the years ahead. resides in the power of the U. But the exceptions are limited and do not cover many critical areas. international relations. the local expertise involves climate scientists.Korea declines to participate in the Non-Proliferation Treaty. It should be emphasized that the nature of the syndromes may differ according to whether they are benign or harmful. such as whether they are additive. and the like. then. is to explore the perverse outcomes as well as possible mechanisms involved in addressing global public goods. The rules governing international trade have evolved toward multinational decision-making. and associated social sciences is to devise mechanisms that overcome the bias toward the status quo and the voluntary nature of current international law in life-threatening issues. genocide. Security Council. In the environmental arena. and cyber warfare. on the structure of the production technologies. piracy.N. The Westphalian system is an increasingly dangerous vestige of a different world. although these require consent of the five permanent Members. Some are rules such as prohibitions on torture. our international institutions and analyses must come to grips with the fact that national sovereignty often cannot deal effectively with critical global public goods. Grand Challenges for the Social Sciences October 3. Just as economists recognize that consumer sovereignty does not apply to children. particularly important for national security.) To take the example of global warming. Challenges in Dealing with Global Public Goods Dealing effectively with global public goods has two intellectual grand challenges. and on the scale of the problem. The grand challenge for economics. But the local expertise is not sufficient to deal with global public goods. political science. which are both critical and complementary. The central proposition of this White Paper is that global public goods are becoming more important. economics. and international law. there is no provision for forcing its adherence. Each of the problems mentioned in this White Paper (global warming. treaties to reduce ozonedepleting chemicals have been an important contribution. overfishing. and racial discrimination. The first challenge is the analytical one. local denotes intellectual as well as geographical proximity. political science. There exists a substantial core of work on cooperative games and public-goods mechanisms. marine biologists. on the distribution of gains and losses. 2010 241 . It is also __________________________________________________________________________________ Nordhaus. energy specialists. This involves understanding the behavioral aspects that underlie the problems associated with global public goods. as examples) has a specific structure and a “local” expertise of knowledge. The second challenge consists of actual problems that pose dangers to human societies. There are important examples where the international system has responded to this set of problems. best-shot. (In this context. Another area.

those who have studied the history of international agreements will recognize that it is insufficient to tell countries that a terrible future awaits them if they do not act. A second approach transcends the boundaries of the social sciences and includes the study of the actual problems raised by global public goods. The definition of the analytical areas to be supported as well as the specific problems that need examination should be determined by a panel specifically asked to delineate the issues. environmental research. To return to the example of global warming. It is generally fruitless to attempt to establish programs in the social sciences to address specific challenges that spring up from time to time. terrorism. However. Behavioral. climate change. these problems are often viewed in isolation as technical. they are just as much social and political issues. Strategy for Research for the National Science Foundation and other Agencies Finally. First. Research and policy have sometimes foundered because they did not incorporate the relevant social-science insights from the very conception. I do not recommend establishing a special program to deal with such issues. It will be necessary to design systems in which affirmative national steps to contribute to global action serve a country’s own national self-interests. and Economic Sciences make a special effort to solicit and recognize research that is targeted to research on aspects global public goods. I suggest that each program within the Directorate for the Social. This would also encourage cross-disciplinary research programs (such as economics and international relations. Indeed. Rather. or political science and behavioral psychology) that address specific issues that arise in the context of global public goods. or security issues. The federal government directly and indirectly supports a wide variety of research programs on the substantive issues discussed here. So the grandest challenge of all is to ensure that __________________________________________________________________________________ Nordhaus. This goal might be accomplished by establishing matching funds to provide incentives to programs. for the NSF. scientific. I would suggest two complementary approaches.necessary to recognize the analytical issues involved. and the mechanisms for reaching solutions – and this is where the complementarily between the analysis in the first challenge and the local knowledge in the second challenge arises. and public health are extensively studied in different parts of the federal government. I address the strategy for the NSF and other agencies in addressing the programmatic study of global public goods. the nature of the externalities. 2010 242 . In fact. particularly when the national costs of action are large and the national costs of inaction appear small. security issues. It is essential to have a mechanism by which social-scientific analyses can be included in such research programs and for social scientists to be at the table when the scope of the problems and the research programs are defined. Grand Challenges for the Social Sciences October 3.

Grand Challenges for the Social Sciences October 3. 2003. 2010 243 . 1954:387-389. W. 2009. S. MIT Press.  While this warning is only a hypothesis at this stage. Environment & Statecraft.. Ecology and Society. on global environmental issues: Human activities increasingly influence the Earth’s climate and ecosystems. The exponential growth of human activities is raising concern  that further pressure on the Earth System could destabilize critical biophysical systems and  trigger abrupt or irreversible environmental changes that would be deleterious or even  catastrophic for human well‐being. where humans constitute the dominant driver of  change to the Earth System. Barrett. The Earth has  entered a new epoch. 1994.. References Samuelson.. it does indicate the stakes involved in the grand challenge of finding solutions for global public goods.research on global problems be seen in the social as well as the technical context when the substantive problems are considered. __________________________________________________________________________________ Nordhaus. Johan Rockström et al. What are the stakes? I conclude with the warning from Rockström et al. This is a profound dilemma because the predominant  paradigm of social and economic development remains largely oblivious to the risk of  human induced environmental disasters at continental to planetary scales. ReStat. Oxford University Press. Nordhaus. the Anthropocene. Managing the Global Commons.. P.

244 .

ultimately. These changes will improve our ability to predict outcomes. terrorism. economic and political systems and in the outcomes that those systems produce. Increasing complexity has implications for social science: it hinders our ability to predict and explain and to prevent large deleterious events.Complexity in Social. and finally. we must learn how to identify combinations of interventions that improve systems. Harnessing complexity will require several changes: we must develop practical measures of social complexity that we can use to evaluate systems. and polarization among voters we must acknowledge their complexity through interdisciplinary teams. and Economic Systems Scott E Page University of Michigan-Ann Arbor Santa Fe Institute We live in a time of rising complexity both in the internal workings of our social. we must see variation and diversity as not just noise around the mean. segregation. achievement gaps. Political. we must support methodologies like agent-based models that are better suited to capture complexity. design institutions. but as sources of innovation and robustness. to transform society. and. To make headway on the problems that animate social and behavioral scientists: economic inequality. 245 . identity effective policy changes. climate change. health disparities.

others as being difficult to explain. And. it has enormous implications for society. these scores of factors have a large effect. it limits the efficacy of social science. the outcomes of our social. political. and they act on each person differently. it means that we have little chance of designing effective policies. to describe or to predict. Wal-Mart phones the cow. they explain little of the variation. the problem of rising obesity. Social and behavioral science research reveals scores of causes ranging from the economic (the low price of corn syrup infused Big Gulps) to the genetic (the tendency for some people to store fat). these causes have small magnitude. a non stationary time series that features unpredictable booms and busts. How then do we design interventions? How can we pull levers in combination to reverse the trend? Implications of Increasing Complexity We can see the increasing complexity as creating problems. Some characterize output complexity as lying between ordered and random. whose behaviors are interdependent. just a few decades ago businesses received quarterly inventory updates. such as the consciousness and cognition that emerges from interacting neurons and 246 . We need only consider the damage wrought by the home mortgage crises and the resulting recession. though. Now it is said that when you purchase milk. as can prices in the stock market. We also know that collectively. and as making us more susceptible to large. to the extent that it means incomprehensibility.Introduction Confronting and harnessing complexity will be among the greatest challenges facing social scientists over the coming decades. We therefore expect that individual policy interventions such as taxing sugary drinks. So too have the inner workings of those systems. Take just one example. complexity has been formally defined in dozens of ways. We are now much less geographically limited in our friendships and lags in information have all but disappeared. deleterious events. To the extent that rising complexity means a lack of predictability. How do we predict the unpredictable? To the extent that it implies more large events. Technology has been a major cause. For example. is complexity? Complexity can refer to either the attributes of a system or to the outputs a system produces: the social life of a city can be characterized as complex because it has diverse actors. The opportunity derives from the potential for complex systems to produce emergent functionalities. Individually. We’re more interconnected and more adaptive than ever before. We can also see it as an opportunity. will have only modest effects. and economic systems have become more complex. What. According to any of these many definitions. No mere metaphor. as making it hard for us to predict and design.

they rely on an equilibrium paradigm. Most. Managing the complexity of systems may be just as important as working to maintain their efficiency. and social systems do not sit at rest but are constantly in flux. For this reason.S. We need to understand how to encourage the former and guard against the latter. However. in which the whole not just exceeds but transcends the parts. Changes in outcomes are seen as movements in equilibria and not as natural progressions in a dynamic process. human society has only begun to learn to harness the potential of more. i. Yet. How Social Science Must Change to Include Complexity The challenge and the opportunity for social science are to build in the robustness necessary to limit the damage of large events and to harness complexity to produce better social outcomes. 247 . First.synapses (Holland 1999). To account for this incommensurability between our models and reality. we must advance our methodologies for measuring and categorizing the complexity of social processes. The physicist Phillip Anderson famously commented ``more is different”. though not all. tax code or our legal system? Why should we care about these questions? Organizational theorists have long claimed that if you can measure it. To accomplish these tasks requires at least four changes in practice. At present. for the most part. Current social science models cannot help us harness complexity because. Equilibrium may well remain at the core of our disciplines. disorganized complexity cancels out. robust outcomes. The economy and other social systems contain organized complexity. you can manage it. How complex is our welfare system or the international financial system? How complex is the U. we make little or no effort to measure and categorize the complexity of social processes. equilibrium models that toss in noise see the internal complexity of systems as disorganized. the large unexpected outcomes produced in complex systems are anything but random. Hence. but they can also spiral into chaos. the term self-organization has become widespread within complexity research. political systems. the whole can be more than its parts. As Warren Weaver pointed out over sixty years ago. Yet.e. The relevance of complexity does not deny the value of equilibrium models. social scientists add in randomness in the form of shocks or uncertainty. so it cannot add up to more than the parts. complex systems scholars often refer to social outcomes as generated from the bottomup. Self-organized systems can produce cooperative. even the most casual observer recognizes that most markets.

Robustness. to explain academic success we can create a comprehensive model with lots of weak individual effects but strong collective effects. or what some call resilience. on the other hand. it’s relatively easy to understand how their behaviors aggregate. do think. A large part of that process of taking a generative perspective will be rethinking variation and diversity. As mentioned. to borrow a term from Robert Axelrod and Michael Cohen. refers to the ability of a system to maintain functionality in response to external shocks and 248 . information. belief systems. and economic variables (family income). but they need refinement for the simple reason that electrons don’t think. We can also copy others whom we perceive as being successful. we ignore the complex interactions that enable the whole to be more than its parts. We can even consider complexity as a policy consideration in and of itself. and if so. Social and behavioral scientists must think more like ecologists who see variation as central and less like statisticians. and passion. social and evolutionary systems may be more prone to fluctuations than physical systems. once we know the complexity of a system. Positive feedbacks along with interdependencies are a major driver of large events. physical and computational measures of complexity exist in abundance. In complex systems. we have some idea about how predictable it is and how likely it will produce large unexpected events. We need interdisciplinary teams to unpack how those many forces interact. we must promote interdisciplinary research on specific problems. whether or not the cost of the complexity is worth the potential costs. But if we break a complex system into disciplinary parts. the third necessary change. such as improving education. Second. social variables (crime rates). We base our behaviors on mental models. Educational success on depends individual. We might even ask whether a new policy will make a system more complex.In addition. implies a positive feedback and a close link between social and evolutionary systems. and community influences. Hence. These can provide a starting point for creating social complexity metrics. This last observation. Thus. who perceive variation from average effects as noise or individual differences that average out. variation (differences within types) and diversity (differences in the number and distribution across types) drive innovation and contribute to system level robustness. Empirical studies of educational performance include psychological variables (IQ). peer. People. that we often mimic others. As in the aforementioned case of obesity. To harness complexity. family. we must take a generative perspective and see social outcomes as produced by purposive actors responding to incentives. and psychological predispositions. cultural norms. health variables (presence of lead in bloodstream and obesity).

to random walks on Wall Street. sets the agents loose. Summary On the positive side. Robust systems often maintain functionality by locating a new arrangement of their parts. 249 . increased engagement with complexity research can enable social scientists to better explain and predict what occurs in our increasing complex world and anticipate large events. mechanistic rules.internal adaptations. References Holland. agent based models produce aggregate outcomes that fall into one of four broad categories: static equilibria. as some claim. Many people conflate computational methods with complexity. On the normative side. Basic Books. Emergence: From Chaos to Order. John 1999. we must advance computational agent-based modeling even though this methodology is not. and to leverage the potential for emergence to improve outcomes. Social systems exhibit all these four behaviors as well. empirical studies that assume a single type of actor or behavior may be woefully inaccurate in their estimations if in fact the systems contain multiple types of actors. Note that robustness differs from stability – the capacity for a perturbed system to return to the same equilibrium. The behaviors included in the models need not be ad hoc. We must disconnect scientific methodologies from the properties of the systems that they are used to study. to complex intra-industry dynamics. This is a mistake. a deeper engagement with complexity can help us to identify and pull levers within systems to effect change. A goal of social science should be to explain why some processes produce outcomes that fall into one category and others fall into another. Variation and diversity also provide the building blocks for emergent phenomena and for complexity itself. to design rules. They can be calibrated to actual behaviors revealed in the laboratory. Thus. We see phenomena ranging from stable market prices. The modeler designs a system. In point of fact. or complex trajectories. laws. and watches what transpires. Agent based models consist of a set of object – agents – situated in place and time that follow and adapt rules of behavior. random paths. and incentive structures that limit the prevalence of large deleterious events. a panacea. periodic equilibra (patterns). to political cycles. or discerned from empirical studies. identified in field studies. Finally.

2010.S. July 20. San Francisco.Page.org/licenses/by-nc-sa/3. House of Representatives. 171 Second Street. Committee on Science and Technology. To view a copy of this license. 250 .0 Unported License.0/ or send a letter to Creative Commons. visit http://creativecommons. USA. California. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3. Scott (2010) Diversity and Complexity. 94105. Princeton University Press U. “Building a Science of Economics for the Real World”. Suite 300.

Research Opportunities in Economics: Suggestions for the Coming Decade James Poterba, MIT and NBER September 2010 The remarkable events in the global economy in the last two years have drawn new attention to the central role that economic institutions and economic policies play in determining the well-being of virtually all participants in modern industrial societies, even those who livelihoods are far removed from the financial sector that was the epicenter of the global economic crisis. The global financial crisis suggests a number of critical research opportunities for the next decade. While they have attracted less media attention, recent developments in energy markets, in environmental policy, in information technology, and in the availability of data on the economic activity of households are also opening new research vistas for economics. This short research prospectus begins by summarizing key research issues that have been raised by the global financial crisis, and then moves on to issues that are associated with energy and environmental policy and to the wealth of new data being created by the rise of electronic commerce and internet search. Measuring and Modeling Interdependencies in Financial Markets. One of the central regulatory challenges that has been identified in the aftermath of the collapse of Lehmann Brothers and the rescue of AIG is the need to develop metrics for evaluating whether a participant in the financial system, whether a bank, an investment firm, or a sovereign borrower, is "too big to fail." The conceptual challenge in answering this question is enormous, because as the events of October 2008 demonstrated, the importance of a single actor depends critically on that actors links to other actors, and on the way those other actors will respond to signs of distress at the actor in question. This challenge needs to be addressed with empirical research directed at measurement and modeling issues, as well as with theoretical research directed at understanding market cascades and the interdependencies across markets. A central question for regulatory policy is what information needs to be disclosed by each actor in the financial system, and to whom does it need to be disclosed? Put more simply, what should regulators regulate, and how should they measure it? We now recognize that there are substantial links between financial actors, and that it regulatory authorities need to be able to judge the exposure of one systemically important financial institution to financial distress at another. Some proposals that have been discussed in the context of the newly-created Office of Financial Regulation call for financial institutions to disclose their holdings of securities, at varying levels of disaggregation, to a regulatory body that will search for patterns and excessive risk-loading on particular dimensions. Empirical research is needed to determine the value of different types of disclosures, to create risk-assessment models that offer maximum predictive power for financial distress, and to help guide the design of regulatory policy in this sphere. Theoretical issues are at least as critical as the unresolved empirical questions. What criteria should be used to determine whether a particular institution is systemically important? Answering that question requires a framework for assessing the consequences of its failure, which in turn requires assessing the set of related institutions that might be affected by such a

251

failure. The modeling of such network effects in financial markets is just beginning. The transmission of financial shocks from one firm to another, or from one sovereign borrower to the broader capital market, clearly depends on the nature and timing of policy response. This suggests that the research challenge is not simply to describe and quantify the nature of interrelationships, but to describe how the links across actors in the financial market will depend on the policy environment. The answers to these questions will surely build on several decades of theoretical research on incentives and contract structure, but they need to recognize the specific institutional features of modern global financial markets. Similarly, research on economic models of regulation, which advanced rapidly in the 1970s and helped create a wave of regulatory changes in that era, is vital to evaluating the trade-offs that will confront regulatory policy-makers. There are rich opportunities for collaborative studies by financial economists, who have the expertise to evaluate the risk attributes of specific financial products or transactions, and regulatory economists, who have analyzed the challenges of regulating risktaking and firm or individual behavior that may generate externalities in a range of market settings. Designing and Evaluating Fiscal Policy. For at least two decades, the near-consensus among academic researchers has been that fiscal policy, with its "long and variable lags," is poorly suited to stabilization of economic fluctuations, while monetary policy offered at least a potentially effective tool for high-frequency macroeconomic fine-tuning. There have been active debates about whether monetary policy should be used for such purposes, but relatively little attention to the macroeconomic effects of fiscal policy. The recent economic downturn has reignited interest in fiscal policy, and reveals important gaps in our current understanding of this policy tool. One critical set of issues concerns the stimulative effects of increases in government spending and reductions in taxes. The debate over the magnitude of the economic stimulus in early 2009 revealed widespread disagreement over the extent to which higher spending and lower taxes would raise aggregate economic activity. Moreover, while there was broad agreement that different types of spending would have different effects -- that outlays on infrastructure might have different employment consequences than extending unemployment benefits, and that cutting business taxes might have different effects than reducing payroll taxes - there was a limited base of research available to refine estimates of these policy impacts. Analyzing the economic effects of fiscal policy, studying differences across U.S. states, across nations, measuring how different fiscal policies appear to affect economic activity under different macroeconomic circumstances, are important opportunities for future research. A related, but distinct, set of questions concerns long-term fiscal balance. The U.S. an many other developed nations are currently on unsustainable fiscal trajectories. Current payroll and income taxes are not sufficient to cover the cost of current-law spending programs. This means that it is inevitable that changes will need to be made in spending programs, in taxes, or in both. The devil will be in the details of programmatic reform, and it is essential to build a solid research base for evaluating different reform strategies. The effect of changing reimbursement rules in the health care sector, for example, may effect R&D decisions by pharmaceutical firms, labor supply decisions by doctors and other health care providers, and the nature of treatment received by patients. Changes in Social Security benefit rules similarly could have differential

252

effects on the "oldest old" and on recent retirees, and they might influence household behaviors as diverse as savings decisions, retirement choices, and living arrangements. There is rich variation in past policies that can provide input on calibrating these behavioral responses, and this information must be brought to bear in policy design. Leaving specific programs aside, there are unanswered questions about the macroeconomic effect of a higher level of government debt, one of the inevitable consequences of the near-term continuation of an unsustainable fiscal stance. To evaluate policies that might bring the U.S. closer to a sustainable fiscal position, it is important to evaluate the costs of deviating from such a rule. That challenge presents a distinct but very rich set of research opportunities. Energy and the Environment. The extent to which human consumption of fossil fuels is associated with global warming has been one of the most studied scientific issues of the last three decades. As scientific advance narrows the uncertainties associated with this question, there will be greater opportunities to craft policies that will alter emission of greenhouse gases and more generally alter patterns of energy consumption. The operation of energy markets and the design of policy interventions to achieve particular objectives are central economic issues that warrant further study. Economists have long understood the basic principles of optimal tax design that apply to "Pigouvian" corrective taxes: set the tax rate equal to the marginal social damage of a good's consumption. There are refinements, such as recognizing the effect of the corrective tax on the demand for other goods, that must be considered, but this broad principle is still a useful guide. Yet there are many unanswered questions that are associated with a potential transition to an economic environment in which the tax-inclusive prices of fossil fuels are substantially greater than those prices today. Two of the most important issues concern the long-run changes in consumer behavior that will flow from higher energy prices, and the response of firms to changes in the tax and regulatory environment. First, consider consumer responses. Much of the existing research on energy demand focuses relatively short-run adjustments to price changes, such as the price elasticity of demand for gasoline and the demand for vehicles with different miles-per-gallon ratings in different fuel cost regimes. Yet there are unresolved puzzles about energy consumption, and open issues about general equilibrium responses higher energy prices. For example, there are widely documented inefficiencies in energy use in both the commercial and residential sectors - opportunities to reduce energy consumption with little or no change in the production or consumption opportunities of energy users. The underlying behavior that generates these patterns warrants investigation, both as a positive question and as policy-related input that may provide information on how behavior will adjust in response to higher taxes or other regulatory policies. It may be possible to carry out controlled experiments to learn how consumers respond to various types of incentives for energy conservation, and how the amount that can be saved with particular interventions affects the take-up of such interventions. Just as there are important opportunities to study the behavior of consumers, there are important needs and opportunities in modeling the behavior of firms. A small but growing literature has explored the economic effects of energy policies when firms are imperfect competitors. In these settings, some firms may have market power, or there may be

253

opportunities for firms to collude. The effects of various policies on energy production and on the prices facing consumers can be quite sensitive to the nature of firm competition. Exploring models of imperfect competition, and calibrating them for the energy-producing sector, is therefore an exciting direction for new research. Beyond the detailed analysis of consumer demand and energy supply, there are open issues concerning the impact of publicly- and privately-provided infrastructure on energy use. To assess the impact of expanding light rail service in a metropolitan area on energy use, it is necessary to model how intermodal transportation choices will change, to consider how residential location decisions may evolve, and to carry out the analysis at a disaggregated level. Will the former drivers who switch to light rail be drawn disproportionately from the well-to-do groups that drive late-model and relatively fuel-efficient vehicles, or will they come from lowerincome groups that drive less fuel efficient cars? How does the pricing policy for public transit affect the relative responses of these groups? Finally, there are important opportunities to study the operation of energy markets. While much energy-related research may be supported by other funders, the core research on the markets for various fossil fuels, for electricity, and for emissions associated with the combustion of fossil fuels falls squarely within the purview of social scientists. Insights from market design and auction theory have already played a central role in the helping to create markets for trading emission rights, but there are many other potential applications in the energy and environmental sectors. Regulatory policies and tax policies have long been central to the markets for crude oil, natural gas, and nuclear power in the United States, and these policies are likely to play a key role in shaping these markets in the future. Supporting research on regulatory policy analysis and on the operation of energy markets is therefore of direct relevance for the policy design process. Networking and Household Behavior. The advent of cellphones and handheld devices, coupled with social media such as Twitter and Facebook, has greatly expanded the frequency of interaction between individuals and the availability of information for a host of economic decisions. In deciding on a product purchase at a retail outlet, consumers can easily access a host of product reviews. When selecting a restaurant in an unfamiliar city, travelers can investigate ratings from previous diners in real time. There are an extraordinary range of transactions for which there have been similar expansions in information access. What are the implications for consumer choices? Will enhanced information access result in greater "herding" in product choices? How does the possibility of such herding alter strategy for firms? Does the need to establish an early product success change the way firms might develop new products and introduce them to the marketplace? How do the opportunities for networking between individuals affect job search, housing markets, and other economically significant sectors of the economy? This is an area in which there are important opportunities for both conceptual research, modeling network structure and the factors that might influence the strength of linkages across households, and for empirical work. Moreover, there are a host of regulatory design issues about information sharing and privacy, and about the extent to which firms can share information and customize marketing efforts, for which economic analysis will be key inputs. There are many issues in this research area that may be amenable to cross-disciplinary work,

254

including for example sociologists, marketing scientists, computer scientists, and applied mathematicians with experience on networking problems. At the same time that consumers are gaining access to extraordinary volumes of new information, firms are gathering much more information on their customers and on other market participants than ever before. Internet search firms that collect information on the queries of their users can build voluminous data bases that provide new insights on the products that consumers are interested in, and on related topics that involve economic activity. Google, for example, has explored the use of the frequency of searches for "temporary help" or "unemployment benefits" as a way of gauging the state of macroeconomic activity in real time. As large quantities of information on individual consumers, and on networks of households, become available, there will not only be exciting research opportunities but also important challenges. One will be finding secure ways for researchers outside of the firms that collect this information to analyze these data -- subject of course to corporate approval. This may involve designing new forms of data protection, or creating data warehouses in which data files that remove any potential identifiers for individuals can be stored for research access. Federal research support may be needed to create such data facilities and to "clean" the data that are collected by private firms. The same sort of infrastructure could be deployed to provide access to information on consumer financial records and related administrative record information, currently held by firms, that would enable researchers to move well beyond the limitations of existing household surveys. Creating model contracts for such data access and supporting the researchers who might work to extract such data from corporate files, and even supporting corporate partnerships to facilitate such data access, are promising research directions. The four issues represent promising opportunities for research in economics and allied fields. There is a fifth issue -- the implications of an aging U.S. and global population for economic institutions and economic performance -- that is also of great importance. It is widely understood that the decline in the birth rate beginning in the mid-1960s, and the coincident decline in old-age mortality rates, will lead to a gradual aging of the U.S. population in coming decades. These will pressure pay-as-you-go transfer programs to the elderly, such as Social Security and Medicare, and it will raise a host of other questions about economic activity. How will the organization of work and the design of workplaces respond to an aging labor force? Will there be important effects on asset markets, housing markets, and on the structure of economic activity across sectors? How will the aging population affect the rate of technological progress, and how will the health care sector respond to the growing need for its services? Many of these important research topics are supported by the National Institute of Aging, and while the research issues are as important as those in the four areas outlined above, they may not command the same funding priority in light of the potential availability of other funding sources.

255

256

 It is a good bet that soon new approaches  will emerge to think about macroeconomic phenomena. let  alone identifying its effects.   Either  way.  Therefore. Harder to forecast is whether they will require  a change in paradigm or a more intense use of existing ideas and models. I will  present three areas or questions. After 15 or 20 years of  work. but where I also see  active work.  With  attention  now  turned to fiscal policy.  the  mechanism  by  which  government  policy  stimulates  the  economy  in  standard models is a caricature of reality at best: government spending is expansionary because it takes      257 . much of it funded by the NSF.  I  am  filled  with  optimism.  GDP  or  employment. or on its potential use to fight recessions.  and  research  is  shifting  towards the  type of basic fundamental research that the NSF almost solely funds.  Yet.     In the last two years.     Fiscal policy    From the end of 2007 to the end of 2009. The motivation for this  rise  was  the  longest  and  deepest  economic  recession  in  the  post‐war.    there  has  been  little  economic  research  in  the  past  decade  on  the  aggregate  impacts  of  fiscal  policy  on  unemployment  or  output. government spending in the United States increased  by 4. Applications to graduate programs are higher than ever.  and  many  provisions  and  interests get tacked on at each of these stages. Students are flocking into macro classes. where I currently see large outstanding questions. In this report. which  will  in  a  few  years  lead  to  more  economists  applying  for  grants  than  ever.  However.  these studies barely addressed some of the biggest challenges in this literature.  it  is  hard  to  think  of  a  time  in  the  evolution  of  economic  science  where  funding  research may get more bang for its buck. a wealth of new empirical techniques emerged together with clever  exercises  at  data  collection  or  empirical  identification  that  today  leave  us  with  some  confidence  as  to  what  is  the  effect  of  a  monetary  policy  shock  on  inflation. the largest 2‐year increase on government spending since 1953.    Looking  at  this  outstanding  challenge. the literature on monetary policy has taught us that the impact  of a change in policy can be wildly different depending on what people expect will be the subsequent  policy path. PhD theses and seminar papers  are becoming more creative and connected to the real world.    In  current  theory. I expect that we will see progress in this field over the next 15 years as well. even measuring the fiscal change is hard. Empirical work faces the  obstacle  that  fiscal  programs  go  through  multiple  steps  to  be  approved.National Science Foundation white paper on future research in macroeconomics  Ricardo Reis. Moreover. Columbia University    The economic crisis of the past two years has brought a tremendous amount of excitement to  the field of macroeconomics.  Economists  working  on  monetary policy faced similar seemingly overwhelming challenges 25 years ago.4% of GDP. macroeconomists were very responsive and adapted the tools that they  had  developed  to  study  business  cycles  to  look  at  the  effects  of  government  consumption.

 or eligibility for disability.  these  theories  have  always  had  a  clear  Achilles'  heel:  to  justify  why  people  happened  to  know  some  pieces  of  information. potentially with budget deficits in between.  and  uses  this  insight  to  explain  a  wide  variety  of  economic  phenomena.  but  not  others. all seem like questions that could have a large impact  on public policy and for which we currently do not have even the most basic research.  In representative‐agent models.  Distortionary  transfers  in  turn  may  improve  welfare  but  they  discourage  work  effort  and  investment.    Many  approaches  developed. are neutral to economic activity.  only  25%  of  it  has  come  from  government  consumption  and  investment. The remaining 75%  was due to increases in social transfers.  the  two  spending categories to which all of the research that I have described above applies.  For  the  typical  household. but we cannot forget that they were  built  to  understand  the  macroeconomic  dynamics  that  follow  either  monetary  policy  shocks  or  technology  shocks.   Of the increase in government spending in the  past  two  years.  The  study  of  monetary  policy  went  through  a  fundamental  transformation  in  the  80s  and  90s. or tuition assistance. Information on almost anything is cheap  today. there is virtually no research on the impact of transfers on  aggregate consumption or employment.  Noticeably. more foundational challenge.  there  was  a  striking  contrast  between  the  confidence  with which monetary policy was set by the Fed in response to the crisis vis‐a‐vis our ignorance about the  Treasury  and  its  fiscal  expansion  programs. lump‐sum transfers from one  group of agents to another.    An  important  challenge  is  therefore  to  build  models  where  transfer  programs  have  aggregate  effects. It is easy to ridicule the models. economic activity.  what  others  know.  it  would  take  a  few  minutes  to  discover  most  of  the  relevant  information it needs to plan their savings for retirement.  The  impact  of  fiscal  shocks  has  simply  not  been  the  focus  of  much  business‐cycle  research.    There is a third.  and  inducing  them  to  work  harder  to  compensate for their lost wealth.  Thinking  harder  about  how  is  it  that  government  consumption  affect  decisions  to  invest  and  work  will  require  a  transformation  in  the  models.  depressing. Yet.     Here  I  am  again  optimistic. To understand how expanding Medicaid. there has been no noticeable progress (or      258 .  and    any  relevant  variable‐‐‐  whether  they  just  don't  know  some  variables‐‐‐in  rational  expectations  models‐‐‐ whether they also don't know what others know‐‐in imperfect common knowledge models‐‐‐and finally  whether they do not know even the model itself‐‐‐as in theories of learning and imperfect knowledge.resources  from  private  hands.    While debate over this topic raged.  However.    Limited attention in a world of limitless information     There  is  an  old  tradition  in  economic  models  that  assumes  that  people  have  imperfect  information  about  what  surrounds  them. the world changed.  making  households  poorer.  some  more  popular  than  others:    whether  people  know  everything‐‐‐model. Yet. or early  retirement may be effective at fighting recessions. rather than expanding.  but  the  toolkit that macroeconomists use leaves much room for creativity.

 This has led.    However.even  change)  in  the  way  people  save.  manufacturing  and  services.  usual  pre‐requisites  for  any  glimmer  of  progress. funding for basic research is essential.  A  second  reason  for  concern  is  that  the  measures  of  inflation  that  the  models  of  optimal  monetary policy say should guide monetary policy have only the slightest resemblance with the CPI. it is time to come  up with meaningful ways to group the activities within services. If everything is in the old "services". and just drawing from it to mimic the BLS sample.  but  also  well‐defined  and  appreciated. in my view. The new data from brain scans.  who  suggested  that  jobs  could  be  instead  divided  into  offshorable  or  not. Economists have just started using tools  from  neuroscience  to  measure  brain  activity  and  for  a  while  drawn  on  philosophy  and  psychology  to  understand the limits of human knowledge. I would say it is here.  One  problem  with  the  cooperation  across  disciplines  is  that  they  are  terribly  expensive. There is much active  work on this area. there are reasons to think that reformulating economic statistics will soon become an  active area of research.    Measurement of macroeconomic aggregates and forecasting    The last great revolution in the measurement of economic activity is more than half a century  old.  without  changes  to  the  fundamental concepts of what is being measured.  while  free  information  is  today  almost  limitless. In particular our mental energy and attention span is very limited.  which  has  a  sample  a  few  times  larger  than  that  available  to  the  Bureau  of  Labor  Atatistics.  and often frustratingly slow to yield progress. this classification is almost vacuous. A good example comes from our division of economic activity into three broad  sectors:  agriculture. If we want economics to deal better with information.    The  answer  to  this  puzzle  is  likely  that. the growing field of behavioral  economics  to  too  often  focus  on  "cute"  results  that  are  quicker  to  obtain  and  grab  the  attention  of  sponsors. sampling error alone meant that 65% of the  times  it  was  impossible  to  say  whether  inflation  was  accelerating  of  decelerating  from  one  quarter  to  the  next. One useful insight came recently from  Alan  Blinder. I was shocked by  the recent work of Christian Broda and David Weinstein showing that using the data on consumer prices  of  AC  Nielsen. and the scope for interdisciplinarity is great.  most  research  has  revolved  around  improving  the  quality  of  the  data. as well as the amazing information from surveys and activities  of  people  using  the  Internet  is  allowing  the  discussion  to  move  from  ideological  positions  to  testable  propositions.  How to model  these limits is an outstanding challenge that could profoundly change economics.  The  challenge  is  great.  and  the  typical  result  from  surveys  of  people  is  that  most  lack  even the basic knowledge that economic models assume would be so valuable to them.  Since  then.  the  human brain is not.  with  the  introduction  of  price  indices  and  the  national  income  and  product  accounts.      259 .  With  employment  in  the  services  sector  getting  very  close to 90%.    If I was to bet on where the next revolution in economics will come from.  a  much  more  useful categorization when informing trade policy.    Another economic index in need of reassessment is the consumer price index.

 The embarrassment of  the  past  crisis.  "Baby  Boomer  retirement  security:  The  roles  of  planning.  This  is  time  consuming.  Blinder. 54.  that  almost  no  one  forecasted.  and  Jagdish  Bhagwati  (2009)  Offshoring  of  American  Jobs:  What  Response  from  U.    References  Hyunseung  Oh  and  Ricardo  Reis  (2010).  and  requires  assembling  teams  of  research  assistants  similar  to  the  laboratories  in  the  natural  sciences  but  which  are  inconceivable  for  economists  with  access  to  only  very  limited  funding.  expensive.   Yet.    Conclusion    Most  people  being  asked  to  write  about  the  future  of  macroeconomic  research  today  would  probably discuss financial crises and the integration of finance and macroeconomics. I am encouraged by the surprising success of the recent book by Reinhart and Rogoff on financial  crisis.  "Targeted  transfers  and  the  great  recession"  Columbia  University manuscript. I  fear that the excitement about finance will satiate itself within a few years and it will involve applying  the  standard  economic  apparatus. and housing wealth.  Lusardi. I did not mention  this so far. 205‐224.  Mitchell  (2007). there has been great progress in the science of forecasting in the previous years.  for  any  work  that  argues  for  measuring  and  forecasting  economic  activity  in  a  different  way  has  to  come  up  with  at  least  provisional  estimates.  Alan  S. In other part though. in part because I expect others will describe it in more detail to you.    All  of  this  work  is  painful.  Finally. It is almost entirely a data effort.S.        260 .  The  work  I  have  seen  in  the  last  year  has  confirmed  my  prior. The theory of dynamic principal components models applied to forecasting has  taken great strides and there are many new approaches to forecasting business cycle that are just now  being  applied.  The  three  areas  that  I  discussed  above  instead  are  ones  that  I  think  may  have  a  larger  impact  on  what  is  economics but also that will yield fruits for longer.  Much  of  this  work  has  been  left  to  do  as  macroeconomists  have  steered  away  from  forecasting in the past decade to focus more on theory and structural modeling.  Economic Policy? MIT Press.  Annamaria  and  Olivia  S." Journal of Monetary Economics.  may  provide  the  right  stimulus  to  reignite  interest  in  forecasting. and yet it has had as much impact on our understanding of the  current financial crisis as any other work. financial literacy. in part  driven by increases in computing power that allows us to use information from thousands of time series  to inform the forecasts.

      261 . USA.This work is licensed under the Creative Commons Attribution‐NoDerivs 3. Suite 300.0 Unported License.0/  or  send  a  letter  to  Creative  Commons. San Francisco. 94105.  visit  http://creativecommons. 171 Second Street. To view a  copy  of  this  license.org/licenses/by‐nd/3. California.

262 .

we cannot even begin to make sense of the financial crisis. We can get overly enamored of a particular model that happens to be inappropriate to the circumstances at hand. agency theory. among others. or if one is a journalist asking an economist for a quick 263 . The problem was that economists (and those who listen to them) became overconfident in their preferred models of the moment: markets are efficient.A RESEARCH AGENDA IN ECONOMIC DIAGNOSTICS A Note Prepared for the National Science Foundation Dani Rodrik Harvard University September 8. emphasize the risks created by financial innovation. Many commentators. Why. and agree on the solutions to be pursued once the crisis struck as evidence that economics had become bankrupt as a discipline. Hubris creates blind spots. in fashion. They forgot that there were many other models that led in radically different directions. but because they were.not because these had better empirical validation. Without recourse to the economist’s toolkit. self-regulation works best. information economics. If the only economics course one takes is the typical introductory survey. My view is different. financial innovation transfers risk to those best able to bear it. But models are also our great weakness. 2010 Economists work with models. did China’s decision to accumulate foreign reserves result in a mortgage lender in Ohio taking excessive risks? It is impossible to provide a coherent answer to this question without resorting to constructs from behavioral economics. including some within mainstream economics. interpreted the failure of economists to recognize the housing bubble. Economists put too much faith in particular financial and macro models at the expense of others -. The fault lies less with economics than with how economists have used the tools at their disposal. This is in fact not a bad characterization of what happened in economics in the run-up to the recent financial crisis. to put it bluntly. Non-economists tend to think of economics as a discipline that idolizes markets and a narrow concept of (allocative) efficiency. A model is at best a gross simplification of reality. and international economics. So we can end up misunderstanding the world and making the wrong recommendations. because each single model is necessarily false. as the discipline imparted by specifying well-articulated cause-and-effect relationships checks our logic and prevents us from falling into incoherence. That is our great strength. for example. Economics as a science makes progress one model at a time. and government intervention is ineffective and harmful.

stylized representation of some aspect of reality. and about the myriad ways in which government intervention is required to help markets work better. 1 See Dani Rodrik. zero-agency-costs models to the U. pp. Labor economists focus not only on how trade unions can distort markets. The profession places a large premium on developing new models that shed light on as yet unexplained phenomena. they can enhance productivity. Macroeconomics is perhaps the only applied field within economics today in which more training puts greater distance between the specialist and the real world. but also how. relying on market-led and state-led strategies in turn. Some. Sadly. or spend some time in advanced seminar rooms. 33-43. and one gets a different picture. macroeconomists have made little progress on policy since John Maynard Keynes explained how economies could get stuck in unemployment due to deficient aggregate demand. but no-one gets brownie points for research that informs how appropriate models and remedies can be selected in specific contexts. Summer 2010. 264 . in view of today’s needs. But take a few more economics courses. would say that the field has actually regressed. Finance theorists have written reams on the consequences of the failure of the “efficient markets” hypothesis. prior to the financial crisis. With better diagnostic tools.S.” Journal of Economic Perspectives. development economics would not gravitate from one extreme to another. 1 But clearly this ought to be part of a much more general research agenda. Open-economy macroeconomists examine the instabilities of international finance. under certain conditions. “Diagnostics Before Prescription. My colleagues and I have brought such ideas to bear on problems of growth policy in developing countries. Economics is really a toolkit with multiple models – each a different. perhaps economists would have been more skeptical of applying perfect-information.opinion on a policy issue. and the appropriate models and remedies depend on the nature of the more binding constraints. owing to its reliance on highly unrealistic models that sacrifice relevance to technical rigor. Advanced training in economics requires learning about market failures in detail. and moving from one “big idea” to another (and sometimes repudiating all big ideas altogether). like Brad DeLong and Paul Krugman. The reality is that different economies suffer from different constraints. that is indeed what one encounters. And to give an example from an entirely different domain. Diagnostic research can help us figure out how to apply economics in different settings in an intelligent way. The shocking thing about economics is that very little research is devoted to what might be called economic diagnostics: figuring out which among multiple plausible models actually applies in a particular setting. Trade economists study how globalization can exacerbate inequality within and across countries. One’s skill as an economist depends on the ability to pick and choose the right model for the situation.

Randomized policy evaluations that seem at first sight to be model-free are not immune from this criticism. every new model enlarges the toolkit. And that in turn requires methods for making intelligent diagnostic decisions. Even in the best kind of empirical work – the kind that really hones in on the essentials of the problem at hand – the manner in which the honing in has been accomplished is typically left unspecified. The trend towards empirical work in many economics subfield has pushed the problem only further back into the sub-consciousness of the researcher. Suppose the researcher finds that free distribution of bed nets reduce malaria incidence or that cameras in the classroom deter teacher absenteeism. For every piece of empirical work requires a background theoretical model in order to be interpreted. But s/he will not be exposed to a systematic exposition on when it is appropriate to apply one of these models and not another. It would contribute expertise about which model to apply where.” Economics constrains that structure but does not provide a unique mapping. It all depends on the specific model we want to apply. It would make researchers better applied economists and more useful policy advisers. the work is done instinctively and rarely becomes codified or expounded at any length. good economists develop a knack for performing the needed diagnostics. the model behind empirical work is selected in an ad hoc manner or for reasons of convenience.The absence of serious research on “choosing among models” results also in graduate programs in economics producing PhDs who are woefully undertrained when it comes to applying their trade to the real world. Instead. necessarily simplified representations of reality. of course. A research program in economic diagnostics would help economists think systematically about how to choose among competing. The promise of economics as a discipline is that it is an “applied science. and makes us better able to deal with different and new circumstances. Because these experiments are necessarily carried out in highly specific locales and under very specific experimental conditions. A student of industrial organization. the discipline remains incomplete unless we develop better rules for navigating among the diverse models that it contains. one needs a well articulated theory to be able to infer anything at all about the likely effects of similar policy interventions in different settings. In other words. say. More commonly. will be exposed to many different game-theoretic models of imperfect competition. Over time. extrapolation requires “structure.” The “science” part in this definition does not imply that every successive model displaces previous ones. Approached as such. 265 . Even then.

266 .

 particularly in the  response to a financial crisis where credit markets seize.       267 .Three Challenges Facing Modern Macroeconomics   White paper submitted to the National Science Foundation  Kenneth Rogoff. September 21. and yet  tractable. ways to incorporate financial market frictions into our canonical models for analyzing  monetary policy. 2010    There are three great challenges facing researchers in modern macroeconomics today. all  brought into sharp relief by the recent financial crisis.   The second is to rethink the role of countercyclical fiscal policy. Professor of Economics.  A third great challenge is to achieve a better  cost‐benefit analysis of financial market regulation. Harvard University. Professor of Economics. The first is to find  more realistic. Harvard University. 2010    Abstract:    Three Challenges Facing Modern Macroeconomics   White paper submitted to the National Science Foundation  Kenneth Rogoff. September 21.

 of course. both in response to shocks and. see Bernanke and Gertler (1988).2  With the benefit of hindsight.  Obstfeld and Rogoff (1996) review  analyses of sovereign default.     2  See Reinhart and Rogoff (2009).     Despite the canonical models’ obviously strong assumptions. creates considerable  complications.  using historical data to development benchmark trajectories based on past deep financial crises around  the world has proven to be a far more powerful tool both for predicting the crisis and for projecting the  economy’s post crisis recovery path.  but economists often choose it because it proved a huge simplifying assumption.   Certainly.  Perhaps  most important.  These are both important examples of departures from perfect financial  markets.  In addition.                                                               1  For financial market frictions.  The failure of the consensus models is hardly a satisfactory state of affairs. deeply undercut that confidence. in response to  monetary policy. there was no consensus model of frictions.     Prior to financial crisis. they continued to perform poorly in analyzing the aftermath of the crisis.  268 . not just in the United States  but around the world.  The financial crisis. making it hard to know what  direction to push.    There are three great challenges facing researchers in modern macroeconomics today. any departure from frictionless markets where prices (including  sophisticated futures and derivative prices) move to equate demand and supply.  Instead. particularly in the  response to a financial crisis where credit markets seize. This was in contrast to product and labor markets.  where transitory wage and price rigidities created the possibility that unemployment and capacity could  temporarily deviate from equilibrium levels. it has become apparent that part of  the consensus models’ “success” may be partly  attributed to the relative ease of forecasting during  tranquil periods. importantly.1  However.  The models not  only failed to predict the crisis itself.   The second is to rethink the role of countercyclical fiscal policy.  policymakers need a more nuanced framework for analyzing their policy choices. all  brought into sharp relief by the recent financial crisis. where distortions and imperfections were thought (by many) to be  much larger. allowing analysis to  concentrate on say.  The argument was that whereas financial markets might not be quite perfect. they  were far more so than labor and goods markets. economists had developed sophisticated models of financial frictions and of  debt repudiation.  A third great challenge is to achieve a better  cost‐benefit analysis of financial market regulation. labor markets. they failed to give meaningful warning signs of any kind. ways to incorporate financial market frictions into our canonical models for analyzing  monetary policy. and any departures from idealized perfection were of  only minor consequence. The perfect financial markets assumption may seem absurd to a lay person. and yet  tractable. The first is to find  more realistic. economists had been encouraged  by the apparent success of their frameworks in modeling monetary policy. the consensus monetary policy model assumed frictionless “perfect”  financial markets in every aspect of the economy.

  An increase in infrastructure spending presumably has very different effects than an  increase  in military spending.  A related question is how large a  government debt burden can an economy sustain without risking a loss in market confidence.    A second great challenge is to develop a better understanding of how government fiscal and  debt policy affects the economy. while making some profoundly insightful empirical  observations. a  problem famously emphasized by Harvard economist Robert Barro. and especially if growth slows.  Nevertheless.  In  the case of government spending increases. however.    The challenge ahead is to now also incorporate financial frictions. there is no reason to presume that a consensus will  arise any more quickly than after Keynes.  The challenge facing macroeconomists is a daunting one and. even after the financial crisis.  On the eve of the financial crisis. did not really offer a clear approach to how to formally model labor market frictions. until macroeconomics meets this challenge.  On top of all the issues confronting analysis of monetary policy  (introducing frictions in financial. although the underlying rationale for the rigidities remained somewhat crude and  mechanical. and that it might well take many decades before the dust  settles.  Before then.    Future  research needs to better incorporate the striking non‐linearities that historical analyses reveal in the  data.  To  make a long story short. the consensus monetary  model incorporated price and wage rigidities in a way that seemed to capture empirical reality in a  useful way. the credibility of its models will  remain deeply compromised.  But as debt rises. the canonical model  not only assumed perfect financial markets (to the extent that concept was understood at the time).  Fiscal and debt policy will of  269 . there a several additional problems. it has to matter greatly what the government is spending  money on.  There is also a question of how  private savings might be influenced by deficit spending and prospect of higher taxes in the future. the new models are an  improvement for purposes of analyzing monetary policy which would be virtually impotent in the  absence of frictions. in many ways. prompting either default or a sharp and painful adjustment. economists debated the right approach for more than half a century. This observation was a central tenet of Lord John Maynard  Keynes’ seminal work.  The US government had to make profoundly difficult choices on how much fiscal  stimulus to introduce on the back of disturbingly thin economic research.  Also.  Keynes.i  The  inadequacy of economists’ models of fiscal and debt policy was again brought to the fore by the  financial crisis. interest rates on a  country’s debt can rise quite suddenly.  Although many young  economists are already working on the problem. deficits that are due to tax cuts arguably have a very different  impact than deficits that are due to government spending increases.  But with a quarter of the population  unemployed at the peak of the Depression.  Up to a point (a “debt ceiling”) countries seem to be able to borrow freely with little consequence  on the interest rate they pay. labor and product markets). and never  found a completely satisfactory solution.   Nevertheless. parallel to the  challenge economists faced after the Great Depression of the 1930s. (At least. but  also perfect markets for all non‐financial transactions as well. it is clear that “New Keynesian” and related  models are a vast improvement not only over Keynes but over later “new neoclassical” and “real  business cycle” models that essentially rejected all frictions entirely. the notion that frictionless markets equate the supply and  demand for labor appeared patently absurd.

course become a much more popular topic now, but again, as in the case of financial market frictions, it  will take a great deal of research to make lasting progress.    The third great challenge is to develop a better cost benefit analysis of financial market  regulation.  Most analyses of regulation take a microeconomic industry or firm level perspective.  But in  the case of financial market regulation, there are important economy wide risks.  Remarkably, whereas  economists have looked  a great deal at how financial deepening fosters development, there is far less  understanding of how to balance risks in a more sophisticated economy.  How does one do a proper  cost benefit analysis of bank capital adequacy rules?  Does high frequency trading improve an  economy’s stability and growth, or is it more likely to be destabilizing?  Again, these are issues that have  always existed, but have now been given fresh urgency by the global financial crisis.    I have detailed three important challenges facing modern macroeconomic research.  In  concluding, I want to take up the issue of methodology in economics.  My basic contention is that  although macroeconomists should certainly give more attention to historical analysis and empirics, the  profession still very much needs to continue deepening its mathematical and analytical frameworks,  certainly along the lines of the three challenges outline above.    A central thrust of modern economics, especially since World War II, has been to introduce  greater mathematical rigor and discipline into analysis.  Although this approach has been much  criticized, mathematical rigor serves two essential roles.  First, it makes it far easier to make the field  cumulative, so that researchers can generalize, refine, advance and refute existing theories.  Secondly, in  conjunction with modern statistical methods, it has made possible to formally parameterize and test  specific theoretical models, greatly expanding their applicability.    As noted, the recent financial crisis has raised huge criticism and discontent with the canonical  approach to macroeconomics, some justified, some not.  A fair criticism is that because academic  researchers place great emphasis on internal consistency, there is tendency to give far less rigorous  attention to external consistency.  As noted, the small number of economists who looked at long‐term  historical data on the history of financial crises were far better able to analyze and predict the  economy’s vulnerability to the financial crisis, as well to project its likely aftermath.    But the current limitations of sophisticated mathematical and statistical models for real world  macroeconomic applications should not be viewed as a reason to reject modern technical economics.   Over the very long‐term, as economics advances as a science, frameworks that are amenable to  concrete mathematical and statistical methods are likely to continue to improve dramatically, especially  as computational methods expand and databases become deeper and easier to manipulate.  One can  imagine that future developments will allow much more nuanced models of how large‐scale markets  work, and of the interconnection between financial variables, political and regulatory constraints and  macroeconomic outcomes.    Ultimately, success in meeting the three challenges detailed here must  involve a deepening of research in technical economic methods, not abandonment.   

270

References  Bernanke, Ben S and Mark Gertler, “Financial Fragility and Economic Performance,” The Quarterly

Journal of Economics Vol. 105, No. 1 (Feb., 1990), pp. 87-114 Obstfeld, Maurice and Kenneth S Rogoff, Foundations of International Macroeconomics, Cambridge: MIT Press, 1996 Reinhart, Carmen M and Kenneth S Rogoff, This Time is Different: Eight Centuries of Financial Folly, Princeton: Princeton University Press, 2009.
                                                              
i

 See Reinhart and Rogoff (2009) and Reinhart and Rogoff, American Economic Review, May 2010. 

271

272

NSF white paper, 9/8/10

Market Design

Alvin E. Roth

Market Design: Understanding markets well enough to fix them when they’re broken “Grand Challenge” White Paper for Future Research in the Social, Behavioral & Economic Sciences By Alvin E. Roth, Harvard University Abstract In the past fifteen years, the emerging field of Market Design has solved important practical problems, and clarified both what we know and what we don’t yet know about how markets work. The challenge is to understand complex markets well enough to fix them when they’re broken, and implement new markets and market-like mechanisms when needed. Among markets that economists have helped design are multi-unit auctions for complementary goods such as spectrum licenses; computerized clearinghouses such as the National Resident Matching Program, through which most American doctors get their first jobs; decentralized labor markets such as those for more advanced medical positions and for academic positions; school choice systems; and kidney exchange, which allows patients with incompatible living donors to exchange donor kidneys with other incompatible patient-donor pairs. These markets differ from markets for simple commodities, in which, once prices have been established, everyone can choose whatever they can afford. Most of these markets are matching markets, in which you can’t just choose what you want, you also have to be chosen. One of the scientific challenges is to learn more about the workings of complex matching markets, such as labor markets for professionals, college admissions, and marriage.

273

NSF white paper, 9/8/10

Market Design

Alvin E. Roth

1.

Fundamental questions

“Market design” is the term used to refer to a growing body of work that might also be called microeconomic engineering, and to the theoretical and empirical research in economics, computer science and other disciplines that supports this effort and is motivated by it. At its heart is the fundamental question: How do markets work? For competitive commodity markets, economists have a good grasp of some of the basic elements. When price discovery and adjustment operate smoothly, agents choose what they want at the prices they see. But many markets are more complicated than that; you can’t simply choose what you want, even if you can afford it; you also have to be chosen. Examples of such “matching markets” abound: colleges don’t select their entering classes by raising tuition until just enough students remain interested, rather they set tuition so that lots of students would like to attend, and then they admit some fraction of those who apply. (And colleges also can’t just choose their students, they have to woo them, since many students are admitted to multiple colleges.) Neither do employers of professionals reduce wages until just enough applicants remain to fill the positions; there’s courtship on both sides (e.g. many new economics Ph.D.s would like to work for Stanford at the wages they offer, but Stanford receives many applications and makes only a few offers, and then has to compete with other top universities to actually hire those they make offers to). Particularly for entry-level professionals, wages are often rather impersonal (e.g. many new assistant professors of economics, or new associates at large law firms, earn around the same wage, just as many students are offered the same tuition packages). Prices seem to play a different role in clearing matching markets than in markets for commodities. Labor markets and college admissions are thus more than a little like the marriage market; each is a two sided matching market that involves searching and courting on both sides. Among markets that economists have helped design are multi-unit auctions for complementary goods such as spectrum licenses; computerized clearinghouses such as the National Resident Matching Program, through which most American doctors get their first jobs; decentralized labor markets such as those for more advanced medical positions, and for academic positions; the school choice systems used to assign children to big city schools; and kidney exchange, which allows patients with incompatible living donors to exchange donor kidneys with other incompatible patient-donor pairs. For surveys, see Milgrom (2004) and Roth (2002, 2008). Auction design is perhaps the part of market design most closely connected to the traditional function of commodity markets: price discovery and efficient allocation. However, when multiple, heterogeneous goods are offered, and buyers may want to consume packages of complementary goods, matching bidders to packages becomes necessary, and recent research motivated by FCC auctions of radio spectrum has drawn close parallels between auctions and matching markets. Many open questions

274

NSF white paper, 9/8/10

Market Design

Alvin E. Roth

remain, on the interface of economics and computer science, about the design and conduct of auctions that will make it safe and simple for bidders to bid on packages of goods. Matching markets sometimes suffer persistent market failures, and this has opened a door through which economists have engaged in market design. For example, a number of labor markets have lost thickness due to the unraveling of transaction dates: e.g. presently big law firms often hire new associates while they still have a year remaining of law school, and appellate judges hire law clerks via exploding offers that don’t allow them to compare offers. Failures associated with phenomena like these caused the market for new doctors to explore various forms of centralized clearinghouse. In 1995 I was asked to direct the redesign of the clearinghouse for new doctors (the National Resident Matching Program), to address a number of issues, including the fact that there are a growing number of married couples in that labor market who seek two positions in the same vicinity. Each of these issues raises questions whose further answers will be important for understanding and designing complex markets: • How does the timing of transactions influence market clearing? In particular, what is needed to create a marketplace in which sufficiently many transactions are available at the same time to achieve the benefits of a thick market? (Economists have devoted great effort to understanding the price of transactions, but much less is known about other features of transactions.) The timing of transactions concerns not just when they are made, but also their duration, as in e.g. the case of “exploding” offers. How does the growing number of two-career households influence the labor market? How does it influence the marriage market? How are these related (e.g. in migration to cities, and in spousal hiring policies of firms such as universities located outside of cities, and labor law involving what kinds of questions applicants can be asked about their marital status)? Some of these are questions that will involve collaboration among economists, demographers, and sociologists.

A marketplace that successfully becomes thick by attracting many participants may face a problems of congestion resulting from all the transactions that can potentially be considered, since in many markets such consideration take time (e.g. interviews in labor markets, time between offers and acceptances, etc.) Congestion was the problem that led to the redesign of the high school assignment process in New York City, and it has led to the redesign of a number of other markets, such as the market for clinical psychologists. Many open questions remain about the management of congestion. Some markets fail to reach efficient outcomes because it isn’t safe for market participants to reveal the necessary private information. This was what led to the redesign of the school choice system for Boston: the old Boston algorithm made it risky for families to reveal what schools they wished their children to attend, since a family that failed to get the choice it listed first would likely drop far down in the rankings. The new assignment mechanism makes it safe—a “dominant strategy”-- for families to state their true preferences. However in many cases it can be shown to be impossible to make safe participation a dominant strategy, and so many questions remain about how to make participation safe. Recent results in economics and computer science suggest that some of these problems may become more tractable as markets grow large.

275

and lately also computer scientists (about which more in a moment). kidney exchange and the shortage of transplantable organs also make clear that not every kind of market transaction is welcomed: some kinds of market transactions are viewed as repugnant. without computers. (The recent development of non-simultaneous chains has helped. this is work that brings together economists. note that finding optimal kidney exchanges of constrained size is an NP hard problem solved by integer programming. A characteristic of market design is that it is going to require a great deal of collaboration among all sorts of people to design appropriate markets. Many of the market designs referred to above involve computer-assisted markets. by revealing all of their incompatible patient-donor pairs to the exchange. legal scholars. More broadly. there has been collaboration between economists and doctors. 276 . or job applicants can send a certified number of signals (as in the signaling mechanism now used in the market for new economists). At each step of the process. congestion had to be overcome. but also about the rules. so that transactions are recorded and processed in an orderly way. Note that market design is not just about computerized or even centralized marketplaces. instead of just reporting bids and asks.) Presently. which is likely to illuminate many aspects of markets and market design. computers can add computational intelligence to the market.S. kidney exchange programs are grappling with the problem of how to make it safe for transplant centers to participate fully. so that many more people can participate than could at a marketplace in the physical world. In particular. For example. and get them adopted and implemented. Then. procedures and customs of decentralized markets. in the form of the number of operating rooms and surgical teams who could be assembled simultaneously to carry out larger exchanges. while many labor market clearinghouses take as input rank order lists and use a deferred acceptance algorithm to find a stable matching. 9/8/10 Market Design Alvin E. In this latter category. Roth Developing kidney exchange in the United States involved many people working to overcome each of the problems mentioned above. In addition. At present. to accomplish something more or more cheaply than could be done without computers (for example the computer can hold a reserve price without revealing it unnecessarily. market solutions are not welcomed for a variety of transactions. it is against the law in the U. a thick marketplace had to be made possible by establishing databases of incompatible patient donor pairs. market outcomes can be determined by (possibly computationally intensive) algorithms that process market information in ways that couldn’t be done. Markets can be accessed over the internet. and philosophers.NSF white paper. Understanding the sociology and psychology of repugnant transactions and markets is a big task. Markets can be run on computers. psychologists. Computers can assist markets in a number of ways. Markets can use computers as trusted intermediaries. Finally. some of them more profound than others. and in most developed nations to buy or sell organs for transplantation. or done quickly. sociologists. in helping repair an unraveled market for gastroenterologists. First. an essential feature was changing the rules about whether applicants could change their minds about offers received before a specified time. what might be called their market culture.

The theory and practice of market design are deeply intertwined. "The Economist as Engineer: Game Theory. about the most fundamental questions of economics. and Coles at Harvard. as market design develops. Parkes. Cambridge U. 3. Alvin E. 9/8/10 Market Design Alvin E. The Stanford group includes Milgrom. Pathak and Ashlagi at MIT. and how they can be fixed when they fail. 1341-1378. Sonmez and Unver at Boston College. and the Boston group includes Roth. Ostrovsky. Other centers include Maryland: Ausubel and Cramton. and Kojima.NSF white paper. 70. "What have we learned from market design?" Hahn Lecture. demanding both design knowledge and knowledge of particular domains of application. For economics as a discipline. and perhaps also more about where we might pause to look for alternatives before instituting simple or unregulated or monetary markets or relaxing the restrictions against them. 2004 Roth." Econometrica. Hatfield. Experimentation. CMU: Sandholm… References Milgrom. Niederle. market design provides a fresh source of theoretical problems and empirical data. 118 (March). design papers mostly are judged by the journals as theory papers . What are the implications for advancing the domain? For building capacity? And for providing infrastructure? As we understand more about markets (and perhaps about repugnant transactions) we’ll know more about where and in what ways better markets can improve welfare. Bulow. and Leider. Roth To summarize. it will become more like an engineering discipline. we’ll have to nurture a market design literature that judges and recognizes frontier work in appropriate ways. or empirical work. Roth. Chicago: Budish. and in other academic disciplines. they might derive their value from how those things are combined in novel ways on some new domain of application. As market design grows. but there are big active groups in the Boston area and at Stanford. Alvin E. the last fifteen years have increased our understanding of how markets fail and how they can sometimes be fixed. Athey. So. Right now. Resnick. Levin. and Computation as Tools for Design Economics. Who is doing provocative research? Market designers are starting to be to numerous for a short list (here’s a link to a very partial list of mostly economists and computer scientists). July 2002. 277 . 4. 285–310. and each particular market design brings economists into close contact with experts in the particular domain. 2008. Edelman. that standard papers do. Press. Michigan: Chen. But frontier design papers might not necessarily have the same focus on theory. Paul Putting Auction Theory to Work. Economic Journal. concerning how markets work. 2.

%20Mar ket%20Design.org/l/by-nc-sa/3.0 Unported License</a>. 278 .0/"><img alt="Creative Commons License" style="border-width:0" src="http://i. 9/8/10 Market Design Alvin E.pdf" property="cc:attributionName" rel="cc:attributionURL">Alvin E.png" /></a><br /><span xmlns:dc="http://purl. Roth</a> is licensed under a <a rel="license" href="http://creativecommons.org/dc/dcmitype/Text" property="dc:title" rel="dc:type">Market Design</span> by <a xmlns:cc="http://creativecommons.NSF white paper.0/88x31.edu/~aroth/papers/NSF%20Grand%20Challenge.SBE%202020.org/ns#" href="http://kuznets.org/licenses/by-nc-sa/3.harvard. Roth Creative Commons License: <a rel="license" href="http://creativecommons.1/" href="http://purl.org/licenses/by-ncsa/3.fas.creativecommons.0/">Creative Commons Attribution-NonCommercial-ShareAlike 3.org/dc/elements/1.

0 Unported License. This work is licensed under the Creative Commons Attribution-NoDerivs 3. 94105. San Francisco. 171 Second Street. 279 . USA. The first is a study of how context and institutions affect people’s incentives. and narrowly self-interested in others.Samuelson@yale. Suite 300. these insights are to be put to work in addressing questions of how we can influence or even design social outcomes.edu September 20. To view a copy of this license. Behavioral and Economic Sciences Larry Samuelson Department of Economics Yale University 30 Hillhouse Avenue New Haven. CT 06525-8281 Larry. Next.Future Research in the Social. visit http://creativecommons. How do we achieve a consensus on using future interactions to create current incentives? What institutions can we design that will induce people to coordinate on contributions to the public good rather than hoarding private wealth as the route to status? Questions such as these are fundamental to making economics and social science more generally a useful part of our intellectual arsenal. Why do people’s preferences appear to be helpfully prosocial in some settings. 2010 Abstract: This paper describes a research program organized around the theme of “What Makes Societies Work?” There are two stages.org/licenses/by-nd/3.0/ or send a letter to Creative Commons. and how can we design interactions to amplify the former? Why are institutions such as constitutions and courts effective in shaping social behavior in some settings but not others? Why are the relational incentives created by repeated interactions more effective in some settings than others? Addressing these questions will take a concerted effort on the part of economists and others from across the range of social sciences. California.

but rather that people in the US unilaterally supply. Behavioral and Economic Sciences Larry Samuelson September 20. but the categories are conceptually useful. growth rate. What Makes Societies Work? Why are some societies more successful than others? This is perhaps the most fundamental of social science questions. However. Like many concepts in economics. Early economic models. but have not produced a precise idea as to what it might be.Future Research in the Social. happiness. One might define success in many ways. but have not advanced to careful quantitative evaluation of their models. The 280 . The other social sciences provide a collection of intriguing ideas. taking them as fixed and beyond either explanation or influence. Incentivees An understanding of how a society works begins with an understanding of the incentives motivating its members. physical or mental health. in the sense that people express a willingness to comply only to the extent that they think others are also doing so. a higher degree of compliance. longevity. Economists typically have very little to say about preferences. as well as quite important in shaping social behavior. however. Adding human capital to the analysis provides some improvement. focused on physical capital accumulation. However. The difference appears to be not that audit probabilities are higher in the United States or penalties more severe. with per capita income. there is good reason to believe that internal incentives are quite sensitive to context. The United States is known for the extent to which its citizens comply with its tax code. or prefer. the boundaries between these categories are blurred and context dependent. is contractural if the person chooses A in return for some explicit (typically immediate) benefit. This preference is fragile. II. but still leaves large gaps in our understanding. explained woefully little of the variation in economic performance across countries. and relational if the person chooses A in return for some commonly understood but implicit (typically future) benefit. and political stability being just a few of the possibilities. from both a positive viewpoint—it is important to understand the patterns we see—and a normative viewpoint—it is important to draw lessons for how we should organize society. while many other countries struggle with tax collection. Economics has not produced a convincing answer. Introduction: II. such as the short run and long run. Recent appeals to “social capital” reflect a recognition that something is missing. 2010 I. there is sufficient common ground in these measures that we can move beyond this potentially endless diversion to concentrate attention on the underlying mechanisms. Internal incentives are simply a matter of preferences. :Let us say that the incentive for a person to do A rather than B is internal if the person prefers A to B.

but to a greater extent because they would choose not to flee even if certain of impunity. Political interactions similarly hinge on relational incentives. We expect to pay for a restaurant meal. What is the nature of these incentives? More importantly. Far from being confined to seemingly minor matters of social etiquette. much of behavioral economics has been concerned with arguing that people’s preferences are not narrowly self-interested. but wouldn’t think of asking for compensation upon giving a colleague a ride home. and obeying traffic laws to life-style decisions that are reflected in the “broken windows” theory of neighborhood behavior. Contractural incentives mediate a relatively small portion of our interactions. virtually every interaction between individuals requires a willingness to forego some individual advantage. the latter relational.” Interactions between firms and consumers similarly hinge on relational incentives—one makes no explicit promise to return to a provider who has given good service. what determines them? It may appear at first glance as if we are appealing here simply for more behavioral economics. and so on. It is important for the proposed research to move beyond this to investigate what determines the social aspects of preferences. What Shapes Internal Incentives? As noted by Arrow (1971). There are three important differences. The former interaction in each case is contractural. The first stage would address the following three questions: 1. but by relational considerations of the form “we’ll make this up next time.same is true of many other activities. from littering. Such incentives come into play whenever we make a purchase. waiting in line. It is clear that these internal incentives go beyond the narrow conception of self-interest that serves economics. Why are people willing to behave socially in some environments and not others? How do we design interactions to take advantage of these social aspects? Answering these questions will require insight from and collaboration with 281 . and to trust that others will do so as well. there is evidence that a great proportion of business-to-business interactions are governed not by explicit agreements. Contractural incentives are the hallmark of economic exchange. yet everyone views the interaction differently if such return is impossible. but a typical response to a fine dinner at a friend’s house is that one has acquired an obligation to reciprocate. III. people have internal incentives to complete transactions and respect the property of others. In the language of this proposal. People invest in private property partly because law enforcement resources help protect that property. but to a greater extent because they understand that most others will not try to seize it. trade financial instruments. with relational incentives playing the key role in the remainder. A First Round of Questions These distinctions allow us to outline a research program in two stages. The retail sector of our economy works as it does partly because customers fear arrest if they flee without paying. We readily pay taxi fares. First. accept employment.

The different questions stressed by the two disciplines clearly call for different methods. Notice that we need not simply more experiments. Third. but produced a surprisingly 282 . It is imperative that the proposed research make progress in experimental method. Perhaps most distressingly. while replication is commonly touted in the sciences as the essence of experimental inquiry. in return for simplicity and transparency. much of experimental and behavioral economics has been concerned with showing that one can find behavior that cannot be explained by the simple economic models with which we work.” allowing discoveries of behavior inconsistent with standard models to be accompanied by a meaningful discussion of whether such a finding warrants a more complicated model. Unfortunately. What Gives Rise to Contractual Incentives? At first glance. We are still far from having developed commonly accepted and workable standards for doing economic experiments. but give rise to remarkably different outcomes. Constitutions protect private property. the answer to this question seems obvious. Second. This is interesting. Upon closer examination. and so on. drawing insight not only from economic theory but also psychology and the other social sciences. they produce because they can sell. the link between the formal institutions and the resulting contractural incentives is complex and fragile. courts enforce contracts. in contrast to the extent to which other sciences view experimental design and the ability to collect additional data as a substitute for statistics. but also including sociology and anthropology. most notably psychology. finding behavior inconsistent with our current models is a useful contribution only if one can argue that elaborating our existing models to accommodate such behavior is worth the resulting erosion of simplicity and transparency. we currently have no techniques for making such comparisons. psychologists have a long history of experimentation. For example. but it is unlikely that we have nothing to learn from decades of experimental research in psychology. but rather more work on how we should do experiments. markets are designed to facilitate trade. Existing econometric methods have been imported into experimental economics. These incentives are backed up by formal institutions that ensure these transactions can be made reliably. but very little has been done to bring experimental methods from psychology into economics. We need research in quest of the theoretical equivalent of an “adjusted R-squared. all in the shadow of mutually-embraced and officially-sanctioned coercion. there is virtually no emphasis on replication in experimental economics. Hence. Contractural incentives are the bread-and-butter of economics. What does it mean to say that a constitution guarantees certain liberties or protects private property? The constitutions of Liberia and the United States are quite similar.the other social sciences. and hence are deliberately constructed so as to not explain some behavior. 2. The constitution of the Soviet Union included a host of civil liberties. or is simply another reminder that models are indeed models. and indeed no common language for discussing the issues. Models are by design approximations of a hopelessly complex reality. behavioral economics relies crucially on experimental methods. but this alone tells us very little. People work because they are paid.

but out of the implications of current actions for future payoffs. despite being distinguished by nothing in the formal structure of the equilibrium. How Do We Harness Relational Incentives? The incentives in the vast bulk of our interactions arise neither internally nor out of the expectation of contractural reward. and the current concept of jury nullification. and when are they irrelevant? How do we design formal institutions to give rise to effective contractural incentives? Mailath. legal systems. but have brought little progress on a general understanding of “focal points. namely an understanding of which of the many equilibria in a repeated game is the relevant one. in response to penalties they judged too severe. 283 . but are also ultimately the most important. drawing not only on economics. Example include cases in which juries in Victorian England simply failed to convict. IV. credit markets. can we design culture to be more effective? These questions are the most speculative raised in this proposal. We have a well-developed theory of repeated games to deal with such situations. and political science. but are “selected” by the participants? This is perhaps the most important question. When are these suggestions effective. Much more work is needed. and accordingly are the most briefly described. noting that in many cases there appears to be an obvious equilibrium. What does it mean to say that incentives are created by the potential for officially-sanctioned coercion? The propensity for super-bowl celebrations to turn into riots is but one indication that law enforcement is effective only if most people comply voluntarily. this theory is missing an essential element. Morris and Postlewaite (2001) provide one intriguing attempt at examining this problem. At the same time. psychology. We have the tools for examining such questions. suggesting (perhaps quite vividly) an equilibrium in the “game of society.different outcome. and if so. Schelling (1980) raised this problem long ago. and so on—but also the internal and relational incentives that supplement these formal institutions? To what extent should development assistance concentrate on building such incentives? Should the study and perhaps creation of such incentives play a role in our educational system? Can we describe culture as a shared set of incentives. The intervening decades have provided many more examples and filled in many details. Implications The second stage of the proposed research will build on these foundations to ask the following type of questions. while effective relational incentives may require sufficiently homogeneous behavioral expectations. 3. formal institutions are simply cheap talk. but are at the very beginning of formulating and understanding them.” but with no power to do anything other than suggest. sociology. but also history. What does it mean to say that courts enforce contracts? Law enforcement personnel or juries simply decline to enforce laws they find sufficiently unpalatable. In effect. How do we balance these conflicting forces? How can we rewrite institutional economics to include not only formal institutions—banks. again calling for reinforcements from the other social sciences. What is the optimal level of diversity in a society? Psychologists have stressed that heterogeneous groups of people often make better decisions.” How do we structure relationships so that salutary equilibria not only exist.

W.. References 1.” unpublished. Thomas Schelling (1980). The Strategy of Conflict (Harvard University Press). 284 . Stephen Morris. Kenneth J. and Andrew Postlewaite (2001). V. 2. “Laws and Authority. Arrow (1974). Norton & Company). The Limits of Organization (W. George J. 3.Answering this research challenge will require elements from all of the social sciences. but promises tremendous rewards. Mailath.

first with regard to natural resources. poorly-defined property-right regimes. Stavins1 Albert Pratt Professor of Business and Government. that is. Suite 300. the same forces of open-access C whether characterized as externalities. visit http://creativecommons. Economic research within academia and think tanks has improved our understanding of the causes and consequences of excessive resource depletion and inefficient environmental degradation. Clearly. Resources for the Future Research Associate. No over-arching authority can offer complete control.0/ or send a letter to Creative Commons. and commons problems have spread across communities and even across nations. following Pigou.org/licenses/by-nd/3. and numerous other species of plant and animal C have been depleted below socially efficient levels. USA. Kennedy School of Government.@ and economic research is well-positioned to make increasingly important contributions. depletion of stratospheric ozone. natural resource and environmental economics is a productive field of the discipline and one which shows considerable promise for the future. 285 . 171 Second Street. the scale of society has grown. and the atmospheric accumulation of greenhouse gases linked with global climate change. and more recently with regard to environmental quality. principally because of commons problems. the carrying-capacity of the planet C in regard to both natural resources and environmental quality C has become a greater concern. 1 This work is licensed under the Creative Commons Attribution-NoDerivs 3. and so commons problems have become more commonplace and more severe. John F. following Coase C have led to the degradation of air and water quality. and thereby has helped identify sensible policy solutions. While small communities frequently provide modes of oversight and methods for policing their citizens. The stocks of a diverse variety of renewable natural resources C including water. forests. Harvard University University Fellow. This is particularly true for common-property and open-access resources. To view a copy of this license. the world surely faces some Agrand challenges.Some Research Priorities in Environmental Economics Robert N. Economics C as a discipline C has gradually come to focus more and more attention on these commons problems. Today. or public goods. 94105. fisheries. The key challenges are associated presently not with better scientific understanding of the nature of the problem C although that surely is important C but rather with the fundamental economic and political barriers to policy action.0 Unported License. inappropriate disposal of hazardous waste. the greatest challenge the world faces in this realm is the threat of global climate change linked with the accumulation of greenhouse gases C including carbon dioxide (CO2) C in the atmosphere. In the environmental and resource sphere. San Francisco. Likewise. California. National Bureau of Economic Research As the United States and other economies have grown.

Second. In both cases. a carbon tax would raise revenues that can be used for beneficial public purposes. it is a stock. with greenhouse gases remaining in the atmosphere for decades to centuries. From an economic perspective. The ubiquitous nature of energy generation and use and the diversity of CO2 sources in a modern economy mean that conventional technology and performance standards would be infeasible and C in any event C excessively costly. I limit my attention to the means C the instruments C of climate policy. an auction mechanism under a cap-and-trade system can do the same. a tax approach eliminates the potential for price volatility that can exist under a cap-and-trade system. Whether a policy as significant as a national carbon tax would turn out to be simple in its implementation is an open question. Of course. In the temporal domain. it is important to ask what economics can say about the best instruments for national action. the direct benefits of taking action will inevitably be less than the costs. thereby lowering the social cost of the overall policy. Experience with previous cap-and-trade systems. whereas a minority have endorsed the use of cap-and-trade mechanisms. that is. in which firms would not need to manage and trade allowances. In the spatial domain. Therefore. A carbon tax C if implemented upstream (at the point of fossil-fuels entering the economy) and hence economy-wide C would appear to have some advantages over an equivalent upstream cap-and-trade system. Third. the nations of the world. magnitude. and thereby suggesting the importance of international C if not global C cooperation. for any individual political jurisdiction. First is the simplicity of the carbon tax system. With a cap-and-trade system. producing a classic free-rider problem. 286 . at least in the United States and other industrialized countries. There is widespread agreement among economists (and a diverse set other policy analysts) that economy-wide carbon pricing will be an essential ingredient of any policy that can achieve meaningful reductions of CO2 emissions cost-effectively. and the government would not need to track allowance transactions and ownership. this temporal flexibility needs to be built in through provisions for banking and borrowing of allowances. before turning to the topic of international cooperation. and so the nature. such as for cutting existing distortionary taxes. Despite the apparent necessity of international cooperation for the achievement of meaningful GHG targets. There is considerably less agreement among economists regarding the choice of carbon-pricing policy instrument. although economists have and will continue to make important contributions to analyses of the ends C the goals C of climate policy. It is fair to say that most academic economists have favored the use of carbon taxes. greenhouse gases uniformly mix in the atmosphere. indicates that the costs of trading institutions are not significant. however.Climate change is a commons problem of unparalleled magnitude along two key dimensions: temporal and spatial. not a flow problem. the key political unit of implementation Cand decision-making C for any international climate policy will be the sovereign state. and location of damages are independent of the location of emissions. This happens automatically with a carbon tax. Hence. it makes sense to allow emissions (of a stock pollutant) to vary from year to year with economic conditions that affect aggregate abatement costs.

To some degree. Instead. the compensation associated with free distribution of allowances based on historical activities can be mimicked under a tax regime. That system. First among these is the resistance to new taxes in many countries. but these do not raise the overall cost of the program nor affect its climate impacts. it has functioned as anticipated since then. Of course. New Zealand has launched a CO2 cap-and-trade system. and Australia and Japans are planning to do likewise. politics slowed developments in 2010. In June. The cap-and-trade approach avoids likely battles over tax exemptions among vulnerable industries and sectors that would drive up the costs of the program. despite the fact that the 2008-2009 recession has led to significantly lower allowance prices and hence fewer emission reductions. but domestic U. because with the former firms incur both abatement costs and the cost of tax payments to the government. Some observers worry about the political process= propensity under a cap-and-trade system to compensate (with free allowance allocations) sectors that claim to be burdened. Although the system had its share of problems in its pilot phase. Second. a carbon tax is more costly than a cap-and-trade system to the regulated sector. it has been argued that the two key questions that should be used to decide between these two policy approaches are: which is more politically feasible. and a cap-and-trade system without auctions). and may be expected to succumb in ways that are ultimately more harmful. In addition. responses to these questions have been provided by the political revealed preference of individual countries. including but not limited to the United States.R. and provide a straightforward means to compensate burdened sectors. a cap-and-trade system leads to battles over the allowance allocation. relative to 2005. but it is legislatively more complex. if enacted. The world=s largest cap-and-trade system is addressing Europe=s CO2 emissions C the European Union Emission Trading Scheme (EU ETS). it is easier to harmonize with other countries= carbon mitigation programs. cap-and-trade systems generate a natural unit of exchange for international harmonization linkage: allowances denominated in units of carbon content of fossil fuels (or CO2 emissions). CO2 emissions by 17 percent in 2020 and by 80 percent in 2050. through the effects of 287 . the two approaches have much in common. However. which are more likely to employ cap-and-trade than tax approaches. cap-and-trade approaches leave distributional issues up to politicians. A carbon tax is sensitive to the same pressures. international linkage can include carbon tax systems. Hence. through appropriate mechanisms. Despite these differences between carbon taxes and cap-and-trade. Third. the U. 2009.S.There are also a set of apparent disadvantages of carbon taxes. thereby simultaneously compromising environmental integrity.S. economy-wide capand-trade regime as part of H. This is the crucial political-economy distinction between the two approaches. Canada has indicated that it will launch a domestic system when and if the United States does so. 2454 C the American Clean Energy and Security Act of 2009 (otherwise known as the Waxman-Markey bill).S. as more and more sources (emission-reduction opportunities) are exempted from the program. and which is more likely to be well-designed. relative to a cap-and-trade regime. would reduce U. Therefore. In their simplest respective forms (a carbon tax without revenue recycling. House of Representatives passed an ambitious.

and bottom-up loosely coordinated national policies. The California system will be linked with systems in seven other states and four Canadian provinces under the Western Climate Initiative. but also the key emerging economies. the U. along with external incentives for participation. and technological change. In 2010. The most promising alternatives can C in principle C achieve reasonable environmental performance cost-effectively by including not only the currently industrialized nations. insufficient to the long-term task. and California=s Global Warming Solutions Act (Assembly Bill 32) will lead to the creation of an ambitious set of climate initiatives. These sub-national policies will interact in a variety of ways with Federal policy when and if a Federal policy is enacted. Since sovereign nations cannot be compelled to act against their wishes. political incentives around the world are to rely upon other nations to take action. Given the spatial and temporal nature of this global commons problem. thereby reducing the pace of economic growth such that 2050's expected economic output would be delayed by a few months. Political feasibility. such as the interaction between a Federal cap-and-trade system and a more ambitious cap-and-trade system in California under AB 32. such as domestic carbon taxes. with emissions in nearly all OECD countries close to stable or falling. while other interactions would be benign. Because no single approach guarantees a sure path to ultimate 288 . and most growth in CO2 emissions in the coming decades will come from countries outside of the Organization of Economic Cooperation and Development (OECD). due to the exclusion of developing countries from responsibility. that is no longer the case. landBuse changes. A wide range of potential paths forward are possible. With political stalemate in Washington. unless it is stopped by ballot initiative or a new Governor. successful cooperation C whether in the form of international treaties or less formal mechanisms C must create internal incentives for compliance. such as RGGI becoming irrelevant in the face of a Federal cap-and-trade system that was both more stringent and broader in scope. Some of these interactions would be problematic. China has surpassed the United States as the world=s largest emitter.S. The best estimates of the costs are that they would be considerably less than 1 percent of GDP annually in the long term. attention may increasingly turn to sub-national policies intended to address climate change. The Kyoto Protocol (1997) to the United Nations Framework Convention on Climate Change (1992) will expire in 2012. Although the industrialized countries accounted for the majority of annual CO2 emissions until 2004. Even as domestic climate policies move forward in some countries but not in others. in any event. The Regional Greenhouse Gas Initiative (RGGI) in the Northeast has created a cap-and-trade system among electricity generators. including top-down international agreements involving targets and timetables that involve more countries as they become more wealthy. however. and is. such as the linkage of regional and national cap-and-trade systems through bilateral arrangements. international cooperation will eventually be necessary. it is clear that due to the global commons nature of the problem. including a statewide cap-and-trade system. fuel switching (coal to natural gas). is another matter. Senate opted to delay any and all climate legislation.price signals on energy efficiency. harmonized national policies.

the best strategy to address this ultimate commons problem may be to pursue a variety of approaches simultaneously. 289 .success.

290 .

but theory is no substitute for good empirical design. I will take this opportunity to reveal my prejudices regarding key areas that would in benefit from increased resources. One recurrent theme is the importance of heterogeneity in performance between firms and how this links to management practices. This has made the storage. II. they would already have been penned. I. Grand Challenges – Some main themes Organizational Heterogeneity 291 . it is not in the dominant position that it once was. manipulation and analysis of data much easier and has led to a “Golden Age” of micro-econometric work. Researchers are now much more careful to seek to identify exogenous changes in the variable of interest (either from nature or policy-makers) or. to design and implement their own (often randomized controlled) experiments. Liberalization of access to Census Bureau information has also helped as has greater regulatory requirements of the disclosure of company accounts. Although writing down a structural model and using an “off the shelf” secondary dataset even (if highly unsuited) still goes on. Splitting economics from other NSF funding would also be desirable. if this is not possible. if one could indeed write such lyrics. Introduction I am writing this in response to a letter by Myron Guttman requesting “Grand Challenges” for the social sciences over the next two decades. Professor of Economics at the London School of Economics and Director of the Centre for Economic Performance September 6th 2010 Abstract I discuss some developments in economics and what I think are “Grand Challenges” for the social sciences over the next 10-20 years. After all. Alongside the flourishing of large-scale datasets has been a move towards more transparent methods of understanding the causal relations between variables. Two of the most important developments in economics the last decade have been (i) the massive growth of micro-economic data and (ii) the methodological move towards credible identification. The growth of huge databases of firm-level information in the public and private sector has been driven by the phenomenal fall in the quality adjusted price of information technology.White Paper for NSF Grand Challenges John Van Reenen. This is a formidable task. akin to writing the music of the future. This is not so say that structural modelling has no place – it definitely does (see sub-section on “methodology” below). Nevertheless.

Many papers suggest that the evolution of productivity differences – through the creative destruction process of allocating more output to the most efficient and driving the less productive from the market – is a key factor in the time series aggregate growth of nations and aggregate TFP differences between nations (about half of the US-India difference for example). The key challenge then is what is the cause of these between plant productivity differences? Management Practices One answer to the question on the causes of productivity heterogeneity is that the differences lie in technology. etc. Even wider distributions are evident in other nations. but better measurement actually tended to make the differences larger. within a typical four digit sector in US manufacturing output per worker is four times as high for the plant at the 90th percentile as the plant at the 10th percentile. diffusions measures (especially information and communication technologies. This is valuable but (1) a large residual remains after accounting for these observable indicators of technology. Thirdly. one of the most profound facts uncovered about modern economies is the huge variation in performance between plants and firms in narrowly defined industries. observable innovation measures. First. Bloom. find that the impact of ICT is much stronger for firms with better “people management” (i. unravelling the first part of the puzzle would be a start. Second. ICT). how to quantify management practices across different 292 . First.). the view was that inputs and outputs were badly mismeasured. For example. financial and labor markets. pay and promotion based on effort/ability rather than just tenure. plant-level price information has recently become available for some industries and when this is used to correct the measure of output (which typically used industry deflators) productivity differences where even wider (as the more efficient firms tended to charge lower prices. rigorous procedures for dealing with underperformers. it was said that these differences were purely transitory – they were not. (2) the impact of technologies on productivity is very heterogeneous and seems to depend in a substantive way on the management of firms. culture. This is. Most economists’ initial reaction to these performance differences was denial.In my view.e. This is true. they are relatively persistent. careful hiring. patents. only a proximate answer because the deeper responses need to rest to structural features of societies – product. There has been a large and substantial literature looking at the various “hard” technological variables that influence productivity – R&D. There has been significant methodological advance in this area (and still more needed) but the bottom line is that the differences persist under a wide variety of estimation procedures. it was argued that the estimation of production function parameters was flawed. Sadun and Van Reenen (2007) for example. For example. This suggests that management is a key factor in understanding productivity. of course. There are two big challenges here. etc. And for total factor productivity the difference is still about double. Nevertheless.

Increasingly. there is an element of management that is linked to productivity that makes it more akin to a technology. it is difficult to see how the evidence can be made secure without this type of approach. what are the theories that can account for the relationship? We discuss these three questions of measurement. Although powerful (e. the empirical basis for organizational economics is based too much on case studies and anecdote than solid data. for example). externalities and potential failures of financial markets. Much more so than conventional forms of capital. 2010.g.organizations in a comparable way. Although these are expensive. brands and marketing. the core assets of firms are not easily on the company balance sheet and the assets of a nation are barely tracked (human capital. firms are starting to use their data more systematically. 2003) or dynamic and transferred between firms like any other technology. This is still poorly understood and needs theoretical development. 1978 or in firms as in Melitz. Personnel economics). identification and theory in turn. Intangible Capital Management is one part of the intangible capital of the firm. intellectual property. how can we get at causal effects? The gold standard approach here is. randomised control trials. there have been some advances in recent years (see Bloom and Van Reenen. but the challenge is how to develop such methods further and how to integrate them into standard statistical series such as the Economic Census. intangible capital is beset by uncertainties. 293 . but increasingly these are being opened. This may be static and nontransferable (embodied in people as in Lucas. This “design” approach applies much of standard optimization and equilibrium concepts to the theory of the firm. is the correlation of management on productivity causal? And third. On the theory side there are now a wide range of models that seek to account for the heterogeneity of management. Although some management styles are fads and fashions. Second. in my view. Further. but with the new abundance of information. This also used to be the case for governments. This needs to be done internationally to obtain cross-country comparisons. Firms tended to underutilize their data. mainstream modern economics correctly deems them as part of the chosen organizational design of the firm. Can the (high skilled) labor intensive methods of Bloom and Van Reenen (2007) be simplified so that they can be mainstreamed in statistical agencies routine data collection? On the identification side. Unlocking Business data Large amounts of data are collected by private sector firms and kept secret. for a discussion). There is a challenge to better measure and understand the accumulation of these intangible assets. On the measurement side.

Some of its problems are inescapable – a paucity of data of severe downturns. Studies of the human brain and behavior have shown how important these noncognitive aspects are in economic behavior. Climate change and its economic effects. What are the adaption policies? How can policy be used to influence innovation to tackle climate change The growth of emerging powers. III. Encouragement for work which combines experimental and quasi-experimental evidence with theory (so structural estimation is possible) is the ideal. But a grand challenge for macro is to reflect economic reality of frictions much more seriously. What effect will this have on the political economy of the world? Demographics The impact of aging and changing demographics Africa. above all China. there is a huge opportunity to make more business data available to tackle the questions of heterogeneity and the causal impact of business practices Macro-economics and Finance The trends towards credible identification and deeper use of micro-data have penetrated some fields more than others. Other Themes and Challenges These are mainly obvious so I will list them in a rather staccato way  A richer conception of human capital. for example. The best methods combine credible identification with good theory. It also needs to re-discover respect for data and causal identification. Macro-economics needs a Perestroika moment where the imperfections of financial markets take a pride of place. On the other end of the scale to macro. and difficulty in running experiments. we need a richer concept of human capital. How can we model the accumulation of mental health? What policies best influence the development of human capital in this respect? Methodology. Why has Africa stayed so poor? Is this going to change      294 . People’s facilities rely not just on their physical and cognitive endowments.Just as with government-academic cooperation. Most macro-models share the unfortunate assumptions of frictionless financial markets. an assumption that has fared extra-ordinarily badly over the financial crisis. Macro-economics at some point seemed to turn its back on data and retreat into a focus on tightly specified models with empirical data used loosely to calibrate parameters of these models. but also on their non-cognitive resiliency.

http://cep. American Economic Review http://cep. Nick and John Van Reenen (2007) “Measuring and Explaining Management practices across firms and nations” Quarterly Journal of Economics (2007) 122(4).lse.ac.vanreenen@lse. Raffaella Sadun and John Van Reenen (2007) “Americans Do I.ac.T Better: US multinationals and the productivity miracle” NBER Working Paper No. rather than mixing the funding stream with other disciplines. Nick.IV. References Bloom. Since many of my themes cross disciplines. Recommendations Funding should focus on the areas identified above. Nick and John Van Reenen (2010) “Human Resource Management and Productivity” Handbook of Labor Economics Volume IV (Edited by Orley Ashenfelter and David Card) http://cep. Economists have the best tools to tackle the questions I have identified and I think it would be better to split NSF funding so there was a distinct stream solely for economics. 13085.uk/pubs/download/dp0716.lse. there is ample scope for inter-disciplinary work.uk/pubs/download/dp0788.ac. Yet my experience is that the best research is done in the discipline one knows and setting up explicit inter-disciplinary funding leads only to tokenism. especially for III. Forthcoming.lse.ac.pdf Bloom.uk/pubs/download/dp0982.pdf Bloom.pdf j. 1351–1408.uk 295 .

296 .

For example. Many of these are quite sophisticated and there is a strong possibility for public-private research co-operation in this area. Such monitoring technology would also be very helpful for the shorter clinical trials I am describing here. these are referred to as single-source panels. This is a ripe area for analysis. except in the relatively constrained environment of the laboratory. I am well aware that the NSF has been funding field experiments in a variety of areas. My understanding is that interviews and surveys are still the basis for much of the analysis. A substantial amount of money is being invested by the private sector in the design and funding of panels for purposes of marketing. but proposals for fundamental topics in human behavior should also be entertained. It would be particularly helpful for researchers to be on the alert for “natural experiments” and recruit subjects in both treatment and control groups to facilitate analysis of the experiments.) I believe that there should be a bias towards policy-relevant experiments. Proposals for experimental designs should be submitted to a special program and be reviewed by referees and a panel of experts. funding would be offered to the researchers. including welfare payments. My proposal is to create a special program in experimental design and analysis for these and other topics. Discussion. In the marketing literature. (Think of rentcontrol as an example.Grand Challenge for NSF-SBE in the Next Decade Clinical Trials in Economics Hal Varian 6 September 2010 Abstract. I propose that the NSF should set up a program to fund field experiments/clinical trials in a variety of areas in economics. The data from such studies could be hugely valuable to economists studying household behavior. as with current experiments. The gold standard for scientific research is reproducible controlled experiments. 297 . the panel members can receive different experimental treatments using quite sophisticated techniques. we would expect some iteration with respect to the experimental design. There should also be educational funding for summer courses in experimental design and related topics to make sure that all economic graduates have some training in experimental design and analysis. to name just a few. Long-running panels such as the PSID have been hugely helpful in understanding economic behavior at the individual and household level. I believe that monitoring technology available today can offer substantial improvement on these traditional methods by making the gathering of data less onerous and more accurate. over the last several decades economists have built up an impressive body of literature on game theory and strategic behavior. In the last two or three decades economics has made much progress in implementing experiments in both the laboratory and in the field. These clinical trials should be designed to resolve fundamental debates in economics. educational issues. When a consensus (or a significant majority) is reached about experimental design. It would be helpful to involve researchers from public health and other fields who are familiar with the problems involved with large clinical trials. and development economics. but there has been comparatively little empirical work. Unlikely current proposals. In many cases.

and the like.For example. Nielsen Homescan maintains a standing panel of consumers who scan every item purchased on a weekly basis. one might speculate that existing efforts such as the PSID would provide commercially useful data as well (on a non-proprietary. providing a natural way of addressing scientific and commercial needs. 298 . family composition. these panels could provide highly useful scientific data. but many of the other economic uses can be conducted with historical data. non-exclusive basis. of course. This work is licensed under the Creative Commons Attribution Non-Commercial Share Alike license. there are several other single-source panels that are providing on-going data for various commercial purchases. The marketing uses of the data require current data. consumer confidence. but imagine what you could do by examining how purchases respond to changes in employment. This provides a wealth of data for marketing. With only a small amount of effort. This is only a single example. Conversely. taxes.

Behavior and decision-making are important to chronic disease management. GRAND CHALLENGES FOR THE SCIENTIFIC STUDY OF AGING David R. with funding to genotype 20. To view a copy of this license.000 respondents using a current state-of-the-art 2. choose a low-risk environment if one has genes that produce a low biological tolerance for risk) means that the links between genetics and outcomes can appear distorted. DNA is the blueprint and gene expression is the contractor who actually implements the design. USA. Moreover. the fact that humans can both choose and manipulate their environments (e. These two features—population aging as a social-demographic fact. San Francisco.org/licenses/by-nd/3. In contrast to that first demographic transition. The first grand challenge is to better integrate the behavioral and biological sciences to their mutual benefit.. Suite 300. the enormous success at defeating infant mortality has left very little room for further improvements to have much quantitative effect on life expectancy. this one involves chronic diseases that cannot in general be definitively prevented or cured but rather postponed and managed to limit their impact on healthy functioning and mortality. 171 Second Street. 2010 The demographic transition that began at the end of the eighteenth century saw dramatic reductions in infant and child mortality.0/ or send a letter to Creative Commons. Moreover. visit http://creativecommons.0 Unported License. Weir University of Michigan October 15. a cooperative agreement between its sponsor the National Institute on Aging (NIA) and the University of Michigan. This will open the door to investigation by many and varied multidisciplinary teams of scholars to better understand the behaviors and health conditions that either advance or delay the progression of aging. will be critical to further progress. Both trends resulted in populations whose stable dynamics imply much older populations. The demographic transition of the recent past and long-term future is one in which adult and especially older-age mortality and morbidity will be the main stage for large changes in population dynamics. 299 . and individual aging as a target for health improvement—pose the grand challenges for behavioral science I wish to address. What then is the benefit to genetics or the biological sciences from an integration with behavioral sciences? One crucial link is gene expression. The Health and Retirement Study (HRS). To use a common analogy. That means that studying genetic associations without understanding the influence of social environments is inherently limited.5 million SNP chip. accompanied by declining fertility. Genes express their influence over the construction of proteins differently at different times and this appears to be subject to social influences in at least some cases. 94105. which I now direct. as they are to other aspects of healthy aging.This work is licensed under the Creative Commons Attribution-NoDerivs 3. in which the prevention and “curing” of infectious diseases was central. is one of several population-based surveys that are integrating biological measures. Understanding the genetics of both chronic disease and the behaviors and decision propensities needed to manage it. California. We are now moving into a leadership role in the integration of genetics.g. Integrate behavioral and biological sciences.

Retrospection has some obvious dangers if recall of earlier risk factors is influenced by the current presence of outcomes the respondent believes are due to those risk factors. and longitudinal study designs are crucial to research on aging. The more immediate priority is to fill in individual histories on persons currently in late adulthood. The other approach is to exploit studies done many years ago that provide observations on early life. forthcoming). Greater attention must be given to study designs that allow early-life exposures. REFERENCES Robert M. Weir. The HRS model has been successful at encouraging comparable designs elsewhere. can achieve this and deserve support now even though their usefulness for aging research are several decades off. Longitudinal studies are critical to the study of aging because it is fundamentally about change and the pace of change in functioning (Hauser and Weir. which clouds the association of two variables. “Recent Developments in Longitudinal Studies of Aging. and finding and studying their participants. Across individuals. International comparisons exploit the natural experiments of national histories to provide some truly exogenous variation. and through better question design as we have attempted to do in HRS. But some countries have developed generous early retirement policies and others have not. Combining several studies based on the HRS model.” forthcoming. Rohwedder and Willis (2010) showed that country variation in retirement age predicted country differences in the pace of cognitive decline. and characteristics to be included in the analysis of outcomes in later life. it would be nearly impossible to determine whether cognitive decline caused early retirement or early retirement caused cognitive decline. The ability to conduct international comparisons requires a high degree of cooperation or “harmonization” among scientists and studies. but not always sufficient. A recent example is the relationship of retirement to cognitive decline. Birth cohort studies. That bias is not universal and can be minimized by obtaining retrospective reports before the outcomes are present. Expand life-course perspective. Finding such resources. Another way in which this new demographic transition differs from the old is that the health of older persons depends on their own individual histories in ways that the health of a newborn simply cannot.The challenge to the institutions that support scientific research is to support the large sample sizes needed to make and replicate genetic inferences. Longitudinal data can be valuable in this regard. experiences. Demography 300 . and the right phenotypic traits. resulting in fairly large differences in retirement ages across countries that is not likely due to cognition. Such cooperation can be further encouraged by the right incentives from research organizations. the right proteins. of which there are several good examples. should be a top scientific priority. Hauser and David R. Promote international comparison. The challenge to the scientists is to better understand the genome and its expression to focus the effort more economically on the right genes. But most longitudinal studies begin in middle age or later. A persistent problem in behavioral research is endogeneity. or reverse causality.

24(1): 119–38. "Mental Retirement. Willis.Rohwedder. 301 . Susann. and Robert J." Journal of Economic Perspectives. 2010.

302 .

visit http://creativecommons. I suggest developing the mixed Gini and Ordinary Least Squares regression. My conjecture is that the new technique will reduce drastically the number of results that are claimed to be supported by empirical "proofs". 303 .0 Unported License. USA. The Hebrew University Shlomo.org/licenses/by-nc-sa/3.ac. Since then there were huge improvements in research design. Suite 300. data collection and econometric methodology. To view a copy of this license. Among those assumptions are the linearity assumption. the huge increase in computing power has increased the number of instruments available for the use of the over-zealous researcher who wants to prove his point. Israel and Professor emeritus. the use of monotonic increasing transformations. It enables unraveling. and the symmetry between distributions that is imposed by the Pearson correlation coefficient.yitzhaki@huji. On the other hand.Sensitivity Analysis through mixed Gini and OLS regressions Paper submitted to the NSF By Shlomo Yitzhaki The Government Statistician. California. tracing and testing the role of several whimsical assumptions imposed on the data in regression analysis. San Francisco. This work is licensed under the Creative Commons Attribution-NonCommercialShareAlike 3. 94105.il Abstract About thirty years ago Edward Leamer criticized the credibility of empirical research in economics. 171 Second Street.0/ or send a letter to Creative Commons.

org. using the same data and an identical model. better definitions of the research question. which is based on the properties of the variance. My interpretation of whimsical assumptions is those that are imposed on the data. to other methods. In a recent paper Angrist and Pischke responded by pointing out the huge improvements in research design. and that may drive the results. To simplify the presentation. American Economic Review 73 no. are still there. with some modifications. They show that if one investigator uses Gini regression and the other OLS regression. More computer power allows more complicated modeling and data mining. http://www. J. then the sign of some regression coefficients differ. as far as I can see. Pischke. better data collection.S. Presented at the 31st IARIW. Golan and S. Yitzhaki. However. 1 2 E. To see whether Leamer’s criticism is still valid let me ask the following question: is it possible that two investigators. can reach opposite conclusions concerning the partial effect of one variable on another? Golan and Yitzhaki supply a positive answer to this question. . 2 I do not deny the improvements pointed out. http://www. 3 Yet both regression methods rely on plausible properties of regression model and can be described as innocent applications of the methodology. Working Paper No. Leamer.php 304 . affect the coefficients in a drastic way but are not supported by the data. Angrist and J. Let’s Take the Con Out of Econometrics. What is the fundamental question? A popular method in reaching quantitative conclusions in research is the method of regression. NBER. I will restrict my arguments to the OLS. the methodology of estimation has not changed in a qualitative way. But some of the assumptions that are not supported by the data. The most popular one is the Ordinary Least Squares. although the arguments apply. This was pointed out by Edward Leamer who states that "Hardly anyone takes data analysis seriously. and more. Draft. The Credibility Revolution in Empirical Economics: How Better Research Design is taking the Con out of Econometrics. Who Does not Respond in the Social Survey: an Exercise in OLS and Gini Regressions. 15794." 1 Leamer traced the lack of credibility to lack of robustness because of sensitivity to key assumptions he called as "whimsical".nber.iariw. 1 (1983): 37.org/c2010./papers/w15794 3 Y.Sensitivity Analysis through mixed Gini and OLS regressions 1. It is clear that this area of research suffers from a lack of credibility.

Yitzhaki. together with monotonic increasing transformations and the regression method. we need a methodology that "reveals more" (the term was coined by Lambert and Aronson). linearity is a “whimsical” assumption. 4 (October 1996): 478-86. 305 . Hence. which has many properties that are similar to those of the variance (and which nests variance). The "reveal more" includes the following: The basic assumption in a regression is that there exists a linear model connecting the variables. The association of many variables with age is a U-shape (or an inverse U-shape) relationship. Yitzhaki showed that both the OLS and Gini methods result in a regression coefficient which is a weighted average of slopes between adjacent points of the independent variable. which in some cases translates into searching for the model that can get the desired results. let us consider age. and the way the age variable is introduced in the regression. If the underlying model is not linear then the method of regression would result in different estimate. Journal of Business & Economic Statistics. 4 S. may determine the sign of the regression coefficient with respect to age. In some sense the target of the overzealous researcher is to prove his point. 14. I believe that Leamer's critique can be answered by developing a better technique that incorporates the properties of OLS as a special case. Moreover. the composition of the sample. to see how robust are the conclusions derived by the regression. These factors may also determine the sign of the regression coefficients that are included as independent variables together with age. For this purpose. but reveals more about the critical underlying statistical assumptions.My argument is that there are too many tools in the arsenal of the researcher to influence the results of the regression. In such a case. As an example. To see the effect of this assumption consider the following: Some variables are not related to each other in a monotonic way. The suggested methodology is based on the properties of the Gini's Mean Difference (and the Extended Gini family). it can be used together with the OLS. On Using Linear Regression in Welfare Economics. 4 The method of regression determines the weighting scheme. My aim is to provide a method that exposes the hidden and redundant assumptions that are responsible for the results.

Yitzhaki and T. The reason is that the regression method or a monotonic transformation can change the magnitude of the regression coefficient and the magnitude of its correlation with other variables. The Extended Gini regression 5 E. Of course such transformations of variables change the properties of the data. when used. S. can change the sign of the relationship in the case of a non-monotonic relationship. The extended Gini allows the researcher to impose (and reveal to the reader) her social (or risk) attitude. Schechtman. This in turn can change the sign of a regression coefficient of another variable in the regression. and binning (making a continuous variable a discrete one). Gini's Multiple Regressions: Two Approaches and Their Interaction (2010). economic theory calls for asymmetric treatment of the data. totally ignoring the poor. It arises because of the assumption of declining marginal utility of income that is made in the areas of decision making under risk and income distribution. http://ssrn. 6 To illustrate.The Gini methodology enables the researcher and the reader to know whether nonmonotonic relationships among the variables exist. 5 The same problem exists if the relationship is monotonic but not linear. a researcher that uses a linear expenditure system using OLS in order to design poverty reducing subsidies estimates the income elasticities of the commodities by the properties of the top deciles. In at least two areas of social science. and to impose the statistical measure of variability that reflects the social attitude on the analysis. Monotonic transformations. restricting the sample from above or/and below. It is shown in Yitzhaki that statistical theory may contradict economic theory if some “whimsical” assumptions--like linearity of the model that includes income as an independent variable is imposed but not supported by the data.com Yitzhaki (1996) Ibid. The Gini method also enables the researcher to test for linearity. should be carefully documented and justified. This way one first reveals one’s social attitude and then imposes it on the analysis. which are a legitimate tool to use in modeling the relationship between variables.S. Monotonic transformations include using a different functional form. Pudalov. then the model should be viewed as a linear approximation that is not useful for prediction. If linearity is rejected. 6 306 . and therefore should be used sparingly and.

by applying the transformation ex to two normally distributed variables. one can move east/west or north/south as in Manhattan). (A transfer of a dollar from one asset to the other is additive. because compound interest is used). Correlation and the Time Interval over which the Variables Are Measured. for which the Pearson correlation coefficient is bounded from below by -0. consider two lognormally distributed variables.36. 133-138. Schechtman and S. The problem is especially relevant and severe in the field of finance. It is not obvious a priori which metric is more appropriate for particular social science applications. but it reveals more. 9 In this context "reveals more" means that the structure of the decomposition of the Gini of a linear combination of random variables can be identical to the structure 7 E. Yitzhaki. Journal of Econometrics 76 (1997): 341 S. But. where additive and multiplicative relationships are mixed. Gini’s Mean Difference: A Superior Measure of Variability for Non-Normal Distributions. Economics Letters 63. Schwarz. The difference is in the metric used to define the distance between observations: the variance is based on Euclidean metric. we are able to change the correlation coefficient from minus one to minus -0. 7 That is. Levy and Swartz have shown that the Pearson correlation between the returns of two assets will converge to zero provided that one estimates them for a long enough period. As Yitzhaki develops. 2 (May 1999). (That is. 8 The implications of advancing the domain: To answer these questions. while interest over time is multiplicative. the Gini is the only measure of variability that can be decomposed in a way that resembles the decomposition of the variance. On the Proper Bounds of the Gini Correlation. Yitzhaki. no matter what is the periodical correlation coefficient. Both are based on averaging the difference between all pairs of observations. Metron LXI. Another instrument in the hand of an over-zealous or sophisticated researcher is the Pearson correlation coefficient. the Gini on "city block". Its "official" range is between minus one and one. while applying a transformation on the weighting scheme. Levy and G. 2 (2003): 285-316. 9 307 .36. 8 H.keeps the properties of the data intact. if the underlying distributions of the variables are different then its range can be limited. As an example. we need to explain the difference between the OLS/variance and Gini-based methodologies.

To see how this can be done. 308 . Angrist and Pischke (2010). Therefore. provided that certain properties of the underlying distributions hold. Thus using the Gini methodology expands the number of sensitivity tests one has to perform to demonstrate a robust relationship. then one can investigate the simple regression coefficients for a non-monotonic or non-linear relationship. then we may conclude that the regression results are robust. If they are not equal. and if they are equal then one gets the same decomposition as is done under the variance. friendly Software has to be developed and it is not clear that there are low hanging fruits in every area. then a symmetric relationship is imposed on the asymmetric relationship. Should one detect such a change. 10 11 Golan and Yitzhaki (2010). further supporting the “credibility revolution” championed by Angrist and Pischke. Among the implicit assumptions is the imposition of a symmetric relationship in correlations and linearity. if the signs are not equal. by using the decomposition of the Gini we can learn about the implicit assumptions behind the use of the variance method. This enables the researcher to move from one regression to the other in a step-wise way so that one can identify the variable(s) that are responsible for the change in sign that can happen in another independent variable. 10 Adopting the methodology will reduce the number of new "findings" in the social science. which will result in reducing of the quantity of results “proven” by regressions. then the researcher should run a mixed regression where the investigator can choose which independent variable to treat according to OLS and which according to the Gini. Ibid. which in turn will increase the trust in empirical results. Ibid. note that the Gini methodology allows for mixed regressions in the following sense: one can run an OLS regression as well as a Gini regression. On the other hand.of the decomposition of the variance. leading to some of the results mentioned above. The range of cases for which researchers will have to admit that no answer can be confidently claimed will likewise expand. the Gini regression is much more complicated than the OLS. 11 At this stage. However. Under the Gini methodology there are two correlation coefficients between two variables. If the sign of all regression coefficients are identical and the values do not differ very much. one can replicate every text-book in econometrics.

S. Yitzhaki and T. 309 . Gini’s mean difference: A superior measure of variability for non-normal distributions.(2003). 14. E. 4. (1996).For Further Reading: Schechtman. Metron. October. Gini's multiple regressions: two approaches and their interaction. Yitzhaki. http://ssrn. On Using Linear Regression in Welfare Economics. Forthcoming. S. Metron. 285-316. Pudalov (2010). LXI. Journal of Business & Economic Statistics. 478-86. 2.com. S. Yitzhaki.

Sign up to vote on this title
UsefulNot useful

Master Your Semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master Your Semester with a Special Offer from Scribd & The New York Times

Cancel anytime.