Psychological Review
Psychological Entropy: A Framework for UnderstandingUncertainty-Related Anxiety
Jacob B. Hirsh, Raymond A. Mar, and Jordan B. PetersonOnline First Publication, January 16, 2012. doi: 10.1037/a0026767CITATIONHirsh, J. B., Mar, R. A., & Peterson, J. B. (2012, January 16). Psychological Entropy: AFramework for Understanding Uncertainty-Related Anxiety.
Psychological Review
. Advanceonline publication. doi: 10.1037/a0026767
Psychological Entropy: A Framework for UnderstandingUncertainty-Related Anxiety
Jacob B. Hirsh
University of Toronto
Raymond A. Mar
York University
Jordan B. Peterson
University of Toronto
Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with theenvironment and must adapt themselves to changing circumstances to keep internal entropy at amanageable level. We propose the entropy model of uncertainty (EMU), an integrative theoreticalframework that applies the idea of entropy to the human information system to understand uncertainty-related anxiety. Four major tenets of EMU are proposed: (a) Uncertainty poses a critical adaptivechallenge for any organism, so individuals are motivated to keep it at a manageable level; (b) uncertaintyemerges as a function of the conflict between competing perceptual and behavioral affordances; (c)adopting clear goals and belief structures helps to constrain the experience of uncertainty by reducing thespread of competing affordances; and (d) uncertainty is experienced subjectively as anxiety and isassociated with activity in the anterior cingulate cortex and with heightened noradrenaline release. Byplacing the discussion of uncertainty management, a fundamental biological necessity, within theframework of information theory and self-organizing systems, our model helps to situate key psycho-logical processes within a broader physical, conceptual, and evolutionary context.
Keywords:
entropy, uncertainty, anxiety, behavioral inhibition, self-organization
Recent years have witnessed a growing interest in the topic of uncertainty (Heine, Proulx, & Vohs, 2006; Hogg, 2000;McGregor, Zanna, Holmes, & Spencer, 2001; Peterson, 1999; vanden Bos, 2001). As the body of research on uncertainty continuesto grow, the need for an integrative theoretical framework toestablish its psychological significance and provide a context forits neural underpinnings and behavioral consequences has becomeincreasingly apparent. In the current article, we propose that theconcept of entropy as derived from information theory provides auseful framework for understanding the nature and psychologicalimpact of uncertainty. By drawing upon dynamical models of self-organizing systems, we argue that uncertainty presents a fun-damental (and unavoidable) challenge to the integrity of anycomplex organism. The entropy-based model developed through-out this article provides an organizing framework for understand-ing the critical importance of uncertainty management for anindividual’s survival, well-being, and productivity, situated withina broader evolutionary and physical context. In doing so, it helpsto draw together numerous research literatures in which uncer-tainty plays an important role, integrating them into a coherenttheoretical framework for conceptualizing the neural and behav-ioral responses to uncertain situations.The article proposes the entropy model of uncertainty (EMU), aframework based on four major tenets: (a) Uncertainty poses acritical adaptive challenge for any organism, so individuals aremotivated to keep it at a manageable level; (b) uncertainty emergesas a function of the conflict between competing perceptual andbehavioral affordances; (c) adopting clear goals and belief struc-tures helps to constrain the experience of uncertainty by reducingthe spread of competing affordances; and (d) uncertainty is expe-rienced subjectively as anxiety
1
and is associated with activity inthe anterior cingulate cortex and heightened noradrenaline release.We begin by describing the origins and definitions of the en-tropy construct, outlining its relevance for biological organisms ingeneral and human behavior in particular. We then apply this ideato cognitive processes by introducing the construct of
psycholog-ical entropy,
defined as the experience of conflicting perceptualand behavioral affordances. We next examine how EMU accountsfor our current understanding of the neurophysiology of uncer-tainty. Finally, we discuss how the cognitive and behavioral con-sequences of heightened uncertainty can be understood within thisentropy-based framework.
1
We are using the term
anxiety
in the same manner as Gray andMcNaughton (2000), who distinguished it from the emotion of fear. Thisdistinction is further elaborated upon later.Jacob B. Hirsh, Rotman School of Management, University of Toronto,Toronto, Ontario, Canada; Raymond A. Mar, Department of Psychology,York University, Toronto, Ontario, Canada; Jordan B. Peterson, Depart-ment of Psychology, University of Toronto.Correspondence concerning this article should be addressed to Jacob B.Hirsh, 105 St. George Street, Toronto, Ontario M5S3E6, Canada. E-mail: jacob.hirsh@rotman.utoronto.ca
Psychological Review © 2011 American Psychological Association2011, Vol. 00, No. 00, 000–0000033-295X/11/$12.00 DOI: 10.1037/a0026767
1
Entropy
Rudolf Clausius (1865), working in the field of thermodynam-ics, originally defined entropy as the amount of energy within asystem that cannot be used to perform work (i.e., cannot be usedto transform the system from one state to another). Maximumentropy occurs during complete thermodynamic equilibrium, whenenergy is equally dispersed across all parts of a system. At thispoint, no useful work can be performed as work always depends onthe movement of energy from one area to another.Ludwig Boltzmann (1877), a defining figure in statistical me-chanics, extended this work by defining entropy as a function of the number of microstates that could potentially comprise a par-ticular macrostate, mathematically linking this definition to Clau-sius’s thermodynamic concept. The more microstates that arepossible, given any particular macrostate, the higher the entropy of the observed system. In this respect, entropy reflects the amount of uncertainty about a system: The greater the number of plausiblemicrostates, the more uncertainty about which microstate currentlydefines the system.Since World War II, the concept of entropy has been generalizedto all information systems, not just thermodynamic ones. ClaudeShannon (1948), a seminal figure in the field of informationtheory, defined entropy as the amount of uncertainty associatedwith a random variable. Shannon demonstrated that the informationcontent of a given signal could be measured as a function of thenumber of signals that could potentially have been received. Theinformation content of a signal is thus quantified in relation tothe amount of uncertainty reduced by receiving the message; this isdirectly linked to the prior distribution of possible outcomes.Although these various definitions of entropy are all mathemat-ically related, only this latter conceptualization has the advantageof generalizing to a broad range of information systems (Jaynes,1957; Pierce, 1980). Building on this work, Norbert Wiener(1961), the founder of cybernetics, defined entropy as the disor-ganization within cybernetic information systems (goal-directedself-regulating systems). As a system’s disorder and uncertaintyincrease, its ability to perform useful work is hampered by reducedaccuracy in specifying the current state, the desired state, and theappropriate response for transforming the former into the latter.It is the cybernetic view of entropy that plays a prominent rolein modern nonlinear dynamical systems approaches. These ap-proaches deal with the emergent properties of complex systems,with an emphasis on tendencies toward self-organization (Nicolis& Prigogine, 1977). Self-organization describes the emergence of a patterned structure of relationships between the constituent ele-ments of a complex system (Ashby, 1947, 1956). High-entropystates, in this context, reflect a lack of internal constraints amongthe system’s interacting parts, such that knowing the state of onecomponent provides minimal information about the others. Be-cause dynamical systems continually change over time, entropycan be related to the predictability of successive states givenknowledge of the current state (Shannon & Weaver, 1949).According to the second law of thermodynamics, the amount of entropy within a closed system can only increase over time. Anymechanical process involves some irreversible energy loss (e.g.,inefficiencies resulting in heat loss), and heat will not move froma colder body to a warmer body without additional energy input.Accordingly, a system moves closer to a state of thermodynamicequilibrium as it performs work. Unless more energy is added, theamount of potential work it can produce will inevitably decreasewith time. Information systems also lose energy over time as aresult of inefficiencies, so they too will eventually dissipate anddissolve unless additional energy is incorporated to sustain struc-tural coherence and minimize internal disorder. We propose thatunderstanding the relationship between entropy and the potentialof systems to perform work (i.e., to pursue and achieve goals) canilluminate the significance of uncertainty to biological systems ingeneral and psychological systems more specifically.
Entropy Management as a Fundamental Principle of Organized Systems
Application of the second law of thermodynamics to psychologyproducesthefirstmajortenetofEMU,thatuncertaintyposesacriticaladaptive challenge, resulting in the motive to reduce uncertainty. Thistenet is partly predicated on research examining the emergence andmaintenance of order within complex systems. In his groundbreakingbook,
What Is Life?
, the physicist Erwin Schrödinger (1944) arguedthat living systems survive by reducing their internal entropy, whilesimultaneously (and necessarily) increasing the entropy that exists intheir external environment. Although the total amount of entropy inthe universe as a whole can only increase (as expressed in the secondlaw of thermodynamics), living organisms can stem the rise of en-tropy found within their biological systems by consuming energyfrom the environment, using it to maintain the integrity and order of their own biological systems, and displacing their entropy into theoutside world.In the dynamical systems literature, this entropy-reductionframework has been extended to the view of biological organismsas
dissipative systems
(Prigogine & Stengers, 1997). For an organ-ism to survive, it must effectively dissipate its entropy into theenvironment. Dissipative systems are open systems operating farfrom thermodynamic equilibrium, requiring energy intake to sus-tain a stable structural organization. If the environment changes toproduce more entropy for an organism (thereby challenging itsstructural coherence), that organism must adopt new patterns of self-organization that are capable of accommodating the environ-mental changes. Self-organization describes the process by whichnovel dissipative structures emerge in response to higher entropylevels. Dynamical systems theorists therefore propose that stableinformation systems survive only insofar as they are able toeffectively manage their internal entropy. Those that cannot effec-tively dissipate this entropy are destroyed, in a Darwinian fashion(Kauffman, 1993). One consequence of this process is that com-plex systems tend to return to a relatively small number of stable,low-entropy states (known as
attractors
; Grassberger & Procaccia,1983). This is because the vast majority of states that these systemscould theoretically inhabit do not provide effective entropy man-agement and are therefore characterized by instability.Given that the principles of entropy and self-organization can beemployed to examine any complex information system, it may notbe surprising that these frameworks have also been used to studypsychological phenomena (Barton, 1994; Carver & Scheier, 2002;Hollis, Kloos, & Van Orden, 2009; Vallacher, Read, & Nowak,2002). For instance, researchers have observed self-organizingdynamics during the problem-solving process (Stephen, Bon-coddo, Magnuson, & Dixon, 2009; Stephen, Dixon, & Isenhower,
2
HIRSH, MAR, AND PETERSON