Psychological Inquiry

1990, Vol. 1, No.3, 181-197

Copyright 1990 by Lawrence Erlbaum Associates, Inc.


Lay Epistemic Theory in Social-Cognitive Psychology

Arie W. Kruglanski University of Maryland, College Park

A theory of lay epistemics is described and applied to a range of topics within social-cognitive psychology. The theory addresses the process whereby human knowledge isformed and modified, and it highlights the epistemic functions of hypothesis generation and validation. Hypothesis generation is assumed to depend on knowers' cognitive capability and their epistemic motivations. Hypothesis validation is assumed to be based on preexisting inference rules that, in the knower's mind, connect given categories of evidence with given hypotheses. The same knowledge-acquisition process is assumed ~o underlie numerous social-cognitive phenomena including attribution, dissonance, attitude formation, and judgmental accuracy. The lay epistemic analysis thus serves to integrate seemingly diverse social psychological topics under the same fundamental principles. The same analysis also has implications for synthesizing notions of adaptive and maladaptive thinking, and of lay and scientific inference. Besides the unifying coherence it lends to previously separate domains of study, the epistemic framework offers novel suggestions for future research on numerous social-cognitive topics.

This article proposes that attention to ways in which persons form, modify, and employ their knowledge affords an integration of numerous, previously separate, topics in social-cognitive psychology. This work commenced more than a dozen years ago when my colleagues and I realized that the various causal attribution models actually depict the same epistemic process and differ in the causal contents they address (Kruglanski, 1975, 1980; Kruglanski, Hamel, Maides, & Schwartz, 1978). We were intrigued to further realize that the same process-contents distinction that orders the area of attribution also lends new coherence to the area of cognitiveconsistency models (Abelson et al., 1968; Kruglanski & KIar, 1987) and affords novel insights into other classic topics in social cognition (e.g., attitudes, perceptual accuracy, social comparison processes). Finally, the epistemic focus seems capable of synthesizing conceptions of normal and neurotic inference, and of lay and scientific inference. In subsequent pages these notions are' explored in some detail, following a more complete account presented in my Lay Epistemics and Human Knowledge (Kruglanski, 1989a). Let us begin, therefore, by considering how human knowledge on any topic forms and changes.

The Process of Lay Epistemics

Knowledge is defined in terms of propositions (or bodies of propositions) in which a person has a given degree of confidence. Such definition imposes two functional requirements on a model of knowledge formation:

1. Propositional contents must be engendered somehow, requiring a phase of hypothesis generation.

2. A degree of confidence needs to be bestowed on (some of) the generated hypotheses, requiring a phase of hypothesis validation.

Hypothesis generation and validation models have been employed previously to depict epistemic activities on levels of perception (Bruner, 1951, 1973; Gregory, 1970, 1973), concept formation (Levine, 1975), problem solving (Bourne, Dominowski, & Loftus, 1979; Newell & Simon, 1972), and scientific discovery (Popper, 1972). Beyond its general affinity with those earlier formulations, the present model makes some unique assumptions about ways in which the hypothesis validation and generation functions are carried out.

Hypothesis Validation

Hypotheses are validated on the basis of relevant evidence. Relevance, in turn, is determined by preexisting inference rules that, by the knower's assumption, link together different cognitive categories. Some such assumed linkages may be logical in form as in the statement "If an interviewee emerges smiling, (then) the interview must have been a success," which renders "smiling" relevant evidence for a "success" inference. Other linkages may be probabilistic or statistical in form as in the statement "20% of College Park residents are students" by which "residence in College Park" is relevant evidence for inference of probable "student status" (for a detailed discussion of various linking structures, see KIar, in press).

Thus, both statistical and logical inferences are assumed to be mediated via the same general process in which the appropriate rules or "heuristics" are accessed and applied to the problem at hand. Occasionally, only a single rule is considered. At other times competing rules are cognized in which the same category of evidence (e.g., "smiling") is tied to possible alternative inferences (e.g., "pleased with the interview" vs. "trying to appear brave"). To the extent that such alternatives are in true competition (i.e., are mutually ex-


elusive), they define a logical inconsistency that, if left unresolved, prevents confident inference concerning either alternative (Kelley, 1971; Kruglanski, 1980). Choice between plausible competing alternatives (i.e., inconsistency resolution) is often accomplished via further inference rules incorporating diagnostic evidence (Kruglanski & Mayseless, 1988; Trope & Bassok, 1983), capable of differentiating among the rival hypotheses (e.g., "If interviewee continues to smile when in private, the 'pleasing interview' hypothesis is true and the 'brave appearance' hypothesis, false").

Hypothesis Generation

In principle, the knower may continue generating further and further linkages in which the same category of evidence is tied to various competing hypotheses (Campbell, 1969; Weimer, 1979). Given that, at most times, we do possess definite knowledge on various topics, such generation of alternatives must have come to an end in those cases. The lay epistemic model identifies two categories of conditions that may bring this about, related respectively to notions of cognitive capability and episternic motivation.

Cognitive capability: Availability and accessibility.

Recent social-cognitive literature identifies two notions relevant to the capability to generate hypotheses on a given topic. Long -term capability relates to the availability of constructs in memory (Higgins, King, & Mavin, 1982), and short-term capability relates to their momentary accessibility (Bruner, 1973; Higgins, Bargh, & Lombardi, 1985; Higgins & King, 1981). Consider a person whose car has suddenly stalled on the highway. The possible explanations he or she may be able to generate for the mishap are assumed to depend on the set of automotive concepts stored in that individual's memory (availability) and on the subset of such notions recently activated by some event (accessibility, e.g., an article about defective carburetors in the morning paper).

Epistemic motivations: Needs for nonspecific or specific closure and the avoidance of closure. Beyond capability considerations, the knower's tendency to generate hypotheses on a topic (and search for information relevant to those hypotheses) is assumed to depend on that person's epistemic motivations-that is, on motivations toward knowledge as object (Kruglanski, 1990). Episternic motivations, in turn, are classified in terms of two orthogonal

Table 1. A Two-Dimensional Classification of Epistemic Motivations

Type of

Motivating Closure

Disposition Toward Closure




Need to avoid Need for nonspecific

nonspecific closure closure


Need to avoid a Need for a specific

specific closure closure


dimensions: of closure seeking versus avoidance and of nonspecificity versus specificity. This yields a typology of four motivational orientations labeled as needs for (a) nonspecific closure, (b) avoidance of nonspecific closure, (c) specific closure, and (d) avoidance of a specific closure (see Table 1).

Nonspecific closure refers to definite knowledge on a given topic, irrespective of the particular content of such knowledge. Possession of definite knowledge may be considered desirable or undesirable in various circumstances. Thus, a motivational continuum is envisaged with a strong need for (nonspecific) closure at one end and strong need to avoid closure at the other. Specific closure is knowledge with some special properties (e.g., esteem-enhancing, or optimistic contents). A person who desires given knowledge (e.g., knowledge that he or she did well on an exam) is said to have a need for specific closure, whereas one who wishes to avoid given knowledge (e.g., knowledge that he or she failed) is said to have a need to avoid a specific closure.

All epistemic motivations are assumed to derive from the individual's cost-benefit analysis of the appropriate episternic end states. Those perceived costs and benefits are assumed to vary as function of the situation as well as the person. For instance, cognitive closure may be perceived as very advantageous under time pressure to reach a decision; in addition, some persons may generally value closure more than others.

As implied earlier, the epistemic process can be active or at rest. From the motivational standpoint, rest is assumed to ensue in cases of match (i.e., an absence of discrepancy) between currently experienced and desired epistemic states. In such circumstances, the epistemic process is "frozen" as it were (Lewin, 1943), hypothesis generation is arrested, and the individual becomes generally insensitive to relevant stimulus information (Ross, Lepper, & Hubbard, 1975). The system is "unfrozen" or activated by a discrepancy between experienced and desired epistemic states (e.g., a lack of a desired closure). This facilitates hypothesis generation on a topic and increases sensitivity to relevant stimulus information (see also Kruglanski & Freund, 1983).

Beyond their common features, the episternic motivations may differ in the specific costs and benefits on which they are based. For instance, some perceived benefits of cognitive closure may relate to predictability, the basis for action, or social status accorded the possessors of knowledge (i.e., "experts"). Similarly, some perceived costs of lacking closure may relate to the additional time and effort required to attain closure, or the unpleasantness of process whereby closure must be reached. Occasionally, however, lack of closure may be perceived to offer various advantages such as freedom from a constraining commitment, neutrality -in an acrimonious dispute, the maintenance of a romantic mystery (Snyder & Wicklund, 1981), and so on. Particularly diverse are benefits and costs associated with specific closures. Those may stem from the heterogenous contents of such closures and their manifold implications (for one's esteem, physical or material welfare, the welfare of significant others, etc.),

Particular needs in the specific closure and closure avoidance categories have received ample research attention in the social-cognitive literature. In fact, the very notion of a motivational influence on cognition has been often interpreted as a directional bias toward a favored (e.g., esteem enhancing,



or ego defensive) conclusion. Much motivational work in attribution (e.g., Miller, 1976; Ross & Sicoly, 1979; Zuckerman, 1979) has such a flavor, as does much work on cognitive dissonance (Cooper & Fazio, 1984). By contrast, the nonspecific needs for closure and closure-avoidance, although of historic interest to personality researchers (e.g., Frenkel-Brunswik, 1949; Rokeach, 1960), have been largely neglected by social-cognitive psychologists.

Recent research on nonspecific closure needs supported the lay epistemic model in contexts of (a) knowledge formation, (b) knowledge utilization, and (c) social interaction. Thus, in forming knowledge, subjects under high (vs. low) need for closure processed less information before reaching a judgment, exhibited higher judgmental confidence (Mayseless & Kruglanski, 1987), and tended to base judgments on information encountered early rather than late in the process (Kruglanski & Freund, 1983). Furthermore, under high (vs. low) need for closure, judgments tended to be driven more by preexisting prejudices, stereotypes, and attitudes rather than by the individuating stimulus information (Bechtold, Zanna, & Naccarato, 1985; Jamieson & Zanna, 1989; Kruglanski & Freund, 1983). Finally, subjects under high need to avoid (or postpone) closure tended more to socially compare with dissimilarly (vs. similarly) minded others (Kruglanski & Mayseless, 1988), whereas those under high need for closure tended more to compare with similarly minded others (Kruglanski & Mayseless, 1988) and to reject or devalue dissimilarly minded others (Kruglanski, 1988; Kruglanski & Webster, 1989).

Two Kinds of Epistemic Integration

The lay epistemic analysis serves to integrate social-cognitive conceptions in two distinct ways. Itjunctionally interrelates previously separate aspects of the judgmental process and pieces them together into one interdependent whole. This unitary process may then be abstracted from seemingly separate effects in social-cognitive psychology, demonstrating their interrelation as special cases of the same general phenomenon.

The discussion thus far has stressed the first, functional integration by noting how several judgmentally relevant principles work in concert toward the production of subjective knowledge. Notions like "inference rules," "accessibility," or "epistemic needs" were all portrayed as mutually complementary and collectively indispensable to understanding how human opinions are shaped and changed. Particularly significant in this connection is the presumption of fundamental interdependence between cognitive and motivational facets of the judgmental process (see also Sorrentino & Higgins, 1986). Such a synthesis contrasts. with previously drawn partitions between "cold" and "hot" cognition, of which the former was often implied to be more rational than the latter.

The remainder of this article is devoted to the second, abstractive integration in which heterogeneous lower level phenomena in social cognition are shown to constitute special cases of the same higher level constructs, identified in the functional integration. This abstractive task commences by offering a lay epistemic reinterpretation of attribution theory.

Underlying Epistemics of Social-Cognitive Psychology

Attribution Theory

Research on causal attribution addresses ways in which lay persons form their knowledge about causes of various events. In this sense, causal attribution represents a specific content domain where general knowledge-acquisition processes should be evident. If the lay epistemic framework depicts such processes it should afford a more general way of thinking about attributional concepts and findings. To start with, the lay epistemic analysis suggests that attributional notions may be readily mapped onto constructs describing knowledge acquisition in general. In what follows, such mapping is, therefore, attempted by clustering attributional notions in accordance with their epistemic significance for (a) the logic whereby various causal hypotheses are validated, (b) particular contents of those hypotheses, (c) motivational factors, and (d) cognitive capability factors in the formation of causal knowledge.

Logic of causal proof. The major inference rule, or "heuristic" in Kelley's (1967) seminal analysis of causal validation, is covariation. Specifically, the attributor is assumed to use the rule "If X covaries with Y, it is the cause of Y." Such a heuristic follows directly from common assumptions about the nature of causality in which covariation between cause and effect represents a major ingredient of the concept. For instance, if the attributor assumed that, for Johnny, smiling covaried with victory in tennis, she or he could infer that the cause of Johnny's smile on a given occasion was winning a match. Such a causal inference is logically consistent with or deducible! from premises involving (a) the presumption of covariation between smiling and winning, (b) the "covariation heuristic" equating covariation with causation, and (c) evidence that smiling has occurred.

Just as consistency with one's premises may foster confidence in an attribution, inconsistency may undermine confidence. In attribution theory, this notion is represented by the concept of discounting whereby "the role of a given cause in producing a given effect is discounted if other plausible causes too are present" (Kelley, 1971, p. 8). The lay epistemic analysis refines our understanding of conditions under which discounting occurs. Specifically, alternative plausible causes should detract from confidence in a given cause only if they represent hypotheses logically inconsistent, hence truly in competition, with that cause.· For instance, in attempting to account for an athlete's poor performance one may entertain the competing hypotheses that it was caused by "lack of training" versus "overtraining." The attributors's confidence in each of those hypotheses should be lowered to the extent that its alternative appeared plausible because lack of training and overtraining are logically inconsistent or mutually exclusive.

1 Throughout the present discussion, the term logical is not meant to imply that lay persons are competent logicians or that they do not err on a variety of logical tasks (Wason & Johnson-Laird, 1972). Rather, it is suggested that the validation process is generally deductive in that conclusions are inferred from evidence, implying a prior premise whereby if the designated evidence turned up the conclusion would be warranted. For further discussion, see Kruglanski (1989a, Chap. 2).


Often, however, alternative causes need not be inconsistent with a given attribution. If so, they need not lower attributional confidence. For example, the hypothesis that Frank succeeded on an exam because of hard work is not logically inconsistent with the hypothesis that he succeeded because of high ability. Both work and ability may well have been involved; hence, assuming one does not exclude the other or vice versa. Indeed, the attributional literature has accumulated by now numerous reported instances of failure to discount or insufficient discounting (e.g., Billig, 1982; Einhorn & Hogarth, 1983; Hanson & Hall, 1985; Taylor & Koivumaki, 1976). Such "failures to discount" need not be seen as anomalous but rather as reflecting situations in which the multiple causes considered (e.g., "external" and "internal" causes of aggressive behavior; Rosenfield & Stephan, 1977) are not perceived as logically inconsistent.

Consider now the relation of covariation and discounting in more general terms. These constructs seem to vary in two separate ways:

1. Covariation refers to a relation of an effect to a given cause, denoting a within-hypothesis relation, whereas discounting refers to a relation one potential cause has to another, denoting a between-hypotheses relation.

2. Covariation represents information consistent with a (causal) hypothesis, whereas discounting represents an inconsistency.

One implication of this analysis is that covariation and discounting need not be thought of as disjunctive routes to causal inference. Rather, both could be at work simultaneously. For instance, one could have evidence for covariation of the effect with each of two competing "causes," combining informational consistency within hypotheses with an inconsistency between hypotheses. Such evidence would bolster the plausibilities of the competing alternatives, and hence lead to discounting (Einhorn & Hogarth, 1983; Kruglanski, Schwartz, Maides, & Hamel, 1978). Furthermore, the confidence-undermining effect of inconsistency should not be restricted to a between-hypotheses relation (as in discounting) but should be similarly evident for a withinhypothesis relation (e.g., where inconsistent findings about covariation of an effect with a given cause were reported). As the foregoing discussion illustrates, a general epistemic perspective on the logic of subjective proof casts novel light on fundamental attributional notions regarding the validation and invalidation of causal hypotheses.

Contents of causal categories. Kelley's (1967, p. 197) attributional criteria conjoin the covariation principle to the categories of "entity," "time," "modality," and "person." Thus, the distinctiveness criterion refers to the case where the effect covaries with the external entity (effect is present when the entity is present and absent when it is absent), whereas the consistency and consensus criteria assert that the effect does not covary with modality, time, and person.

Note that the covariation principle pertains to causal inference as such, whereas the categories of entity, person, time, and modality represent the contents of particular causal categories. It seems likely that under the appropriate conditions numerous alternative categories could figure in persons' causal hypotheses. In such a case, the covariation principle would be applied to entirely different concepts yielding


different "criteria" for a causal assignment. Recent attribution research supports this interpretation. Over the last two decades, a rich variety of causal categories have been investigated (Abramson, Seligman, & Teasedale, 1978; Anderson, 1983; Fincham, 1985; Orvis, Kelley, & Butler, 1976; Weiner, 1979, 1985a). From the present perspective, numerous further such categories are possible pertaining to the varied causal hypotheses that persons may construct: It also follows that the criteria of distinctiveness, consensus and consistency should be of interest only where the attributor tested causal hypotheses about entity, person, and time/modality, but not where she or he tested causal hypotheses about alternative categories (e.g., ability vs. effort categories). Support for this notion was obtained in research by Kruglanski, Hamel, Maides, and Schwartz (1978).

To summarize, attribution theory represents the general knowledge-formation process in the content domain of causal knowledge. Even more specifically, attribution theory has dealt with particular types of causal knowledge tied to particular causal categories. Across those various specifics, however, causal knowledge seems to be validated in the same basic way as all knowledge, namely on the basis of inference rules that connect particular categories of evidence to particular conclusions. In attribution theory, one such major inference rule involved the linkage of covariation with causality, although alternative causal inference rules may exist (e.g., linking causality to the temporal ordering of associated events). Accordingly, the knower's confidence in a causal attribution may rise if, given the causal inference rule she or he is using, the evidence received was consistent with the attribution. Similarly, the knower's confidence may wane on receiving evidence inconsistent with an attribution, such as evidence supporting a competing attribution.

Motivational bases of attribution. Beyond the use of inference rules, causal attribution represents the general knowledge-formation process also in the ways it is affected by the several epistemic motivations. Most motivational research in attribution concerns the effects of "needs for specific closures," particularly, closures desired for their positive (e.g., flattering or ego-enhancing) contents. Heider (1958, pp. 120-121) noted the existence of such biasing motivations when he commented on "subjective forces of needs and wishes" possibly affecting attributions. A similar idea underlies Jones and Davis's (1965) concept of hedonic relevance and Kelley's (1967) discussion of ego-protective biases. Empirical research on self-serving biases in attribution (e.g., Miller, 1976; Ross & Sicoly, 1979) supported their motivational bases despite earlier reservations (Ajzen & Fishbein, 1975; Miller & Ross, 1975). Recent reviews of attribution research (Harvey & Weary, 1984; Kelley & Michela, 1980) have also concluded in favor of motivational interpretations (but see Tetlock & Levi, 1982) lending credence to the idea that attributions often are affected by needs for specific closures.

It also appears that causal assignments are often instigated by a "need for nonspecific closure." Such need seems implicit, for example, in Kelley's (1967) reference to the desire for "cognitive mastery of the causal structure of the environment," considered fundamental to attribution. Research on outcome-dependent situations (Berscheid, Graziano, Monson, & Dermer, 1976; Erber & Fiske, 1984; Fiske & Neuberg, 1988) has further indicated that such nonspecific needs



to explain causally another's behavior are particularly likely where the individual's own outcomes may depend on the ability to predict and control the activities of the other. In the same vein, Weiner (1985a) reviewed evidence that the (nonspecific) need for causal closure may often arise, hence that causal activity may ensue, where closure is disrupted by an unexpected (inconsistent) event, or by a failure.

Finally, attributional activity may be occasionally affected by the need to avoid (causal) closure. Thus, Snyder and Wicklund (1981) reported research in which subjects were motivated to increase the ambiguity of causal attributes. In such circumstances, individuals generated competing hypotheses to a current attribution and/or actively sought out evidence inconsistent with such attribution (Snyder & Wicklund, 1981, p. 211). Similarly, with little causal closure to start with, individuals motivated to avoid closure refrained from attributional activity, hence maintaining ambiguity by "leaving the field" (Snyder & Wicklund, 1981, pp. 205- 207).

Research by Tetlock (1985a, 1985b) suggests that persons held accountable for their judgments and decisions may be motivated to avoid premature closure. This may sensitize them to information inconsistent with initial causal hunches and dispose them to consider alternative hypotheses. A need to avoid premature closure may also arise in situations where the possibility of failure appeared either costly and/ or likely. For instance, Pittman and his colleagues (see Pittman & D'Agostino, 1985, for a review) found that, following failure on a concept formation task (presumably making the costs of failure salient), subjects display greater vigilance on a subsequentattributional activity and pay more attention to competing possibilities, hence delaying the formation of causal closure.

Thus, causal attribution seems to be affected by the same motivational forces that drive the epistemic process in general. Such motivational effects need not be thought of as representing a separate mode of causal inference, disjointed from the information mode. More likely, the motivational and informational aspects of attribution (and of knowledge acquisition in general) are functionally interdependent and mutually indispensable: The motivational forces may determine the extent of information processing (indexed, e.g., by its duration or thoroughness), as well as its directionality or selectivity (e.g., reflected in biases toward or against specific causal closures). Within those motivational constraints, however, causal inference is assumed to be informationally driven in accordance with the logic of proof and inference discussed earlier.

Cognitive capability factors: Salience and accessibility effects. Early attributional models reflected a somewhat noncontingent image of lay knowers generally interested in the same basic causal categories and in the same informational criteria. It seems likely, however, that preoccupation with specific causal contents depends on the attributor's momentarycapability to cognize various constructs. Recent research indeed attests to the significance for causal attribution of such determinants of capability as construct accessibility and stimulus salience. For instance, Ferguson and Wells (1980) found that attributor's responsiveness to consensus, consistency, and distinctiveness information was affected by priming manipulations designed to vary information accessibility. Anderson (1983) demonstrated that inference

rules are more likely to be utilized when depicted in a concrete (vs. abstract) fashion that renders them more accessible from memory. Finally, several studies found that a stimulus is more likely to draw causal attributions the greater its perceptual salience (McArthur & Ginsburg, 1981; Pryor & Kriss, 1977; Smith & Miller, 1979; Taylor & Fiske, 1975; Taylor & Thompson, 1982).

To conclude, the lay epistemic analysis synthesizes diverse attributional notions in terms of the epistemic junctions they depict. For instance, attributional criteria, covariation, and discounting all relate to the process of (causal) hypothesis validation. Notions such as entity, time, ability, and luck relate to the causal contents being considered. Determinants of momentary capability such as accessibility and salience contribute to the selection of such contents, and defensive attributions and outcome-dependency effects relate to the impact of various epistemic motivations on the formation of causal knowledge.

The epistemic reinterpretation of attributional activity has several implications for further research. To take just one example, in past research, effects of needs for specific closure (e.g., self-serving ones) were demonstrated primarily via attributional products, that is, by contents of attributions ultimately made. By contrast, effects of needs for nonspecific closure were demonstrated via the (spontaneous) initiation of attributional activity and/or the extent of such activity. According to the present analysis, however, extent of attributional activity should also be impacted by specific closure needs; specifically, it should vary inversely with the desirability of an initial attribution.

Furthermore, needs for nonspecific closure should interact with accessibility to determine the products of attribution. To the extent that a given causal category was readily accessible (say, the person category) and the individual felt a high need for nonspecific closure, that category might likely be assigned as cause of the event. Similarly, the likelihood of causally attributing an effect to a less readily accessible category should be increased under a high need to postpone closure (for a more complete discussion, see Kruglanski, 1989a, Chap. 4).

Cognitive Consistency Models

Logical inconsistency between cognitions, encountered already in connection with the discounting principle is central to a major paradigm in social-cognitive psychology, featuring the cognitive consistency models. Whereas, in the case of discounting, the main consequence of inconsistency has been the undermining of confidence (in a hypothesis confronted by an alternative), in cognitive consistency theories a major mechanism of interest is the negative affect (tension, unpleasantness, arousal) engendered by the inconsistency. According to the lay epistemic analysis, inconsistency-based undermining of confidence could give rise to negative affect if the individual desired confidence to begin with. Thus, the purely epistemic effect of inconsistency, a lowering of subjective confidence, is not in and of itself aversive. Rather negative affect arises from a motive frustrated by the inconsistency. For instance, a person with a high need for nonspecific closure on a topic may feel upset when closure is undermined by an inconsistency. Similarly, a person with a high need for a specific closure may feel badly when confidence in that closure is lowered by inconsistent


information. On the other hand, a person with a high need to avoid closure might welcome an inconsistency allowing him or her to suspend judgment on a topic. Similarly, a person motivated to avoid a specific closure (e.g., that the boss disparaged his or her performance) might be pleased to encounter information inconsistent with that closure.

How does this analysis compare to traditional accounts in the cognitive consistency literature? To begin with, cognitive consistency theorists (e.g., Abelson, 1968; Festinger, 1957) stressed that the inconsistency they were addressing was psychological rather than logical. On close examination, however, it turns out that every case of "cognitive inconsistency" is actually both. Consider Festinger's (1957) example of dissonance that exists when a loyal Democrat prefers a Republican candidate for some office. Presumably, the knower had assumed here that "If a person is a Democrat, she or he will prefer a Democratic candidate" (major premise of the syllogism). Knowing, furthermore, that the person is a Democrat (minor premise) implies a Democratic preference (the conclusion). Yet it is known for a fact that the person does not prefer a Democratic candidate (indeed, she or he prefers the Republican candidate), producing a simple logical contradiction of A (as implied) with not-A (as factually known).

The psychological aspects of inconsistency relate to factors that led the individual to subscribe to his or her premises in the first place. Not everyone may be equally convinced that Democratic party membership automatically implies support for the Democratic candidate, nor that the Democrat in question indeed prefers the Republican candidate. Such notions would have been acquired through a psychological process of belief formation which specific (informational and/ or motivational) conditions may well vary for different individuals.

Similar considerations apply to a more recently investigated prototype of dissonant cognitions where the actor assumes personal responsibility for an irrevocable decision with highly negative consequences (Cooper & Fazio, 1984). The key contradiction here seems to exist between the cognition "I made a highly inappropriate and harmful decision" and the hidden assumption "I do not, of my own accord, make inappropriate and harmful decisions." Awareness of personal responsibility in this case disconfirms-s-that is, logically contradicts=-one'e expectancy (Aronson, 1968) concerning one's own judgmental competence. This undermines one's confidence in a belief of considerable motivational significance, that is, a highly desired specific closure. The consequence may often be arousal and experienced displeasure as well as efforts to undo the disconfinnation in various accessible ways (Cooper & Fazio, 1984), for example, through reaffirmation of one's general self-worth (Steele, 1988). If it is true that the "desired closure" often undermined by dissonant cognitions concerns one's decision-making competence, reaffirming, more specifically, the high quality of one's decisions should serve to reduce dissonance even more directly than reaffirming one's self-worth in general.

To the extent that cognitive inconsistency represents logical contradiction it can only be resolved by negating or denying one of the inconsistent cognitions. A and not-A cannot both be true. A way must be found, therefore, to deny one of the two in some way. In this sense, denial underlies all modes of inconsistency resolution rather than representing, as is often thought, merely one among several modes.

Consider, for instance, inconsistency resolution via "dif-


ferentiation." In an example given by Abelson (1968, p. 120), a person is exposed to an inconsistency-inducing statement that the Pope endorses LSD. To deal with the inconsistency, she or he might posit that LSD contains some good aspects, "say, medical use in certain appropriate cases and that the Pope endorsed these good aspects." But note that the original inconsistency must have been between the assumption that the Pope condemned LSD universally and the new cognition that he endorsed LSD at least in some cases. The differentiation resolution thus amounts to denying the original belief in the Pope's global condemnation of the drug.

It is of interest to specify whether, in a particular case, inconsistency will be resolved through straight denial or through more elaborate schemes such as "differentiation" or "transcendence" (in which both seemingly inconsistent cognitions are derived from the same overarching principle; see Abelson, 1968). According to the present analysis, the specific mode of resolution may depend on the knower's epistemic motivations and the accessibility of constructs relevant to the resolution. For instance, under high need for nonspecific closure, a person might opt for straight denial as it provides a most direct need-satisfaction and dispenses with a laborious consideration of complex solutions. Furthermore, persons for whom notions pertaining to differentiation (e.g., concerning the potential medical benefits of LSD) were made accessible may be more likely to resolve inconsistency via differentiation than persons for whom such notions were less accessible. Those possibilities could be profitably pursued in future research.

Finally, note that the issue of inconsistency resolution is relevant to a central topic in contemporary social cognition, notably the conditions and manner of changing one's stereotypes when confronted with exceptions (e.g., Pavelchak, 1989). Thus, the lay epistemic analysis of inconsistency resolution applies also to the problem of stereotype change, and could usefully guide future investigations in this domain of study.

Let us take stock and consider how cognitive consistency theories relate to attribution theories. Both types of theory address the same general process whereby persons form or change their knowledge, albeit doing so from different perspectives. Attribution theory stresses the phenomenon of confidence, heightened when an information pattern is consistent with an attribution, and lowered in case of an inconsistency. By contrast, cognitive consistency models largely bypass the issue of confidence and focus instead on the affective consequences of an inconsistency. However, to the extent that an individual desired causal knowledge on a topic (under high need for nonspecific closure) or desired particular causal knowledge (under high need for specific closure) he or she would presumably experience positive affect given an informational pattern consistent with such knowledge, and negative affect given a pattern inconsistent with it. Thus attributional models may profitably tap the affective consequences of informational consistency and inconsistency (see Weiner, 1985a). Moreover, to the extent that the affective consequences of inconsistency are mediated by waning confidence, cognitive consistency models could profitably incorporate confidence measures to check on the postulated causal path from inconsistency to confidence-decline to affect, given the appropriate motivations.

If the lay epistemic analysis is correct, needs for specific closure other than esteem-related ones (Cooper & Fazio,



1984; Greenwald & Ronis, 1978) should result in arousal, upset, and cognitive change typically associated with dissonance. This ought to occur when knowledge compatible with those needs was undermined by an inconsistency. For instance, one may evidence upset, arousal, and cognitive bias upon being informed that one's flight to an important business conference has been canceled or that bad weather threatens to spoil one's skiing trip, even though no personal responsibility for those events was assumed. Those notions could be explored in future cognitive consistency research.

Attitudes as Knowledge Structures

Attitudes may be thought of as types of subjective knowledge or judgments whose content is affective or evaluative. Thus, in making an attitudinal statement that one likes or values a given object, the person is actually expressing a judgment of an affective or evaluative type ("I like X, " "X is good"). This implies that attitudes should be formed or modified according to principles that govern the epistemic process at large. The following sections attempt to demonstrate that this is so.

The logic of attitude inference. First note that attitudes are validated in a logical fashion based on inference rules, or linkage structures in which the attitude is linked with specific categories of evidence. Hovland, Janis, and Kelley (1953) stated this clearly. In their words, "a major basis for acceptance of a given opinion is provided by arguments or reasons which, according to the individual's own thinking habits, constitute 'rational' or 'logical' support for the conclusions" (p. 11).

The same logical process is assumed to underlie contextual bases of attitudes and those rooted in message contents. Thus, one may form an attitude on the basis of a "heuristic" linking communicator expertise (or trustworthiness) to presumed validity of his or her pronouncements. In this case, a communicator's expressed opinions represent "evidence" for a given attitude. In like fashion, an attitude may be derived from substantive evidence contained in the communicator's arguments. More generally, my analysis suggests that attitudes may be deduced from disparate evidential bases. Consistent with this notion is Zanna and Rempel's (1988) proposal that attitudes can be based on three classes of information: (a) utilitarian beliefs concerning the value of the attitude object, (b) affect experienced in the presence of the object, and (c) behavior enacted toward the object. In fact, according to my proposal, attitudes may be deduced from any type of evidence. In addition to the bases identified by Zanna and Rempel (1988), some attitudes may be deduced from perceived characteristics of the source (such as credibility or trustworthiness), from nonutilitarian message arguments (e.g., ethical appeals), or from reactions of referent others to the communicator's appeals.

Classical conditioning of attitudes (Staats, 1975; Zanna, Kiesler, & Pilkonis, 1970) may be understood as an "erroneous" deduction of attitudes from affect evoked by the unconditioned stimulus. For instance, Zanna et al. (1970) paired an initially neutral word light with onset or offset of electric shock, respectively increasing negative or positive attitude to the word. According to the present interpretation, aversion generated by the onset of shock, for example, could be used as "evidence" for deducing one's attitude to light; as

if the subject was thinking "I am annoyed whenever light is flashed; this word must really be annoying." Similar attributional interpretation of classical conditioning phenomena were offered by Testa (1974). Finally, the positive effect on attitude change of rewards for attitude-consistent statements (Scott, 1957) could be interpreted as a case of deductive reasoning: Reward dispensed by the experimenter could be taken to signify his or her agreement with the expressed position, in turn interpreted as supportive evidence for the position from which one's own appropriate attitude could be inferred.

Thus, the notion that attitudes are deduced "logically" from subjective inference rules integrates disparate concepts and findings on attitude formation and change including message and source factors in persuasion, effects of rewards and classical conditioning, as well as behavioral, affective, and cognitive bases for attitudes. Furthermore, this analysis has implications for new attitude-change research, for instance, on the effects of personal experience with an attitude object (Fazio & Zanna, 1981; Regan & Fazio, 1977). Specifically, personal experience might not prove a very efficient means of acquiring an attitude if one doubted one's ability to validly assess the evidence (i.e., applied to oneself a "negative expertise" heuristic). In such circumstances, one might more readily accept an attitude expressed by a respected external authority. For instance, a person who doubted his or her taste in clothes might feel uncomfortable about shopping alone for a suit and might feel the need to bring along a friend with an understanding of fashion. Consistent with those ideas is research by S. Ellis (1984) in which subjects who rated highly their authority in a given domain (mathematics or interpersonal skills) benefited from experiential learning in this domain more than subjects with low self-ascribed authority; the two groups of subjects did not differ, or differed in the opposite direction, when learning was based on a frontal lecture delivered by an external authority.

Cognitive capability factors in attitude change. Numerous attitude-change effects are interpretable in terms of the salience or accessibility of evidence for or against an attitude. For instance, the concept of "biased scanning" (Janis & Gilmore, 1965), invoked to explain cases where extrinsic incentives exerted positive effects on attitude change, involves the selective accessing of pro-attitudinal evidence. Similarly, the "sleeper effect" (Hovland, Lumsdaine, & Sheffield, 1949), in which the persuasiveness-reducing impact of negative sources is weakened by the passage of time, may reflect a decline in accessibility of negative evidence for the attitudinal position. Increases in persuasibility when a subject is distracted from listening to a persuasive communication (Allyn & Festinger, 1961; Festinger & Maccoby, 1964) are often interpreted in terms of subjects' reduced ability to counterargue the position-that is, their momentary inattention to refuting evidence. Attentional effects could be at least partially responsible for the finding that information that people generate by themselves (hence, inescapably attend to) is a more important determinant of persuasion than information provided by others (for a review, see Cialdini, Petty, & Cacioppo, 1981).

Motivational effects in attitude change. If attitudes constitute knowledge structures, their formation and change should be appropriately influenced by the several epistemic


motivations identified earlier. For instance, Katz's (1960) "knowledge function" of attitudes closely resembles the need for (nonspecific) closure. As Katz put it: "Individuals ... seek knowledge to give meaning to what would otherwise be an unorganized, chaotic universe. People need standards or frames of reference for understanding their world, and attitudes help to supply such standards" (pp. 175- 176).

More recently, findings of Tesser and colleagues on the "mere thought" phenomenon (Sadler & Tesser, 1973; Tesser, 1976, 1978; Tesser & Leone, 1977) could reflect need-for-closure effects. A major conclusion drawn from this research is that an explicit instruction to think about an issue tends to polarize subjects' attitudes on the topic. It is possible, however, that the instruction to "think" is often interpreted by subjects as a demand to form an opinion, inducing a need for closure that might polarize attitudes to make them more definitive or less ambiguous. If this analysis is correct, alternative modes of inducing the need for closure should exert effects similar to those induced by thought instructions. For example, making an action or a decision contingent on an attitude should increase the degree of polarization, as should previous exposure to an aversive confusion (inducing "closure deprivation"). Furthermore, in some semantic contexts, the instruction to think may be interpreted in ways other than as a request to form an opinion, for example, as an injunction to be cautious and accurate. In such conditions, attitudes may become less rather than more extreme in "thought" conditions. Those possibilities could be profitably pursued in future research.

Numerous motivational constructs in the attitudes domain are classifiable as needs for specific closure or for the avoidance of specific closure. Thus, Katz's (1960) notion of the "ego-defensive" function suggests that such attitudes are often adopted that avoid conclusions damaging to one's ego. Sherif and Cantril's (1945) assumption that ego-involved attitudes resist change can be assimilated to our more general proposal that a conclusion congruent with a need for a specific closure is more likely to survive despite counterevidence. Specific-closure effects could underlie "anticipatory shifts" in attitude change-that is, shifts resulting from mere expectation of persuasive attacks. Attitude researchers have indeed noted that such shifts may represent attempts to move toward the "defensible and admirable moderate positions on the attitude scale" (Cialdini et al., 1981, p. 393)-in other words, attempts to adopt more socially desirable attitudinal closures.

Finally, several attitude-change effects are made intelligible in terms of the motivational concept of "need to avoid closure." In research by Petty, Cacioppo, and Goldman (1981), when an issue was highly consequential for subjects, attitude change was determined mostly by argument quality, whereas, when the issue was inconsequential, perceived source expertise was the primary determinant. Expressed in terms of lay epistemic theory, the high-consequence condition might have induced in subjects a fear of invalidity, hence a desire to (temporarily) avoid closure. This would have led to extensive processing of available evidence, including arguments contained in the message. By contrast, the highconsequence condition might, in different circumstances, induce in subjects a need for closure stemming from the experimental demand to form some opinion. This might motivate subjects to use the shortest possible route to an opinion, in this case represented by the expertise heuristic.


If this analysis is correct, it is not essential that the consequences of an attitudinal issue be personally involving to subjects, only that the perceived costs to oneself or others of an invalid position be high. For instance, in a study by Freund, Kruglanski, and Schpitzajzen (1985), fear of in validity was manipulated by perceived costs of subjects' judgmental mistake to the target person; this significantly mitigated subjects' tendency to base their judgments predominantly on early information. Furthermore, subjects' reliance on the expertise heuristic should be enhanced through various ways of inducing the need for closure (e.g., via time pressure, or prior fatigue that makes information processing seem costly and aversive). Those possibilities could also be investigated.

The attitude-behavior relation. When might an attitude dictate behavior? The lay epistemic analysis suggests three conditions:

1. The attitude should be accessed by the individual.

2. The accessed attitude should have clear implications for action.

3. The individual should be motivated to use the attitude as a guide for action, rather than seeking alternative possible guides.

Relevant to the first condition, Fazio and his colleagues (for a summary, see Fazio, 1989) recently reported evidence that attitude accessibility moderates the power of attitudes to guide behavior. The second condition implies, however, that even if an attitude is momentarily accessed from memory, its particular behavioral implications might not be. Thus, one might access one's positive attitude to physical fitness without at the same time recalling that it implies reducing alcohol intake or lowering cholesterol in the diet. Such implications might fail to be recalled because they are not congruent with one's momentary wishes and desires (e.g., "to have a good time at a party"). The action implications of an attitude represent a topic on which systematic research is lacking and badly needed.

Finally, relevant to the third condition is work by Zanna and colleagues (e.g., Bechtold et al., 1985; Jamieson & Zanna, 1989) suggesting that the attitude-behavior relation is strengthened under time pressure (i.e., under conditions assumed to magnify the need for nonspecific closure). Bolstering this interpretation, individuals with a high-stable predilection toward closure exhibited stronger attitude-behavior relations than those with a low disposition toward closure. This agrees with the lay epistemic analysis whereby need for closure should augment the tendency to use accessible attitudes as guides to action and to refrain from looking for alternative guides (e.g., embedded in the situational norms).

Two routes to persuasion or only one? Petty and Cacioppo (1981) distinguished between two routes to attitude change-the central and the peripheral routes. The central route makes extensive use of information about the attitude object or issue. The peripheral route is guided by "rewards and punishments with which the message is associated . . . or the simple inferences about why a speaker advocated a certain position" (Petty & Cacioppo, p. 255). The use of the central route is more likely under high issue involvement, that is, when the issue is of high consequence to the



person. Similarly, the peripheral route is more likely to be used under low involvement. Attitude change via the central route is thought to be more enduring than that via the peripheral route. In a study by Chaiken (1980) relevant to those notions, subjects in a high-consequence condition tended to be more persuaded by message arguments and less by source likability than were subjects in a low-consequence condition, and high-consequence subjects exhibited less decay of attitude change 10 days later.

From the present perspective, the central-peripheral distinction ties together several factors that are essentially unrelated, such as extent of information processing and content of the information processed (e.g., about issue vs. source). It is easy to envisage situations in which the person is motivated (e.g., by a fear of invalidity) to extensively process information about source characteristics, especially when his or her own perceived ability to make sense of "message information" is limited. Furthermore, various punishments and rewards might often lead to the processing of endogenous message arguments. A person insulted by a critic might experience a high need for a specific closure, contrary to the critic's opinion. Such a person might be motivated to think up numerous message-relevant arguments to bolster his or her position.

The foregoing discussion implies that central (i.e., message-based) and peripheral (i.e., context-based) processing of persuasive information are similarly affected by the epistemic motivations. A related point was recently made by Chaiken, Liberman, and Eagly (1989) regarding the similarity of motivational influences on "heuristic" and "systematic" processing. Note that Chaiken et al. suggested that heuristic processing is predominantly rule-based (or "topdown"), whereas systematic processing is predominantly data-based (or "bottom-up"). In contrast,'the lay epistemic approach suggests that information processing in general involves an interplay between inference rules and relevant data even if some rules pertain to the message contents and other rules to the message source. In this sense, then, heuristic processing and systematic processing are assumed to be governed by the same principles.

In short, the lay epistemic analysis features a unified view of attitude change stressing the fundamental commonalities between central and peripheral routes to attitude change. Furthermore, the formation, utilization, and change of persons' attitudes is shown to be governed by the same process affecting causal attribution and cognitive consistency effects. As noted earlier, this is the general epistemic process whereby all knowledge may form or change.

Accuracy of Social Judgment

Social-cognitive psychologists have long maintained an interest in the accuracy of persons' perceptions and inferences. Recent work on judgmental biases and errors (Kahneman, Slovic, & Tversky, 1982; Nisbett & Ross, 1980) made salient the fallibility of human judgment. In reaction, several authors argued the essential adaptiveness of everyday perception (Funder, 1987; McArthur & Baron, 1983; Swann, 1984). Yet several fundamental questions implicit in the accuracy debate remain unanswered: for example, whether persons are generally accurate or inaccurate, under what conditions they are or aren't accurate, and what is the accuracy mediating process. The lay epistemic framework provides a general perspective on such issues.

A most prevalent definition of accuracy has been that of a correspondence between judgment and a criterion (Cronbach, 1955; Gage & Cronbach, 1955; Hastie & Rasinsky, 1988; Kenny & Albright, 1988; Kruglanski, 1989b). The criterion, however, is often difficult to establish and often boils down to someone else's-namely, some "criterion setter's" =-judgments. Such criterion setters are typically well informed and in possession of compelling evidence for their opinion. For instance, in most social psychological research on bias, the criterion setters are the experimenters who have more evidence and better evidence than the subjects. Consequently, "in most research on social judgment even the subjects agree, when the basis for the researchers' judgment is explained, that the researcher has more reasons and better reasons for his or her judgment" (Hastie & Rasinsky, 1988, p.22).

However compelling or warranted, the accuracy criteria are, nonetheless, human judgments. As such, they are in principle modifiable under the appropriate motivational or informational conditions. Furthermore, it seems rather implausible to assume that every shift in criterion will be immediately tracked by a corresponding shift in judgment. The individual making the judgment may not be privy to information that occasioned the shift, or may be unmotivated to attend to such information even if it was readily available. The assumption of a shift potential in the accuracy criterion has several implications. First, it renders it unlikely that any particular schema or model will universally yield accurate inferences. For instance, the fmdings that people are insufficiently regressive in their predictions (Kahneman & Tversky, 1973) assumes that the regression model applies to a given situation. As Einhorn and Hogarth (1981) noted, however, this may not always be the case. In my terms, a "standard setter" might come by information suggesting that, in a given case, variation in outcomes represents a systematic effect rather than random fluctuation around a stable parameter. Based on such information, he or she may set a standard of accuracy in reference to which regressive predictions would be, in fact, erroneous.

Second, it seems implausible that any particular set of conditions would generally affect the likelihood of accurate judgments. Consider the suggestion that persons' judgments are more likely to be accurate in natural versus artificial settings (Funder, 1987; McArthur & Baron, 1983; Swann, 1984) .. But note that, irrespective of its being "natural" or not, any judgment is influenced by (informational and/or motivational) factors that may operate differently on the judging subject and the standard setter. Even in natural settings people disagree a great deal; granting some of those people standard-setter status, their dissenters would have to be deemed in error. Nor does it seem that forces toward the forging of a "shared reality" (Swann, 1984) may assure social agreement and hence, the accuracy of social perception in natural settings. That is because forces promoting disagreement seem quite as natural and potent as those promoting agreement (e.g., the forces of social differentiation or intergroup conflict; see Tajfel, 1981; Tajfel & Turner, 1979). Similar arguments counterindicate the proposal that social judgments about specific or "circumscribed" situations are more likely to be correct than those about general or "global" situations (Swann, 1984). Both circumscribed and global judgments mayor may not match a criterion, and it does not seem possible to estimate meaningfully the frequency of situations with which each outcome occurs.


Finally, it is doubtful whether a general process may be uncovered whereby accurate judgments are reached. Thus, it is unlikely that the accuracy of judgments depends in any simple way on the amount of relevant information the knower has considered. As several authors have noted (e.g., Campbell, 1969; Weimer, 1979), any amount of information is compatible with multiple alternate hypotheses. Thus, one may continue to hold on to an inaccurate judgment despite considerable information that, although consistent with the "correct" alternative endorsed by the standard setter, is also consistent with the "incorrect" hypothesis. In other words, considerable information could be nondiagnostic (Trope & Bassok, 1983), hence fail to differentiate between "correct" and "incorrect" hypotheses. Nor may the motivation to be accurate necessarily contribute to accuracy (McArthur & Baron, 1983). Occasionally, one's initial intuition could be "correct." The motivation for accuracy might lead one to abandon such an intuition on the basis of further, possibly invalid, information (Tetlock & Boettger, in press).

The foregoing discussion implies the futility of attempts to draw general conclusions about human accuracy, identify general classes of conditions for accurate judgments, or discover a general process or method whereby accuracy may be attained. Does that mean that all accuracy research is worthless and misguided? The answer is an emphatic No.

First, it is appropriate to ask how accuracy may be improved relative to a given, currently accepted, criterion. For instance, assuming the appropriateness of statistical reasoning in many situations, it is meaningful to ask how might such reasoning be improved. Thus, recent work by Nisbett and his colleagues (Fong, Krantz, & Nisbett, 1986; Jepson, Krantz, & Nisbett, 1983; Nisbett, Fong, Cheng, & Lehman, 1987; Nisbett, Krantz, Jepson, & Kunda, 1983) attests that the teaching of statistical rules can increase the likelihood of statistical reasoning, hence, of statistically accurate judgments. Whereas the teaching of statistical rules may make them available in the individual's long-term memory, their use in a given situation may also depend on rule accessibility (Higgins, Bargh, & Lombardi, 1985; Higgins & King, 1981) and on a motivationally determined tendency to search for alternative rules (Kruglanski, 1989b). Those issues could be profitably studied in the future.

Beyond the focus on a specific criterion, research could identify more general classes of conditions where accuracy could be improved. For example, a general category ofleaming situations characterized by negative transfer may involve the knower's initial tendency to produce inadequate hypotheses; in such situations, a useful set of "thinking skills" (Ennis, 1987) may include the motivation to avoid (or postpone) closure, increasing the attention to "correct" notions (e.g., offered via instruction). By contrast, in situations of positive transfer, a useful thinking skill might involve the motivation for closure, increasing the likelihood of "freezing" on one's ("correct") hunches. In short, our conception of the epistemic process suggests accuracy-relevant research for cases where specific standards are assumed, as well as for broader classes of situations where given judgmental tendencies toward a criterion (e.g., positive or negative transfer) may be postulated.

Finally, the lay epistemic approach suggests ways of studying perceived accuracy-that is, cases in which comparison of the judgment with the criterion is carried out by the subjects themselves. For instance, the target judgment may


be formed in a different psychological context from the criterion, resulting in discrepancy or perceived error (Higgins & Stangor, 1988). To take an example, a target judgment might involved ascribing a political opinion to oneself as a young student in the 1960s. Such a topic might bring forth an association of a "militant student body," activating constructs that may lead to a self-ascription of a strong antiestablishment attitude. This might not correspond to one's current views of the 1960s establishment, conceivably shaped by one's current self-perception (e.g., as a successful Wall Street broker), contributing to the assessment of one's prior opinions as incorrect.

Furthermore, motivations to form judgments of specific contents, or needs for specific closure (Kruglanski, 1989b), may affect one's target as well as criterion judgments. Illustrative of such a process are findings by Vallone, Ross, and Lepper (1985) that pro-Arab viewers judged major networks' coverage of the Beirut Massacre of 1982 very differently from pro-Israeli viewers, each group concluding that the coverage was substantially biased against their own side. Vallone et al. suggested that divergent perceptions of the media position (target judgment) could be one factor responsible for the perception of bias. Another likely factor is the perception of what actually took place (the criterion). Both target and criterion judgments in this case could be substantially colored by subjects' specific sympathies and identifications within the Israeli-Arab conflict-that is, their needs for specific closure in the situation.

The Epistemic Approach to Cognitive Therapy

Most schools of cognitive therapy accept that the end of treatment is the uprooting of clients' dysfunctional beliefs. Dysfunctionality, in turn, is often equated with cognitive distortion of reality. A. Ellis (1977), for example, wrote:

"Virtually all serious emotional problems with which humans are beset directly stem from their magical, superstitious, empirically invalidatable thinking" (p. 172). Similarly, Beck (1976) asserted that "psychological problems ... result from making incorrect inferences, and not distinguishing between imagination and reality" (p. 19).

Contrasting with those and other similar views (e.g., Raimy, 1975), my analysis suggests that truth status or veridicality of an idea is largely unrelated to the emotional welfare it promotes. This follows directly from the assumption that the criterion for accuracy (veridicality) in social judgment is itselftentative and modifiable. Thus, the same happiness-inducing (or pain-inducing) idea could be proclaimed as veridical at one time and as nonveridical the next time, counterindicating a general relation between emotional wellbeing (i.e., happiness or pain) and veridicality.

Indeed, life seems replete with examples where uncompromising "truth" has highly traumatic consequences for its possessor: The betrayed lover, the bankrupt businessman, or the terminally ill patient may not only suffer profoundly upon discovering the truth about their fate, but often may be driven to self-destruction in those circumstances (so that not even possible "delayed" benefits of truth may be hoped for). Indeed, recent analyses and reviews have stressed the occasional palliative value of cognitive distortions and biases (Alloy & Abramson, 1979, 1988; Lewinsohn, Mischel, Chapin, & Barton, 1980; Mahoney, 1985; Mischel, 1979;



Nelson & Craighead, 1977; Taylor & Brown, 1988). As Mischel (1979) put it:

It is tempting to conclude that a key to avoiding depression is to see oneself less stringently and more favorably than others see one. If so, the beliefs that unrealistic self-appraisals are a basic ingredient of depression and that realism is the crux of appropriate affect may have to be seriously questioned. To feel good about ourselves we may have to judge ourselves more kindly than we are judged. Self-enhancing information processing and biased self encoding may be both a requirement for positive affect and the price for achieving it. (p. 752)

The implication should be avoided that, in general, judgmental veridicality is negatively correlated with well-being. Consider research on "depressive realism" (Alloy & Abramson, 1988; Alloy, Albright, Abramson, & Dykman, 1989). Although initial findings in this domain suggested, intriguingly, that depressives are often more accurate in their judgments than nondepressives, subsequent research uncovered boundary conditions for the effect. Alloy et al. (1989) concluded that "both depressed and nondepressed individuals . . . exhibit both biased and unbiased processing, depending on the relative match of the information to be processed to each group's self schema" (p. 9). In line with our earlier analysis, then, for depressives and nondepressives alike, accuracy is a matter of match between judgment and criterion. Judgment, in turn, is determined by various epistemic factors including the individual's accessible constructs or schemata: When those match the situational criterion (the "information to be processed"), there will be accuracy; when they don't, inaccuracy will follow. Thus, theoretical analysis and empirical data (Albright, Alloy, Barch, & Dykman, 1989; Dykman, Abramson, Alloy, & Hartlage, 1988) agree that, generally speaking, psychological well-being is neither positively nor negatively related to accuracy. Furthermore, no one class of persons (e.g., depressives) is likely to be generally more accurate than another, though it might be so in specific circumstances where those persons' judgmental process happened to yield criterion-matching outcomes.

If psychological malaise is not the function of inaccuracy, what might it be the function of? Most likely, the contents of the individual's beliefs connoting the frustration of important objectives. For instance, if a person's objective was to excel in absolutely everything, he or she would be likely to suffer on failing at some task. Similarly, if a person's objective was to be well liked by all, he or she would suffer if rejected by some people. According to Beck, Rush, Shaw, and Emery, (1979), these exemplify states of "loss" -that is, past frustrations leading to depression. In contrast, expected or future frustrations represent impending "dangers" experienced as anxiety or fear (Beck et al., 1979).

Recent work by Higgins and his associates (Higgins, 1987; Higgins, Straumann, & Klein, 1985) suggests that the type of suffering depends on the type of objective that is frustrated. Frustrating an objective of fulfilling one's duty (discrepancy from an "ought" standard) may give rise to agitation-type affect (stemming from the implicit possibility of punishment), whereas frustrating an objective of attaining a desirable objector state (discrepancy from an "ideal" standard) may give rise to a dejection-type affect.

Whatever its specific contents, the frustrative belief needs to be modified if suffering is to be alleviated. Structurally, a frustrative belief involves the elements of (a) a goal, (b) a means to a goal, and (c) a relation of ineffectiveness connecting the two, suggesting that whatever means the person is currently using fails to achieve the goal. Accordingly, modification of a frustrative belief may involve one of three types of shifts. A person may reassess the conclusion that his or her goal is indeed frustrated and be persuaded to accept the contrary conclusion (e.g., a person who had assumed that she or he was rather untalented may come to believe that she or he is in fact quite gifted). A person may come to believe that she or he has other, more effective, means of attaining the same goal (e.g., that assertive firmness is a more efficient way of dealing with a co-worker than is conciliatory affability). Finally, a person may change his or her mind about the attractiveness of the goal itself, and be persuaded to abandon it and/or replace it by another goal (e.g., the goal of avoiding all interpersonal conflicts may be abandoned, and one may learn to live with some conflicts).

Whatever therapeutic change seems most desirable, its attainment should be facilitated through considering the epistemic process whereby all subjective knowledge is formed and modified. Thus, in order that therapeutic arguments qualify as compelling evidence, they should be founded on the client's own beliefs and assumptions. In part, such assumptions may pertain to the perceived "authority" of the arguments' source. For some clients, the therapist may represent a highly revered authority; other clients might be more readily swayed by a group consensus or their own experiences. Advance knowledge of clients' beliefs in these regards allows a judicious selection of therapeutic situations most likely to facilitate the desired change.

Beyond attunement to clients' SUbjective logic, successful therapy may utilize cognitive capability principles in attempts to replace the dysfunctional cognitions by more adaptive alternatives. Thus, the therapist may impart to the client useful ways of framing his or her problem that have been previously unavailable or inaccessible. Furthermore, the therapist may try to lower the accessibility of dysfunctional cognitions. Occasionally, this may take the form of minimizing the client's contact with situations or persons likely to activate such maladaptive notions. Following prolonged inactivity, the dysfunctional ideas may become dissociated in the client's mind from their evidential base (Cook, Gruder, Hennigan, & Flay, 1979; Greenwald, Baumgardner, & Leippe, 1979) and in this way become more tractable to alteration attempts.

In addition, therapeutic change may take advantage of the client's epistemic motivations. Thus, the therapist may demonstrate that a given dysfunctional belief is incongruent and/ or that a substitute belief is more congruent with the client's basic desires. For example, the therapist who suspects that a client's belief that he or she is a failure is prompted by a need for compassion and sympathy might point out that such self-conception invites instead rejection and disdain. An alternative, more socially acceptable belief might be suggested-for example, that, like most people, the client too has a mixed success-failure record.

In the foregoing example, therapeutic intervention engages the client's need for specific closure (to believe that others are sympathetic to him or her). Furthermore, the need for nonspecific closure might be utilized as a consolidating


influence that may freeze the client's belief on a given hypothesis (ideally that proposed by the therapist) and suppress his or her tendency to generate rival alternative hypotheses. Contracts and commitments often employed in cognitive therapy (e.g., Kanfer, 1975) may work in part because they arouse closure needs, ultimately resulting in acceptance by the client of the (adaptive) notions relevant to the commitment.

Finally, engagement of the need to avoid closure may be helpful at the stage of uprooting the old, dysfunctional beliefs assumed responsible for clients' pain and suffering. For instance, a person might be inclined to abandon a belief portrayed as invalid by the therapist or members of the therapy group. A. Ellis's (1977) therapeutic approach often invokes the client's fear of invalidity to unfreeze his or her maladapti ve beliefs. The same process seems to underlie Beck's (1976) technique of intellectually challenging the clients' beliefs.

The last examples highlight an important issue regarding the actual value of the lay epistemic reanalysis of cognitive therapy processes. Does it offer a genuinely new approach, or merely relabel in new terms established therapeutic practices? The answer is twofold. First, the analysis accounts for much of what the cognitive therapists actually do, though it differs from what they say they do. Second, as a consequence, it may help identify unproductive therapeutic techniques and furnish generative concepts for inventing new, more productive, techniques instead. Consider, forexample, a person's belief "I must succeed in absolutely everything I try." Many cognitive therapists would view such belief as a good candidate for alteration because it is "irrational," "invalid," an "overgeneralization," and so on. By contrast, from the present perspective the belief may be thought of as worrisome because it connotes a high likelihood of frustration. On the other hand, if the client actually believed that she or he is, in fact, "absolutely successful at everything," no frustration would seem to exist, hence, no warrant for uprooting the belief.

For another example, the therapist's derisive portrayal of a dysfunctional thought as a "selective abstraction" or an "overgeneralization" (Beck, 1976) might contribute to a successful unfreezing of such thought through arousal of the client's fear of invalidity, hence, the need to avoid closure. In some cases, however, the argument might fall flat: The client might be a (depressed) philosopher of science fully convinced that all statements are selective abstractions or overgeneralizations anyway. If so, alternative means of unfreezing the dysfunctional beliefs would need to be devised.

The foregoing examples indicate that the lay epistemic reinterpretation of cognitive therapy is of more than just an academic interest, and is capable of yielding practical implications of some importance.

Epistemics of the Standard Setters:

Sociocognitive Constraints on Scientific Knowledge

In discussing accuracy, I noted earlier that the "reality" criterion against which subjects' judgments are assessed is often some standard setter's judgment and that the two types of judgment are formed via essentially the same process. Let us pursue this notion more fully by exploring how knowledge is formed and modified among a particularly influential group of standard setters, the scientific community at large.


The notion that scientific and common-sense methods of knowledge formation are similar is not particularly startling. It has been acknowledged both by leading social psychologists (e.g., Heider, 1958; Jones & Davis, 1965; Kelley, 1967) and major philosophers of science (e.g., Feyerabend, 1976; Kuhn, 1962; Laudan, 1977; Popper, 1935/1959, 1966, 1972) among others. But let us see exactly how deep is the similarity and what implications it has for everyday scientific practices.

Just as do lay persons, scientists validate their hypotheses via relevant evidence identified in previously assumed linkages between data and hypotheses. More specifically, the scientist attempts to collect diagnostic evidence (Trope & Bassok, 1983) that rules out plausible competing alternatives to a hypothesis (Campbell & Stanley, 1963; Cook & Campbell, 1979). Formally speaking, the scientist acts as ifhe or she assumed a linkage whereby "only if the hypothesis were true (and its alternatives false) would a given (diagnostic) datum be in evidence." Adopting such a premise (whose logical form is material equivalence) warrants a positive inference of the hypothesis from the evidence (rather than its mere falsification by failure to obtain the evidence, as Popper, 1935/1959, had it).

Obviously, the diagnosticity of a given datum depends on the context of competing hypotheses that the scientist is considering (Kruglanski & Mayseless, 1988). In turn, the generation of such alternative hypotheses is affected by factors of (a) accessibility and (b) epistemic motivation, referred to earlier. In principle, it should be possible to continue generating hypotheses indefinitely on any topic of interest. That is why no scientific proposition is ever established or proven with finality (Campbell, 1969; Cronbach, 1980, 1982; Weimer, 1979). Thus, scientific knowledge is perennially tentative and potentially changing (Popper, 1972). These notions have some interesting implications for everyday problems of research design.

Consider the commonly accepted notion of "design hierarchies" whereby some research paradigms (e. g., an experiment) are seen as inherently superior to others (e.g., a case study). This notion assumes a finite list of "threats to validity" and the argument that some designs control for more of those threats than do others. But no lists of threats can be finite or exhaustive. Threats represent (alternative) hypotheses generated by scientists in the course of a potentially limitless epistemic process. Any list of threats is relative to a state of knowledge within a field of research, which determines the availability 1 accessibility of plausible linkages between data and various hypotheses. For instance, Campbell and Stanley (1963)1isted as threats to "internal validity":

1. History.

2. Maturation.

3. Testing.

4. Instrumentation.

5. Statistical regression.

6. Selection biases.

7. Experimental mortality.

8. Selection-maturation interaction.

Subsequently, Cook and Campbell (1976) added to that list several further threats:

9. Diffusion or imitation of the treatment.



10. Compensatory equalization of treatment.

11. Compensatory rivalry.

12. Local history.

Presumably, with further evolution of a science, yet-further threats to validity could be added and others dropped from the list. For instance, at one point Campbell and Stanley (1963) expressed concern about the reactive effects of the pretest; this concern was subsequently allayed (Campbell, 1969) by Lana's (1969) reassuring review of research on this topic.

However, note that, if lists of threats are relative (to a state of knowledge), so must be design hierarchies founded on those lists. Thus, a "true" experiment could be occasionally less interpretable than a correlational design or a case study (Campbell, 1975). In factvmost developmental studies are correlational; this does not make them any less valid than many experimental studies. More generally then, the issue is not whether a given research design is of a particular type, but whether it controls for currently plausible interpretations of the data. A research design controlling for specific threats out of a fixed list could be controlling for the "wrong" or implausible ones, while failing to control for those of real concern. A given experiment may invite several alternative interpretations of the data, whereas a single-shot case study may invite none. For instance, a physician may be quite confident that a patient was poisoned, say, by antimony and may be unable to generate any alternative explanations for the observed symptoms.

Not only is utility of a research design relative to a state of knowledge within a community of scientists, but it may also vary across communities. A research design that convinces the psychiatric community may fail to convince the community of clinical psychologists, and vice versa. A social psychologist may worry about "demand characteristic" (Orne, 1962) or "reactivity" of research procedures (Webb, Campbell, Schwartz, & Sechrest, 1966); such threats may be somewhat foreign (i.e., relatively unaccessible) to a cognitive psychologist. Thus, to achieve scientific acceptance of one's views, one may need to furnish different types of evidence to different scientific audiences. It is in this vein that Cronbach (1982) recently commented that the methodology of scientific inference may well be moved "from the realm of pure logicinto the realm of rhetoric, or persuasion" (p. 109).

Official and unofficial scientific "methodologies." Scientific inferences considered so far were based on linkage structures connecting substantive evidence on a problem to one or more explanatory hypotheses. It is such inferences that design textbooks typically address; ways of reaching them are what official scientific methodology is all about. But just as lay attitudes are often deduced from "heuristics" (Chaiken, 1980) unrelated to the message substance, so apparentlyare scientific opinions. A case in point are "source effects" in scientific inference-namely, credibility boosts to scientific arguments due to endorsement by prestigious scientific authorities. Enhancing the persuasiveness of one's views to the scientific audience through the attainment of appropriate endorsements may thus be classified as an aspect of unofficial scientific methodology.

Darwin, for instance, published his theory of natural selection in the prestigious Journal of the Linnaean Society, under the sponsorship of Sir Charles Lyell, a famed geol-

ogist. Einstein's general theory of relativity gained popularity in England following an endorsement by Eddington, and the legitimation of quantum optics is often ascribed to the support of Henri Poincare. Failure to gain the approval of established scientific sources is overcome sometimes by cultivating an alternative community of followers for whom one commands authority. A case in point is Freud, who, upon rejection by the psychiatric establishment, proceeded to found his own scientific network complete with learned (psychoanalytic) societies and publication outlets (e.g., the Zentralblatt fur Psychoanalyse).

Implicit inferences from "source heuristics" are not the only way in which the process of scientific persuasion transcends the maxims of official methodology. To exert impact, researchers need to make their ideas accessible to others, which in the present age of information explosion is not always easy. To accomplish this, it may be useful to have prolific co-workers or graduate students whose work may amplify the accessibility of their supervisor's notions. Some scientists may attempt to render their research noticeable by choosing faddish problems. A cost of such a strategy can be excessive reactivity to shifting trends, a sort of scientific "knee-jerk relevance," at the expense of a more serious commitment to a topic of study.

Finally, persuasion in science appears very much affected by the researchers' epistemic motivations. For example, the opposition that the psychiatric establishment put up against Freud's theorizing is typically explained by his emphasis on sexuality, which may have offended those holding the traditional middle-class values of his age. Mendel's genetic notion that human traits are transmitted from parent to child could have been desirable, hence persuasive, to him and his fellow members in the Augustinian order, stressing as it does the "original sin" that, since Adam's fall, is also transmitted down the generations. In short, credibility of scientific propositions seems importantly affected by the values, interests, and needs they may serve. As Cronbach (1980) put it: "To call for value-free standards of validity is ... a nostalgic longing for a world that never was" (p. 105). Sensitivity to such motivational considerations may also count as part of unofficial scientific methodology.

Whereas the foregoing examples pertain to needs for specific closures, developments in scientific knowledge may also be shaped by needs for nonspecific closure or the avoidance of closure. Kuhn's (1962) seminal work on scientific revolutions stressed the motivational significance for researchers of paradigms that offer guiding structure or firm assumptive basis for further inquiry. Popper (1935/1959), on the other hand, recommended adopting a perennially questionning stance, in which the most deeply venerated assumptions are constantly "on the line" as candidates for falsification. In present terms, the Popper-Kuhn controversy about the virtues of "revolutionary" versus "normal" science may relate in part to the relative dominance within a scientific community of concerns for the avoidance versus maintenance of closure. Based on such an analysis, It should be possible to speculate which scientists might be inclined to undermine existing paradigms (e.g., scientists with little hope of making major contribution within the existing system), and which might come to staunch defense of the received paradigms (e.g., scientists threatened by ambiguity, or ones with strong career interests in those paradigms). More generally, it is suggested that "paradigm shifts" may


be propelled by more than just the accumulation of anomalies (Kuhn, 1962). Specifically, a motivational base is postulated for taking the anomalies seriously, or even investing efforts in their discovery.

In summary, the lay epistemic framework offers a sociocognitive perspective on the scientific process. The same psychological factors (inference rules, construct accessibility, or epistemic motivations) that affect the formation of everyday knowledge appear also to govern the formation of scientific knowledge. According to this perspective, beyond the official scientific methodology of proof via the substance of scientific arguments, persuasiveness of scientific propositions is inevitably affected by several unofficial factors related to the psychological context of inference. As in one of M. C. Escher's graphical paradoxes, where the lower plane turns out to represent simultaneously the higher plane, social-cognitive psychology too may be thought of at once as a subdiscipline of science and as a metatheory for science.

Concluding Comment

This article proposed that a functionally integrated theory of the processes whereby persons form and change their knowledge offers a synthesis of numerous domains of study within social-cognitive psychology.- Such a synthesis offers two distinct advantages: (a) It identifies unifying coherence and continuity in the research efforts of social-cognitive psychologists over the past several decades, and (b) if offers a fresh perspective on fundamental social-cognitive issues, opening up numerous avenues for future inquiry.


I am indebted to Judson Mills for comments on a draft of this article.

Arie W. Kruglanski, Department of Psychology, University of Maryland, College Park, MD 20742.


Abelson, R. P. (1968). Psychological implication. In R. P: Abelson, E.

Aronson, W. J. McGuire, T. M. Newcomb, M. J. Rosenberg, & P. H. Tannenbaum (Eds.), Theories of cognitive consistency: A sourcebook (pp. 112-140). Chicago: Rand McNally.

Abelson, R. P., Aronson, E., McGuire, W. J., Newcomb, T. M., Rosenberg, M. J., & Tannenbaum, P. H. (Eds.). (1968). Theories of cognitive consistency: A sourcebook. Chicago: Rand McNally.

Abramson, L. Y., Seligman, M. E. P., & Teasedale, J. D. (1978). Learned helplessness in humans: Critique and reformulation. Journal of Abnormal Psychology, 87, 48-74.

Ajzen, I., & Fishbein, M. (1975). A Bayesian analysis of attribution processes. Psychological Bulietin,.82, 261-277.

Albright, J. S., Alloy, L. B., Barch, D., & Dykman, B. M. (1989). Depression and social comparison: The role of comparison set and target other. Manuscript submitted for publication.

Alloy, L. B., & Abramson, L. Y. (1979). Judgment of contingency in depressed and nondepressed students: Sadder but wiser? Journal of Experimental Psychology: General, 168, 441-485.

Alloy, L. B., & Abramson, L. Y. (1988). Depressive realism: Four theoretical perspectives. In L. B. Alloy (Ed.), Cognitive processes in depression (pp. 17-42). New York: Guilford.

Alloy, L. B., Albright, J. S., Abramson, L. Y., & Dykman, B.

M. (1989). Depressive realism and nondepressive optimistic illusion:

2Due to space limitations, only an abridged form of these arguments was presented here. For a fuller discussion, the interested reader is referred to Kruglanski (1989a).


The role of the self. In R. E. Ingram (Ed.), Contemporary psychological approaches to depression: Treatment, research and theory. New York: Plenum.

Allyn, J., & Festinger, F. (1961). The effectiveness of unanticipated persuasive communications. Journal of Abnormal and Social Psychology, 62,35-40.

Anderson, C. A. (1983). The causal structure of situations: The generation of plausible causal attributions as a function of type of event situation. Journal of Experimental Social Psychology, 19, 185-203.

Aronson, E. (1968). Dissonance theory: Progress and problems. In R. P.

Abelson, E. Aronson, W. J. McGuire, T. M. Newcomb, M. J. Rosenberg, & P. H. Tannenbaum (Eds.), Theories of cognitive consistency: A sourcebook. Chicago: Rand McNally.

Bechtold, J., Zanna, M. P., & Naccarato, M. (1985). [Effects of need for closure on the prejudice-discrimination relation]. Unpublished data, University of Waterloo, Waterloo, Canada.

Beck, A. T. (1976). Cognitive therapy and the emotional disorders.

Madison, CT: International Universities Press.

Beck, A. T., Rush, A. J., Shaw, B. F., & Emery, G. (1979). Cognitive therapy of depression: A treatment manual. New York: Guilford.

Berscheid, E., Graziano, W, Monson, T., & Dermer, M. (1976). Outcome dependency: Attention, attribution and attraction. Journal of Personality and Social Psychology, 34, 978-989.

Billig, M. (1982). Ideology and social psychology. Oxford: Basil BlackwelL

Bourne, L. E., Dominowski, R. L., & Loftus, E. F. (1979). Cognitive process. Englewood Cliffs, NJ: Prentice-Hall.

Bruner, 1. S. (1951). Personality dynamics and the process of perceiving.

In R. Blake & G. V. Ramsey (Eds.), Perception-An approach to personality (pp. 24-54). New York: Ronald.

Bruner, 1. S. (1973). Beyond the information given: Studies in the psychology of knowing. New York: Norton.

Campbell, D. T. (1969). Prospective: Artifact and control. In R. Rosenthal & R. L. Rosnow (Eds.), Artifact in behavioral research. New York:


Campbell, D. T (1975). Degrees of freedom and the case study. Comparative Political Studies, 2, 178-193.

Campbell, D. T, & Stanley, J. (1963). Experimental and quasi-experimental designs for research on teaching. In N. L. Gage (Ed.), Handbook of research on teaching. Chicago: Rand McNally.

Chaiken, S. (1980). Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of Personality and Social Psychology, 39, 752-766.

Chaiken, S., Liberman, A., & Eagly, A. H. (1989). Heuristic and systematic information processing within and beyond the persuasion context. In J. S. Uleman & 1. A. Bargh (Eds.), Unintended thoughts: Limits of awareness, intentions and control (pp. 212-252). New York:


Cialdini, R. B., Petty, R. E., & Cacioppo, J. T. (1981). Attitude and attitude change. Annual Review of Psychology, 32, 357-404.

Cook, T. D., & Campbell, D. T (1976). The design and conduct of quasiexperiments and true experiments in field settings. In M. D. Dunnette (Ed.), Handbook of industrial and organizational psychology. Chicago: Rand McNally.

Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago: Rand McNally.

Cook, T. D., Gruder, C. L., Hennigan, K. M., & Flay, B. R. (1979). History of the sleeper effect: Some logical pitfalls in accepting the null hypothesis. Psychological Bulletin, 86, 662-679.

Cooper, J., & Fazio, R. H. (1984). A new look at dissonance theory. In L.

Berkowitz (Ed.), Advances in experimental social psychology (Vol. 17). New York: Academic.

Cronbach, L. J. (1955). Processes affecting scores on "understanding of others" and "assumed similarity." Psychological Bulletin, 52, 177- 193.

Cronbach, L. J. (1980). Validity on parole: How can we go straight? New Directions for Testing and Measurement, 5, 99-107.

Cronbach, L. 1. (1982). Designing evaluations of educational and social programs. San Francisco: Jossey-Bass.

Dykman, B. M., Abramson, L. Y., Alloy, L. B., & Hartlage, S. (1988).

Processing of ambiguous and unambiguous feedback by depressed and nondepressed college students: Schematic biases and their implication for depressive realism. Unpublished manuscript, University of Wisconsin- Madison.

Einhorn, H. J., & Hogarth, R. M. (1981). Behavioral decision theory:

Processes of judgment and choice. Annual Review of Psychology, 32, 53-88.



Einhorn, H. 1., & Hogarth, R. (1983). A theory of diagnostic inference:

Judging causality (Memorandum). Chicago: University of Chicago, Center for Decision Research.

Ellis, A. (1977). The basic clinical theory of rational emotive therapy. In A. Ellis & R. Grieger (Eds.), Handbook of rational emotive therapy (pp. 48-73). New York: Springer.

Ellis, S. (1984). The effect of self-epistemic authority on the relative efficacy of cognitive change methods in different subject matters. Unpublished doctoral dissertation, Tel Aviv University, Israel.

Ennis, R. H. (1987). A taxonomy of critical thinking dispositions and abilities. In 1. Boykoff-Baron & R. J. Sternberg (Eds.), Teaching thinking skills: Theory and practice. New York: Freeman.

Erber, R., & Fiske, S. T. (1984). Outcome dependency and attention to inconsistent information. Journal of Personality and Social Psychology, 47, 709-726.

Fazio, R. H. (1989). On the power and functionality of attitudes. In A. R.

Pratkanis, S. 1. Breckler, & A. G. Greenwald (Eds.), Attitude structure and function. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Fazio, R. H., & Zanna, M. P. (1981). Direct experience and attitudebehavior consistency. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 14, pp. 17-39). New York: Academic.

Ferguson, T. 1., & Wells, C. L. (1980). Priming of mediators in causal attribution. Journal of Personality and Social Psychology, 38, 461- 470.

Festinger, L. (1957). A theory of cognitive dissonance. Stanford, CA:

Stanford University Press.

Festinger, L., & Maccoby, N. (1964). On resistance to persuasive communication. Journal of Abnormal and Social Psychology, 68, 359-366.

Feyerabend, P. (1976). Against method. New York: Humanities Press. Fincham, F. D. (1985). Attributions in close relationships. In 1. H. Harvey & G. Weary (Eds.), Attribution: Basic issues and applications (pp. 66- 92). New York: Academic.

Fiske, S. T., & Neuberg, S. L. (1988). A continuum of impression formation from category-based to individuating processes: Influence of information and motivation on attention and interpretations. In M. P. Zanna (Ed.), Advances in experimental social psychology (Vol. 23, pp. 23- 47). New York: Academic.

Fong, G. T., Krantz, D. H., & Nisbett, R. E. (1986). The effects ofstatistical training on thinking about everyday problems. Cognitive Psychology, 18, 253-292.

Frenkel-Brunswik, E. (1949). Intolerance of ambiguity as emotional and perceptual personality variable. Journal of Personality, 18, 108- 143.

Freund, T., Kruglanski, A. W., & Schpitzajzen, A. (1985). The freezing and unfreezing of impressional primacy: Effects of the need for structure and the fear of invalidity. Personality and Social Psychology Bulletin, 11, 479-487.

Funder, D. C. (1987). Errors and mistakes: Evaluating the accuracy of social judgment. Psychological Bulletin, 101, 75-91.

Gage, N. L., & Cronbach, L. 1. (1955). Conceptual and methodological problems in interpersonal perception. Psychological Review, 662, 411-422.

Greenwald, A. G., Baumgardner, M. H., & Leippe, M. R. (1979). In search of reliable persuasion affects: III. The sleeper effect is dead. Long live the sleeper effect! Unpublished manuscript, Ohio State University, Columbus.

Greenwald, A. G. , & Ronis, D. I. (1978). Twenty years of cognitive dissonance: Case study in the evolution of a theory. Psychological Review, 85,53-57.

Gregory, R. L. (1970). The intelligent eye. New York: McGraw-Hill. Gregory, R. L. (1973). The confounded eye. In R. L. Gregory & E. H.

Gombrich (Eds.), Illusions in nature and art (pp. 7-23). London:


Hanson, R. D., & Hall, C. A. (1985). Discounting and augmenting facilitatory and inhibitory forces: The winner takes almost all. Journal of Personality and Social Psychology, 49, 1482-1493.

Harvey, J. H., & Weary, G. (1984). Current issues in attribution theory and research. Annual Review of Psychology, 35, 427-459.

Hastie, R., & Rasinsky, K. A. (1988). The concept of accuracy in social judgment. In D. Bar-Tal & A. W. Kruglanski (Eds.), The social psychology of knowledge (pp. 193-209). Cambridge, England:

Cambridge University Press.

Heider, F. (1958). The psychology of interpersonal relations. New York:


Higgins, E. T. (1987). Self-discrepancy: A theory relating self and affect.

Psychological Review, 94, 319-340.

Higgins, E. T., Bargh, J. A., & Lombardi, W. (1985). The nature of prim-

ing effects on categorization. Journal of Experimental Psychology:

Learning, Memory, and Cognition, 11, 59-69.

Higgins, E. T., & King, G. A. (1981). Accessibility of social constructs:

Information-processing consequences of individual and contextual Variability. In N. Cantor & J. F. Kihlstrom (Eds.), Personality, cognition and social interaction (pp. 69-121). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Higgins, E. T., King, G. A., & Mavin, G. H. (1982). Individual construct accessibility and subjective impressions and recall. Journal of Personality and Social Psychology, 43, 35-47.

Higgins, E. T., & Stangor, C. (1988). Context-driven social judgment and memory. In D. Bar-Tal & A. W. Kruglanski (Eds.), The socialpsychology of knowledge (pp. 262-299). Cambridge, England: Cambridge University Press.

Higgins, E. T., Strauman, T., & Klein, R. (1985). Standards and the process of self-evaluation: Multiple affects from multiple stages. In R. M. Sorrentino & E. T. Higgins (Eds.), Motivation and cognition: Foundations of social behavior (pp. 23-63). New York: Guilford.

Hovland, C. I., Janis, I. L., & Kelley, H. H. (1953). Communication and persuasion. New Haven, CT: Yale University Press.

Hovland, C. I., Lumsdaine, A. A., & Sheffield, F. D. (1949). Experiments on mass communication. Princeton, NJ: Princeton University Press.

Jamieson, D. W., & Zanna, M. P. (1989). Need for structure in attitude formation and expression. In A. R. Pratkanis, S. J. Breckler, & A. G. Greenwald (Eds.), Attitude structure andfunction (pp. 73-89). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Janis, I. L., & Gilmore, J. B. (1965). The influence of incentive conditions on the success of role playing in modifying attitudes. Journal of Personality and Social Psychology, 1, 17-27.

Jepson, C., Krantz, D. H., & Nisbett, R. E. (1983). Inductive reasoning:

Competence or skill? Behavioral and Brain Sciences, 6, 94-101.

Jones, E. E., & Davis, K. E. (1965). From acts to dispositions: The attribution process in person perception. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 2). New York: Academic.

Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press.

Kahneman, D., & Tversky, A. (1973). On the psychology of prediction.

Psychological Review, 80, 237~251.

Kanfer, F. H. (1975). Self-management methods. In F. H. Kanfer & A. P.

Goldstein (Eds.), Helping people change (pp. 7-24). Elmsford, NY:


Katz, D. (1960). The functional approach to the study of attitudes. Public Opinion Quarterly, 24, 163-204.

Kelley, H. H. (1967). Attribution theory in social psychology. In D. Levine (Ed.), Nebraska Symposium on Motivation (pp. 192-238). Omaha:

University of Nebraska Press.

Kelley, H. H. (1971). Attribution in social interaction. In E. E. Jones, D.

E. Kanouse, H. H. Kelley, R. E. Nisbett, S. Valins, & B. Weiner (Eds.), Attribution: Perceiving the causes of behavior (pp. 1-26). Morristown, NJ: General Learning Press.

Kelley, H. H., & Michela, J. L. (1980). Attribution theory and research.

Annual Review of Psychology, 31, 457-501.

Kenny, D. A., & Albright, L. (1988). Accuracy in interpersonal perception: A social relations analysis. Psychological Bulletin, 102, 390- 403.

Klar, Y. (in press). Linking-structures: Another look at logical and statistical inference biases. Journal of Personality and Social Psychology.

Kruglanski, A. W. (1975). The endogenous-exogenous partition in attribution theory. Psychological Review, 82, 387-406.

Kruglanski, A. W. (1980). Lay epistemo-Iogic-Process and contents:

Another look at attribution theory. Psychological Review, 87, 70-87.

Kruglanski, A. W. (1988). Knowledge as a social psychological construct.

In D. Bar-Tal & A. W. Kruglanski (Eds.), The social psychology of knowledge (pp. 109-141). Cambridge, England: Cambridge University Press.

Kruglanski, A. W. (1989a). Lay epistemics and human knowledge: Cognitive and motivational bases. New York: Plenum.

Kruglanski, A. W. (1989b). The psychology of being "right": The problem of accuracy in social perception and cognition. Psychological Bulletin, 106, 395-409.

Kruglanski, A. W. (1990). Motivations for judging and knowing: Implications for causal attributions. In E. T. Higgins & R. M. Sorrentino (Eds.), Handbook of motivation and cognition: Foundations of social behavior (Vol. 2, pp. 13-37). New York: Guilford.


Kruglanski, A. W., & Freund, T (1983). The freezing and unfreezing of lay inferences: Effects on impressional primacy, ethnic stereotyping, and numerical anchoring. Journal of Experimental Social Psychology, 19, 448-468.

Kruglanski, A. W., Hamel, LA., Maides, S. A., & Schwartz, J.

M. (1978). Attribution theory as a special case ofJay epistemology. In J. H. Harvey, W. 1. Ickes, & R. F. Kidd (Eds.), New directions in attribution research (VoL 2, pp. 299-333). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Kruglanski, A. w., & Klar, Y. (1987). A view from a bridge: Synthesizing the consistency and attribution paradigms from a lay epistemic perspective. European Journal of Social Psychology, 17, 211-241.

Kruglanski, A. W., & Mayseless, O. (1988). Contextual effects in hypothesis testing: The role of competing alternatives and epistemic motivations. Social Cognition, 6, 1-20.

Kruglanski, A. w., Schwartz, 1. M., Maides, S. A., & Hamel, L A. (1978). Covariation, discounting and augmentation: Toward a clarification of attributional principles. Journal of Personality, 46, 176-180.

Kruglanski, A. W., & Webster, D. M. (1989). [Group members' reactions to an opinion deviate under varying degrees of the need for closure]. Unpublished data, University of Maryland, College Park.

Kuhn, T S. (1962). The structure of scientific revolutions. Chicago: Chicago University Press.

Lana, R. E. (1969). Pretest sensitization. In R. Rosenthal & R. L. Rosnow (Eds.), Artifact in behavioral research (pp. 121-146). New York:


Laudan, L. (1977). Progress and its: problems. Berkeley: University of California Press.

Levine, M. A. (1975). A cognitive theory of learning. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Lewin, K. (1943). Forces behind food habits and methods of change. Bulletin of the National Research Council, 108, 35-67.

Lewinsohn, P. M., Mischel, w., Chapin, w., & Barton, R. (1980). Social competence and depression: The role of illusory self-perception. Journal of Abnormal Psychology, 89, 203-212.

Mahoney, M. (1985). Psychotherapy and human change processes. In M. 1. Mahoney & A. Freeman (Eds.), Cognition and psychology. New York: Plenum.

Mayseless, 0., & Kruglanski, A. W. (1987). What makes you so sure?

Effects of epistemic motivations on judgmental confidence. Organizational Behavior and Human Decision Processes, 39, 162-183.

McArthur, L. Z., & Baron, R. M. (1983). Toward an ecological theory of social perception. Psychological Review, 90, 215-238.

McArthur, L. Z., & Ginsburg, E. (1981). Causal attributions to salient stimuli: An investigation of visual fixation mediators. Personality and Social Psychology Bulletin, 7, 547-553.

Miller, D. T. (1976). Ego involvement and attributions for success and failure. Journal of Personality and Social Psychology, 34, 901-906.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction? Psychological Bulletin, 82, 213-225.

Mischel, W. (1979). On the interface of cognition and personality: Beyond the person-situation debate. American Psychologist, 34, 740-754.

Nelson, R. E., & Craighead, W. E. (1977). Selective recall of positive and negative feedback, self-control behaviors, and depression. Journal of Abnormal Psychology, 86, 379-388.

Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-HalL

Nisbett, R. E., Pong, G. T., Cheng, P. w., & Lehman, D. (1987). Teaching reasoning. Unpublished manuscript, University of Michigan, Ann Arbor.

Nisbett, R. E., Krantz, D. H., Jepson, C., & Kunda, Z. (1983). The use of statistical heuristics in everyday inductive reasoning. Psychological Review, 90, 339-363.

Nisbett, R. E., & Ross, L. (1980). Human inference: Strategies and shortcomings of social judgment. New York: Prentice-Hall,

Orne, M. (1962). On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implication. American Psychologist, 17, 776-783.

Orvis, B. R., Kelley, H. H., & Butler, D. (1976). Attributional conflict in young couples. In 1. H. Harvey, W. Ickes, & R. F. Kidd (Eds.), New directions in attribution research (VoL 1, pp. 353-386). Hillsdale, NJ:

Lawrence Erlbaum Associates, Inc.

Pavelchak, M. A. (1989). Piecemeal and category based evaluation: An idiographic analysis. Journal of Personality and Social Psychology, 56, 354-363.


Petty, R. E., & Cacioppo, 1. T. (1981). Attitudes and persuasion: Classic and contemporary approaches. Dubuque, IA: Brown.

Petty, R. E., Cacioppo, 1. T, & Goldman, R. (1981). Personal involvement as a determinant of argument based persuasion. Journal of Personality and Social Psychology, 41, 847-855.

Pittman, T. S., & D'Agostino, P. R. (1985). Motivation and attribution:

The effects of control deprivation and subsequent information processing. In 1. H. Harvey & G. Weary (Eds.), Attribution: Basic issues and applications (pp. 43-66). New York: Academic.

Popper, K. R. (1959). The logic of scientific discovery. New York: Harper. (Original work published 1935).

Popper, K. R. (1966). Conjectures and refutations. New York: Harper. Popper, K. R. (1972). Objective knowledge. Oxford: Oxford University Press.

Pryor, J. B., & Kriss, M. (1977). The cognitive dynamics of salience in the attribution process. Journal of Personality and Social Psychology, 35, 49-55.

Raimy, V. (1975). Misunderstandings of the self San Francisco: JosseyBass.

Regan, D. T, & Fazio, R. H. (1977). On the consistency between attitudes and behavior: Look to the method of attitude formation. Journal of Experimental Social Psychology, 13, 38-45.

Rokeach, M. (1960). The open and closed mind: Investigations into the nature of belief systems and personality systems. New York: Basic.

Rosenfield, D., & Stephan, W. G. (1977). When discounting fails: An unexpected finding. Memory & Cognition, 5, 97-102.

Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in selfperception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32, 880-892.

Ross, M., & Sicoly, F. (1979). Egocentric biases in availability and attribution. Journal of Personality and Social Psychology, 37, 322-336.

Sadler, 0., & Tesser, A. (1973). Some effects of salience and time upon interpersonal hostility and attraction during social isolation. Sociometry, 36, 99-112.

Scott, W. A. (1957). Attitude change through reward of verbal behavior.

Journal of Abnormal and Social Psychology, 55, 72-75.

Sherif, M., & Cantril, H. (1945). The psychology of attitudes L Psychological Review, 52, 295-319.

Smith, E. R., & Miller, F. D. (1979). Salience and the cognitive mediation of attribution. Journal of Personality and Social Psychology, 37, 2240-2252.

Snyder, M. L., & Wicklund, R. A. (1981). Attribute ambiguity. In J. H.

Harvey, W. Ickes, & R. F. Kidd (Eds.), New directions in attribution research (VoL 3, pp. 78-103). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Sorrentino, R. M., & Higgins, E. T. (1986). Handbook of motivation and cognition: Foundations of social behavior. New York: Guilford.

Statts, A. W. (1975). Social behaviorism. Homewood, IL: Dorsey. Steele, C. M. (1988). The psychology of self-affirmation. In L. Berkowitz (Ed.), Advances in experimental social psychology (VoL 21, pp. 261- 299). New York: Academic.

Swann, W. B. (1984). Quest for accuracy in person perception: A matter of pragmatics. Psychological Review, 91, 457-477.

-Tajfel, H. (1981). Social stereotypes and social groups. In J. C. Turner & H. Giles (Eds.), Intergroup behavior (pp. 7-42). Oxford: Basil BlackwelL

Tajfel, H., & Turner, 1. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The social psychology of intergroup relations (pp. 35-62). Monterey, CA: Brooks/Cole.

Taylor, S. E., & Brown, J. D. (1988). Illusion and well being: A social psychological perspective on mental health. Psychological Bulletin, 103, 193-210.

Taylor, S. E., & Fiske, S. T. (1975). Point of view and perception of causality. Journal of Personality and Social Psychology, 32, 439-445.

Taylor, S. E., & Koivumaki, 1. H. (1976). The perception of self and others: Acquaintanceship, affect, and actor-observer differences. Journal of Personality and Social Psychology, 33, 403-408.

Taylor, S. E., & Thompson, S. C. (1982). Stalking the elusive "vividness" effect. Psychological Review, 89, 155-181.

Tesser, A. (1976). Thought and reality constraints as determinants of attitude polarization. Journal of Research in Personality, 10, 183- 194.

Tesser, A. (1978). Self-generated attitude change. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 11, pp. 289-338). New York: Academic.



Tesser, A., & Leone, C. (1977). Cognitive schemas and thought as determinants of attitude change. Journal of Experimental Social Psychology, 13, 340-356.

Testa, T. 1. (1974). Causal relationships and the acquisition of avoidance responses. Psychological Review, 81, 491-505.

Tetlock, P. E. (l985a). Accountabiity: The neglected social context of judgment and choice. Research in Organizational Behavior, 7, 297- 332.

Tetlock, P. E. (1985b). Accountability: A social check on the fundamental attribution error. Social Psychology Quarterly, 68, 227-236.

Tetlock, P. E., & Boettger, R. (in press). Accountability: A social magnifier of the dilution effect. Journal of Personality and Social Psychology.

Tetlock, P. E., & Levi, A. (1982). Attribution bias: On the inconclusiveness of the cognition-motivation debate. Journal of Ex perimental Social Psychology, 18, 68-88.

Trope, Y., & Bassok, M. (1983). Information-gathering strategies in hypothesis testing. Journal of Experimental Social Psychology, 19, 560- 576.

Vallone, R. P., Ross, L., & Lepper, M. R. (1985). The hostile media phenomenon: Biased perception and perceptions of media bias in coverage of the Beirut massacre. Journal of Personality and Social Psychology, 49, 577-585.

Wason, P. C., & Johnson-Laird, P. N. (1972). Psychology of reasoning:

Structure and content. Cambridge, MA: Harvard University Press.

Webb, E. 1., Campbell, D. T., Schwartz, R. D., & Sechrest, L.

B. (1966). Unobtrusive measures: Nonreactive research in the social sciences. Chicago: Rand McNally.

Weimer, W. B. (1979). Psychology and the conceptualfoundations of science. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Weiner, B. (1979). A theory of motivation for some classroom experiences. Journal of Educational Psychology, 71, 3-25.

Weiner, B. (1985a). An attributional theory of achievement motivation and emotion. Psychological Review, 92, 548-573.

Weiner, B. (1985b). "Spontaneous" causal thinking. Psychological Bulletin, 97, 74-84.

Zanna, M. P., Kiesler, C. A., & Pilkonis, P. A. (1970). Positive and negative attitudinal affect established by classical conditioning. Journal of Personality and Social Psychology, 14, 321-328.

Zanna, M. P., & Rempel, J. K. (1988). Attitudes: A new look at an old concept. In D. Bar-Tal & A. W. Kruglanski (Eds.), The social psychology of knowledge (pp. 315-334). Cambridge, England: Cambridge University Press.

Zuckerman, M. (1979). Attribution of success and failure revisited, or:

The motivational bias is alive and well in attribution theory. Journal of Personality, 47,245-287.

Copyright © 2002 E BSCO Publishing

Sign up to vote on this title
UsefulNot useful