This action might not be possible to undo. Are you sure you want to continue?
Cognitive Engineering: Human Problem Solving with Tools
D. D. WOODSt and E. M. ROTH,2 Westinghouse Pittsburgh, Pennsylvania Research and Development Center '
C?gnitive engi~~ring is an applied cognitive science that draws on the knowledge and techntqu~ of cog"!ltrve psychology and related disciplines to provide the foundation for principle-dnven desrgn of person-machine systems. This paper examines the fundamental features that cha~acten.ze .c0l!nitive engineering and reviews some of the major issues faced by this nascent mterdrscrplmary field.
INTRODUCTION Why is there talk about a field of cognitive engineering? What is cognitive engineering? What can it contribute to the development of more effective person-machine systems? What should it contribute? We will explore some of the answers to these questions in this paper. As with any nascent and interdisciplinary field, there can be very different perspectives about what it is and how it will develop over time. This paper represents one such view. The same phenomenon has produced both the opportunity and the need for cognitive engineering. With the rapid advances and dramatic reductions in the cost of computational power, computers have become ubiquitous in modern life. In addition to traditional office applications (e.g., word-processing, accounting, information systems), computers increasingly dominate a broad
I Requests for reprints should be sent to David D. ~oods,. Department of Industrial and Systems Engineermg, OhIO State University, 1971 Neil Ave. Columbus OH 43210. " zEmilie Roth, Department of Engineering and Public Policy, Carnegie-Mellon University, Pittsburgh, PA 15213.
range of work environments (e.g., industrial process control, air traffic control, hospital emergency rooms, robotic factories). The need for cognitive engineering occurs because the introduction of computerization often radically changes the work environment and the cognitive demands placed on the worker. For example, increased automation in process control applications has resulted in a shift in the human role from a controller to a supervisor who monitors and manages semiautonomous resources. Although this change reduces people's physical workload, mental load often increases as the human role emphasizes monitoring and compensating for failures. Thus computerization creates an increasingly larger world of cognitive tasks to be performed. More and more we create or design cognitive environments. The opportunity for cognitive engineering arises because computational technology also offers new kinds and degrees of machine power that greatly expand the potential to assist and augment human cognitive activities in complex problem-solving worlds, such as monitoring, problem formulation, plan
© 1988, The Human Factors Society, Inc. All rights reserved.
This question defines the central focus of cognitive engineering: to understand what is effective support for human problem solvers.. an uneasy mixture of other types of description of a complex situation has been substituted -descriptions in terms of the application itself (e. 1985. constructed in a variety of media and technologies including current AI tools.. internal medicine or power plant thermodynamics).g. This is highly creative time when people are exploring and testing what can be created with the new machine power-displays with multiple windows and even rooms (Henderson and Card.g. as witnessed by early attempts to develop computerized alarm systems in process control (e. 1985). This is the dark side: the capability to do more amplifies the potential magnitude of both our successes and our failures. What are the cognitive implications of some application's task demands and of the aids and interfaces available to the practitioners in the system? How do people behave/perform in the cognitive situations defined by these demands and tools? Because this independent cognitive description has been missing. Wiener. Although our ability to build more powerful machine cognitive systems has grown and been promulgated rapidly. performance breakdowns have been observed repeatedly with support systems. 1978) and attempts to convert paper-based procedures to a computerized form (e. The effort required to provide a cognitive support function in different kinds of media or technology may also vary. Rasmussen.. Elm and Woods. Hirschhorn. Different kinds of media or technology may be more powerful than others in that they enable or enhance certain kinds of cognitive support functions. This means that factors related to tool usage can and should affect the very nature of the tools to be used. Pope. 1986). Different choices of media or technology may also represent trade-offs between the kinds of support functions that are provided to the practitioner. 1984. Hoogovens Report. when issues of tool use were not considered (see Roth. Today we can describe cognitive tools in terms of the tool-building technologies (e.g. and fault management. The conditions under which the machine will be exercised and the human's role in problem solving have a profound effect on the quality of performance. 1986). This observation is not new-in actual work contexts.. the implementation technology of the interfaces/aids.g. In any case. and Woods. Noble. The capability to build more powerful machines does not in itself guarantee effective performance.. tiled or overlapping windows).g.416-August 1988 HUMAN FACTORS generation and adaptation. (Clancey. The impediment to systematic provision of effective decision support is the lack of an adequate cognitive language of description tool building to assist human performance. 1987). This view of cognitive technology as com- . 1985). The question we continue to face is how we should deploy the power available through new capabilities for 1984. Careful examination of past shifts in technology reveals that new difficulties (new types of errors or accidents) are created when the shift in machine power has changed the entire human-machine system in unforeseen ways (e. Bennett. 1976. the user's physical activities. The new capabilities have led to large amounts of activity devoted to building new and more powerful tools-how to build better-performing machine problem solvers. our ability to understand how to use these capabilities has not kept pace. performance aiding requires that one focus at the level of the cognitive support functions required and then at the level of what technology can provide those functions or how those functions can be crafted within a given computational technology.
the cognitive demands that they create. Rasmussen.. Lanir. and Johnson. Studying human behavior in complex worlds (and designing support systems) is one case of people engaged in problem solving in a complex world.. 1985. about Complex Worlds Cognitive engineering is about human behavior in complex worlds. the role of problem formulation and reformulation in effective performance is often overlooked. to understand human behavior in complex situations. knowledge engineering). . Newell and Card. First. some aspects of problem solving may emerge only when more complex situations are directly examined. the actual problem solver must cope with the possibility of multiple failures. the strategies researchers and designers use to cope with complexity are similar as well.. This strategy is limited because it is not clear whether the relevant aspects of the whole have been captured. 1986. Pople.. 1986). Fischhoff. the designer of a machine problem solver may assume that only one failure is possible to be able to completely enumerate possible solutions and to make use of classification problem-solving techniques (Clancey. 1986). 1986. 1981. and interacting disturbances (e. Norman and Draper. particularly in this time of advancing machine power. Norman. analogous to the task of other human problem solvers (e. cognitive factors. (See Dorner..COGNITIVE ENGINEERING August 1988-417 plementary to computational technology is in stark contrast to another view whereby cognitive engineering is a necessary but bothersome step to acquire the knowledge fuel necessary to run the computational engines of today and tomorrow. Rasmus- . Montmollin and De Keyser. 1983. and some of the cognitive failure forms that emerge when these demands are not met is essential if advances in machine power are to lead to new cognitive tools that actually enhance problem-solving performance. It is one major source of failure in the design of machine problem solvers. second. Not surprisingly. operators. Thus one might address only a single time slice of a dynamic process or only a subset of the interconnections among parts of a highly coupled world. However. In this section we will examine some of the characteristics of cognitive engineering (or whatever label you prefer-cognitive technologies. COGNITIVE ENGINEERING IS . There has been growing recognition of this need to develop an applied cognitive science that draws on knowledge and techniques of cognitive psychology and related disciplines to provide the basis for principle-driven design (Brown and Newman. 1985. 1985. parts of the problem-solving process may be missed or their importance underestimated. cognitive ergonomics. Woods and Roth. 1985). The specific perspective for this exposition is that of cognitive systems engineering (Hollnagel and Woods. 1983. troubleshooters) who confront complexity in the course of their daily tasks. in press. 1985. Reducing the complexity of design or research questions by bounding the world to be considered merely displaces the complexity to the person in the operational world rather than providing a strategy to cope with the true complexity of the actual problem-solving context. For example. What makes problem solving complex? How does complexity affect the performance of human and machine problem solvers? How can problem-solving performance in complex worlds be improved and deficiencies avoided? Understanding the factors that produce complexity. For example. a standard tactic to manage complexity is to bound the world under consideration. The result is that we need. Klein.g.. 1986).g. For example. misleading signals. Woods.
) .. have pointed out the need for a semantic and pragmatic analysis of environment-cognitive agent relationships with respect to the goals/resources of the agent and the demands/constraints in the environment. The studies by De Keyser (e. the introduction of computerized alarm systems into power plant control rooms inadvertently so badly undermined the strategies operational personnel used to cope with some problem-solving demands that the systems had to be removed and the previous alarm system restored (Pope. for other discussions on the nature of complexity in problem solving. quite a lot could be learned from examining the nature of the tools that people spontaneously create to work more effectively in some problem-solving environment. 1987. To guard against this danger. Rissland.g. we run the risk of eliminating the critical features of the world that drive behavior. one has to pay very close attention to what people actually do in a problem-solving world. Roth. The danger is to fall into the psychologist's fallacy of William James (1890) whereby the psychologist's reality is confused with the psychological reality of the human practitioner in his or her problemsolving world.418-August 1988 HUMAN FACTORS sen. Coombs. To decide this question. 1987). Rasmussen. For example. It is about multidimensional. An example of the ecological perspective is the need to study humans solving problems with tools (i. as occurred in Roth et al. or examining how tools provided for a practitioner are really put to use by practitioners. As a result.. This creates the problem of deciding what counts as an effective stimulus (as Gibson has pointed out in ecological perception) or. 1986.. .g. The point is that these worlds encompass more than the design intent-they exist in the world as natural problem-solving habitats. 1984. Selfridge. 1987). given the actual demands that they face (Woods. In reducing the target world to a tractable laboratory or desktop world in search of precise results. . 1987). 1978). 1986) are extremely unique with respect to the latter. 1986. Coombs and Hartley. . and Pople. From this viewpoint. Woods and Hollnagel.e. Principle-driven design of support systems begins wi th understanding what are the difficult aspects of a problemsolving situation (e.. decid- ing what counts for a symbol. Gibson (1979) and Dennett (1982).. the psychologist or cognitive engineer must start with the working assumption that practitioner behavior is reasonable and attempt to understand how this behavior copes with the demands and constraints imposed by the problem-solving world in question..g. virtually all of the work environments that we might be interested in are man-made. 1986. or examining how preexisting mechanisms are adapted to serve as tools.. to use an alternative terminology. support systems) as opposed to laboratory research that continues. among others. to examine human performance stripped of any tools.g. open worlds and not about the artificially bounded closed worlds typical of the laboratory or the engineer's desktop (e. for the most part. Of course. 1987). Purely syntactic and exclusively tool-driven approaches to develop support systems are vulnerable to the error of the third kind: solving the wrong problem. Coombs and Hartley. (1987). How to put effective cognitive tools into the hands of practitioners is the sine qua non for cognitive engineering. about the Semantics of a Domain A corollary to the foregoing points is that cognitive engineering must address the contents or semantics of a domain (e. Funder. Ecological Cognitive engineering is ecological. and Arbib..
formance of the current problem-solving system. The specification is typically embodied as a computer program and consti- . it would impose strong practical constraints on principle-driven development of support systems. Buggy knowledge-missing. Gentner and Stevens. or even that they always produce acceptable levels of performance. . More ambitious are attempts to build a formal cognitive language-for example.. and identify pragmatic reasoning situations.. This means that there is a need for cognitive engineering to understand where. 1985.g. human. as well as the language of particular computational mechanisms. Roth and Woods. To achieve the goal of enhanced performance. 1980. Basic concepts are confirmed only when they generate treatments (aiding either on-line or off-line) that make a difference in the target world. Holyoak. how.. and Oliver (1986). that by Coombs and Hartley (1987) through their work on coherence in model generative reasoning. after Cheng and Holyoak (1985) and Cheng.g. it is about changing behavior/performance in that world.COGNITIVE ENGINEERING August 1988-419 The question is not why people failed to accept a useful technology but. but only that understanding how they function in the initial cognitive environment is a starting point to develop truly effective support systems (e. and humanplus-machine problem solving breaks down in natural problem-solving habitats.. If each world is seen as completely unique and must be investigated tabula rasa. 1988).. the cognitive language must be able to escape the language of particular worlds. Nisbett. about Improved Performance Cognitive engineering is not merely about the contents of a world. restricting it to cases in which the consequences of poor performance are extremely high. Cheng's concepts about human deductive reasoning (Cheng and Holyoak. The buggy knowledge approach provides a specification of the knowledge structures (e. 1983).g. If this were the case. rather. Brown and Burton. These reasoning situations are abstract relative to the language of the particular application in question and therefore transportable across worlds. This is not to say that the strategies developed to cope with the original task demands are always optimal.. incomplete or erroneous knowledge) that are postulated to produce the pattern of errors and correct responses that characterize the performance of particular individuals. 1986) generated treatments that produced very large performance changes both absolutely and relative to the history of rather ineffectual alternative treatments to human biases in deductive reasoning. but they are also pragmatic in the sense that the reasoning involves knowledge of the things being reasoned about. cognitive engineering must first identify the sources of error that impair the per- To achieve relevance to specific worlds and generalizability across worlds. Brown and VanLehn. This is both a practical consideration-improving performance or reducing errors justifies the investment from the point of view of the world in question-and a theoretical consideration-the ability to produce practical changes in performance is the cri terion for demonstrating an understanding of the factors involved. and why machine. incomplete. are vulnerable to myopia. how the original alarm system supported and the new system failed to support operator strategies to cope with the world's problem-solving demands. on the other hand. or erroneous knowledge-is one source of error (e. Cheng et aI. 1978. Semantic approaches. then cognitive engineering can be no more than a set of techniques that are used to investigate every world anew.
g.. Vye. Ferrara.420-August 1988 HUMAN FACTORS tutes a "theory" of what these individuals "know" (including misconceptions). Training has to be about more than simply student knowledge acquisition. Perkins and Martin.g. Cheng et aI. and Lichtenstein (1979) and Kruglanski. Fischhoff. 1983. Sherwood.. and Rieser. Does he or she know that it is relevant to the problem at hand.. This is the issue of expression of knowledge. The general conclusion of studies on the problem of inert knowledge is that possession of the relevant domain knowledge or strategies by themselves is not sufficient to ensure that this knowledge will be accessed in new contexts.. it must also enhance the expression of knowledge by conditionalizing knowledge to its use via information about "triggering conditions" and constraints (Glaser. Similarly. The critical question is not to show that the problem solver possesses domain knowledge. Friedland. Bransford. From the point of view of computational power. and Simon. 1985). Off-line training experiences need to promote an understanding of how concepts and procedures can function as tools for solving relevant problems (Bransford et aI.. representativeness) were greatly reduced or eliminated when aspects . people will fail to apply a recently learned problem-solution strategy to an isomorphic problem (see also Kotovsky. 1986). for examples). Slovic. Cheng et aI. 1986. Thus the fact that people possess relevant knowledge does not guarantee that this knowledge will be activated when needed. human performance may be how knowledge is activated and utilized in the actual problem-solving environment (e .. In contrast. and does he or she know how to utilize this knowledge in problem solving? Studies of education and training often show that students successfully acquire knowledge that is potentially relevant to solving domain problems but that they often fail to exhibit skilled performance-for example. Education and training tend to assume that if a person can be shown to possess a piece of knowledge in any circumstance. 1986. The question concerns not merely whether the problem solver knows some particular piece of domain knowledge. This has been called the problem of inert knowledge-knowledge that is accessed only in a restricted set of contexts. Hayes. a variety of research has revealed dissociation effects whereby knowledge accessed in one context remains but rather the more stringent criterion that situation-relevant knowledge is accessible under the conditions in which the task is performed. 1986. The fact that people possess relevant knowledge does not guarantee that that knowledge will be activated and utilized when needed in the actual problem-solving environment. and Campione. 1986). Gick and Holyoak (1980) found that unless explicitly prompted. 1986). on-line representations of the world can help or hinder problem solvers in recognizing what information or strategies are relevant to the problem at hand (Woods. then this knowledge should be accessible under all conditions in which it might be useful. For example. 1986. 1984). But the more critical question for effective inert in another (Bransford et aI. For example. Brown and Campione. such as the relationship between two entities. differences in solving mathematical exercises versus word problems (see Gentner and Stevens. and Farkash (1984) found that judgmental biases (e. Brown. Bransford. 1986). 1983. Human performance aiding then focuses on providing missing knowledge and correcting the knowledge bugs. more knowledge and more correct knowledge can be embodied and delivered in the form of a rule-based expert system following a knowledge acquisition phase that determines the fine-grained domain knowledge.
1983). and human users of the interface. Notice that the application world (what the interface is used for) is deemphasized. Kieras and Polson... From one point of view the computer program being executed is the end application of concern. one often speaks of the interface. a medium through which agents come to know and act on the world-troubleshooting electronic devices (Davis. plan generation. Mancini. Therefore we will generally refer to people as domain agents or actors or problem solvers and not as users. situation assessment. Mitchell and Saisi. word processors for document preparation or copying machines for duplicating material). logistic maintenance systems. In this case. It is in designing interfaces and aids for . and practitioner intentions (Wiecha and Henrion. and plan monitoring and adaptation are significantly more complex. The human is not a passive user of a computer program but is an active problem-solver in some world. air traffic control systems. and Pople. medical diagnosis) in which problem formulation.e. Issues of concern include designing for learnability (e. Woods and Roth. system state.. Brown and Newman.COGNITIVE ENGINEERING August 1988-421 of the situation cued the relevance of statistical information and reasoning. The interface is an external representation of an application world.g. replace word 1 with word 2) and the steps required to accomplish them are relatively straightforward.. For example. there are many decision-making and supervisory environments (e. 1987. The bulk of work on humancomputer interaction takes this perspective.g. Lesh. 1983. 1984. 1986... Woods. 1987). Research on person-computer interaction has typically dealt with office applications (e. Mancini. 1988)? . Miyata and Norman. and composition.. in which the goals to be accomplished (e.. text-editing tasks are performed only in some larger context such as transcription. that is. 1986. and Woods. 1986).g. Shneiderman. about Systems Cognitive engineering is about systems. 1986).. 1987). Tasks are properties of the world in question. 1987. In part. 1986)? What representations cue people about the knowledge that is relevant to the current context of goals. 1986.. These applications fall at one extreme of the cognitive complexity space. and Hollnagel.g. data entry. 1983. Stefik et aI. One source of tremendous confusion has been an inability to clearly define the "systems" of interest. 1986). demands) is affected by the design of the external representation (e.. in press. In contrast. Perkins and Martin. managing power distribution networks. although performance of these fundamental tasks (i. Roth.g. 1985. The challenge for cognitive engineering is to study and develop ways to enhance the expression of knowledge and to avoid inert knowledge. Carroll and Carrithers. the tasks performed within the syntax of the interface. 1985). 1985) and designing for ease and pleasureableness of use (Malone. the difference in the foregoing views can be traced to differences in the cognitive complexity of the domain task being supported. aircraft and helicopter flight decks (Pew et aI. Fischhoff et aI. Norman. and command and control of a battlefield (e.g. process control accident response (Woods. Rasmussen. 1987. 1984. Gadd and Pople. medical diagnosis (Cohen et aI.. Thus one dimension along which representations vary is their ability to provide prompts to the knowledge relevant in a given context. managing data communication networks. 1987). military situation assessment. What training content and experiences are necessary to develop condi tionalized knowledge (Glaser. goal definition.. A second perspective is to distinguish the interface from the application world (Hollnagel.
. and Roberts. Because of the expansions in computational powers. Mancini. When a system includes these machine agents. If this stage of technology introduction is mishandled (for example. nor should the focus of interest in evaluation be the performance of the machine problem solver alone. et ai. The "system" of interest in design should not be the machine problem solver per se. command and control. Schum. One metaphor that is often invoked to frame questions about the relationship between human and machine intelligence is to examine human-human relationships in multi person problem-solving or advisory situations and then to transpose the results to human-intelligent machine interaction (e. e. Following this metaphor leads Muir (1987) to raise the question of the role of "trust" between man and machine in effective performance. Coombs and Alty. Cognitive engineering must be able to address systems with multiple cognitive agents.. This means that factors about how new technology is introduced into the work environment can playa critical role in building or undermining trust in the machine problem solver. 1980). the human role is not eliminated but shifted. 1986.422-August 1988 HUMAN FACTORS these applications that it is essential to distinguish the world to be acted on from the interface or window on the world (how one comes to know that world). March and WeisingerBaylon. 1986... and the design goal is to maximize overall performance. and Woods. This means that changes in automation are changes in the joint human-machine cognitive system. and from agents who can act directly or indirectly on the world.. the machine element can be thought of as a partially autonomous cognitive agent in its own right. One provocative question that Muir's analysis generates is. data communication networks). This applies to multiple human cognitive systems (often called distributed decision making). leading to excessive trust or mistrust. Ultimately the focus must be the design and the performance of the human-machine problem-solving ensemble -how to "couple" human intelligence and machine power in a single integrated system that maximizes overall performance.g.g. . Not only do we need to be clear about where systemic boundaries are drawn with respect to the application world and interfaces to or representations of the world. 1984). Rochlin. how does the level of trust between human and machine problem solvers affect performance? The practitioner's judgment of machine competence or predictability can be miscalibrated. La Porte. in press). joint cognitive systems (Hollnagel. Trust or mistrust is based on cumulative experience with the other agent that provides evidence about enduring characteristics of the agent such as competence and predictability. 1986. This raises the problem of how to build a cognitive system that combines both human and machine cog- nitive systems or. Either a system will be underutilized or ignored when it could provide effective assistance or the practitioner will defer to the machine even in areas that challenge or exceed the machine's range of competence.. Fischhoff.. (Fischhoff. we also need to be clear about the different agents who can act directly or indirectly on the world.g. practitioners are exposed to the system before it . Mancini et aI. Another question concerns how trust is established between human and machine. process control. 1986. in press. about Multiple Cognitive Agents A large number of the worlds that cognitive engineering should be able to address contain multiple agents who can act on the world in question (e. in other words.
. people are tool builders and tool users. 1988b). First. 1986. the practitioner's trust in the machine's competence can be undermined. Quinn and Russell. Sheridan and Hennessy.g. Boy (1987) uses this metaphor to guide the development of assistant systems built from AI technology. to cognitive tools. with the arrival of AI technology.) In this metaphor the question of the relationship between machine and human takes the form of what kind of tool is an intelligent machine (e.. 1980. Muir also points out how human information processing biases can affect how the evidence of experience is interpreted in the calibration process. Technological development has moved from physical tools (tools that magnify capacity for physical work) to perceptual tools (extensions to human perceptual apparatus such as medical imaging) and now. In order for a supervisory control architecture between human and machine agents to function effectively. and he or she possesses ultimate responsibility and authority. the machine can be a prosthesis that compensates for a deficiency in human reasoning or problem solving. A third metaphor is to consider the new machine capabilities as extensions and expansions along a dimension of machine power.g. (Although this type of tool has a much longer historye. Ehn and Kyng. At the other extreme. Third. there is need for a common or shared representation of the state of the world and of the state of the problem-solving process (Woods and Roth.. Suchman. 1984. the supervisor must have real as well as titular authority. 1986). Suchman. the machine element is thought of as a semiautonomous cognitive system. 1986).g. Again. The human supervisor generally has a wider range of responsibility. albeit partially autonomous. 1988a). 1987. 1987). In this metaphor machines are tools. machine problem solvers can be designed and introduced in such a way that the human retains the responsibility for outcomes without any effective authority. A second metaphor that is frequently invoked is supervisory control (Rasmussen. (1987) found that some practitioners tried to devise ways to instruct an expert system in situations in which the machine's problem solving had broken down. the machine can be an instrument in the hands of a fundamentally competent but limited-resource .. but in this case it is a lower-order subordinate. However. Roth et al. Second. At one extreme. the supervisor must be able to redirect the lower-order machine cognitive system. several requirements must be met that.COGNITIVE ENGINEERING August 1988-423 is adequately debugged). as Woods (1986) has pointed out. even when the machine's designer had provided no such mechanisms. are often overlooked when tooldriven constraints dominate design. Allen and Perrault. Significant attention has been devoted to the issue of how to get intelligent machines to assess the goals and intentions of humans without requiring explicit statements (e. Muir's analysis shows how variations in explanation and display facilities affect how the person will use the machine by affecting his or her ability to see how the machine works and therefore his or her level of calibration. otherwise communication between the agents will break down (e. aide-memories or decision analysis-AI has certainly increased the interest in and ability to provide cognitive tools. 1984).g. This could be a local deficiency for the population of expected human practitioners or a global weakness in human reasoning. in order to be able to supervise another agent. the supervisory control metaphor highlights that it is at least as important to pay attention to what information or knowledge people need to track the intelligent machine's "state of mind" (Woods and Roth. Woods.
and (3) to entrast to the usual notion of new systems as hance error detection by providing better amplifiers of user capabilities.424-August 1988 HUMAN FACTORS human practitioner (Woods. 1988a).g. or strategies lem-solving context and understand the that determine how the problem is solved.. 1985. the cognitive problems to be solved provided) by new performance aids and in. and Visciola. and Hartley. New levels of performance are different cognitive demands imposed by the now possible.g. the amplification metaphor im. Problem-Driven performance by increasing the strength or power of the cognitive processes the human Cognitive engineering is problem-driven. Extra resources better see the implications of concept and to may help to improve performance in another way by allowing a restructuring of how the help one restructure one's view of the probhuman performs the task.engineering must be able to analyze a probderlying activities. 1984. problem formulation and reformulation. Bagnara. processes.Norman.g. Examples include keeping track of The instrumental perspective suggests that the most effective power provided by good cognitive tools is conceptualization power (Woods and Roth. and the kinds of errors one is application world in question and with charprone to (and therefore the consequences of acteristics of the cognitive agents. The extra resources may support improved performance in several ways. (2) enevidence related to decisions about particular domain issues. is often left out of studies of problem solving and the design basis of new support systems. 1987). shifting perfor. and Weitzman. 1988). 1987). plies that support systems improve human . 1986). Pople. Hutchins. both for errors) change as well. 1987.. 1985. sources of both good and poor performance Alternatively. 1987). 1984. This means that cognitive problem. One path is to off-load overhead information-processing activities from the person to the machine to allow the human practitioner to focus his or her resources on "higher-level" issues and strategies. and 1985). Hollan. This restructuring concept is in con. Coombs mance onto a new higher plateau (see Pea. Hutchins. Woods et aI. the resources provided (or not -that is. Pea. Rasmussen. the activities.(Rizzo... The importance of conceptualization power in effective problem-solving performance is often overlooked because the part of the problem-solving process that it most crucially affects. Woods and representations of the world interact with Roth.lem (Becker and Cleveland. The machine aids the practitioner by providing increased or new kinds of resources (either knowledge resources or processing resources such as an expanded field of attention). and collecting the gies (e. but without any change in the un. Woods and Hollnagel.. one must understand how performing domain tasks (e. 1985). or strategies that To build a cognitive description of a probcarry out the cognitive functions relevant to lem-solving world. problem solver goes through to solve the tool-constrained.or challenges to be met (e.. performing basic data computations or transformations. Hollan. processes. Support systems that increase conceptualization power (1) enhance a problem solver's ability multiple ongoing activities in an external to experiment with possible worlds or stratememory. existing and prospective changes in the . As Pea (1985) feedback about the effects/results of actions points out. as occurred recently with hance their ability to visualize or to make new computer-based displays in nuclear concrete the abstract and uninspectable (analogous to perceptual tools) in order to power plant control rooms. terface systems can support restructuring of 1986.
1988). 1988.g.COGNITIVE ENGINEERING August 1988-425 world. it was possible to redesign the system to support domain actors and eliminate the getting-lost problems (Elm and Woods. Because cognitive engineering issues were not considered in the application of the new technology. Mitchell and Saisi. Gruber and Cohen. Techniques for analyzing cognitive demands not only help characterize a particular world but also help to build a repertoire of general cognitive situations that are transportable. the language of implementation is used as a cognitive language. 1987. representation aids. AN EXAMPLE OF HOW COGNITIVE AND COMPUTATIONAL TECHNOLOGIES INTERACT To illustrate the role of cognitive engineering in the deployment of new computational powers. 1988. The results of this process then can be deployed in many possible ways as constrained by tool-building limitations and tool-building possibili ties-explora tion training worlds. The results from this analysis are used to define the kind of solutions that are needed to enhance successful performance-to meet cognitive demands of the world. McCraken. for the most part. Woods and Roth. There is a clear trend toward this conception of knowledge acquisition in order to achieve more effective decision support and fewer brittle machine problem solvers (e. to eliminate or mitigate error-prone points in the total cognitive system (demand-resource mismatches). Acquiring and using a domain semantics is essential to avoiding potential errors and specifying performance boundaries when building "intelligent" machines (Roth et aI. 1985. Roth and Woods. The alternative is to provide an umbrella structure of domain semantics that organizes and makes explicit what particular pieces of knowledge mean about problem solving in the domain (Woods. 1984).. Building of a cognitive description is part of a problem-driven approach to the application of computational power. Based on a cognitive analysis of the world's demands. to help the human function more expertly. 1987). In this case an attempt was made to implement a computerized procedure system using a commercial hypertext system for building and navigating large network data bases. The system was built based on a network data base "shell" with a builtin interface for navigating the network (Robertson. a highlevel person-machine performance problem resulted-the" getting lost" phenomenon (Woods. advisory systems. and Newell. Coombs. knowledge acquisi tion focuses on describing domain knowledge in terms of the syntax of computational mechanisms-that is. 1981). It is one example of how purely technologydriven deployment of new automation capabilities can produce unintended and unforeseen negative consequences. Clan- cey. 1988). so it was assumed possible to create the computerized system simply by . 1987. consider a case in human-computer interaction (for other cases see Mitchell and Forren. The data base application in question was designed to computerize paper-based instructions for nuclear power plant emergency operation. The shell aiready treated human-computer interface issues. 1986. Semantic questions are displaced either to whomever selects the computational mechanisms or to the domain expert who enters knowledge. In tool-driven approaches.. 1987). new information. Through cognitive engineering it proved possible to build a more effective computerized procedure system that. was within the technological boundaries set by the original technology chosen for the application. 1985). Woods and Roth. or machine problem solvers (see Roth and Woods. 1988).
Many activities that are inherently easy to perform in a physical book turned out to be very difficult to carry outfor example. Operators needed to maintain awareness of the global context (i. 1984). there was a very high proportion of menu frames relative to content frames. to anticipate the need for actions by looking ahead.e. The result was a mismatch between user information-processing activities during domain tasks and the structure of the interface as a representation of the world of recovery from abnormalities. 1982. reading ahead. or unable even to find places where they knew they should be (i. rather. The system contained two kinds of frames: menu frames. Because of the size of a frame. and the content frames provided a narrow window on the world.. step-by-step execution of instructions. Second. As a result. which served to point to other frames.e. unable to determine where they were in the network of frames. Following procedures was not simply a matter of linear. to see the organization of the steps.e. What was the source of the disorientation problem? It resulted from a failure to analyze the cognitive demands associated with using procedures in an externally paced world. yet could not find the relevant procedural steps in the network). Roth et aI. First. rather than sequentially." unable to keep procedure steps in pace with plant behavior. which contained instructions from the paper procedures (generally one procedure step per frame). unable to decide where to go next. they diagnosed the situation. the current instructions as implemented for the paper medium) into the interface and network data base framework provided by the shell. Transition options were presented at several grains of analysis to support moves from step to step as easily as moves across larger units in the structure of the procedure system. knew the appropriate responses as trained operators. a spatial metaphor was used to make the system more like a physical book. how a given step fits into the overall plan). In preliminary tests of the system it was found that people uniformly failed to complete recovery tasks with procedures computerized in this way. in using the procedures the operator often is required to interrupt one activity and transition to another step in the procedure or to a different procedure depending on plant conditions and plant responses to operator actions. display selection/movement options were presented in parallel. with procedural information (defining two types of windows in the interface)... to mark steps pending completion and return to them easily. and content frames. In addition. incomplete steps were automati- . or to mark a "trail" of activities carried out during the recovery. operators need to be able to rapidly transition across procedure boundaries and to return to incomplete steps. 1987). and to monitor for changes in plant state that would require adaptation of the current response plan.426-August 1988 HUMAN FACTORS entering domain knowledge (i. A variety of cognitive engineering techniques were utilized in a new interface design to support the demands (see Woods.. This structure made it difficult to read ahead to anticipate instructions. These results were found with people experienced with the paper-based procedures and plant operations as well as with people knowledgeable in the frame-network software package and how the procedures were implemented within it. For example. it required the ability to maintain a broad context of the purpose and relationships among the elements in the procedure (see also Brown et aI.. They became disoriented or "lost. These results triggered a full design cycle that began with a cognitive analysis to determine the user information-handling activities needed to effectively accomplish recovery tasks in emergency situations.
and those steps were made directly accessible (e. Repair theory: A generative theory of bugs in procedural skills... and Campione. A. J. To provide the global context within which the current procedure step occurs. (1987). K. Brown. S. Boy.. M. J. and Burton. R. Woods. there can be severe penalties for failing to adequately map the cognitive demands of the environment. American Psychologist. J. S. 27. and understanding. and Rieser. and E. C. P.). 41. Woods. International Journal of Man-Machine Studies. 15.. T. J. Note how the cognitive analysis of the domain defined what types of data needed to be seen effectively in parallel. (in press). A. (1985). and VanLehn. Palo Alto.. N. then a variety of techniques can be employed to build support systems for those functions. Markman (Eds. C. J. Also in G. A. 41... (1980). Report). Mancini. As the foregoing example saliently demonstrates. and Campione.. 1078-1089. Psychological theory and the study of learning disabilities. Vye. H.. Cognitive engineering in dynamic worlds. 1. S. directly implementable within the base capabilities of the interface shell.. G. (1983).. Brown. if we understand the cognitive requirements imposed by the domain. Moran. Diagnostic models for procedural bugs in basic mathematics. In this paper we have attempted to outline the questions that need to be answered to make this promise real and to point to research that already has begun to provide the necessary concepts and techniques. (1984). 359-391.. Sherwood. 4. or are we condemned to simply build what can be built practically and wait for the judgment of experience? Is principle-driven design possible? A vigorous and viable cognitive engineering can provide the knowledge and tech- niques necessary for principle-driven design. which then determined the number of windows required. S. Ferrara. 1984). . Carmichael's manual of child psychology. Murray Hill. CA: Xerox Palo Alto Research Center. D. Teaching and problem solving: Research foundations. the step of interest is presented in detail and is embedded in a skeletal structure of the larger response plan of which it is a part (Furnas. Brushing the scatlerplot matrix: High-interaction graphical methods for analyzing multidimensional data (Tech. C. SUMMARY The problem of providing effective decision support hinges on how the designer decides what will be useful in a particular application. D. D. 1986. In J. R. Flavell and E. Also note that the cognitive engineering redesign was. REFERENCES Allen. 541-554. Cognitive engineering does this by providing the basis for a problem-driven rather than a technology-driven approach whereby the requirements and bottlenecks in cognitive task performance drive the development of tools to support the human problem solver. Can researchers provide designers with concepts and techniques to determine what will be useful support systems.1059-1068. A.143-178. and Perrault. M.COGNITIVE ENGINEERING August 1988-427 cally tracked. (1978). Becker.. Report). (1982). ArtificialIntelligence.). R. However. and Williams. L. E. The semantics of procedures (Tech. Brown. Context sensitivity was supported by displaying the rules for possible adaptation or shifts in the current response plan that are relevant to the current context in parallel with current relevant options and the current region of interest in the procedures (a third window). Bransford. (1986). 5.. New York: Wiley. J. J. Cognitive Science. Analyzing intention in utterances. Operator assistant systems. Learning. Issues in cognitive and social ergonomics: From our house to Bauhaus. electronic bookmarks or placeholders). W. Brown. R. and Newman. American Psychologist. NJ: AT&T Bell Laboratories. 2. 379-426. (1986). L. J. J. A.. Hollnagel (Eds. Brown. remembering. (1980). 5. J. with a few exceptions. R. Human-Computer Interaction. London: Academic Press. 155-192. and Cleveland. Cognitive Science. Cognitive engineering can address (a) existing cognitive systems in order to identify deficiencies that cognitive system redesign can correct and (b) prospective cognitive systems as a design tool during the allocation of cognitive tasks and the development of an effective joint architecture. Brown.g. Bransford.
Kruglanski. Cambridge. Coombs. 101. Intelligent decision support in process environments. J. Dorner. D. Why are some problems hard? Evidence from Tower of Hanoi. A... E. In E. American Psychologist. S. (1984). M. B. C. (1986). International Journal of ManMachine Studies. Report).. Bischof (Eds. (Eds. (1987).428-August 1988 HUMAN FACTORS Carroll. Report FR251). NJ: Erlbaum. and Pople. London: Academic Press.. (Eds. 26. Psychological Bulletin. P. Decision making in complex systems. Fischhoff. Cognitive Psychology. Expert systems: An alternative paradigm. (1986).. Hillsdale. J. Steamer: An interactive inspectable simulation-based training system. CT: JAI Press.293-328. (in press). The ecological approach to visual perception. Holyoak. Rouse (t:d. New York: Springer. 5. D. 1. Human-Computer Interaction. (1985).Verlag. K. (1987). 927-931). D. Oxford: Clarendon Press. Also in Mancini. P. (1984). 306-365. Hollnagel. Palo Alto: Xerox PARCo Hirschhorn. D. Thought and object. Human factors in computing systems: CHJ'86 Conference Proceedings (pp. (1983). (in press).J. In Proceedings of the IEEE Conference on Computers and Communications. P. P. R.).. A. Gadd. 339-359. Swedish Center for Working Life. M. D. W. Day. Santa Monica. Analogical problem solving. Advances in man-machine research.). Errors and mistakes: Evaluating the accuracy of social judgments. Hillsdale. Liege. 679-708. W. Coombs.). Getting lost: A case study in interface design. In E. International Journal of Educational Research.. R. and Johnson. Kotovsky. (1984). (1987). 1. J. Hutchins. ACM Transactions on Office Information Systems. Heuristics and cognition in complex systems.. OR: Decision Research. . 17. J. Military risk taking and modem C31 (Tech. P.). De Keyser. Artificial intelligence and cognitive technology: Foundations and perspectives. Hayes.. (1983). (1983).. In A. A. 20. 365-394.. 391-416. New York: IEEE. Mancini. Z. (1983). Clancey. J. Greenberg.. 1. G. 15-27. Lanir. Hollan. Glaser. E.. E. 1. R. 800-806. Funder. International Journal of ManMachine Studies. Intelligent decision support in process environments. A tool perspective on design of interactive computer support for skilled workers. D. Stockholm. Cohen. 1. Les interactions hommes-machine: Caracteristiques et utilisations des different supports d'information par les operateurs. Mancini. and Carrithers. Direct manipulation interfaces. A. Universite de I'Etat a Liege. F. D.. Cheng.. Management of uncertainty in medicine. and Hartley.. Malone. Boston: Houghton Mifflin. New York: Springer-Verlag. Nisbett. Ehn. An interpretation synthesis model of medical teaching rounds discourse: Implications for expert system interaction. 16-23).. Hoogovens Report. K. N. T. and D. New York: ACM/SIGCHI. Lay persons' sensitivity to statistical information: The case of high perceived applicability.. D. Hollnagel. 19. 289-350. James. G. Cognitive Psychology. C.. H. B. M. (1984). Report 86-2). (1985).403-423. Mancini. M. Kjeldsen. and Weitzman. Mental models.. 12. J. 248-294.. Gruber. Cognitive systems engineering: New wine in new bottles. C. H... M. and Stevens. M. 503-518. D. International Journal of Man-Machine Studies. 18. Hutchins. 27. 583-600. Cognitive Psychology. Woods. Woods (Eds. J.311-338. Recognition-primed decisions. K. 143-159. Generalized fisheye views. Human factors evaluation: Hoogovens No. V. In R. P.. How do people organize their desks: Implications for designing office automation systems. (Eds. In M. Klein. D. (1986). Furnas.).. (1979). In W. Gibson. and Holyoak. Elm. Slovic. Methods of heuristics. and Suthers. Kieras. E. P. 17. J. (1979). vol.. Delisio. The evolution of problem representations in the presence of powerful conceptual amplifiers. 27.). (1984). (1987). (1985). and Hollnagel. Orbeton (Eds. New York: Holt. Henderson. Hollan. G. Pragmatic versus syntactic approaches to training deductive reasoning. Woodfield (Ed. G. G. J. Education and thinking: The role of knowledge. Heuristic classification. Journal of Personality and Social Psychology. Design for acquisition: Principles of knowledge system design to facilitate knowledge acquisition (special issue on knowledge acquisition for knowledge-based systems). and Kyng. M. (1890).).2 hot strip mill (Tech. Intelligent decision support. (1976).. E.). and Oliver.. S. Organizational Behavior and Human Performance. (1986). 1. A. (1986). E.. and Woods. Woods. R. J. and Woods. and Simon. An approach to the formal analysis of user complexity. and Card. 1. 93-104. and Cohen. W. The principles of psychology. (1985). (1987). D. W. (1987). D. Coombs. Eugene. 1. (1986). 8. W. 99-112. 23. S. Training wheels in a user interface. (1984). C. and Norman. Belgium: Psychologie du Travail. Improving intuitive judgment by subjective sensitivity analysis. 46. Cheng. W. Rapport Politique Scientifique/FAST no. R. and Polson. D.. (1986). Friedland. T. AI Magazine. (1982). Greenwich. and Holyoak. Cognitive engineering in dynamic worlds. (1980). Problems of representation in the teaching and learning of mathematics. (1983). The MGR algorithm and its application to the generation of explanations for novel events. NJ: Erlbaum. Beyond belief. (Person-machine interaction: How operators use different information channels). M. Janvier (Ed. Fischhoff. and Alty. G. NJ: Erlbaum.). 4..). D.. S. D. Rooms: The use of multiple virtual workspaces to reduce space contention in a window-based graphical user interface (Tech. A. Groner.. T. Dennett.. CA: Human Factors Society. In Proceedings of the Human Factors Society 29th Annual Meeting (pp. R. D. R. Mantei and P. Cognitive Psychology. Gick.. Artificial Intelligence.. Hillsdale. MA: MIT Press. Communications of the ACM. Hollnagel. International Journal of Man-Machine Studies. K. G. 39. and Woods. J. Davis. New York: Springer-Verlag. J. Beyond mechanization: Work and technology in a postindustrial age. 21-43. and Lichtenstein. International Journal of ManMachine Studies. E. (Eds. D. 1. B. Reasoning from first principles in electronic troubleshooting. 18. E. Gentner. In C. 27. (1985). and Farkash. and D. and W. Fischhoff. International Journal of Man-Machine Studies. Unpublished manuscript. (1984). B. J. Hollnagel. (1986). Groner. Pragmatic reasoning schemas.. (1985). London: British Steel Corporation/Hoogovens. Lesh.75-90. 22.
In M. Montmollin. McKendree. Woods. D. W.. Schum. (Eds. and Hollnagel. Beyond amplification: Using the computer to reorganize mental functioning. Cockpit automation technology (Tech. E. D. Norman and S. Hillsdale.. (in press). (1984). Human Factors. Human factors in computing systems: CHI'86 Conference Proceedings (pp. Pew. and Henrion. L. D. R. Analysis. Mantei and P. (1985). Woods. (1986). Mantei and P. Power station control room and desk design: Alarm system and experience in the use of CRT displays.). In E. London: Academic Press. Pople. New York: Plenum Press. Miyata. C. D. Italy: IFAC. D. and Russell. O. and Card. A. (1983). and Visciola. Program in Cognitive Science. M. D. (in press). Foster. A. (1987). de.. PAMI-5.). Information processing and humanmachine interaction: An approach to cognitive engineering. Rochlin. (1986). M.). and Cybernetics. and Norman. et at (1986). 20. Norman. Cognitive engineering in dynamic worlds.lnternational Journal of Man-Machine Studies. K. and Hollnagel.).. D. Seven plus or minus two central issues in human-computer interaction. Shneiderman. La Porte. Also in Mancini. and Woods. Marshfield. D. E. Mancini. Ambiguity and command. L. and Arbib. K. and Carroll. New York: ACM/SIGCHL Stefik. A. Rissland. The prospects for psychological science in human-computer interaction. C. Stefanelli (Eds. International Journal of Man-Machine Studies. (1986). Cambridge: Cambridge University Press. L. design. (1985). Quinn. M. T. Jr. M.. G.). IEEE Transactions on Pattern Analysis and Machine Intelligence. (in press). Cognitive engineering in dynamic worlds. (in press). 27). Kahn.). R. New York: Elsevier Science Publishers.. March. 27. Also in Mancini. and L. Y. (1984). Communications of the ACM. D. Norman... Knopf.. B. CA: Intelligent Systems Laboratory. Cognitive engineering in dynamic worlds. T. J. and Forren. S. NJ: Erlbaurn. H. R. In D. F.). Forces of production: A social history of industrial automation. (1986). 449-461. D. 27. S. New York: ACM/ SIGCHI. Xerox Palo Alto Research Center. A. Selfridge. D. 209-242. E.. G. and evaluation of man-machine systems. Newell. Trust between humans and machines. Human factors in computing systems: CHI'86 Conference Proceedings (pp. Research and modeling of supervisory control behavior. C.. G. G. Perkins. and Draper. G. and Newell. R. Bennett. Man. Martensson (Eds. Sheridan. Mitchell. A. (in press). (1986).. Naval War College Review.). J. Bobrow. Beyond the sterile cockpit. 35-40). F. L... A. Use of model-based qualitative icons and adaptive windows in workstations for supervisory control systems. Le Travail Humain. A. G.. New York: IEEE Computer Society. (1987).. 26. 555-570. Report). Oberton (Eds. Norman. (1978). Artificial intelligence in medicine. 314-320). In Proceedings of Workshop on Visual Languages. Roth. D. . Pea. Advising roles of a computer consultant. MA: BBN Laboratories Inc. Miller. A. (Special issue of International Journal of Man-Machine Studies. and Hennessy. (1986). Washington. New York: ACM/SIGCHI. San Diego: University of California. G. (1987). and Cybernetics. Bagnara. A. E. L. E. Report 6133). London: Academic Press.).). E.. and Saisi.. M. (1984). and Weisinger-Baylon. Steps towards a cognitive engineering (Tech. A. Human factors in computing systems: CHI'86 Conference Proceedings (pp. Woods. (1980). W. (1985). Wiecha. J. Roth.. (1987). In G.. In M. K. and Hollnagel. and Woods. I.. Adaptive control of ill-defined systems. (Eds. P. (1988).. Empirical studies of programmers. Design rules based on analyses of human error. Woods. (1987). NJ: Erlbaum. User-centered system design: New perspectives on human-computer interaction. NJ: Ablex. New York: North-Holland. 14. Hillsdale. Robertson.39-64.). I. Cannes. D. Draper (Eds. Hillsdale.. D... D. R. (1985). Fragile knowledge and neglected strategies in novice programmers. vol. G. Aiding human performance: L Cognitive analysis.). Cognitive processes in decision and choice behavior. Human error detection processes. SMC-I7. D. Evolution of an expert system: From internist to caduceus. and Suchman.).. S. SMC-I7. and Roberts. Wiener.. M. M. Muir. D. D. G. ATTENDING: Critiquing a physician's management plan. (1985. Current developments in research on cascaded inference. 75-90. New York: Alfred A. (1981). M. Cognitive engineering in dynamic worlds.. Rasmussen. In M.). Soloway and S. E. (Eds. (Eds.. Woods. Human-Computer Interaction. London: Academic Press. Noble. Intelligent interfaces: User models and planners. Palo Alto. A. (1987). Norwood. W. 27. (1981). CEC-JRC Ispra. Internationa/J ournal of M an-Machine Studies. R. 573-593. The ZOG approach to man-machine communication. D. Report). 479-525. E. Rizzo. Johannsen. 254-258. Cambridge. User-centered system design: New perspectives on human-computer interaction.. (1986).. M. MA: Pitman Publishing. Educational psychologist. 51(1). K.. and Hollnagel.COGNITIVE ENGINEERING August 1988-429 Mancini.. Iyengar (Eds. 461-488. (1984). International Journal of Man-Machine Studies. NJ: Erlbaum.. H. (Eds. Expert logic vs. 27. A graphical tool for structuring and understanding quantitative decision models. 343-349). Visual momentum: A concept to improve the cognitive coupling of person and computer. (1985). (1986). Wallstein (Ed. S. D. 527-539. (Eds. G. IEEE Transactions on Systems. (1983). 594-607. Mitchell. San Diego.. and De Keyser.. Also in Mancini. Psychological issues in support of multiple activities. Lanning. The self-designing high-reliability organization: Aircraft carrier flight operations at sea. McCracken. In L De Lotto and M. September). DC: National Academy Press. D. 1. Beyond the chalkboard: Using computers to support collaboration and problem solving in meetings (Tech. G. Pope. M. In Proceedings of the International Symposium on Nuclear Power Plant Control and Instrumentation. S. operator logic.. Plans and situated actions: The problem of human-machine communication. 167-182. H. B. D. IEEE Transactions on Systems. V. Obreton (Eds. Mantei and P. London: Academic Press. Suchman. D. (1987). France. Human interaction with an "intelligent" machine. Oberton (Eds. Multimodal user input to supervisory control systems: Voice-augmented keyboard. In T. and Martin. Man.
(1986).. Nuclear Regulatory Commission. D. M. D. Helander (Ed. (1987). D. E. D. and S. 26. Woods.. D.. 21. Intelligent decision support in process environments. Mapping cognitive demands in complex problem solving worlds (special issue on knowledge acquisition for knowledge-based systems). Washington. Washington. E. D. London: Taylor and Francis. (NUREGCR-4532).). (l988b). D. P. Olsen (Eds. Woods.). From cognitive analysis to support systems. and Hollnagel. 229-244. DC: U. D.S. E. Woods. D. Woods. 139-172. .. Roth. Nuclear Regulatory Commission. Cognitive systems engineering. M. D.. Woods.. New York: North-Holland. H. G. Models of cognitive behavior in nuclear power plant personnel. M. In L. tasks and errors: A collection of essays to celebrate Jens Rasmussen's 60th birthday. M. (1987). In M. D. B. Cognitive Environment Simulation: An artificial intelligence system for human performance assessment (NUREG-CR4862).). 257-275. Handbook of human-computer interaction. International Journal of Man-Machine Studies. Hollnagel. DC: U. D.S. Andersen. (1986). and Roth. Mancini. E. and D. Woods. Woods (Eds. Mental models. Woods. H. and Roth. (1988). D. (l988a). D. New York: Springer-Verlag. and Roth. In E. Paradigms for intelligent decision support. D.430 -August 1988 HUMAN FACTORS International Journal of Man-Machine Studies. Le Travail Humain. E. E. Aiding human performance: II. and Pople. Coping with complexity: The psychology of human behavior in complex systems. 51. Goodstein.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.