You are on page 1of 8

IEEE TRANSACTIONS ON SYSTEMS, MAN, A N D CYBERNETICS, VOL.

19, NO. 6, NOVEMBER/DECEMBER 1989

1527

Human- Computer Interaction Using Eye-Gaze Input


THOMAS E. HUTCHINSON, K. PRESTON WHITE, JR., SENIOR MEMBER, IEEE, WORTHY N. MARTIN, KELLY C. REICHERT, AND LISA A. FREY

Abstract -Erica is a computer workstation with a unique user interface. The workstation is equipped with imaging hardware and software, which automatically record a digital portrait of the users eye. From the features of the current portrait, the interface calculates the approximate location of the users eye-gaze on the computer screen. The computer then executes commands associated with the menu option currently displayed at this screen location. In this way, the user can interact with the computer, run applications software, and manage peripheral devices-all simply by looking at an appropriate sequence of menu options displayed on the screen. The eye-gaze interface technology, its implementation in Erica, and its application as a prosthetic device are described.

I. INTRODUCTION

MAGINE yourself the victim of a severely crippling accident. You can no longer move or talk. Youre unable to write, to point, or even to nod your head. You communicate solely with your eyes-moving them up and down for yes; left and right for no. Your intellectual and creative abilities remain undiminished, in spite of your physical disabilities, yet your thoughts are trapped within you. This scenario is disturbing for most of us even to imagine. Yet it is a living reality for many thousands of severely handicapped people-the result of an injury, a stroke, or a progressively disabling disease. Without voice or gesture at their disposal, these individuals lack an effective means to communicate with others, to control their environment, or to entertain and enrich their minds. In this paper we describe a unique prosthetic device called the eye-gaze-responseinterface computer aid (Erica). Erica is a stand-alone workstation, based on a standard personal computer, specially adapted with imaging hardware and software. The user interface accepts input directly from the human eye. Menu options are displayed at different positions on the computer monitor. Simply by
Manuscript received September 20, 1988; revised April 20, 1989. This work was supported in part under a grant from the Virginia Center for Innovative Technology, and in part from the General Electric Corporation, in part from IBM. and in part by NYNEX. T. E. Hutchinson, K. P. White, W. N. Martin, and L. A. Frey are with the School of Engineering and Applied Science, University of Virginia, Charlottesville, VA 22901. K. C. Reichert was with the University of Virginia. She is now with E. I. DuPont de Nemours and Co., Inc., D4084-2, Wilmington, DE 19898. IEEE Log Number 8930357.

looking at a given position, the corresponding menu option is invoked. In this way, the disabled Erica user can interact with the computer, run communications and other applications software, and manage peripheral devices. Although the underlying eye-gaze technology has many potential applications, the initial and abiding goal of the Erica project is to help the physically and vocally disabled to attain or regain some measure of independent communication and control. This population includes approximately 238000 quadriplegics in the U. S., who require assistance to perform everyday functions [l]. For the many quadriplegics who retain at least some degree of motor control, Erica represents an attractive alternative to existing prosthetic communications systems [2]-[5]. The eyegaze interface is less cumbersome and potentially faster and more functional than alternative input devices, such as body-activated switches and sip-and-puff mechanisms. Most importantly, for the estimated 150,000 severely motor-dysfunctional individuals who can control only the muscles of their eyes [l], the eye-gaze interface has no substitutes. Erica was designed and developed by faculty and students in the School of Engineering and Applied Science at the University of Virginia [6]. The project began in 1984. Since its inception, over one-hundred engineering undergraduate and graduate students have contributed their time and talents to help make Erica a reality. Erica has been in active use in the laboratory and in beta test sites for several years. The progress of the project has received extensive national press coverage, both on television and in the print media [7]. LC Technologies, Inc., of Fairfax, VA, currently manufactures and markets Erica I systems, under a patent licensing agreement with the University and the Virginia Center for Innovative Technology (CIT). The first commercial Erica units were delivered in late 1988. In the following section, we describe the standard Erica hardware system, explain how the system works, and outline the eye-gaze position detection algorithm. System operation and existing software applications are presented in Section 111. In Section IV, we discuss some of the history of system development and describe selected test experiences. In the final sections, we consider the limitations of the current system and discuss continuing research and development directed toward overcoming these limitations.

0018-9472/89/1100-1527$01.00 01989 IEEE

1528

IEFF TRANSACTIONS ON S Y S I k M S , MAN. A N D CYREKNLTICS. VOL 19. NO

6.

NOVFMBER/DECk.MBER

1989

Ltght Condensing Lens

Camera

\
Fig 2

Ball and SDckelJoint

IR Passing Filter

Erica light-source assembly

Fig 1

Laboratory configuration of Erica hardware &!stem

11. SYSTEM DESCRIPTION

A . Standard Harduare Configuration The Erica hardware system consists of a personal computer with hard disk, a color monitor, a near-infrared light source, a light-tracking video surveillance camera with infrared-pass filter, and a light-intensity imaging board. A laboratory configuration of this basic set-up is shown in Fig. Also shown are a printer and a second, smaller, black-and-white video monitor (located to the left of the color monitor), used in the laboratory for camera-lmage feedback. The light Source a gallium near-infrared, 880-nanometer, light diode (LED), commonlY used in communications systems. Fig. shows the bly to position the LED in front Of the camera lens. The LED attached to One end Of a short Joint connects the opposite end Of tube. A this tube to the center of an infrared-pass filter. The filter Onto the front Of the lens. The socket joint provides the two degrees of freedom required to position the LED so that it illuminates the users face and the filter reduces the amount of ambient light reaching the camera sensor. In addition, a light-condensing lens is mounted on a metal sheath, which slides along the tube. The distance from the lens to the LED can be varied from 0.5 to 2.0 cm, focusing the infrared light on the users face. As can be Seen in Fig. 1, the light-source/camera assembly is mounted directly under the color monitor.
B. How it Works

as the users head remains stationary relative to the camera, the glint position remains fixed in the image field. The bright eye: A fraction of the infrared light enters the pupil and is reflected off the retina. This is the image of the pupil, called the bright eye (a reflection of infrared light from the human retina, similar to the reflection of visible light from a cats eye at night). The bright-eye appears in the camera as an area of infrared light, larger and less intense than the glint, but more intense than the dark image of the surrounding iris. The position of the bright eye moves in the camera image field, following the motion of the eye.

In operation, the light source illuminates the users face with harmless near-infrared light, as shown in Fig. 3. The direction of the users eye-gaze is determined from the video-camera image of this light, reflected from one of the users eyes. The eye image has two significant features used in eye-gaze position detection, as shown in Fig. 4.

The direction of the users eye-gaze can be determined from the relative positions of the bright eye and glint in the camera image. When the direction of the eye-gaze coincides with the optical axis of the camera (i.e., when the user looks directly at the light source), the center of the bright-eye coincides with the center of the glint, Because the light-source/camera assembly 1s located immediately beneath the computer monitor. the bright-eye center is raised relative to the glint, whenever the user looks at the display screen. Fig. 5 shows an example of the illuminated eye and Fig. the spatial relationship between the bright-qe and glint centers for various eye-gaze positions on the display screen. In order to the vector relationship between the bright-eye and glint centers, a virtual image of the eye is passed to the The current image-processing board records the camera image in a 512 x 480 bit frame at a rate of 3o frames per second. The frame contains the digitized light-level intensity of each of the pixels in the eye image. The frame is stored in the physical memory of the imaging board, where the host computer can access one-fourth of the frame at a time. Fig. 7 shows a digitized image after it is processed by the imaging board and the host computer software.

C. Eye-gaze Position Detection Algorithm


In its present state Of development> Erica can distinguish among as many as nine reference areas, or menu boxes, arranged in a 3x3 matrix on the display screen. This is accomplished by an eye-gaze position detection algorithm that 1) extracts the horizontal and vertical Cartesean coordinates of the glint center and the bright-eye center from the frame representing the current eye-gaze

The glint: A fraction of the infrared light is reflected off the corneal surface. This is the first Purkinje image of the LED and appears in the camera as a small, intense area of infrared light, called the glint. As long

HUTCHINSON

er ul. : HIJMAN-COMPUTER

INTERACTION USING EYE-GAZEi m u i

1529

(b) Fig 3 (a) Erica system in use (b) Schematic shouing how eye-gaze interface operates (1) Light beam is shot at eyes Light reflecting hack into camera alerts computer to positlon of eye\ (2) Staring at one of commands displayed on computer screen for 0 5 s or more automatically triggers sy5tem

Fig. 4. Schematic showing glint reflection and bright-eye effect. Image features of interest are: 1) glint g-reflected light from front corneal surface of eye, 2) pupil bright-eye image from light reflected from back surface.

Fig. 5. Illuminated eye, showing glint and bright-eve.

(c) ( 4 Fig. 6. Vector relationship between glint and bright-eye center, used to determine eye-gaze position. (a) Directed at camera. (b) Directly above camera. (c) To right of camera. (d) Up and right.

1530

IEEE TRANSACTIONS ON SYSTEMS, MAN, A N D CYBERNETICS, VOL.

19. NO. 6, NOVEMBER/DECEMBER 1989

is used to define a smaller region in the frame, containing the bright eye and glint. This smaller region is then rescanned, pixel by pixel, to determine 1) pixels on the circumference of the bright eye (i.e., on an edge with low intensity neighbors) and 2) pixels within the frame representing the glint (usually from five to eight very high intensity pixels). If the glint cannot be detected, a new frame is grabbed and the process repeated. Repeated failure to detect the glint automatically results in recalibration, after a specified time delay.
111. OPERA IION AND CURRENT APPLICATIONS
Fig. 7.
Digitized eye-image displayed on-screen.

Fig. 8. Typical screen layout showing four active-menu boxes

position and 2) maps these coordinate locations intc the stored reference ranges corresponding to one of the nine menu boxes. A typical screen layout is shown in Fig. 8. A calibration routine is executed at the beginning of each Erica session to dctermine the reference values required by the algorithm. During calibration, an icon is placed successively at three locations on the display screen and the user is prompted to follow the icon with his or her eye gaze. The first icon is placed at the center of the screen. The eye-image obtained is used to establish intensity thresholds for the bright-eye and the glint. The icon then is moved to the upper-left and lower-right corners of the screen. At each corner, the X and Y components of the vector distance from the glint to the bright-eye center are recorded and averaged over a number of images. Linear regression on these averages is used to determine the reference coordinates of an 80x25 matrix of gaze locations spanning the entire display screen. These gaze locations are then associated with the corresponding menu boxes. The entire calibration procedure typically requires approximately 5 sec. The imaging algorithm used during calibration and subsequent operation are the same. Beginning with the first row of the frame, approximately every twentieth pixel is examined (a resolution only slightly smaller than the smallest expected pupil size), until a pixel with intensity greater than the bright-eye threshold is located. This pixel

Erica currently operates using a tree-structured menu hierarchy. Menu options appear in from one to nine of the menu boxes. The user makes a selection by staring at the desired option for a short period of time. This time typically is pre-set to two or three seconds, but can be altered depending on the users experience, skill, and intended application. When the users eye-gaze is fixed for t h s period, a tone sounds and an icon (cursor) appears in the menu box in line with the gaze. If the user continues to stare at this enabled option, a second tone sounds and this option is performed. The purpose of the auditory and visual feedback is to allow the user a moment to change or abort the enabled option by altering his or her gaze accordingly. Most menus contain a back-up option, which permits the user to return to the previous menu if desired. Upon completion of the initial calibration routine, a main menu appears on-screen. Using this menu, the user may select an application area. The current Erica software ensemble includes four applications suites-control, including environmental control and nonvocal communication of personal needs: communications, including word processing and synthesized speech; recreation, including computer games, digitized music, and educational programs; and text reading, including a small library of books and other texts. These applications are described in the following subsections.
A . Control

The control application allows the user to operate electrical devices in the surrounding area, through an X-10 Powerhouse system. This commercial system consists of an encoder unit and a set of appliance modules. Both the encoder and appliance modules are plugged into standard AC outlets on the same AC circuit. Each electrical device to be controlled is plugged into an appliance module, which has been preset with a unique digital code for that device. When the user command to control the device is selected from an Erica menu, the application software passes a command signal to the X-10 encoder through the computer serial port. The encoder in turn sends the appropriate code out over the AC circuit. The appliance module then changes the state of the attached device. In this way, appliances can be turned on and off and lights can be raised and dimmed.

HUTCHINSON et

ul. : HUMAN-COMPUTER INTERACTION USING EYE-GAZE INPUT

1531

The standard X-10 system controls the power supply to an appliance, using a two-state logic signal. Appliance functions that cannot be controlled in this way use specially-adapted remote-control peripherals. For example, in order to select television channels, a PC expansion board was designed with the addressing logic required to send instructions to a programmable remote control unit. This unit translates the instructions into the infrared codes used for channel selection on a video cassette recorder (VCR), which in turn changes the television channel. This control strategy also has been adapted successfully to stereos equipped with remote controls. The control application also includes the capability for urgent, nonlinguistic communication of frequent personal needs. For example, a menu option labeled call the nurse activates a loud buzzer, in case of an emergency. Similar options enable the the user to communicate thirst, pain, or an itch, and to specify the body region associated with the pain or itch.

context of the communication. Second, a character prediction algorithm has been developed and implemented, which changes the character options on the root text-entry menu dynamically, based on the two preceding character entries. This scheme makes use of the conditional probability structure of English-language text strings (a Markov chain) and reduces text-entry time by an average of 30 percent [SI. Higher-level prediction schemes, artificial intelligence, and other concepts are currently under investigation to further improve eye-typing ease, accuracy, and speed. C. Recreation The recreation suite includes a selection of standard games, such as blackjack, Mastermind, and Simon, rewritten to allow eye-gaze operation. Many of the recreational programs are educationally oriented and one application even allows the user to compose music and playback the composition through the computer speaker. These programs are valuable in themselves as entertainment. In addition, we have found that games provide a simple, unintimidating, and compelling means of introducing novice users to Erica operation.

B. Communications
The communications application contains several features that allow the handicapped user to communicate through language. The central program in this suite is a fully functional word processor, designed to emulate standard word-processing software. It includes most of the usual typing, editing, file-creation, and printing capabilities. In addition, text files can be converted to verbal communications, using a voice synthesizer. The principal difference in the Erica word processor is that touch typing is replaced by eye typing, i.e., keyboard entries are replaced by eye-gaze input. Clearly, the limited resolution of eye-gaze position detection makes it impossible for the word processor to use the full screen-image of the standard keyboard. In contrast to 84 keys on a standard AT keyboard, the current word processor has only six keys (menu boxes) available for character entry and editing functions (the bottom row of menu boxes has been disabled, to provide space on the screen to display text while typing and editing). Character entry is accomplished using a tree-structured menu hierarchy. On the first menu, the available character set is partitioned into subsets of approximately equal size. Successive menus explode the subset initially selected, until each menu box ultimately is associated with a unique keystroke. Two, three, and perhaps as many as four menu selections are required to enter a single character. Direct eye typing is a relatively slow affair and it can take an experienced user nearly 85 min. to inter an entire page of text, one character at a time, using a static menu hierarchy. Several unique features have been added to the word processor to speed-up the process of text creation. First, a set of common phrases is available and phrases can be selected and entered using the word-processor menu hierarchy. The user can edit and customize the phrase file and alternate phrase files can be used, depending on the

D. Text Reading
Text files can be read using the word processor, or a specially designed reading application. The root menu in this application provides a subject index of text files stored on disk. Selecting a subject category brings-up a list of file names in that area. The list is indexed by a cursor. Menu options scroll the cursor, or select the current file for reading. The principal advantage of this read-only application is that more of the computer screen is available to display text. When a file is selected for reading, only the bottom row of menu boxes is enabled. Two boxes are used to turn pages, backwards and forwards, and the third to call-up a submenu. Options of the submenu allow the user to place a bookmark on the current page, select an alternate text, and exit the application. The amount of text material in the current library is limited, both by available disk space and by the expense of obtaining texts on disk.

IV.

SOME TESTEXPERIENCES

The overall design and development of Erica has followed t h e well-known systems engineering methodology-identification of the problem, evaluation of needs and objectives, selection of the most appropriate design from among the existing alternatives, and implementation. The project has drawn on knowledge from a wide range of areas-optics, image processing, digital logic design, software engineering, human factors engineering, rehabilitation engineering, mechanical design, project management, and marketing. As with most systems engineering projects, however, application of the methodology and synthesis of these knowledge domains have been accom-

1532

IEEE TRANSACTIONSON SYSTEMS, MAN, AND CYBERNETICS, VOL.

19, NO. 6, NOVEMBER/DECEMBER 1989

plished through an iterative, evolutionary, and enlightening development process. Initial objectives reflected the designers able-bodied preconceptions, with little background knowledge or experience of the disabled users predicament. In the course of system development and testing, these objectives frequently have been altered and new objectives discovered. This process continues, as it must. In this section, we believe it appropriate to provide a brief example of the design process, relating some of the development history and one specific beta-test experience. The design of the user interface began with a literature review and with interviews with special-education and medical experts and practitioners. The purpose of this research was to develop a profile of the handicapped and nonverbal user, from which we could comprehend the needs, wants, abilities, and limitations of severally handicapped persons. Profile in hand, we proceeded to design a system prototype with environmental-control and textreading capabilities, working in the laboratory and within the limits imposed by eye-gaze input. Able-bodied test subjects were abundant and readily available among the undergraduate population. These test subjects provided feedback on the performance of the prototype design, which was unanimously positive. Demonstrating the prototype to specialists who work with the handicapped, however, as well as subsequently working with several handicapped subjects, yielded a less positive appraisal. What was obvious to the specialists, but not the project team or able subjects, was the acute need for a system that enables communication. The active environmental control features were impressive technical accomplishments and undoubtedly useful, but severely handicapped persons typically are surrounded by health-care professionals who routinely perform these functions for the handicapped with great competence. The most difficult and tedious aspect of this relationship is the dialogue, in which the handicapped can communicate needs, wants, and feelings of satisfaction or dissatisfaction with the current state of affairs. Because the prototype lacked adequate communications capabilities, we experienced this truth first hand, as a part of our own difficulty in working with handicapped volunteers. This led us back to the laboratory and to the immediate design of a simple word processor. The first beta test began in the Fall of 1986. The test subject was Officer Steven McDonald, a college-educated New York City policeman, who had been shot while on duty the preceding summer. The injury sustained to his C-2 vertebra left him completely paralyzed and, because of his dependence on a respirator, almost completely nonvocal. After several interviews consisting of yes and no questions, to which McDonald could mouth the answers, we were able to assess his specific needs and provide him with a prototype system. As we had hoped and anticipated, Officer McDonalds assessments, suggestions, and special insights were a major source of design improvements in the applications software- the principal intended focus of the beta test.

Among the many unanticipated difficulties encountered in this first beta test, however, was the acute need for hardware redesign. Specifically, the standard mounting system for the computer monitor and light-source/camera assembly, used successfully in the laboratory and with patients in wheelchairs, was ill-adapted for use by a recumbent paraplegic. The mounting system initially developed for McDonald also was unusable, because it interfered with the doctors and nurses care for the patient. The mechanical design of a functional mount was further complicated by the special design of the patients bed (which reduces the possibility of bed sores); by the need to safeguard the patient, his caretakers, and his visitors (as well as the Erica prototype) against accidental upset in an oftentimes crowded hospital room; and by our prime directive to produce an affordable system. Solving this mechanical design problem delayed software testing for several months. V. SYSTEM LIMITATIONS AND FUTURE ENHANCEMENTS

Although small numbers of first-generation Erica systems are available commercially, ongoing research and development offer promise for significant functional improvements in second-generation designs. Perhaps the most important limitation of the current technology concerns the bright-eye effect. The strength of this effect varies among subjects and not all candidates for the system have sufficiently intense bright-eyes to permit consistent and reliable detection of their eye-gaze direction. Laboratory studies and beta tests suggest that this may inhibit use of the current system to varying degrees by 5 to 10 percent of the population, at least with the near-infrared light frequency currently employed. The source of this variability is not well understood and may relate to either genetic or environmental factors, or some combination of these. Basic research has been undertaken to improve our understanding of the bright-eye effect and to develop design options which overcome this basic problem. To operate Erica, the user must maintain his or her head in a nearly stationary position. Lateral head movements greater than two inches in either direction cause the eye image to leave the camera field; movements greater than a few inches toward or away from the camera put the eye image out of focus. Sadly, this in not a problem for many of the target population. Patients who suffer from cerebral palsy and similar disorders, however, have uncontrolled head movements that currently inhibit their use of the system. Hardware alternatives that w i l l make Erica available to this population, such as head-tracking systems and autofocus lenses, are under evaluation. Operation of the menu system can be quickened by increasing the number of options available at any given level of the menu hierarchy. Research is under way to improve the overall accuracy and precision with which eye-gaze position can be detected. This w i l l increase the density of menu boxes permissible on-screen. Also under

HUTCHINSON

et

U/.:

HlJ!dAN-COMPUTER INTERACTION USING k Y t - G A Z E I N P U I

1533

investigation are strategies for making greater use of the entire visual field, beyond the border of the current monitor, to support additional static and/or dynamic menu options. A variety of alternatives to the menu-box pick mechanism also are being studied, as well as the development of dynamic and context-sensitive menu macros for lengthy and frequently used option sequences. Communication is the most important prosthetic application of the eye-gaze interface and research leading to further enhancements of the word processor is a principal interest of the project. In addition to the letter-prediction scheme currently in operation, several design concepts appear promising. These concepts make use of our knowledge of word context and higher-level English language structures. Alternatives under investigation include the use of word and phrase prediction, nonsequential character entry, shorthand symbols and pictographs, a dual-monitor hardware configuration, alternative pick mechanisms and menu structures, limited and special-purpose communication worlds, and learning systems. Also under investigation is the use of the word processor as a keyboard emulator, to permit the use of commercial software not specially designed for eye-gaze computing. Our principal focus in this paper has been the application of eye-gaze technology for the benefit of the disabled community. Other enhancements under development within this application domain include educational tools for disabled school-aged children, set-up routines that permit easy home reconfiguration of the control applications suite, and an eye-gaze control system for a mobile robot. Clearly, the eye-gaze interface is not limited to prosthetic applications described here, however. Among the many domains under active consideration are the use of eye-gaze position detection in medical, educational, business, industrial, and military settings, for such applications as testing, training, targeting, data entry, and control. Given the rapid development of the eye-gaze interface over the past several years, we look forward optimistically to the diffusion of t h s technology into general-purpose computing environments within the next decade. ACKNOWLEDGMENT Special thanks are due Janine M. Carley, E. Marshall Newton, IV, Kevin S. Spetz, and Lisa A. Onufrak for their many useful comments on the first draft of this paper. We wish to acknowledge these students, as well as the many graduate and undergraduate students at the University of Virginia, past and present, who have contributed their time and talents in making Erica a reality. We also wish to acknowledge the many handicapped volunteers, most especially Officer Steven McDonald and his family, and Jamie Mitchell, whose strength, courage, and perspectives have inspired and enhanced Erica research and development. Finally, we wish to acknowledge the many consultants, both inside and outside the University, who at various times have donated their expertise and/or material support to the project.

REFERENCES
B. T. PVk. A n Aiiu!\.sIs of the IIi,~h-Quudrrple~it.s i n the United Stutes for use 111 Encu MurXetirig Strutegies. Undergraduate Thesis, Univ. Virginia. Charlottesville. VA. 1987 Aesir Software Engineering, A A RON A Comniuiiicution A d , Pinedale. CA. 1984. L. Cory, P.H. Viall. and R. Walder. A Versutrle Conimuiiic.utionJ SI.steni /or High Qirudriplegic s. Southern Massuchusetts Univ.. 1985. Model 300, Pittsburgh. Sentient Systems Technology. c\.eT\,per PA.. 1986. Words+. Inc., Eqirultzer and Eyiruhzer 11. Sunnyvale. CA. 1986. K. C. Reichert. ERICA Ifurdwure utid Softwure Design. M.S. Thesis, Univ. Virginia. Charlottesville. VA. 1987. With Thomas Hutchinsons marvelous ERICA, a flick of an eye brings help to the helpless, People. July 20, 1987, p. 87. L. A. Frey. P w l / n i / n u r i . Des~giiof UII Iiitelligerit Word Processor for &e 1 1 1 ERICA. M.S. Thesis. Univ. Virginia, Charlottesville. VA, 19x8
~ ~

Thomas E. Hutchinson received his B.S. and M.S. degrees in physics in 1958 and 1959. respectively, from Clemson University. Clemson. SC. and his Ph.D. degree in physics in 1963 from the University of Virginia, Charlottesvelle. He was a professor of biomedical and chemical engineering at the University of Minnesota, Minneapolis, M N (1968-1975) and the University of Washington. Seattle, WA (1975-1982) before joining the faculty at the University of Virginia. Upon his appointment of the William Stantsfield Calcott Professorship and as Associate Dean of the School of Engineering and Applied Science. he has been instrumental in forming the Virginia Center for Innovative Technology with h s colleagues at Virginia Polytechnic and State University and Virginia Commonwealth University. He resigned as Associate Dean in 1986 to pursue research in the area of man-machine interfaces. particularly the development of an eye-gaze controlled coinputer to be applied initially to the severely and chronically disabled. His other research interests include bioengineering, materials scicnce. experimental psychology, fault detection devices for aircraft. and the philosophy of science. Dr. Hutchinson has been an Atomic Energy Commission Research Fellow and is a continuing Senior Research Fellow of the University of GlasgoN and Elected Fellow of Cambridge University. In addition to his academic. piir\uit\. he consults for IBM. Morgan Bank and several government agencies. He is a regular panel member for NSF and NIH, and has for the past five years been chairman of the Southeastern Universities Research Association Committee on the Future of Materials Science.

K. PreGon White (MX1-SMXX). for a photograph and biography. please see pagc 163 of the March/April 1989 issue of this TRANSACTIONS.

Worth! N. Martin was born in 1951. He received the R.A. degree in mathematics, the M.A. degree in computer science. and the Ph.D. degree in computer science in (1981), all from the University of Texas at Austin. He is currently an Associate Professor of Computer Science at the University of Virginia. having joined the faculty in 1982. He is also a member of the Institute for Parallel Computation at the University of Virginia. Dr. Martins research interests include dynamic scene anal?\is. probabilistic problem solving methods (e.g., genetic algorithms). and applications of computer vision (e.g.. user interfaces for the handicapped and computer-aided analysis of imagery from archaeological artifacts). Among his published work is the book, Motion UnderS I U I I ~ I I I Rohor ,~: uiid lfuniuii Vi.sron. co-edited with J. K. Aggamal. (1988). He is a member o f the IEEE Computer Society, the Association for Computing Machine?. and the American Association for Artificial Intelligence.

1534

ILLE TRANSACTIONS O N SYSTEMS, MAN, A N D CYHtKNETICS, VOI..

19,NO. 6,NOVEMBER/DECEMHEK 19x9

Kellj C. Reichert was born in Frederick. MI). in 1964. She receivcd the B.S. degree in systems engineering and computer science and the M. S. degree in systems engineering in 19x7, both from the University of Virginia, Charlotteaville. Her thesis reaearch was conducted through the Biomedical Engineering Department and focused on the implementation of image processing technology for the development of an eye-gaze controlled computer system to be used by nonvocal quadriplegics. She is a staff analyst for E. I. DuPont de Nemours & Co., Inc., currently employed at the Waynesboro Fibers Plant in Wayncsboro. VA. Her work currently includes the design and implementation of process monitoring and control systems for the manufacturing of nylon.

Lisa A. Frey was born in Newport News. VA, in 1964. She received the B. S. degree with distinction in computer science in 19x6 and the M.S. degree with high distinction in biomedical engineering in 19XX from the University of Virginia, Charlottesville. She is currently Ph.D. candidate in biomedical engineering at the University of Virginia. She serves as the project manager for the eye-gaze computing laboratory in Charlottesville, VA. Her research interests include the design and implementation of intelligent systems with an emphasis on humanmachine interaction as applied to rehabilitation and biomedical engineering. Miss Frey is a member of RESNA, Tau Beta Pi, and The Raven Society. She is the recipient of the G. E.--Fanuc Fellowship at the University of Virginia and is the former recipient of the Bertha Lamme Scholarship from the Society of Women Engineers.

You might also like