You are on page 1of 29

HCI

• What is HCI?
• How does HCI affect you?
• Why is HCI important to you?.
HCI
• “Human‐computer interaction (HCI) is the study of interaction between people(users) and computers. It is an
interdisciplinarysubject, relating computer science with many other fields of study and research. Interaction
between users and computers occurs at the user interface (or simply interface), which includes both hardware(i.e.
input and output devices) and software(for example determining which, and how, information is presented to the
user on a screen).”
HCI
• Inherently interdisciplinary
–Computer Science
–Human Factors and Ergonomics
• Industrial Engineering
–Industrial Design
–Graphics Design
–Psychology
–Anthropology

Goal of HCI
• Purposeful design of a system.
–Efficient, effective, and error free.

User Centered Design (UCD)


• What is UCD?
• What is involved in UCD?
• Why is UCD important?
UCD
• “user‐centered design (UCD)is a design philosophy and a process in which the needs, wantsand limitations of the
end user of an interface or document are given extensive attention at each stage of the design process.
User‐centered design can be characterized as a multi‐stage problem solving process that not only requires designers
to analyzeand foreseehow users are likely to use an interface, but to testthe validity of their assumptions with regards
to user behavior in real world tests with actual users. Such testing is necessary as it is often very difficult for the
designers of an interface to understand intuitively what a first‐time user of their design experiences, and what each
user's learning curve may look like.

--- The chief difference from other interface design philosophies is that user‐centered design tries to optimize the
user interface around how peoplecan, wantor needto work, rather than the users to change how they work to
accommodate the system or function.”

UCD
• Involve users early
• Direct contact with users
• Continual user feedback
• Designers ARE NOTusers!

User Goals
• What are the user’s goals?
• How does the computer interface facilitate the
user attaining the goals?

HCI
• What does it mean for a computer to be usable?
• What do these terms mean?

Universal Usability
• Ultimate goal: addressing the needs of all users.

Computing Environments/ Universal Usability


• Physical Environment
–Positioning should provide comfortable reaching and easy movements.
• Anthropometry
–Static human measures: height, weight, arm length, etc.
–Dynamic measures: reach distance while seated, etc.
–Safety
–Efficiency: do not require more work than necessary.
–User Space: no discomfort.
–Work Space: Accommodate other work tools.
–Lighting
–Noise
–Pollution

Computing Environments/ Universal Usability


• Social Environment
–Protect user’s privacy.
–Do not announce errors.
–Group settings require the ability for all members to view, hear, and be heard.

Computing Environments/ Universal Usability


• Cultural and International
– Reading direction
• vertical vs. horizontal; left vs. right
– Characters, numerals, special characters
– Formats
• Date and time
• Numerical and currency
• Weights and measures
• Phone numbers and addresses
• Names and titles
– Sorting sequences
– Icons, buttons, images, and colors
– Etiquette, policies, tone, formality, and metaphors.

Computing Environments/Universal Usability


• Cognitive Environments
–Age: Children, Older adults
–Disabilities
–Technical knowledge
–Degree of focus
–Stress

–Memory: Short‐term and long‐term


–Decision making and risk assessment
–Language
–Other: vigilance, fatigue, workload, fear, anxiety, emotion, drugs, smoking, alcohol, etc

Computing Environments/Universal Usability


• Children
–Skill levels
• Reading and understanding
• Smartphones!

• Keyboard, drag and drop, double‐clicks, small targets


–Attention levels
–Familiar characters, exploratory environments, Repetition

Computing Environments/ Universal Usability


• Older Adults
– Reduced physical, cognitive, and social capabilities.
– Fear of technology.
Computing Environments/Universal Usability
• Disabilities
–Visual
–Hearing
–Motor
–Learning
–Temporarily disabled

Computing Environments/ Universal Usability


• Personality
– Personal interaction preferences
– Extroverts vs. introverts
– Sensing vs. intuition
– Perceptive vs. judging
– Feeling vs. thinking

5W+H
• What/How
–Understand the physical and virtual interface components.
• For example, I/O devices, windows, icons, etc.
• Where/When
–Related to physical environment.
–Differences between office, portable, wearable systems.
• Who/Why –Types of tasks and skill sets required.

Norman’s Principles
• Provide a good conceptual model.
• Make things visible.
• Use good mappings.
• Provide feedback.

Norman’s Paradox of Technology


• “Added complexity and difficulty cannot be avoided when functions are added, but with clever design, they can
be minimized.”
Problems
• Gulf of Execution
• Gulf of Evaluation

Johnson’s Principles
• Focus on the users and their tasks, not the technology.
• Consider function first, presentation later.
• Conform to the user’s view of the task.
• Do not complicate the user’s task.
• Promote Learning.
• Deliver information, not just data.
• Design for responsiveness.
• Try it on users first, then fix it.
Interaction Paradigms
• Innovation
• Computing Environments
• Analyzing Interaction Paradigms
• Interaction Paradigms

Innovation - Vannevar Bush


• “As We May Think.” (1945) in the July issue of the Atlantic Monthly
• Bush envisioned a device that would help people organize information in a meaningful way.
• He called this device the “Memex”:

Memex: A Memex is a device in which an individual stores all his books, records, and communications, and which
is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement
to his memory.

Innovation - Douglas Engelbart


• Human Augmentation System
–Augmentation Research Center (ARC) of theStanford Research Institute (SRI) in Menlo Park, CA.
• oNLine System (NLS), presented at the Fall Joint Computer Conference in San Francisco in 1968
• Human Augmentation System: It seemed clear to everyone else at the time that nobody would ever take seriously
the idea of using computers in direct, immediate interaction with people. The idea of interactive computing—well, it
seemed simply ludicrous to most sensible people.

Innovation - Douglas Engelbart


• How do Engelbart’sinnovations affect us today?

Innovation - J. C. R. Licklider
• The Computer as a Communication Device (1968)
• Licklider envisioned something like the current manifestation of the Internet with its diversity of computing
technologies and platforms.
• OLIVER – (online interactive vicarious expediter and responder),
OLIVER was designed to be a complex of applications, programmed to carry out many low-level tasks, which
would
serve as an intermediary between the user and his or her online community.
OLIVER would manage files and communications, take dictation, and keep track of transactions and appointments.

Does OLIVER exist in our current computing environments?

Innovation - Ivan Sutherland


• The Ultimate Display (1965)
• He proposed novel ways of interacting with computers, including the concept of a kinesthetic display.
The Ultimate Display – Ivan Sutherland
The ultimate display would, of course, be a room within which the computer can control the existence of matter. A
chair displayed in such a room ould be good enough to sit in. Handcuffs displayed in such a room would be
confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could
literally be the Wonderland into which Alice walked.

In what ways has the Ultimate Display become manifest in our current computing environments?

Computing Environments
• Physical Computing Environment
• Social Computing Environment
• Cognitive Computing Environment

Physical Computing Environment


1–Safety 2 –Efficiency 3–User Space 4 –Work Space 5–Lighting 6–Noise 7-Pollution

Social Computing Environment


–The social environment affects the way people use computers.
–Computer use has also been shown to affect human social interaction.
–Different computing paradigms imply different social environments.
• For instance, personal computing is usually a solitary activity done in an office or an isolated corner of the
house. Mobile computing is often done outside and in public places

Cognitive Computing Environment


1–Age 2–Disabilities 3–Degree of technical knowledge 4–Degree of focus 5–Cognitive Stress

Analyzing Interaction Paradigms


• 5W + H
–What/How
–Where/When
–Who/Why
Terms
• Information Space—Defined by the information artifacts used and the content included, for example, a book and
the topics covered in the book
• Interaction Architecture—The structure of an interactive system that describes the relationship and methods of
communication between the hardware and software components.
• Interaction Mode—Refers to perceptual modalities, for example, visual,auditory, or haptic (sometimes used in the
literature to refer to interaction styles or particular tasks such as browsing or data entry)
• Interaction Paradigm—A model or pattern of human–computer interaction that encompasses all aspects of
interaction, including physical, virtual, perceptual, and cognitive
• Interaction Space—The abstract space defined by complex computing devices such as displays, sensors,
actuators, and processors
• Interaction Style—The type of interface and the interaction it implies, for example, command line, graphical user
interface (GUI), or speech
• Work Space—The place where people carry out work-related activities, which may include virtual as well as
physical locations, as in, for example,flight simulation training

Interaction Paradigms
• Large Scale Computing
• Personal Computing
• Networked Computing
• Mobile Computing
• Collaborative Environments
• Virtual Reality
• Augmented Reality

Interaction Paradigms
Large Scale Computing
• The original mainframe computers were large-scale computing machines, referred to as hosts
• They resided in a central location
• They were accessed by remote alphanumeric terminals equipped with keyboards
–The terminals were referred to as “dumb terminals”
–These systems are also referred to as host/terminal systems
• They were programmed using punch cards
• Time-sharing services (TSSs) were schemes that used the downtime of one user for another user who was currently
active.
• Mainframe computers are currently used in enterprise computing environments like Wall Street
EX: IBM Mainframes photo album
• Super Computers
–These highly specialized machines crunch large amounts of data at high speed, as in computing fluid dynamics,
weather patterns, seismic activity predictions, and nuclear explosion dynamics.
–Supercomputers are used for the very high speed backbone (vBNS)connections that constitute the core of the
Internet.
EX: National Center for Super Computing Applications (NCSA)

Personal Computing
Desktop Computing: The Alto, developed at the Xerox Palo Alto Research Center in 1973, was the first computer
to use a GUI that involved The desktop metaphor: pop-up menus, windows, and icons.

Personal Computing
• Personal-Public Computing
–Public Access Computing
– The information divide
–Public Information Appliances
Networked Computing
• Licklider –The Galactic Network
• ARPAnet - 10:30 pm on October 29, 1969
• Scope
–WAN – Wide Area Network
–MAN – Metropolitan Area Network
–LAN – Local Area Network
–PAN –Personal Area Network
• Wired - Wireless
–Wi-Fi (IEEE 802.11x)
–Bluetooth
–3G

Mobile Computing
• Mobile computing technologies comprise a very diverse family of devices:
–Laptop computers
–Tablet computers
–Game players
–MP3 players
–PDAs
–Cell phones
• Mobile devices can be connected to global positioning systems(GPS)
– These have touchscreens and voice interaction to alleviate potential visual attention problems during
driving
• Mobile devices can offer situational computing that can take advantage of location-specific information through
location-based mobile services (LMS).
– LMS can be beneficial for location-sensitive advertisements, public service announcements, social interactions,
and location-specific educational information.

Collaborative Environments
• Networks allow members of a group to interact with other members on shared files and documents.
–This creates a virtual space where people can collaborate and work collectively.
–Groupware
• Collaborative work
–Communication
–Coordination
–Organization
–Presentation
• Computer-mediated communication (CMC)
• Computer-supported cooperative work (CSCW)
• What are some of the different types of groupware?
• Remote interaction
–Synchronous
• Video conferencing
• Instant messaging
• Chat rooms
• Remote access white boards
–Asynchronous
• Recommender systems
• Bulletin boards
• Email
• Face-to-face
–Smart rooms
• Projectors
• Smart Boards
• Collaboratory (Laboratories without walls)
–Developed to allow the scientific community to perform and share research projects and results regardless
of physical Location.
EX: The Research Collaboratory for Structural Bioinformatics (RCSB), • The Chimpanzee Collaboratory, •
The National Fusion Grid

Embodied Virtuality
Some of us use the term “embodied virtuality” to refer to the process of drawing computers out of their electronic
shells. The “virtuality” of computer-readable data—all the different ways in which it can be altered, processed and
analyzed—is brought into the physical world
• How do we disperse computing functionality throughout the environment?
• What form should EV computing take?
• What kind of interface does it require?
• How much control should we retain, and how much should be automated?
• Four discernable currents in EV (location/operation)
Side 1—Portable/manual (sometimes wearable) devices such as cell phones, MP3 players, digital cameras, and
PDAs offer portable functionality the user can manipulate.
Side 2—Manual/fixed devices such as ATMs and kiosks are manipulated by the user but are fixed in place.
Side 3—Portable/automated devices are read by situated sensors, such as the car transceivers used for toll both
payments. There are no possible manual operations.
Side 4—Automated/fixed devices such as alarm sensors can be used to detect the presence of intruders or industrial
hazards.

• Emerging fields
– Ubiquitous/pervasive computing
– Invisible/transparent computing
– Wearable computing

Embodied Virtuality - Ubiquitous/pervasive


• Third Paradigm (Alan Key)
• Devices like cameras, video recorders, musical instruments, and picture frames are becoming “smart” through the
introduction of embedded chips.
• The essence of UbiComp is that, to fulfill their potential, computing technologies must be considered a part of the
fabric of our lives and not something that resides in a gray box.
• Ambient computing
– The concept of a computational grid that is seamlessly integrated into our physical environment
• Lighting systems
• Heating systems
• Electrical systems
– Smart environments that sense and recognize people
• Face recognition
• ID tags

Embodied Virtuality - Invisible/transparent


• The most profound technologies are those that disappear. They weave themselves into the fabric ofeveryday life
until they are indistinguishable from it.(Weiser,1991, 94)
– Two approaches
• Make the interface simple and intuitive
– Driving a car
• Remove the interface entirely
– Automotive breaking systems
• Information Appliances
– PDAs, BlackBerry®devices, digital cameras, MP3 players, and portable game players.
An appliance specializing in information: knowledge, facts, graphics, images, video, or sound. An information
appliance is designed to perform a specific activity, such as music, photography, or writing. A distinguishing feature
of information appliances is the ability to share information Among themselves

Embodied Virtuality - Wearable


• The underlying principle of wearable computing is the merging of information space with work space - humionics.
• The goal of humionics is to create an interface that is unobtrusive and easily operated under work-related
conditions.
• Traditional I/O technologies are generally inadequate
• Wearable systems must take advantage of auditory and haptic as well as visual interaction.
MAIN EX : • NASA
– Body Wearable Computer (BWC) Project
• Shuttle
• Links –
– Xybernaut
– i-glasses (iPod)
– Ascension Technology
– Wearable Voice Activated Computer (WEVAC) Project.
– MIT Media Lab Wearable Computing
• Personal Area Network (PAN)
– Two types
• Wireless network of wearable and proximal devices
– (IEEE) 802.15 Working Group for WPAN
– Microsoft – Connect to a Bluetooth personal area network (PAN)
• Wearable devices that use the body to transmit signals
– MIT Media Lab – Intrabody Signaling
– IBM Personal Area Network (PAN)
• Wearable computing integrates three spaces
– Information Space
• Defined by the artifacts people encounter such as documents and schedules
– Interaction Space
• Defined by the computing technology that is used
– Work Space
• Any physical location that may be involved in a task.
• Venn diagram of a library space

Embodied Virtuality
• Embodied Virtuality Environments and Their Characteristics

Virtual Reality
• The goals of the virtual reality (VR) community are the direct opposite of the goals of the EV
community.
– EV strives to integrate computer functionality with the real world
– VR strives to immerse humans in a virtual world
• Virtual reality technologies can be divided into two distinct groups:
– Nonimmersive environments
– Immersive environments
• Nonimmersive - screen-based, pointer-driven, threedimensional (3D) graphical presentations that may involve
haptic feedback
– VRML
– QuickTime VR
• Immersive VR environments are designed to create a sense of “being” in a world populated by virtual objects.
– To create a convincing illusion, they must use as many human perceptual channels as possible.
• VR I/O devices
– Head Mounted Display (HMD)
– Spatial Immersive Display (SID)
– Cave Automated Virtual Environment (CAVE)

• VR I/O devices
– Head-movement-tracking systems
– Passive systems
• Platform device
– Flight simulation
– Active locomotion systems
• Treadmill
– Military training
• Applications
– Engineering
• Computer-aided design (CAD) and VR
• Clemson Research in Engineering Design and Optimization
– Virtual Reality Design Tools > Virtual Reality (VR) related Projects:
– Education
– Psychology
• Treatment of phobias
– Spiders
– Agoraphobia
– Claustrophobia
– Fear of flying

Augmented Reality
• The goal of AR is to create a seamless integration between real and virtual objects in a way that augments the
user’s perception and experience.
• Criteria for AR environments
– The virtual information must be:
• Relevant to and
• in sync with the real-world environment
• AR I/O devices
– Heads Up Displays (HUD)
• Optical see through
• Video see through EX: GOOGLE GLASS
Chapter 2:
Interaction Styles
Interaction Styles
• Frameworks for Understanding Interaction
• Coping with Complexity
–Avoid “cluttering”, mixing-up, becoming too eclectic
–Good for analysis/evaluation?
Consistency often thought of in re. to interaction styles
• Interaction Styles vs. Paradigms
–Not so much to do with what we plan on using computers for
Frameworks for Understanding Interaction
• We are going to talk about two different (or more) interaction models/frameworks:
–Execution/Evaluation Action Cycle
–Interaction Framework
•A framework is basically a structure that provides a context for conceptualizing something
•We can (also) use these frameworks to:
–Structure the design process
–Help us to identify problematic areas within the design
–Help us to conceptualize the problem space as a whole

Execution/Evaluation Action Cycle (EEC)


•Donald Norman (1990) The Design of Everyday Things
–I think a book that you should consider reading!!!
•The structure of an action has four basic part:
–Goals: We begin with some idea of what we want to happen; this is our goal.
–Execution: We must then execute an action in the world.
–World: To execute and action, we must manipulate objects in the world.
–Evaluation: Finally, we must validate our action and compare the results with our goal.
•Goals do not specify particular actions
•Goals and intentions do not have a one-to-one, relationship
•“Delete text”goal
–Intention that involves the Edit menu
–Intention that involves the Delete key
•Each intention involves a sequence of actions
Goal > Intention > Actions > Execution
•Evaluate Results
–Perceive new state
–Interpret what we perceive
–Evaluate new state with goal
Perceive > Interpret > Evaluate
•Seven Stages of Action
• The cycle can be initiated at any point
–Some goals are data-driven - initiated when an environmental event is perceived
• Event-driven?
o Recipient-designed?
–Others are goal-driven - initiated when the person conceives of a new goal.

Gulf of Execution
•Does the interface allows us to carry out the actions required by the intention?
Goal = save a file
Intention = use the file menu
Action = click the save option
•Is there a save option in the file menu?
• Given a particular interface design, how easily can you:
–Determine the function of the device?
–Determine what actions are possible?
–Determine mapping from intention to physical movement?
–Perform the action?
–Determine whether the system is in the desired state?
–Determine the mapping from system state to interpretation?
–Determine what state the system is in?

Interaction Framework
• Abowd and Beale expanded on the EEC to include the system
• System (S)—Uses its core language (computational attributes
related to system state)
• User (U)—Uses its task language (psychological attributes
related to user state)
• Input (I)—Uses its input language
• Output (O)—Uses its output language

Interaction Framework / EEC


• Execution Phase
–Articulation—The user formulates a goal, which is then articulated using the input language.
–Performance—The input language is translated into the core language (operations that the system will
carry out).
–Presentation—The system manifests the result of the corelanguage operations using the output language.
• Evaluation Phase
–Observation—The user interprets the results on the screen and reconciles them with the original goal.

Coping with Complexity


• We are going to talk about different strategies for doing this,...
• Mental Models
• Mapping
• Semantic and ArticulatoryDistance
• Affordances

Mental Models
• A mental model is a cognitive representation of something that defines a logical and believable estimation as to
how a thing is constructed or
how it functions
–Transparent objects expose their functions
• Bicycles
–Opaque objects hide their functions
• Computers
• “Bee in a box”
• Mental models are:
–Unscientific—They are often based on guesswork and approximations.
–Partial—They do not necessarily describe whole systems, just the aspects that are relevant to the persons
who formulate them.
–Unstable—They are not concrete formulations, but evolve and adapt to the context.
–Inconsistent—They do not necessarily form a cohesive
-whole; some parts may be incompatible with other parts of the same model.
–Personal—They are specific to each individual and are not universal concepts that can be applied
generically.
MAXIM
Designs that align with a user’smental model will be easier for him or her to use
•How can we ascertain information about a user’s mental model?

Mapping
• The concept of mapping describes how we make connections between things
Semantic and Articulatory Distance
• Semantic Distance
–The distance between what people want to do and the meaning of an interface element.
• Articulatory Distance
–The distance between the physical appearance of an interface element and what it actually means.

Affordances
• The affordances of some interfaces can be intuitively understood: a steering wheel affords turning, and a door bell
affords pushing.
• These connections allow us to make predictions about the results of our actions and help us to create usable mental
models.
• Affordance Confusion- when certain aspects of an object do not work in a way in which we assume they should

• Norman considers an affordance to be a relationship between an object and a user, not a property of an object
• What may be an affordance to one person may not be to another
• The perception of affordance fosters usability
• The affordances a user may need must be present
• Affordances must not contradict the user’s expectations

Interaction Styles (list of)


• Command Line
• Menu-Based Interface
• Form Fill-In
• Question and Answer
• Direct Manipulation
• Metaphors
• Web Navigation
• Three-Dimensional Environments
• Zoomable Interface
• Natural Language

• Command-line interfaces are fast and powerful.


–Many commands are abbreviated
• quick and efficient
–Commands can be applied to many objects simultaneously
• fast input
–Some commands have multiple parameters that can be set and altered
• precise and flexible
• Command Line and the EECA
–Intention formation, specification of the action, and the execution stages are complex
–Requires a rather accurate mental model of the computer’s internal processing
• Command Line and the Interaction Framework
–Translating the user’stask language into the input language requires knowledge of the core
language
–The output language can be confusing for inexperienced users - there is very little feedback
Command Line and Articulatory Distance
–Articulatory distance is large because we are presented with only the command prompt – no
indication of functionality
• Advantages of command-line interfaces:
–Suitable for repetitive tasks
–Advantageous for expert users
–Offer direct access to system functionality
–Efficient and powerful
–Not encumbered with graphic controls
• Low visual load
• Not taxing on system resources
Disadvantages of command-line interfaces:
–Low command retention
–Steep learning curve
–High error rates
–Heavy reliance on memory
–Frustrating for novice users

Interaction Styles - Menu-Based Interface


• Menu-driven interfaces present users with sequential hierarchal menus that offer lists of functions.
–Textual: key-in number of option
–Graphical: use arrow keys or pointing device
Menus are based on recognition as opposed to recall
• No need to remember commands
• Users search from a list of possible choices
• List provides constraints
• Appropriate for small screens (iPod)
• Menu-based interfaces and the EEAC
–Menu constraints can help the user to form the proper intentions and specify the proper action sequence
–Provide a context to evaluate the output language
• Menu-based interfaces and :
–Articulatory Distance
• Menu options create small articulatory distance
–Mental Models
• Menu construction has a direct impact on user’smental model
–Affordances
• Menu elements present affordances
• Advantages of menu-based interfaces:
–Low memory requirements
–Self-explanatory
–Easy to undo errors
–Appropriate for beginners
• Disadvantages of menu-based interfaces:
–Rigid and inflexible navigation
–Inefficient for large menu navigation
–Inefficient use of screen real estate
–Slow for expert users

Interaction Styles - Form Fill-In


• Similar to menu interfaces – present screens of information
• Different than menu interfaces - used to capture information and proceed linearly not to navigate a hierarchical
structure
Always inform the user about the length of paged forms and where they are within the structure
• Forms can be presented using
–Single scrolling screens
– Multiple linked pages
• Form elements must be grouped logically
• Include “You Are Here” indications
Form elements must be unambiguously labeled to increase data integrity
• Users must understand what data is required and what format should be used
–Date information formats
1/29/2005, 29/1/2005, or January 29, 2005?
•Advantages of form fill-in interfaces:
–Low memory requirements
–Self-explanatory
–Can gather a great deal of information in little space
–Present a context for input information
•Disadvantages of form fill-in interfaces:
–Require valid input in valid format
–Require familiarity with interface controls
–Can be tedious to correct mistakes

Interaction Styles - Question and Answer


• Question and answer interfaces are also called wizards.
• They are restricting for expert users
• They are easy for novice users
–However, they may not know the required information

• Advantages of question and answer interfaces:


–Low memory requirements
–Self-explanatory
–Simple linear presentation
–Easy for beginners
• Disadvantages of question and answer interfaces:
–Require valid input supplied by user
–Require familiarity with interface controls
–Can be tedious to correct mistakes

Interaction Styles - Direct Manipulation


• Ben Shneiderman (1982)
–Continuous representations of the objects and actions of interest with meaningful visual metaphors.
–Physical actions or presses of labeled buttons instead of complex syntax.
–Rapid, incremental, reversible actions whose effects on the objects of interest are visible immediately
• Three phases in Direct Manipulation - Cooper, Reimann (2003)
–Free Phase—How the screen looks before any user actions
–Captive Phase—How the screen looks during a user action (click, click-drag, etc.)
–Termination Phase—How the screen looks after a user action
• Direct Manipulation and the EEAC
–The range of possible intentions is consistently wide
–Users usually have multiple options for specifying action sequences
o Can be overwhelming of novice users
–Provide multiple ways of executing action sequences
• Advantages of direct manipulation interfaces:
–Easy to learn
–Low memory requirements
–Easy to undo
–Immediate feedback to user actions
–Enables user to use spatial cues
–Easy for beginners
• Disadvantages of direct manipulation interfaces:
–Not self-explanatory
–Inefficient use of screen real estate
–High graphical system requirements

Interaction Styles - Metaphors


• GUIs use visual relationships to real-world objects (metaphors)
• Metaphors can help people relate to complex concepts and procedures by drawing on realworld knowledge
• Real-world affordances can be reflected
• What metaphors are used by contemporary GUIs?
A metaphor’s function must be consistent with real-world expectations
• Metaphors that do not behave the way people expect will cause confusion and frustration
• Macintosh trashcan
Don’t force a metaphor
• Potential problems with metaphors
–Run out of metaphors
• Some virtual processes and objects have no real-world counter parts
–Mixed metaphors
–Carry connotations and association

Interaction Styles - Web Navigation


• Two basic interaction styles
–Link-based navigation
• Sensitive to articulatory distance
• Ambiguous link labels increase the gulf of evaluation
–Search
• Sensitive to semantic distance
• Inadequate search engine algorithms increase the gulf of execution
• Slight advantage in development of mental models

Interaction Styles – 3D Environments


• 3D interaction is natural in the real-world
• 3D environments are common in digital games
• Rich graphical 3D environment are processor intensive
• 3D Navigation
–Involves two types of movement
• Translation – movement on a plane
• Rotation – movement around an axis

Interaction Styles – 3D Environments


• Web-based 3D
–Use vector-based graphics to decrease file size
–Virtual Reality Modeling Language (VRML)
• Uses polygons with parameters
–Transparency
–Texture maps
–shininess
–X3-D is XML based - Web3D.org
• Offers greater flexibility and control
• Desktop 3D
–Current GUIs are predominantly 2D
–3D environments presented on 2D screens are difficult to navigate

Interaction Styles - Zoomable Interface


• ZoomWorld (Jeff Raskin) is based on the zooming interface paradigm (ZIP)
• ZoomWorld Demo
Zoomable interfaces allow us to use our sense of relative positioning
• ZIP is based on landmarks and relative positioning (organizational cues)
–Proportion
–Color
–Patterns
–Proximity
• Pad++: Zoomable User Interface (ZUI)

Interaction Styles - Natural Language


• Natural Language Interaction (NLI) – Interacting with computers using everyday language
• Obstacles
–Language is ambiguous
–Meaning depends on context
• “Search results”
• “She said she did not know”
–Dependant on visual cues
• Applications for NLI
–Speech Input
• Hands-free operation
• Poor Lighting Situations
• Mobile Applications
• In the home
–Speech Output
• On-board navigational systems
• Two areas of development
–Speech recognition
–Semantics
• Grammar issues
• Vague meanings
• Contradictory statements
• Advantages of NLI:
–Ease of learning
–Low memory requirements
–Flexible interaction
–Low screen requirements
–Appropriate for beginners
• Disadvantages of NLI:
–Requires knowledge of the task domain
–May require tedious clarification dialogues
–Complex system development

You might also like