You are on page 1of 57

Submitted To:- Submitted By :-

Ramandeep Singh Gurvinder Singh


Ashish Kapoor
Amritpal Singh
Jugvinder Singh Sidhu
TABLE OF CONTENT

1. What is SCADA?

2. What is Telemetry?

3. What is Data Acquisition?

4. Differences between SCADA and DCS?

5. Components of SCADA
i. Field Instrumentation
ii. Remote Station
iii. Communication Network
iv. Central Monitoring System
(CMS)

6. Typical System Configuration

7. Modes of Communication

8. SCADA Example Application

9. SCADA System Benefits

10.Futuristic Technoogy for Scada- BAN

11. Limitations

Continued.
12. Human Machine Interface(HMI)

i. Introduction
ii. Terminology
iii. Defination
iv. Goals

13. HumanMachine Interaction

14. Human Machine Interface

15. Design Methodologies

16. Thirteen Principles of Display Design

17. Modalities And Modes

18. Interaction Technique

19. Human Interface Device

20. Biblography
Supervisory Control And Data Acquisition
What is SCADA?

SCADA (Supervisory Control And Data Acquisition) system refers to the


combination of telemetry and data acquisition. It consists of collecting
information, transferring it back to a central site, carrying out necessary analysis
and control, and then displaying this data on a number of operator screens.The
SCADA system is used to monitor and control a plant or equipment. Control may
be automatic or can be initiated by operator commands. SCADA systems were first
used in the 1960s

SCADA stands for supervisory control and data acquisition. It generally refers to
an industrial control system: a computer system monitoring and controlling a
process. The process can be industrial, infrastructure or facility-based as
described below:

Industrial processes include those of manufacturing, production, power


generation, fabrication, and refining, and may run in continuous, batch,
repetitive, or discrete modes.
Infrastructure processes may be public or private, and include water
treatment and distribution, wastewater collection and treatment, oil and
gas pipelines, electrical power transmission and distribution, Wind Farms,
civil defense siren systems, and large communication systems.
Facility processes occur both in public facilities and private ones, including
buildings, airports, ships, and space stations. They monitor and control
HVAC, access, and energy consumption.
SCADA Software
The supervisory computer consists of a PC running either Campbell Scientific's
HMI software or another vendor's software. InTouch, Intellution, Lookout, and
other software packages can be used in conjunction with our OPC client/server
software application. Like other HMI software packages, our software provides a
graphical interface that the operator uses to view the status of remote sites,
acknowledge alarms, and control the units.

What is Telemetry?
Telemetry is usually associated with SCADA systems. It is a technique used in
transmitting and receiving information or data over a medium. The information can
be measurements, such as voltage, speed or flow. These data are transmitted to
another location through a medium such as cable, telephone or radio. Information
may come from multiple locations. A way of addressing these different sites is
incorporated in the system

What is Data Acquisition?


Data acquisition refers to the method used to access and control information or data
from the equipment being controlled and monitored. The data accessed are then
forwarded onto a telemetry system ready for transfer to the different sites. They can
be analog and digital information gathered by sensors, such as flowmeter, ammeter,
etc. It can also be data to control equipment such as actuators, relays, valves, motors,
etc.
What are the differences between
SCADA and DCS?
Similar to the SCADA systems are the Distributed Control Systems (DCS). The DCS is
usually used in factories and located within a more confined area. It uses a high-speed

communications medium, such as local area network (LAN). A significant amount of


closed loop control is present on the systemThe SCADA system covers
larger.geographical areas. It may rely on a variety of communication links such as
radio and telephone. Closed loop control is not a high priority in this system.

Supervision vs control

There is, in several industries, considerable confusion over the differences


between SCADA systems and distributed control systems (DCS). Generally
speaking, a SCADA system usually refers to a system that coordinates, but does
not control processes in real time. The discussion on real-time control is muddied
somewhat by newer telecommunications technology, enabling reliable, low
latency, high speed communications over wide areas. Most differences between
SCADA and DCS are culturally determined and can usually be ignored. As
communication infrastructures with higher capacity become available, the
difference between SCADA and DCS will fade.
Common system components

A SCADA's System usually consists of the following subsystems:

A Human-Machine Interface or HMI is the apparatus which presents


process data to a human operator, and through this, the human operator
monitors and controls the process.
A supervisory (computer) system, gathering (acquiring) data on the process
and sending commands (control) to the process.
Remote Terminal Units (RTUs) connecting to sensors in the process,
converting sensor signals to digital data and sending digital data to the
supervisory system.
Programmable Logic Controller (PLCs) used as field devices because they
are more economical, versatile, flexible, and configurable than special-
purpose RTUs.
Communication infrastructure connecting the supervisory system to the
Remote Terminal Units

Systems concepts
The term SCADA usually refers to centralized systems which monitor and control
entire sites, or complexes of systems spread out over large areas (anything
between an industrial plant and a country). Most control actions are performed
automatically by Remote Terminal Units ("RTUs") or by programmable logic
controllers ("PLCs"). Host control functions are usually restricted to basic
overriding or supervisory level intervention. For example, a PLC may control the
flow of cooling water through part of an industrial process, but the SCADA system
may allow operators to change the set points for the flow, and enable alarm
conditions, such as loss of flow and high temperature, to be displayed and
recorded. The feedback control loop passes through the RTU or PLC, while the
SCADA system monitors the overall performance of the loop.
Data acquisition begins at the RTU or PLC level and includes meter readings and
equipment status reports that are communicated to SCADA as required. Data is
then compiled and formatted in such a way that a control room operator using
the HMI can make supervisory decisions to adjust or override normal RTU (PLC)
controls. Data may also be fed to a Historian, often built on a commodity
Database Management System, to allow trending and other analytical auditing.

SCADA systems typically implement a distributed database, commonly referred to


as a tag database, which contains data elements called tags or points. A point
represents a single input or output value monitored or controlled by the system.
Points can be either "hard" or "soft". A hard point represents an actual input or
output within the system, while a soft point results from logic and math
operations applied to other points. (Most implementations conceptually remove
the distinction by making every property a "soft" point expression, which may, in
the simplest case, equal a single hard point.) Points are normally stored as value-
timestamp pairs: a value, and the timestamp when it was recorded or calculated.
A series of value-timestamp pairs gives the history of that point. It's also common
to store additional metadata with tags, such as the path to a field device or PLC
register, design time comments, and alarm information.
Human Machine Interface

Typical Basic SCADA Animations

A Human-Machine Interface or HMI is the apparatus which presents process data


to a human operator, and through which the human operator controls the
process.

An HMI is usually linked to the SCADA system's databases and software programs,
to provide trending, diagnostic data, and management information such as
scheduled maintenance procedures, logistic information, detailed schematics for
a particular sensor or machine, and expert-system troubleshooting guides.

The HMI system usually presents the information to the operating personnel
graphically, in the form of a mimic diagram. This means that the operator can see
a schematic representation of the plant being controlled. For example, a picture
of a pump connected to a pipe can show the operator that the pump is running
and how much fluid it is pumping through the pipe at the moment. The operator
can then switch the pump off. The HMI software will show the flow rate of the
fluid in the pipe decrease in real time. Mimic diagrams may consist of line
graphics and schematic symbols to represent process elements, or may consist of
digital photographs of the process equipment overlain with animated symbols.

The HMI package for the SCADA system typically includes a drawing program that
the operators or system maintenance personnel use to change the way these
points are represented in the interface. These representations can be as simple as
an on-screen traffic light, which represents the state of an actual traffic light in
the field, or as complex as a multi-projector display representing the position of
all of the elevators in a skyscraper or all of the trains on a railway.

An important part of most SCADA implementations is alarm handling. The system


monitors whether certain alarm conditions are satisfied, to determine when an
alarm event has occurred. Once an alarm event has been detected, one or more
actions are taken (such as the activation of one or more alarm indicators, and
perhaps the generation of email or text messages so that management or remote
SCADA operators are informed). In many cases, a SCADA operator may have to
acknowledge the alarm event; this may deactivate some alarm indicators,
whereas other indicators remain active until the alarm conditions are cleared.
Alarm conditions can be explicit - for example, an alarm point is a digital status
point that has either the value NORMAL or ALARM that is calculated by a formula
based on the values in other analogue and digital points - or implicit: the SCADA
system might automatically monitor whether the value in an analogue point lies
outside high and low limit values associated with that point. Examples of alarm
indicators include a siren, a pop-up box on a screen, or a coloured or flashing area
on a screen (that might act in a similar way to the "fuel tank empty" light in a car);
in each case, the role of the alarm indicator is to draw the operator's attention to
the part of the system 'in alarm' so that appropriate action can be taken. In
designing SCADA systems, care is needed in coping with a cascade of alarm events
occurring in a short time, otherwise the underlying cause (which might not be the
earliest event detected) may get lost in the noise. Unfortunately, when used as a
noun, the word 'alarm' is used rather loosely in the industry; thus, depending on
context it might mean an alarm point, an alarm indicator, or an alarm event.
Components of SCADA System
Components of a SCADA System A SCADA system are composed of the following:

1. Field Instrumentation

2. Remote Stations

3. Communications Network

4. Central Monitoring Station

Field Instrumentation refers to the sensors and actuators that are directly interfaced
to the plant or equipment. They generate the analog and digital signals that will be
monitored by the Remote Station. Signals are also conditioned to make sure they are
compatible with the inputs/outputs of the RTU or PLC at the Remote Station.

The Remote Station is installed at the remote plant or equipment being monitored
and controlled by the central host computer. This can be a Remote Terminal Unit
(RTU) or a Programmable Logic Controller (PLC).

The Communications Network is the medium for transferring information from one
location to another. This can be via telephone line, radio or cable.

The Central Monitoring Station (CMS) refers to the location of the master or host
computer. Several workstation may be configured on the CMS, if necessary. It uses a
Man Machine

Interface (MMI) program to monitor various types data needed for the operation. The
following is a sample configuration of a SCADA system for water distribution.
SCADA Component:
Field Instrumentation

Field Instrumentation refers to the devices that are connected to the equipment or
machines being controlled and monitored by the SCADA system. These are sensors
for monitoring certain parameters; and actuators for controlling certain modules of
the system.

These instruments convert physical parameters (i.e., fluid flow, velocity, fluid level,
etc.) to electrical signals (i.e.voltage or current) readable by the Remote Station
equipment. Outputs can either be in analog (continuous range) or in digital (discrete
values). Some of the industry standard analog outputs of these sensors are 0 to 5
volts, 0 to 10 volts, 4 to 20 mA and 0 to 20 mA. The voltage outputs are used when
the sensors are installed near the controllers (RTU or PLC). The current outputs are
used when the sensors are located far from the controllers.

Digital outputs are used to differentiate the discrete status of the equipment. Usually,
<1> is used to mean EQUIPMENT ON and <0> for EQUIPMENT OFF status. This may
also mean <1> for FULL or <0> for EMPTY.

Actuators are used to turn on or turn off certain equipment. Likewise, digital and
analog inputs are used for control. For example, digital inputs can be used to turn on
and off modules on equipment. While analog inputs are used to control the speed of
a motor or the position of a motorized valve.
Remote Station
Field instrumentation connected to the plant or equipment being monitored and
controlled are interfaced to the Remote Station to allow process manipulation at a
remote site. It is also used to gather data from the equipment and transfer them to
the central SCADA system. The Remote Station may either be an RTU (Remote
Terminal Unit) or a PLC (Programmable Logic Controller). It may also be a single board
or modular unit.

RTU versus PLC

The RTU (Remote Terminal Unit) is a ruggedized computer with very good radio
interfacing. It is used in situations where communications are more difficult. One

disadvantage of the RTU is its poor programmability. However, modern RTUs are now
offering good programmability comparable to PLCs.

The PLC (Programmable Logic Controller) is a small industrial computer usually found
in factories. Its main use is to replace the relay logic of a plant or process. Today,

the PLC is being used in SCADA systems to due its very good programmability. Earlier
PLCs have no serial communication ports for interfacing to radio for transferring

of data. Nowadays, PLC's have extensive communication features and a wide support
for popular radio units being used for SCADA system. In the near future we are seeing
the merging of the RTUs and the PLCs.

Micrologic is offering an inexpensive RTU for SCADA system wherein the PLC may be
an overkill solution. It is a microcontroller-based RTU and can be interfaced to radio
modems for transmitting of data to the CMS.
Single Board versus Modular Unit

The Remote Station is usually available in two types, namely, the single board and the
modular unit. The single board provides a fixed number of input/output (I/O)
interfaces. It is cheaper, but does not offer easy expandability to a more sophisticated
system. The modular type is an expandable remote station and more expensive than
the single board unit. Usually a back plane is used to connect the modules. Any I/O or
communication modules needed for future expansion may be easily plugged in on the
backplane.

Communication Network

The Communication Network refers to the communication equipment needed to


transfer data to and from different sites. The medium used can either be cable,
telephone or radio. The use of cable is usually implemented in a factory. This is not
practical for systems covering large geographical areas because of the high cost of the
cables, conduits and the extensive labor in installing them.The use of telephone lines
(i.e., leased or dial-up) is a cheaper solution for systems with large coverage. The
leased line is used for systems requiring on-line connection with the remote stations.
This is expensive since one telephone line will be needed per site. Besides leased lines
are more expensive than ordinary telephone line. Dial-up lines can be used on
systems requiring updates at regular intervals (e.g., hourly updates). Here ordinary
telephone lines can be used. The host can dial a particular number of a remote site to
get the readings and send commands. Remote sites are usually not accessible by
telephone lines. The use of radio offers an economical solution. Radio modems are
used to connect the remote sites to the host. An on-line operation can also be
implemented on the radio system. For locations wherein a direct radio link cannot be
established, a radio repeater is used to link these sites.
Central Monitoring Station (CMS)

The Central Monitoring Station (CMS) is the master unit of the SCADA system. It is in
charge of collecting information gathered by the remote stations and of generating

necessary action for any event detected. The CMS can have a single computer
configuration or it can be networked to workstations to allow sharing of information

from the SCADA system.

A Man-Machine Interface (MMI) program will be running on the CMS computer. A


mimic diagram of the whole plant or process can be displayed onscreen for easier
identification with the real system. Each I/O point of the remote units can be
displayed with corresponding graphical representation and the present I/O reading.
The flow reading can be displayed on a graphical representation of a flowmeter. A
reservoir can be displayed with the corresponding fluid contents depending on the
actual tank level. Set-up parameters such as trip values, limits, etc. are entered on
this program and downloaded to the corresponding remote units for updating of their
operating parameters.

The MMI program can also create a separate window for alarms. The alarm window
can display the alarm tag name, description, value, trip point value, time, date and
other pertinent information. All alarms will be saved on a separate file for later
review. A trending of required points can be programmed on the system. Trending
graphs can be viewed or printed at a later time. Generation of management reports
can also be scheduled on for a specific time of day, on a periodic basis, upon operator
request, or event initiated alarms. Access to the program is permitted only to
qualified operators. Each user is given a password and a privilege level to access only
particular areas of the program.. All actions taken by the users are logged on a file for
later review.
MMI Screen Showing Pipe System Diagram and Repair Areas

Typical System Configurations

There are two typical network configurations for the wireless telemetry radio-based
SCADA systems. They are the point-to-point and the point-to-multipoint
configurations.

1.Point-to-Point Configuration

2.Point-to-Multipoint Configuration
1.Point-to-Point Configuration
The Point-to-Point configuration is the simplest set-up for a telemetry system. Here
data is exchanged between two stations. One station can be set up as the master and
the other as the slave. An example is a set-up of two RTUs: one for a reservoir or tank
and the other for a water pump at a different location. Whenever the tank is nearly
empty, the RTU at the tank will send an EMPTY command to the other RTU. Upon
receiving this command, the RTU at the water pump will start pumping water to the
tank. When the tank is full, the tanks RTU will send a FULL command to the pumps
RTU to stop the motor.

Point-to-Point Configuration
2.Point-to-Multipoint Configuration
The Point-to-Multipoint configuration is where one device is designated as the master
unit to several slave units. The master is usually the main host and is located at the
control room. While the slaves are the remote units at the remote sites. Each slave is
assigned a unique address or identification number.

Point-to-Multipoint Configuration
Modes of Communication
There are two modes of communication available, namely, the polled system and the
interrupt system.

1.Polled System
In the Polled or Master/Slave system, the master is in total control of
communications. The master makes a regular polling of data (i.e., sends and receives
data) to each slave in sequence. The slave unit responds to the master only when it
receivers a request. This is called the half-duplex method. Each slave unit will have its
own unique address to allow correct identification. If a slave does not respond for a
predetermined period of time, the master retries to poll it for a number of times
before continuing to poll the next slave unit.

Advantages:

Process of data gathering is fairly simple

No collision can occur on the network

Link failure can easily be detected

Disadvantages:

Interrupt type request from a slave requesting immediate action cannot be handled
immediately

Waiting time increases with the number of slaves

All communication between slaves have to pass through the master with added
complexity
2.Interrupt System
The interrupt system is also referred to as Report by Exception (RBE) configured
system. Here the slave monitors its inputs. When it detects a significant change or
when it exceeds a limit, the slave initiates communication to the master and transfers
data. The system is designed with error detection and recovery process to cope with
collisions. Before any unit transmits, it must first check if any other unit is
transmitting. This can be done by first detecting the carrier of the transmission
medium. If another unit is transmitting, some form of random delay time is required
before it tries again. Excessive collisions result to erratic system operation and
possible system failure. To cope with this, if after several attempts, the slave still fails
to transmit a message to the master, it waits until polled by the master.

Advantages:

System reduces unnecessary transfer of data as in polled systems

Quick detection of urgent status information

Allows slave-to-slave communication

Disadvantages:

Master may only detect a link failure after a period of time, that is, when system is
polled
Operator action is needed to have the latest values

Collision of data may occur and may cause delay in the communication
SCADA Example Application

Sedimentation Tank Monitor on/off status of pumps


Control coliform, TSS, and on/off status of pumps

Clarifier Monitor torque


Control on/off status and torque alarms

Generator Monitor and control temperatures and flow rates


within exhaust heat recovery unit and heat
exchanger

Trickling Filter Monitor on/off status of pumps and blowers,


dissolved oxygen, flow rate, and wetwell level
Control on/off status of pumps and blowers

Chlorine Contact Tank


Monitor ORP
Control Cl2and SO2 injection

Digester Monitor and control temperature


SCADA System Benefits

1. Control units function as PLCs, RTUs, or DCUs.


2. Control units perform advanced measurement and control independent of
the central computer.
3. PID control continues, even if communications to the main computer are
lost.
4. Control units have many channel types to measure most available sensors.
5. Systems are compatible with our own or other vendors' HMI software
packages.
6. Control units have their own UPS; during ac power loss, they continue to
measure and store time-stamped data.
7. Control units provide on-board statistical and mathematical processing.
8. Systems are easily expandable: add new sites or add sensors to existing
sites.
9. Control units have wide operating temperature ranges and operate in
rugged environments.
Futuristic Technoogy for Scada- BAN
What is BAN ?
BAN Body Area Network
Still under research at Chiba University Japan under supervision of Prof.
Hideyuki Nebiya

This technology works on the basic principle that human body poses its own
Electric Field and is a good conductor of electricity.

Prof. Nebiya and his team found that human body posses an electric field of its
own and when ever human body comes in contact with some electrical
equipment say if someone touch TV Screen ; the intensity of body electric field
increases considerably.

From the above research, they concluded that human body can be used for
transmission using different frequency signals.
What human body communication is?

It is a means of communication between devices via the human body. It will


contribute to a reduction in information leakage because the communication is
carried out via the human body. In addition, the transmission loss is believed to
be smaller compared with that of wireless spatial transmission, realizing wireless
communication with low power consumption.

In this technology an electronic card is used which acts as transmitter. Signals are
transmitted through human body. Receiver on the other side receives the signal
and respond back that signal has received and acts accordingly.

Electronic Receiver Electronic Transmitter


APPLICATIONS
It can be used for locking and unlocking of doors as well as medical and health
care purposes. It can also be adopted by the entertainment field for transmission
of music and image information. And don't forget automotive keyless entry
systems and wearable computing systems.

1. MEDICAL FIELD
In Medical Field, a wrist watch like equipment is give to patient to wear which
keeps an eye on the state of patient and records every second change that take
place in patients body. Nurses also wear similar device , when they touch the
patient the whole dat gets transferred to the nurses device which is then
transmitted to the concernd Doctor. In this way many patients can be monitored
at a same time from a central control room using SCADA.
2. CORPORATE SECTOR
In this field this technology plays a vital role, the company officials need
not to carry bulky files or even a pen to sign the contract , all data is stored
in there Electronic Data Cards which is transferred to other person on
permission. To sign the contract 2 persons just need to shake hand and
there signed contracts exchanged through there Electronic Data Cards
3. Entertainment Industry
In this field this technology is going to flourish at maximum rate.
Just with the simple example you can get an idea how useful and
amazing development its going to make in this sector.

Now a days Ipod have wires for the head phones but with this
technology there is no need for wired head phones. Just switch on
the music player and put on your headphones and enjoy. Human
both acts as carrier in this case.

This is not over yet real application is now to start ,imagine your
friend also want to listen the same music what he has to do is to
wear a headset and just hold your hand and its done. In similar
way upto 15 persons can get connect and listen music at same
time. Further research is needed to develop it further to higher
levels.
How about the progress in the development of its usage
and market?
A number of companies have been working on the employment of
human body communication technologies and actually developed
prototypes. A variety of companies, including electronic manufacturers,
mobile phone companies, office equipment manufacturers, automobile
manufacturers and house builders, showed demonstrations. In the
demonstrations, they used the products incorporating human body
communication technologies, such as keys for locking and unlocking,
cash registers for retail shops and transmission and common use of
video, music and textual information in the entertainment field. Some
demonstrations could be seen at Security Show 2009 and IC Card World
2009, both of which took place in March 2009 We have recently
received many inquiries about their applications to sensor networks,
which are drawing attention from not only the industrial sector but also
from the medical and healthcare sectors. People in medical
organizations and healthcare companies seem to be placing greater
expectations on human body communication technologies probably
because the human body information gained by sensing can be
transmitted to devices via the human body.

THIS TECHNOLOGY RECONNECT WIDE APARTED HUMAN BEINGS WITH


A TOUCH
LIMITATIONS:-
Like every other technology, this technology also have some limitations
some are as follows :-

1. Loss of information and threat of stealing important information.

2. Physical effects on human body.


Researchers are still working on these factors to use it
commercially
HUMAN MACHINE INTERFACE (HMI)

Introduction

To work with a system, users have to be able to control and assess the state of
the system. For example, when driving an automobile, the driver uses the steering
wheel to control the direction of the vehicle, and the accelerator pedal, brake
pedal and gearstick to control the speed of the vehicle. The driver perceives the
position of the vehicle by looking through the windshield and exact speed of the
vehicle by reading the speedometer. The user interface of the automobile is on
the whole composed of the instruments the driver can use to accomplish the
tasks of driving and maintaining the automobile.

Terminology

There is a distinct difference between User Interface versus Operator Interface or


Human Machine Interface (HMI).

The term user interface is often used in the context of (personal) computer
systems and electronic devices
o where a network of equipment or computers are interlinked through
an MES (Manufacturing Execution System)-or Host.
o An HMI is typically local to one machine or piece of equipment, and is
the interface method between the human and the
equipment/machine. An Operator interface is the interface method
by which multiple equipment that are linked by a host control system
is accessed or controlled.
The system may expose several user interfaces to serve different kinds of
users. For example, a computerized library database might provide two
user interfaces, one for library patrons (limited set of functions, optimized
for ease of use) and the other for library personnel (wide set of functions,
optimized for efficiency).The user interface of a mechanical system, a
vehicle or an industrial installation is sometimes referred to as the human-
machine interface (HMI). HMI is a modification of the original term MMI
(man-machine interface). In practice, the abbreviation MMI is still
frequently used although some may claim that MMI stands for something
different now. Another abbreviation is HCI, but is more commonly used for
than human-computer interface. Other terms used are operator interface
console (OIC) and operator interface terminal (OIT). However it is
abbreviated, the terms refer to the 'layer' that separates a human that is
operating a machine from the machine itself.

In science fiction, HMI is sometimes used to refer to what is better described as


direct neural interface. However, this latter usage is seeing increasing application
in the real-life use of (medical) prosthesesthe artificial extension that replaces a
missing body part (e.g., cochlear implants).

In some circumstance computers might observe the user, and react according to
their actions without specific commands. A means of tracking parts of the body is
required, and sensors noting the position of the head, direction of gaze and so on
have been used experimentally. This is particularly relevant to immersive
interfaces.

Definition: The Human-Machine Interface is quite literally where the human


and the machine meet. It is the area of the human and the area of the machine
that interact during a given task.

Interaction can include touch, sight, sound, heat transference or any other
physical or cognitive function.

Also Known As: Man-Machine Interface


Examples: A typical computer station will have four human-machine interfaces,
the keyboard (hand), the mouse (hand), the monitor (eyes) and the speakers
(ears).
Goals
A basic goal of HMI is to improve the interactions between users and
machines(computers) by making computers more usable and receptive to the
user's needs. Specifically, HMI is concerned with:

methodologies and processes for designing interfaces (i.e., given a task and
a class of users, design the best possible interface within given constraints,
optimizing for a desired property such as learning ability or efficiency of
use)
methods for implementing interfaces (e.g. software toolkits and libraries;
efficient algorithms)
techniques for evaluating and comparing interfaces
developing new interfaces and interaction techniques
developing descriptive and predictive models and theories of interaction

A long term goal of HMI is to design systems that minimize the barrier between
the human's cognitive model of what they want to accomplish and the
computer's understanding of the user's task.

Professional practitioners in HMI are usually designers concerned with the


practical application of design methodologies to real-world problems. Their work
often revolves around designing graphical user interfaces and web interfaces.

Researchers in HMI are interested in developing new design methodologies,


experimenting with new hardware devices, prototyping new software systems,
exploring new paradigms for interaction, and developing models and theories of
interaction.
HUMANMACHINE INTERACTION

HumanMachine interaction (HMI) is the study of interaction between people


(users) and Machines(Computers). Interaction between users and machines
occurs at the user interface (or simply interface), which includes both software
and hardware; for example, characters or objects displayed by software on a
personal computer's monitor, input received from users via hardware peripherals
such as keyboards and mice, and other user interactions with large-scale
computerized systems such as aircraft and power plants. The Association for
Computing Machinery defines human-machine interaction as "a discipline
concerned with the design, evaluation and implementation of interactive
computing systems for human use and with the study of major phenomena
surrounding them. An important fact of HMI is the securing of user satisfaction

Because human-machine interaction studies a human and a machine in


conjunction, it draws from supporting knowledge on both the machine and the
human side. On the machine side, techniques in computer graphics, operating
systems, programming languages, and development environments are relevant.
On the human side, communication theory, graphic and industrial design
disciplines, linguistics, social sciences, cognitive psychology, and human factors
are relevant. Engineering and design methods are also relevant. Due to the
multidisciplinary nature of HCI, people with different backgrounds contribute to
its success. HCI is also sometimes referred to as humanmachine interaction
(HMI) or computerhuman interaction (CHI).
HUMANMACHINE (COMPUTER)
INTERFACE

The HumanMachine (computer) interface can be described as the point of


communication between the human user and the computer. The flow of
information between the human and computer is defined as the loop of
interaction. The loop of interaction has several aspects to it including:

Task Environment: The conditions and goals set upon the user.
Machine Environment: The environment that the computer is connected
to, i.e. a laptop in a college student's dorm room.
Areas of the Interface: Non-overlapping areas involve processes of the
human and computer not pertaining to their interaction. Meanwhile, the
overlapping areas only concern themselves with the processes pertaining to
their interaction.
Input Flow: The flow of information that begins in the task environment,
when the user has some task that requires using their computer.
Output: The flow of information that originates in the machine
environment.
Feedback: Loops through the interface that evaluate, moderate, and
confirm processes as they pass from the human through the interface to
the computer and back.
Usability

User interfaces are considered by some authors to be a prime ingredient of


Computer user satisfaction.

The design of a user interface affects the amount of effort the user must expend
to provide input for the system and to interpret the output of the system, and
how much effort it takes to learn how to do this. Usability is the degree to which
the design of a particular user interface takes into account the human psychology
and physiology of the users, and makes the process of using the system effective,
efficient and satisfying

Usability is mainly a characteristic of the user interface, but is also associated with
the functionalities of the product and the process to design it. It describes how
well a product can be used for its intended purpose by its target users with
efficiency, effectiveness, and satisfaction, also taking into account the
requirements from its context of use.

Consistency

A key property of a good user interface is consistency. There are three important
aspects.First, the controls for different features should be presented in a
consistent manner so that users can find the controls easily.For example, users
find it very difficult to use software when some commands are available through
menus, some through icons, and some through right-clicks. A good user interface
might provide shortcuts or "synonyms" that provide parallel access to a feature,
but users do not have to search multiple sources to find what they're looking
for.Second, the "principle of least astonishment" is crucial. Various features
should work in similar ways. For example, some features in Adobe Acrobat are
"select tool, then select text to which apply." Others are "select text, then apply
action to selection."

Third, user interfaces should not change version-to-versionuser interfaces must


remain upward compatible.

Good user interface design is about setting and meeting user expectations.Better
(from a programmer's point of view) is not better. The same (from a user's point
of view) is better.
USER INTERFACES IN COMPUTING

In computer science and human-computer interaction, the user interface (of a


computer program) refers to the graphical, textual and auditory information the
program presents to the user, and the control sequences (such as keystrokes with
the computer keyboard, movements of the computer mouse, and selections with
the touchscreen) the user employs to control the program.

Types

Currently (as of 2009) the following types of user interface are the most common:

Graphical user interfaces (GUI) accept input via devices such as computer
keyboard and mouse and provide articulated graphical output on the
computer monitor. There are at least two different principles widely used in
GUI design: Object-oriented user interfaces (OOUIs) and application
oriented interfaces [verification needed].
Web-based user interfaces or web user interfaces (WUI) accept input and
provide output by generating web pages which are transmitted via the
Internet and viewed by the user using a web browser program. Newer
implementations utilize Java, AJAX, Adobe Flex, Microsoft .NET, or similar
technologies to provide real-time control in a separate program,
eliminating the need to refresh a traditional HTML based web browser.
Administrative web interfaces for web-servers, servers and networked
computers are often called Control panels.

User interfaces that are common in various fields outside desktop computing:

Command line interfaces, where the user provides the input by typing a
command string with the computer keyboard and the system provides
output by printing text on the computer monitor. Used by programmers
and system administrators, in engineering and scientific environments, and
by technically advanced personal computer users.
Tactile interfaces supplement or replace other forms of output with haptic
feedback methods. Used in computerized simulators etc.
Touch user interface are graphical user interfaces using a touchscreen
display as a combined input and output device. Used in many types of point
of sale, industrial processes and machines, self-service machines etc.

Other types of user interfaces:

Attentive user interfaces manage the user attention deciding when to


interrupt the user, the kind of warnings, and the level of detail of the
messages presented to the user.
Batch interfaces are non-interactive user interfaces, where the user
specifies all the details of the batch job in advance to batch processing, and
receives the output when all the processing is done. The computer does not
prompt for further input after the processing has started.
Conversational Interface Agents attempt to personify the computer
interface in the form of an animated person, robot, or other character
(such as Microsoft's Clippy the paperclip), and present interactions in a
conversational form.
Crossing-based interfaces are graphical user interfaces in which the
primary task consists in crossing boundaries instead of pointing.
Gesture interface are graphical user interfaces which accept input in a form
of hand gestures, or mouse gestures sketched with a computer mouse or a
stylus.
Intelligent user interfaces are human-machine interfaces that aim to
improve the efficiency, effectiveness, and naturalness of human-machine
interaction by representing, reasoning, and acting on models of the user,
domain, task, discourse, and media (e.g., graphics, natural language,
gesture).
Motion tracking interfaces monitor the user's body motions and translate
them into commands, currently being developed by Apple[2]
Multi-screen interfaces, employ multiple displays to provide a more
flexible interaction. This is often employed in computer game interaction in
both the commercial arcades and more recently the handheld markets.
Noncommand user interfaces, which observe the user to infer his / her
needs and intentions, without requiring that he / she formulate explicit
commands.
Object-oriented user interface (OOUI)
Reflexive user interfaces where the users control and redefine the entire
system via the user interface alone, for instance to change its command
verbs. Typically this is only possible with very rich graphic user interfaces.
Tangible user interfaces, which place a greater emphasis on touch and
physical environment or its element.
Task-Focused Interfaces are user interfaces which address the information
overload problem of the desktop metaphor by making tasks, not files, the
primary unit of interaction
Text user interfaces are user interfaces which output text, but accept other
form of input in addition to or in place of typed command strings.
Voice user interfaces, which accept input and provide output by generating
voice prompts. The user input is made by pressing keys or buttons, or
responding verbally to the interface.
Natural-Language interfaces - Used for search engines and on webpages.
User types in a question and waits for a response.
Zero-Input interfaces get inputs from a set of sensors instead of querying
the user with input dialogs.
Zooming user interfaces are graphical user interfaces in which information
objects are represented at different levels of scale and detail, and where
the user can change the scale of the viewed area in order to show more
detail.
DESIGN PRINCIPLES

When evaluating a current user interface, or designing a new user interface, it is


important to keep in mind the following experimental design principles:

Early focus on user(s) and task(s): Establish how many users are needed to
perform the task(s) and determine who the appropriate users should be;
someone that has never used the interface, and will not use the interface in
the future, is most likely not a valid user. In addition, define the task(s) the
users will be performing and how often the task(s) need to be performed.
Empirical measurement: Test the interface early on with real users who
come in contact with the interface on an everyday basis. Keep in mind that
results may be altered if the performance level of the user is not an
accurate depiction of the real human-computer interaction. Establish
quantitative usability specifics such as: the number of users performing the
task(s), the time to complete the task(s), and the number of errors made
during the task(s).
Iterative design: After determining the users, tasks, and empirical
measurements to include, perform the following iterative design steps:

1. Design the user interface


2. Test
3. Analyze results
4. Repeat

Repeat the iterative design process until a sensible, user-friendly interface is


created.
DESIGN METHODOLOGIES
A number of diverse methodologies outlining techniques for HumanMachine
interaction design have emerged since the rise of the field in the 1980s. Most
design methodologies stem from a model for how users, designers, and technical
systems interact. Early methodologies, for example, treated users' cognitive
processes as predictable and quantifiable and encouraged design practitioners to
look to cognitive science results in areas such as memory and attention when
designing user interfaces. Modern models tend to focus on a constant feedback
and conversation between users, designers, and engineers and push for technical
systems to be wrapped around the types of experiences users want to have,
rather than wrapping user experience around a completed system.

User-centered design: user-centered design (UCD) is a modern, widely


practiced design philosophy rooted in the idea that users must take center-
stage in the design of any computer system. Users, designers and technical
practitioners work together to articulate the wants, needs and limitations
of the user and create a system that addresses these elements. Often, user-
centered design projects are informed by ethnographic studies of the
environments in which users will be interacting with the system. This
practice is similar but not identical to Participatory Design, which
emphasizes the possibility for end-users to contribute actively through
shared design sessions and workshops.

Principles of User Interface Design: these are seven principles that may be
considered at any time during the design of a user interface in any order,
namely Tolerance, Simplicity, Visibility, Affordance, Consistency, Structure
and Feedback.

Display designs :-Displays are human-made artifacts designed to support the


perception of relevant system variables and to facilitate further processing of that
information. Before a display is designed, the task that the display is intended to
support must be defined (e.g. navigating, controlling, decision making, learning,
entertaining, etc.). A user or operator must be able to process whatever
information that a system generates and displays; therefore, the information
must be displayed according to principles in a manner that will support
perception, situation awareness, and understanding.
THIRTEEN PRINCIPLES OF DISPLAY DESIGN
These principles of human perception and information processing can be utilized
to create an effective display design. A reduction in errors, a reduction in required
training time, an increase in efficiency, and an increase in user satisfaction are a
few of the many potential benefits that can be achieved through utilization of
these principles.

Certain principles may not be applicable to different displays or situations. Some


principles may seem to be conflicting, and there is no simple solution to say that
one principle is more important than another. The principles may be tailored to a
specific design or situation. Striking a functional balance among the principles is
critical for an effective design.

Perceptual Principles
1. Make displays legible (or audible)

A displays legibility is critical and necessary for designing a usable display. If the
characters or objects being displayed cannot be discernible, then the operator
cannot effectively make use of them.

2. Avoid absolute judgment limits

Do not ask the user to determine the level of a variable on the basis of a single
sensory variable (e.g. color, size, loudness). These sensory variables can contain
many possible levels.

3. Top-down processing

Signals are likely perceived and interpreted in accordance with what is expected
based on a users past experience. If a signal is presented contrary to the users
expectation, more physical evidence of that signal may need to be presented to
assure that it is understood correctly.

4. Redundancy gain

If a signal is presented more than once, it is more likely that it will be understood
correctly. This can be done by presenting the signal in alternative physical forms
(e.g. color and shape, voice and print, etc.), as redundancy does not imply
repetition. A traffic light is a good example of redundancy, as color and position
are redundant.

5. Similarity causes confusion: Use discriminable elements

Signals that appear to be similar will likely be confused. The ratio of similar
features to different features causes signals to be similar. For example, A423B9 is
more similar to A423B8 than 92 is to 93. Unnecessary similar features should be
removed and dissimilar features should be highlighted.

Mental Model Principles

6. Principle of pictorial realism

A display should look like the variable that it represents (e.g. high temperature on
a thermometer shown as a higher vertical level). If there are multiple elements,
they can be configured in a manner that looks like it would in the represented
environment.

7. Principle of the moving part

Moving elements should move in a pattern and direction compatible with the
users mental model of how it actually moves in the system. For example, the
moving element on an altimeter should move upward with increasing altitude.

Principles Based on Attention


8. Minimizing information access cost

When the users attention is diverted from one location to another to access
necessary information, there is an associated cost in time or effort. A display
design should minimize this cost by allowing for frequently accessed sources to be
located at the nearest possible position. However, adequate legibility should not
be sacrificed to reduce this cost.

9. Proximity compatibility principle

Divided attention between two information sources may be necessary for the
completion of one task. These sources must be mentally integrated and are
defined to have close mental proximity. Information access costs should be low,
which can be achieved in many ways (e.g. proximity, linkage by common colors,
patterns, shapes, etc.). However, close display proximity can be harmful by
causing too much clutter.

10. Principle of multiple resources

A user can more easily process information across different resources. For
example, visual and auditory information can be presented simultaneously rather
than presenting all visual or all auditory information.

Memory Principles

11. Replace memory with visual information: knowledge in the world

A user should not need to retain important information solely in working memory
or to retrieve it from long-term memory. A menu, checklist, or another display
can aid the user by easing the use of their memory. However, the use of memory
may sometimes benefit the user by eliminating the need to reference some type
of knowledge in the world (e.g. an expert computer operator would rather use
direct commands from memory than refer to a manual). The use of knowledge in
a users head and knowledge in the world must be balanced for an effective
design.

12. Principle of predictive aiding

Proactive actions are usually more effective than reactive actions. A display
should attempt to eliminate resource-demanding cognitive tasks and replace
them with simpler perceptual tasks to reduce the use of the users mental
resources. This will allow the user to not only focus on current conditions, but also
think about possible future conditions. An example of a predictive aid is a road
sign displaying the distance from a certain destination

13. Principle of consistency

Old habits from other displays will easily transfer to support processing of new
displays if they are designed in a consistent manner. A users long-term memory
will trigger actions that are expected to be appropriate. A design must accept this
fact and utilize consistency among different displays.
MODALITIES AND MODES
A modality is a path of communication employed by the user interface to carry
input and output. Examples of modalities:

Input computer keyboard allows the user to enter typed text, digitizing
tablet allows the user to create free-form drawing
Output computer monitor allows the system to display text and graphics
(vision modality), loudspeaker allows the system to produce sound
(auditory modality)

The user interface may employ several redundant input modalities and output
modalities, allowing the user to choose which ones to use for interaction.

A mode is a distinct method of operation within a computer program, in which


the same input can produce different perceived results depending of the state of
the computer program. Heavy use of modes often reduces the usability of a user
interface, as the user must expend effort to remember current mode states, and
switch between mode states as necessary

In the industrial design field of human-machine interaction, the user interface is


(a place) where interaction between humans and machines occurs. The goal of
interaction between a human and a machine at the user interface is effective
operation and control of the machine, and feedback from the machine which aids
the operator in making operational decisions. Examples of this broad concept of
user interfaces include the interactive aspects of computer operating systems,
hand tools, heavy machinery operator controls. and process controls. The design
considerations applicable when creating user interfaces are related to or involve
such disciplines as ergonomics and psychology.

A user interface is the system by which people (users) interact with a machine.
The user interface includes hardware (physical) and software (logical)
components. User interfaces exist for various systems, and provide a means of:

Input, allowing the users to manipulate a system, and/or


Output, allowing the system to indicate the effects of the users'
manipulation.
Generally, the goal of human-machine interaction engineering is to produce a
user interface which makes it easy, efficient, enjoyable to operate a machine in
the way which produces the desired result. This generally means that the
operator needs to provide minimal input to achieve the desired output, and also
that the machine minimizes undesired outputs to the human.

Ever since the increased use of personal computers and the relative decline in
societal awareness of heavy machinery, the term user interface has taken on
overtones of the (graphical) user interface, while industrial control panel and
machinery control design discussions more commonly refer to human-machine
interfaces.

Other terms for user interface include human-computer interface (HCI) and man-
machine interface (MMI).
INTERACTION TECHNIQUE

Fold n' Drop, a crossing-based interaction technique for dragging and dropping
files between overlapping windows.

An interaction technique, user interface technique or input technique is a


combination of hardware and software elements that provides a way for
computer users to accomplish a single task. For example, one can go back to the
previously visited page on a Web browser by either clicking a button, pressing a
key, performing a mouse gesture or uttering a speech command. It is a key
concept in human-computer interaction.

Definition

Although there is no general agreement on the exact meaning of the term


"interaction technique", the most popular definition is from the computer
graphics literature:

An interaction technique is a way of using a physical input/output device to


perform a generic task in a human-computer dialogue.

A more recent variation is: An interaction technique is the fusion of input and
output, consisting of all software and hardware elements, that provides a way for
the user to accomplish a task.
The computing view

From the computer's perspective, an interaction technique involves:

One or several input devices that capture user input,


One or several output devices that display user feedback,
A piece of software that:
o interprets user input into commands the computer can understand,
o produces user feedback based on user input and the system's state.

Consider for example the process of deleting a file using a contextual menu. This
assumes the existence of a mouse (input device), a screen (output device), and a
piece of code that paints a menu and updates its selection (user feedback) and
sends a command to the file system when the user clicks on the "delete" item
(interpretation). User feedback can be further used to confirm that the command
has been invoked.

The user's view

From the user's perspective, an interaction technique is a way to perform a single


computing task and can be informally expressed with user instructions or usage
scenarios. For example "to delete a file, right-click on the file you want to delete,
then click on the delete item".

The designer's view

From the user interface designer's perspective, an interaction technique is a well-


defined solution to a specific user interface design problem. Interaction
techniques as conceptual ideas can be refined, extended, modified and combined.
For example, contextual menus are a solution to the problem of rapidly selecting
commands. Pie menus are a radial variant of contextual menus.

Level of granularity

Interaction techniques are usually fine-grained entities. For example, a desktop


environment is too complex to be an interaction technique, whereas Expos fits
the common intuitive understanding of the term perfectly well. In general, a user
interface can be seen as a combination of many interaction techniques, some of
which are not necessarily as explicit as widgets.

Interaction tasks and domain objects

An interaction task is "the unit of an entry of information by the user" [1], such as
entering a piece of text, issuing a command, or specifying a 2D position. A similar
concept is that of domain object, which is a piece of application data that can be
manipulated by the user.[3]

Interaction techniques are the glue between physical I/O devices and interaction
tasks or domain objects. Different types of interaction techniques can be used to
map a specific device to a specific domain object. For example, different gesture
alphabets exist for pen-based text input.

In general, the less compatible the device is with the domain object, the more
complex the interaction technique. For example, using a mouse to specify a 2D
point involves a trivial interaction technique, whereas using a mouse to rotate a
3D object requires more creativity to design the technique and more lines of code
to implement it.

A current trend is to avoid complex interaction techniques by matching physical


devices with the task as close as possible, such as exemplified by the field of
tangible computing. But this is not always a feasible solution. Furthermore,
device/task incompatibilities are unavoidable in computer accessibility, where a
single switch can be used to control the whole computer environment.

Interaction style

Interaction techniques that share the same metaphor or design principles can be
seen as belonging to the same interaction style. General examples are command
line and direct manipulation user interfaces.

Visualization technique

Interaction techniques essentially involve data manipulation and thus place


greater emphasis on input than output. Output is merely used to convey
affordances and provide user feedback. The use of the term input technique
further reinforces the central role of input. Conversely, techniques that mainly
involve data exploration and thus place greater emphasis on output are called
visualization techniques. They are studied in the field of information visualization.

Research and innovation

A large part of research in human-computer interaction involves exploring easier-


to-learn or more efficient interaction techniques for common computing tasks.
This includes inventing new (post-WIMP) interaction techniques, possibly relying
on methods from user interface design, and comparing them with existing
techniques using methods from experimental psychology. Examples of scientific
venues in these topics are the UIST and the CHI conferences. Other research
focuses on the specification of interaction techniques, sometimes using
formalisms such as Petri nets for the purposes of formal verification.
HUMAN INTERFACE DEVICES

A human interface device or HID is a type of computer device that interacts


directly with, and most often takes input from, humans and may deliver output to
humans. The term "HID" most commonly refers to the USB-HID specification. The
term was coined by Mike Van Flandern of Microsoft when he proposed the USB
committee create a Human Input Device class working group. [when?] The working
group was renamed as the Human Interface Device class at the suggestion of Tom
Schmidt of DEC because the proposed standard supported bi-directional
communication.

History

The primary motivations for HID were to enable innovation in PC input devices
and simplify the process of installing these devices. Prior to HID, devices usually
conformed to very narrowly defined protocols for mice, keyboards and joysticks
(for example the standard mouse protocol at the time supported relative X and Y
axis data and binary input for up to two buttons). Any innovation in hardware
required overloading the use of data in an existing protocol or creation of custom
device drivers and evangelization of a new protocol to application developers. By
contrast all HID devices deliver self describing packages that may contain an
infinite variety of data types and formats. A single HID driver on the PC parses the
data and enables dynamic association of data I/O with application functionality.
This has enabled rapid innovation and proliferation of new human interface
devices.

The HID standard was developed by a working committee with representatives


from several companies and the list of participants can be found in the "Device
Class Definition for Human Interface Devices (HID)" document. The concept of a
self describing extensible protocol was initially conceived by Mike Van Flandern
and Manolito Adan working on a project named Raptor at Microsoft and
independently by Steve McGowan working on a device protocol for Access Bus
while at Forte. After comparing notes at a Consumer Game Developer
Conference, Steve and Mike agreed to collaborate on a new standard for the
emerging Universal Serial Bus.
HARDWARE
Hardware input/output devices and peripherals:

List of input devices


o Unit record equipment
o Barcode scanner
o Keyboard
Computer keyboard
Keyboard shortcut
Ways to make typing more efficient: command history,
autocomplete, autoreplace and Intellisense
o Microphone
o Pointing device
Computer mouse
Mouse chording

List of output devices


o Visual devices
Graphical output device
Display device
Computer display
Video projector
Computer printer
Plotter

o Auditory devices

Speakers
Earphones

o Tactile devices

Refreshable Braille display


Braille embosser
Haptic devices
Common HIDs

Keyboard
Mouse, Trackball, Touchpad, Pointing
stick
Graphics tablet
Joystick, Gamepad, Analog stick
Webcam
Headset

Less common HIDs

Driving simulator devices and flight


simulator devices have HIDs such as
gear sticks, steering wheels and pedals.
Wired glove (Nintendo Power Glove)
Dance pad
Wii Remote
Surface computing device
Apple's Sudden Motion Sensor(SMS)
device in Macs.

Most operating systems will recognize standard USB HID devices, like keyboards
and mice, without needing a special driver. When installed, a message saying that
a "HID-compliant device" has been recognized generally appears on screen. In
comparison, this message does not usually appear for devices connected via the
PS/2 6-pin DIN connectors which preceded USB. PS/2 does not support plug-and-
play, which means that connecting a PS/2 keyboard or mouse with the computer
powered on does not always work. In addition, PS/2 does not support the HID
protocol. A USB HID is described by the USB human interface device class.
Components of the HID protocol

In the HID protocol, there are 2 entities: the "host" and the "device". The device is
the entity that directly interacts with a human, such as a keyboard or mouse. The
host communicates with the device and receives input data from the device on
actions performed by the human. Output data flows from the host to the device
and then to the human. The most common example of a host is a computer but
some cell phones and PDAs also can be hosts.

The HID protocol makes implementation of devices very simple. Devices define
their data packets and then present a "HID descriptor" to the host. The HID
descriptor is a hard coded array of bytes that describe the device's data packets.
This includes: how many packets the device supports, how large are the packets,
and the purpose of each byte and bit in the packet. For example, a keyboard with
a calculator program button can tell the host that the button's pressed/released
state is stored as the 2nd bit in the 6th byte in data packet number 4 (note: these
locations are only illustrative and are device specific). The device typically stores
the HID descriptor in ROM and does not need to intrinsically understand or parse
the HID descriptor. Some mouse and keyboard hardware in the market today are
implemented using only an 8-bit CPU.

The host is expected to be a more complex entity than the device. The host needs
to retrieve the HID descriptor from the device and parse it before it can fully
communicate with the device. Parsing the HID descriptor can be complicated.
Multiple operating systems are known to have shipped bugs in the device drivers
responsible for parsing the HID descriptors years after the device drivers were
originally released to the public. However, this complexity is the reason why rapid
innovation with HID devices is possible.

The above mechanism describes what is known as HID "report protocol". Because
it was understood that not all hosts would be capable of parsing HID descriptors,
HID also defines "boot protocol". In boot protocol, only specific devices are
supported with only specific features because fixed data packet formats are used.
The HID descriptor is not used in this mode so innovation is limited. However, the
benefit is that minimal functionality is still possible on hosts that otherwise would
be unable to support HID. The only devices supported in boot protocol are:-
Keyboard Any of the first 256 key codes ("Usages") defined in the HID
Usage Tables, Usage Page 7 can be reported by a keyboard using the boot
protocol, but most systems only handle a subset of these keys. Most
systems support all 104 keys on the IBM AT-101 layout, plus the three new
keys designed for Microsoft Windows 95. Many systems also support
additional keys on basic western European 105-, Korean 106-, Brazilian
ABNT 107- and Japanese DOS/V 109-key layouts. Buttons, knobs and keys
that are not reported on Usage Page 7 are not available. For example, a
particular US keyboard's QWERTY keys will function but the Calculator and
Logoff keys will not because they are defined on Usage Page 12 and cannot
be reported in boot protocol.
Mouse Only the X-axis, Y-axis, and the first 3 buttons will be available.
Any additional features on the mouse will not function.

One common usage of boot mode is during the first moments of a computer's
boot up sequence. Directly configuring a computer's BIOS is often done using only
boot mode.

Other protocols using HID

Since HID's original definition over USB, HID is now also used in other computer
communication buses. This enables HID devices that traditionally were only found
on USB to also be used on alternative buses. This is done since existing support
for USB HID devices can typically be adapted much faster than having to invent an
entirely new protocol to support mice, keyboards, and the like. Known buses that
use HID are:

Bluetooth HID Bluetooth is a wireless communications technology.


Several Bluetooth mice and keyboards already exist in the market place.
Serial HID Used in Microsoft's Windows Media Center PC remote control
receivers
BIBLOGRAPHY
WIKIPEDIA ENCYCLOPEDIA
NHK WORLD.JP

You might also like