You are on page 1of 21

1. In your own words -- What is your understanding of HCI?

Please explain in detail

but limit your response to not more than one page.

ANS:

HUMAN COMPUTER INTERFACE

Interaction with computers has become so close. It practically ate up the human ways of life.

Future human life will be much dependent upon technology than ever before. Typically HCI

mainly concerned about the design which interacts with the user. Over the past couple of

decades, the usability has increased immensely, providing far better interaction between human

and computers. Considering about the society and human life, growth of HCI will enable a stress

free life as computers will be doing more work and humans will be just giving orders. Whenever

since computer and its technology became household equipment, the interaction between human

and computers had become a reality which not only belonged to IT professionals, but also for

ordinary people. Gradually with the inclusion of applications such as drawing programs, word

processing programs, spread sheets, etc. HCI became a normal routine in peoples’ life.

2. What is HCI and why is it important to all computer users?

ANS:

Human–Computer Interaction (HCI) is the study of the way in which computer technology

influences human work and activities. The term “computer technology” now-a-days includes

most technology from obvious computers with screens and keyboards to mobile phones,
household appliances, in-car navigation systems and even embedded sensors and actuators such

as automatic lighting. HCI has an associated design discipline, sometimes called Interaction

Design or User-Centered Design, focused on how to design computer technology so that it is as

easy and pleasant to use as possible.

The human brain is not optimized for the abstract thinking and data memorization that websites

often demand. Many usability guidelines are dictated by cognitive limitations. People can't keep

much information in their short-term memory. This is especially true when they're bombarded

with multiple abstract or unusual pieces of data in rapid succession. People live in a curious and

modern world where they go to no lengths to embrace technology. It is people who pick and

choose what to use more efficiently. HCI (Human Computer Interface) is one of the catalysts

which revolutionised the computer technology to a greater extent during the past three decades.

HCI concept came into the spotlight after various researches; the improvement of GUIs paved

the path for a better and advanced interaction between humans and computers. Interaction with

computers has become so close; it almost devoured the human life styles. Future human life will

be much dependent upon technology than ever before.

Human-computer interaction arose as a field from intertwined roots in computer graphics,

operating systems, human factors, ergonomics, industrial engineering, cognitive psychology, and

the systems part of computer science.

3. In discussion of HCI some tend to use interaction and interface some tend to use these

two words interchangeably. In your own words, define interaction and interface in a

clear and detail form using concrete examples.


ANS:

UI is the interaction between user and computer let say it’s like how you use the computer and

how it response of your usage, but for the HCI it’s like how you designed the computer to react

with user its involved in the interaction link between user and computer.

When you use OS as windows in your computer this process or software called UI but when you

writer OS with software language as C++, Java or others you are creating the OS as windows

this software called Human computer interaction. Then the interface mostly involved with

software so UI is a software or OS but Human Computer Interface is a programing language.

Explaining the differences in labels, especially between these two, is mostly about fine

distinctions. Wherever you go, you should try to learn about all of these areas.

HCI was the term used when I was doing academic research into the psychology of how humans

interacted with computers.

Some examples such as –

• National competitiveness. Information technology is one of the drivers for increased

productivity. As more and more workers use computers in their jobs, training time and

ease-of-use issues become economically more and more important.


• National security. Computer-based command, control, communications, and intelligence

systems are at the heart of our military infrastructure. Interfaces between operators and

computers are found in cockpits, on the bridge, and in the field. To be effective, these

systems must have high-quality human-computer interfaces.


• Growth of the computer and communications industries. Powerful, interesting, and usable

applications are the fuel for continuing growth of these industries. The current growth

cycle is the direct consequence of the graphical user interface developed by Xerox and
commercialized by Apple and Microsoft, and of the lower computer costs made possible

by the microprocessor. The resulting mass market supports commodity pricing for both

hardware and software. Future growth cycles will in part be driven by current HCI

research, which will lead to new applications that are increasingly easy to use.

Important applications of computers in medicine are possible only if they are both useful and

easy to use by doctors, nurses, and aides; similarly, use of computers in education requires that

they be both useful and easy to use by students and teachers.

4. HCI in its real and current state has moved beyond designing interfaces for desktop

computers and embedded devices to extending and supporting all manners of human

activities in various manners and of places. HCI facilitates user experiences through

designing interactions and making work effective, efficient and safer. HCI also has

improved and enhanced learning and training process. HCI equally provide enjoyable and

exciting entertainment; enhance communication, understanding as well as support new

forms of creativity and expression. Please categorize, explain in detail achievements and

benefits of HCI enumerated above.

ANS:

Major changes have occurred within the computer revolution; changes which encompass all

aspects of its role. These are not just quantitative in nature, such as exponential increases in

processing power and storage capacity, but are more fundamental, pointing not only to the
function of computer technology, but its emerging diversity both in terms of its form and place in

the world. Computers are now embedded within a huge range of materials and artefacts, and take

on roles in almost all aspects of life. People and lifestyles are altering. These changes are

sometimes spurred on by technology, but other times work in parallel or provoke technological

innovation. There is a global scale of change which is taking place hand in hand with new

technologies. This gives rise to tensions between individuals and governments, and between

globalization and cultural diversity.

In the last few years, new input techniques have been developed that are richer and less prone to

the many shortcomings of keyboard and mouse interaction. For example, there are tablet

computers that use stylus-based interaction on a screen, and even paper-based systems that

digitally capture markings made on specialized paper using a camera embedded in a pen. These

developments support interaction through sketching and handwriting. Speech-recognition

systems to support a different kind of ‘natural’ interaction, allowing people to issue commands

and dictate through voice. Meanwhile, multi-touch surfaces enable interaction with the hands and

the fingertips on touch-sensitive surfaces, allowing us to manipulate objects digitally as if they

were physical. From GUIs to multi-touch, speech to gesturing, the ways we interact with

computers are diversifying as never before. Two-handed and multi-fingered input is providing a

more natural and flexible means of interaction beyond the single point of contact offered by

either the mouse or stylus. The shift to multiple points of input also supports novel forms of

interaction where people can share a single interface by gathering around it and interacting

together

The Hot Hand device: a ring worn by electric guitar players that uses motion sensors and a

wireless transmitter to create different kinds of sound effects by various hand gestures.
Eye movements have been used for many years as a way of supporting the disabled in interacting

with computers, but now we are also seeing the advent of ‘brain computer interfaces’. Such

systems allow, for example, people with severe physical disabilities to use their brain waves to

interact with their environment. Real-time brainwave activity is beginning to be used to control

digital movies, turn on music, and switch the lights on and off. These interfaces can even control

robot arms, allowing paralyzed individuals to manipulate objects. Input can also be a by-product

of our activities in the world at large. For example, our location can be sensed through GPS and

our movements can be captured using CCTV cameras, providing inputs to a range of interactive

technologies.

5. User-centered design is an approach to User Interface (UI) design where the needs of

the user are paramount and where the user is involved in the design process of UI design

and which always involves the development of prototype interfaces. Please elaborate on

this premise and why prototyping is very curtail in UI design.

ANS:

User-centered design process (UCD) is also called human-centered design process. Human

centered design processes for interactive systems, ISO 13407 (1999), states: "Human-centered

design is an approach to interactive system development that focuses specifically on making

systems usable. It is a multi-disciplinary activity."

User-Centered Design (UCD) is a user interface design process that focuses on usability goals,

user characteristics, environment, tasks, and workflow in the design of an interface. UCD
follows a series of well-defined methods and techniques for analysis, design, and evaluation of

mainstream hardware, software, and web interfaces. The UCD process is an iterative process,

where design and evaluation steps are built in from the first stage of projects, through

implementation.

UCD Principles

1. Early focus on users and tasks

o Structured and systematic information gathering (consistent across the board)

o Designers trained by experts before conducting data collection sessions

2. Empirical Measurement and testing of product usage

o Focus on ease of learning and ease of use

o Testing of prototypes with actual users

3. Iterative Design

o Product designed, modified and tested repeatedly.

o Allow for the complete overhaul and rethinking of design by early testing of

conceptual models and design ideas.

Usability

The goal of UCD is to produce products that have a high degree of

usability. Usability objectives as:


• Usefulness - product enables user to achieve their goals - the tasks that it was designed to

carry out and/or wants needs of user.

• Effectiveness (ease of use) - quantitatively measured by speed of performance or error

rate and is tied to a percentage of users.

• Learnability - user's ability to operate the system to some defined level of competence

after some predetermined period of training. Also, refers to ability for infrequent users to

relearn the system.

• Attitude (likeability) - user's perceptions, feelings and opinions of the product, usually

captured through both written and oral communication.

User Interface Design Basics -

User Interface (UI) Design focuses on anticipating what users might need to do and ensuring that

the interface has elements that are easy to access, understand, and use to facilitate those actions.

UI brings together concepts from interaction design, visual design, and information.

Interface elements include but are not limited to:

• Input Controls: buttons, text fields, checkboxes, radio buttons, dropdown lists, list boxes,

toggles, date field

• Navigational Components: breadcrumb, slider, search field, pagination, slider, tags, icons

• Informational Components: tooltips, icons, progress bar, notifications, message boxes,

modal windows
• Containers: accordion

There are times when multiple elements might be appropriate for displaying content. When this

happens, it’s important to consider the trade-offs. For example, sometimes elements that can

help save you space, put more of a burden on the user mentally by forcing them to guess what is

within the dropdown or what the element might be.

Best Practices for Designing an Interface –

• Keep the interface simple. The best interfaces are almost invisible to the user. They avoid

unnecessary elements and are clear in the language they use on labels and in messaging.

• Create consistency and use common UI elements. By using common elements in your UI,

users feel more comfortable and are able to get things done more quickly. It is also

important to create patterns in language, layout and design throughout the site to help

facilitate efficiency. Once a user learns how to do something, they should be able to

transfer that skill to other parts of the site.

• Be purposeful in page layout. Consider the spatial relationships between items on the

page and structure the page based on importance. Careful placement of items can help

draw attention to the most important pieces of information and can aid scanning and

readability.

• Strategically use color and texture. You can direct attention toward or redirect attention

away from items using color, light, contrast, and texture to your advantage.
• Use typography to create hierarchy and clarity. Carefully consider how you use typeface.

Different sizes, fonts, and arrangement of the text to help increase scanability, legibility

and readability.

• Make sure that the system communicates what’s happening. Always inform your users

of location, actions, changes in state, or errors. The use of various UI elements to

communicate status and, if necessary, next steps can reduce frustration for your user.

Unique Aspects of HCI and Usability in the Privacy and Security Domain

Although many HCI techniques are general, there are unique aspects in the design of privacy and

security systems that present challenges and opportunities.

First, a key issue to consider is that security and privacy are rarely the user’s main goal. Users

value and want security and privacy, of course, but they regard them as only secondary to

completing primary tasks like completing an online banking transaction or ordering medications.

Users would like privacy and security systems and controls to be as transparent as possible. On

the other hand, users want to be in control of the situation and understand what is happening.

As more of people’s interactions in daily life involve the use of computing technology and

sensitive information, disparate types of users must be accommodated. Security solutions in

particular have historically been designed with a highly trained technical user in mind. The user

community has broadened extensively as organizational business processes have come to include

new types of roles and users in the security and privacy area. Many compliance and policy roles

in organizations are handled by legal and business process experts who have limited technical

skills.
The negative impact that usability problems can have is higher for security and privacy

applications than for many other types of systems. Complexity is at the very heart of many

security and privacy solutions; from an HCI point of view, that complexity is the enemy of

success. If a system is so complex that whole groups of users (e.g., technical users, business

users, and end users) cannot understand it, costly errors will occur.

6. In your view and in your own words, discuss briefly, what you foresee the future of

Domain specific interface in relation to our future life style, all things being equal?

Assume that the entire world is developing at the same rate.

ANS:

Technologists of the future will have to understand much more about human psychology and

physiology

Four converging technologies are going to transform our interaction with the world radically:

genetics, robotics, information and nanotechnology. With a few exceptions, such as pacemakers

and artificial hips, technology has always been at one distance removed from our bodies and

brains. Not for much longer. The interface between the mad world and us is going to become

almost invisible.

The monolithic device with a screen may be on the verge of disappearing. It is being enhanced

with numerous smaller devices, which may soon replace it as the way to access information. We

will arrive at a more ambient experience where sensors capture information about us and feed
that information into systems quietly working away in the background. Wearables will give way

to "embeddables", nano-scale machinery inside our bodies, which can monitor us and modify us.

The first domains that we are seeing using this technology are the obvious ones, such as

healthcare and fitness. But the possibilities extend further. Communications, entertainment,

socialising, learning, work, even self-actuation – any human activity we can think of is going to

be modified and amplified with an invisible mesh of data and processing that we will drift

through (mostly obliviously). We can start to think about these systems as mental and sensory

prosthetics, increasing our knowledge and perception of the world.

Humans enhance their bodies both unknowingly and consciously as an instinctive action, either

to confer social acceptance or to improve their physical attributes. Few people are totally content

with the bodies they ended up in; the difference is that now we have the opportunity to change it

in much more fundamental ways, from the gene up. The enhanced human will have improved

attributes such as sensing, thinking (aided by computation) and in more physical ways, such as

endurance, resistance and longevity. If you think this is far-fetched, recent developments in

artificial organ technology and robotics are bringing this sci-fi scenario closer than we may

imagine.

The technologists and designers of the future will have to understand much more about human

psychology and physiology to deliver appropriate services. These new services will be sitting so

close to us that they will have to find the right balance between unobtrusiveness and affordance.

Algorithms and learning systems will be crucial to take effort away from the users of these

services. Services that can predict our needs without us having to intervene will be the ones that

resonate and find an audience. Those that get in the way of our daily activity will be discarded.
The Apple Watch is beautifully designed and engineered, with a great look and feel. Its chunky,

rounded body is faintly reminiscent of the original iPhone, yet simultaneously modern-looking

and very satisfying to hold. The Apple Watch is also pleasingly comfortable on the wrist.

We've seen lots of fitness trackers over the years, and they've typically struck us as pretty

formulaic: plastic wristbands with little fashion appeal. One activity tracker brand tried to

convince us that their activity tracker was designed to appeal to the fashion-conscious woman;

they even thought that women would wear it around their neck like a necklace. But at the end of

the day it wasn't jewelry. None of the fitness trackers are.

It's a similar story with smartwatches. Sure, over the past year they've become more and more

popular with guys looking for the latest tech gadget, but they don't appeal to everyone. One

major issue is that most smartwatches are designed for men. They wouldn't sit comfortably on a

smaller wrist.

Here are the default Apple Watch apps that will be preinstalled on your device:

 Activity & Workout (see below)

 Apple Pay (in the US only, at least for the time being)

 Apple Remote

 Calendar

 Clock

 Digital Touch
 Friends

 Mail

 Maps

 Messages

 Music

 Phone

 Photos

 Passbook

 Remote Camera

 Siri

 Stocks

 Weather

Not only is the Apple Watch accurate to within 50ms of the Universal Time Standard, it will also

spring forward when daylight saving time begins, and 'fall back', as our American friends would

say, in the autumn. It will also adjust according to the time zone you are in.

The watch face is not on view all the time, presumably to save battery. If you want to see the

time, you have to raise your arm to make the screen come on. Quite a few early reviewers have
complained that this isn't quite instantaneous, and as precious as that might sound, even a second

or so of delay is likely to become highly frustrating when repeated dozens of times every day.

But our issue with this experience has been less about speed - which seems fine to us: close

enough to instantaneous that we don't mind - and more about the way the 'raise wrist to activate'

action doesn't always fire correctly.

You'll get false positives, where your arm is on the desk at what the watch considers to be a

suspect angle and the display will suddenly light up. No harm done, if a little distracting. But

more annoyingly, there will be times when you raise your wrist and it doesn't quite register the

movement. This is only an occasional problem, and presumably will get less common still as we

get used to the action required, but it's still mildly irritating.

Apple has received some criticism for introducing the high-priced Apple Watch Edition, a

strategy that was apparently spearheaded by head of design Jony Ive who wanted to offer a

watch for this part of the market. We think that if a few well-known celebrities are seen wearing

Apple Watches it will help promote it to the masses. By targeting fashion and celebrity, Apple

might well be able to make the Apple Watch appeal to the mainstream rather than geeks and

gadget lovers.

7. The wide spread of computer-based systems and applications in every walk of life and

the anticipated widespread use of emerging telemetric services have introduced new

dimensions to the issue of human-machine interaction, necessitating the design of high


quality user interfaces accessible and usable by a diverse user population. Please explain in

your own words and understanding the above statement or premise. Please do not exceed

one page.

ANS:

Telemetric is a technology that involves the automatic measurement and transmission of data

from remote sources. The process of measuring data at the source and transmitting it

automatically is called telemetry. The two terms, telemetry and telemetric, are often used

interchangeably. In general, telemetric works in the following way: Sensors at the source

measure either electrical data (such as voltage or current) or physical data (such as temperature

or pressure). These are converted to specific electrical voltages. A multiplexer combines the

voltages, along with timing data, into a single data stream for transmission to the distant receiver.

Upon reception, the data stream is separated into its original components and the data is

displayed and processed according to user specifications.

Telemedicine is the use of electronic communications and information technologies to provide

clinical services when participants are at different locations. Closely associated with

telemedicine is the term Telehealth. This term is often used to encompass a broader application

of technologies to distance education, consumer outreach, and other applications wherein

electronic communications and information technologies are used to support healthcare services.

Videoconferencing, transmission of still images, e-health including patient portals, remote

monitoring of vital signs, continuing medical education and nursing call centers are all

considered part of telemedicine and telehealth. Within existing healthcare facilities, a few key

clinical staff members have often led the development of telemedicine applications. As a result,
the initial tele medical services that are offered reflect the clinical specialties of those leaders.

Leading examples in the past have included radiology, dermatology, cardiology and pathology.

Telemedicine does not represent a separate medical specialty; rather it is a tool that can be used

by health providers to extend the traditional practice of medicine outside the walls of the typical

medical practice. In addition, telemedicine offers a means to help transform healthcare itself by

encouraging greater consumer involvement in decision making and providing new approaches to

maintaining a healthy lifestyle.

10. Throughout this course we understood that testing is one of the important steps in

maintaining software quality which power UI. In order to accomplish this process, we

must establish test plan and its associated elements such as: a) establish objectives for each

test phase, b) establish schedules and responsibilities for each test activity, c) determine

availability of tools, facilities, and test libraries, and finally d) establish procedures,

standards and benchmarks to be used for planning and conducting the tests and reporting

the test results. In your own words, who is responsible for performing the above functions

and in detail describes what the purpose and need of each function is in the software testing

process.

ANS:

Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result

may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of

the problem. Testers will be respected if they not only log the bugsbut also provide solutions.
Learn to maximize the test coverage every time you test any application. Though 100 percent

test coverage might not be possible still you can always try to reach near it.

While writing test cases, write test cases for intended functionality first i.e. for valid conditions

according to requirements. Then write test cases for invalid conditions. This will cover expected

as well unexpected behavior of application under test.

Make your test cases available to developers prior to coding.Don’t keep your test cases with you

waiting to get final application release for testing, thinking that you can log more bugs. Let

developers analyze your test cases thoroughly to develop quality application. This will also save

the re-work time.

If possible identify and group your test cases for regression testing. This will ensure quick and

effective manual regression testing.

Applications requiring critical response time should be thoroughly tested for performance.

Performance testing is the critical part of many applications. In manual testing this is mostly

ignored part by testers due to lack of required performance testing large data volume. Find out

ways to test your application for performance. If not possible to create test data manually then

write some basic scripts to create test data for performance test or ask developers to write one for

you.

Programmers should not test their own code. As discussed in our previous post, basic unit testing

of developed application should be enough for developers to release the application for testers.
But you (testers) should not force developers to release the product for testing. Let them take

their own time. Everyone from lead to manger know when the module/update is released for

testing and they can estimate the testing time accordingly. This is a typical situation in agile

project environment.Go beyond requirement testing. Test application for what it is not supposed

to do.

While doing regression testing use previous bug graph (Bug graph – number of bugs found

against time for different modules). This module-wise bug graph can be useful to predict the

most probable bug part of the application.

Many times testers or developers make changes in code base for application under test. This is

required step in development or testing environment to avoid execution of live transaction

processing like in banking projects. Note down all such code changes done for testing purpose

and at the time of final release make sure you have removed all these changes from final client

side deployment file resources.

It’s a good practice to involve tester’s right from software requirement and design phase. These

way testers can get knowledge of application dependability resulting in detailed test coverage. If

you are not being asked to be part of this development cycle then make request to your lead or

manager to involve you’re testing team in all decision making processes or meetings.

Testing teams should share best testing practices, experience with other teams in their

organization.

Increase your conversation with developers to know more about the product. Whenever possible

make face-to-face communication for resolving disputes quickly and to avoid any
misunderstandings. But also when you understand the requirement or resolve any dispute – make

sure to communicate the same over written communication ways like emails. Do not keep

anything verbal.

You might also like