Professional Documents
Culture Documents
INTERACTION
The study of human-computer interaction (HCI) focuses on how people interact with
technology. In order to enhance the user interaction with computer systems ,user-friendly
interaction designs are required. The traditional human computer interaction used to happen
with the help of numerous physical devices such as desktop , mouse , keyboard , joy etc. Due
to advancement in technology and human way of living there is an need for improved and
easy way of interaction . This paper will discuss the advancement in human computer
interactions where it focuses on the recent and current technologies. The main motive of this
paper is to do overall observation on the human computer interaction methods and analyse
those methods and contribute to the future research on the human computer interaction
technologies.
Keywords: User-centred ,Tactile interface ,F-Tablet
Introduction
HCI needs investigation of how people use computers, including the design of user-friendly
user interfaces, the examination of user behaviour, and the evaluation of user satisfaction .It is
an interdisciplinary field that incorporates theories and techniques from a variety of fields,
including computer science, psychology, design, engineering, and others. It aims to develop
computer systems that are user-friendly, effective, and efficient while also enhancing the user
experience. HCI researchers and practitioners use a variety of techniques to accomplish this
goal, including user-centred design, usability testing, and iterative design. Also, they take into
account the cognitive, social, and cultural aspects of human-computer interaction while
designing computer systems that are suited to the requirements and tastes of various users.
This could therefore result in a more positive user experience and more user satisfaction. HCI
research can contribute to the increased accessibility of new technologies for a larger range of
users by enhancing the usability and accessibility of computer systems. This is crucial for
users who might be less tech-savvy or who could face difficulties using conventional
computer systems due to impairments or other issues.
Therefore ,this paper gives an good insight about the human computer interactions , by
summarising all the human computer interaction methods and briefing out them with their
limitations and advantages.
A Comprehensive Overview of Available Methods
In the modern world, there are countless ways for individuals to interact with computers and
accomplish their tasks efficiently. However, each method comes with its own set of
advantages and limitations , and it is important to consider all these factors when choosing
the most appropriate option. One important consideration is whether the method requires
additional hardware components, which may require an extra investment of time and money
to implement. Additionally certain methods may require separate batteries to power these
hardware components, which must be regularly charged in order to maintain their
functionality.
Table 1: List of all the available interaction methods along with their advantages and
limitations
Page Interaction Method Advantages Limitations
No.
1 Keyboard and Mouse Easy to use and learn Slow and not accurate, it needs
hand and eye coordination
2 Touch Screen Best for mobile devices, Not accurate and can’t be used
supports multi touches for longer duration
3 Voice Recognition It won’t require use any body It is not suitable to use where
parts there is background noises, It
needs correct sounds of words
to interact
6 Joystick/Controller It is worth using for gaming The input given through these
purposes devices are nearly limited, it
will require hardware and also it
needs the user to have motor
skills
7 Trackpad It is helpful in case of The space needed for the
implementing them on the trackpad might take table space.
laptop It also needs hand eye
coordination
8. Trackball It is good for limited desk space, The input using these devices
it only takes small amount of are limited, also the user should
input to control the operation be handy to use it
9 Pen Input It is good for making digital art The input using these devices
and for writing are limited, also the user should
have some motor skills
11 Tactile Interface Good to be used by the blind The actions will be limited and
people needs hardware building
12 Eye Tracking It is good for accessibility and it Accuracy is limited, any move
supports hands free work away from the screen can lead
to some other operation on the
screen
13 Motion Sensor It is better to use them for gym The range of them will be
applications limited & requires hardware
support
15 Wearable Technology Good for health and fitness It needs battery power which
monitoring and provides hands will be limited and it is not
free communication comfortable for extended use
16 Autonomous Systems These are good for automatic It require hardware & software
surveillances both and may raise safety
concerns
10 8 15
84 22
24
28
54
31
35
49
42
By the above following graph which describes the number of times an particular body part
was used during and human computer interaction. We can observe that the most used body
part if the hand that maps upto 20% of the total interaction time. It is because gesture based
interactions have become soo much popular today .As it is an easy way of interaction many
research has been done in that method of human computer interaction.
Fei Shen,Le Using F-Tablet Natural Interface and good High cost and Learning
Kang[19] versatility. needed before use.
Conclusion
Altogether it can be concluded that there are various techniques available for Human
Computer Interaction , each with its own advantages and limitations. Some techniques are
cost-effective and easily accessible, while others require expensive hardware and software.
Some techniques are fast and accurate, while others may have limitations in their accuracy
and reliability. As we get forward in technology we also have to change the way of
interaction to make it match to our life style and development speed. Overall, advancements
in technology and research continue to provide new and improved techniques for Human
Computer Interaction , with the potential to revolutionize human-machine interaction.
References
[1] Karam, M.: A framework for research and design of gesture-based
human computer interactions. PhD thesis, University of Southampton (2006)
[2] Just, A.: Two-handed gestures for. Human Comput. Interact. 77, 1 (2006)
[3] Hasan, H., Abdul Kareem, S.: Fingerprint image enhancement and recognition
algorithms:a survey. Neural Comput. Appl. (2012)/ doi:10.1007/s00521-012-1113-0 3.
[4] Hasan, H.: Abdul Kareem, S.: Static hand gesture recognition using neural networks.
Artif. Intell. Rev. (2012). doi:10.1007/ s10462-011-9303-1
[5] Moeslund, T., Granum, E.: A survey of computer vision based human motion capture.
Comput. Vis. Image Underst. 81, 231–268 (2001)
[6] Derpanis, K.G.: A Review of Vision- Based Hand Gestures. http://
cvr.yorku.ca/members/gradstudents/kosta/publications/file Gesture review (2004)
[7] Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man, Cybern.
(SMC) Part C Appl. Rev. 37(3), 311–324 (2007)
[8] Chaudhary, A., Raheja, J.L., Das, K., Raheja, S.: Intelligent approaches to interact with
machines using hand gesture recognition in natural way a survey. Int. J. Comput. Sci. Eng.
Surv. (IJCSES) 2(1), 122–133 (2011)
[9] Wachs, J.P., Kolsch, M., Stern, H., Edan, Y.: Vision-based handgesture applications.
Commun. ACM 54, 60–71 (2011)
[10] Corera, S., Krishnarajah, N.: Capturing hand gesture movement a survey on tools
techniques and logical considerations. In: Proceedings of Chi Sparks 2011 HCI Research,
Innovation and Implementation, Arnhem, Netherlands (2011). http://proceedings. chi-
sparks.nl/documents/Education-Gestures/FP-35-AC-EG
[11] Ionescu, B.: Dynamic hand gesture recognition using the skeleton of the hand. EURASIP
J. 1–9 (2005)
[12] Symeonidis, Klimis: Hand gesture recognition using neural networks. Neural Netw. 13,
1–5 (1996)
[13] Kevin, N.Y.Y., Ranganath, S., Ghosh, D.: Trajectory modeling in gesture recognition
using CyberGloves, TENCON 2004. IEEE Region 10 Conference (2004)
[14] Kanniche, M.B.: Gesture recognition from video sequences. PhD thesis, University of
Nice (2009)
[15] Ramsha Fatima, Atiya Usmani, Zainab Zaheer: Eye Movement Based Human Computer
Interaction, 3rd Int’l Conf. on Recent Advances in Information Technology RAIT-2016
[16] S.S.Deepika and G.Murugesan: A Novel Approach for Human Computer Interface Based
on Eye Movements for Disabled People
[17] Nhan T. Cao: Facial Expression Recognition Using a Novel Modeling of Combined
Gray Local Binary Pattern
[18] RUHUL AMIN KHALIL: Speech Emotion Recognition Using Deep Learning
Techniques
[19] Fei Shen,Le Kang: The Pen-Computer Interface Based on Force Tablet: Active
Information Acquisition for HumanComputer-Interaction
[20] T, Kanongchaiyos, P.: 3D Hand pose modeling from uncalibrate monocular images. In:
Eighth International Joint Conference on Computer Science and Software Engineering
(JCSSE), pp. 177– 181 (2011)
[21] Henia, O.B., Bouakaz, S.: 3D Hand model animation with a new data-driven method.
In: Workshop on Digital Media and Digital Content Management, IEEE, pp. 72–76 (2011)
[22] Ionescu, D., Ionescu, B., Gadea, C., Islam, S.: A Multimodal interaction method that
combines gestures and physical game controllers. In: Proceedings of 20th International
Conference on Computer Communications and Networks (ICCCN), IEEE, pp. 1– 6 (2011a)
[23] Yang, J., Xu, J., Li, M., Zhang, D.,Wang, C.: A Real-time command system based on
hand gesture recognition. In: Seventh International Conference on Natural Computation, pp.
1588–1592 (2011) [24] Huang, D., Tang, W., Ding, Y., Wan, T., Wu, X., Chen, Y.: Motion
capture of hand movements using stereo vision for minimally invasive vascular
interventions. In: Sixth International Conference on Image and Graphics, pp. 737–742 (2011)
[25] Ionescu, D., Ionescu, B., Gadea, C., Islam, S.: An Intelligent gesture interface for
controlling TV sets and set-top boxes. In: 6th IEEE International Symposium on Applied
Computational Intelligence and Informati cs, pp. 159–164 (2011b)
[26] Bergh, M., Gool, L.: Combining RGB and ToF cameras for realtime 3D hand gesture
interaction. In: Workshop on Applications of Computer Vision (WACV), IEEE, pp. 66–72
(2011a)
[27] Bao, J., Song, A., Guo, Y., Tang, H.: Dynamic hand gesture recognition based on SURF
tracking. In: International Conference on Electric Information and Control Engineering
(ICEICE), pp. 338– 341 (2011)
[28] Bellarbi, A., Benbelkacem, S., Zenati-Henda, N., Belhocine, M.: Hand gesture
interaction using color-based method for tabletop interfaces. In: IEEE 7th International
Symposium on Intelligent Signal Processing (WISP), p. 16 (2011)
[29] Hackenberg, G., McCall, R., Broll, W.: Lightweight palm and finger tracking for real-
time 3D gesture control. In: IEEE Virtual Reality Conference (VR), pp. 19–26 (2011)
[30] Du, H., Xiong, W., Wang, Z.: Modeling and interaction of virtual hand based on
virtools. In: International Conference on Multimedia Technology (ICMT), pp. 416–419
(2011)
[31] Rautaray, S.S., Agrawal, A.: A novel human computer interface based on hand gesture
recognition using computer vision techniques. In: International Conference on Intelligent
Interactive Technologies and Mult imedia
[32] Barathi Subramanian , Bekhzod Olimov, Shraddha M. Naik, Sangchul Kim, Kil-Houm
Park & Jeonghong Kim. An integrated mediapipe-optimized GRU model for Indian sign
language recognition (Springer, 2021).
[33] Itsaso Rodriguez-Moreno, Jose Maria Martinez-Otzeta, Izaro Goienetxea, Igor
Rodriguez, and Basilio Sierra. A New Approach for Video Action Recognition: CSP-Based
Filtering for Video to Image Transformation (IEEE October 2021).
[34] Indriani Moh.Harris, Ali Suryaperdana Agoes. Applying Hand Gesture Recognition for
User Guide Application Using MediaPipe (2021).
[35] Attaqa Bashir, Faria Malik, Farwa Haider, Muhammad Ehatisham-ul-Haq, Aasim
Raheel, Aamir Arsalan. A Smart Sensor-based Gesture Recognition System for Media
Player Control (IEEE 2020).
[36] Swapna Agarwal & Saiyed Umer. MP-FEG: Media Player controlled by Facial
Expressions and Gestures (IEEE 2015).
[37] Wen-Shyong Yu, Member, IEEE, and Kuo-Yao Huang. Implementation of Media Player
Simulator Using Kinect Sensors (IEEE 2017).
[38] Manuj Paliwal, Gaurav Sharma, Dina Nath, Astitwa Rathore, Himanshu Mishra,
Soumik
Mondal. A Dynamic hand gesture recognition system for controlling VLC media player
(IEEE 2013).
[39] Jaya Prakash Sahoo, Allam Jaya Prakash, Paweł Pławiak, and Saunak Samantray. Real-
Time Hand Gesture Recognition Using Fine-Tuned Convolutional Neural Network (2022).
[40] Ustunug A, Cevikcan, Industry 4.0: Managing The Digital Transformation, Springer
Series in Advanced Manufacturing, Switzerland. 2018. DOI: https://doi.org/10.1007/978-3-
319-57870-5.
[41] S.Rautaray S, Agrawal A. Vision Based Hand Gesture Recognition for Human
Computer
Interaction: A Survey. Springer Artificial Intelligence Review. 2012. DOI:
https://doi.org/10.1007/s10462-012-9356-9.