You are on page 1of 1

LED Control Using Hand Gestures

Jaspreet Kaur, Khushi Gupta, Shreya


Manipal University Jaipur, Rajasthan, India

ABSTRACT Dataset & Methodology Methodology Cont.. Result


In this system, the interaction between a machine learning
This study presents a methodology for creating a system Image Scanning: Acquire images of real trains or train Model Training: Train a machine learning model using the model, Python code, and an Arduino board enables the
that utilizes a model train set to control Light Emitting components using a scanner or camera. created dataset to recognize patterns and features within control of LEDs based on detected hand gestures. Model
Diodes (LEDs) based on real-time analysis of a physical Dataset Creation: Compile scanned images into a dataset, the images. Prediction: The machine learning model analyzes input data,
train layout. The process involves several key steps, labeling them with relevant information such as train Real-Time Input and Prediction: Apply the trained model likely hand landmarks from video frames, to predict hand
including image scanning to gather visual data, dataset component type (e.g., locomotive, wagon) and action to analyze live video feed frames of a physical model train gestures .LED Control Function: The predicted output from
creation by compiling scanned images labeled with train (e.g., moving forward, stopping). layout, predicting identified components and actions. the model, such as the number of fingers detected, is sent to
component information, model training to recognize Output to LED Control Code: Translate model predictions the led(total) function in the Python code. Arduino Control:
patterns and features within the images, real-time input into a format understood by LED control code, generating Within the led(total) function, specific LED pins on the
and prediction using the trained model on live video feed specific signals or instructions based on identified Arduino board are manipulated using the write method from
frames, output translation into LED control instructions, components and actions. LED Activation: Implement LED the pyfirmata library. Each LED pin corresponds to a
and LED activation based on model predictions. The control code to receive instructions from the model and particular finger or gesture, and its state (on/off) is
creation of the dataset involves image acquisition, activate LEDs accordingly, reflecting the actions of the determined by the model's prediction. Physical Response:
labeling, compilation, preprocessing, splitting, and model train layout. The LEDs connected to the Arduino board respond
documentation to ensure robust model training and physically to the commands sent by the Python code. As a
evaluation. This approach facilitates the development of result, they illuminate or turn off, providing a visual
an effective system for dynamically controlling LEDs in representation of the detected hand gesture. Overall, this
synchronization with the actions of a model train layout. system demonstrates a seamless integration of machine
learning, Python programming, and hardware control
through the Arduino board, enabling real-time interaction
and visual feedback based on detected hand gestures.
INTRODUCTION
Conclusion
In our project, we embark on a pioneering endeavor
at the intersection of gesture recognition and LED In conclusion, this project successfully demonstrates the
.
control, harnessing the power of machine learning to Figure 1: Dataset Collection feasibility of controlling LEDs with hand gestures. The
enhance accuracy and responsiveness. With the fusion developed system showcases the potential of hand
of cutting-edge technologies, we aim to create an In this figure, we are collecting a dataset comprising gesture recognition technology for creating a more
intuitive system where users can control LEDs images of fingers numbered from 1 to 10, captured using a intuitive and interactive way to interact with electronics.
through natural hand gestures, revolutionizing the camera. We have used this dataset to train a machine While the current prototype focuses on fundamental
way we interact with our environment. learning model, enabling accurate recognition of finger functionalities, the future holds immense possibilities for
At the heart of our project lies the utilization of gestures for diverse applications like sign language expansion. By incorporating more complex gesture
machine learning algorithms to interpret and classify translation or gesture-based interfaces. recognition, advanced LED control features, and
Figure 3: Efficient Net Model
hand gestures, enabling precise and real-time control integration with other devices, this technology has the
of LED illumination. By developing a robust model Dataset Output potential to revolutionize human-computer interaction in
Collection Model Predict
trained on a diverse dataset of hand gestures, we seek various domains, from smart homes and virtual reality to
to achieve unparalleled accuracy in gesture Controlling LEDs with Arduino involves connecting the assistive technology applications.
recognition, ensuring seamless and intuitive LEDs to specific pins on the Arduino board using wires to Overall, this project serves as a stepping stone towards a
interaction with the LED system. regulate current flow. In the Arduino IDE, code is written future where hand gestures become a natural and
Through this project, we not only aspire to push the to define these pins as output and specify instructions for ubiquitous way to control the world around us.
boundaries of human-computer interaction but also to LED control using hand gestures. After uploading the
inspire innovation in the realm of smart lighting code to the Arduino board, thorough testing ensures Reference
solutions. By empowering users to effortlessly proper functionality of the LED control system. Once
manipulate their surroundings through simple validated, the system is integrated with external inputs, 1. Dayanandam, G.; Reddy, E.S.; Babu, D.B. Regression algorithms for efficient detection
gestures, we envision a future where technology such as a model train set or sensors, allowing the LEDs to and prediction of DDoS attacks. In Proceedings of the 2017 3rd International Conference on Applied and
Theoretical Computing and Communication Technology (iCATccT), Tumkur, India, 21–23 December 2017;
seamlessly integrates into our daily lives, enhancing respond dynamically to environmental stimuli. Finally, the pp. 215–219.

both functionality and aesthetics. Join us on this integrated system is deployed, ensuring all connections 2. Sharma, N.; Mahajan, A.; Mansotra, V. Machine Learning Techniques Used in Detection
of DOS Attacks:A Literature Review. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2016, 6, 100.
journey as we illuminate the path towards a more are secure and the LEDs operate as intended in real-world 3. Somani, G.; Gaur, M.S.; Sanghi, D.; Conti, M.; Buyya, R. DDoS attacks in cloud

intuitive and connected world. Fig2 : Proposed Methodology


applications. computing: Issues, taxonomy, and future directions. Comput. Commun. 2017, 107, 30–48.
4. Perera, P.; Tian, Y.-C.; Fidge, C.; Kelly, W. A Comparison of Supervised Machine
Learning Algorithms for Classification of Communications Network Traffic. In International
Conference on Neural Information Processing; Springer: Cham, Switzerland, 2017; pp. 445–454.

RESEARCH POSTER PRESENTATION DESIGN © 2022

www.PosterPresentations.com

You might also like