Professional Documents
Culture Documents
TECHNOLOGY
MINOR PROJECT
Interaction between human and the robot has become a part of our daily life. This depicts the
evaluation and implementation of interaction between human and robot. For controlling a
robot generally we use remote or switches which are not convenient and feasible in all the
situations. Nowadays, the vision based interface has made the human computer interaction
more effective. This is more reliable and user friendly way of controlling and
communicating. Nowadays Robots are successfully capable of performing various human
tasks that may be difficult due to their physical disabilities, size constraints or harsh
environmental conditions.
PROPOSED SYSTEM:
This robot is a gesture controlled one which can be controlled from anywhere by the help of
internet. In this project accelerometer MPU 6050 is used for collecting data regarding our
hand movements. It measures the acceleration of our hand in three axes. This data is sent to
Arduino which processes this data and decides where the robot should move. This data is sent
to nodemcu. The received data is transmitted to thing speak website. Thingspeak.com is a
free IOT website which stores this data. On the receiver side, NODE MCU WI-FI module
receives this data and then drives the motors through motor driving board connected to it.
COMPONENTS REQUIRED:
Transmitter part- Receiver part-
Jumper Wires
Nodemcu
CIRCUIT DIAGRAM:
FUTURE SCOPE:
Instead of direct supply, batteries can be used. But the on board batteries occupy a lot of
space and are also quite heavy. We can either use some alternate power source for the
batteries or replace the current DC Motors with ones which require less power. The proposed
system is applicable in hazardous environment where a camera can be attached to the robot
and can be viewed by the user who is in his station. This system can also be employed in
medical field where miniature robot are created that can help doctors for efficient surgery
operations In homes, offices, transport vehicles and more, gesture recognition can be
incorporated to greatly increase usability and reduce the resources necessary to create
primary or secondary input systems like remote controls.
REFERENCES:
[1] Thiago Chabes, Cristiano de Araujo, Veronica Teichrieb, "Human Body Motion and
Gestures Recognition Based on Checkpoints", 2012 14th symposium on virtual augmented
reality.
[2] B. Wang, T. Yuan, "Traffic Police Gesture Recognition using Accelerometer", IEEE
SENSORS Conference Lecce-Italy, pp. 1080-1083, Oct. 2008.