0% found this document useful (0 votes)
8 views6 pages

Arduino_Based_Hand_Gesture_Controlled_Ro

Uploaded by

1mp22ai058
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
8 views6 pages

Arduino_Based_Hand_Gesture_Controlled_Ro

Uploaded by

1mp22ai058
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 6

International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056

Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

Arduino Based Hand Gesture Controlled Robot


Swarnika Shruti1, Savita Kumari Verma2, Shrishti Singh3, Tanya Gupta4

1,2,3Dept. of Electronics and Communication, KIET Group of Institutions, Ghaziabad, India-201206


4Dept of Electronics and Instrumentation, KIET Group of Institutions, Ghaziabad, India-201206
---------------------------------------------------------------------***---------------------------------------------------------------------
Abstract - As robot capabilities become more complex, deliver information to the close computer cell phone. It can
some, but not all, human tasks are supplanted by robots. be executed on a watch or in any home appliances a
Robots, on the other hand, rely on human programming and likewise Air conditioner. Modern ARDUINO chips help
will continue to do so in the future. Intranet and Internet rapport which can be employed to an
enormous degree. This robotic car can be enriched to
This paper explains how to simplify the difficult and function in the military surveillance where it can be
complex techniques of operating robotic devices in a variety transmitted to enemy headquarters and trace its actions
of applications. It's quite tough to use a remote or switches via the Internet. With a sense full of inventiveness, the
to control a robot or a specific machine. This advanced chances are vast.
technology can recognize our hand motions or gestures and
respond to our commands. A camera collects your hand In this paper, the aim and execution of Gesture Controlled
gestures, and processing of image algorithms will be used to Robot are illustrated and formulated using an Arduino
compare the previous and present hand motions. Based on microcontroller and Android Smartphone. An algorithm
the comparison result, the robot will get the appropriate has been given and its work is accurate fully. Since the
command over a ZigBee wireless connection. This revising chances are huge, revamping the system has been
technology can thus be used to operate a specific machine. kept as a future scope. The assembled tool is reasonable
and is simple to hold from one place to another. The
Key Words: Hand gesture, Image processing, Interrupt, improvement of some extra sensors or cameras will
Microcontroller, Motor Polarity, Arduino, Hand provoke it more efficient. The restriction of the hardware
Gestures, Robot, transmission, Sensors. being related to a system has been reduced to a great
degree. As a future idea, the system will permit the user to
1. INTRODUCTION regulate it in a way that decreases the extent between the
real world and the digital world with an outcome more
Technology has assisted humans in today's culture by automatic.
enhancing workplace efficiency, regardless of hazardous
working circumstances or a complex environment. Robots 2.2 Hand gesture robot using radar sensors for
have simplified work to a simple process in a variety of human-computer interaction [3]
fields, from medicine to industry [1]. Even as technology
progresses, it will never be able to complete any job A vast expansion and instantaneous improvement of radar
without the supervision of its master, i.e., humans. Apart based HGR was noticed in the former decade. This paper
from using external devices to control robots, simple surveyed some of the studies associated with HGR
gestures are the most natural and effective way to petitions using radars. Currently, the experimenters
communicate with them. depend on the economically accessible radars prepared by
tech companies such as Infenion, Novelda and Texas
2. Literature Review Instrument. With these systems living on chips, much
scrutiny has been spent on formulating the motion
2.1 Gesture Controlled Robot using Android & detection and acclaim algorithms. In the contemporary
Arduino [2] era, attention is changing from signal-processing-based
HGR algorithms to deep-learning-based algorithms.
The Gesture controlled robot constructed in this task has Especially, variants of CNN have indicated assuring
many future scopes. The robot can be used for supervision relevancy. Although radar sensors show numerous
objectives. The robot can connect to a wheelchair which reasons over the other HGR detectors (i.e., wearable
can be run by the actions of the rider's hand. sensors and cameras), the adoption of radar based HGR in
our everyday inhabits is still delayed behind these
Wi-Fi can be used for transmission instead of Bluetooth to rivalling technologies. Awareness must be expended to
allow it to a tremendous degree. Edge sensors can be imitating hardware improvement and real-time
integrated into it to prohibit the robot from declining from distinction algorithms’ improvement.
any ground. Some cameras can be placed which can
document and

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 826
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

2.3 Hand gesture recognition based on computer founded on the procedure of hand indications delivered by
vision [4] the user.

It is realistic to observe the research division since most Devoid of utilizing any superficial hardware assistance for
research surveys focus on computer applications, motion intake unlike the stipulated prevailing network,
indication language and interchange with a 3D object the user can govern a robot from his software station. The
through an actual domain. Still, many study articles trade formulated system is favorable in a difficult climate where
with improving frames for hand gesture recognition or a camera can be connected to the robot and can be
formulating modern algorithms rather than enforcing a glimpsed by the user who is in his location. This system
logical application with concern to fitness supervision. The can also be assigned in the medical area where imitation
enormous challenge experienced by the experimenter is in robots are built that can help physicians with efficient
formulating a powerful structure that survives the most surgery undertakings. For extra productive reaction, limit
established problems with limited constraints and significances can be used to observe indications and
provides a valid and credible outcome. Greatly progressive characteristics such as finger measures that
recommended hand gesture systems can be distributed empower several functional powers can be borrowed.
into two sectors of computer vision methods. First, an easy
method is to use image processing techniques via Open-NI 3. Proposed System
library or OpenCV library and perhaps extra tools to
provide interaction in real-time, which evaluates time There are three steps to this image processing technique
consumption because of real-time processing. This has Capture, comparison, and signaling. Typically, the
some constraints, such as setting issues, description capturing operation is carried out through the system's
difference, length threshold and multi-object or multi- webcam.
gesture crises. A next technique uses dataset motions to
The hand gesture is processed, and the hand gesture's
approximate against the input gesture, where extensively
direction is relayed to the Robot via the ZigBee module,
more problematic habits
which subsequently goes in the indicated direction.
require a complicated algorithm. Deep learning techniques
and artificial intelligence techniques to approximate the
exchange gesture in real-time with dataset gestures
including distinct attitudes or motions.

Although this strategy can specify a big amount of


indications, it has some difficulties in some trials, such as
losing some indications because of the class algorithms
exactness discrepancy. In addition, it brings extra moment
than first method because of the matching dataset in trial
of using a big number of the dataset. In addition, the
dataset of motions cannot be used by other shelves.

Hand gesture recognition deals with a fault in interaction


systems. Regulating elements by hand is more natural,
simpler, more creative and reasonable, and there is no Fig.3.1 Block diagram
necessity to rectify difficulties affected by hardware
devices since none is employed. From prior categories, it 4. Technologies Used
was apparent to a desire to settle ampleaction into
expanding durable and powerful algorithms with the The software station generates this signal, which is sent to
benefit of utilizing a camera sensor has a certain aspect to the robot using ZigBee, a wireless technology with a 50-
confront social issues and obtain a credible result. Each meter range.
procedure spoken of over, still, has its benefits and
drawbacks and may attain well in some challenges while
4.1 Comparison of Hand Gesture
living secondary in others.
An image of the plain background is taken initially,
followed by an image of the palm to detect the palm of the
2.4 Hand Gesture Controlled Robot [5]
hand. A black and white image has been created.
Hand Gesture Controlled Robot System lends a more
Following dynamic movement, the palm is photographed,
realistic way of governing appliances. The government for
and a black and white image of this frame is created in the
the robot to drive in a certain path in the atmosphere is
next stage [10].

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 827
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

4.2 Signal Generator 4.6 Comparator IC


Java language is being used to create action commands for The comparator IC that is being used is LM324 which
the system. To have the robot move forward, right, compares the accelerometer's analogue voltage to a
backward, stop, and left, simple values are expressed as f, reference value before outputting a specific high or low
r, b, s, and l, respectively [11]. The letter f, for example, voltage. The signal received is noisy, with a wide range of
represents the Forward command. The following voltage levels. This IC compares the voltage levels and
sequence, which is to turn right, is denoted by the letter r, produces either a 1 or a 0-voltage level. This method is
and so on. known as signal conditioning. The comparator integrated
circuit is shown in the diagram below [15]. 1, 7, 8, and 14
4.3 Communication using ZigBee are the output pins. When the LM324 IC's input is high.

As soon as the command is formed, it is delivered to the


ZigBee (XBee) transmitter configured in the software
component [9]. This transmits a digital signal to the
robot's ZigBee receiver.

4.4 Accelerometer (ADXL335)


The ADXL335 is a low-power, three-axis accelerometer
that is thin and compact. It is useful for measuring
acceleration because it can count 3g of full scale. In tilt
sensing applications, it can measure both the static Fig. 4.3: LM324
acceleration of gravity and the dynamic acceleration
caused by motion, shock, or vibration [12]. 4.7 Encoder IC (HT12E)
The HT12E controls remote applications. It will connect to
RF transmitter modules in order to create secure single-
channel or multi-channel RF remote control transmitters
[16]. Each address/data input can be set to one of two
logic states.

Fig. 4.1: ADXL335

4.5 Arduino UNO


The Arduino Uno is an open-source microcontroller that
uses the AtMega328 AVR microprocessor (Fig.4), a
detachable and dual inline microcontroller that provides
flexibility, compactness, and versatility [13]. It can be
interfaced with any type of electronic type to construct
embedded systems and IoT projects. It's a less priced
module that's easy to use for beginners.

Fig. 4.4: HT12E

4.8 RF Module
Radio frequency is an oscillation rate that corresponds to
the currents that carry radio communications and is
measured in the range of 3 KHz to 300 GHz [18]. Even
though radio frequency is a rate of oscillation, the term
Fig. 4.2: Arduino UNO "radio frequency" or its abbreviation "RF" referring to the

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 828
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

use of wireless communication rather than cable 5.3 Signal Simulation


transmission. The RF module has a range of 50-80 meters
and works at 315 MHz Receiver for radio frequency For each of the indicated directions, a corresponding
signals: character is now sent. List of the characters used for
transmission is depicted bellow:

• 'f' - The robot is progressing.

• 'b' - robot is moving backwards.

• 'r' - the robot is going towards right.

• 'l' - robot should go to the left.

• 's' - The robot should stop.

Fig.4.5: RF Transmitter After then, the character will be transferred to the


microcontroller, which will allow the robot to move.
Zigbee technology is used for communication.

The three main steps in communication are:

• Initializing - initialization of the data takes place.


• Connection establishment – connections are then
established
Fig.4.6: RF Receiver • Termination- transmission of signal takes place.

5. IMPLEMENTATION 5.4 Movement of Robot


Each image acquired is converted into a supporting image The PIC16F877A microcontroller receives the digital
file with the help of camera. signals. According to the interrupt signals, the polarity of
the motor is set as follows:
5.1 Imaging of Gesture
The software takes the backdrop image first when the
camera or webcam is turned on. This image file is then
compared to the backdrop image that was originally
gathered. By comparing the differences between the hand
picture and the backdrop image, that is, by assigning each
pixel value to 0 (black) or 255 (white).

5.2Detecting Gesture The microcontroller's output is sent into a driver that


generates 12V to power the robot's motors.
Horizontal scanning is performed on the obtained picture
from the top of the image frame until a pixel spot with a 5.5 Experimental Result
value less than or equal to black is identified. Validation
starts checking horizontally 30 pixels below the spot and Using the components, the hardware was assembled,
20 pixels below the spot. culminating in the creation of a robot. The experiment was
carried out using a Dell laptop, with the web camera
The spot value, which is value (0) and value (1), functioning as the video capture input device. The hand
determines the robot's direction (1). Value provides left motions were analyzed to establish the true orientation,
and right directions (1). If value (1) is more than three- and the picture processing software was created in Java.
quarters of the image's width, move right; if value(1) is
less than one-fourth of the image's width, move left. Move With the use of Zigbee, the robot was sent the
forward if value (0) is more than the height value; identified direction as characters. The robot's final
otherwise, go backward. If the value (0) and value (1) do movements are:
not change and are the same, the direction is to stop.
• Initially, the robot is stop.

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 829
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

• As the hand went to top, the robot moved forward. [2] “(PDF) gesture controlled robot using Arduino and
Android,” ResearchGate. [Online]. Available:
• And it moved backward as the hand moved to the
https://www.researchgate.net/publication/304624
bottom.
684_Gesture_Controlled_Robot_using_Arduino_and_
• The robot moved in the direction of the hand, which Android.
was shown as an acute angle to the left on the screen.
[3] S. Ahmed, K. D. Kallu, S. Ahmed, and S. H. Cho, “Hand
• The robot moved in the direction of the hand, which gestures recognition using radar sensors for
was shown as an acute angle to the right on the screen. human-computer-interaction: A Review,” MDPI, 02-
• Because the hand was held motionless in reference to Feb-2021. [Online]. Available:
the environment, the robot was in stop mode. https://www.mdpi.com/2072-
4292/13/3/527/htm..
6. Conclusion
[4] M. Oudah, A. Al-Naji, and J. Chahl, “Hand gesture
Users may control products more naturally with the Hand recognition based on Computer Vision: A review of
Gesture Controlled Robot System. Unlike mentioned techniques,” MDPI, 23-Jul-2020. [Online]. Available:
previous methods, the user can control a robot from his https://www.mdpi.com/2313-
software station without requiring any external hardware 433X/6/8/73?type=check_update&version=2.
support for gesture input. The direction-based technique
directly indicates the robot's movement direction, [5] K. Sekar, R. Thileeban, V. Rajkumar, and S. S. B.
whereas each finger count specifies an instruction for the Sembian, “Hand gesture controlled robot,”
robot to go in a specific direction in the environment. International Journal of Engineering Research &
Technology, 07-Nov-2020. [Online]. Available:
The design and execution of a Gesture Controlled Robot https://www.ijert.org/hand-gesture-controlled-
is demonstrated and developed using an Arduino robot. [Accessed: 10-May-2022].
microcontroller and an Android smartphone. The system
has been placed aside for the future because there are so [6] Simplilearn, “What is image processing: Overview,
many ways to update it. [19] The constructed equipment applications, benefits, and who should learn it
is low-cost and easy to move from one area to another. If [2022 edition],” Simplilearn.com, 24-Mar-2022.
some extra sensors or a camera are added, it will be more [Online]. Available:
productive. Finally, the system will allow the user to https://www.simplilearn.com/image-processing-
control it in a way that bridges the gap between the article.
physical and digital worlds while providing a more
[7] L. Rosencrance, “What is zigbee? - definition from
intuitive output.
whatis.com,” IoT Agenda, 28-Jun-2017. [Online].
7. Future Scope Available:
https://www.techtarget.com/iotagenda/definition/
The proposed method can be employed in a dangerous ZigBee#:~:text=Zigbee%20is%20a%20standards%
circumstance where the robot is equipped with a camera 2Dbased,and%20is%20an%20open%20standard
that the user at his station can monitor. This technology
can also be applied in the medical field, where miniature [8] “International Journal of Science, Engineering and
robots are being developed to help doctors execute more Technology Research” [Online]. Available:
efficient surgeries. http://ijsetr.org/wp-
content/uploads/2014/04/IJSETR-VOL-3-ISSUE-4-
Advanced features such as finger counts can be used to 1024-1028.pdf.
send distinct functional instructions for a more efficient
response. Threshold values can be used to identify [9] “IAEME publication, IJMET, IJCIET, IJCET, IJECET,
gestures, and advanced features such as finger counts can IJARET ...” [Online]. Available:
be used to deliver different functional commands for a https://iaeme.com/MasterAdmin/Journal_uploads/
more efficient response. IJMET/VOLUME_8_ISSUE_12/IJMET_08_12_080.pdf.

[10] “(PDF) hand gesture recognition: A literature


REFERENCES
review,” ResearchGate. [Online]. Available:
[1] Administrator, “Hand gesture controlled robot using https://www.researchgate.net/publication/284626
Arduino,” Electronics Hub, 23-Feb-2018. [Online]. 785_Hand_Gesture_Recognition_A_Literature_Revie
Available: https://www.electronicshub.org/hand- w.
gesture-controlled-robot/.

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 830
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072

[11] “Hand gesture controlled robot - IJSER.” [Online]. [23] J. Davis, M. Shah, “Visual gesture recognition”, IEEE
Available: Proc.-Vis. Image Signal Process., Vol. 141, No. 2,
https://www.ijser.org/researchpaper/Hand- April 1994
Gesture-Controlled-Robot.pdf.
[24] Chin-Chen Chang, I-Yen Chen and Yea-Shuan Huang,”
[12] “Hand gesture controlled robot PDF,” Scribd. [Online]. Hand Pose Recognition Using Curvature Scale
Available: Space”, Pattern Recognition, 16th International
https://www.scribd.com/document/412303023/H Conference on 2002, Volume: 2
and-Gesture-Controlled-Robot-pdf.
[25] “Design and implementation of robotic hand control
[13] “Arduino Uno - Farnell.” [Online]. Available: using gesture ... - IJERT.” [Online]. Available:
https://www.farnell.com/datasheets/1682209.pdf. https://www.ijert.org/research/design-and-
implementation-of-robotic-hand-control-using-
[14] “Arduino for beginners - makerspaces.com.” [Online]. gesture-recognition-IJERTV6IS040352.pdf.
Available: https://www.makerspaces.com/wp-
content/uploads/2017/02/Arduino-For-Beginners-
REV2.pdf.

[15] “LM324 - single supply quad operational amplifiers.”


[Online]. Available:
https://www.onsemi.com/pdf/datasheet/lm324-
d.pdf.

[16] “HT12A/HT12E - 212 series of encoders - farnell.”


[Online]. Available:
https://www.farnell.com/datasheets/1899539.pdf.

[17] HT12E RF encoder IC,” Components101. [Online].


Available: https://components101.com/ics/ht12e-
encoder-pin-diagram-datasheet. “

[18] “RF based wireless remote using RX-TX MODULES


(434MHz.) - elementzonline.” [Online]. Available:
https://www.elementzonline.com/downloads/RF_
Based_Wireless_Remote.pdf.

[19] “Modeling and designing of gesture control robot -


researchgate.” [Online]. Available:
https://www.researchgate.net/publication/325722
248_Modeling_and_Designing_of_Gesture_Control_R
obot.

[20] K. Sekar, R. Thileeban, V. Rajkumar, and S. S. B.


Sembian, “Hand gesture controlled robot,”
International Journal of Engineering Research &
Technology, 07-Nov-2020. [Online]. Available:
https://www.ijert.org/hand-gesture-controlled-
robot.

[21] “Hand gesture controlled robot - ijert.org.” [Online].


Available: https://www.ijert.org/research/hand-
gesture-controlled-robot-IJERTV9IS110014.pdf.

[22] “Hand gesture controlled robot,” Arduino Project


Hub. [Online]. Available:
https://create.arduino.cc/projecthub/abdelkader_c
h/hand-gesture-controlled-robot-9e4282.

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 831

You might also like