0% found this document useful (0 votes)
190 views7 pages

Controlled Hand Gestures Using Python and OpenCV

: Due to its communal nature, gesture recognition has been used in recent trends to develop machines. Gestures are a type of verbal communication that allows humans and computers to communicate with one another. Artificial intelligence makes heavy use of hand gesture detection to improve functionality and person commerce. Then, to carry out particular tasks, we used counterplotted action dyads and some Python libraries (OpenCV, cvzone) that aid in image capture, pre-processing, and discovery
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
190 views7 pages

Controlled Hand Gestures Using Python and OpenCV

: Due to its communal nature, gesture recognition has been used in recent trends to develop machines. Gestures are a type of verbal communication that allows humans and computers to communicate with one another. Artificial intelligence makes heavy use of hand gesture detection to improve functionality and person commerce. Then, to carry out particular tasks, we used counterplotted action dyads and some Python libraries (OpenCV, cvzone) that aid in image capture, pre-processing, and discovery
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 7

11 V May 2023

https://doi.org/10.22214/ijraset.2023.52285
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

Controlled Hand Gestures using Python and


OpenCV
Chetana D. Patil1, Amrita Sonare2, Aliasgar Husain3, Aniket Jha4, Ajay Phirke5
1, 2, 3, 4, 5
Department of Computer Engineering, Dhole Patil College of Engineering, Pune

Abstract: Due to its communal nature, gesture recognition has been used in recent trends to develop machines. Gestures are a
type of verbal communication that allows humans and computers to communicate with one another. Artificial intelligence makes
heavy use of hand gesture detection to improve functionality and person commerce. Then, to carry out particular tasks, we used
counterplotted action dyads and some Python libraries (OpenCV, cvzone) that aid in image capture, pre-processing, and
discovery
Keywords: Gesture recognition, OpenCV, artificial intelligence, python, machine learning

I. INTRODUCTION
The most reliable transmission method for human-computer commerce is hand gestures. Generally, we use our keyboard, mouse, or
any other input device to interact with the computer or operation. Using Python libraries, we will leverage hand motions in this
design to provide input to our code. The web camera will read the data from the image and analyse it to determine the type of
gesture our hands are making as we make various hand gestures towards it.. also, it'll reuse that data to perform a particular exertion
or give some affair. The first step towards this process is landing the hand gesture, also analysing it to get the data of the gesture and
action counterplotted to gestures has to be performed. The webcam discovery process is the first step because it detects your hands
and serves as a medium for the computer and the mortal to interact by not using any words or an external input device. Using hand
gestures as input will make it veritably readily to perform commands. Using it for a PowerPoint donation will perform a specific
task, similar as scrolling forward or backwards or pointing to anything on the display interface, simply by using your hand and
making gestures.

II. PROBLEM STATEMENT


Recognition of gestures is used currently for different exploration operations, whether it's face recognition or body recognition.
Developing a recognition system that's effective and works directly is delicate as it involves a real- world terrain. When the camera
detects the person's hand movements, the background of the picture is important.
To encounter this problem, we will be barring the background that focuses on the hand to honor the fewest movement of our fritters
and hand. The camera will be landing this and analysing what gestures and movements our hand is making, and grounded on that,
it'll execute them. Another condition that should be allowed about is the camera quality of our device and the sharpness of the
camera. We'll bear precise gesture prisoner. In the phase of result evaluation, it's important to check the delicacy of the gesture and
do duly.
III. SCOPE AND OBJECTIVE
The goal is to attain synchronisation with a gesture organisation that may honor gestures. Spontaneously amend the lighting
conditions to attain this thing, synced gestures are generated in real- time honor gestures. The intention of this design is to identify,
descry, and make an entire system describing hand movement via laptop vision; this structure- function one side of the laptop
through person commerce, vision and AI are combined that produces to see determine which hand movements are supported and
which are not fully different arguments. or with a design that prioritises simplicity and ease of use, easy and produces nothing.
specific tackle. All functions area unit displayed on the identical screen, computer, or digital computer, just some specific tackle
used to digitalise filmland.
.
IV. LITERATURE SURVEY
Grounded on how individualities interpret and interpret information about their terrain, a vision- grounded approach offers the
implicit to produce organic and non-contact results. The person interacts with it while collecting the data needed for recognition
using their bare hands. Using comparable visual properties like texture and colour, it gathers data for gesture interpretation.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2973
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

S.No. Author Year Description


1 Hand Gesture 2021 The capacity of hand gesture recognition systems to
Recognition using successfully collaborate with machines has led to their
OpenCV and rapid development in recent years. The most natural
Python form of communication between humans and
computers in a virtual environment is seen to be
gesturing.
2 G. Murthy and R. 2009 Authors gives foundation of the field of gesture
Jadon, (Murthy & identification as a process for connection with
Jadon, 2009) computers.
3 M. K. Ahuja and 2015 Authors suggested a project using a database-driven
A. Singh (Ahuja hand motion identification build upon skin colour
& Singh, 2015) model approach and thresholding proposal further
with an effectual template complement using PCA

V. METHODOLOGY
Computer use is expanding rapidly across all industries. There are training libraries for modulation in Python, including those for
face identification, motion detection, and many others. In any industry, a PowerPoint presentation is a necessity. The system
operates by capturing motion and relating the task to be performed for the specific action or motion.
The OpenCV is the library that helps get the motion detected, and it is combined with the camera where the dimension in the camera
is drawn to restrict the motion in a particular area.
This gesture-controlled project focuses on gesture control and how it may be used to carry out particular actions with finger
movements, like moving forward and backwards through presentation slides, clicking, and, and writing on the screen. The gesture is
recorded beneath the green line that the camera projects when it is deployed. The theory is used to explain how gestures are
recorded, detected, and used to carry out certain tasks that might simplify our work.
Here, the entire procedure described below is the focal point of the gesture recognition process. The whole system comprises of two
sections. The image sensor module, detection module, and connection module compose the back-end structure.

Fig1: Gesture detection process

The Camera module is used for interacting, taking, and sending gesture images to the detection module for processing.
Detection module has the work for image processing. Whatever the camera module detection module receives as images, it
processes it, eliminates background and noise, and makes the image readable to identify gestures.
Interface module is in charge of matching the detected hand movements to the intended actions. These assigned actions are then sent
to our application which is PowerPoint presentation and the necessary action is carried out.
We suggested a very effective way for a gesture recognition system. Gesture detection and image processing work flow:

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2974
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

Fig2: Workflow of process

The libraries cvzone handles the processes required for detection and processing. To understand how it does we will understand
what tasks are processed in order for gesture detection. The below steps give us brief idea about it:
1) Hand region segmentation removes superfluous data from the video stream using edge detection.
2) RGB values since the RGB values for the hand are entirely different from the background of the image.
3) Eliminating the backdrop
OpenCV, cvzone, and Hand tracking module are three Python packages that handle these procedures.
The predefined hand motion’s function is designed to carry out specific actions, such as clicking, scrolling left and right, and
drawing with coloured markers on the screen. After the hand gestures have been recognised, the results are mapped with particular
action pairs using OpenCV, the Hand tracking module from cvzone library, and the hand gestures.

VI. REQUIREMENTS
A. Hardware Requirements
1) Operating system: windows 10 and above, macOS or Linux.
2) Webcam (For real-time hand Detection)
3) System Type: 64-bit operating system, x64-basedprocessor
4) Processor: Intel(R) Core (TM) i5-6200U CPU @ 2.30GHz 2.40 GHz
5) Installed Ram: 8 GB

B. Software Requirements
1) VSCode editor, jupyter
2) Python version: 3.6 or 3.8
3) Library: OpenCV, cv2, NumPy, cvzone

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2975
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

VII. OUTPUT

Fig3: gesture for showing pointer on screen

Gestures

Fig4: For deleting the pointer’s actions Fig5: For going to previous slide Fig6: For going to next slide

VIII. CONCLUSION
We can conclude from the above project that a person can use their hand gestures as an input method to perform some certain tasks
such as scrolling through PowerPoint presentation slides or point at something on the screen. Machine learning is a growing branch
in computer technology. These days, thanks to machine learning, we can employ a lot of new features The computer's camera is
used by the gesture control feature to read data and perform the task that corresponds to each gesture. Python is the main
programming language used and with its help this project is completed. There are many other sorts of gestures that can be created
specifically for you in the huge field of gesture control to perform any task you need. Here we were just focused on PowerPoint
presentation and controlling it through gestures of our hands.

IX. FUTURE SCOPE


In the future, we can implement more gestures and different types of gestures so that we will be able to perform more things and get
more benefits from this. Additionally, we will pay close attention to accuracy and work to make it better and better. For future scope,
we can also include whole body action and face gestures to perform certain tasks, or if we have to use some application to scroll
through some different application rather than a PowerPoint presentation, we will be able to do it. accessible to more and more
people, so everyone can benefit from this and their work will be easier.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2976
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

REFERENCES
[1] Li, L., & Zhang, L. (2012). Corner Detection of Hand Gesture. TELKOMNIKA Indonesia Journal of Electrical Engineering, 10(8), 2088-2094.
[2] Murthy, G., & Jadon, R. (2009). A review of vision-based hand gestures recognition. International Journal of Information Technology and Knowledge
Management, 2(2), 405-410.
[3] Parvini, F., & Shahabi, C. (2007). An algorithmic approach for static and dynamic gesture recognition utilising mechanical and biomechanical
[4] Ahuja, M. K., & Singh, A. (2015). Static vision-based Hand Gesture recognition using principal component analysis. Paper presented at the 2015 IEEE 3rd
International Conference on MOOCs, Innovation and Technology in Education (MITE).
[5] Bretzner, L., Laptev, I., & Lindeberg, T. (2002). Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. Paper
presented at the Proceedings of fifth IEEE international conference on automatic face gesture recognition.
[6] Garg, P., Aggarwal, N., & Sofat, S. (2009). Vision based hand gesture recognition. World academy of science, engineering and technology, 49(1), 972-977.
[7] Gupta, S., Jaafar, J., & Ahmad, W. F. W. (2012). Static hand gesture recognition using local gabor filter. Procedia Engineering, 41, 827-832.
[8] Hasan, H., & Abdul-Kareem, S. (2014). Retracted article: Human–computer interaction using vision based hand gesture recognition systems: A survey.
Neural Computing and Applications, 25(2), 251-261.
[9] Hasan, M. M., & Mishra, P. K. (2012). Hand gesture modeling and recognition using geometric features: a review. Canadian journal on image processing and
computer vision, 3(1), 12-26.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2977

You might also like