Controlled Hand Gestures Using Python and OpenCV
Controlled Hand Gestures Using Python and OpenCV
https://doi.org/10.22214/ijraset.2023.52285
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
Abstract: Due to its communal nature, gesture recognition has been used in recent trends to develop machines. Gestures are a
type of verbal communication that allows humans and computers to communicate with one another. Artificial intelligence makes
heavy use of hand gesture detection to improve functionality and person commerce. Then, to carry out particular tasks, we used
counterplotted action dyads and some Python libraries (OpenCV, cvzone) that aid in image capture, pre-processing, and
discovery
Keywords: Gesture recognition, OpenCV, artificial intelligence, python, machine learning
I. INTRODUCTION
The most reliable transmission method for human-computer commerce is hand gestures. Generally, we use our keyboard, mouse, or
any other input device to interact with the computer or operation. Using Python libraries, we will leverage hand motions in this
design to provide input to our code. The web camera will read the data from the image and analyse it to determine the type of
gesture our hands are making as we make various hand gestures towards it.. also, it'll reuse that data to perform a particular exertion
or give some affair. The first step towards this process is landing the hand gesture, also analysing it to get the data of the gesture and
action counterplotted to gestures has to be performed. The webcam discovery process is the first step because it detects your hands
and serves as a medium for the computer and the mortal to interact by not using any words or an external input device. Using hand
gestures as input will make it veritably readily to perform commands. Using it for a PowerPoint donation will perform a specific
task, similar as scrolling forward or backwards or pointing to anything on the display interface, simply by using your hand and
making gestures.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2973
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
V. METHODOLOGY
Computer use is expanding rapidly across all industries. There are training libraries for modulation in Python, including those for
face identification, motion detection, and many others. In any industry, a PowerPoint presentation is a necessity. The system
operates by capturing motion and relating the task to be performed for the specific action or motion.
The OpenCV is the library that helps get the motion detected, and it is combined with the camera where the dimension in the camera
is drawn to restrict the motion in a particular area.
This gesture-controlled project focuses on gesture control and how it may be used to carry out particular actions with finger
movements, like moving forward and backwards through presentation slides, clicking, and, and writing on the screen. The gesture is
recorded beneath the green line that the camera projects when it is deployed. The theory is used to explain how gestures are
recorded, detected, and used to carry out certain tasks that might simplify our work.
Here, the entire procedure described below is the focal point of the gesture recognition process. The whole system comprises of two
sections. The image sensor module, detection module, and connection module compose the back-end structure.
The Camera module is used for interacting, taking, and sending gesture images to the detection module for processing.
Detection module has the work for image processing. Whatever the camera module detection module receives as images, it
processes it, eliminates background and noise, and makes the image readable to identify gestures.
Interface module is in charge of matching the detected hand movements to the intended actions. These assigned actions are then sent
to our application which is PowerPoint presentation and the necessary action is carried out.
We suggested a very effective way for a gesture recognition system. Gesture detection and image processing work flow:
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2974
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
The libraries cvzone handles the processes required for detection and processing. To understand how it does we will understand
what tasks are processed in order for gesture detection. The below steps give us brief idea about it:
1) Hand region segmentation removes superfluous data from the video stream using edge detection.
2) RGB values since the RGB values for the hand are entirely different from the background of the image.
3) Eliminating the backdrop
OpenCV, cvzone, and Hand tracking module are three Python packages that handle these procedures.
The predefined hand motion’s function is designed to carry out specific actions, such as clicking, scrolling left and right, and
drawing with coloured markers on the screen. After the hand gestures have been recognised, the results are mapped with particular
action pairs using OpenCV, the Hand tracking module from cvzone library, and the hand gestures.
VI. REQUIREMENTS
A. Hardware Requirements
1) Operating system: windows 10 and above, macOS or Linux.
2) Webcam (For real-time hand Detection)
3) System Type: 64-bit operating system, x64-basedprocessor
4) Processor: Intel(R) Core (TM) i5-6200U CPU @ 2.30GHz 2.40 GHz
5) Installed Ram: 8 GB
B. Software Requirements
1) VSCode editor, jupyter
2) Python version: 3.6 or 3.8
3) Library: OpenCV, cv2, NumPy, cvzone
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2975
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
VII. OUTPUT
Gestures
Fig4: For deleting the pointer’s actions Fig5: For going to previous slide Fig6: For going to next slide
VIII. CONCLUSION
We can conclude from the above project that a person can use their hand gestures as an input method to perform some certain tasks
such as scrolling through PowerPoint presentation slides or point at something on the screen. Machine learning is a growing branch
in computer technology. These days, thanks to machine learning, we can employ a lot of new features The computer's camera is
used by the gesture control feature to read data and perform the task that corresponds to each gesture. Python is the main
programming language used and with its help this project is completed. There are many other sorts of gestures that can be created
specifically for you in the huge field of gesture control to perform any task you need. Here we were just focused on PowerPoint
presentation and controlling it through gestures of our hands.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2976
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
REFERENCES
[1] Li, L., & Zhang, L. (2012). Corner Detection of Hand Gesture. TELKOMNIKA Indonesia Journal of Electrical Engineering, 10(8), 2088-2094.
[2] Murthy, G., & Jadon, R. (2009). A review of vision-based hand gestures recognition. International Journal of Information Technology and Knowledge
Management, 2(2), 405-410.
[3] Parvini, F., & Shahabi, C. (2007). An algorithmic approach for static and dynamic gesture recognition utilising mechanical and biomechanical
[4] Ahuja, M. K., & Singh, A. (2015). Static vision-based Hand Gesture recognition using principal component analysis. Paper presented at the 2015 IEEE 3rd
International Conference on MOOCs, Innovation and Technology in Education (MITE).
[5] Bretzner, L., Laptev, I., & Lindeberg, T. (2002). Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. Paper
presented at the Proceedings of fifth IEEE international conference on automatic face gesture recognition.
[6] Garg, P., Aggarwal, N., & Sofat, S. (2009). Vision based hand gesture recognition. World academy of science, engineering and technology, 49(1), 972-977.
[7] Gupta, S., Jaafar, J., & Ahmad, W. F. W. (2012). Static hand gesture recognition using local gabor filter. Procedia Engineering, 41, 827-832.
[8] Hasan, H., & Abdul-Kareem, S. (2014). Retracted article: Human–computer interaction using vision based hand gesture recognition systems: A survey.
Neural Computing and Applications, 25(2), 251-261.
[9] Hasan, M. M., & Mishra, P. K. (2012). Hand gesture modeling and recognition using geometric features: a review. Canadian journal on image processing and
computer vision, 3(1), 12-26.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2977