This project focuses on developing a hand gesture recognition system using Python and the MediaPipe library. The primary goal is to recognize specific hand gestures and classify them into meaningful actions, such as pointing (index finger), love (thumb, index, and pinky raised), and “you” (pinky finger raised).
The system utilizes real-time webcam input to track hand landmarks and applies custom logic to detect gestures based on finger positions. MediaPipe’s Hands module is used to extract 21 key points from the hand, which are then analyzed to determine the status (up/down) of each finger.
This project showcases my ability to implement computer vision techniques and apply them creatively in real-world scenarios. It can be used as a foundation for gesture-based control systems, interactive art installations, or accessibility tools.
Technologies used:
Python
OpenCV
MediaPipe