This project utilizes MediaPipe and TensorFlow/Keras to detect hand gestures in real-time using a webcam feed. It recognizes various hand gestures and displays the corresponding class names on the video stream.
Hand gesture detection has numerous applications, from sign language recognition to human-computer interaction. This project demonstrates how to use MediaPipe for hand landmark detection and TensorFlow/Keras for gesture recognition, providing a simple yet effective solution for real-time gesture detection.
-
Clone the repository:
git clone https://github.com/yourusername/hand-gesture-detection.git
-
Install the required dependencies:
pip install -r requirements.txt
-
Download the pre-trained gesture recognizer model (
mp_hand_gesture
) and class names (gesture.names
) files. -
Place the downloaded model and class names files in the project directory.
-
Run the main script:
python hand_gesture_detection.py
-
Position your hand in front of the webcam and perform various gestures.
-
The recognized gesture will be displayed on the video stream in real-time.
-
Press 'q' to quit the application.
Contributions are welcome! If you want to contribute to this project, please follow these steps:
- Fork the repository
- Create your feature branch (
git checkout -b feature/YourFeature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin feature/YourFeature
) - Create a new Pull Request