Welcome to the Hand Gesture-Controlled Game project! This project demonstrates the integration of computer vision, hand tracking, and gesture recognition technologies to control a Unity-based game using hand gestures.
With this setup, you can control the player's actions in the game using simple hand gestures:
- Index Finger Up: The player jumps.
- Index Finger Down: The player runs normally.
This project leverages the power of computer vision and machine learning to create a fun and interactive way to play games. Using Python and various modules like OpenCV, MediaPipe, AppOpener, and ClickHandler, we can detect hand gestures and translate them into game controls. The game itself is built using the Unity Engine, which offers a robust environment for game development.
- Python: The main programming language used for hand gesture detection and game control.
- OpenCV: A powerful library for computer vision, used for image processing and gesture detection.
- MediaPipe: A framework for building multimodal machine learning pipelines, used here for hand tracking and gesture recognition.
- AppOpener: A Python module to programmatically open applications, used to launch the Unity game.
- ClickHandler: A module to simulate mouse and keyboard events, enabling interaction with the game.
- Unity Engine: A cross-platform game engine used to develop the game.
- Hand Gesture Detection: Recognizes hand gestures using the camera and translates them into game actions.
- Real-Time Interaction: Offers a seamless real-time interaction experience with the game.
- Customizable Gestures: Easily modify the code to recognize different gestures and assign them to various game actions.
- Cross-Platform: The project is designed to be cross-platform, with compatibility for different operating systems where Unity and Python can run.