The Hand Gesture Volume Control System is a project that enables users to control the system's volume through hand gestures. Utilizing computer vision techniques, this system recognizes specific gestures and translates them into volume control actions.
- Real-time hand gesture recognition
- Volume control through simple hand movements
- User-friendly interface
- Python: The primary programming language for implementation.
- OpenCV: A library used for computer vision tasks, specifically to capture and process video input.
- MediaPipe: A framework for building multimodal applied machine learning pipelines, used here for hand-tracking.
To run this project, you'll need to have Python installed on your machine. Follow these steps to set up the environment:
-
Clone this repository:
git clone https://github.com/arya-io/AI-Volume-Controller.git
-
Navigate to the project directory:
cd AI-Volume-Controller
-
Install the required libraries:
pip install -r requirements.txt
- Run the main script:
python main.py
- Increase and decrease the distance between the tip of your index finger and thumb to control the volume of the system.
Contributions are welcome! If you have suggestions for improvements or features, please open an issue or submit a pull request.
This project is licensed under the MIT License - see the LICENSE file for details.
- OpenCV for image processing capabilities.
- MediaPipe for efficient hand tracking.