Motion Bender is a machine learning-based gesture recognition system designed for earables. It uses real-time IMU data (accelerometer and gyroscope) from earbuds to detect head gestures and control music playback.
The system maps specific head movements to media control actions:
- Nod: Play / Pause
- Shake: Next Track
- Tilt: Previous Track
- Background: No action
-
Prerequisites: Make sure you have Python installed (3.8+ recommended).
-
Clone the repository:
git clone https://github.com/yemoeaung1/motion-bender.git cd MotionBenderServer -
Install dependencies: Install the required libraries (PyTorch, etc.) using pip:
pip install -r requirements.txt
Run the python server to start listening for IMU data:
python server.pyThe server will start and wait for HTTP POST requests containing sensor data.
This project works with the Sensor Logger mobile app (or compatible IMU streamer). Configure the app to the following:
1. Go to settings.
2. Go to Data Streaming.
3. Enable HTTP Push.
Send HTTP POST requests to your computer's IP address (e.g., http://<YOUR_IP>:<SERVER_PORT>/data).
This project utilizes and modifies the code provided from Inertial-based Activity Recognition with Transformers.