Skip to content

yemoeaung1/MotionBenderServer

Repository files navigation

Motion Bender

Motion Bender is a machine learning-based gesture recognition system designed for earables. It uses real-time IMU data (accelerometer and gyroscope) from earbuds to detect head gestures and control music playback.

Supported Gestures

The system maps specific head movements to media control actions:

  • Nod: Play / Pause
  • Shake: Next Track
  • Tilt: Previous Track
  • Background: No action

Installation

  1. Prerequisites: Make sure you have Python installed (3.8+ recommended).

  2. Clone the repository:

    git clone https://github.com/yemoeaung1/motion-bender.git
    cd MotionBenderServer
  3. Install dependencies: Install the required libraries (PyTorch, etc.) using pip:

    pip install -r requirements.txt

Usage

1. Start the Server

Run the python server to start listening for IMU data:

python server.py

The server will start and wait for HTTP POST requests containing sensor data.

2. Connect Sensor Logger

This project works with the Sensor Logger mobile app (or compatible IMU streamer). Configure the app to the following:

1. Go to settings.
2. Go to Data Streaming.
3. Enable HTTP Push.

Send HTTP POST requests to your computer's IP address (e.g., http://<YOUR_IP>:<SERVER_PORT>/data).

Architecture & Attribution

Base Framework

This project utilizes and modifies the code provided from Inertial-based Activity Recognition with Transformers.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors