A cutting-edge browser-based 3D physics engine with hand gesture recognition and voice commands
Live Demo โข Documentation โข Contributing
Jarvis 1.0 is a high-performance, browser-native 3D physics simulation engine that brings interactive 3D experiences directly to your browser. Control dynamic physics simulations using cutting-edge hand gesture recognition via your webcam, voice commands, or traditional UI controls. Build immersive interactive experiences without leaving your browser.
Perfect for:
- Educational physics demonstrations
- Interactive art installations
- Game prototyping and experimentation
- Virtual particle system manipulation
- Gesture-controlled applications
- ๐ฎ Interactive 3D Object Creation - Instantly spawn cubes and spheres into the 3D scene
- ๐ Advanced Physics Simulation - Realistic gravity, friction, restitution, and collision handling
- ๐ค Hand Gesture Recognition - Real-time webcam-based hand tracking via MediaPipe
- ๐ค Voice Command Control - Control the engine using natural voice commands
- โจ Particle Systems - Dynamic particle effects responsive to hand gestures
- โ๏ธ Gravity Manipulation - Toggle, increase, or decrease gravity in real-time
- ๐ Performance Monitoring - Real-time FPS and object count visualization
- ๐จ Responsive UI - Modern glassmorphism design with real-time statistics
- โก Optimized Rendering - 60 FPS performance with hardware acceleration
- ๐ No Installation Required - Runs directly in your browser with Vite dev server
- Three.js r183 - 3D rendering and visualization
- Cannon-ES 0.20 - Physics engine for realistic simulations
- MediaPipe Hands 0.4 - Hand gesture recognition and tracking
- Vite 5.0 - Lightning-fast build tool and dev server
- Vanilla JavaScript - Pure ES6+ modules, no frameworks
Interactive 3D scene with physics-enabled objects controlled by hand gestures
- Node.js >= 14.0.0 (with npm or yarn)
- Modern Browser with WebGL and WebRTC support (Chrome, Firefox, Edge)
- Webcam for hand gesture recognition
git clone https://github.com/yourusername/jarvis-3d-engine.git
cd jarvis-3d-enginenpm installnpm run devThe engine will be available at http://localhost:5173 by default.
When the app loads, your browser will request permission to access your webcam. Allow this to enable hand gesture tracking.
npm run buildThis creates an optimized build in the dist/ directory.
| Button | Action |
|---|---|
| Add Cube | Create a new cube in the scene |
| Add Sphere | Create a new sphere in the scene |
| Delete Last | Remove the most recently created object |
| Toggle Gravity | Enable/disable gravity simulation |
| Increase | Increase gravity strength |
| Decrease | Decrease gravity strength |
| Start Voice | Activate voice command recognition |
| Gesture | Action |
|---|---|
| ๐ Waving Motion | Rotate the particle system |
| ๐ค Pinch (Thumb + Index) | Change shape and color of particles |
| โ Hand Proximity | Expand/contract particle formation |
| ๐๏ธ Open Hand | Spread particles outward |
"create cube" - Add a new cube
"create sphere" - Add a new sphere
"delete" - Remove last object
"gravity up" - Increase gravity
"gravity down" - Decrease gravity
"toggle gravity" - Enable/disable gravity
- FPS - Frames per second (target: 60)
- Objects - Total physics-enabled objects in scene
- Gravity - Current gravity magnitude (m/sยฒ)
Jarvis 1.0/
โโโ src/
โ โโโ main.js # Application entry point
โ โโโ core/
โ โ โโโ renderer.js # Three.js rendering engine
โ โ โโโ scene.js # Scene management & object creation
โ โโโ hand/
โ โ โโโ handTracker.js # MediaPipe hand detection
โ โ โโโ gestureDetector.js # Gesture recognition logic
โ โโโ particles/
โ โ โโโ particleSystem.js # Particle effect system
โ โโโ voice/
โ โโโ speechController.js # Voice command processing
โโโ index.html # HTML entry point
โโโ package.json # Dependencies & scripts
โโโ vite.config.js # Vite configuration
โโโ README.md # This file
Manages Three.js canvas setup, camera configuration, and rendering loop.
- Handles WebGL context and canvas initialization
- Manages frame rate and performance metrics
- Configures lighting and post-processing effects
Controls physics world and 3D objects.
- Creates and manages Cannon-ES physics bodies
- Handles object creation (cubes, spheres)
- Manages gravity and physics parameters
- Tracks object lifecycle
Real-time hand detection using MediaPipe.
- Captures video stream from webcam
- Detects hand landmarks (21 points per hand)
- Provides normalized hand position and rotation
Interprets hand movements into actionable gestures.
- Recognizes pinch, wave, and proximity gestures
- Maintains gesture state machine
- Fires events for gesture changes
Dynamic particle effects responsive to gestures.
- Creates and manages particles
- Updates particle behavior based on hand input
- Renders particle geometry
Voice command recognition and processing.
- Uses Web Speech API for speech recognition
- Maps voice commands to engine actions
- Provides visual feedback for commands
Edit src/core/scene.js to modify:
// Gravity (m/sยฒ)
this.world.gravity.set(0, -9.8, 0);
// Default object properties
{
mass: 1, // Object mass
friction: 0.4, // Friction coefficient
restitution: 0.4 // Bounce coefficient
}
// Collision response
{
contactEquationRelaxation: 4,
contactEquationStiffness: 1e6
}Edit src/core/renderer.js to modify:
// Camera configuration
camera.fov = 75;
camera.far = 1000;
// Lighting
ambientLight.intensity = 0.6;
directionalLight.intensity = 0.8;
// Target frame rate
targetFPS = 60;Jarvis 1.0 is optimized for smooth performance:
- Rendering: 60 FPS at 1080p on modern hardware
- Physics: Stable simulation with up to 100+ dynamic objects
- Hand Tracking: Real-time detection at 30 FPS
- Memory: ~150MB typical usage
- Latency: <100ms hand-to-visual response time
- Reduce the number of active objects for better performance
- Disable hand tracking if not in use
- Use the production build for deployment
- Monitor FPS in the UI panel
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Follow ES6+ conventions
- Use descriptive variable and function names
- Add comments for complex logic
- Test your changes with the dev server
- Ensure no console errors or warnings
- โจ Additional gesture recognition patterns
- ๐จ Visual effects and particle system enhancements
- ๐ค Voice command expansion
- ๐ฑ Mobile optimization
- ๐งช Unit and integration tests
- ๐ Documentation improvements
- ๐ Bug fixes and performance optimization
This project is licensed under the MIT License - see the LICENSE file for details.
MIT License
Copyright (c) 2024 Jarvis 1.0 Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
- Found a Bug? Open an issue
- Have a Question? Start a discussion
- Want to Contribute? See the Contributing section
โญ If you find this project useful, please consider giving it a star!