Alejandro Fontan · Tobias Fischer · Nicolas Marticorena
Somayeh Hussaini · Ted Vanderfeen · Beverley Gorry · Javier Civera · Michael Milford
VSLAM-LAB is designed to simplify the development, evaluation, and application of Visual SLAM (VSLAM) systems. This framework enables users to compile and configure VSLAM systems, download and process datasets, and design, run, and evaluate experiments — all from a single command line!
Why Use VSLAM-LAB?
- Unified Framework: Streamlines the management of VSLAM systems and datasets.
- Ease of Use: Run experiments with minimal configuration and single command executions.
- Broad Compatibility: Supports a wide range of VSLAM systems and datasets.
- Reproducible Results: Standardized methods for evaluating and analyzing results.
To ensure all dependencies are installed in a reproducible manner, we use the package management tool pixi. If you haven't installed pixi yet, please run the following command in your terminal:
curl -fsSL https://pixi.sh/install.sh | bash After installation, restart your terminal or source your shell for the changes to take effect. For more details, refer to the pixi documentation.
If you already have pixi remember to update: pixi self-update
Clone the repository and navigate to the project directory:
git clone https://github.com/VSLAM-LAB/VSLAM-LAB.git && cd VSLAM-LABYou can now execute any baseline on any sequence from any dataset within VSLAM-LAB using the following command:
pixi run demo <baseline> <dataset> <sequence> <mode>
For a full list of available systems and datasets, see the VSLAM-LAB Supported Baselines and Datasets. Example commands:
pixi run demo mast3rslam eth table_3 mono
pixi run demo droidslam rgbdtum rgbd_dataset_freiburg1_xyz rgbd
pixi run demo orbslam2 kitti 04 stereo
pixi run demo pycuvslam euroc MH_01_easy stereo-vi
To change the paths where VSLAM-LAB-Benchmark or/and VSLAM-LAB-Evaluation data are stored (for example, to /media/${USER}/data), use the following commands:
pixi run set-benchmark-path /media/${USER}/data
pixi run set-evaluation-path /media/${USER}/data
With VSLAM-LAB, you can easily design and configure experiments using a YAML file and run them with a single command. To run the experiment demo, execute the following command:
pixi run vslamlab configs/exp_vslamlab.yaml (--overwrite)
Experiments in VSLAM-LAB are sequences of entries in a YAML file (see example ~/VSLAM-LAB/configs/exp_vslamlab.yaml):
exp_vslamlab:
Config: config_vslamlab.yaml # YAML file containing the sequences to be run
NumRuns: 1 # Maximum number of executions per sequence
Parameters: {verbose: 1} # Vector with parameters that will be input to the baseline executable
Module: droidslam # droidslam/monogs/orbslam2/mast3rslam/dpvo/...
Config files are YAML files containing the list of sequences to be executed in the experiment (see example ~/VSLAM-LAB/configs/config_vslamlab.yaml):
rgbdtum:
- 'rgbd_dataset_freiburg1_xyz'
hamlyn:
- 'rectified01'
7scenes:
- 'chess_seq-01'
eth:
- 'table_3'
euroc:
- 'MH_01_easy'
monotum:
- 'sequence_01'
For a full list of available VSLAM systems and datasets, refer to the section VSLAM-LAB Supported Baselines and Datasets.
In addition to running the full automated pipeline, VSLAM-LAB provides modular commands to interact directly with datasets and baselines. For a comprehensive list of all available commands consult Wiki: Comand‐line Interface
pixi run install-baseline <baseline> # Example: pixi run install-baseline droidslam
pixi run download-sequence <dataset> <sequence> # Example: pixi run download-sequence eth table_3
pixi run run-exp <exp_yaml> # Example: pixi run run-exp configs/exp_vslamlab.yaml
pixi run evaluate-exp <exp_yaml> # Example: pixi run evaluate-exp configs/exp_vslamlab.yaml
pixi run compare-exp <exp_yaml> # Example: pixi run compare-exp configs/exp_vslamlab.yamlExpand the evaluation suite by integrating custom datasets. Follow the instructions in Wiki: Integrate a new VSLAM Dataset.
Incorporate new algorithms into the framework. Follow the guide in Wiki: Integrate a new VSLAM Baseline. Benchmark your method against state-of-the-art baselines across all supported datasets.
For a reference implementation, see the VGGT-SLAM integration in commit 259f7ae.
VSLAM-LAB is released under a LICENSE.txt. For a list of code dependencies which are not property of the authors of VSLAM-LAB, please check docs/Dependencies.md.
If you're using VSLAM-LAB in your research, please cite:
@article{fontan2025vslam,
title={VSLAM-LAB: A Comprehensive Framework for Visual SLAM Methods and Datasets},
author={Fontan, Alejandro and Fischer, Tobias and Civera, Javier and Milford, Michael},
journal={arXiv preprint arXiv:2504.04457},
year={2025}
}| Baselines | System | Sensors | License | Label | Conda Pkg | Camera Models |
|---|---|---|---|---|---|---|
| VGGT-SLAM | VSLAM | mono |
BSD-2 | vggtslam |
✅ | pinhole |
| MASt3R-SLAM | VSLAM | mono |
CC BY-NC-SA 4.0 | mast3rslam |
✅ | radtan5 unknown |
| DPVO | VSLAM | mono |
License | dpvo |
✅ | radtan5 |
| DROID-SLAM | VSLAM | mono rgbd stereo |
BSD-3 | droidslam |
✅ | radtan5 |
| ORB-SLAM2 | VSLAM | mono rgbd stereo |
GPLv3 | orbslam2 |
✅ | radtan5 |
| MonoGS | VSLAM | mono rgbd |
License | monogs |
✅ | radtan5 |
| AnyFeature-VSLAM | VSLAM | mono |
GPLv3 | anyfeature |
✅ | radtan5 |
| ---------- | ------- | ------- | ---------- | -------- | --- | ---------- |
| PyCuVSLAM | VSLAM | mono rgbd stereo(-vi) |
NVIDIA | pycuvslam |
➖ | radtan5 equid4 |
| ORB-SLAM3 | VSLAM | mono(-vi) rgbd(-vi) stereo(-vi) |
GPLv3 | orbslam3 |
✅ | radtan5 equid4 |
| OKVIS2 | VSLAM | mono-vi |
BSD-3 | okvis2 |
✅ | radtan5 equid4 |
| ---------- | ------- | ------- | ---------- | -------- | --- | ---------- |
| GLOMAP | SfM | mono |
BSD-3 | glomap |
✅ | radtan5 equid4 unknown |
| COLMAP | SfM | mono |
BSD | colmap |
✅ | radtan5 equid4 unknown |
| VGGT | SfM | mono |
VGGT | vggt |
➖ | pinhole |
| Datasets | Features | Label | Sensors | Camera Models |
|---|---|---|---|---|
| ETH3D SLAM Benchmarks | 📸🏠🤳 | eth |
mono rgbd |
pinhole |
| RGB-D SLAM Dataset and Benchmark | 📸🏠🤳 | rgbdtum |
mono rgbd |
radtan5 |
| The KITTI Vision Benchmark Suite | 📸🏞️🚗 | kitti |
mono stereo |
pinhole |
| The EuRoC MAV Dataset | 📸🏞️🚁 | euroc |
mono(-vi) stereo(-vi) |
radtan4 |
| The Replica Dataset - iMAP | 💻🏠🤳 | replica |
mono rgbd |
pinhole |
| TartanAir: A Dataset to Push the Limits of Visual SLAM | 💻🏞️🤳 | tartanair |
mono |
pinhole |
| ICL-NUIM RGB-D Benchmark Dataset | 💻🏠🤳 | nuim |
mono rgbd |
pinhole |
| RGB-D Dataset 7-Scenes | 📸🏠🤳 | 7scenes |
mono rgbd |
pinhole |
| OpenLORIS-Scene Dataset | 📸🏠🤳 | openloris-d400/t265 |
mono(-vi) rgbd(-vi) stereo(-vi) |
pinhole equid4 |
| Monado SLAM Dataset - Valve Index | 📸🏠🥽 | msd |
mono(-vi) stereo(-vi) |
equid4 |
| ROVER: A Multiseason Dataset for Visual SLAM | 📸🏞️🚗 | rover-picam/d435i/t265 |
mono(-vi) rgbd stereo(-vi) |
radtan5 equid4 |
| The UT Campus Object Dataset | 📸🏞️🤖 | ut-coda |
mono stereo |
radtan5 |
| Sesoko campaign | 📸🏞️🌊 | sesoko |
mono |
pinhole |
Real / Synthetic : 📸 / 💻
Indoor / Outdoor : 🏠 / 🏞️
Handheld / Headmounted / Vehicle / UAV / Robot / AUV :🤳 / 🥽 / 🚗 / 🚁 / 🤖 / 🌊
- Extend
orbslam3andorbslam3-devtorgbd-vi - Extend
okvis2andokvis2-devtorgbd-viandstereo-vi
- Implement
monotum - Implement
drunkards - Implement
hamlyn - Implement
caves - Implement
hilti2022
