Basile Terver,
Randall Balestriero,
Megi Dervishi,
David Fan,
Quentin Garrido,
Tushar Nagarajan,
Koustuv Sinha,
Wancong Zhang,
Mike Rabbat,
Yann LeCun,
Amir Bar
An open source library and tutorial for learning representations for
prediction and planning using joint embedding predictive architectures.
Each example is (almost) self-contained and training takes up to a few hours on a single GPU card.
Self-supervised representations from unlabeled images on CIFAR-10, evaluated on classification.
Predict next image representation in a sequence.
JEPA for world modeling + planning in Two Rooms environment.
| Planning Episode | Task Definition |
|---|---|
![]() |
![]() |
| Successful planning episode | From init to goal state |
We use uv for package management.
# Install dependencies
uv sync
# Option 1: Activate virtual environment
source .venv/bin/activate
python main.py
# Option 2: Run directly with uv
uv run python main.pyIf you need conda-specific packages, you can use Conda + uv
# Create conda environment with Python 3.12
conda create -n eb_jepa python=3.12 -y
conda activate eb_jepa
# Install package in editable mode with dev dependencies (pytest, black, isort)
uv pip install -e . --group devAdd these to your ~/.bashrc for persistent configuration.
# Required for SLURM jobs to find datasets
export EBJEPA_DSETS=/path/to/eb_jepa/datasets
# Optional: Directory for checkpoints and logs
export EBJEPA_CKPTS=/path/to/checkpoints# Local training
python -m examples.{image_jepa,video_jepa,ac_video_jepa}.mainOur default configs are tuned for H100 GPUs. With older GPUs (e.g., A100, V100), you may need to reduce batch size to fit in memory.
All experiments use a unified folder structure:
checkpoints/
βββ {example_name}/
βββ dev_2026-01-16_00-10/ # Single/local runs (dev_ prefix)
β βββ {exp_name}_seed1/
β
βββ sweep_2026-01-16_00-10/ # Auto-named 3-seed sweep
β βββ {exp_name}_seed1/
β βββ {exp_name}_seed1000/
β βββ {exp_name}_seed10000/
β
βββ sweep_my_experiment/ # Custom-named sweep
βββ ...
{exp_name} encodes key hyperparameters to avoid folder collisions, e.g.:
- image_jepa:
resnet_vicreg_proj_bs256_ep300_ph2048_po2048_std1.0_cov80.0 - video_jepa:
resnet_bs64_lr0.001_std10.0_cov100.0 - ac_video_jepa:
impala_cov8_std16_simt12_idm1
π₯οΈ SLURM Launcher (optional)
| Command | Description |
|---|---|
--example {name} |
Choose: image_jepa, video_jepa, ac_video_jepa |
--fname {path} |
Run the sweep specified in the config at {path} |
--single |
Launch single job (dev mode) |
--sweep {name} |
Custom sweep name |
--array-parallelism {N} |
Limits the maximum number of concurrent jobs to N |
--full-sweep |
Full hyperparameter sweep from config |
--use-wandb-sweep |
Enable wandb sweep UI |
# 3 seeds with wandb averaging (recommended)
python -m examples.launch_sbatch --example image_jepa --fname examples/image_jepa/cfgs/default.yaml
# Custom sweep name
python -m examples.launch_sbatch --example image_jepa --fname examples/image_jepa/cfgs/default.yaml --sweep my_experiment
# Single job
python -m examples.launch_sbatch --example image_jepa --fname examples/image_jepa/cfgs/default.yaml --single
# Full hyperparameter sweep
python -m examples.launch_sbatch --example image_jepa --fname examples/image_jepa/cfgs/default.yaml --full-sweep
# With wandb sweep UI for hyperparameter analysis
python -m examples.launch_sbatch --example image_jepa --fname examples/image_jepa/cfgs/default.yaml --use-wandb-sweepReplace image_jepa with ac_video_jepa or video_jepa for other examples.
Full Sweep Configuration: The --full-sweep flag reads the sweep.param_grid section from the example's YAML config file (e.g., examples/image_jepa/cfgs/default.yaml). Without this flag, only a 3-seed sweep is launched. To customize sweep parameters, edit the sweep section in the config:
# Example: examples/image_jepa/cfgs/default.yaml
sweep:
param_grid:
loss.cov_coeff: [0.1, 1.0, 10.0, 100.0]
loss.std_coeff: [1.0, 10.0]
meta.seed: [1, 1000, 10000]Runs with the same hyperparameters but different seeds share the same wandb run name, enabling automatic averaging:
- Go to wandb web UI β Runs table
- Click "Group by" β select "Name" β Groups runs with identical hyperparameters (different seeds) together
To filter runs from a specific sweep: 3. Click "Filter" β "Group" β select your sweep name
For detailed wandb sweep analysis (parallel coordinates, hyperparameter importance):
- Use
--use-wandb-sweepflag when launching - Go to wandb web UI β left pane β "Sweeps" β click your sweep name
SLURM Configuration: To customize SLURM parameters (partition, account, memory, etc.), edit the SLURM_DEFAULTS dictionary at the top of examples/launch_sbatch.py.
Libraries added to eb_jepa must have their own test cases. To run the tests:
# With uv sync installation
uv run pytest tests/
# With conda + uv installation (no .venv created)
pytest tests/Before contributing, please format your code with the following tools:
# Remove unused imports
autoflake --remove-all-unused-imports -r --in-place .
# Sort imports
python -m isort eb_jepa examples tests
# Format code
python -m black eb_jepa examples testsIf you find this repository useful, please consider giving a β and citing:
@misc{terver2026lightweightlibraryenergybasedjointembedding,
title={A Lightweight Library for Energy-Based Joint-Embedding Predictive Architectures},
author={Basile Terver and Randall Balestriero and Megi Dervishi and David Fan and Quentin Garrido and Tushar Nagarajan and Koustuv Sinha and Wancong Zhang and Mike Rabbat and Yann LeCun and Amir Bar},
year={2026},
eprint={2602.03604},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2602.03604},
}EB-JEPA is Apache licensed. See LICENSE.




