This repository contains the official implementation of the paper "PathFinder: Advancing Path Loss Prediction for Single-to-Multi-Transmitter Scenario".
Project page: https://emorzz1g.github.io/PathFinder
Real-time path loss prediction with a walking transmitter across different urban layouts.
- Multi-GPU training and inference via PyTorch DDP.
- Config-driven experiments (YAML).
- Baselines and ablations supported by swapping config or entry scripts.
- Python 3.8+ (tested with PyTorch)
- CUDA-capable GPU(s) for training
Install dependencies:
pip install -r requirements.txt- config/: YAML experiment configs
- models/: model code
- logs/: training logs (created at runtime)
- results/: inference outputs (created at runtime)
- trainer/: training and inference logic
You can download our pre-trianed models from Google Driver.
https://drive.google.com/drive/folders/1CBaQWyIV5sb2xLmvvwKS_QKQBOZ7ZEov?usp=sharing
Set the dataset path in your config file (see dataset.dataset_dir in
config/default_pathfinder.yaml).
The method for downloading the dataset will be sorted out later.
Train PathFinder:
python main.py --config default_pathfinder.yaml --mode trainTest / Inference:
python main.py --config default_pathfinder.yaml --mode testBaseline configurations are provided under config/. To run a baseline, switch
the config file. Example:
python main.py --config default_radiounet.yaml --mode trainThe baseline_main.py entry point uses a different trainer implementation and
can be used for legacy runs.
Key options in YAML:
model.name: model type (e.g., PathFinder, UNet, RadioUNet, PMNet, REM_Net)opti: optimizer settings and training scheduledataset: dataset path and split indicesload_pretrainandckp_name: control checkpoint loading
- Checkpoints:
model_pathin config (defaultmodels/) - Logs:
log_pathin config (defaultlogs_pf/orlogs_rem/) - Results:
save_pathin config (defaultresults/)
If you use this code, please cite the paper:
@article{zhong2025pathfinder,
title={PathFinder: Advancing Path Loss Prediction for Single-to-Multi-Transmitter Scenario},
author={Zhong, Zhijie and Yu, Zhiwen and Li, Pengyu and Lv, Jianming and Chen, CL and Chen, Min},
journal={arXiv preprint arXiv:2512.14150},
year={2025}
}
See the LICENSE file for details.