Source code of paper: "Out-of-distribution Graph Models Merging"
Keywords: Graph OOD, Domain generalization, Graph neural networks
This paper studies a novel problem of out-of-distribution graph models merging, which aims to construct a generalized model from multiple graph models pre-trained on different domains with distribution discrepancy. This problem is challenging because of the difficulty in learning domain-invariant knowledge implicitly in model parameters and consolidating expertise from potentially heterogeneous GNN backbones. In this work, we propose a graph generation strategy that instantiates the mixture distribution of multiple domains. Then, we merge and fine-tune the pre-trained graph models via a MoE module and a masking mechanism for generalized adaptation. Our framework is architecture-agnostic and can operate without any source/target domain data. Both theoretical analysis and experimental results demonstrate the effectiveness of our approach in addressing the model generalization problem.
We used Python 3.12.3, PyTorch 2.4.0, DGL 2.4.0. For the Python packages, please see requirements.txt.
dgl==2.4.0+cu124
matplotlib==3.10.8
networkx==3.4.2
numpy==2.4.2
ogb==1.3.6
pandas==3.0.1
scikit_learn==1.8.0
scipy==1.17.1
torch==2.4.0
tqdm==4.67.1
We used the datasets provided by Deng et al. and Qiao et al..
You can download all the datasets through the Google drive:
https://drive.google.com/drive/folders/1a_SnT9to_ADRXR9338CaNAIaNQT55w_c?usp=sharing
Make sure the data files are in the ./dataset folder.
Running run.sh to run OGMM on PTC dataset. This will conduct directly the graph models fusion stage based on the pretrained GNNs and synthesized data.
