English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Preprint

Learning Long Range Dependencies on Graphs via Random Walks

MPS-Authors
/persons/resource/persons298953

Chen,  Dexiong
Borgwardt, Karsten / Machine Learning and Systems Biology, Max Planck Institute of Biochemistry, Max Planck Society;

/persons/resource/persons302427

Schulz,  Till
Borgwardt, Karsten / Machine Learning and Systems Biology, Max Planck Institute of Biochemistry, Max Planck Society;

/persons/resource/persons75313

Borgwardt,  Karsten       
Borgwardt, Karsten / Machine Learning and Systems Biology, Max Planck Institute of Biochemistry, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

2406.03386.pdf
(Any fulltext), 666KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Chen, D., Schulz, T., & Borgwardt, K. (2024). Learning Long Range Dependencies on Graphs via Random Walks. arXiv. doi:10.48550/arXiv.2406.03386.


Cite as: https://hdl.handle.net/21.11116/0000-0010-34FC-8
Abstract
Message-passing graph neural networks (GNNs) excel at capturing local
relationships but struggle with long-range dependencies in graphs. In contrast,
graph transformers (GTs) enable global information exchange but often
oversimplify the graph structure by representing graphs as sets of fixed-length
vectors. This work introduces a novel architecture that overcomes the
shortcomings of both approaches by combining the long-range information of
random walks with local message passing. By treating random walks as sequences,
our architecture leverages recent advances in sequence models to effectively
capture long-range dependencies within these walks. Based on this concept, we
propose a framework that offers (1) more expressive graph representations
through random walk sequences, (2) the ability to utilize any sequence model
for capturing long-range dependencies, and (3) the flexibility by integrating
various GNN and GT architectures. Our experimental evaluations demonstrate that
our approach achieves significant performance improvements on 19 graph and node
benchmark datasets, notably outperforming existing methods by up to 13\% on the
PascalVoc-SP and COCO-SP datasets. The code is available at
https://github.com/BorgwardtLab/NeuralWalker.