Graph Neural Networks (GNNs) have gained tremendous popularity for their potential to effectively learn from graphstructured data, commonly encountered in real-world applications. However, most of these models, based on the messagepassing paradigm (interaction within a neighborhood of a few nodes), can only handle local interactions within a graph. When we enforce the models to use information from far away nodes, we will encounter two major issues — oversmoothing & oversquashing. Architectures such as the transformer and diffusion models are introduced to solve this. Yet, these models are not tested on large graph datasets containing graphs with large diameters. Although transformers are powerful, they require significant computational resources for both training and inference, thereby limiting their scalability, particularly for graphs with long-term dependencies. Hence, this paper proposes GraphHSCN—a Heterogenized Spectral Cluster Network, a message-passing-based approach specifically designed for capturing long-range interaction information (when prediction depends on representations of distant nodes interacting with each other).
Our model is composed of the following components:
The models are tested on the following datasets:
They are benchmarked against the following models:
The results of our model are shown below:
As figures indicate, we achieved a similar performance as SAN for peptides datasets, which is better than other common Message-passing GNNs. Yet, we are way more efficient than SAN.
As our model is not performing as well on the resampled citation datasets, we proposes the following potential causes:
There's a lot of excellent work that was introduced around the same time as ours.
@article{xxx
author = {Tao, Sirui and Luo, Zhishang and Dunning, Camille},
title = {GraphHSCN: Heterogenized Spectral Cluster Network for Long Range Graph Data},
journal = {arxiv},
year = {2023},
}