Archaeoinformatics - Data Science

MA: Overcoming Oversmoothing in Graph Neural Networks

Author: Christian Beth, M.Sc.

Supervisor: Prof. Dr. Matthias Renz

Graph Attention Network (GAT) Convolution


Graphs are data structures that see a wide variety in their fields of applications, ranging from medical and life science to social networks. Recently, graph neural networks (GNNs) have proven to be a seminal tool for solving a multitude of graph-related tasks across these domains, including representation learning, node classification, and link prediction. The success of these approaches is attributed to the effect of Laplacian smoothing over the node features that occurs during the GNN filtering. This Laplacian smoothing effectively acts as a low-pass filter over the node features, which functions as a de-noiser and thus improves feature quality and performance on the learning task. However, state-of-the-art architectures run into the problem of oversmoothing when too many GNN layers are stacked. Oversmoothing occurs when the filtered features become too similar to each other - to the point where they become hard to distinguish for a learning task, which can greatly hurt performance. To overcome the oversmoothing problem, this work proposes a high-pass filter that - similarly to edge detectors in convolutional neural networks (CNNs) - highlights signal parts where large changes occur. Finally, the approach is evaluated in comparison to several state-of-the-art architectures on various real-world graphs.