论文标题
图神经网络扩展的理论比较
A Theoretical Comparison of Graph Neural Network Extensions
论文作者
论文摘要
我们研究和比较了不同的图神经网络扩展,这些神经网络扩展增加了GNN的表达能力,而不是weisfeiler-leman检验。我们专注于基于高阶WL方法的GNN,(ii)GNNS,该GNN在图中预处理小子结构,(iii)GNNS,将图将图预先到小半径范围内,以及(iv)gnns,(iv)gnns略微扰动图表以计算嵌入嵌入。首先,我们对最后一个扩展程序进行了简单的改进,该进步严格增加了该GNN变体的表现力。然后,作为我们的主要结果,我们通过一系列示例构造可以比较这些扩展的表现力,这些示例构造可以通过其中一种扩展来区分,而不是另一个扩展。我们还展示了负面的示例,这些示例对于每个扩展方面都特别具有挑战性,并且我们证明了这些扩展方面计算图中集团和周期的能力的几个主张。
We study and compare different Graph Neural Network extensions that increase the expressive power of GNNs beyond the Weisfeiler-Leman test. We focus on (i) GNNs based on higher order WL methods, (ii) GNNs that preprocess small substructures in the graph, (iii) GNNs that preprocess the graph up to a small radius, and (iv) GNNs that slightly perturb the graph to compute an embedding. We begin by presenting a simple improvement for this last extension that strictly increases the expressive power of this GNN variant. Then, as our main result, we compare the expressiveness of these extensions to each other through a series of example constructions that can be distinguished by one of the extensions, but not by another one. We also show negative examples that are particularly challenging for each of the extensions, and we prove several claims about the ability of these extensions to count cliques and cycles in the graph.