论文标题
大脑网络变压器
Brain Network Transformer
论文作者
论文摘要
人的大脑通常被建模为感兴趣区域(ROI)网络及其与大脑功能和精神疾病的理解联系。最近,已经对基于变压器的模型进行了研究,包括不同类型的数据(包括图形),这些数据显示出可以广泛带来的性能增长。在这项工作中,我们研究了用于大脑网络分析的基于变压器的模型。在数据的独特属性的驱动下,我们将大脑网络建模为具有固定尺寸和顺序的节点的图形,这使我们能够(1)将连接配置文件用作节点特征,以提供自然和低成本的位置信息,以及(2)在ROI中学习对ROIS的配对连接强度,具有有效的个人注意力,这些个人对下游分析任务具有预测性。此外,我们提出了一个基于自我监视的软聚类和正交投影的正顺序聚类读数操作。该设计解释了确定ROI组之间相似行为的基本功能模块,从而导致可区分的群集感知的节点嵌入和信息图形嵌入。最后,我们将评估管道重新标准化了Abide的唯一公开可用的大型大脑网络数据集,以实现有意义的不同模型的比较。实验结果表明,我们所提出的大脑网络变压器对公众遵守和我们受限制的ABCD数据集有了明显的改进。该实现可在https://github.com/wayfear/brainnetworktransformer上获得。
Human brains are commonly modeled as networks of Regions of Interest (ROIs) and their connections for the understanding of brain functions and mental disorders. Recently, Transformer-based models have been studied over different types of data, including graphs, shown to bring performance gains widely. In this work, we study Transformer-based models for brain network analysis. Driven by the unique properties of data, we model brain networks as graphs with nodes of fixed size and order, which allows us to (1) use connection profiles as node features to provide natural and low-cost positional information and (2) learn pair-wise connection strengths among ROIs with efficient attention weights across individuals that are predictive towards downstream analysis tasks. Moreover, we propose an Orthonormal Clustering Readout operation based on self-supervised soft clustering and orthonormal projection. This design accounts for the underlying functional modules that determine similar behaviors among groups of ROIs, leading to distinguishable cluster-aware node embeddings and informative graph embeddings. Finally, we re-standardize the evaluation pipeline on the only one publicly available large-scale brain network dataset of ABIDE, to enable meaningful comparison of different models. Experiment results show clear improvements of our proposed Brain Network Transformer on both the public ABIDE and our restricted ABCD datasets. The implementation is available at https://github.com/Wayfear/BrainNetworkTransformer.