How Much to Aggregate: Learning Adaptive Node-Wise Scales on Graphs for Brain Networks
- Title
- How Much to Aggregate: Learning Adaptive Node-Wise Scales on Graphs for Brain Networks
- Authors
- Choi, Injun; Wu, Guorong; KIM, WON HWA
- Date Issued
- 2022-09-27
- Publisher
- Society of Medical Image Computing and Computer Assisted Intervention
- Abstract
- Brain connectomes are heavily studied to characterize early symptoms of various neurodegenerative diseases such as Alzheimer’s Disease (AD). As the connectomes over different brain regions are naturally represented as a graph, variants of Graph Neural Networks (GNNs) have been developed to identify topological patterns for disease early diagnosis. However, existing GNNs heavily rely on the fixed local structure given by an initial graph as they aggregate information from a direct neighborhood of each node. Such an approach overlooks useful information from further nodes, and multiple layers for node aggregations have to be stacked across the entire graph which leads to an over-smoothing issue. In this regard, we propose a flexible model that learns adaptive scales of neighborhood for individual nodes of a graph to incorporate broader information from appropriate range. Leveraging an adaptive diffusion kernel, the proposed model identifies desirable scales for each node for feature aggregation, which leads to better prediction of diagnostic labels of brain networks. Empirical results show that our method outperforms well-structured baselines on Alzheimer’s Disease Neuroimaging Initiative (ADNI) study for classifying various stages towards AD based on the brain connectome and relevant node-wise features from neuroimages.
- URI
- https://oasis.postech.ac.kr/handle/2014.oak/114390
- Article Type
- Conference
- Citation
- International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), page. 376 - 385, 2022-09-27
- Files in This Item:
- There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.