LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Diagnosis of Glioblastoma Multiforme Progression via Interpretable Structure-Constrained Graph Neural Networks

Photo by mybbor from unsplash

Glioblastoma multiforme (GBM) is the most common type of brain tumors with high recurrence and mortality rates. After chemotherapy treatment, GBM patients still show a high rate of differentiating pseudoprogression… Click to show full abstract

Glioblastoma multiforme (GBM) is the most common type of brain tumors with high recurrence and mortality rates. After chemotherapy treatment, GBM patients still show a high rate of differentiating pseudoprogression (PsP), which is often confused as true tumor progression (TTP) due to high phenotypical similarities. Thus, it is crucial to construct an automated diagnosis model for differentiating between these two types of glioma progression. However, attaining this goal is impeded by the limited data availability and the high demand for interpretability in clinical settings. In this work, we propose an interpretable structure-constrained graph neural network (ISGNN) with enhanced features to automatically discriminate between PsP and TTP. This network employs a metric-based meta-learning strategy to aggregate class-specific graph nodes, focus on meta-tasks associated with various small graphs, thus improving the classification performance on small-scale datasets. Specifically, a node feature enhancement module is proposed to account for the relative importance of node features and enhance their distinguishability through inductive learning. A graph generation constraint module enables learning reasonable graph structures to improve the efficiency of information diffusion while avoiding propagation errors. Furthermore, model interpretability can be naturally enhanced based on the learned node features and graph structures that are closely related to the classification results. Comprehensive experimental evaluation of our method demonstrated excellent interpretable results in the diagnosis of glioma progression. In general, our work provides a novel systematic GNN approach for dealing with data scarcity and enhancing decision interpretability. Our source codes will be released at https://github.com/SJTUBME-QianLab/GBM-GNN.

Keywords: interpretable structure; diagnosis; progression; constrained graph; structure constrained; glioblastoma multiforme

Journal Title: IEEE Transactions on Medical Imaging
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.