LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

PARS-NET: a novel deep learning framework using parallel residual conventional neural networks for sparse-view CT reconstruction

Photo from wikipedia

Sparse-view computed tomography (CT) is recently proposed as a promising method to speed up data acquisition and alleviate the issue of CT high dose delivery to the patients. However, traditional… Click to show full abstract

Sparse-view computed tomography (CT) is recently proposed as a promising method to speed up data acquisition and alleviate the issue of CT high dose delivery to the patients. However, traditional reconstruction algorithms are time-consuming and suffer from image degradation when faced with sparse-view data. To address this problem, we propose a new framework based on deep learning (DL) that can quickly produce high-quality CT images from sparsely sampled projections and is able for clinical use. Our DL-based proposed model is based on the convolution, and residual neural networks in a parallel manner, named the parallel residual neural network (PARS-Net). Besides, our proposed PARS-Net model benefits from a loss based on the geodesic distance to effectively reflect image structures. Experiments have been performed on the combination of two large-scale CT datasets consisting of CT images of whole-body patients for different sparse projection views including 120, 60, and 30 views. Our experimental results show that PARS-Net is 4–5 times faster than the state-of-the-art DL-based models, with fewer memory requirements, better performance in other objective quality evaluations, and improved visual quality. Results showed that our PARS-Net model was superior to the latest methods, demonstrating the feasibility of using this model for high-quality CT image reconstruction from sparsely sampled projections.

Keywords: reconstruction; neural networks; sparse view; pars net; deep learning

Journal Title: Journal of Instrumentation
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.