Finding sparse solutions of underdetermined linear systems commonly requires the solving of $L_{1}$ regularized least-squares minimization problem, which is also known as the basis pursuit denoising (BPDN). They are computationally… Click to show full abstract
Finding sparse solutions of underdetermined linear systems commonly requires the solving of $L_{1}$ regularized least-squares minimization problem, which is also known as the basis pursuit denoising (BPDN). They are computationally expensive since they cannot be solved analytically. An emerging technique known as deep unrolling provided a good combination of the descriptive ability of neural networks, explainable, and computational efficiency for BPDN. Many unrolled neural networks for BPDN, e.g., learned iterative shrinkage thresholding algorithm and its variants, employ shrinkage functions to prune elements with small magnitude. Through experiments on synthetic aperture radar tomography (TomoSAR), we discover the shrinkage step leads to unavoidable information loss in the dynamics of networks and degrades the performance of the model. We propose a recurrent neural network (RNN) with novel sparse minimal gated units (SMGUs) to solve the information loss issue. The proposed RNN architecture with SMGUs benefits from incorporating historical information into optimization and, thus, effectively preserves full information in the final output. Taking TomoSAR inversion as an example, extensive simulations demonstrated that the proposed RNN outperforms the state-of-the-art deep learning-based algorithm in terms of super-resolution power and generalization ability. It achieved 10%–20% higher double-scatterer detection rate and is less sensitive to phase and amplitude ratio difference between scatterers. Test on real TerraSAR-X spotlight images also shows the high-quality 3-D reconstruction of the test site.
               
Click one of the above tabs to view related content.