In this paper, an approximate message passing-based generalized sparse Bayesian learning (AMP-Gr-SBL) algorithm is proposed to reduce the computation complexity of the Gr-SBL algorithm, meanwhile improving the robustness of the… Click to show full abstract
In this paper, an approximate message passing-based generalized sparse Bayesian learning (AMP-Gr-SBL) algorithm is proposed to reduce the computation complexity of the Gr-SBL algorithm, meanwhile improving the robustness of the GAMP algorithm against the measurement matrix deviated from the independent and identically distributed Gaussian matrix for the generalized linear model (GLM). According to expectation propagation, the original GLM is iteratively decoupled into two sub-modules: the standard linear model (SLM) module and the minimum mean-square-error module. For the SLM module, we apply the SBL algorithm, where the expectation step is replaced by the AMP algorithm to reduce the computation complexity significantly. The numerical results demonstrate the effectiveness of the proposed AMP-Gr-SBL algorithm.
               
Click one of the above tabs to view related content.