Compressive sensing (CS)‐based image reconstruction methods have proposed random undersampling schemes that produce incoherent, noise‐like aliasing artifacts, which are easier to remove. The denoising process is critically assisted by imposing… Click to show full abstract
Compressive sensing (CS)‐based image reconstruction methods have proposed random undersampling schemes that produce incoherent, noise‐like aliasing artifacts, which are easier to remove. The denoising process is critically assisted by imposing sparsity‐enforcing priors. Sparsity is known to be induced if the prior is in the form of the Lp (0 ≤ p ≤ 1) norm. CS methods generally use a convex relaxation of these priors such as the L1 norm, which may not exploit the full power of CS. An efficient, discrete optimization formulation is proposed, which works not only on arbitrary Lp‐norm priors as some non‐convex CS methods do, but also on highly non‐convex truncated penalty functions, resulting in a specific type of edge‐preserving prior. These advanced features make the minimization problem highly non‐convex, and thus call for more sophisticated minimization routines.
               
Click one of the above tabs to view related content.