LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Dynamic feature mapping network for unsupervised low-light image enhancement

Abstract. Low-light image enhancement is a challenging task, and zero-reference deep curve estimation (Zero-DCE) is a popular method. However, due to the accumulation of results during the iterative process, Zero-DCE… Click to show full abstract

Abstract. Low-light image enhancement is a challenging task, and zero-reference deep curve estimation (Zero-DCE) is a popular method. However, due to the accumulation of results during the iterative process, Zero-DCE often leads to excessive enhancement. To address this issue, a dynamic feature mapping network (DFMNet) has been proposed, which is an adaptive model that integrates dynamic feature mapping, trainable mapping curves, and lab color space (LAB) color adjustment. Use dynamic feature mapping and color adjustment techniques in each iteration to achieve more accurate image enhancement. In addition, mapping low-light images to obtain reference images facilitates the use of supervised loss functions, thereby improving the performance of the model. Unlike Zero-DCE, DFMNet uses non-additive operations such as subtraction to avoid excessive enhancement. In addition, DFMNet has enabled a supervised loss function in unsupervised settings, which improves learning efficiency. DFMNet outperforms mainstream supervised methods and is ahead of unsupervised methods. DFMNet has a computational complexity of only 1.315 giga floating point operations and only 5125 parameters.

Keywords: low light; feature mapping; dynamic feature; image enhancement

Journal Title: Journal of Electronic Imaging
Year Published: 2025

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.