LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Infrared and Visible Image Fusion Using a Deep Unsupervised Framework With Perceptual Loss

Photo from wikipedia

The fusion of infrared and visible images can utilize the indication characteristics and the textural details of source images to realize the all-weather detection. The deep learning (DL) based fusion… Click to show full abstract

The fusion of infrared and visible images can utilize the indication characteristics and the textural details of source images to realize the all-weather detection. The deep learning (DL) based fusion solutions can reduce the computational cost and complexity compared with traditional methods since there is no need to design complex feature extraction methods and fusion rules. There are no standard reference images and the publicly available infrared and visible image pairs are scarce. Most supervised DL-based solutions have to take pre-training on other labeled large datasets which may not behave well when testing. The few unsupervised fusion methods can hardly obtain ideal images with good visual impression. In this paper, an infrared and visible image fusion method based on unsupervised convolutional neural network is proposed. When designing the network structure, densely connected convolutional network (DenseNet) is used as the sub-network for feature extraction and reconstruction to ensure that more information of source images can be retained in the fusion images. As to loss function, the perceptual loss is creatively introduced and combined with the structure similarity loss to constrain the updating of weight parameters during the back propagation. The perceptual loss designed helps to improve the visual information fidelity (VIF) of the fusion image effectively. Experimental results show that this method can obtain fusion images with prominent targets and obvious details. Compared with other 7 traditional and deep learning methods, the fusion results of this method are better on objective evaluation and visual observation when taken together.

Keywords: perceptual loss; visible image; fusion; loss; infrared visible

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.