Multimodal medical image fusion is a current technique applied in the applications related to medical field to combine images from the same modality or different modalities to improve the visual… Click to show full abstract
Multimodal medical image fusion is a current technique applied in the applications related to medical field to combine images from the same modality or different modalities to improve the visual content of the image to perform further operations like image segmentation. Biomedical research and medical image analysis highly demand medical image fusion to perform higher level of medical analysis. Multimodal medical fusion assists medical practitioners to visualize the internal organs and tissues. Multimodal medical fusion of brain image helps to medical practitioners to simultaneously visualize hard portion like skull and soft portion like tissue. Brain tumor segmentation can be accurately performed by utilizing the image obtained after multimodal medical image fusion. The area of the tumor can be accurately located with the information obtained from both Positron Emission Tomography and Magnetic Resonance Image in a single fused image. This approach increases the accuracy in diagnosing the tumor and reduces the time consumed in diagnosing and locating the tumor. The functional information of the brain is available in the Positron Emission Tomography while the anatomy of the brain tissue is available in the Magnetic Resonance Image. Thus, the spatial characteristics and functional information can be obtained from a single image using a robust multimodal medical image fusion model. The proposed approach uses a generative adversarial network to fuse Positron Emission Tomography and Magnetic Resonance Image into a single image. The results obtained from the proposed approach can be used for further medical analysis to locate the tumor and plan for further surgical procedures. The performance of the GAN based model is evaluated using two metrics, namely, structural similarity index and mutual information. The proposed approach achieved a structural similarity index of 0.8551 and a mutual information of 2.8059.
               
Click one of the above tabs to view related content.