Applying deep neural networks in image-based wavefront sensing allows for the non-iterative regression of the aberrated phase in real time. In view of the nonlinear mapping from phase to intensity,… Click to show full abstract
Applying deep neural networks in image-based wavefront sensing allows for the non-iterative regression of the aberrated phase in real time. In view of the nonlinear mapping from phase to intensity, it is common to utilize two focal plane images in the manner of phase diversity, while algorithms based on only one focal plane image generally yield less accurate estimations. In this paper, we demonstrate that by exploiting a single image of the pupil plane intensity pattern, it is possible to retrieve the wavefront with high accuracy. In the context of free-space optical communications (FSOC), a compact dataset, in which considerable low-order aberrations exist, is generated to train the EfficientNet which learns to regress the Zernike polynomial coefficients from the intensity frame. The performance of ResNet-50 and Inception-V3 are also tested in the same task, which ended up outperformed by EfficientNet by a large margin. To validate the proposed method, the models are fine-tuned and tested with experimental data collected in an adaptive optics platform.
               
Click one of the above tabs to view related content.