BACKGROUND AND OBJECTIVE The morphology of the human metaphase II (MII) oocyte is an essential indicator of the embryo's potential for developing into a healthy baby in the Intra-Cytoplasmic Sperm… Click to show full abstract
BACKGROUND AND OBJECTIVE The morphology of the human metaphase II (MII) oocyte is an essential indicator of the embryo's potential for developing into a healthy baby in the Intra-Cytoplasmic Sperm Injection (ICSI) process. In this case, characteristics such as oocyte and ooplasm area, zona pellucida (ZP) thickness, and perivitelline space (PVS) width are also linked to the embryo's implantation potential. Moreover, oocyte segmentation methods may be of particular interest in those countries' restrictive IVF legislation. METHODS While the manual examination is impractically time-consuming and subjective, this paper concentrates efforts on designing an automated deep learning framework to take on the challenging task of segmentation in low-resolution microscopic images of MII oocytes. In particular, we have developed a deep learning network based on an improved U-Net model using our presented unique collection of human MII oocyte images (a new challenging dataset contains 1,009 images accompanied by manually labeled pixel-accurate ground truths). High-quality ground truth (GT) preparation is a labor-intensive task. However, we put considerable effort into assessing how different types of GT annotations (binary and multiclass) impact segmentation performance. RESULTS Experimental results on 250 MII oocyte test images demonstrate that the proposed multiclass segmentation algorithm is able to segment complex and irregular ooplasm, ZP, and PVS structures more accurately than its two-class version. Furthermore, the proposed architecture outperforms two other state-of-the-art deep learning models, U-Net and ENet, for the MII oocyte segmentation task. CONCLUSIONS The findings of this study provide a fascinating insight into the automatic and accurate segmentation of human MII oocytes.
               
Click one of the above tabs to view related content.