LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Exploring the solutions via Retinex enhancements for fruit recognition impacts of outdoor sunlight: a case study of navel oranges

Photo by charlesdeluvio from unsplash

Machine vision-based techniques are one of the critical means to realize intelligent orchard management. Additionally, the third wave of artificial intelligence guided by deep learning has promoted the application of… Click to show full abstract

Machine vision-based techniques are one of the critical means to realize intelligent orchard management. Additionally, the third wave of artificial intelligence guided by deep learning has promoted the application of machine vision technology in fruit recognition. Currently, multiple detection models can extract the fruits from collected images, yet the accuracy has often existed deviation, due to different observation times cause corresponding changes of sunlight throughout a day. Consequently, exploring a method to solve this problem is of great significance to facilitating smart orchards. On this basis, this article takes the navel orange as a study object, dividing four observation periods (10:00–11:00, 12:00–13:00, 14:00–15:00, 16:00–17:00) and two viewing distances of one-meter and two-meter to collect image data. Corresponding algorithms are designed according to each Retinex processing modes to assist the detection of YOLOv5, including single-scale Retinex (SSR), multi-scale Retinex (MSR), multi-scale Retinex with color restoration (MSRCR), multi-scale Retinex with chromaticity preservation (MSRCP), and MSRCR with automatic color gradation adjustment (AutoMSRCR). The experimental results showed that two observation periods of 10:00–11:00 and 14:00–15:00 are more conducive to the data collection, where MSR-based and MSRCR-based models respectively from the two periods improved 9.28% and 6.32% of mean average precision (MAP) than original YOLOv5 under one-meter viewing distance. Also, MSRCR-based and AutoMSRCR-based models achieved 4.92% and 16.91% of MAP higher than the original under two-meter viewing distance. Simultaneously, this article also provides technical selection schemes and analyzes the sensitivity of each model in typical impact scenarios.

Keywords: fruit recognition; meter; scale retinex; retinex; study

Journal Title: Evolutionary Intelligence
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.