LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Assessment of Convolutional Neural Network Pre-Trained Models for Detection and Orientation of Cracks

Photo from wikipedia

Failure due to cracks is a major structural safety issue for engineering constructions. Human examination is the most common method for detecting crack failure, although it is subjective and time-consuming.… Click to show full abstract

Failure due to cracks is a major structural safety issue for engineering constructions. Human examination is the most common method for detecting crack failure, although it is subjective and time-consuming. Inspection of civil engineering structures must include crack detection and categorization as a key component of the process. Images can automatically be classified using convolutional neural networks (CNNs), a subtype of deep learning (DL). For image categorization, a variety of pre-trained CNN architectures are available. This study assesses seven pre-trained neural networks, including GoogLeNet, MobileNet-V2, Inception-V3, ResNet18, ResNet50, ResNet101, and ShuffleNet, for crack detection and categorization. Images are classified as diagonal crack (DC), horizontal crack (HC), uncracked (UC), and vertical crack (VC). Each architecture is trained with 32,000 images equally divided among each class. A total of 100 images from each category are used to test the trained models, and the results are compared. Inception-V3 outperforms all the other models with accuracies of 96%, 94%, 92%, and 96% for DC, HC, UC, and VC classifications, respectively. ResNet101 has the longest training time at 171 min, while ResNet18 has the lowest at 32 min. This research allows the best CNN architecture for automatic detection and orientation of cracks to be selected, based on the accuracy and time taken for the training of the model.

Keywords: crack; trained models; detection orientation; convolutional neural; detection; pre trained

Journal Title: Materials
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.