Defects recognition algorithm development from visual UAV inspections

dc.contributor.authorAvdelidis, Nicolas Peter
dc.contributor.authorTsourdos, Antonios
dc.contributor.authorLafiosca, Pasquale
dc.contributor.authorPlaster, Richard
dc.contributor.authorPlaster, Anna
dc.contributor.authorDroznika, Mark
dc.date.accessioned2022-07-25T11:46:26Z
dc.date.available2022-07-25T11:46:26Z
dc.date.issued2022-06-21
dc.description.abstractAircraft maintenance plays a key role in the safety of air transport. One of its most significant procedures is the visual inspection of the aircraft skin for defects. This is mainly carried out manually and involves a high skilled human walking around the aircraft. It is very time consuming, costly, stressful and the outcome heavily depends on the skills of the inspector. In this paper, we propose a two-step process for automating the defect recognition and classification from visual images. The visual inspection can be carried out with the use of an unmanned aerial vehicle (UAV) carrying an image sensor to fully automate the procedure and eliminate any human error. With our proposed method in the first step, we perform the crucial part of recognizing the defect. If a defect is found, the image is fed to an ensemble of classifiers for identifying the type. The classifiers are a combination of different pretrained convolution neural network (CNN) models, which we retrained to fit our problem. For achieving our goal, we created our own dataset with defect images captured from aircrafts during inspection in TUI’s maintenance hangar. The images were preprocessed and used to train different pretrained CNNs with the use of transfer learning. We performed an initial training of 40 different CNN architectures to choose the ones that best fitted our dataset. Then, we chose the best four for fine tuning and further testing. For the first step of defect recognition, the DenseNet201 CNN architecture performed better, with an overall accuracy of 81.82%. For the second step for the defect classification, an ensemble of different CNN models was used. The results show that even with a very small dataset, we can reach an accuracy of around 82% in the defect recognition and even 100% for the classification of the categories of missing or damaged exterior paint and primer and dents.en_UK
dc.identifier.citationAvdelidis NP, Tsourdos A, Lafiosca P, et al., (2022) Defects recognition algorithm development from visual UAV inspections. Sensors, Volume 22, Issue 13, July 2022, Article number 4682en_UK
dc.identifier.issn1424-8220
dc.identifier.urihttps://doi.org/10.3390/s22134682
dc.identifier.urihttps://dspace.lib.cranfield.ac.uk/handle/1826/18219
dc.language.isoenen_UK
dc.publisherMDPIen_UK
dc.rightsAttribution 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectdefect recognitionen_UK
dc.subjectaircraft inspectionen_UK
dc.subjectdeep learningen_UK
dc.subjectCNNen_UK
dc.subjectUAVen_UK
dc.subjectdefect classificationen_UK
dc.subjectAIen_UK
dc.titleDefects recognition algorithm development from visual UAV inspectionsen_UK
dc.typeArticleen_UK

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Defects_recognition_algorithm_development-2022.pdf
Size:
730.66 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.63 KB
Format:
Item-specific license agreed upon to submission
Description: