Detection of insect-induced defoliation in soybeans with deep learning and object detection

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This thesis utilizes a modified Faster Region-based convolutional neural network (R-CNN) model framework with a Visual Geometry Group 16 (VGG16) feature extraction network to explore two similar but different applications. The first study aimed to evaluate the practicality and accuracy of detecting and labeling soybean leaflets based on their specific defoliation level captured via smartphone. This study was conducted by training and testing the model with images of individual soybean leaflets with varying defoliation levels. Using a defoliation analysis application (Bioleaf), the leaflets were categorized as either exceeding 30% defoliation or below 30% defoliation. One hundred fifty images from each category (300 images total) were used for training data, and 30 images from each category were used for test data (60 images total). The results produced an average precision (AP) of 88.96% and an average recall (AR) of 90.55%. Overall, the model identified and labeled 49 of the 60 test images correctly. The second study aimed to evaluate the practicality and accuracy of detecting and labeling soybean defoliation from canopy level RGB images via an unmanned aircraft vehicle (UAV). This study was conducted through training and testing the model with images of the soybean canopy collected with a flying height of approximately 1 meter. Two hundred images were used to train the model, and 40 images were used as a test dataset. Two hundred images were used for training, and 40 images were used for the test data. The results produced a precision of 25.16% and a recall of 65.00%.

Description

Keywords

Defoliation, Deep learning, Soybeans, Object detection

Graduation Month

May

Degree

Master of Arts

Department

Department of Geography and Geospatial Sciences

Major Professor

Douglas G. Goodin

Date

2021

Type

Thesis

Citation