TY - GEN
T1 - Y-Net
T2 - 2024 IEEE International Instrumentation and Measurement Technology Conference, I2MTC 2024
AU - Kargar, Amin
AU - Zorbas, Dimitrios
AU - Gaffney, Michael
AU - O'Flynn, Brendan
AU - Tedesco, Salvatore
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Insect pests can pose a serious threat to food production and agriculture in general and can cause substantial crop damage and economic losses. Monitoring insect pest populations is essential to control and mitigate these losses. Traditional monitoring methods are considered by growers and agronomists to be time-costly as well as labour-intensive tasks, which ultimately means that in times of high activity on farms it is a task which often is neglected. This study proposes an automated vision-based insect segmentation and counting approach through the use of deep learning (DL) models developed particularly for embedded systems. An image dataset for our target insect, Halyomorpha halys, was first created using images captured by our IoT-enabled image capture system deployed in a fruit orchard. Then, a Y-Net model inspired by U-Net was developed with the capability of insect counting in addition to segmentation. The performance of this model was assessed using a variety of different metrics, and the results demonstrated the feasibility and effectiveness of the model in counting and segmentation of insects using Edge-AI algorithms capable of running on embedded systems. Based on the achieved results, the proposed Y-Net model achieved a Mean Squared Error (MSE) of 1.9 for the insect counting task, an Intersection over Union (IoU) of 84.5% and a Dice Similarity Coefficient (DSC) of 91.5% for the segmentation task, with an inference time of nearly 0.4 seconds on a smartphone.
AB - Insect pests can pose a serious threat to food production and agriculture in general and can cause substantial crop damage and economic losses. Monitoring insect pest populations is essential to control and mitigate these losses. Traditional monitoring methods are considered by growers and agronomists to be time-costly as well as labour-intensive tasks, which ultimately means that in times of high activity on farms it is a task which often is neglected. This study proposes an automated vision-based insect segmentation and counting approach through the use of deep learning (DL) models developed particularly for embedded systems. An image dataset for our target insect, Halyomorpha halys, was first created using images captured by our IoT-enabled image capture system deployed in a fruit orchard. Then, a Y-Net model inspired by U-Net was developed with the capability of insect counting in addition to segmentation. The performance of this model was assessed using a variety of different metrics, and the results demonstrated the feasibility and effectiveness of the model in counting and segmentation of insects using Edge-AI algorithms capable of running on embedded systems. Based on the achieved results, the proposed Y-Net model achieved a Mean Squared Error (MSE) of 1.9 for the insect counting task, an Intersection over Union (IoU) of 84.5% and a Dice Similarity Coefficient (DSC) of 91.5% for the segmentation task, with an inference time of nearly 0.4 seconds on a smartphone.
KW - CNN-based architecture
KW - Deep learning
KW - Image segmentation
KW - Insect monitoring
KW - Object counting
KW - Precision agriculture
UR - http://www.scopus.com/inward/record.url?scp=85197815236&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85197815236&partnerID=8YFLogxK
U2 - 10.1109/I2MTC60896.2024.10561007
DO - 10.1109/I2MTC60896.2024.10561007
M3 - Conference contribution
AN - SCOPUS:85197815236
T3 - Conference Record - IEEE Instrumentation and Measurement Technology Conference
BT - I2MTC 2024 - Instrumentation and Measurement Technology Conference
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 20 May 2024 through 23 May 2024
ER -