Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT
Abstract
:1. Introduction
2. Materials and Methods
2.1. Sensors and System Components
2.2. Dataset
2.3. Image-Processing Methods
2.3.1. Object Detection Method
2.3.2. Objects Tracking Method
3. Results and Discussion
3.1. Data Training
3.2. Validation on Internet Media
3.3. Practical Application
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Adhikari, S.P.; Kim, G.; Kim, H. Deep Neural Network-Based System for Autonomous Navigation in Paddy Field. IEEE Access 2020, 8, 71272–71278. [Google Scholar] [CrossRef]
- Liu, Y.; Noguchi, N.; Liang, L. Development of a positioning system using UAV-based computer vision for an airboat navigation in paddy field. Comput. Electron. Agric. 2019, 162, 126–133. [Google Scholar] [CrossRef]
- Roser, M. Employment in Agriculture. Available online: https://ourworldindata.org/employment-in-agriculture (accessed on 31 March 2020).
- Chen, W.; Guo, S. Current situation and existing problems of agricultural informatization in China. Trans. Chin. Soc. Agric. Eng. 2013, 29, 196–205. [Google Scholar]
- Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Hellmann Santos, C.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
- Bonadies, S.; Gadsden, S.A. An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Eng. Agric. Environ. Food 2019, 12, 24–31. [Google Scholar] [CrossRef]
- Zhang, N.Q.; Wang, M.H.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
- Nagasaka, Y.; Umeda, N.; Kanetai, Y.; Taniwaki, K.; Sasaki, Y. Autonomous guidance for rice transplanting using global positioning and gyroscopes. Comput. Electron. Agric. 2004, 43, 223–234. [Google Scholar] [CrossRef]
- Gonzalez-de-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of robots for environmentally-safe pest control in agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
- Zhang, Z.; Noguchi, N.; Ishii, K.; Yang, L.; Zhang, C. Development of a robot combine harvester for wheat and paddy harvesting. IFAC Proc. Vol. 2013, 46, 45–48. [Google Scholar] [CrossRef]
- CDC Agricultural Safety. Available online: https://www.cdc.gov/niosh/topics/aginjury/ (accessed on 12 April 2020).
- Teagasc Alarming Rise in Farm Accidents. Available online: https://www.teagasc.ie/news--events/news/2018/rise-in-farm-accidents.php (accessed on 26 February 2020).
- Kamilaris, A.; Prenafeta-Boldu, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Rehman, T.U.; Mahmud, M.S.; Chang, Y.K.; Jin, J.; Shin, J. Current and future applications of statistical machine learning algorithms for agricultural machine vision systems. Comput. Electron. Agric. 2019, 156, 585–605. [Google Scholar] [CrossRef]
- Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramech. 2013, 50, 211–232. [Google Scholar] [CrossRef]
- Li, Y.; Hong, Z.; Cai, D.; Huang, Y.; Gong, L.; Liu, C. A SVM and SLIC Based Detection Method for Paddy Field Boundary Line. Sensors 2020, 20, 2610. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Cheng, J. Moving Obstacle Detection Based on Machine Vision for Agricultural Mobile Robot. Trans. Chin. Soc. Agric. Mach. 2011, 42, 154–158. [Google Scholar]
- Pajares, G.; Garcia-Santillan, I.; Campos, Y.; Montalvo, M.; Miguel Guerrero, J.; Emmi, L.; Romeo, J.; Guijarro, M.; Gonzalez-de-Santos, P. Machine-Vision Systems Selection for Agricultural Vehicles: A Guide. J. Imaging 2016, 2, 34. [Google Scholar] [CrossRef] [Green Version]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Liu, Z.; Gao, J.; Yang, G.; Zhang, H.; He, Y. Localization and Classification of Paddy Field Pests using a Saliency Map and Deep Convolutional Neural Network. Sci. Rep. 2016, 6, 20410. [Google Scholar] [CrossRef] [Green Version]
- Christiansen, P.; Nielsen, L.N.; Steen, K.A.; Jorgensen, R.N.; Karstoft, H. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field. Sensors 2016, 16, 1904. [Google Scholar] [CrossRef] [Green Version]
- Kragh, M.F.; Christiansen, P.; Laursen, M.S.; Larsen, M.; Steen, K.A.; Green, O.; Karstoft, H.; Jorgensen, R.N. FieldSAFE: Dataset for Obstacle Detection in Agriculture. Sensors 2017, 17, 2579. [Google Scholar] [CrossRef] [Green Version]
- Wojke, N.; Bewley, A.; Paulus, D. Simple online and realtime tracking with a deep association metric. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017. [Google Scholar]
- Redmon, J.; Farhadi, A.J. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Lin, T.Y.; Dollar, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature Pyramid Networks for Object Detection. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollar, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. Available online: https://link.springer.com/chapter/10.1007/978-3-319-10602-1_48 (accessed on 3 May 2020).
- Whittlesey, P. Water Buffalo and Rice Paddies in Sam Neua. Available online: https://www.youtube.com/watch?v=-Sb15wYdy8k (accessed on 26 December 2019).
- Milan, A.; Leal-Taixé, L.; Reid, I.; Roth, S.; Schindler, K. MOT16: A benchmark for multi-object tracking. arXiv 2016, arXiv:1603.00831. [Google Scholar]
Camera | Computer | ||
---|---|---|---|
Sensitivity | 0.05 lx | Central Processing Unit (CPU) | Intel Core® i7-6700HQ |
Video formats @ frame rate (maximum) | 1280 × 960 (1.2 MP) RGB32 @ 30 fps 1280 × 960 (1.2 MP) Y800 @ 30 fps 1280 × 960 (1.2 MP) Y16 @ 30 fps | Graphic Processing Unit (GPU) | Nvidia GTX 970M |
Temperature (operating) | −5 °C to 45 °C | Memory | 16 GB 64-bit DDR4 |
Humidity (operating) | 20 % to 80 % (non-condensing) | Universal Serial Bus (USB) | 3x USB 3.0, USB 2.0 Micro-B |
Scale (grid) | 13 * 13 | 26 * 26 |
---|---|---|
Clusters (Pixel) | 157 × 183 193 × 92 331 × 171 | 25 × 29 69 × 61 89 × 123 |
mIoU | FPS | |
---|---|---|
Yolov3 | 0.791 | 11 |
Improved Yolov3 | 0.779 | 14 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qiu, Z.; Zhao, N.; Zhou, L.; Wang, M.; Yang, L.; Fang, H.; He, Y.; Liu, Y. Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT. Sensors 2020, 20, 4082. https://doi.org/10.3390/s20154082
Qiu Z, Zhao N, Zhou L, Wang M, Yang L, Fang H, He Y, Liu Y. Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT. Sensors. 2020; 20(15):4082. https://doi.org/10.3390/s20154082
Chicago/Turabian StyleQiu, Zhengjun, Nan Zhao, Lei Zhou, Mengcen Wang, Liangliang Yang, Hui Fang, Yong He, and Yufei Liu. 2020. "Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT" Sensors 20, no. 15: 4082. https://doi.org/10.3390/s20154082