Development of a combined harvester navigation control system based on visual simultaneous localization and mapping-inertial guidance fusion

Published: 8 May 2024
Abstract Views: 205
PDF: 105
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Authors

Recently, the existing unmanned systems of combine harvesters mostly adopts satellite navigation scheme, lacking real-time observation of harvesting adjustment. To improve the operational efficiency of combine harvester assisted navigation operation, this paper designs a combine harvester navigation control system based on vision simultaneous localization and mapping (SLAM)-inertial guidance fusion. The system acquires field image information and extracts the crop boundary line as the navigation datum by binocular camera. First, the system acquires field image information through binocular camera and extracts the crop boundary line as the navigation datum. Second, fusing camera and inertial guidance information to obtain the real-time relative position of a combine harvester. Third, constrained optimization of image and inertial guidance information is achieved through a sliding window optimization method based on tightly coupled nonlinear optimization. Finally, obtain the position of the combine harvester relative to the navigation datum line, and output a signal to the steering mechanism to realize the combine harvester in the field intelligent positioning navigation control. The system consists of binocular camera, inertial measurement unit, motorized steering wheel, monitor display, angle sensor and microcontroller. During field testing, the system underwent repetitive harvesting trials over a distance of 25 m.. The testing machine performs field operations at a speed of 0.9-1.5 m/s, with an average lateral deviation range of 2.21-8.62 cm, a standard deviation range of 0.13-4.21 cm and an average cutting rate range of 92.2%-96.0%, achieving the expected harvesting effect.

Dimensions

Altmetric

PlumX Metrics

Downloads

Download data is not yet available.

Citations

Crossref
Scopus
Google Scholar
Europe PMC
Campos C, Elvira R, JJG Rodríguez, et al. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM[J]. 2020. DOI: https://doi.org/10.1109/TRO.2021.3075644
Cariou C, Berducat M, Lenain R, et al. Automatic Guidance of Farm Vehicles[J]. Springer Netherlands, 2003. DOI: https://doi.org/10.1007/978-94-017-0401-4_26
Cheng Chuanqi, Hao Xiangyang, Li Jiansheng, et al. Monocular visual-inertial navigation based on nonlinear optimization[J]. Journal of Chinese Inertial Technology, 2017,25(05):643-649. (in Chinese with English abstract)
DAVID B, BEN U, GORDON W, et al. Vision-based obstacle detection and navigation for an agricultural ro‐bot[J]. Journal of Field Robotics, 2016, 33(8): 1107-1130.
Ding Youchun, Xia Zhongzhou, Peng Jingye, et al. Design and experiment of the single-neuron PID navigation controller for a combine harvester[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2020, 36(7): 34-42. (in Chinese with English abstract) doi: 10.11975/j.issn.1002-6819.2020.07.004 http://www.tcsae.org
Liu Yang, Gao Guoqin. Research Development of Vision-based Guidance Directrix Recognition for Agriculture Vehicles[J]. Journal of Agricultural Mechanization Research, 2015,37(05):7-13. DOI: 10.13427/j.cnki.njyi.2015.05.002. (in Chinese with English abstract)
LI Fengyang, JIA Xue-dong, DONG Ming. Development of vision/inertial integrated navigation in different application scenarios[J]. Journal of Navigation and Positioning,2016,4(4):30-35. DOI:10.16547/j. cnki. 10-1096. 201 60406.
Li Guojun,Xu Yanhai, Duan Jiewen ,et al. ORB-SLAM method based on local adaptive threshold extraction feature points[J]. Bulletin of Surveying and Mapping,2021(9):32-36,48. (in Chinese with English abstract)
Liang Lin,He Weiping,Lei Lei,et al. Survey on enhancement methods for non-uniform illumination image[J]. Application Research of Computers, 2010,27(05):1625-1628. (in Chinese with English abstract)
Liu M, Chang G. Numerically and statistically stable Kalman filter for INS/GNSS integration[J]. Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 2016. DOI: https://doi.org/10.1177/0954410015591614
Luo Xiwen, Liao Juan, Hu lian, et al. Improving agricultural mechanization level to promote agricultural sustainable development[J]. Transactions of the Chinese Society of Agricultural Engineering(Transactions of the CSAE) 2016,32(1):1-11.
Lu Jianjun, Rwn Xiaojun,Sun Wei, et al. INS /Stereo Visual Odometry Deeply Integrated Navigation Method[J]. Navigation Positioning and Timing,2016,3(03):37-43. (in Chinese with English abstract)
Ma Xianglu, Yao Xiaoshan, Ding Rensong. Influence of IMU's quality on VIO: based on MSCKF method[J]. SIXTH SYMPOSIUM ON NOVEL OPTOELECTRONIC DETECTION TECHNOLOGY AND APPLICATIONS,2020,11455. DOI: https://doi.org/10.1117/12.2564751
Ma Zhiyan,Ouyang Fangxi,Yang Guangyou, et al. Research on Multi-Sensor Integrated Navigation Based on Vision and Inertia[J]. Journal of Agricultural Mechanization Research, 2019, v.41(06):13-18. (in Chinese with English abstract)
Meng Qingkuan, Qiu Ruicheng, Zhang Man, et al. Navigation System of Agricultural Vehicle Based on Fuzzy Logic Controller with Improved Particle Swarm Optimization Algorithm[J]. Transactions of the Chinese Society for Agricultural Machinery, 2015,46(03):29-36+58. (in Chinese with English abstract)
Mur Artal Raul and Tardos Juan D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras[J]. IEEE Transactions on Robotics, 2017, 33(5): 1255-1262. DOI: https://doi.org/10.1109/TRO.2017.2705103
Nagasaka Y, Saito H, Tamaki K, et al. An autonomous rice transplanter guided by global positioning system and inertial measurement unit[J]. Journal of Field Robotics, 2010, 26(6-7): 537-548. DOI: https://doi.org/10.1002/rob.20294
Noguchi N, Qin Z, Han S, et al. Autonomous Agricultural Tractor with an Intelligent Navigation System[J]. IFAC Proceedings Volumes, 2001, 34(19): 197-202 DOI: https://doi.org/10.1016/S1474-6670(17)33136-1
Qin Tong and Li Peiliang and Shen Shaojie. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. DOI: https://doi.org/10.1109/TRO.2018.2853729
Robert P C, Rust R H, Larson W E, et al. Automatic Steering of Farm Vehicles Using GPS[M]. Proceedings of the Third International Conference on Precision Agriculture, 1996.
SCARAMUZZA D, FRAUNDORFER F. Visual odometry [Tutorial] [J]. IEEE Robotics & Automation Magazine,2011,18(4):80-92. DOI: https://doi.org/10.1109/MRA.2011.943233
Shufeng H, Qin Z, John F. Reid. A Navigation Planner for Automatic Tractor Guidance Using Machine Vision and DGPS[J]. IFAC Proceedings Volumes,2001,34(09): 209-214. DOI: https://doi.org/10.1016/S1474-6670(17)41707-1
Tan Chenjiao,Li Yilin,Wang Dongfei,et al. Review on Automatic Navigation Technologies of Agricultural Machinery[J]. Journal of Agricultural Mechanization Research,2020,42(05):7-14+32. DOI: 10.13427/j.cnki.njyi.2020.05.002. (in Chinese with English abstract)
T. Lupton, S. Sukkarieh. Visual­inertial­aided navigation for high­dynamic motion in-built envi­ronments without initial conditions[J]. IEEE Transactions on Robotics, 2011, 28(1): 61­76 DOI: https://doi.org/10.1109/TRO.2011.2170332
Vision‐based Obstacle Detection and Navigation for an Agricultural Robot[J]. Journal of Field Robotics, 2016, 33(8): 1107-1130. DOI: https://doi.org/10.1002/rob.21644
XU Zhibin,LI Hongwei, ZHANG Bin, et al. Localization method of mobile robot based on binocular vision and inertial navigation[J]. Acta Geodaetica et Cartographica Sinica,2021,50(11):1512-1521. DOI:10.11947/j. AGCS.2021.20210250.
Yang Tao, Li Xiaoxiao. Research progress of machine vision technology in modern agricultural production [J]. Journal of Chinese Agricultural Machinery,2021,42(3):171-181(in Chinese with English abstract)
Yoshisada, Nagasaka, And, et al. Autonomous guidance for rice transplanting using global positioning and gyroscopes[J]. Computers & Electronics in Agriculture, 2004. DOI: https://doi.org/10.1016/S0168-1699(04)00030-4
Zhang Man,Ji Yuhan,Li Shichao,et al. Research Progress of Agricultural Machinery Navigation Technology[J]. Transactions of the Chinese Society for Agricultural Machinery, 2020,51(04):1-18. (in Chinese with English abstract)
Zhu Z S, Yuan C Z, Zhou P. Situation and development of passive navigation localization technology. Progress in Geophys. (in Chinese), 2011, 26(4):1473~ 1477, DOI: 10. 3969/j. issn. 1004-2903. 2011. 04. 044. (in Chinese with English abstract)
Zhang Xiongchu, Chen Bingqi, Li Jingbin, et al. Path detection of visual navigation for jujube harvesters[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2020, 36(13): 133-140. (in Chinese with English abstract)
Zhang Guoliang,Yao Erliang1,Lin Zhilin,et al. Fast Binocular SLAM Algorithm Combining the Direct Method and the Feature-based Method[J]. Robot, 2017,39(06):879-888. (in Chinese with English abstract)
Zhou Silong, Liang Dong, Wang Hui, et al. Remote Sensing Image Segmentation Approach Based on Quarter-tree and Graph Cut[J]. Computer Engineering,2010,36(08):224-226. (in Chinese with English abstract)
Zhou H, Zhong Y, Song H, et al. Gyro-Free Inertial Navigation Technology[J]. 2021. DOI: https://doi.org/10.1007/978-981-15-4972-4

How to Cite

Zhu, F. (2024) “Development of a combined harvester navigation control system based on visual simultaneous localization and mapping-inertial guidance fusion”, Journal of Agricultural Engineering, 55(3). doi: 10.4081/jae.2024.1583.

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.