An indoor positioning method of unmanned aerial vehicle (UAV) based on improved multi-state constraint Kalman filter (MSCKF) was proposed aiming at the problem that the indoor positioning of UAV is prone to drift. A high robustness and low delay detection method was proposed under the framework of MSCKF. The pose of UAV was calculated with the help of the known positions of the mark points in world coordinate system. Then inertial measurement unit (IMU) data and monocular vision data fusion and UAV pose correction were realized. The proposed positioning method was tested. The simulation results show that the positioning error of the proposed method was within 0.266 m, and the positioning accuracy was improved by more than 54.6% compared to OpenVins and LARVIO.
Si-peng WANG,Chang-ping DU,Guang-hua SONG,Yao ZHENG. Indoor positioning method of UAV based on improved MSCKF algorithm. Journal of ZheJiang University (Engineering Science), 2022, 56(4): 711-717.
Fig.3Performance test environment of positioning algorithm
Fig.4Positioning results of trajectory 1 and trajectory 2
Fig.5Error comparision of trajectory 1
算法
误差/m
轨迹1
轨迹2
本文算法
0.226
0.266
OpenVins
0.498
0.644
LARVIO
0.575
—
Tab.1Positioning error of three algorithms for two trajectories
Fig.6Error comparision of trajectory 2
算法
td/ms
本文算法
7.92
OpenVins
2.67
文献[13]算法
>22.22
Tab.2Real-time analysis of positioning algorithms
[1]
USENKO V, ENGEL J, STÜCKLER J, et al. Direct visual-inertial odometry with stereo cameras [C]// 2016 IEEE International Conference on Robotics and Automation. Stockholm: IEEE, 2016: 1885-1892.
[2]
MUR-ARTAL R, TARDóS J D Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters, 2017, 2 (2): 796- 803
doi: 10.1109/LRA.2017.2653359
[3]
MUR-ARTAL R, MONTIEL J M M, TARDOS J D ORB-SLAM: a versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 2015, 31 (5): 1147- 1163
doi: 10.1109/TRO.2015.2463671
[4]
MUR-ARTAL R, TARDóS J D Orb-slam2: an open-source slam system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics, 2017, 33 (5): 1255- 1262
doi: 10.1109/TRO.2017.2705103
[5]
CASTELLANOS J A, NEIRA J, JD T Limits to the consistency of EKF-based SLAM[J]. IFAC Proceedings Volumes, 2004, 37 (8): 716- 721
doi: 10.1016/S1474-6670(17)32063-3
[6]
MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation [C]// Proceedings of 2007 IEEE International Conference on Robotics and Automation. Roma: IEEE, 2007: 3565-3572.
[7]
SUN K, MOHTA K, PFROMMER B, et al Robust stereo visual inertial odometry for fast autonomous flight[J]. IEEE Robotics and Automation Letters, 2018, 3 (2): 965- 972
doi: 10.1109/LRA.2018.2793349
[8]
GENEVA P, ECKENHOFF K, LEE W, et al. Openvins: a research platform for visual-inertial estimation [C]// 2020 IEEE International Conference on Robotics and Automation. Paris: IEEE, 2020: 4666-4672.
[9]
QIU X, ZHANG H, FU W Lightweight hybrid visual-inertial odometry with closed-form zero velocity update[J]. Chinese Journal of Aeronautics, 2020, 33 (12): 3344- 3359
doi: 10.1016/j.cja.2020.03.008
[10]
HUAI Z, HUANG G. Robocentric visual-inertial odometry [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 6319-6326.
[11]
MA F, SHI J, YANG Y, et al ACK-MSCKF: tightly-coupled Ackermann multi-state constraint Kalman filter for autonomous vehicle localization[J]. Sensors, 2019, 19 (21): 4816
doi: 10.3390/s19214816
[12]
ZHENG F, TSAI G, ZHANG Z, et al. Trifo-VIO: robust and efficient stereo visual inertial odometry using points and lines [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 3686-3693.
[13]
BAVLE H, MANTHE S, DE L P P, et al. Stereo visual odometry and semantics based localization of aerial robots in indoor environments [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 1018-1023.
[14]
LEVINSON J, MONTEMERLO M, THRUN S. Map-based precision vehicle localization in urban environments [C]// Robotics: Science and Systems. Atlanta: [s. n.], 2007: 1.
[15]
LEVINSON J, THRUN S. Robust vehicle localization in urban environments using probabilistic maps [C]// 2010 IEEE International Conference on Robotics and Automation. Anchorage: IEEE, 2010: 4372-4378.
[16]
QIN T, LI P, SHEN S Vins-mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34 (4): 1004- 1020
doi: 10.1109/TRO.2018.2853729
[17]
QIN T, SHEN S. Online temporal calibration for monocular visual-inertial systems [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 3662-3669.
[18]
徐晓苏, 代维, 杨博, 等 室内环境下基于图优化的视觉惯性SLAM方法[J]. 中国惯性技术学报, 2017, 25 (3): 313- 319 XU Xiao-su, DAI Wei, YANG Bo, et al Visual-aid inertial SLAM method based on graph optimization in indoor[J]. Journal of Chinese Inertial Technology, 2017, 25 (3): 313- 319
[19]
HARRIS C G, STEPHENS M. A combined corner and edge detector [C]// Alvey Vision Conference. Manchester: [s. n.], 1988: 10-5244.