Please wait a minute...
J4  2012, Vol. 46 Issue (6): 1021-1026    DOI: 10.3785/j.issn.1008-973X.2012.06.010
    
Decoupled mobile robot motion estimation based on fusion of
visual and inertial measurement unit
LU Dan-hui1, ZHOU Wen-hui2, GONG Xiao-jin1, LIU Ji-lin1
1. Department of Information Science and Electronic Engineering, Zhejiang University,Hangzhou 310027, China;
2. College of Computer Science, Hangzhou Dianzi University, Hangzhou 310018, China
Download:   PDF(0KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

Due to attitude estimation deviation caused by accumulated error in present visual odometry(VO), a real-time adaptive extended kalman filter (EKF) model was presented, with decoupled attitude estimation in yaw, pitch and roll, based on inertial measurement unit(IMU) using gravity as vertical reference to correct  the accumulated error in attitude. According to the state of motion, a fuzzy logic was used to adjust filter parameters, so that adaptive filter estimation was realized. Compared with the ground truth from total station, experimental result shows that the algorithm can track motions in kinds of terrain, over distance of 50 m, with an error of less 0.3 m, effectively making VO more accurate and robust.



Published: 24 July 2012
CLC:  TP 242.62  
Cite this article:

LU Dan-hui, ZHOU Wen-hui, GONG Xiao-jin, LIU Ji-lin. Decoupled mobile robot motion estimation based on fusion of
visual and inertial measurement unit. J4, 2012, 46(6): 1021-1026.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2012.06.010     OR     http://www.zjujournals.com/eng/Y2012/V46/I6/1021


视觉和IMU融合的移动机器人运动解耦估计

针对视觉里程计(VO)因累积误差导致运动姿态估计存在的偏差,提出实时扩展卡尔曼滤波器姿态估计模型,利用惯性测量单元(IMU)结合重力加速度方向作为垂直方向参考,对视觉里程计航向、俯仰和侧倾3个方向姿态估计进行解耦,修正姿态估计的累积误差;根据运动状态采用模糊逻辑调整滤波器参数,实现自适应的滤波估计,降低加速度噪声的影响.实验采用高精度的全站仪作为真值,并结合多种地形环境,实验结果表明:50 m内跟踪的累积误差低于0.3 m,有效地提高了视觉里程计的定位精度和鲁棒性.

[1] HUSTER A, ROCK S. Relative position sensing by fusing monocular vision and inertial rate sensors [C]∥Proceedings of 11th International Conference on Advanced Robot. Coimbra: IEEE, 2003: 1562-1567.
[2] YOU S, NEUMANN U. Fusion of vision and gyro tracking for robust augmented reality registration [C]∥ Proceedings of IEEE Virtual Reality Conference. Yokohama: IEEE, 2001: 71-78.
[3] KONOLIGE K, AGRAWAL M, SOLA J. Large scale visual odometry for rough terrain [C]∥Proceedings of 13th International Symposium on Research in Robotics. Hiroshima: Springer, 2011: 201-212.
[4] STRELOW D, SINGH S. Motion estimation from image and inertial measurements [J]. International Journal of Robotics Research, 2004,23(12): 1157-1195.
[5] LOWE D G. Distinctive image features from scaleinvariant keypoints [J]. International Journal of Computer Vision, 2004,60(2): 91-110.
[6] FISCHLER M A, BOLLES R C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography [J]. Communication of the ACM , 1981,24: 381-385.
[7] NOCEDAL J, WRIGHT S J. Numerical optimization [M]. New York :Springer, 1999.
[8] RONNBACK S. Development of a INS/GPS navigation loop for an UAV [D]. Lulea: Lulea Tekniska University of Technology, 2000.
[9] KANG C W, PARK C G. Attitude estimation with accelerometers and gyros using fuzzy tuned kalman filter [C]∥Proceedings of 10th European Control Conference. Budapest: IEEE, 2009: 3713-3718.
[10] SIMON D, Optimal state estimation: kalman Hinfinity, and nonlinear approaches [M]. New York: WileyInterscience, 2006.
[11] HOWARD A, RealTime stereo visual odometry for autonomous ground vehicles [C]∥Proceedings of IROS 2008. Nice: IEEE, 2008: 3946-3952.

[1] DU Xin-Feng, XIONG Rong, CHU Jian. Fast recognition and precise localization of humanoid soccer robot vision system[J]. J4, 2009, 43(11): 1975-1981.