Please wait a minute...
J4  2012, Vol. 46 Issue (6): 1021-1026    DOI: 10.3785/j.issn.1008-973X.2012.06.010
无线电电子学、电信技术     
视觉和IMU融合的移动机器人运动解耦估计
路丹晖1, 周文晖2, 龚小谨1, 刘济林1
1. 浙江大学 信息与电子工程学系,浙江 杭州310027;2.杭州电子科技大学 计算机学院,浙江 杭州310018
Decoupled mobile robot motion estimation based on fusion of
visual and inertial measurement unit
LU Dan-hui1, ZHOU Wen-hui2, GONG Xiao-jin1, LIU Ji-lin1
1. Department of Information Science and Electronic Engineering, Zhejiang University,Hangzhou 310027, China;
2. College of Computer Science, Hangzhou Dianzi University, Hangzhou 310018, China
 全文: PDF  HTML
摘要:

针对视觉里程计(VO)因累积误差导致运动姿态估计存在的偏差,提出实时扩展卡尔曼滤波器姿态估计模型,利用惯性测量单元(IMU)结合重力加速度方向作为垂直方向参考,对视觉里程计航向、俯仰和侧倾3个方向姿态估计进行解耦,修正姿态估计的累积误差;根据运动状态采用模糊逻辑调整滤波器参数,实现自适应的滤波估计,降低加速度噪声的影响.实验采用高精度的全站仪作为真值,并结合多种地形环境,实验结果表明:50 m内跟踪的累积误差低于0.3 m,有效地提高了视觉里程计的定位精度和鲁棒性.

Abstract:

Due to attitude estimation deviation caused by accumulated error in present visual odometry(VO), a real-time adaptive extended kalman filter (EKF) model was presented, with decoupled attitude estimation in yaw, pitch and roll, based on inertial measurement unit(IMU) using gravity as vertical reference to correct  the accumulated error in attitude. According to the state of motion, a fuzzy logic was used to adjust filter parameters, so that adaptive filter estimation was realized. Compared with the ground truth from total station, experimental result shows that the algorithm can track motions in kinds of terrain, over distance of 50 m, with an error of less 0.3 m, effectively making VO more accurate and robust.

出版日期: 2012-07-24
:  TP 242.62  
基金资助:

国家自然科学基金重大资助项目(6053407);国家自然科学基金资助项目(60902077).

通讯作者: 龚小谨,女,讲师.     E-mail: gongxj@zju.edu.cn
作者简介: 路丹晖(1986—),男,硕士生,从事计算机视觉、机器人导航研究.E-mail:danhui.lu@hotmail.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  

引用本文:

路丹晖, 周文晖, 龚小谨, 刘济林. 视觉和IMU融合的移动机器人运动解耦估计[J]. J4, 2012, 46(6): 1021-1026.

LU Dan-hui, ZHOU Wen-hui, GONG Xiao-jin, LIU Ji-lin. Decoupled mobile robot motion estimation based on fusion of
visual and inertial measurement unit. J4, 2012, 46(6): 1021-1026.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2012.06.010        http://www.zjujournals.com/eng/CN/Y2012/V46/I6/1021

[1] HUSTER A, ROCK S. Relative position sensing by fusing monocular vision and inertial rate sensors [C]∥Proceedings of 11th International Conference on Advanced Robot. Coimbra: IEEE, 2003: 1562-1567.
[2] YOU S, NEUMANN U. Fusion of vision and gyro tracking for robust augmented reality registration [C]∥ Proceedings of IEEE Virtual Reality Conference. Yokohama: IEEE, 2001: 71-78.
[3] KONOLIGE K, AGRAWAL M, SOLA J. Large scale visual odometry for rough terrain [C]∥Proceedings of 13th International Symposium on Research in Robotics. Hiroshima: Springer, 2011: 201-212.
[4] STRELOW D, SINGH S. Motion estimation from image and inertial measurements [J]. International Journal of Robotics Research, 2004,23(12): 1157-1195.
[5] LOWE D G. Distinctive image features from scaleinvariant keypoints [J]. International Journal of Computer Vision, 2004,60(2): 91-110.
[6] FISCHLER M A, BOLLES R C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography [J]. Communication of the ACM , 1981,24: 381-385.
[7] NOCEDAL J, WRIGHT S J. Numerical optimization [M]. New York :Springer, 1999.
[8] RONNBACK S. Development of a INS/GPS navigation loop for an UAV [D]. Lulea: Lulea Tekniska University of Technology, 2000.
[9] KANG C W, PARK C G. Attitude estimation with accelerometers and gyros using fuzzy tuned kalman filter [C]∥Proceedings of 10th European Control Conference. Budapest: IEEE, 2009: 3713-3718.
[10] SIMON D, Optimal state estimation: kalman Hinfinity, and nonlinear approaches [M]. New York: WileyInterscience, 2006.
[11] HOWARD A, RealTime stereo visual odometry for autonomous ground vehicles [C]∥Proceedings of IROS 2008. Nice: IEEE, 2008: 3946-3952.

[1] 杜鑫峰, 熊蓉, 褚健. 仿人足球机器人视觉系统快速识别与精确定位[J]. J4, 2009, 43(11): 1975-1981.