Please wait a minute...
浙江大学学报(工学版)
计算机技术     
分层特征移动机器人行人跟踪
贾松敏,卢迎彬,王丽佳,李秀智,徐涛
北京工业大学 电子信息与控制工程学院,北京 100124;
计算智能与智能系统北京市重点实验室,北京 100124;
数字社区教育部工程研究中心,北京 100124
Mobile robot human tracking using hierarchical features
JIA Song min, LU Ying bin, WANG Li jia, LI Xiu zhi, XU Tao
College of Electronic and Control Engineering, Beijing University of Technology, Beijing 100124, China;
Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing 100124,China;
Engineering Research Center of Digital Community, Ministry of Education, Beijing 100124, China
 全文: PDF(1929 KB)   HTML
摘要:

利用行人头肩或者颜色特征,实现行人外层初定位|在初定位区域上,应用改进的尺度不变特征变换(SIFT)特征匹配实现对目标的精确定位.根据SIFT特征确定目标尺寸,解决行人尺度变化问题;将SIFT特征模板库更新机制引入特征保留优先级,解决行人短暂遮挡和形变的问题.为解决传统Cam-Shift算法的椭圆核函数自适应问题,将SIFT特征尺度变化与Epanechnikov函数融合,构成自适应带宽核函数,克服背景对目标的干扰.外层粗定位结果限制了Harris算子的检测范围,提高了SIFT特征匹配的实时性.实验结果证明,所提出移动机器人行人跟踪算法可以在目标尺度变化、短暂遮挡以及形变情况下实现行人跟踪.

Abstract:

First, the outer orientation of pedestrian target was obtained through head-shoulder or color features. Then the improved SIFT features of the pixels were extracted to locate the target accurately in the coarse location area. The pedestrian scale change problem was solved based on SIFT features to figure out the scale variation of the pedestrian. Simultaneously, the updating mechanism of SIFT features template library was introduced into feature reservation priority level to solve the problem of temporary occlusion and deformation of the target. Aiming at the adaptive elliptic kernel function of the traditional Cam-Shift algorithm, the variablebandwidth and orientation ellipse kernel was combined with scale variations of the human and the Epanechnikov function, which reduced the interfere of background. Furthermore, the searching area for Harris operator was limited by the outler coarse location results, which improved the real-time performance of SIFT feature matching. The experimental results indicate that the proposed mobie robot human tracking algorithm can accomplish human tracking under conditions of target scale variations, temporary occlusion and deformation.

出版日期: 2016-09-22
:  TP 242.6  
基金资助:

国家自然科学基金资助项目(61175087;北京工业大学智能机器人“大科研”推进计划资助项目.

作者简介: 贾松敏(1964-),女,教授,博士,从事机器人分散控制及视觉研究. ORCID: 0000-0002-6682-8797. E-mail: jsm@bjut.edu.cn
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  

引用本文:

贾松敏,卢迎彬,王丽佳,李秀智,徐涛. 分层特征移动机器人行人跟踪[J]. 浙江大学学报(工学版), 10.3785/j.issn.1008-973X.2016.09.06.

JIA Song min, LU Ying bin, WANG Li jia, LI Xiu zhi, XU Tao. Mobile robot human tracking using hierarchical features. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 10.3785/j.issn.1008-973X.2016.09.06.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2016.09.06        http://www.zjujournals.com/eng/CN/Y2016/V50/I9/1677

[1] YOSHIMI T, NISHIYAMA M, SONOURA T, et al. Development of a person following robot with vision based target detection [C] ∥ Proceedings of the 2006 IROS International Conference on Intelligent Robots and Systems. Beijing: IROS, 2006: 5286-5291.
[2] YUN W H, KIM D, LEE J. Person following with obstacle avoidance based on multilayered mean shift and force field method [C] ∥ Proceedings of the 2010 SMC International Conference on Systems, Man and Cybernetics. Turkey: SMC,2010: 3813-3816.
[3] 曹腾,项志宇,刘济林.基于视差空间V截距的障碍物检测自主机器人[J].浙江大学学报:工学版,2015,49(3): 409-414.
CAO Teng, XIANG Zhiyu, LIU Jilin. Obstacle detection based on Vintercept in disparity space [J]. Journal of Zhejiang University: Engineering Science, 2015, 49(3): 409414.
[4] 邱雪娜.基于视觉的运动目标跟踪算法及其在移动机器人中的应用 [D]. 上海:华东理工大学,2011.
QIU Xuena. Visiobased moving object tracking and its application to mobile robot [D] Shanghai: South China University of Technology, 2011.
[5] 李杏华,刘硕. 基于计算机视觉的人体跟踪系统研究[D].天津:天津大学,2008.
LI Xinghua, LIU Shuo. Research on computer vision based human tracking system [D]. Tianjin: Tianjin University, 2008.
[6] 王丽佳,贾松敏,王爽,等.采用改进的MeanShift算法的移动机器人行人跟踪[J].光学紧密工程,2013,21(9): 2364-2370.
WANG Lijia, JIA Songmin, WANG Shuang, et al. Person tracking of mobile robot using improved MeanShift [J]. Optics and Precision Engineering, 2013, 21(9): 2364-2370.
[7] 薛陈,朱明,陈爱华.鲁邦的基于改进MeanShift的目标跟踪[J].光学精密工程,2010,18(1): 234-239.
XUE Chai,ZHU Ming,CHEN Aihua. Robust object tracking based on improved MeanShift algorithm [J]. Optics and Precision Engineering, 2010, 18(1): 234-239.
[8] LI J D. Research on camerabased human body tracking using improved camshift algorithm [J]. International Journal on Smart Sensing and Intelligent Systems, 2015, 8(2): 1104-1122.
[9] SUGA A, FUKUDA K, TAKIGUCHI T, et al. Object recognition and segmentation using SIFT and graph cuts [C] ∥ Proceedings of the 19th ICPR International Conference on Pattern Recognition. Tampa: ICPR, 2008:14.
[10] LOWE D G. Distinctive image features from scaleinvariant keypoints [J]. International Journal of Computer Vision, 2004, 60(2): 91-110.
[11] 王丽佳,贾松敏,李秀智.多特征提取的双目机器人目标跟踪[J].控制与决策,2013,28(10): 1568-1572.
WANG Lijia, JIA Songmin, LI Xiuzhi. Person following of binocular robot by extracting multiple features [J]. Control and Decision, 2013, 28(10): 1568-1572.
[12] NAUSHAD ALI M M, ABDULLAHALWADUD M, LEE S L. Moving object detection and tracking using particle filter [J]. Applied Mechanics and Materials 2013, 321324(2): 1200-1204.
[13] SONGMIN J, WANG S. Human tracking system based on adaptive multifeature meanshift for robot under the doublelayer locating mechanism [J]. Advanced Robotics, 2014, 25(5): 564-577.
[14] 蔺海峰, 马宇峰, 宋涛. 基于 SIFT 特征目标跟踪算法研究[J]. 自动化学报, 2010, 36(8): 1204-1208.
LIN Haifeng, MA Yufeng, Research on object tracking algorithm based on SIFT [J]. Acta Automatica Sinica, 2010, 36(8): 1204-1208.
[15] 徐进,沈敏一,杨力,等. 基于双目光束法平差的机器人定位与地形拼接[J]. 浙江大学学报:工学版, 2011, 45(7): 1141-1146.
XU Jin, SHEN Minyi, YANG Li, et al. Binocular bundle adjustment based localization and terrain stitching for robot [J]. Journal of Zhejiang University: Engineering Science, 2011, 45(7): 1141-1146.
[16] 董蓉, 李勃. 基于 SIFT 特征的目标多自由度 MeanShift 跟踪算法[J].控制与决策, 2012,27(3): 399-407.
DONG Rong, LI Bo. Multidegreeoffreedom MeanShift tracking algorithm based on SIFT feature [J]. Control and Decision, 2012, 27(3): 399-407.
[17] 王丽佳,贾松敏. 基于改进在线多示例学习算法的机器人目标跟踪[J].自动化学报,2014,40(12): 2916-2925.
WANG Lijia, JIA Songmin. Person following for mobile robot using improved multiple instance learning [J]. Acta Automatica Sinica, 2014, 40(12): 2916-2925.

[1] 江文婷, 龚小谨, 刘济林. 基于增量计算的大规模场景致密语义地图构建[J]. 浙江大学学报(工学版), 2016, 50(2): 385-391.
[2] 马子昂,项志宇. 光流测距全向相机的标定与三维重构[J]. 浙江大学学报(工学版), 2015, 49(9): 1651-1657.
[3] 王立军,黄忠朝,赵于前. 基于超像素分割的空间相关主题模型及场景分类方法[J]. 浙江大学学报(工学版), 2015, 49(3): 402-408.
[4] 曹腾,项志宇,刘济林. 基于视差空间V-截距的障碍物检测[J]. 浙江大学学报(工学版), 2015, 49(3): 409-414.
[5] 卢维, 项志宇, 于海滨, 刘济林. 基于自适应多特征表观模型的目标压缩跟踪[J]. 浙江大学学报(工学版), 2014, 48(12): 2132-2138.
[6] 陈明芽, 项志宇, 刘济林. 单目视觉自然路标辅助的移动机器人定位方法[J]. J4, 2014, 48(2): 285-291.
[7] 林颖, 龚小谨, 刘济林. 基于单位视球的鱼眼相机标定方法[J]. J4, 2013, 47(8): 1500-1507.
[8] 王会方, 朱世强, 吴文祥. 谐波驱动伺服系统的改进自适应鲁棒控制[J]. J4, 2012, 46(10): 1757-1763.
[9] 欧阳柳,徐进,龚小谨,刘济林. 基于不确定性分析的视觉里程计优化[J]. J4, 2012, 46(9): 1572-1579.
[10] 马丽莎, 周文晖, 龚小谨, 刘济林. 基于运动约束的泛化Field D*路径规划[J]. J4, 2012, 46(8): 1546-1552.
[11] 路丹晖, 周文晖, 龚小谨, 刘济林. 视觉和IMU融合的移动机器人运动解耦估计[J]. J4, 2012, 46(6): 1021-1026.
[12] 徐进,沈敏一,杨力,王炜强,刘济林. 基于双目光束法平差的机器人定位与地形拼接[J]. J4, 2011, 45(7): 1141-1146.
[13] 陈家乾,柳玉甜,何衍,蒋静坪. 基于栅格模型和样本集合的动态环境地图创建[J]. J4, 2011, 45(5): 794-798.
[14] 陈家乾, 何衍, 蒋静坪. 基于权值平滑的改良FastSLAM算法[J]. J4, 2010, 44(8): 1454-1459.
[15] 鲁仁全, 魏强, 薛安克. 基于线性量化的网络控制系统状态观测器设计[J]. J4, 2010, 44(7): 1400-1405.