In response to the issue of target disappearing when a mobile robot is following a target, a robot target following system based on visual tracking and autonomous navigation is proposed. The robot following problem was divided into two cases: regular following when the target was within the robot's field of view, and autonomous navigation after the target disappeared. For the former case, the target's motion state was predicted using a Kalman filter, appearance features were extracted using a pedestrian re-identification network, and target tracking was performed by fusing motion information and appearance features using data association fusion. Servo control was then applied for following the target. For the latter case, an autonomous navigation algorithm was adopted based on the relative position between the historical target and the robot. The robot moved to the history position of the target and searched the target, aiming to increase the success rate of the target following. Evaluations were conducted on the OTB100 benchmark dataset and a target following test dataset which was in robot application scenarios. Experiments were performed on a mobile robot platform. The results showed that the robot could follow the target in the environment with different lighting conditions and more background pedestrians, which verified the robustness and effectiveness of the proposed algorithm, and it could meet the real-time requirement. The research results can provide reference for research on the problem of robot refollowing after the target disappears.
Rui ZHANG,Wanyue JIANG. Mobile robot target following system based on visual tracking and autonomous navigation. Chinese Journal of Engineering Design, 2023, 30(6): 687-696.
Fig.2 Framework of mobile robot target following system
Fig.3 Target tracking process
Fig.4 Structure of appearance feature extraction network
Fig.5 Schematic of distance and angle of target measured by depth camera
Fig.6 Transformation of robot coordinate system and world coordinate system
Fig.7 Target positioning deviation after robot offset
参数组序号
运动信息
权重
关联
阈值
外观特征库
更新阈值
1
0.005
0.175
0.15
2
0.005
0.165
0.12
Table 1Parameter values for the algorithm in this paper
Fig.8 Tracking performance of different algorithms in OTB100 dataset
Fig.9 Tracking effect of the algorithm in this paper on the target in the data stream
Fig.10 Tracking performance of different algorithms in the collected data stream
Fig.11 Predefined target walking path
Fig.12 Following effect from robot camera's perspective
Fig.13 Mobile robot following effect
Fig.14 Robot looking for the disappearing target
[1]
LIU X, ZHANG H T, FANG J X, et al. Intelligent shopping cart with quick payment based on dynamic target tracking[C]//International Conference on Cloud Computing & Intelligence Systems, Beijing, Aug. 17-19, 2016.
[2]
李光煌,罗辉,李万建,等.一种智能跟随高尔夫球杆车:201710808764.7[P].2019-03-19. LI G H, LUO H, LI W J, et al. An intelligent following golf club cart: 201710808764.7[P]. 2019-03-19.
[3]
姚瀚晨,彭建伟,戴厚德,等.基于改进弹簧模型的移动机器人柔顺跟随行人方法[J].机器人,2021,43(6):684-693. doi:10.13973/j.cnki.robot.200310 YAO H C, PENG J W, DAI H D, et al. A compliant human following method for mobile robot based on an improved spring model[J]. Robot, 2021, 43(6): 684-693.
doi: 10.13973/j.cnki.robot.200310
[4]
徐胜,邢强,王浩.基于环形红外阵列的移动机器人自动跟随系统[J].工程设计学报,2022,29(2):247-253. doi:10.3785/j.issn.1006-754X.2022.00.020 XU S, XING Q, WANG H. Automatic following system of mobile robot based on annular infrared array[J]. Chinese Journal of Engineering Design, 2022, 29(2): 247-253.
doi: 10.3785/j.issn.1006-754X.2022.00.020
[5]
FERRER G, ZULUETA A G, COTARELO F H, et al. Robot social-aware navigation framework to accompany people walking side-by-side[J]. Autonomous Robots, 2016, 41(4): 775-793.
[6]
CHUNG W. The detection and following of human legs through inductive approaches for a mobile robot with a single laser range finder[J]. IEEE Transactions on Industrial Electronics, 2012, 59(8): 3156-3166.
[7]
LING J, CHAI H, LI Y, et al. An outdoor human-tracking method based on 3D lidar for quadruped robots[C]//2019 IEEE 9th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems, Suzhou, Jul. 29-Aug. 2, 2019.
[8]
CHENG X, JIA Y, SU J, et al. Person-following for telepresence robots using web cameras[C]//2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, Macau, Nov. 3-8, 2019.
[9]
SUN T, NIE S, YEUNG D Y, et al. Gesture-based piloting of an aerial robot using monocular vision[C]//IEEE International Conference on Robotics & Automation, Singapore, May 29-Jun. 2, 2017.
[10]
KOIDE K, MIURA J. Identification of a specific person using color, height, and gait features for a person following robot[J]. Robotics and Autonomous Systems, 2016, 84: 76-87.
[11]
余铎,王耀南,毛建旭,等.基于视觉的移动机器人目标跟踪方法[J].仪器仪表学报,2019,40(1):227-235. YU D, WANG Y N, MAO J X, et al. Vision-based object tracking method of mobile robot[J]. Chinese Journal of Scientific Instrument, 2019, 40(1): 227-235.
[12]
万琴,李智,葛柱,等.基于改进YOLOX的移动机器人目标跟随方法[J/OL].自动化学报,2023,49(7):1558-1572. WAN Q, LI Z, GE Z, et al. Target tracking method of mobile robot based on improved YOLOX[J/OL]. Acta Automatica Sinica, 2023, 49(7): 1558-1572.
[13]
HENRIQUES J F, CASEIRO R, MARTINS P, et al. Exploiting the circulant structure of tracking-by-detection with kernels[C]//Computer Vision–ECCV 2012: 12th European Conference on Computer Vision, Florence, Italy, Oct. 7-13, 2012.
[14]
HENRIQUES J F, CASEIRO R, MARTINS P, et al. High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2015, 37(3): 583-596.
[15]
DANELLJAN M, KHAN F S, HAGER G. Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(8): 1561-1575.
[16]
BERTINETTO L, VALMADRE J, Henriques J F, et al. Fully-convolutional Siamese networks for object tracking[C]//Proceedings of the European Conference on Computer Vision. Amsterdam, The Netherlands: Springer, 2016: 850-865.
[17]
LI B, YAN J J, WU W, et al. High performance visual tracking with Siamese region proposal network[C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, Jun. 18-22, 2018.
[18]
LI B, WU W, WANG Q, et al. SiamRPN++: evolution of Siamese visual tracking with very deep networks[C]//2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, Jun. 16-20, 2019.