Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2019, Vol. 53 Issue (8): 1488-1495    DOI: 10.3785/j.issn.1008-973X.2019.08.007
Computer and Control Engineering     
Real-time tracking algorithm based on multiple Gaussian-distribution correlation filters
Chang-zhen XIONG1(),Run-ling WANG2,Jian-cheng ZOU2
1. Beijing Key Laboratory of Urban Road Traffic Intelligent Control Technology, Beijing 100144, China
2. School of Sciences, North China University of Technology, Beijing 100144, China
Download: HTML     PDF(2641KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

Aiming at the shortage of real-time performance of the hierarchical convolutional features for visual tracking algorithm and the poor adaptability of single classifier to target appearance changes, a real-time visual tracking algorithm based on multiple Gaussian-distribution correlation filters was proposed. Features with high dimensions of convolution channels were extracted from Pool4 and Conv5-3 layers of VGG-19 networks, and the sparse sampling approach was used to reduce the number of convolution channels to speed up the tracking algorithm. In order to prevent the decrease of tracking accuracy caused by the reduction of features, the multiple correlation filters based on different Gaussian-distribution samples were trained and all the predicted target positions were fused by adaptive weights, expecting for the better robustness for target posture changes. The sparse model update strategy was applied to further improve the algorithm’s speed and achieve the real-time performance. Experimental results on OTB100 benchmark dataset showed that the proposed algorithm had an average distance precision of 86.6%, which was 3.5% higher than that of the original hierarchical convolutional features for visual tracking method. The proposed method has better robustness under complex conditions, for example occlusion, deformation, similar background interferences. The average tracking speed was 43.7 frames per second, and it had a better real-time effect.



Key wordsvisual tracking      convolutional feature      correlation filter      Gaussian distribution      adaptive fusion     
Received: 19 July 2018      Published: 13 August 2019
CLC:  TP 391  
Cite this article:

Chang-zhen XIONG,Run-ling WANG,Jian-cheng ZOU. Real-time tracking algorithm based on multiple Gaussian-distribution correlation filters. Journal of ZheJiang University (Engineering Science), 2019, 53(8): 1488-1495.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2019.08.007     OR     http://www.zjujournals.com/eng/Y2019/V53/I8/1488


基于多高斯相关滤波的实时跟踪算法

针对分层卷积特征目标跟踪算法实时性不足和单分类器对目标表观变化适应能力差的问题,提出多高斯相关滤波器融合的实时目标跟踪算法. 为了加快跟踪算法,提取VGG-19网络的Pool4和Conv5-3层的多通道卷积特征,通过稀疏采样减少卷积特征通道数;为了防止特征减少造成精确度下降,利用不同高斯分布样本训练多个相关滤波器,并对所有分类器预测的目标位置进行自适应加权融合,提高算法对目标姿态变化的鲁棒性;采用稀疏模型更新策略,进一步提高算法速度,使算法具有实时性. 在OTB100标准数据集上对算法进行测试,结果表明,该算法的平均距离精度为86.6%,比原分层卷积特征目标跟踪算法提高了3.5%,在目标发生遮挡、形变、相似背景干扰等复杂情况时具有较好的鲁棒性;平均跟踪速度为43.7帧/s,实时性更好.


关键词: 视觉跟踪,  卷积特征,  相关滤波,  高斯分布,  自适应融合 
Fig.1 Framework of multiple Gaussian-distribution correlation filter algorithm
Fig.2 Response maps of convolutional features with different dimensions
Fig.3 Feature response maps with different Gaussian distributions for different video images
数据集 变量 深度特征跟踪器 相关滤波跟踪器
本研究算法 CF2 MSDAT HDT DeepSRDCF KCF SRDCF SAMF DSST Staple
1)注:测试的9种算法速度为对应原算法文献中给出的速度。
OTB 2013 平均DP/% 89.5 89.1 86.3 88.9 84.9 74.0 83.8 78.5 74.0 79.3
平均OP/% 83.8 74.0 74.1 73.7 79.5 62.3 78.1 73.2 67.0 75.4
平均CLE/像素 12.7 15.7 14.6 15.9 25.7 35.5 35.2 30.1 41.2 30.6
平均速度/(帧·s?1 46.2 11.0 23.7 6.3 0.2 273.0 3.6 18.6 26.0 45.0
OTB 100 平均DP/% 86.6 83.7 82.1 84.8 85.1 69.6 78.9 75.1 68.0 78.4
平均OP/% 78.0 65.5 65.5 65.7 77.3 55.1 72.8 67.4 60.1 70.9
平均CLE/像素 15.3 22.8 20.5 20.1 21.4 45.0 38.6 36.5 50.4 31.5
平均速度/(帧·s?1 43.7 10.4 23.5 5.5 0.2 266.0 3.5 17.0 22.0 42.9
Tab.1 Comparisons of average DP、OP、CLE and speed for different algorithms
Fig.4 Comparison plots of precision and success for OPE
算法 BC OV IPR FM MB DEF OCC IV SV OPR LR
本研究算法 80.7 60.2 73.8 72.3 75.0 71.4 73.0 79.3 69.1 75.0 60.2
DeepSRDCF 74.9 65.5 71.8 75.5 78.2 69.0 73.7 74.1 73.0 73.8 71.3
CF2 72.1 54.0 66.2 66.8 69.8 60.3 60.6 61.6 51.9 62.9 32.7
HDT 71.3 54.7 65.7 66.4 68.9 61.8 61.1 60.8 51.4 62.7 35.4
MSDAT 72.5 56.0 67.6 63.4 65.9 60.4 59.7 63.5 50.8 63.6 35.9
Tab.2 Comparison of success rates of different attribute datasets
Fig.5 Comparison of precision and speed for different strategies
Fig.6 Tracking result comparison of four algorithms on several challenging video sequences
[1]   HERIQUES J F, CASEIRO R, MARTINS P, et al. Exploiting the circulant structure of tracking-by-detection with kernels [C]// Proceedings of European Conference on Computer Vision. Heidelberg: Springer, 2012: 702-715.
[2]   HERIQUES J F, CASEIRO R, MARTINS P, et al High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 37 (3): 583- 596
[3]   熊昌镇, 赵璐璐, 郭芬红 自适应特征融合的核相关滤波跟踪算法[J]. 计算机辅助设计与图形学学报, 2017, 29 (6): 1068- 1074
XIONG Chang-zhen, ZHAO Lu-lu, GUO Fen-hong Kernelized correlation filters tracking based on adaptive feature fusion[J]. Journal of Computer-Aided Design and Computer Graphics, 2017, 29 (6): 1068- 1074
[4]   DANELLJAN M, KHAN F S, FELSBERG M, et al. Adaptive color attributes for real-time visual tracking [C]// Proceeding of CVPR 2014. Washington D C: IEEE, 2014: 1090-1097.
[5]   马晓楠, 刘晓利, 李银伢 自适应尺度的快速相关滤波跟踪算法[J]. 计算机辅助设计与图形学学报, 2017, 29 (3): 450- 458
MA Xiao-nan, LIU Xiao-li, LI Yin-ya Fast scale-adaptive correlation tracking[J]. Journal of Computer-Aided Design and Computer Graphics, 2017, 29 (3): 450- 458
doi: 10.3969/j.issn.1003-9775.2017.03.007
[6]   BERTINETTO L, VALMADRE J, GOLODETZ S. Staple: complementary learners for real-time tracking [C]// Proceeding of CVPR 2016. Washington D C: IEEE, 2016: 1401-1409.
[7]   DANELLJAN M, HAGER G, KHAN F, et al. Accurate scale estimation for robust visual tracking [C]// Proceedings of British Machine Vision Conference. Nottingham: BMVA Press, 2014: 1-11.
[8]   LUKEZIC A, VOJIR T, ZAJC L C, et al. Discriminative correlation filter with channel and spatial reliability [C]// Proceeding of CVPR 2017. Washington D C: IEEE, 2017: 4847-4856.
[9]   陈倩茹, 刘日升, 樊鑫, 等 多相关滤波自适应融合的鲁棒目标跟踪[J]. 中国图象图形学报, 2018, 23 (2): 269- 276
CHEN Qian-ru, LIU Ri-sheng, FAN Xin, et al Multi-correlation filters method for robust visual tracking[J]. Journal of Image and Graphics, 2018, 23 (2): 269- 276
[10]   DANELLJAN M, HAGER G, KHAN F S, et al. Learning spatially regularized correlation filters for visual tracking [C]// Proceedings of ICCV 2016. Santiago: IEEE, 2016: 4310-4318.
[11]   蔡玉柱, 杨德东, 毛宁, 等 基于自适应卷积特征的目标跟踪算法[J]. 光学学报, 2017, (3): 262- 273
CAI Yu-zhu, YANG De-dong, MAO Ning, et al Visual tracking algorithm based on adaptive convolutional features[J]. Acta Optica Sinica, 2017, (3): 262- 273
[12]   DANELLJAN M, ROBINSON A, KHAN F S. Beyond correlation filters: learning continuous convolution operators for visual tracking [C]// Proceedings of ECCV. Amsterdam: Springer, 2016: 472-488.
[13]   DANELLJAN M, BHAT G, KHAN F, et al. ECO: efficient convolution operators for tracking [C]// Proceedings of CVPR 2017. Washington D C: IEEE, 2017: 6931-6939.
[14]   MA C, HUANG J B, YANG X K, et al. Hierarchical convolutional features for visual tracking [C]// Proceedings of CVPR 2015. Washington D C: IEEE, 2015: 3074-3082.
[15]   WANG X Y, LI H X, LI Y, et al. Robust and real-time deep tracking via multi-scale domain adaptation [C]// Proceeding of IEEE International Conference on Multimedia and Expo. Washington D C: IEEE, 2017: 1338-1343.
[16]   SIMONYAN K, ZISSERMAN A Very deep convolutional networks for large-scale image recognition[J]. Computer Science, 2014, 34 (2): 1409- 1422
[17]   WANG L J, OUYANG W L, WANG X G, et al. Visual tracking with fully convolutional networks [C]// Proceedings of ICCV 2015. Santiago: IEEE, 2015: 3119-3127.
[18]   WU Y, LIM J, YANG M H. Online object tracking: a benchmark [C]// Proceeding of CVPR 2013. Portland: IEEE, 2013: 2411-2418.
[19]   WU Y, LI M J, YANG M H Object tracking benchmark[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (9): 1834- 1848
doi: 10.1109/TPAMI.2014.2388226
[20]   QI Y K, ZHANG S P, QIN L, et al. Hedged deep tracking [C]// Proceeding of CVPR 2016. Las Vegas: IEEE, 2016: 4303-4311.
[21]   DANELLJAN M, HAGER G, KHAN F S, et al. Convolutional features for correlation filter based visual tracking [C]// Proceeding of ICCVW. Santiago: IEEE, 2016: 621-629.
[1] Peng SONG,De-dong YANG,Chang LI,Chang GUO. An adaptive siamese network tracking algorithm based on global feature channel recognition[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(5): 966-975.
[2] Yan-wei ZHAO,Jian ZHANG,Xian-ming ZHOU,Geng-yu WU. Dynamic tracking and precise landing of UAV based on visual magnetic guidance[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(1): 96-108.
[3] Chang-zhen XIONG,Yan LU,Jia-qing YAN. Visual tracking algorithm based on anisotropic Gaussian distribution[J]. Journal of ZheJiang University (Engineering Science), 2020, 54(2): 301-310.
[4] YU Hui-min, ZENG Xiong. Visual tracking combined with ranking vector SVM[J]. Journal of ZheJiang University (Engineering Science), 2015, 49(6): 1015-1021.