Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2020, Vol. 54 Issue (2): 301-310    DOI: 10.3785/j.issn.1008-973X.2020.02.011
Computer Technology, Information Engineering     
Visual tracking algorithm based on anisotropic Gaussian distribution
Chang-zhen XIONG(),Yan LU,Jia-qing YAN
Beijing Key Laboratory of Urban Traffic Intelligent Control Technology, North China University of Technology, Beijing 100144, China
Download: HTML     PDF(1507KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

A visual tracking algorithm based on the anisotropic Gaussian distribution was proposed, in order to improve the tracking performance of the effective convolution operation algorithm (ECOhc) with traditional features. The anisotropic Gaussian function with different horizontal and vertical bandwidths is constructed according to the shape ratio of different objects and then the function is used to train the tracker so as to predict the position and improve the tracking accuracy. The color histogram features of the object are extracted to track and predict the new position. And then the two predicted positions are weighted fused at the decision layer, which further improves the tracking accuracy. The algorithm was evaluated on the OTB-100 and VOT2016 datasets. The average distance accuracy and overlap rate of the proposed algorithm in OTB-100 were 89.6% and 83.7%, which were 4.67% and 6.62% higher than that of the ECOhc method, respectively. The expected average overlap rate in VOT2016 was 33.3%, which was 3.42% higher than that of the ECOhc method. The proposed algorithm can effectively improve the accuracy of tracking, and it has good robustness when encountering interferences such as occlusion, illumination variation and deformation.



Key wordsvisual tracking      correlation filter      anisotropic Gaussian distribution      color histogram      weighted fusion     
Received: 06 January 2019      Published: 10 March 2020
CLC:  TP 491  
Cite this article:

Chang-zhen XIONG,Yan LU,Jia-qing YAN. Visual tracking algorithm based on anisotropic Gaussian distribution. Journal of ZheJiang University (Engineering Science), 2020, 54(2): 301-310.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2020.02.011     OR     http://www.zjujournals.com/eng/Y2020/V54/I2/301


基于各向异性高斯分布的视觉跟踪算法

为了提高使用传统特征的有效卷积操作算法(ECOhc)的跟踪性能,提出基于各向异性高斯分布的视觉跟踪算法. 该方法根据不同目标的形状比构造水平和垂直方向上带宽不同的各向异性高斯函数,利用该函数训练跟踪器预测目标位置,提高算法的跟踪精度;提取颜色直方图特征跟踪预测新的目标位置,并在决策层加权融合2个预测位置,进一步提高跟踪精度. 在标准数据集OTB-100、VOT2016中测试算法,本研究算法在数据集OTB-100上的平均距离精度为89.6%,平均重叠率为83.7%,比ECOhc算法分别提高4.67%、6.62%;本研究算法在数据集VOT2016上的平均期望重叠率为33.3%,比ECOhc算法提高3.42%. 所提算法能有效提高目标跟踪的精度,在遇到遮挡、光线变化、变形等干扰时仍能稳定跟踪目标.


关键词: 视觉跟踪,  相关滤波,  各向异性高斯分布,  颜色直方图,  加权融合 
Fig.1 Overall framework diagram of proposed algorithm
Fig.2 Comparison of two target feature distribution with different aspect ratios
Fig.3 Aspect ratios of tracking objects on dataset OTB-100
Fig.4 Average distance precision under different position fusion factors
带宽因子组合 $\overline{\rm{DP}} $/ % $\overline{\rm{OP}} $/ %
1)注:第1、2名分别用粗体字和下划线标出
(1/15,1/12) 88.31) 83.2
(1/15,1/11) 88.0 83.1
(1/15,1/10) 89.6 83.7
(1/14,1/13) 87.3 82.0
Tab.1 Comparison results of tracking algorithms with different bandwidths on dataset OTB-100
跟踪算法 $\overline{\rm{CLE}} $/pixel $\overline{\rm{DP}} $/% $\overline{\rm{OP}} $/% V/(帧·s?1
1)注:第1、2名分别用粗体字和下划线标出
ECOhc 22.7 85.6 78.5 60.0
ECOhc_sig 19.5 86.7 79.8 69.0
ECOhc_his 18.41) 86.9 82.0 56.1
本研究算法 15.9 89.6 83.7 42.6
Tab.2 Comparison results of different algorithms on dataset OTB-100
Fig.5 Precision and success plots of different algorithms for four different videos
Fig.6 Distance precision and success rates of ten algorithms on dataset OTB-100
跟踪算法 $\overline{\rm{DP}} $ / % $\overline{\rm{AUC}} $ / %
1)注:第1、2名分别用粗体字和下划线标出
本研究算法 89.61) 66.5
SiamRPN 85.1 63.7
DaSiamRPN 88.0 65.8
SiamFC+CIR 85.0 64.0
SiamRPN+CIR 86.0 67.0
C-RPN ? 66.3
LDES 76.0 63.4
DAT 89.5 66.8
Tab.3 Comparison results of different algorithms on dataset OTB-100 in recent years
Fig.7 Tracking results of ten algorithms for typical video sequences
跟踪算法 EAO A R
1)注:第1、2名分别用粗体字和下划线标出
本研究算法 0.333 0.53 1.03
ECO 0.3731) 0.54 0.72
C-COT 0.331 0.52 0.85
ECOhc 0.322 0.53 1.08
CSR-DCF 0.338 0.51 0.85
Staple 0.295 0.54 1.35
D_SRDCF 0.274 0.52 1.23
Tab.4 Comparison results of different algorithms on dataset VOT2016
[1]   BOLME D S, BEVERIDGEJ R, DRAPERB A, et al. Visual object tracking using adaptive correlation filters [C]// Proceedings of 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco: IEEE, 2010: 2544-2550.
[2]   HENRIQUES J F, RUI C, MARTINS P, et al. Exploiting the circulant structure of tracking-by-detection with kernels [C]// Computer Vision-ECCV 2012. Florence: Springer, 2012: 702-715.
[3]   HENRIQUESJ F, CASEIRO R, MARTINS P, et al High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (3): 583- 596
doi: 10.1109/TPAMI.2014.2345390
[4]   DANELLJAN M, KHAN F S, FELSBERG M. Adaptive color attributes for real-time visual tracking [C]// Proceedings of 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus: IEEE, 2014: 1090-1097.
[5]   熊昌镇, 赵璐璐, 郭芬红 自适应特征融合的核相关滤波跟踪算法[J]. 计算机辅助设计与图形学学报, 2017, 29 (6): 1068- 1074
XIONG Chang-zhen, ZHAO Lu-lu, GUO Fen-hong Kernelized correlation filters tracking based on adaptive feature fusion[J]. Journal of Computer-Aided Design and Computer Graphics, 2017, 29 (6): 1068- 1074
doi: 10.3969/j.issn.1003-9775.2017.06.012
[6]   BERTINETTO L, VALMADRE J, GOLODETZ S, et al. Staple: complementary learners for real-time tracking [C]// Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition. California: IEEE, 2016: 1401-1409.
[7]   MA C, HUANG J, YANG X, et al. Hierarchical convolutional features for visual tracking [C]// Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 3074-3082.
[8]   MA C, HUANG J B, YANG X, et al Robust visual tracking via hierarchical convolutional features[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019, 41 (11): 2709- 2723
doi: 10.1109/TPAMI.2018.2865311
[9]   DANELLJAN M, ROBINSON A, KHAN F S, et al. Beyond correlation filters: learning continuous convolution operators for visual tracking [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 472-488.
[10]   DANELLJAN M, BHAT G, KHAN F S, et al. ECO: efficient convolution operators for tracking [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 6931-6939.
[11]   DANELLJAN M, HAGER G, KHAN F S, et al Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39 (8): 1561- 1575
doi: 10.1109/TPAMI.2016.2609928
[12]   LI Y, ZHU J. A scale adaptive kernel correlation filter tracker with feature integration [C]// Proceedings of 2014 European Conference on Computer Vision. Zurich: Springer, 2014: 254-265.
[13]   GALOOGAHI H K, FAGG A, LUCEY S. Learning background-aware correlation filters for visual tracking [C]// Proceedings of 2017 IEEE International Conference on Computer Vision. Venice: IEEE, 2017: 1144-1152.
[14]   LUKEZIC A, VOJIR T, ?EHOVIN L, et al. Discriminative correlation filter with channel and spatial reliability [C]// Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 4847-4856.
[15]   DANELLJAN M, HAGER G, KHAN F S, et al. Learning spatially regularized correlation filters for visual tracking [C]// Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 4310-318.
[16]   卢维, 项志宇, 于海滨, 刘济林 基于自适应多特征表观模型的目标压缩跟踪[J]. 浙江大学学报: 工学版, 2014, 48 (12): 2132- 2138
LU Wei, XIANG Zhi-yu, YU Hai-bin, LIU Ji-lin Object compressive tracking based on adaptive multi-feature appearance model[J]. Journal of Zhejiang University: Engineering Science, 2014, 48 (12): 2132- 2138
[17]   WU Y, LIM J, YANG M Object tracking benchmark[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (9): 1834- 1848
doi: 10.1109/TPAMI.2014.2388226
[18]   KRISTAN M, MATAS J, LEONARDIS A, et al. The visual object tracking VOT2016 challenge results [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 777-823.
[19]   DANELLJAN M, HAGER G, KHAN F S, et al. Convolutional features for correlation filter based visual tracking [C]// Proceedings of the IEEE International Conference on Computer Vision Workshop. Santiago: IEEE, 2015: 621-629.
[20]   BERTINETTO L, VALMADRE J, HENRIQUESJ F, et al. Fully-convolutional siamese networks for object tracking [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 850-865.
[21]   VALMADRE J, BERTINETTO L, HENRIQUESJ F, et al. End-to-end representation learning for correlation filter based tracking [C]// Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 5000-5008.
[22]   LI B, YAN J, WU W, et al. High performance visual tracking with siamese region proposal network [C]// Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 8971-8980.
[23]   ZHU Z, WANG Q, LI B, et al. Distractor-aware siamese networks for visual object tracking [C]// Computer Vision - ECCV 2018. Munich: Springer, 2018: 103-119.
[24]   ZHANG Z, PENG H, WANG Q, et al. Deeper and wider siamese networks for real-time visual tracking [C]// Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 4591-4600.
[25]   FAN H, LING H. Siamese cascaded region proposal networks for real-time visual tracking [C]// Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition. Long Beach: [S. n.], 2019.
[26]   LI Y, ZHU J, HOI S, et al Robust estimation of similarity transformation for visual object tracking[J]. AAAI Technical Track: Vision, 2019, 33 (1): 8666- 8673
[1] Peng SONG,De-dong YANG,Chang LI,Chang GUO. An adaptive siamese network tracking algorithm based on global feature channel recognition[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(5): 966-975.
[2] Yan-wei ZHAO,Jian ZHANG,Xian-ming ZHOU,Geng-yu WU. Dynamic tracking and precise landing of UAV based on visual magnetic guidance[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(1): 96-108.
[3] Chang-zhen XIONG,Run-ling WANG,Jian-cheng ZOU. Real-time tracking algorithm based on multiple Gaussian-distribution correlation filters[J]. Journal of ZheJiang University (Engineering Science), 2019, 53(8): 1488-1495.
[4] YU Hui-min, ZENG Xiong. Visual tracking combined with ranking vector SVM[J]. Journal of ZheJiang University (Engineering Science), 2015, 49(6): 1015-1021.