Please wait a minute...
浙江大学学报(工学版)  2020, Vol. 54 Issue (2): 301-310    DOI: 10.3785/j.issn.1008-973X.2020.02.011
计算机技术、信息工程     
基于各向异性高斯分布的视觉跟踪算法
熊昌镇(),卢颜,闫佳庆
北方工业大学 城市道路交通智能控制技术北京市重点实验室,北京 100144
Visual tracking algorithm based on anisotropic Gaussian distribution
Chang-zhen XIONG(),Yan LU,Jia-qing YAN
Beijing Key Laboratory of Urban Traffic Intelligent Control Technology, North China University of Technology, Beijing 100144, China
 全文: PDF(1507 KB)   HTML
摘要:

为了提高使用传统特征的有效卷积操作算法(ECOhc)的跟踪性能,提出基于各向异性高斯分布的视觉跟踪算法. 该方法根据不同目标的形状比构造水平和垂直方向上带宽不同的各向异性高斯函数,利用该函数训练跟踪器预测目标位置,提高算法的跟踪精度;提取颜色直方图特征跟踪预测新的目标位置,并在决策层加权融合2个预测位置,进一步提高跟踪精度. 在标准数据集OTB-100、VOT2016中测试算法,本研究算法在数据集OTB-100上的平均距离精度为89.6%,平均重叠率为83.7%,比ECOhc算法分别提高4.67%、6.62%;本研究算法在数据集VOT2016上的平均期望重叠率为33.3%,比ECOhc算法提高3.42%. 所提算法能有效提高目标跟踪的精度,在遇到遮挡、光线变化、变形等干扰时仍能稳定跟踪目标.

关键词: 视觉跟踪相关滤波各向异性高斯分布颜色直方图加权融合    
Abstract:

A visual tracking algorithm based on the anisotropic Gaussian distribution was proposed, in order to improve the tracking performance of the effective convolution operation algorithm (ECOhc) with traditional features. The anisotropic Gaussian function with different horizontal and vertical bandwidths is constructed according to the shape ratio of different objects and then the function is used to train the tracker so as to predict the position and improve the tracking accuracy. The color histogram features of the object are extracted to track and predict the new position. And then the two predicted positions are weighted fused at the decision layer, which further improves the tracking accuracy. The algorithm was evaluated on the OTB-100 and VOT2016 datasets. The average distance accuracy and overlap rate of the proposed algorithm in OTB-100 were 89.6% and 83.7%, which were 4.67% and 6.62% higher than that of the ECOhc method, respectively. The expected average overlap rate in VOT2016 was 33.3%, which was 3.42% higher than that of the ECOhc method. The proposed algorithm can effectively improve the accuracy of tracking, and it has good robustness when encountering interferences such as occlusion, illumination variation and deformation.

Key words: visual tracking    correlation filter    anisotropic Gaussian distribution    color histogram    weighted fusion
收稿日期: 2019-01-06 出版日期: 2020-03-10
CLC:  TP 491  
基金资助: 国家重点研发计划资助项目(2017YFC0821102);北京市优秀人才培养资助项目(2017000020124G287)
作者简介: 熊昌镇(1979—),男,副教授,从事深度学习、视频分析研究. orcid.org/0000-0001-7645-5181. E-mail: xczkiong@163.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
作者相关文章  
熊昌镇
卢颜
闫佳庆

引用本文:

熊昌镇,卢颜,闫佳庆. 基于各向异性高斯分布的视觉跟踪算法[J]. 浙江大学学报(工学版), 2020, 54(2): 301-310.

Chang-zhen XIONG,Yan LU,Jia-qing YAN. Visual tracking algorithm based on anisotropic Gaussian distribution. Journal of ZheJiang University (Engineering Science), 2020, 54(2): 301-310.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2020.02.011        http://www.zjujournals.com/eng/CN/Y2020/V54/I2/301

图 1  本研究所提算法整体框架图
图 2  长宽比不同的2种目标的特征分布图对比
图 3  数据集OTB-100视频中跟踪目标的宽高比和高宽比
图 4  不同位置融合因子下的平均距离精确度
带宽因子组合 $\overline{\rm{DP}} $/ % $\overline{\rm{OP}} $/ %
1)注:第1、2名分别用粗体字和下划线标出
(1/15,1/12) 88.31) 83.2
(1/15,1/11) 88.0 83.1
(1/15,1/10) 89.6 83.7
(1/14,1/13) 87.3 82.0
表 1  不同带宽因子下跟踪算法在数据集OTB-100上的对比结果
跟踪算法 $\overline{\rm{CLE}} $/pixel $\overline{\rm{DP}} $/% $\overline{\rm{OP}} $/% V/(帧·s?1
1)注:第1、2名分别用粗体字和下划线标出
ECOhc 22.7 85.6 78.5 60.0
ECOhc_sig 19.5 86.7 79.8 69.0
ECOhc_his 18.41) 86.9 82.0 56.1
本研究算法 15.9 89.6 83.7 42.6
表 2  数据集OTB-100上不同算法的对比结果
图 5  4组视频下不同算法的精度曲线和成功率曲线
图 6  数据集OTB-100上10种算法的距离精度和成功率曲线
跟踪算法 $\overline{\rm{DP}} $ / % $\overline{\rm{AUC}} $ / %
1)注:第1、2名分别用粗体字和下划线标出
本研究算法 89.61) 66.5
SiamRPN 85.1 63.7
DaSiamRPN 88.0 65.8
SiamFC+CIR 85.0 64.0
SiamRPN+CIR 86.0 67.0
C-RPN ? 66.3
LDES 76.0 63.4
DAT 89.5 66.8
表 3  近年跟踪算法在数据集OTB-100上的对比结果
图 7  10种算法在典型视频上的跟踪结果
跟踪算法 EAO A R
1)注:第1、2名分别用粗体字和下划线标出
本研究算法 0.333 0.53 1.03
ECO 0.3731) 0.54 0.72
C-COT 0.331 0.52 0.85
ECOhc 0.322 0.53 1.08
CSR-DCF 0.338 0.51 0.85
Staple 0.295 0.54 1.35
D_SRDCF 0.274 0.52 1.23
表 4  不同算法在数据集VOT2016上的对比结果
1 BOLME D S, BEVERIDGEJ R, DRAPERB A, et al. Visual object tracking using adaptive correlation filters [C]// Proceedings of 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco: IEEE, 2010: 2544-2550.
2 HENRIQUES J F, RUI C, MARTINS P, et al. Exploiting the circulant structure of tracking-by-detection with kernels [C]// Computer Vision-ECCV 2012. Florence: Springer, 2012: 702-715.
3 HENRIQUESJ F, CASEIRO R, MARTINS P, et al High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (3): 583- 596
doi: 10.1109/TPAMI.2014.2345390
4 DANELLJAN M, KHAN F S, FELSBERG M. Adaptive color attributes for real-time visual tracking [C]// Proceedings of 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus: IEEE, 2014: 1090-1097.
5 熊昌镇, 赵璐璐, 郭芬红 自适应特征融合的核相关滤波跟踪算法[J]. 计算机辅助设计与图形学学报, 2017, 29 (6): 1068- 1074
XIONG Chang-zhen, ZHAO Lu-lu, GUO Fen-hong Kernelized correlation filters tracking based on adaptive feature fusion[J]. Journal of Computer-Aided Design and Computer Graphics, 2017, 29 (6): 1068- 1074
doi: 10.3969/j.issn.1003-9775.2017.06.012
6 BERTINETTO L, VALMADRE J, GOLODETZ S, et al. Staple: complementary learners for real-time tracking [C]// Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition. California: IEEE, 2016: 1401-1409.
7 MA C, HUANG J, YANG X, et al. Hierarchical convolutional features for visual tracking [C]// Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 3074-3082.
8 MA C, HUANG J B, YANG X, et al Robust visual tracking via hierarchical convolutional features[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019, 41 (11): 2709- 2723
doi: 10.1109/TPAMI.2018.2865311
9 DANELLJAN M, ROBINSON A, KHAN F S, et al. Beyond correlation filters: learning continuous convolution operators for visual tracking [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 472-488.
10 DANELLJAN M, BHAT G, KHAN F S, et al. ECO: efficient convolution operators for tracking [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 6931-6939.
11 DANELLJAN M, HAGER G, KHAN F S, et al Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39 (8): 1561- 1575
doi: 10.1109/TPAMI.2016.2609928
12 LI Y, ZHU J. A scale adaptive kernel correlation filter tracker with feature integration [C]// Proceedings of 2014 European Conference on Computer Vision. Zurich: Springer, 2014: 254-265.
13 GALOOGAHI H K, FAGG A, LUCEY S. Learning background-aware correlation filters for visual tracking [C]// Proceedings of 2017 IEEE International Conference on Computer Vision. Venice: IEEE, 2017: 1144-1152.
14 LUKEZIC A, VOJIR T, ?EHOVIN L, et al. Discriminative correlation filter with channel and spatial reliability [C]// Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 4847-4856.
15 DANELLJAN M, HAGER G, KHAN F S, et al. Learning spatially regularized correlation filters for visual tracking [C]// Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 4310-318.
16 卢维, 项志宇, 于海滨, 刘济林 基于自适应多特征表观模型的目标压缩跟踪[J]. 浙江大学学报: 工学版, 2014, 48 (12): 2132- 2138
LU Wei, XIANG Zhi-yu, YU Hai-bin, LIU Ji-lin Object compressive tracking based on adaptive multi-feature appearance model[J]. Journal of Zhejiang University: Engineering Science, 2014, 48 (12): 2132- 2138
17 WU Y, LIM J, YANG M Object tracking benchmark[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (9): 1834- 1848
doi: 10.1109/TPAMI.2014.2388226
18 KRISTAN M, MATAS J, LEONARDIS A, et al. The visual object tracking VOT2016 challenge results [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 777-823.
19 DANELLJAN M, HAGER G, KHAN F S, et al. Convolutional features for correlation filter based visual tracking [C]// Proceedings of the IEEE International Conference on Computer Vision Workshop. Santiago: IEEE, 2015: 621-629.
20 BERTINETTO L, VALMADRE J, HENRIQUESJ F, et al. Fully-convolutional siamese networks for object tracking [C]// Computer Vision-ECCV 2016. Amsterdam: Springer, 2016: 850-865.
21 VALMADRE J, BERTINETTO L, HENRIQUESJ F, et al. End-to-end representation learning for correlation filter based tracking [C]// Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 5000-5008.
22 LI B, YAN J, WU W, et al. High performance visual tracking with siamese region proposal network [C]// Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 8971-8980.
23 ZHU Z, WANG Q, LI B, et al. Distractor-aware siamese networks for visual object tracking [C]// Computer Vision - ECCV 2018. Munich: Springer, 2018: 103-119.
24 ZHANG Z, PENG H, WANG Q, et al. Deeper and wider siamese networks for real-time visual tracking [C]// Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 4591-4600.
25 FAN H, LING H. Siamese cascaded region proposal networks for real-time visual tracking [C]// Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition. Long Beach: [S. n.], 2019.
26 LI Y, ZHU J, HOI S, et al Robust estimation of similarity transformation for visual object tracking[J]. AAAI Technical Track: Vision, 2019, 33 (1): 8666- 8673
[1] 赵燕伟,张健,周仙明,吴耿育. 基于视觉-磁引导的无人机动态跟踪与精准着陆[J]. 浙江大学学报(工学版), 2021, 55(1): 96-108.
[2] 熊昌镇,王润玲,邹建成. 基于多高斯相关滤波的实时跟踪算法[J]. 浙江大学学报(工学版), 2019, 53(8): 1488-1495.
[3] 王建平 张涛. 移动机器人检测与跟踪系统的研究与设计[J]. J4, 2007, 41(10): 1710-1714.