CLC number: TP391
On-line Access: 2020-10-14
Received: 2019-09-20
Revision Accepted: 2020-04-12
Crosschecked: 2020-09-02
Cited: 0
Clicked: 4385
Citations: Bibtex RefMan EndNote GB/T7714
Shui-wang Li, Qian-bo Jiang, Qi-jun Zhao, Li Lu, Zi-liang Feng. Asymmetric discriminative correlation filters for visual tracking[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.1900507 @article{title="Asymmetric discriminative correlation filters for visual tracking", %0 Journal Article TY - JOUR
用于视频跟踪的非对称判别相关滤波器四川大学视觉合成图形图像技术国防重点学科实验室,中国成都市,610065 摘要:判别相关滤波器(DCF)是视频跟踪领域一种有效方法,显著推动了视频跟踪领域进展。然而,卷积算子的对称性会带来计算上的问题,并破坏广义的平移等变性。针对前一问题,人们提出许多解决方法,但对后一问题不够重视。本文分析循环卷积的对称性带来的问题,提出一种非对称卷积运算,且证明这种运算具有弱的广义平移等变性。利用提出的卷积运算,构造一个非对称判别相关滤波跟踪器(ADCF)。它对目标的平移更加敏感,且其非对称性允许滤波器和输入样本有不同空域大小,这使得ADCF的计算复杂性,从滤波器参数数量不随输入样本增大而增加的意义上说,更加可控。且ADCF对应的正规矩阵具有两级块Toeplitz矩阵结构,利用该结构可设计时间复杂度为O(NlogN)、空间复杂度为O(N)的矩阵-向量乘法。此外,有别于基于DCF的跟踪器,ADCF引进空域和时域正则化项,本质上不会增加计算复杂度。在4个公开基准数据集(OTB-2013,OTB-2015,VOT-2016和Temple-Color)和一个合成数据集上进行对比实验,结果表明所提方法取得最优视频跟踪性能。 关键词组: Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article
Reference[1]Bertinetto L, Valmadre J, Golodetz S, et al., 2016. Staple: complementary learners for real-time tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1401-1409. [2]Bolme DS, Beveridge JR, Draper BA, et al., 2010. Visual object tracking using adaptive correlation filters. Proc IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.2544-2550. [3]Chen BY, Wang D, Li PX, et al., 2018. Real-time ‘actor-critic’ tracking. Proc 15th European Conf on Computer Vision, p.318-334. [4]Choi J, Chang HJ, Yun S, et al., 2017. Attentional correlation filter network for adaptive visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4828-4837. [5]Dalal N, Triggs B, 2005. Histograms of oriented gradients for human detection. Proc IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.886-893. [6]Danelljan M, Khan FS, Felsberg M, et al., 2014. Adaptive color attributes for real-time visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1090-1097. [7]Danelljan M, Häger G, Khan FS, et al., 2015. Learning spatially regularized correlation filters for visual tracking. Proc IEEE Int Conf on Computer Vision, p.4310-4318. [8]Danelljan M, Häger G, Khan FS, et al., 2016a. Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1430-1438. [9]Danelljan M, Robinson A, Khan FS, et al., 2016b. Beyond correlation filters: learning continuous convolution operators for visual tracking. Proc 14th European Conf on Computer Vision, p.472-488. [10]Danelljan M, Häger G, Khan FS, et al., 2017a. Discriminative scale space tracking. IEEE Trans Patt Anal Mach Intell, 39(8):1561-1575. [11]Danelljan M, Bhat G, Khan FS, et al., 2017b. ECO: efficient convolution operators for tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.6638-6646. [12]Dong XP, Shen JB, 2018. Triplet loss in Siamese network for object tracking. Proc 15th European Conf on Computer Vision, p.459-474. [13]Galoogahi HK, Sim T, Lucey S, 2013. Multi-channel correlation filters. Proc IEEE Int Conf on Computer Vision, p.3072-3079. [14]Galoogahi HK, Fagg A, Lucey S, 2017. Learning background-aware correlation filters for visual tracking. Proc IEEE Int Conf on Computer Vision, p.1135-1143. [15]Henriques JF, Caseiro R, Martins P, et al., 2015. High-speed tracking with kernelized correlation filters. IEEE Trans Patt Anal Mach Intell, 37(3):583-596. [16]Kart U, Lukezic A, Kristan M, et al., 2019. Object tracking by reconstruction with view-specific discriminative correlation filters. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1339-1348. [17]Kristan M, Leonardis A, Matas J, et al., 2016. The visual object tracking VOT2016 challenge results. Proc Amsterdam on Computer Vision, p.191-217. [18]Lee D, 1986. Fast multiplication of a recursive block Toeplitz matrix by a vector and its application. J Complex, 2(4):295-305. [19]Li B, Yan JJ, Wu W, et al., 2018. High performance visual tracking with Siamese region proposal network. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.8971-8980. [20]Li F, Tian C, Zuo WM, et al., 2018. Learning spatial-temporal regularized correlation filters for visual tracking. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.4904-4913. [21]Li Y, Zhu JK, 2014. A scale adaptive kernel correlation filter tracker with feature integration. Proc European Conf on Computer Vision, p.254-265. [22]Liang PP, Blasch E, Ling HB, 2015. Encoding color information for visual tracking: algorithms and benchmark. IEEE Trans Image Process, 24(12):5630-5644. [23]Lukezic A, VojíłT, Zajc LC, et al., 2017. Discriminative correlation filter with channel and spatial reliability. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4847-4856. [24]Ma C, Huang JB, Yang XK, et al., 2015. Hierarchical convolutional features for visual tracking. Proc IEEE Int Conf on Computer Vision, p.3074-3082. [25]Mueller M, Smith N, Ghanem B, 2017. Context-aware correlation filter tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.1387-1395. [26]Nam H, Han B, 2016. Learning multi-domain convolutional neural networks for visual tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4293-4302. [27]Pu S, Song Y, Ma C, et al., 2018. Deep attentive tracking via reciprocative learning. Proc 32nd Conf on Neural Information Processing Systems, p.1931-1941. [28]Qi YK, Zhang SP, Qin L, et al., 2016. Hedged deep tracking. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.4303-4311. [29]Sun C, Wang D, Lu HC, et al., 2018. Correlation tracking via joint discrimination and reliability learning. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.489-497. [30]Sun YX, Sun C, Wang D, et al., 2019. ROI pooled correlation filters for visual tracking. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.5783-5791. [31]Tang M, Yu B, Zhang F, et al., 2018. High-speed tracking with multi-kernel correlation filters. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.4874-4883. [32]Wang Q, Zhang L, Bertinetto L, et al., 2019. Fast online object tracking and segmentation: a unifying approach. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.1328-1338. [33]Wu Y, Lim J, Yang MH, 2013. Online object tracking: a benchmark. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.2411-2418. [34]Wu Y, Lim J, Yang MH, 2015. Object tracking benchmark. IEEE Trans Patt Anal Mach Intell, 37(9):1834-1848. [35]Zhang JM, Ma SG, Sclaroff S, 2014. MEEM: robust tracking via multiple experts using entropy minimization. Proc 13th European Conf on Computer Vision, p.188-203. Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou
310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE |
Open peer comments: Debate/Discuss/Question/Opinion
<1>