Fast visual tracking via dense spatio-temporal context learning

Kaihua Zhang, Lei Zhang, Qingshan Liu, David Zhang, Ming Hsuan Yang

Research output: Contribution to journalConference articlepeer-review

501 Citations (Scopus)

Abstract

In this paper, we present a simple yet fast and robust algorithm which exploits the dense spatio-temporal context for visual tracking. Our approach formulates the spatio-temporal relationships between the object of interest and its locally dense contexts in a Bayesian framework, which models the statistical correlation between the simple low-level features (i.e., image intensity and position) from the target and its surrounding regions. The tracking problem is then posed by computing a confidence map which takes into account the prior information of the target location and thereby alleviates target location ambiguity effectively. We further propose a novel explicit scale adaptation scheme, which is able to deal with target scale variations efficiently and effectively. The Fast Fourier Transform (FFT) is adopted for fast learning and detection in this work, which only needs 4 FFT operations. Implemented in MATLAB without code optimization, the proposed tracker runs at 350 frames per second on an i7 machine. Extensive experimental results show that the proposed algorithm performs favorably against state-of-the-art methods in terms of efficiency, accuracy and robustness.

Original languageEnglish
Pages (from-to)127-141
Number of pages15
JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8693 LNCS
Issue numberPART 5
DOIs
Publication statusPublished - 2014
Event13th European Conference on Computer Vision, ECCV 2014 - Zurich, Switzerland
Duration: 2014 Sept 62014 Sept 12

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Fast visual tracking via dense spatio-temporal context learning'. Together they form a unique fingerprint.

Cite this