An experimental comparison of online object-tracking algorithms

Qing Wang, Feng Chen, Wenli Xu, Ming Hsuan Yang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

69 Citations (Scopus)

Abstract

This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT, 4 SemiT,5 BeSemiT,6 L1T,7 MILT, 8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.

Original languageEnglish
Title of host publicationWavelets and Sparsity XIV
DOIs
Publication statusPublished - 2011
EventWavelets and Sparsity XIV - San Diego, CA, United States
Duration: 2011 Aug 212011 Aug 24

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume8138
ISSN (Print)0277-786X

Conference

ConferenceWavelets and Sparsity XIV
Country/TerritoryUnited States
CitySan Diego, CA
Period11/8/2111/8/24

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'An experimental comparison of online object-tracking algorithms'. Together they form a unique fingerprint.

Cite this