Visual Tracking via Coarse and Fine Structural Local Sparse Appearance Models

Xu Jia, Huchuan Lu, Ming Hsuan Yang

Research output: Contribution to journalArticlepeer-review

48 Citations (Scopus)

Abstract

Sparse representation has been successfully applied to visual tracking by finding the best candidate with a minimal reconstruction error using target templates. However, most sparse representation-based tracking methods only consider holistic rather than local appearance to discriminate between target and background regions, and hence may not perform well when target objects are heavily occluded. In this paper, we develop a simple yet robust tracking algorithm based on a coarse and fine structural local sparse appearance model. The proposed method exploits both partial and structural information of a target object based on sparse coding using the dictionary composed of patches from multiple target templates. The likelihood obtained by averaging and pooling operations exploits consistent appearance of object parts, thereby helping not only locate targets accurately but also handle partial occlusion. To update templates more accurately without introducing occluding regions, we introduce an occlusion detection scheme to account for pixels belonging to the target objects. The proposed method is evaluated on a large benchmark data set with three evaluation metrics. Experimental results demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.

Original languageEnglish
Article number7515153
Pages (from-to)4555-4564
Number of pages10
JournalIEEE Transactions on Image Processing
Volume25
Issue number10
DOIs
Publication statusPublished - 2016 Oct

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Visual Tracking via Coarse and Fine Structural Local Sparse Appearance Models'. Together they form a unique fingerprint.

Cite this