3D human motion tracking with a coordinated mixture of factor analyzers

Rui Li, Tai Peng Tian, Stan Sclaroff, Ming Hsuan Yang

Research output: Contribution to journalArticlepeer-review

55 Citations (Scopus)

Abstract

A major challenge in applying Bayesian tracking methods for tracking 3D human body pose is the high dimensionality of the pose state space. It has been observed that the 3D human body pose parameters typically can be assumed to lie on a low-dimensional manifold embedded in the high-dimensional space. The goal of this work is to approximate the low-dimensional manifold so that a low-dimensional state vector can be obtained for efficient and effective Bayesian tracking. To achieve this goal, a globally coordinated mixture of factor analyzers is learned from motion capture data. Each factor analyzer in the mixture is a "locally linear dimensionality reducer" that approximates a part of the manifold. The global parametrization of the manifold is obtained by aligning these locally linear pieces in a global coordinate system. To enable automatic and optimal selection of the number of factor analyzers and the dimensionality of the manifold, a variational Bayesian formulation of the globally coordinated mixture of factor analyzers is proposed. The advantages of the proposed model are demonstrated in a multiple hypothesis tracker for tracking 3D human body pose. Quantitative comparisons on benchmark datasets show that the proposed method produces more accurate 3D pose estimates over time than those obtained from two previously proposed Bayesian tracking methods.

Original languageEnglish
Pages (from-to)170-190
Number of pages21
JournalInternational Journal of Computer Vision
Volume87
Issue number1-2
DOIs
Publication statusPublished - 2010 Mar

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of '3D human motion tracking with a coordinated mixture of factor analyzers'. Together they form a unique fingerprint.

Cite this