Robust face detection and tracking for real-life applications

Hyeran Byun, Byoungchul Ko

Research output: Contribution to journalLetterpeer-review

10 Citations (Scopus)


In this paper, we propose a new face detection and tracking algorithm for real-life telecommunication applications, such as video conferencing, cellular phone and PDA. We combine template-based face detection and tracking method with color information to track a face regardless of various lighting conditions and complex backgrounds as well as the race. Based on our experiments, we generate robust face templates from wavelet-transformed lowpass and two highpass subimages at the second level low-resolution. However, since template matching is generally sensitive to the change of illumination conditions, we propose a new type of preprocessing method. Tracking method is applied to reduce the computation time and predict precise face candidate region even though the movement is not uniform. Facial components are also detected using k-means clustering and their geometrical properties. Finally, from the relative distance of two eyes, we verify the real face and estimate the size of facial ellipse. To validate face detection and tracking performance of our algorithm, we test our method using six different video categories of QCIF size which are recorded in dynamic environments.

Original languageEnglish
Pages (from-to)1035-1055
Number of pages21
JournalInternational Journal of Pattern Recognition and Artificial Intelligence
Issue number6
Publication statusPublished - 2003 Sept

Bibliographical note

Funding Information:
The authors would like to thank Haejin Song for many useful suggestions that helped to improve the presentation of the paper. This work was supported in part by Biometrics Engineering Research Center (KOSEF).

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Robust face detection and tracking for real-life applications'. Together they form a unique fingerprint.

Cite this