Deinterlacing with motion adaptive vertical temporal filtering

Kwon Lee, Jonghwa Lee, Chulhee Lee

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

In this paper, we propose a deinterlacing method with motion adaptive vertical temporal filtering, which utilizes the correlations between adjacent frames. In this model, we first interpolate the missing lines of the current frame and adjacent frames by using an intrafield deinterlacing method. Then we compute the pixel differences between the current frame and the adjacent frames. Since the differences between the adjacent frames would show similar patterns, we can use these patterns to improve deinterlacing performance. In other words, instead of performing deinterlacing in the frame domain, we perform the operation in the frame difference domain. Since the proposed method produces good performance in stationary regions, we selectively apply the vertical temporal filter. Then we apply the proposed method iteratively in order to enhance video quality. The proposed method shows low complexity and still produces superior performance. Experimental results show that the proposed method provides noticeable improvements over existing methods in terms of both subjective and objective evaluations.

Original languageEnglish
Pages (from-to)636-643
Number of pages8
JournalIEEE Transactions on Consumer Electronics
Volume55
Issue number2
DOIs
Publication statusPublished - 2009

Bibliographical note

Funding Information:
1This research was supported by the MKE (Ministry of Knowledge Economy), South Korea, under the ITRC (Information Technology Research Center) support program supervised by the IITA (Institute of Information Technology Assessment). (IITA-2008-(C1090-0801-0011)).

All Science Journal Classification (ASJC) codes

  • Media Technology
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Deinterlacing with motion adaptive vertical temporal filtering'. Together they form a unique fingerprint.

Cite this