Unpaired image denoising using a generative adversarial network in x-ray CT

Hyoung Suk Park, Jineon Baek, Sun Kyoung You, Jae Kyu Choi, Jin Keun Seo

Research output: Contribution to journalArticlepeer-review

47 Citations (Scopus)

Abstract

This paper proposes a deep learning-based denoising method for noisy low-dose computerized tomography (CT) images in the absence of paired training data. The proposed method uses a fidelityembedded generative adversarial network (GAN) to learn a denoising function from unpaired training data of low-dose CT (LDCT) and standard-dose CT (SDCT) images, where the denoising function is the optimal generator in the GAN framework. This paper analyzes the f-GAN objective to derive a suitable generator that is optimized by minimizing a weighted sum of two losses: The Kullback-Leibler divergence between an SDCT data distribution and a generated distribution, and the 2 loss between the LDCT image and the corresponding generated images (or denoised image). The computed generator reffects the prior belief about SDCT data distribution through training. We observed that the proposed method allows the preservation of fine anomalous features while eliminating noise. The experimental results show that the proposed deeplearning method with unpaired datasets performs comparably to a method using paired datasets. A clinical experiment was also performed to show the validity of the proposed method for noise arising in the low-dose X-ray CT.

Original languageEnglish
Article number2934178
Pages (from-to)110414-110425
Number of pages12
JournalIEEE Access
Volume7
DOIs
Publication statusPublished - 2019

Bibliographical note

Publisher Copyright:
© 2020 Inderscience Enterprises Ltd.. All rights reserved.

All Science Journal Classification (ASJC) codes

  • General Engineering
  • General Materials Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Unpaired image denoising using a generative adversarial network in x-ray CT'. Together they form a unique fingerprint.

Cite this