Low-Cost and Effective Fault-Tolerance Enhancement Techniques for Emerging Memories-Based Deep Neural Networks

Thai Hoang Nguyen, Muhammad Imran, Jaehyuk Choi, Joon Sung Yang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Deep Neural Networks (DNNs) have been found to outperform conventional programming approaches in several applications such as computer vision and natural language processing. Efficient hardware architectures for deploying DNNs on edge devices have been actively studied. Emerging memory technologies with their better scalability, non-volatility, and good read performance are ideal candidates for DNNs which are trained once and deployed over many devices. Emerging memories have also been used in DNNs accelerators for efficient computations of dot-product. However, due to immature manufacturing and limited cell endurance, emerging resistive memories often result in reliability issues like stuck-at faults, which reduce the chip yield and pose a challenge to the accuracy of DNNs. Depending on the state, stuck-at faults may or may not cause error. Fault-tolerance of DNNs can be enhanced by reducing the impact of errors resulting from the stuck-at faults. In this work, we introduce simple and light-weight Intra-block Address remapping and weight encoding techniques to improve the fault-tolerance for DNNs. The proposed schemes effectively work at the network deployment time while preserving the network organization and the original values of the parameters. Experimental results on state-of-the-art DNN models indicate that, with a small storage overhead of just 0.98%, the proposed techniques achieve up to 300× stuck-at faults tolerance capability on Cifar10 dataset and 125× on Imagenet datatset, compared to the baseline DNNs without any fault-tolerance method. By integrating with the existing schemes, the proposed schemes can further enhance the fault resilience of DNNs.

Original languageEnglish
Title of host publication2021 58th ACM/IEEE Design Automation Conference, DAC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1075-1080
Number of pages6
ISBN (Electronic)9781665432740
DOIs
Publication statusPublished - 2021 Dec 5
Event58th ACM/IEEE Design Automation Conference, DAC 2021 - San Francisco, United States
Duration: 2021 Dec 52021 Dec 9

Publication series

NameProceedings - Design Automation Conference
Volume2021-December
ISSN (Print)0738-100X

Conference

Conference58th ACM/IEEE Design Automation Conference, DAC 2021
Country/TerritoryUnited States
CitySan Francisco
Period21/12/521/12/9

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'Low-Cost and Effective Fault-Tolerance Enhancement Techniques for Emerging Memories-Based Deep Neural Networks'. Together they form a unique fingerprint.

Cite this