OAK

A ReRAM-Based Convolutional Neural Network Accelerator Using the Analog Layer Normalization Technique

Metadata Downloads
Abstract
This article presents a resistive random access memory (ReRAM)-based convolutional neural network (CNN) accelerator with a new analog layer normalization (ALN) technique. The proposed ALN can be used to effectively reduce the effect of the conductance variation in ReRAM devices by normalizing the outputs of the vector-matrix multiplication (VMM) in the charge domain. The ALN achieves high energy and hardware efficiencies because it directly processes the normalization of the VMM outputs without storing their values in memory and is merged into the neuron circuit of the accelerator. To verify the effect of the ALN through experiments, a VMM accelerator that consists of two 25 × 25 sized ReRAM arrays and peripheral circuits with ALN is used for a convolution layer with digital signal processing in a field programmable gate array. The MNIST dataset is used to train and inference a CNN employing two VMM accelerators that work as convolution layers in a pipelined manner. Despite the conductance variation of the ReRAM devices, the ALN successfully stabilizes the output distribution of the convolution layer, which improves the classification accuracy of the network. A final classification accuracy for the MNIST and Fashion-MNIST datasets of 96.2% and 83.1% is achieved, respectively, with an energy efficiency of 9.94 tera-operations per second per Watt. © 1982-2012 IEEE.
Author(s)
Gi, S.-G.Lee, H.Jang, J.Lee, B.-G.
Issued Date
2023-06
Type
Article
DOI
10.1109/TIE.2022.3190876
URI
https://scholar.gist.ac.kr/handle/local/10155
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
IEEE Transactions on Industrial Electronics, v.70, no.6, pp.6442 - 6451
ISSN
0278-0046
Appears in Collections:
Department of Electrical Engineering and Computer Science > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.