A ReRAM-Based Convolutional Neural Network Accelerator Using the Analog Layer Normalization Technique
- Abstract
- This article presents a resistive random access memory (ReRAM)-based convolutional neural network (CNN) accelerator with a new analog layer normalization (ALN) technique. The proposed ALN can be used to effectively reduce the effect of the conductance variation in ReRAM devices by normalizing the outputs of the vector-matrix multiplication (VMM) in the charge domain. The ALN achieves high energy and hardware efficiencies because it directly processes the normalization of the VMM outputs without storing their values in memory and is merged into the neuron circuit of the accelerator. To verify the effect of the ALN through experiments, a VMM accelerator that consists of two 25 × 25 sized ReRAM arrays and peripheral circuits with ALN is used for a convolution layer with digital signal processing in a field programmable gate array. The MNIST dataset is used to train and inference a CNN employing two VMM accelerators that work as convolution layers in a pipelined manner. Despite the conductance variation of the ReRAM devices, the ALN successfully stabilizes the output distribution of the convolution layer, which improves the classification accuracy of the network. A final classification accuracy for the MNIST and Fashion-MNIST datasets of 96.2% and 83.1% is achieved, respectively, with an energy efficiency of 9.94 tera-operations per second per Watt. © 1982-2012 IEEE.
- Author(s)
- Gi, S.-G.; Lee, H.; Jang, J.; Lee, B.-G.
- Issued Date
- 2023-06
- Type
- Article
- DOI
- 10.1109/TIE.2022.3190876
- URI
- https://scholar.gist.ac.kr/handle/local/10155
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.