Design of a Time-Domain Neuron for RRAM-Based Neuromorphic Hardware with Quantization-Aware Training
- Author(s)
- Jeongmin Lee
- Type
- Thesis
- Degree
- Master
- Department
- 정보컴퓨팅대학 전기전자컴퓨터공학과
- Advisor
- Lee, Byung-geun
- Abstract
- This thesis presents a time-domain neuron compatible with quantization-aware training (QAT), specifically Learned Step Size Quantization (LSQ), for resistive random-access memory (RRAM)-based neuromorphic hardware. The proposed neuron maps the learned activation step size to a controllable current source in a voltage-to-time converter (VTC) and performs the ReLU activation function in the time domain using a time-to-digital converter (TDC). A convolutional neural network (CNN) model with 3-bit weights and 4-bit activations, trained via LSQ and implemented using the merged scale factor method, achieves an inference accuracy of 98.73% on the MNIST dataset. Circuit nonidealities are analyzed and quantified through circuit-level simulations, and their impact on system performance is evaluated by incorporating the extracted error sources into a behavioral model of the neuron and performing system-level simulations. Simulation results show that the accuracy degradation remains below 1%, demonstrating the robustness of the proposed neuron design. The proposed neuron is designed in a 0.18-μm CMOS process and verified through circuit-level simulation in Cadence Virtuoso and system-level simulation in Python using the PyTorch framework.
- URI
- https://scholar.gist.ac.kr/handle/local/31870
- Fulltext
- http://gist.dcollection.net/common/orgView/200000899218
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.