OAK

All-Digital Bandwidth Mismatch Calibration of TI-ADCs Based on Optimally Induced Minimization

Metadata Downloads
Abstract
The problem of parameter mismatch in time-interleaved-analog-to-digital converters (TI-ADCs) has become a significant concern to guarantee output linearity. Several solutions have been presented for offset, gain, time skew, and bandwidth mismatches, but they can rely on hardware expensive methods. This article proposes an all-digital calibration algorithm for the TI-ADC bandwidth mismatch, which is capable of detecting the optimal correction coefficients for the derivative-based digital filters. The analyzed convergence logic further relaxes the hardware requirements. Moreover, numerical simulations and experimental results validate the calibration efficiency. A commercial 12-bit 3.6-GS/s two-channel TI-ADC was used to verify the proposed calibration algorithm under real conditions. © 1993-2012 IEEE.
Author(s)
Tavares, Yang AzevedoLee, Kang-YoonLee, Minjae
Issued Date
2020-05
Type
Article
DOI
10.1109/TVLSI.2020.2974549
URI
https://scholar.gist.ac.kr/handle/local/12188
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
IEEE Transactions on Very Large Scale Integration (VLSI) Systems, v.28, no.5, pp.1175 - 1184
ISSN
1063-8210
Appears in Collections:
Department of Electrical Engineering and Computer Science > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.