OAK

Meta-transfer learning for zero-shot super-resolution

Metadata Downloads
Author(s)
Soh, Jae WoongCho, SunwooCho, Nam Ik
Type
Conference Paper
Citation
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020, pp.3513 - 3522
Issued Date
2020-06-14
Abstract
Convolutional neural networks (CNNs) have shown dramatic improvements in single image super-resolution (SISR) by using large-scale external samples. Despite their remarkable performance based on the external dataset, they cannot exploit internal information within a specific image. Another problem is that they are applicable only to the specific condition of data that they are supervised. For instance, the low-resolution (LR) image should be a 'bicubic' downsampled noise-free image from a high-resolution (HR) one. To address both issues, zero-shot super-resolution (ZSSR) has been proposed for flexible internal learning. However, they require thousands of gradient updates, i.e., long inference time. In this paper, we present Meta-Transfer Learning for Zero-Shot Super-Resolution (MZSR), which leverages ZSSR. Precisely, it is based on finding a generic initial parameter that is suitable for internal learning. Thus, we can exploit both external and internal information, where one single gradient update can yield quite considerable results. With our method, the network can quickly adapt to a given image condition. In this respect, our method can be applied to a large spectrum of image conditions within a fast adaptation process. © 2020 IEEE.
Publisher
IEEE Computer Society
Conference Place
US
Seattle
URI
https://scholar.gist.ac.kr/handle/local/34051
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.