OAK

Increasing resolution and accuracy in sub-seasonal forecasting through 3D U-Net: the western US

Metadata Downloads
Author(s)
Ryu, JihunKim, HisuWang, Shih-Yu (simon)Yoon, Jin-Ho
Type
Article
Citation
GEOSCIENTIFIC MODEL DEVELOPMENT, v.19, no.1, pp.27 - 39
Issued Date
2026-01
Abstract
Sub-seasonal weather forecasting is a major challenge, particularly when high spatial resolution is needed to capture complex patterns and extreme events. Traditional Numerical Weather Prediction (NWP) models struggle with accurate forecasting at finer scales, especially for precipitation. In this study, we investigate the use of 3D U-Net architecture for post-processing sub-seasonal forecasts to enhance both predictability and spatial resolution, focusing on the western U.S. Using the ECMWF ensemble forecasting system (input) and high-resolution PRISM data (target), we tested different combinations of ensemble members and meteorological variables. Our results demonstrate that the 3D U-Net model significantly improves temperature predictability and consistently outperforms NWP models across multiple metrics. However, challenges remain in accurately forecasting extreme precipitation events, as the model tends to underestimate precipitation in coastal and mountainous regions. While ensemble members contribute to forecast accuracy, their impact is modest compared to the improvements achieved through downscaling. The model using the ensemble mean and only the target variables was most efficient. This model improved the pattern correlation coefficient for temperature and precipitation by 0.12 and 0.19, respectively, over a 32 d lead time. This study lays the groundwork for further development of neural network-based post-processing methods, showing their potential to enhance weather forecasts at sub-seasonal timescales.
Publisher
COPERNICUS GESELLSCHAFT MBH
ISSN
1991-959X
DOI
10.5194/gmd-19-27-2026
URI
https://scholar.gist.ac.kr/handle/local/33556
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.