21th AIAI 2025, 26 - 29 June 2025, Limassol, Cyprus

Multi-Model Data Transfer by Knowledge Distillation for Enhancing Precipitation Nowcasting

WANG QIHANG, Tomita Tomohiko, Fukui Ken-ichi

Abstract:

  Precipitation nowcasting refers to a rapid, high-resolution prediction within the next 2 hours, providing important benefits for areas such as air traffic control and emergency services. Recently, deep learning methods using only radar images have shown promising results for precipitation nowcasting without relying on physical models. However, these methods often overlook the additional meteorological information, such as temperature, humidity, and cloud water content, contained in reanalysis data, thus limiting further improvements in prediction accuracy. In this research, we build upon the CNN-based U-Net architecture to integrate radar data with reanalysis data for network training. Since reanalysis data are delayed and cannot be used for real-time forecasts, we apply a knowledge distillation approach to transfer information from a teacher model to a student model that does not require reanalysis data when making predictions. Our experiments show that the distilled student model outperforms the baseline model trained only on radar data in terms of MSE, CSI, and PSD, demonstrating the effectiveness of our method in improving forecast accuracy.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.