User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Journal Article

Kilometer-scale climate models: Prospects and challenges


Leutwyler,  David
Precipitating Convection, The Atmosphere in the Earth System, MPI for Meteorology, Max Planck Society;

External Ressource
No external resources are shared
Fulltext (public)

(Publisher version), 5MB

Supplementary Material (public)
There is no public supplementary material available

Schär, C., Fuhrer, O., Arteaga, A., Ban, N., Charpilloz, C., Di Girolamo, S., et al. (2020). Kilometer-scale climate models: Prospects and challenges. Bulletin of the American Meteorological Society, 101, E567-E587. doi:10.1175/BAMS-D-18-0167.1.

Cite as: http://hdl.handle.net/21.11116/0000-0006-A293-C
Currently major efforts are underway toward refining the horizontal resolution (or grid spacing) of climate models to about 1 km, using both global and regional climate models (GCMs and RCMs). Several groups have succeeded in conducting kilometer-scale multiweek GCM simulations and decadelong continental-scale RCM simulations. There is the well-founded hope that this increase in resolution represents a quantum jump in climate modeling, as it enables replacing the parameterization of moist convection by an explicit treatment. It is expected that this will improve the simulation of the water cycle and extreme events and reduce uncertainties in climate change projections. While kilometer-scale resolution is commonly employed in limited-area numerical weather prediction, enabling it on global scales for extended climate simulations requires a concerted effort. In this paper, we exploit an RCM that runs entirely on graphics processing units (GPUs) and show examples that highlight the prospects of this approach. A particular challenge addressed in this paper relates to the growth in output volumes. It is argued that the data avalanche of high-resolution simulations will make it impractical or impossible to store the data. Rather, repeating the simulation and conducting online analysis will become more efficient. A prototype of this methodology is presented. It makes use of a bit-reproducible model version that ensures reproducible simulations across hardware architectures, in conjunction with a data virtualization layer as a common interface for output analyses. An assessment of the potential of these novel approaches will be provided.