hide
Free keywords:
-
Abstract:
Predicting extreme events and variations in weather and climate provides crucial information
for economic, social, and environmental decision-making (Merryfield et al., 2020). However,
quantifying prediction skill for multi-dimensional geospatial model output is computationally
expensive and a difficult coding challenge. The large datasets (order gigabytes to terabytes)
require parallel and out-of-memory computing to be analyzed efficiently. Further, aligning the
many forecast initializations with differing observational products is a straight-forward, but
exhausting and error-prone exercise for researchers.
To simplify and standardize forecast verification across scales from hourly weather to decadal
climate forecasts, we built climpred: a community-driven python package for computationally
efficient and methodologically consistent verification of ensemble prediction models. The code
base is maintained through open-source development. It leverages xarray (Hoyer & Hamman,
2017) to anticipate core prediction ensemble dimensions (ensemble member, initialization
date and lead time) and dask (Dask Development Team, 2016; Rocklin, 2015) to perform
out-of-memory and parallelized computations on large datasets.
climpred aims to offer a comprehensive set of analysis tools for assessing the quality of
dynamical forecasts relative to verification products (e.g., observations, reanalysis products,
control simulations). The package includes a suite of deterministic and probabilistic verifica-
tion metrics that are constantly expanded by the community and are generally organized in
our companion package, xskillscore.