Microsoft Word - SustainabilityReport_BCC.doc

(Barry) #1

to improve how ensembles are created (experimental design), and to improve how
information in climate model ensembles is combined. The current state-of-the-art for
combining climate model ensembles is based on a Bayesian hierarchical model. This
includes representations of uncertainty that span the sources of uncertainty sampled by
the ensemble (although these ensembles do not represent any sort of "random sample"
from elementary statistics -- often the sample space of climate models is not even well-
defined). However, climate model output is complex and highly multivariate, and there
are still many opportunities for research including spatial and spatial-temporal modeling
of non-stationary processes, theory and methodology for extremes, and, of course,
ensuring such methods are scalable to the size of the datasets that will be generated by
the next-generation climate models (statistical computing).
Characterizing the uncertainty in climate model ensembles is just the beginning.
Often the spatial and temporal scales that are native to global climate models are too
coarse to use in various applications needed for impact studies. Downscaling refers to
the growing body of work that uses the coarse-scale information in global climate models
to produce regional and local climate information. Dynamic downscaling uses high-
resolution climate models, often by forcing a regional climate model over a limited spatial
domain with a boundary condition provided by global models. Statistical downscaling
involves using empirical relationships. So-called stochastic weather generators are yet
another alternative. There are strengths and weaknesses of each of these, but there are
opportunities for the mathematical sciences to contribute to new downscaling methods
that also allow for the propagation of uncertainty. These new methods will almost surely
require new tools for spatial and spatial-temporal modeling, data fusion, data
assimilation, and other methods that incorporate deterministic and statistical models.
Understanding how a natural system responds to climate and climate change
typically begins by linking the natural system to weather phenomena (e.g., linking a
public health endpoint to heat stress, linking mosquito life cycles and effective ranges to
temperature and precipitation, linking animal migration to seasonal cycles, linking the
response of crops, grassland, forests, etc. to meteorology, etc.). While there is a growing
body of mathematical and statistical modeling central to these efforts, there is more work
to be done in expanding current mathematical models and developing new models.
Again, the common theme of uncertainty and characterizing uncertainty in such
mathematical models is crucial, in particular the difficult problem of propagating the
uncertainty in the meteorological inputs (i.e., weather) through these models especially
when changes to these inputs are informed by climate models. Another emerging area
connected to the response of a natural system to a changing climate is adaptation and
there are opportunities for new mathematical frameworks or modeling strategies to
better explore how a natural system can or cannot adapt.
Many of these analyses and modeling efforts that incorporate mathematical
and/or statistical models of a natural system and that system's response to climate and
climate change are being used for decision making at various levels. New tools for
visualizing uncertainty from the analysis of complex systems are required to help inform
policy makers. While decision making under uncertainty has a long history in statistical
science and other fields, this is something of a new era where choices are more
ambiguous. Decision making under ambiguity or deep uncertainty requires new
mathematical frameworks or even new paradigms, including recognizing the potential of

Free download pdf