Informing BC Stakeholders

You are here

Publications Library

  • Source Publication: The Journal of Open Source Software. 3, 22, 360, doi:10.21105/joss.00360 Authors: Hiebert, J., A. Cannon, A. Schoeneberg, S. Sobie and T. Murdock Publication Date: Feb 2018

    The ClimDown R package publishes the routines and techniques of the Pacific Climate Impacts Consortium (PCIC) for downscaling coarse scale Global Climate Models (GCMs) to fine scale spatial resolution. PCIC’s overall downscaling algorithm is named Bias-corrected constructed analogues with quantile mapping reordering (BCCAQ) (Cannon, Sobie, and Murdock 2015; Werner and Cannon 2016). BCCAQ is a hybrid downscaling method that combines outputs from Constructed Analogues (CA) (Maurer et al. 2010) and quantile mapping at the fine-scale resolution. First, the CA and Climate Imprint (CI) (Hunter and Meentemeyer 2005) plus quantile delta mapping (QDM) (Cannon, Sobie, and Murdock 2015) algorithms are run independently. BCCAQ then combines outputs from the two by taking the daily QDM outputs at each fine-scale grid point and reordering them within a given month according to the daily CA ranks, i.e., using a form of Empirical Copula Coupling (Schefzik, Thorarinsdottir, and Gneiting 2013). The package exports high-level wrapper functions that perform each of three downscaling steps: CI, CA, and QDM, as well as one wrapper that runs the entire BCCAQ pipeline.

  • Source Publication: International Journal of Climatology, 38, 2, 1041-1059, doi:10.1002/joc.5235 Authors: Pingree-Shippee, K., F.W. Zwiers and D. Atkinson Publication Date: Feb 2018

    Extratropical cyclones often produce extreme and hazardous weather conditions, such as high winds, heavy precipitation, blizzard conditions, and flooding, all of which have detrimental environmental/physical and socio‐economic impacts. Furthermore, storm interaction with the ocean produces additional hazards, with major local impacts, including inundation and coastal erosion. The North American west coast is influenced by the North Pacific storm track and by ‘atmospheric river’ events while the east coast is particularly influenced by winter storms that track along two favoured routes: the St. Lawrence Valley and the Eastern Seaboard. Reanalysis provides an invaluable tool for studying the characteristics of storm events that are identified as causing the most severe impacts. However, reanalysis products differ substantially in spatial resolution, model physics, assimilation approach, and the data that are assimilated. This study evaluates the representation of storm activity along the mid‐latitude North American coastlines by six global reanalyses: NCEP‐1, NCEP‐2, ERA‐Interim, Modern‐Era Retrospective analysis for Research and Applications (MERRA), Climate Forecast System Reanalysis (CFSR), and Twentieth Century Reanalysis Version 2 (20CR). Storm activity representation is evaluated at annual and seasonal timescales (JFM, AMJ, JAS, OND, and ‘extended winter’ ONDFM) during the 1979–2010 time period through comparison with selected meteorological stations using single‐point surface pressure‐based proxies of extratropical storm activity. Stations are selected on the basis of record length, reporting frequency, coastal proximity, and relatively uniform spatial distribution. Comparisons are made using data extracted from the reanalysis grid box centre that is closest to each selected station. All reanalyses are found to successfully represent most aspects of mid‐latitude North American coastal strong storm activity, annually and seasonally, along both coasts. Nevertheless, ERA‐Interim, MERRA, and CFSR provide the better representations of mid‐latitude North American coastal strong storm activity, with ERA‐Interim performing best overall.

  • Source Publication: Journal of Climate, doi:10.1175/JCLI-D-16-0752.1 Authors: Naveau, P., A. Ribes, F.W. Zwiers, A. Hannart, A. Tuel and P. Yiou Publication Date: Jan 2018

    Both climate and statistical models play an essential role in the process of demonstrating that the distribution of some atmospheric variable has changed over time and in establishing the most likely causes for the detected change. One statistical difficulty in the research field of Detection and Attribution resides in defining events that can be easily compared and accurately inferred from reasonable sample sizes. As many impacts studies focus on extreme events, the inference of small probabilities and the computation of their associated uncertainties quickly become challenging. In the particular context of event attribution, we address the question of how to compare records between the so-called world as “it might have been been without anthropogenic forcings” and the “world that is”. Records are often the most important events in terms of impact and get much media attention. We will show how to efficiently estimate the ratio of two small probability of records. The inferential gain is particularly substantial when a simple hypothesis testing procedure is implemented. The theoretical justification of such a proposed scheme can be found in Extreme Value Theory. To illustrate our approach, classical indicators in event attribution studies like the Risk Ratio or the Fraction of Attributable Risk, are modified and tailored to handle records. We illustrate the advantages of our method through theoretical results, simulation studies, temperature records in Paris and outputs from a numerical climate model.

  • Source Publication: Nature Scientific Reports, 8, 1007, doi:10.1038/s41598-018-19288- Authors: Li, C., Y. Fang, K. Calderia, X. Zhang, N.S. Diffenbaugh, and A.M. Michalak Publication Date: Jan 2018

    A critical question for climate mitigation and adaptation is to understand when and where the signal of changes to climate extremes have persistently emerged or will emerge from the background noise of climate variability. Here we show observational evidence that such persistent changes to temperature extremes have already occurred over large parts of the Earth. We further show that climate models forced with natural and anthropogenic historical forcings underestimate these changes. In particular, persistent changes have emerged in observations earlier and over a larger spatial extent than predicted by models. The delayed emergence in the models is linked to a combination of simulated change (‘signal’) that is weaker than observed, and simulated variability (‘noise’) that is greater than observed. Over regions where persistent changes had not occurred by the year 2000, we find that most of the observed signal-to-noise ratios lie within the 16–84% range of those simulated. Examination of simulations with and without anthropogenic forcings provides evidence that the observed changes are more likely to be anthropogenic than nature in origin. Our findings suggest that further changes to temperature extremes over parts of the Earth are likely to occur earlier than projected by the current climate models.

  • Source Publication: eather and Climate Extremes, 18, 65-74,doi:10.1016/j.wace.2017.10.003 Authors: Sillmann, J., T.L. Thoranisdottir, N. Schaller, L. Alexander, G.C. Hegerl, S.I. Seneviratne, R. Vautard, X. Zhang and F.W. Zwiers Publication Date: Dec 2017

    Weather and climate extremes are identified as major areas necessitating further progress in climate research and have thus been selected as one of the World Climate Research Programme (WCRP) Grand Challenges. Here, we provide an overview of current challenges and opportunities for scientific progress and cross-community collaboration on the topic of understanding, modeling and predicting extreme events based on an expert workshop organized as part of the implementation of the WCRP Grand Challenge on Weather and Climate Extremes. In general, the development of an extreme event depends on a favorable initial state, the presence of large-scale drivers, and positive local feedbacks, as well as stochastic processes. We, therefore, elaborate on the scientific challenges related to large-scale drivers and local-to-regional feedback processes leading to extreme events. A better understanding of the drivers and processes will improve the prediction of extremes and will support process-based evaluation of the representation of weather and climate extremes in climate model simulations. Further, we discuss how to address these challenges by focusing on short-duration (less than three days) and long-duration (weeks to months) extreme events, their underlying mechanisms and approaches for their evaluation and prediction.

  • Source Publication: Geophysical Research Letters, 44, 21, 11012-11020, doi:10.1002/2017GL075016 Authors: Najafi, M.R., F.W. Zwiers and N.P. Gillett Publication Date: Oct 2017

    We study the observed decline in summer streamflow in four key river basins in British Columbia (BC), Canada, using a formal detection and attribution (D&A) analysis procedure. Reconstructed and simulated streamflow is generated using the semidistributed variable infiltration capacity hydrologic model, which is driven by 1/16° gridded observations and downscaled climate model data from the Coupled Model Intercomparison Project phase 5 (CMIP5), respectively. The internal variability of the regional hydrologic components using ~5100 years of streamflow was simulated using CMIP5 preindustrial control runs. Results show that the observed changes in summer streamflow are inconsistent with simulations representing the responses to natural forcing factors alone, while the response to anthropogenic and natural forcing factors combined is detected in these changes. A two‐signal D&A analysis indicates that the effects of anthropogenic (ANT) forcing factors are discernable from natural forcing in BC, albeit with large uncertainties.

  • Source Publication: Climatic Change, 145, 289303, doi:10.1007/s10584-017-2098-6 Authors: Shrestha, R., A.J. Cannon, M.A. Schnorbus and F.W. Zwiers Publication Date: Oct 2017

    We describe an efficient and flexible statistical modeling framework for projecting nonstationary streamflow extremes for the Fraser River basin in Canada, which is dominated by nival flow regime. The framework is based on an extreme value analysis technique that allows for nonstationarity in annual extreme streamflow by relating it to antecedent winter and spring precipitation and temperature. We used a representative suite of existing Variable Infiltration Capacity hydrologic model simulations driven by Coupled Model Intercomparison Project Phase 3 (CMIP3) climate simulations to train and evaluate a nonlinear and nonstationary extreme value model of annual extreme streamflow. The model was subsequently used to project changes under CMIP5-based climate change scenarios. Using this combination of process-based and statistical modeling, we project that the moderate (e.g., 2–20-year return period) extreme streamflow events will decrease in intensity. In contrast, projections of high intensity events (e.g., 100–200-year return period), which reflect complex interactions between temperature and precipitation changes, are inconclusive. The results provide a basis for developing a general understanding of the future streamflow extremes changes in nival basins and through careful consideration and adoption of appropriate covariates, the methodology could be employed for basins spanning a range of hydro-climatological regimes.

  • Source Publication: Earth's Future, doi:10.1002/2017EF000639 Authors: Li, C., X. Zhang, F.W. Zwiers, Y. Fang and A.M. Michalak Publication Date: Oct 2017

    Wet bulb globe temperature (WBGT) accounts for the effect of environmental temperature and humidity on thermal comfort, and can be directly related to the ability of the human body to dissipate excess metabolic heat and thus avoid heat stress. Using WBGT as a measure of environmental conditions conducive to heat stress, we show that anthropogenic influence has very substantially increased the likelihood of extreme high summer mean WBGT in northern hemispheric land areas relative to the climate that would have prevailed in the absence of anthropogenic forcing. We estimate that the likelihood of summer mean WGBT exceeding the observed historical record value has increased by a factor of at least 70 at regional scales due to anthropogenic influence on the climate. We further estimate that, in most northern hemispheric regions, these changes in the likelihood of extreme summer mean WBGT are roughly an order of magnitude larger than the corresponding changes in the likelihood of extreme hot summers as simply measured by surface air temperature. Projections of future summer mean WBGT under the RCP8.5 emissions scenario that are constrained by observations indicate that by 2030s at least 50% of the summers will have mean WBGT higher than the observed historical record value in all the analyzed regions, and that this frequency of occurrence will increase to 95% by mid‐century.

  • Source Publication: Earth's Future, accepted, doi:10.1002/2017EF000639. Authors: Li, C., X. Zhang, F. Zwiers, Y. Fang and A. Micha Publication Date: Oct 2017

    Wet bulb Globe Temperature (WBGT) accounts for the effect of environmental temperature and humidity on thermal comfort, and can be directly related to the ability of the human body to dissipate excess metabolic heat and thus avoid heat stress. Using WBGT as a measure of environmental conditions conducive to heat stress, we show that anthropogenic influence has very substantially increased the likelihood of extreme high summer mean WBGT in northern hemispheric land areas relative to the climate that would have prevailed in the absence of anthropogenic forcing. We estimate that the likelihood of summer mean WGBT exceeding the observed historical record value has increased by a factor of at least 70 at regional scales due to anthropogenic influence on the climate. We further estimate that, in most northern hemispheric regions, these changes in the likelihood of extreme summer mean WBGT are roughly an order of magnitude larger than the corresponding changes in the likelihood of extreme hot summers as simply measured by surface air temperature. Projections of future summer mean WBGT under the RCP8.5 emissions scenario that are constrained by observations indicate that by 2030s at least 50% of the summers will have mean WBGT higher than the observed historical record value in all the analyzed regions, and that this frequency of occurrence will increase to 95% by mid-century.

  • Source Publication: Water Resources Research, 53, 8366–8382, doi:10.1002/2017WR020596 Authors: Bonnet, R., J. Boé, G. Dayon and E. Martin Publication Date: Oct 2017

    Characterizing and understanding the multidecadal variations of the continental hydrological cycle is a challenging issue given the limitation of observed data sets. In this paper, a new approach to derive twentieth century hydrological reconstructions over France with an hydrological model is presented. The method combines the results of long-term atmospheric reanalyses downscaled with a stochastic statistical method and homogenized station observations to derive the meteorological forcing needed for hydrological modeling. Different methodological choices are tested and evaluated. We show that using homogenized observations to constrain the results of statistical downscaling help to improve the reproduction of precipitation, temperature, and river flows variability. In particular, it corrects some unrealistic long-term trends associated with the atmospheric reanalyses. Observationally constrained reconstructions therefore constitute a valuable data set to study the multidecadal hydrological variations over France. Thanks to these reconstructions, we confirm that the multidecadal variations previously noted in French river flows have mainly a climatic origin. Moreover, we show that multidecadal variations exist in other hydrological variables (evapotranspiration, snow cover, and soil moisture). Depending on the region, the persistence from spring to summer of soil moisture or snow anomalies generated during spring by temperature and precipitation variations may explain river flows variations in summer, when no concomitant climate variations exist.

  • Source Publication: The Cryosphere, doi:10.5194/tc-2017-157 Authors: Kushner, P.J., et al. (F.W. Zwiers 24th co-author) Publication Date: Sep 2017

    This study assesses the ability of the Canadian Seasonal to Interannual Prediction System (CanSIPS) and the Canadian Earth-system Model 2 (CanESM2) to predict and simulate snow and sea ice from seasonal to multi-decadal timescales, with a focus on the Canadian sector. To account for observational uncertainty, model structural uncertainty, and internal climate variability, the analysis uses multi-source observations, multiple Earth-System Models (ESMs) in Phase 5 of the Coupled Model Intercomparison Project (CMIP5) archive, and initial condition ensembles of CanESM2 and other models. It is found that the ability of the CanESM2 simulation to capture snow-related climate parameters, such as cold-region temperature and precipitation, lies within the range of currently available international models. Accounting for the considerable disagreement among satellite-era observational datasets on the distribution of snow water equivalent, CanESM2 has too much springtime snow cover over the Canadian land mass, reflecting a broader Northern Hemisphere positive bias. It also exhibits retreat of springtime snow generally greater than observational estimates, after accounting for observational uncertainty and internal variability. Sea ice is biased low in the Canadian Arctic, which makes it difficult to assess the realism of long-term sea-ice trends there. The strengths and weaknesses of the modeling system need to be understood as a practical tradeoff: the Canadian models are relatively inexpensive computationally because of their moderate resolution, thus enabling their use in operational seasonal prediction and for generating large ensembles of multidecadal simulations. Improvements in climate prediction systems like CanSIPS rely not just on simulation quality but also on using novel observational constraints and the ready transfer of research to an operational setting. Improvements in seasonal forecasting practice arising from recent research include accurate initialization of snow and frozen soil, accounting for observational uncertainty in forecast verification, and sea-ice thickness initialization using statistical predictors available in real time.

  • Source Publication: Hydrology and Earth System Sciences, doi:10.5194/hess-2017-531 Authors: Curry, C.L. and F.W. Zwiers Publication Date: Sep 2017

    The Fraser River basin (FRB) of British Columbia is one of the largest and most important watersheds in Western North America, and is home to a rich diversity of biological species and economic assets that depend implicitly upon its extensive riverine habitats. The hydrology of the FRB is dominated by snow accumulation and melt processes, leading to a prominent annual peak streamflow invariably occurring in June–July. However, while annual peak daily streamflow (APF) during the spring freshet in the FRB is historically well correlated with basin-averaged, April 1 snow water equivalent (SWE), there are numerous occurrences of anomalously large APF in below- or near-normal SWE years, some of which have resulted in damaging floods in the region. An imperfect understanding of which other climatic factors contribute to these anomalously large APFs hinders robust projections of their magnitude and frequency.

    We employ the Variable Infiltration Capacity (VIC) process-based hydrological model driven by gridded observations to investigate the key controlling factors of anomalous APF events in the FRB and four of its subbasins that contribute more than 70 % of the annual flow at Fraser-Hope. The relative influence of a set of predictors characterizing the interannual variability of rainfall, snowfall, snowpack (characterized by the annual maximum value, SWEmax), soil moisture and temperature on simulated APF at Hope (the main outlet of the FRB) and at the subbasin outlets is examined within a regression framework. The influence of large-scale climate modes of variability (the Pacific Decadal Oscillation (PDO) and the El Niño-Southern Oscillation (ENSO)) on APF magnitude is also assessed, and placed in context with these more localized controls. The results indicate that next to SWEmax (which strongly controls the annual maximum of soil moisture), the snowmelt rate, the ENSO and PDO indices, and rate of warming subsequent to the date of SWEmax are the most influential predictors of APF magnitude in the FRB and its subbasins. The identification of these controls on annual peak flows in the region may be of use in the context of seasonal prediction or future projected streamflow behaviour.

  • Source Publication: Climatic Change, 144, 143-150, doi:10.1007/s10584-017-2049-2 Authors: Stott, P.A., D.J. Karoly and F.W. Zwiers Publication Date: Aug 2017

    The science of event attribution meets a mounting demand for reliable and timely information about the links between climate change and individual extreme events. Studies have estimated the contribution of human-induced climate change to the magnitude of an event as well as its likelihood, and many types of event have been investigated including heatwaves, floods, and droughts. Despite this progress, such approaches have been criticised for being unreliable and for being overly conservative. We argue that such criticisms are misplaced. Rather, a false dichotomy has arisen between “conventional” approaches and new alternative framings. We have three points to make about the choice of statistical paradigm for event attribution studies. First, different approaches to event attribution may choose to occupy different places on the conditioning spectrum. Providing this choice of conditioning is communicated clearly, the value of such choices depends ultimately on their utility to the user concerned. Second, event attribution is an estimation problem for which either frequentist or Bayesian paradigms can be used. Third, for hypothesis testing, the choice of null hypothesis is context specific. Thus, the null hypothesis of human influence is not inherently a preferable alternative to the usual null hypothesis of no human influence.

  • Authors: Megan C. Kirchmeier-Young, Francis W. Zwiers, Nathan P. Gillett and Alex J. Cannon Publication Date: Jul 2017

    Canada is expected to see an increase in fire risk under future climate projections. Large fires, such as that near Fort McMurray, Alberta in 2016, can be devastating to the communities affected. Understanding the role of human emissions in the occurrence of such extreme fire events can lend insight into how these events might change in the future. An event attribution framework is used to quantify the influence of anthropogenic forcings on extreme fire risk in the current climate of a western Canada region. Fourteen metrics from the Canadian Forest Fire Danger Rating System are used to define the extreme fire seasons. For the majority of these metrics and during the current decade, the combined effect of anthropogenic and natural forcing is estimated to have made extreme fire risk events in the region 1.5 to 6 times as likely compared to a climate that would have been with natural forcings alone.

  • Source Publication: 56, 6, 1625–1641, doi:10.1175/JAMC-D-16-0287.1 Authors: Sobie, S.R. and T.Q. Murdock Publication Date: Jun 2017

    Knowledge from high-resolution daily climatological parameters is frequently sought after for increasingly local climate change assessments. This research investigates whether applying a simple postprocessing methodology to existing statistically downscaled temperature and precipitation fields can result in improved downscaled simulations useful at the local scale. Initial downscaled daily simulations of temperature and precipitation at 10-km resolution are produced using bias correction constructed analogs with quantile mapping (BCCAQ). Higher-resolution (800 m) values are then generated using the simpler climate imprint technique in conjunction with temperature and precipitation climatologies from the Parameter-Elevation Regression on Independent Slopes Model (PRISM). The potential benefit of additional downscaling to 800 m is evaluated using the “Climdex” set of 27 indices of extremes established by the Expert Team on Climate Change Detection and Indices (ETCCDI). These indices are also calculated from weather station observations recorded at 22 locations within southwestern British Columbia, Canada, to evaluate the performance of both the 10-km and 800-m datasets in replicating the observed quantities. In a 30-yr historical evaluation period, Climdex indices computed from 800-m simulated values display reduced error relative to local station observations than those from the 10-km dataset, with the greatest reduction in error occurring at high-elevation sites for precipitation-based indices.

  • Source Publication: The Cryosphere, doi:10.5194/tc-2017-56 Authors: Snauffer, A., W. Hsieh, A. Cannon, and M. Schnorbus Publication Date: Jun 2017

    Estimates of surface snow water equivalent (SWE) in alpine regions with seasonal melts are particularly difficult in areas of high vegetation density, topographic relief and snow accumulations. These three confounding factors dominate much of the province of British Columbia (BC), Canada. An artificial neural network (ANN) was created using as predictors six gridded SWE products previously evaluated for BC: ERA-Interim/Land, GLDAS-2, MERRA, MERRA-Land, GlobSnow and ERA-Interim. Relevant spatiotemporal covariates including survey date, year, latitude, longitude, elevation and grid cell elevation differences were also included as predictors, and observations from manual snow surveys at stations located throughout BC were used as target data. Mean absolute errors (MAEs) and correlations for April surveys were found using cross validation. The ANN using the three best performing SWE products (ANN3) had the lowest mean station MAE across the entire province, improving on the performance of individual products by an average of 53 %. Mean station MAEs and April survey correlations were also found for each of BC’s five physiographic regions. ANN3 outperformed each product as well as product means and multiple linear regression (MLR) models in all regions except for the BC Plains, which has relatively few stations and much lower accumulations than other regions. Subsequent comparisons of the ANN results with predictions generated by the Variable Infiltration Capacity (VIC) hydrologic model found ANN3 to be superior over the entire VIC domain and within most physiographic regions. The superior performance of the ANN over individual products, product means, MLR and VIC was found to be statistically significant across the province.

  • Source Publication: Journal of Climate, 30, 4113-4130, doi:10.1175/JCLI-D-16-0189.1 Authors: Naja , M.R., F.W. Zwiers and N.P. Gillett Publication Date: May 2017

    A detection and attribution analysis on the multidecadal trend in snow water equivalent (SWE) has been conducted in four river basins located in British Columbia (BC). Monthly output from a suite of 10 general circulation models (GCMs) that participated in phase 5 of the Coupled Model Intercomparison Project (CMIP5) is used, including 40 climate simulations with anthropogenic and natural forcing combined (ALL), 40 simulations with natural forcing alone (NAT), and approximately 4200 yr of preindustrial control simulations (CTL). This output was downscaled to 1/16° spatial resolution and daily temporal resolution to drive the Variable Infiltration Capacity hydrologic model (VIC). Observed (manual snow survey) and VIC-reconstructed SWE, which exhibit declines across BC, are projected onto the multimodel ensemble means of the VIC-simulated SWE based on the responses to different forcings using an optimal fingerprinting approach. Results of the detection and attribution analysis shows that these declines are attributable to the anthropogenic forcing, which is dominated by the effect of increases in greenhouse gas concentration, and that they are not caused by natural forcing due to volcanic activity and solar variability combined. Anthropogenic influence is detected in three of the four basins (Fraser, Columbia, and Campbell Rivers) based on the VIC-reconstructed SWE, and in all basins based on the manual snow survey records. The simulations underestimate the observed snowpack trends in the Columbia River basin, which has the highest mean elevation. Attribution is supported by the detection of human influence on the cold-season temperatures that drive the snowpack reductions. These results are robust to the use of different observed datasets and to the treatment of low-frequency variability effects.

  • Source Publication: Journal of Advances in Modeling Earth Systems, 9, 2, 1292-1306, doi:10.1002/2016MS000830 Authors: Ouali, D., F. Chebana and T.B.M.J. Ouarda Publication Date: Apr 2017

    The high complexity of hydrological systems has long been recognized. Despite the increasing number of statistical techniques that aim to estimate hydrological quantiles at ungauged sites, few approaches were designed to account for the possible nonlinear connections between hydrological variables and catchments characteristics. Recently, a number of nonlinear machine‐learning tools have received attention in regional frequency analysis (RFA) applications especially for estimation purposes. In this paper, the aim is to study nonlinearity‐related aspects in the RFA of hydrological variables using statistical and machine‐learning approaches. To this end, a variety of combinations of linear and nonlinear approaches are considered in the main RFA steps (delineation and estimation). Artificial neural networks (ANNs) and generalized additive models (GAMs) are combined to a nonlinear ANN‐based canonical correlation analysis (NLCCA) procedure to ensure an appropriate nonlinear modeling of the complex processes involved. A comparison is carried out between classical linear combinations (CCAs combined with linear regression (LR) model), semilinear combinations (e.g., NLCCA with LR) and fully nonlinear combinations (e.g., NLCCA with GAM). The considered models are applied to three different data sets located in North America. Results indicate that fully nonlinear models (in both RFA steps) are the most appropriate since they provide best performances and a more realistic description of the physical processes involved, even though they are relatively more complex than linear ones. On the other hand, semilinear models which consider nonlinearity either in the delineation or estimation steps showed little improvement over linear models. The linear approaches provided the lowest performances.

  • Source Publication: Nature Geoscience 10, 255–259, doi:10.1038/ngeo2911. Authors: Zhang, X., F.W. Zwiers, G.L. Hui Wan and A.J. Cannon Publication Date: Mar 2017

    Warming of the climate is now unequivocal. The water holding capacity of the atmosphere increases by about 7% per °C of warming, which in turn raises the expectation of more intense extreme rainfall events. Meeting the demand for robust projections for extreme short-duration rainfall is challenging, however, because of our poor understanding of its past and future behaviour. The characterization of past changes is severely limited by the availability of observational data. Climate models, including typical regional climate models, do not directly simulate all extreme rainfall producing processes, such as convection. Recently developed convection-permitting models better simulate extreme precipitation, but simulations are not yet widely available due to their computational cost, and they have their own uncertainties. Attention has thus been focused on precipitation–temperature relationships in the hope of obtaining more robust extreme precipitation projections that exploit higher confidence temperature projections. However, the observed precipitation–temperature scaling relationships have been established almost exclusively by linking precipitation extremes with day-to-day temperature variations. These scaling relationships do not appear to provide a reliable basis for projecting future precipitation extremes. Until better methods are available, the relationship of the atmosphere's water holding capacity with temperature provides better guidance for planners in the mid-latitudes, albeit with large uncertainties.

  • Source Publication: doi:10.1007/s00382-017-3634-9 Authors: C. Seiler, F. W. Zwiers, K. I. Hodges and J. F. Scinocca Publication Date: Mar 2017

    Explosive extratropical cyclones (EETCs) are rapidly intensifying low pressure systems that generate severe weather along North America’s Atlantic coast. Global climate models (GCMs) tend to simulate too few EETCs, perhaps partly due to their coarse horizontal resolution and poorly resolved moist diabatic processes. This study explores whether dynamical downscaling can reduce EETC frequency biases, and whether this affects future projections of storms along North America’s Atlantic coast. A regional climate model (CanRCM4) is forced with the CanESM2 GCM for the periods 1981 to 2000 and 2081 to 2100. EETCs are tracked from relative vorticity using an objective feature tracking algorithm. CanESM2 simulates 38% fewer EETC tracks compared to reanalysis data, which is consistent with a negative Eady growth rate bias (−0.1 day−1). Downscaling CanESM2 with CanRCM4 increases EETC frequency by one third, which reduces the frequency bias to −22%, and increases maximum EETC precipitation by 22%. Anthropogenic greenhouse gas forcing is projected to decrease EETC frequency (−15%, −18%) and Eady growth rate (−0.2 day−1, −0.2 day−1), and increase maximum EETC precipitation (46%, 52%) in CanESM2 and CanRCM4, respectively. The limited effect of dynamical downscaling on EETC frequency projections is consistent with the lack of impact on the maximum Eady growth rate. The coarse spatial resolution of GCMs presents an important limitation for simulating extreme ETCs, but Eady growth rate biases are likely just as relevant. Further bias reductions could be achieved by addressing processes that lead to an underestimation of lower tropospheric meridional temperature gradients.

Pages