Hansen has an interesting discussion of extreme weather and attribution to human emissions:
Global warming is expected to intensify climate extremes: (1) Warmer air holds more water vapor, and precipitation occurs in more extreme events. ’100-year floods’ and even ’500- year floods’ will become more likely. Storms fueled by water vapor (latent heat), including thunderstorms, tornadoes and tropical storms, will have the potential to be stronger. Storm damage will increase because of increased flooding and stronger winds. (2) Where weather patterns create dry conditions, global warming will intensify the drought, because of increased evaporation and evapotranspiration. Thus fires will be more frequent and burn hotter.
Observations confirm that heat waves and regional drought have become more frequent and intense over the past 50 years. Rainfall in the heaviest downpours has increased about 20 percent. The destructive energy in hurricanes has increased (USGCRP, 2009).
Is the Texas drought related to human-made global warming? There is strong reason to believe that it is. Basic theory and models (Held and Soden, 2006) and empirical evidence (Seidal and Randel, 2006) indicate that the global overturning circulation, air rising in the tropics and subsiding in the subtropics, expands in latitude with global warming. Such expansion tends to make droughts more frequent and severe in the southern United States and the Mediterranean region, for example. Climate simulations, shown in Figure 3 for one of the best climate models, support that expectation.
[JR: I suspect this study underestimates likely drought in the West due to early snow melt and other factors. I’ll have to take a look.]
So the occurrence of unusual Texas heat and drought is consistent with expectations for increasing CO2. But is this year’s event just climate ‘noise’? Scientists need to help the public distinguish climate change caused by global warming from natural climate variability.
I used ‘climate dice’ in conjunction with testimony to Congress in 1988 to try to help the public understand that the human-made climate ‘signal’ must be extracted from the large ‘noise’ of natural climate variability. I believe the public can grasp the concept of natural climate variability and its effect on perceptions of climate change.
In an upcoming post (Climate Variability and Climate Change, Hansen, Sato and Ruedy) we try to clarify this matter via simple maps and graphs that show how the odds have changed, allowing comparison of expectations and reality. We believe this is a truer approach than the frequently suggested alternative of dropping the long-standing ‘global warming’ terminology in favor of anything (‘climate disruption’, ‘global weirding’, etc.) that avoids the need to explain the occurrence of unusually cold conditions.
We show that a ‘signal’ due to global warming is already rising out of the climate ‘noise’, even on regional scales. Figure 4 is an example, showing surface air temperature anomalies in the last four Northern Hemisphere summers relative to the climate of 1951-1980, the time when the ‘baby-boomers grew up – it was a time of relatively stable climate, just prior to the rapid global warming of the past three decades.
During 1951-1980 the world had equal areas of blue (cool), white (near average), and red (warm) temperature anomalies. The division 0.43?, where ? is the local standard deviation about the local 1951-1980 mean, was chosen to yield equal area categories for a normal (‘bell curve’) distribution of temperature anomalies. The other divisions in the figure, 2? and 3?, allow us to see the areas that have extreme anomalies relative to climatology. The frequency of an anomaly greater than +2? is only 2-3 percent in the period of climatology for a normal distribution. The frequency of a +3? event is normally less than one-half of one percent of the time. The numbers on the upper right corner of each map are the percentages of the global area covered by each of the seven categories of the color bar.
Figure 4 reveals that the area with temperature anomaly greater than +2? covers 20-40 percent of the planet in these recent years, and the area greater than +3? is almost 10-20 percent. The United States has been relatively ‘lucky’, with the only +2-3? areas being the Texas region in 2011 and a smaller area in the Southeast in 2010. However, these events are sufficiently fresh in people’s memories that they provide a useful measure of the practical impact of a 3? anomaly.
There is no good reason to believe that the United States, or any other region, will continue to be so ‘lucky’. On the contrary, as shown in our upcoming post, there is a clear positive trend to increasing areas of +2-3? anomalies, consistent with expectations for the climate response to increasing greenhouse gases. If BAU emissions continue, the area with anomalies of +2-3? and larger will continue to increase.
The chaotic element in climate variability makes it impossible to say exactly where large anomalies will occur in a given year. However, we can say with assurance that the area and magnitude of the anomalies and their practical impact will continue to increase. Clear presentations of the data should help the public appreciate the situation as global warming continues to rise further above the level of natural variability.