• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Tuesday, 08 Apr 2014 12:25

Guest commentary from Drew Shindell

There has been a lot of discussion of my recent paper in Nature Climate Change (Shindell, 2014). That study addressed a puzzle, namely that recent studies using the observed changes in Earth’s surface temperature suggested climate sensitivity is likely towards the lower end of the estimated range. However, studies evaluating model performance on key observed processes and paleoclimate evidence suggest that the higher end of sensitivity is more likely, partially conflicting with the studies based on the recent transient observed warming. The new study shows that climate sensitivity to historical changes in the abundance of aerosol particles in the atmosphere is larger than the sensitivity to CO2, primarily because the aerosols are largely located near industrialized areas in the Northern Hemisphere middle and high latitudes where they trigger more rapid land responses and strong snow & ice feedbacks. Therefore studies based on observed warming have underestimated climate sensitivity as they did not account for the greater response to aerosol forcing, and multiple lines of evidence are now consistent in showing that climate sensitivity is in fact very unlikely to be at the low end of the range in recent estimates.

In particular, a criticism of the paper written by Nic Lewis has gotten some attention. Lewis makes a couple of potentially interesting points, chief of which concern the magnitude and uncertainty in the aerosol forcing I used and the time period over which the calculation is done, and I address these issues here. There are also a number of less substantive points in his piece that I will not bother with.

Lewis states that “The extensive adjustments made by Shindell to the data he uses are a source of concern. One of those adjustments is to add +0.3 W/m² to the figures used for model aerosol forcing to bring the estimated model aerosol forcing into line with the AR5 best estimate of -0.9 W/m².” Indeed the estimate of aerosol forcing used in the calculation of transient climate response (TCR) in the paper does not come directly from climate models, but instead incorporates an adjustment to those models so that the forcing better matches the assessed estimates from the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC). An adjustment is necessary because as climate models are continually evaluated against observations evidence has become emerged that the strength of their aerosol-cloud interactions are too strong (i.e. the models’ ‘aerosol indirect effect’ is larger than inferred from observations). There have been numerous papers on this topic and this issue was thoroughly assessed in IPCC AR5 chapter 7. The assessed best estimate was that the historical negative aerosol forcing (radiation and cloud effects, but not black carbon on snow/ice) was too strong by about 0.3 Wm-2 in the models that included that effect, a conclusion very much in line with a prior publication on climate sensitivity by Otto et al. (2013). Given numerous scientific studies on this topic, there is ample support for the conclusion that models overestimate the magnitude of aerosol forcing, though the uncertainty in aerosol forcing (which is incorporated into the analysis in the paper) is large, especially in comparison with CO2 forcing which can be better constrained by observations.

The second substantive point Lewis raised relates to the time period over which the TCR is evaluated. The IPCC emphasizes forcing estimates relative to 1750 since most of the important anthropogenic impacts are thought to have been small at that time (biomass burning may be an exception, but appears to have a relatively small net forcing). Surface temperature observations become sparser going back further in time, however, and the most widely used datasets only go back to 1880 or 1850. Radiative forcing, especially that due to aerosols, is highly uncertain for the period 1750-1850 as there is little modeling and even less data to constrain those models. The AR5 gives a value for 1850 aerosol forcing (relative to 1750) (Annex II, Table AII.1.2) of -0.178 W/m² for direct+indirect (radiation+clouds). There is also a BC snow forcing of 0.014 W/m², for a total of -0.164 W/m². While these estimates are small, they are nonetheless very poorly constrained.

Hence there are two logical choices for an analysis of TCR. One could assume that there was minimal global mean surface temperature change between 1750 and 1850, as some datasets suggest, and compare the 1850-2000 temperature change with the full 1750-2000 forcing estimate, as in my paper and Otto et al. In this case, aerosol forcing over 1750-2000 is used.

Alternatively, one could assume we can estimate forcing during this early period realistically enough to remove if from the longer 1750-2000 estimates, and so compare forcing and response over 1850-2000. In this case, this must be done for all forcings, not just for the aerosols. The well-mixed greenhouse gas forcing in 1850 is 0.213 W/m². Including well-mixed solar and stratospheric water that becomes 0.215 W/m². LU and ozone almost exactly cancel one another. So to adjust from 1750-2000 to 1850-2000 forcings, one must remove 0.215 W/m² and also remove the -0.164 W/m² aerosol forcing, multiplying the latter by it’s impact relative to that of well-mixed greenhouse gases (~1.5) that gives about -0.25 W/m².

If this is done consistently, the denominator of the climate sensitivity calculation containing total forcing barely changes and hence the TCR results are essentially the same (a change of only 0.03°C). Lewis’ claim that the my TCR results are mistaken because they did not account for 1750-1850 aerosol forcing is incorrect because he fails to use consistent time periods for all forcing agents. The results are in fact quite robust to either analysis option provided they are done consistently.

Lewis also discusses the uncertainty in aerosol forcing and in the degree to which the response to aerosols are enhanced relative to the response to CO2. Much of this discussion follows a common pattern of looking through the peer-reviewed paper to find all the caveats and discussion points, and then repeating them back as if they undermine the paper’s conclusions rather than reflecting that they are uncertainties that were already taken into account. It is important to realize that the results presented in the paper include both the uncertainty in the aerosol forcing and the uncertainty in the enhancement of the response to aerosol forcing, as explicitly stated. Hence any statement that the uncertainty is underestimated in the results presented in the paper, due to the fact that (included) uncertainty in these two components is large, is groundless.

In fact, this is an important issue to keep in mind as Lewis also argues that the climate models do not provide good enough information to determine the value of the enhanced aerosol response (the parameter I call E in the paper, where E is the ratio of the global mean temperature response to aerosol forcing versus the response to the same global mean magnitude of CO2 forcing, so that E=1.5 would be a 50% stronger response to aerosols). While the models indeed are imperfect and have uncertainties, they provide the best available method we have to determine the value of E as this cannot be isolated from observations directly. Furthermore, basic physical understanding supports the modeled value of E being substantially greater than 1, as deep oceans clearly take longer to respond than the land surface, so the Northern Hemisphere, with most of the world’s land, will respond more rapidly than the Southern Hemisphere with more ocean. Quantifying the value of E accurately is difficult, and the variation across the models is substantial, primarily reflecting our incomplete knowledge of aerosol forcing. This leads to a range of E quoted in the paper of 1.18 to 2.43. I used this range, assuming a lognormal distribution, along with the mean value of 1.53, in the calculation for the TCR.

Lewis then argues that the large uncertainty ranges in E and in aerosol forcing make it the TCR estimates “worthless”. While “worthless” is a little strong, it is important to fully assess uncertainties in trying to constrain any properties in the real world. It’s worthwhile to note that Lewis co-authored a recent report claiming that TCR could in fact be constrained to be low. That report relies on studies that include the large aerosol forcing uncertainty, so criticizing my paper for that would be inconsistent. However, Lewis’ study assumed that all forcings induce the same response in global mean temperature as CO2. This is equivalent to assuming that E is exactly 1.0 with NO uncertainty whatsoever. This is a reasonable first guess in the absence of evidence to the contrary, but as my paper recently showed, there is evidence to indicate that assumption is biased.

But while Lewis argues that the uncertainty in E is large and climate models do not give the value as accurately as we’d like, that does not justify ignoring that uncertainty entirely. Instead, we need to characterize that uncertainty as best we can and propagate that through the calculation (as can be seen in the figure below). The real question is not whether climate models provide us perfect information (they do not), but rather whether they provide better information than some naïve prior assumption. In this case, it is clear that they do.



Figure shows representative probability distribution functions for TCR using the numbers from Shindell (2014) in a Monte Carlo calculation (Gaussian for Fghg and dTobs, lognormal fits for the skewed distributions for Faerosol+ozone+LU and E). The green line is if you assume exactly no difference between the effects of aerosols and GHGs; Red is if you estimate that difference using climate models; Dashed red is the small difference made by using a different start date (1850 instead of 1750).

This highlights the critical distinction in our reasoning: I fully support the basic methods used in prior work such as Otto et al and have simply quantified an additional physical factor in the existing methodology. I am however confused that Lewis, on one hand, appears to now object to the basic method used in prior work in which the authors first adjusted aerosol forcing, second included it’s uncertainty, and then finally quantified estimates of TCR, Yet on the other hand, he not only co-authored the Otto et al paper but released a report praising that study just three days before the publication of my paper.

For completeness, I should acknowledge that Lewis correctly identified a typo in the last row of the first column of Table S2, which has been corrected in the version posted where there is also access to the computer codes used in the calculations. The climate model output itself is already publicly available at the CMIP5 website (also linked at that page).

Finally, I note that the conclusions of the paper send a sobering message. It would be nice if sensitivity was indeed quite low and society could get away with smaller emission cuts to stabilize climate. Unfortunately, several lines of independent evidence now agree that this is not the case.

References

  1. D.T. Shindell, "Inhomogeneous forcing and transient climate sensitivity", Nature Climate change, vol. 4, pp. 274-277, 2014. http://dx.doi.org/10.1038/nclimate2136
  2. A. Otto, F.E.L. Otto, O. Boucher, J. Church, G. Hegerl, P.M. Forster, N.P. Gillett, J. Gregory, G.C. Johnson, R. Knutti, N. Lewis, U. Lohmann, J. Marotzke, G. Myhre, D. Shindell, B. Stevens, and M.R. Allen, "Energy budget constraints on climate response", Nature Geosci, vol. 6, pp. 415-416, 2013. http://dx.doi.org/10.1038/ngeo1836
Author: "group" Tags: "Climate modelling, Climate Science, Inst..."
Comments Send by mail Print  Save  Delicious 
Date: Sunday, 06 Apr 2014 15:02

More open thread. Unusually, we are keeping the UV Mar 2014 thread open for more Diogenetic conversation and to keep this thread open for more varied fare.

Author: "group" Tags: "Climate Science, Open thread"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 04 Apr 2014 08:41

The second part of the new IPCC Report has been approved – as usual after lengthy debates – by government delegations in Yokohama (Japan) and is now public. Perhaps the biggest news is this: the situation is no less serious than it was at the time of the previous report 2007. Nonetheless there is progress in many areas, such as a better understanding of observed impacts worldwide and of the specific situation of many developing countries. There is also a new assessment of “smart” options for adaptation to climate change. The report clearly shows that adaptation is an option only if efforts to mitigate greenhouse gas emissions are strengthened substantially. Without mitigation, the impacts of climate change will be devastating.

cramer

 

 

Guest post by Wolfgang Cramer

 

 

On all continents and across the oceans

Impacts of anthropogenic climatic change are observed worldwide and have been linked to observed climate using rigorous methods. Such impacts have occurred in many ecosystems on land and in the ocean, in glaciers and rivers, and they concern food production and the livelihoods of people in developing countries. Many changes occur in combination with other environmental problems (such as urbanization, air pollution, biodiversity loss), but the role of climate change for them emerges more clearly than before.

abb1

Fig. 1 Observed impacts of climate change during the period since publication of the IPCC Fourth Assessment Report 2007

 

During the presentation for approval of this map in Yokohama many delegates asked why there are not many more impacts on it. This is because authors only listed those cases where solid scientific analysis allowed attribution. An important implication of this is that absence of icons from the map may well be due to lacking data (such as in parts of Africa) – and certainly does not imply an absence of impacts in reality. Compared to the earlier report in 2007, a new element of these documented findings is that impacts on crop yields are now clearly identified in many regions, also in Europe. Improved irrigation and other technological advances have so far helped to avoid shrinking yields in many cases – but the increase normally expected from technological improvements is leveling off rapidly.

 

A future of increasing risks

More than previous IPCC reports, the new report deals with future risks. Among other things, it seeks to identify those situations where adaptation could become unfeasible and damages therefore become inevitable. A general finding is that “high” scenarios of climate change (those where global mean temperature reaches four degrees C or more above preindustrial conditions – a situation that is not at all unlikely according to part one of the report) will likely result in catastrophic impacts on most aspects of human life on the planet.

abb2

Fig. 2 Risks for various systems with high (blue) or low (red) efforts in climate change mitigation

 

These risks concern entire ecosystems, notably those of the Arctic and the corals of warm waters around the world (the latter being a crucial resource for fisheries in many developing countries), the global loss of biodiversity, but also the working conditions for many people in agriculture (the report offers many details from various regions). Limiting global warming to 1.5-2.0 degrees C through aggressive emission reductions would not avoid all of these damages, but the risks would be significantly lower (a similar chart has been shown in earlier reports, but the assessment of risks is now, based on the additional scientific knowledge available, more alarming than before, a point that is expressed most prominently by the deep red color in the first bar).

 

Food security increasingly at risk

In the short term, warming may improve agricultural yields in some cooler regions, but significant reductions are highly likely to dominate in later decades of the present century, particularly for wheat, rice and maize. The illustration is an example of the assessment of numerous studies in the scientific literature, showing that, from 2030 onwards, significant losses are to be expected. This should be seen in the context of already existing malnutrition in many regions, a growing problem also in the absence of climate change, due to growing populations, increasing economic disparities and the continuing shift of diet towards animal protein.

abb3

Fig. 3 Studies indicating increased crop yields (blue) or reduced crop yields (brown), accounting for various scenarios of climate change and technical adaptation

 

The situation for global fisheries is comparably bleak. While some regions, such as the North Atlantic, might allow larger catches, there is a loss of marine productivity to be expected in nearly all tropical waters, caused by warming and acidification. This affects poor countries in South-East Asia and the Pacific in particular. Many of these countries will also be affected disproportionately by the consequences of sea-level rise for coastal mega-cities.

abb4

Fig. 4 Change in maximum fish catch potential 2051-2060 compared to 2001-2010 for the climate change scenario SRES A1B

 

Urban areas in developing countries particularly affected

Nearly all developing countries experience significant growth in their mega-cities – but it is here that higher temperatures and limited potential for technical adaptation have the largest effect on people. Improved urban planning, focusing on the resilience of residential areas and transport systems of the poor, can deliver important contributions to adaptation. This would also have to include better preparation for the regionally rising risks from typhoons, heat waves and floods.

 

Conflicts in a warmer climate

It has been pointed out that no direct evidence is available to connect the occurrence of violent conflict to observed climate change. But recent research has shown that it is likely that dry and hot periods may have been contributing factors. Studies also show that the use of violence increases with high temperatures in some countries. The IPCC therefore concludes that enhanced global warming may significantly increase risks of future violent conflict.

 

Climate change and the economy

Studies estimate the impact of future climate change as around few percent of global income, but these numbers are considered hugely uncertain. More importantly, any economic losses will be most tangible for countries, regions and social groups already disadvantaged compared to others. It is therefore to be expected that economic impacts of climate change will push large additional numbers of people into poverty and the risk of malnutrition, due to various factors including increase in food prices.

 

Options for adaptation to the impacts of climate change

The report underlines that there is no globally acceptable “one-fits-all” concept for adaptation. Instead, one must seek context-specific solutions. Smart solutions can provide opportunities to enhance the quality of life and local economic development in many regions – this would then also reduce vulnerabilities to climate change. It is important that such measures account for cultural diversity and the interests of indigenous people. It also becomes increasingly clear that policies that reduce emissions of greenhouse gases (e.g., by the application of more sustainable agriculture techniques or the avoidance of deforestation) need not be in conflict with adaptation to climate change. Both can improve significantly the livelihoods of people in developing countries, as well as their resilience to climate change.

It is beyond doubt that unabated climate change will exhaust the potential for adaptation in many regions – particularly for the coastal regions in developing countries where sea-level rise and ocean acidification cause major risks.

The summary of the report is found here. Also the entire report with all underlying chapters is online. Further there is a nicely crafted background video.

Wolfgang Cramer is scientific director of the Institut Méditerranéen de Biodiversité et d’Ecologie marine et continentale (IMBE) in Aix-en-Provence one of the authors of the IPCC  working group 2 report.

This article was translated from the German original at RC’s sister blog KlimaLounge.

 

Weblink

Here is our summary of part 1 of the IPCC report.

Author: "stefan" Tags: "Climate impacts, Climate Science, IPCC"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 31 Mar 2014 01:29

Instead of speculations based on partial drafts and attempts to spin the coverage ahead of time, you can now download the final report of the IPCC WG2: “Climate Change 2014:Impacts, Adaptation, and Vulnerability” directly. The Summary for Policy Makers is here, while the whole report is also downloadable by chapter. Notably there are FAQ for the whole report and for each chapter that give a relatively easy way in to the details. Note too that these are the un-copyedited final versions, and minor edits, corrections and coherent figures will be forthcoming in the final published versions. (For reference, the WG1 report was released in Sept 2013, but only in final published form in Jan 2014). Feel free to link to interesting takes on the report in the comments.

Author: "group" Tags: "Climate impacts, Climate Science, IPCC"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 28 Mar 2014 23:34

This is mid-month open-thread for all discussions, except those related to Diogenes’ comments. People wanting to discuss with commenter Diogenes should stick to the previous UV thread. All such discussion on this thread will be moved over. Thanks.

Author: "gavin" Tags: "Climate Science, Open thread"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 25 Mar 2014 16:40

Does global warming make extreme weather events worse? Here is the #1 flawed reasoning you will have seen about this question: it is the classic confusion between absence of evidence and evidence for absence of an effect of global warming on extreme weather events. Sounds complicated? It isn’t. I’ll first explain it in simple terms and then give some real-life examples.

The two most fundamental properties of extreme events are that they are rare (by definition) and highly random. These two aspects (together with limitations in the data we have) make it very hard to demonstrate any significant changes. And they make it very easy to find all sorts of statistics that do not show an effect of global warming – even if it exists and is quite large.

Would you have been fooled by this?

Imagine you’re in a sleazy, smoky pub and a stranger offers you a game of dice, for serious money. You’ve been warned and have reason to suspect they’re using a loaded dice here that rolls a six twice as often as normal. But the stranger says: “Look here, I’ll show you: this is a perfectly normal dice!” And he rolls it a dozen times. There are two sixes in those twelve trials – as you’d expect on average in a normal dice. Are you convinced all is normal?

You shouldn’t be, because this experiment is simply inconclusive. It shows no evidence for the dice being loaded, but neither does it provide real evidence against your prior suspicion that the dice is loaded. There is a good chance for this outcome even if the dice is massively loaded (i.e. with 1 in 3 chance to roll a six). On average you’d expect 4 sixes then, but 2 is not uncommon either. With normal dice, the chance to get exactly two sixes in this experiment is 30%, with the loaded dice it is 13%[i]. From twelve tries you simply don’t have enough data to tell.

Hurricanes

In 2005, leading hurricane expert Kerry Emanuel (MIT) published an analysis showing that the power of Atlantic hurricanes has strongly increased over the past decades, in step with temperature. His paper in the journal Nature happened to come out on the 4th of August – just weeks before hurricane Katrina struck. Critics were quick to point out that the power of hurricanes that made landfall in the US had not increased. While at first sight that might appear to be the more relevant statistic, it actually is a case like rolling the dice only twelve times: as Emanuel’s calculations showed, the number of landfalling storms is simply far too small to get a meaningful result, as those data represent “less than a tenth of a percent of the data for global hurricanes over their whole lifetimes”. Emanuel wrote at the time (and later confirmed in a study): “While we can already detect trends in data for global hurricane activity considering the whole life of each storm, we estimate that it would take at least another 50 years to detect any long-term trend in U.S. landfalling hurricane statistics, so powerful is the role of chance in these numbers.” Like with the dice this is not because the effect is small, but because it is masked by a lot of ‘noise’ in the data, spoiling the signal-to-noise ratio.

Heat records

The number of record-breaking hot months (e.g. ‘hottest July in New York’) around the world is now five times as big as it would be in an unchanging climate. This has been shown by simply counting the heat records in 150,000 series of monthly temperature data from around the globe, starting in the year 1880. Five times. For each such record that occurs just by chance, four have been added thanks to global warming.

You may be surprised (like I was at first) that the change is so big after less than 1 °C global warming – but if you do the maths, you find it is exactly as expected. In 2011, in the Proceedings of the National Academy we described a statistical method for calculating the expected number of monthly heat records given the observed gradual changes in climate. It turns out to be five times the number expected in a stationary climate.

Given that this change is so large, that it is just what is expected and that it can be confirmed by simple counting, you’d expect this to be uncontroversial. Not so. Our paper was attacked with astounding vitriol by Roger Pielke Jr., with repeated false allegations about our method (more on this here).

Barriopedro

European summer temperatures for 1500–2010. Vertical lines show the temperature deviations from average of individual summers, the five coldest and the five warmest are highlighted. The grey histogram shows the distribution for the 1500–2002 period with a Gaussian fit shown in black. That 2010, 2003, 2002, 2006 and 2007 are the warmest summers on record is clearly not just random but a systematic result of a warming climate. But some invariably will rush to the media to proclaim that the 2010 heat wave was a natural phenomenon not linked to global warming. (Graph from Barriopedro et al., Science 2011.)

 

Heat records can teach us another subtle point. Say in your part of the world the number of new heat records has been constant during the past fifty years. So, has global warming not acted to increase their number? Wrong! In a stationary climate, the number of new heat records declines over time. (After 50 years of data, the chance that this year is the hottest is 1/50. After 100 years, this is reduced to 1/100.)  So if the number has not changed, two opposing effects must have kept it constant: the natural decline, and some warming. In fact, the frequency of daily heat records has declined in most places during the past decades. But due to global warming, they have declined much less than the number of cold records, so that we now observe many more hot records than cold records. This shows how some aspects of extreme events can be increased by global warming at the same time as decreasing over time. A curve with no trend does not demonstrate that something is unaffected by global warming.

Drought

Drought is another area where it is very easy to over-interpret statistics with no significant change, as in this recent New York Times opinion piece on the serious drought in California. The argument here goes that man-made climate change has not played “any appreciable role in the current California drought”, because there is no trend in average precipitation. But that again is a rather weak argument, because drought is far more complex than just being driven by average precipitation. It has a lot to do with water stored in soils, which gets lost faster in a warmer climate due to higher evaporation rates. California has just had its warmest winter on record. And the Palmer Drought Severity Index, a standard measure for drought, does show a significant trend towards more serious drought conditions in California.

The cost of extreme weather events

If an increase in extreme weather events due to global warming is hard to prove by statistics amongst all the noise, how much harder is it to demonstrate an increase in damage cost due to global warming? Very much harder! A number of confounding socio-economic factors clouds this issue which are very hard to quantify and disentangle. Some factors act to increase the damage, like larger property values in harm’s way. Some act to decrease it, like more solid buildings (whether from better building codes or simply as a result of increased wealth) and better early warnings. Thus it is not surprising that the literature on this subject overall gives inconclusive results. Some studies find significant damage trends after adjusting for GDP, some don’t, tempting some pundits to play cite-what-I-like. The fact that the increase in damage cost is about as large as the increase in GDP (as recently argued at FiveThirtyEight) is certainly no strong evidence against an effect of global warming on damage cost. Like the stranger’s dozen rolls of dice in the pub, one simply cannot tell from these data.

The emphasis on questionable dollar-cost estimates distracts from the real issue of global warming’s impact on us. The European heat wave of 2003 may not have destroyed any buildings – but it is well documented that it caused about 70,000 fatalities. This is the type of event for which the probability has increased by a factor of five due to global warming – and is likely to rise to a factor twelve over the next thirty years. Poor countries, whose inhabitants hardly contribute to global greenhouse gas emissions, are struggling to recover from “natural” disasters, like Pakistan from the 2010 floods or the Philippines and Vietnam from tropical storm Haiyan last year. The families who lost their belongings and loved ones in such events hardly register in the global dollar-cost tally.

It’s physics, stupid!

While statistical studies on extremes are plagued by signal-to-noise issues and only give unequivocal results in a few cases with good data (like for temperature extremes), we have another, more useful source of information: physics. For example, basic physics means that rising temperatures will drive sea levels up, as is in fact observed. Higher sea level to start from will clearly make a storm surge (like that of the storms Sandy and Haiyan) run up higher. By adding 1+1 we therefore know that sea-level rise is increasing the damage from storm surges – probably decades before this can be statistically proven with observational data.

There are many more physical linkages like this – reviewed in our recent paper A decade of weather extremes. A warmer atmosphere can hold more moisture, for example, which raises the risk of extreme rainfall events and the flooding they cause. Warmer sea surface temperatures drive up evaporation rates and enhance the moisture supply to tropical storms. And the latent heat of water vapor is a prime source of energy for the atmosphere. Jerry Meehl from NCAR therefore compares the effect of adding greenhouse gases to putting the weather on steroids.

Yesterday the World Meteorological Organisation published its Annual Statement on the Climate, finding that “2013 once again demonstrated the dramatic impact of droughts, heat waves, floods and tropical cyclones on people and property in all parts of the planet” and that “many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change.”

With good physical reasons to expect the dice are loaded, we should not fool ourselves with reassuring-looking but uninformative statistics. Some statistics show significant changes – but many are simply too noisy to show anything. It would be foolish to just play on until the loading of the dice finally becomes evident even in highly noisy statistics. By then we will have paid a high price for our complacency.

 

Postscript (29 March):

The Huffington Post has the story of the letters that Roger Pielke sent to two leading climate scientists, perceived by them as threatening, after they criticised his article: FiveThirtyEight Apologizes On Behalf Of Controversial Climate Science Writer. According to the Huffington Post, Pielke wrote to Kevin Trenberth and his bosses:

Once again, I am formally asking you for a public correction and apology. If that is not forthcoming I will be pursuing this further. More generally, in the future how about we agree to disagree over scientific topics like gentlemen?

Pielke using the word “gentlemen” struck me as particularly ironic.

How gentlemanly is it that on his blog he falsely accused us of cherry-picking the last 100 years of data rather than using the full available 130 years in our PNAS paper Increase of extreme events in a warming world, even though we clearly say in the paper that our conclusion is based on the full data series?

How gentlemanly is it that he falsely claims “Rahmstorf confirms my critique (see the thread), namely, they used 1910-2009 trends as the basis for calculating 1880-2009 exceedence probabilities,” when I have done nothing of the sort?

How gentlemanly is it that to this day, in a second update to his original article, he claims on his website: “The RC11 methodology does not make any use of data prior to 1910 insofar as the results are concerned (despite suggestions to the contrary in the paper).” This is a very serious allegation for a scientist, namely that we mislead or deceive in our paper (some colleagues have interpreted this as an allegation of scientific fraud). This allegation is completely unsubstantiated by Pielke, and of course it is wrong.

We did not respond with a threatening letter – not our style. Rather, we published a simple statistics tutorial together with our data and computer code, hoping that in this way Pielke could understand and replicate our results. But until this day we have not received any apology for his false allegations.

Our paper showed that the climatic warming observed in Moscow particularly since 1980 greatly increased the chances of breaking the previous July temperature record (set in 1938) there. We concluded:

For July temperature in Moscow, we estimate that the local warming trend has increased the number of records expected in the past decade fivefold, which implies an approximate 80% probability that the 2010 July heat record would not have occurred without climate warming.

Pielke apparently did not understand why the temperatures before 1910 hardly affect this conclusion (in fact increasing the probability from 78% to 80%), and that the linear trend from 1880 or 1910 is not a useful predictor for this probability of breaking a record. This is why we decomposed the temperature data into a slow, non-linear trend line (shown here) and a stochastic component – a standard procedure that even makes it onto the cover picture of a data analysis textbook, as well as being described in a climate time series analysis textbook. (Pielke ridicules this method as “unconventional”.)

He gentlemanly writes about our paper:

That some climate scientists are playing games in their research, perhaps to get media attention in the larger battle over climate politics, is no longer a surprise. But when they use such games to try to discredit serious research, then the climate science community has a much, much deeper problem.

His praise of “serious research” by the way refers to a paper that claimed “a primarily natural cause for the Russian heat wave” and “that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.” (See also the graph above.)

 

Update (1 April):

Top hurricane expert Kerry Emanuel has now published a very good response to Pielke at FiveThirtyEight, making a number of the same points as I do above. He uses a better analogy than my dice example though, writing:

Suppose observations showed conclusively that the bear population in a particular forest had recently doubled. What would we think of someone who, knowing this, would nevertheless take no extra precautions in walking in the woods unless and until he saw a significant upward trend in the rate at which his neighbors were being mauled by bears?

The doubling of the bear population refers to the increase in hurricane power in the Atlantic which he showed in his Nature article of 2005 – an updated graph of his data is shown below, from our Nature Climate Change paper A decade of weather extremes.

Emanuel_Atlantic_PDI

 

Related posts:

Extremely hot

On record-breaking extremes

The Moscow warming hole

 


[i] For the math-minded: if a dice has a probability of 1/n to roll a six (a normal dice has n=6) and you roll it k times, the probability p to find m sixes is p = k!/[(k-m)!m!] × (n-1)(k-m)/nk.

Author: "stefan" Tags: "Climate Science, Communicating Climate, ..."
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 22 Mar 2014 17:06

XKCD, the brilliant and hilarious on-line comic, attempts to answer the question

How much CO2 is contained in the world’s stock of bottled fizzy drinks? How much soda would be needed to bring atmospheric CO2 back to preindustrial levels?

The answer is, enough to cover the Earth with 10 layers of soda cans. However, the comic misses a factor of about two, which would arise from the ocean. The oceans have been taking up carbon throughout the industrial era, as have some parts of the land surface biosphere. The ocean contains about half of the carbon we’ve ever released from fossil fuels. We’ve also cut down a lot of trees, which has been more-or-less compensated for by uptake into other parts of the land biosphere. So as a fraction of our total carbon footprint (fuels + trees) the oceans contain about a third.

At any rate, the oceans are acting as a CO2 buffer, meaning that it’s absorbing CO2 as it tries to limit the change to the atmospheric concentration. If we suddenly pulled atmospheric CO2 back down to 280 ppm (by putting it all in cans of soda perhaps), the oceans would work in the opposite direction, to buffer our present-day higher concentration by giving up CO2. The land biosphere is kind of a loose cannon in the carbon cycle, hard to predict what it will do.

Ten layers of soda cans covering the whole earth sounds like a lot. But most of a soda can is soda, rather than CO2. Here’s another statistic: If the CO2 in the atmosphere were to freeze out as dry ice depositing on the ground, the dry ice layer would only be about 7 millimeters thick. I guess cans of soda pop might not be the most efficient or economical means of CO2 sequestration. For a better option, look to saline aquifers, which are porous geological formations containing salty water that no one would want to drink or irrigate with anyway. CO2 at high pressure forms a liquid, then ultimately reacts with igneous rocks to form CaCO3.

Further Reading

Tans, Pieter. An accounting of the observed increase in oceanic and atmospheric CO2 and
an outlook for the Future. Oceanography 22(4) 26-35, 2009

Carbon dioxide capture and storage IPCC Report, 2005

Author: "david" Tags: "Climate Science"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 13 Mar 2014 13:57

I’m writing this post to see if our audience can help out with a challenge: Can we collectively produce some coherent, properly referenced, open-source, scalable graphics of global temperature history that will be accessible and clear enough that we can effectively out-compete the myriad inaccurate and misleading pictures that continually do the rounds on social media?

Bad graphs

One of the most common fallacies in climate is the notion that, because the climate was hotter than now in the Eocene or Cretaceous or Devonian periods, we should have no concern for current global warming. Often this is combined with an implication that mainstream scientists are somehow unaware of these warmer periods (despite many of us having written multiple papers on previous warm climates). This is fallacious on multiple grounds, not least because everyone (including IPCC) has been discussing these periods for ages. Additionally, we know that sea levels during those peak warm periods were some 80 meters higher than today, and that impacts of the current global warming are going to be felt by societies and existing ecosystems that are adapted for Holocene climates – not climates 100 million years ago.

In making this point the most common graph that gets used is one originally put online by “Monte Hieb” on this website. Over the years, the graphic has changed slightly


Monte Hieb temperature/CO2 schematics

(versions courtesy of the wayback machine), but the essential points have remained the same. The ‘temperature’ record is a hand-drawn schematic derived from the work of Chris Scotese, and the CO2 graph is from a model that uses tectonic and chemical weathering histories to estimate CO2 levels (Berner 1994; Berner and Kothavala, 2001). In neither case is there an abundance of measured data.

The original Scotese renderings are also available (again, earlier versions via the wayback machine):


Scotese reconstructions

Scotese is an expert in reconstructions of continental positions through time and in creating his ‘temperature reconstruction’ he is basically following an old-fashioned idea (best exemplified by Frakes et al’s 1992 textbook) that the planet has two long-term stable equilibria (‘warm’ or ‘cool’) which it has oscillated between over geologic history. This kind of heuristic reconstruction comes from the qualitative geological record which gives indications of glaciations and hothouses, but is not really adequate for quantitative reconstructions of global mean temperatures. Over the last few decades, much better geochemical proxy compilations with better dating have appeared (for instance, Royer et al (2004)) and the idea that there are only two long-term climate states has long fallen by the wayside.

However, since this graphic has long been a favorite of the climate dismissives, many different versions do the rounds, mostly forwarded by people who have no idea of the provenance of the image or the lack of underlying data, or the updates that have occurred. Indeed, the 2004 version is the most common, having been given a boost by Monckton in 2008 and many others. Most recently, Patrick Moore declared that this was his favorite graph.

Better graphs

While more realistic graphs of temperature and CO2 histories will not prevent the basic fallacy we started discussing from being propagated, I think people should be encouraged to use actual data to make their points so that at least rebuttals of any logical fallacies wouldn’t have to waste time arguing about the underlying data. Plus it is so much better to have figures that don’t need a week to decipher (see some more principles at Betterfigures.org).

Some better examples of long term climate change graphics do exist. This one from Veizer et al (2000) for instance (as rendered by Robert Rohde):



Phanerozoic Climate Change

IPCC AR4 made a collation for the Cenozoic (65 Mya ago to present):



IPCC AR4 Fig 6.1

and some editors at Wikipedia have made an attempt to produce a complete record for the Phanerozoic:



Wikipedia multi-period collation

But these collations are imperfect in many ways. On the last figure the time axis is a rather confusing mix of linear segments and logarithmic scaling, there is no calibration during overlap periods, and the scaling and baselining of the individual, differently sourced data is a little ad hoc. Wikipedia has figures for other time periods that have not been updated in years and treatment of uncertainties is haphazard (many originally from GlobalWarmingArt).

I think this could all be done better. However, creating good graphics takes time and some skill, especially when the sources of data are so disparate. So this might be usefully done using some crowd-sourcing – where we collectively gather the data that we can find, process it so that we have clean data, discuss ways to fit it together, and try out different plotting styles. The goal would be to come up with a set of coherent up-to-date (and updatable) figures that could become a new standard for representing the temperature history of the planet. Thus…

The world temperature history challenge

The challenge comes in three parts:

  1. Finding suitable data
  2. Combining different data sets appropriately
  3. Graphically rendering the data

Each part requires work which could be spread widely across the participants. I have made a start on collating links to suitable data sets, and this can both be expanded upon and consolidated.

Period Reference Data download
0-600 Mya Veizer et al (2000), Royer et al (2004) (updated Royer (2014)) Veizer d180, Royer04 Temp, Royer14 CO2
0-65 Mya Zachos et al (2008), Hansen et al (2010) Zachos/Hansen
0-5.3 Mya Lisiecki and Raymo (2005) LR04 Stack
0-800 kya EPICA Dome C Temperature Reconstruction
0-125 kya NGRIP/Antarctic analog? NGRIP 50yr
0-12 kya Marcott et al (2013) MEA12 stack (xls)
0-2 kya Mann et al (2008), Ljungqvist (2010) MEA08 EIV, Ljungqvist10
1880-2013 CE GISTEMP GISTEMP LOTI
1850-2013 CE HadCRUT4 HadCRUT4 Global annual average, Cowtan&Way (infilled)
1850-2013 CE Berkeley Earth Land+Ocean annual mean

Combining this data is certainly a challenge, and there are multiple approaches that could be used that range from the very simple to the very complex. More subtly the uncertainties need to be properly combined also. Issues range from temporal and spatial coverage, time-dependent corrections in d18O for long term geologic processes or ice volume corrections, dating uncertainty etc.

Finally, rendering the graphics calls for additional skills – not least so that the different sources of data are clear, that the views over different timescales are coherent, and that the graphics are in the Wiki-standard SVG format (this site can be used for conversion from pdf or postscript).

Suggestions for other data sets to consider, issues of calibration and uncertainty and trial efforts are all welcome in the comments. If we make some collective progress, I’ll put up a new post describing the finished product(s). Who knows, you folks might even write a paper…

This post was inspired by a twitter conversation for Sou from Bundunga and some of the initial data links came via Robert Rohde (of Global Warming Art and now Berkeley Earth) and Dana Royer.

References

  1. R.A. Berner, "GEOCARB II; a revised model of atmospheric CO 2 over Phanerozoic time", American Journal of Science, vol. 294, pp. 56-91, 1994. http://dx.doi.org/10.2475/ajs.294.1.56
  2. R.A. Berner, "GEOCARB III: A revised model of atmospheric CO2 over Phanerozoic time", American Journal of Science, vol. 301, pp. 182-204, 2001. http://dx.doi.org/10.2475/ajs.301.2.182
  3. J. Veizer, Y. Godderis, and L.M. François, "", Nature, vol. 408, pp. 698-701, 2000. http://dx.doi.org/10.1038/35047044
  4. D. Royer, "Atmospheric CO2 and O2 During the Phanerozoic: Tools, Patterns, and Impacts", Treatise on Geochemistry, pp. 251-267, 2014. http://dx.doi.org/10.1016/B978-0-08-095975-7.01311-5
  5. L.E. Lisiecki, and M.E. Raymo, " A Pliocene-Pleistocene stack of 57 globally distributed benthic δ 18 O records ", Paleoceanography, vol. 20, pp. n/a-n/a, 2005. http://dx.doi.org/10.1029/2004PA001071
  6. S.A. Marcott, J.D. Shakun, P.U. Clark, and A.C. Mix, "A Reconstruction of Regional and Global Temperature for the Past 11,300 Years", Science, vol. 339, pp. 1198-1201, 2013. http://dx.doi.org/10.1126/science.1228026
  7. M.E. Mann, Z. Zhang, M.K. Hughes, R.S. Bradley, S.K. Miller, S. Rutherford, and F. Ni, "Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia", Proceedings of the National Academy of Sciences, vol. 105, pp. 13252-13257, 2008. http://dx.doi.org/10.1073/pnas.0805721105
  8. F.C. LJUNGQVIST, "A NEW RECONSTRUCTION OF TEMPERATURE VARIABILITY IN THE EXTRA-TROPICAL NORTHERN HEMISPHERE DURING THE LAST TWO MILLENNIA", Geografiska Annaler: Series A, Physical Geography, vol. 92, pp. 339-351, 2010. http://dx.doi.org/10.1111/j.1468-0459.2010.00399.x
Author: "gavin" Tags: "Climate Science, Communicating Climate, ..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 07 Mar 2014 15:51

I am always interested in non-traditional data sets that can shed some light on climate changes. Ones that I’ve discussed previously are the frequency of closing of the Thames Barrier and the number of vineyards in England. With the exceptional warmth in Alaska last month (which of course was coupled with colder temperatures elsewhere), I was reminded of another one, the Nenana Ice Classic.

For those that don’t know what the ‘Classic’ is, it is lottery competition that has been running since 1917 to guess the date on which the Nenana river ice breaks up in the spring. The Nenana river is outside of Fairbanks, Alaska and can be relied on to freeze over every year. The locals put up a tripod on the ice, and when the ice breaks up in the spring, the tripod gets swept away. The closest estimate to the exact time this happens wins the lottery, which can have a quite substantial pot.

Due to the cold spring in Alaska last year, the ice break up date was the latest since 1917, consistent with the spring temperature anomaly state-wide being one of the coldest on record (unsurprisingly the Nenana break up date is quite closely correlated to spring Alaskan temperatures). This year is shaping up to be quite warm (though current temperatures in Nenana (as of March 7) are still quite brisk!).

Since there is now an almost century-long record of these break up dates, it makes sense to look at them as potential indicators of climate change (and interannual variability). The paper by Sagarin and Micheli (2001) was perhaps the first such study, and it has been alluded to many times since (for instance, in the Wall Street Journal and Physics Today in 2008).

The figure below shows the break up date in terms of days after a nominal March 21, or more precisely time from the vernal equinox (the small correction is so that the data don’t get confused by non-climatic calendar issues). The long term linear trend (which is negative and has a slope of roughly 6 days per century) indicates that on average the break up dates have been coming earlier in the season. This is clear despite a lot of year-to-year variability:



Figure: Break up dates at Nenena in Julian days (either from a nominal March 21 (JD-80), or specifically tied to the Vernal Equinox). Linear trend in the VE-corrected data is ~6 days/century (1917-2013, ±4 days/century, 95% range).

In the 2008 WSJ article Martin Jeffries, a local geophysicist, said:

The Nenana Ice Classic is a pretty good proxy for climate change in the 20th century.

And indeed it is. The break-up dates are strongly correlated to regional spring temperatures, which have warmed over the century, tracking the Nenana trend. But as with the cool weather January 2014 in parts of the lower 48, or warm weather in Europe or Alaska, the expected very large variability in winter weather can be relied on to produce outliers on a regular basis.

Given that year-to-year variability, it is predictable that whenever the annual result is above trend, it often gets cherry-picked to suggest that climate change is not happening (despite the persistent long-term trend). There are therefore no prizes for guessing which years’ results got a lot of attention from the ‘climate dismissives’*. This is the same phenomenon that happens every winter whenever there is some cold weather or snow somewhere. Indeed, it is so predictable** that it even gets its own xkcd cartoon:



(Climate data sourced from Climate Central).
The fact remains that winters have been getting warmer on average. While scientists are very interested in potential influences on the variability (whether from volcanoes, solar effects, greenhouse gases, or Arctic sea ice loss), it remains the case that this is much harder and more uncertain than attributing trends in the mean, which as should be clear, are much more robust. As yet there is no truly convincing evidence that any change in variance has been detected, though there are a lot of ideas out there, and some very interesting discussions (see for instance, Francis and Vavrus (2012) and Barnes (2013)).

For fun, I calculated some of the odds (Monte-Carlo simulations using observed mean, a distribution of trends based on the linear fit and the standard deviation of the residuals). This suggests that a date as late as May 20 (as in 2013) is very unexpected even without any climate trends (<0.7%) and even more so with (<0.2%), but that the odds of a date before April 29 have more than doubled (from 10% to 22%) with the trend. The most favored date is May 3rd (with no trend it would have been May 6th), but the odds of the break-up happening in that single 24 hour period are only around 1 in 14.

So, the Nenana ice Classic – unlike the other two examples I mentioned in the opening paragraph – does appear to be a useful climate metric. That isn’t to say that every year is going to follow the long-term trend (clearly it doesn’t), but you’d probably want to factor that in to (ever so slightly) improve your odds of winning.

* Yup. 2001, after 2008, 2011, and 2013.
** It is so predictable, I am thinking about opening a derivative market on whether this year’s Nenana result will get mentioned.

References

  1. R. Sagarin, "Climate Change in Nontraditional Data Sets", Science, vol. 294, pp. 811-811, 2001. http://dx.doi.org/10.1126/science.1064218
  2. J.A. Francis, and S.J. Vavrus, "Evidence linking Arctic amplification to extreme weather in mid-latitudes", Geophysical Research Letters, vol. 39, pp. n/a-n/a, 2012. http://dx.doi.org/10.1029/2012GL051000
  3. E.A. Barnes, "Revisiting the evidence linking Arctic Amplification 1 to extreme weather in midlatitudes", Geophysical Research Letters, pp. n/a-n/a, 2013. http://dx.doi.org/10.1002/grl.50880
Author: "gavin" Tags: "Climate impacts, Climate Science, Instru..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 05 Mar 2014 13:14

Guest commentary from Zeke Hausfather and Robert Rohde

Daily temperature data is an important tool to help measure changes in extremes like heat waves and cold spells. To date, only raw quality controlled (but not homogenized) daily temperature data has been available through GHCN-Daily and similar sources. Using this data is problematic when looking at long-term trends, as localized biases like station moves, time of observation changes, and instrument changes can introduce significant biases.

For example, if you were studying the history of extreme heat in Chicago, you would find a slew of days in the late 1930s and early 1940s where the station currently at the Chicago O’Hare airport reported daily max temperatures above 45 degrees C (113 F). It turns out that, prior to the airport’s construction, the station now associated with the airport was on the top of a black roofed building closer to the city. This is a common occurrence for stations in the U.S., where many stations were moved from city cores to newly constructed airports or wastewater treatment plants in the 1940s. Using the raw data without correcting for these sorts of bias would not be particularly helpful in understanding changes in extremes.




Berkeley Earth has a newly released homogenized daily temperature field is built as a refinement upon our monthly temperature field using similar techniques. In constructing the monthly temperature field, we identified inhomogeneities in station time series caused by biasing events such as station moves and instrument changes, and measured their impact. The daily analysis begins by applying the same set of inhomogeneity breakpoints to the corresponding daily station time series.

Each daily time series is transformed into a series of temperature anomalies by subtracting from each daily observation the corresponding monthly average at the same station.  These daily anomaly series are then combined using similar mathematics to our monthly process (e.g. Kriging), but with an empirically determined correlation vs. distance function that falls off more rapidly and accounts for the more localized nature of daily weather fluctuations. For each day, resulting daily temperature anomaly field is then added to the corresponding monthly temperature field to create an overall daily temperature field.  The resulting daily temperature field captures the large day-to-day fluctuations in weather but also preserves the long-term characteristics already present in the monthly field.

As there are substantially more monthly records than daily records in the early periods (e.g. 1880-1950), treating the daily data as a refinement to the monthly data allows us to get maximum utility from the monthly data when determining long-term trends.  Additionally, performing the Kriging step on daily anomaly fields simplifies the computation in a way that makes it much more computationally tractable.

You can find the homogenized gridded 1-degree latitude by 1-degree longitude daily temperature data here in NetCDF format (though note that a single daily series like TMin is ~4.2 GB). We will be releasing an individual homogenized station series in the near future. Additional videos of daily absolute temperatures are also available.

Climate Data Visualization with Google Maps Engine

Through a partnership with Google we’ve created a number of different interactive climate maps using their new Maps Engine. These include maps of temperature trends from different periods to present (1810, 1860, 1910, 1960, 1990), maps of record high and low temperatures, and other interesting climate aspects like average temperature, daily temperature range, seasonal temperature range, and temperature variability.

These maps utilize a number of neat features of Google’s Maps Engine. They dynamically update as you zoom in, changing both the contour lines shown and the points of interest. For example, in the 1960-present trend map shown above, you can click on a point to show the regional climate summary for each location. As you zoom in, you can get see more clickable points appear. You can also enter a specific address in the search bar and see aspects of the climate at that location.

We also have a map of all ~40,000 stations in our database, markers for each that show the homogenized record for that station when clicked. These highlight all the detected breakpoints, and show how the station record compares to the regional expectation once breakpoints have been corrected. You can also click on the image to get to a web page for each station that shows the raw data as well as various statistics about the station.

New Global Temperature Series



Berkeley Earth has a new global temperature series available. This was created by combining our land record with a Kriged ocean time series using data from HadSST3. The results fit quite well with other published series, and are closest to the new estimates by Cowtan and Way. This is somewhat unsurprising, as both of us use HadSST3 as our ocean series and use Kriging for spatial interpolation.

If we zoom into the period since 1979, both Berkeley and Cowtan and Way have somewhat higher trends than other series over the last decade. This is due mainly to the coverage over the arctic (note that we use air temperature over ice instead of sea temperature under ice for areas of the world with seasonal sea ice coverage).

Author: "group" Tags: "Climate Science, Instrumental Record"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 04 Mar 2014 15:35

There has been a veritable deluge of new papers this month related to recent trends in surface temperature. There are analyses of the CMIP5 ensemble, new model runs, analyses of complementary observational data, attempts at reconciliation all the way to commentaries on how the topic has been covered in the media and on twitter. We will attempt to bring the highlights together here. As background, it is worth reading our previous discussions, along with pieces by Simon Donner and Tamino to help put in context what is being discussed here.

The papers and commentaries address multiple aspects of recent trends: the climate drivers over recent decades, the internal variability of the system, new analyses and model-observation comparisons – much as we suggested would be the case in any discussions of model-observations mismatches last year. We will take each in turn:

Climate drivers

Two papers (which I was an author on) are focussed mainly on examining the impact of updated forcings on the temperature: Santer et al (2014) ($) looks in detail at the impact of small volcanoes post-2000 on the vertical structure of temperature changes, while the commentary by Schmidt et al (2014) (OA, with registration) also updates the solar, aerosol and GHG forcing to estimate what the CMIP5 ensemble would have looked like if it had used this input data instead of the earlier estimates and initial forecasts (panel b in the figure).

Internal Variability

The contribution of internal variability to decadal trends is the focus of commentaries by Lisa Goddard (OA) and Martin Visbeck (OA). They focus on recent trends in ocean heat uptake, the Pacific Decadal Oscillation and the potential for initialised decadal predictions (such as Keenlyside et al) to capture these variations. These relate to both the earlier Kosaka and Xie and England et al papers, but are mostly reviews.
[Update: See also the Clement and DeNezio perspective in Science.]

New analyses

Fyfe and Gillett (2014)Following on from Kosaka and Xie, Fyfe and Gillett (2014) ($) show that the trends in the Eastern Pacific (1993-2012) are well outside the spread in the CMIP5 ensemble (see figure on right). This is consistent with England et al results, and yet it is unclear to what extent mis-specifications in the forcing might have affected it. For instance, if the updates to volcanic forcings from Vernier et al and used in the Santer and Schmidt papers are correct, trends from 1993 (at the maximum post-Pinatubo cooling) will be too large.

As the Nature Geoscience editorial emphasizes, there is more to climate change than the global mean temperature anomaly, and the paper by Seneviratne et al ($) shows that trends in extreme temperatures over land have continued apace throughout this time period.

Media, outreach and editorial response

Finally, there are some commentaries that look at the impact these questions have had on the wider public discussion: Hawkins et al (OA, including Twitter stalwarts Tamsin Edwards and Doug McNeall) discuss the opportunities that interest in recent trends gives scientists to discuss science on social media (also blogged about here), while Max Boykoff (OA) focuses on the framing of the issue in traditional media. Both Nature Clim. Chg. (OA) and Nature Geoscience (OA) have interesting editorials.

Overall, this is a great set of overviews of the issues – observations, comparisons, and modelling – and leads to some very specific directions for future research. There is unlikely to be any pause in that.

References

  1. B.D. Santer, C. Bonfils, J.F. Painter, M.D. Zelinka, C. Mears, S. Solomon, G.A. Schmidt, J.C. Fyfe, J.N.S. Cole, L. Nazarenko, K.E. Taylor, and F.J. Wentz, "Volcanic contribution to decadal changes in tropospheric temperature", Nature Geosci, vol. 7, pp. 185-189, 2014. http://dx.doi.org/10.1038/ngeo2098
  2. G.A. Schmidt, D.T. Shindell, and K. Tsigaridis, "Reconciling warming trends", Nature Geosci, vol. 7, pp. 158-160, 2014. http://dx.doi.org/10.1038/ngeo2105
  3. L. Goddard, "Heat hide and seek", Nature Climate change, vol. 4, pp. 158-161, 2014. http://dx.doi.org/10.1038/nclimate2155
  4. M. Visbeck, "Bumpy path to a warmer world", Nature Geosci, vol. 7, pp. 160-161, 2014. http://dx.doi.org/10.1038/ngeo2104
  5. A. Clement, and P. DiNezio, "The Tropical Pacific Ocean--Back in the Driver's Seat?", Science, vol. 343, pp. 976-978, 2014. http://dx.doi.org/10.1126/science.1248115
  6. J.C. Fyfe, and N.P. Gillett, "Recent observed and simulated warming", Nature Climate change, vol. 4, pp. 150-151, 2014. http://dx.doi.org/10.1038/nclimate2111
  7. "Hiatus in context", Nature Geosci, vol. 7, pp. 157-157, 2014. http://dx.doi.org/10.1038/ngeo2116
  8. S.I. Seneviratne, M.G. Donat, B. Mueller, and L.V. Alexander, "No pause in the increase of hot temperature extremes", Nature Climate change, vol. 4, pp. 161-163, 2014. http://dx.doi.org/10.1038/nclimate2145
  9. E. Hawkins, T. Edwards, and D. McNeall, "Pause for thought", Nature Climate change, vol. 4, pp. 154-156, 2014. http://dx.doi.org/10.1038/nclimate2150
  10. M.T. Boykoff, "Media discourse on the climate slowdown", Nature Climate change, vol. 4, pp. 156-158, 2014. http://dx.doi.org/10.1038/nclimate2156
  11. "Scientist communicators", Nature Climate change, vol. 4, pp. 149-149, 2014. http://dx.doi.org/10.1038/nclimate2167
Author: "gavin" Tags: "Aerosols, Climate modelling, Climate Sci..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 03 Mar 2014 14:19

This month’s open thread.

Author: "group" Tags: "Climate Science, Open thread"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 18 Feb 2014 03:03

A new paper in Nature Climate Change out this week by England and others joins a number of other recent papers seeking to understand the climate dynamics that have led to the so-called “slowdown” in global warming. As we and others have pointed out previously (e.g. here), the fact that global average temperatures can deviate for a decade or longer from the long term trend comes as no surprise. Moreover, it’s not even clear that the deviation has been as large as is commonly assumed (as discussed e.g. in the Cowtan and Way study earlier this year), and has little statistical significance in any case. Nevertheless, it’s still interesting, and there is much to be learned about the climate system from studying the details.

Several studies have shown that much of the excess heating of the planet due to the radiative imbalance from ever-increasing greenhouses gases has gone into the ocean, rather than the atmosphere (see e.g. Foster and Rahmstorf and Balmaseda et al.). In their new paper, England et al. show that this increased ocean heat uptake — which has occurred mostly in the tropical Pacific — is associated with an anomalous strengthening of the trade winds. Stronger trade winds push warm surface water towards the west, and bring cold deeper waters to the surface to replace them. This raises the thermocline (boundary between warm surface water and cold deep water), and increases the amount of heat stored in the upper few hundred meters of the ocean. Indeed, this is what happens every time there is a major La Niña event, which is why it is globally cooler during La Niña years. One could think of the last ~15 years or so as a long term “La-Niña-like” anomaly (punctuated, of course, by actual El Niño (like the exceptionally warm years 1998, 2005) and La Niña events (like the relatively cool 2011).

A very consistent understanding is thus emerging of the coupled ocean and atmosphere dynamics that have caused the recent decadal-scale departure from the longer-term global warming trend. That understanding suggests that the “slowdown” in warming is unlikely to continue, as England explains in his guest post, below. –Eric Steig

Guest commentary by Matthew England (UNSW)

For a long time now climatologists have been tracking the global average air temperature as a measure of planetary climate variability and trends, even though this metric reflects just a tiny fraction of Earth’s net energy or heat content. But it’s used widely because it’s the metric that enjoys the densest array of in situ observations. The problem of course is that this quantity has so many bumps and kinks, pauses and accelerations that predicting its year-to-year path is a big challenge. Over the last century, no single forcing agent is clearer than anthropogenic greenhouse gases, yet zooming into years or decades, modes of variability become the signal, not the noise. Yet despite these basics of climate physics, any slowdown in the overall temperature trend sees lobby groups falsely claim that global warming is over. Never mind that the globe – our planet – spans the oceans, atmosphere, land and ice systems in their entirety.

This was one of the motivations for our study out this week in Nature Climate Change (England et al., 2014)  With the global-average surface air temperature (SAT) more-or-less steady since 2001, scientists have been seeking to explain the climate mechanics of the slowdown in warming seen in the observations during 2001-2013. One simple way to address this is to examine what is different about the recent decade compared to the preceding decade when the global-mean SAT metric accelerated. This can be quantified via decade-mean differences, or via multi-decadal trends, which are roughly equivalent if the trends are more-or-less linear, or if the focus is on the low frequency changes.

A first look at multi-decadal trends over the past two decades (see below) shows a dramatic signature in the Pacific Ocean; with sea surface cooling over the east and central Pacific and warming in the west, extending into the subtropics. Sea-level records also reveal a massive trend across the Pacific: with the east declining and the west rising well above the global average.  Basic physical oceanography immediately suggests a trade wind trend as the cause: as this helps pile warm water up in the west at the expense of the east. And sure enough, that is exactly what had occurred with the Pacific wind field.

A consistent picture has now emerged to explain the slowdown in global average SAT since 2001 compared to the rapid warming of the 1980s and 1990s: this includes the link between hiatus decades and the Interdecadal Pacific Oscillation, the enhanced ocean heat uptake in the Pacific (see previous posts) and the role of East Pacific cooling. All of these factors are consistent with a picture of strengthened trade winds, enhanced heat uptake in the western Pacific thermocline, and cooling in the east – as you can see in this schematic:

As our study set out to reconcile the emerging divide between observations and the multi-model mean across CMIP5 and CMIP3 simulations, we took a slightly different approach, although there are obvious parallels to Kosaka and Xie’s study assessing the impact of a cooler East Pacific.  In particular, we incorporated the recent 20-year trend in trade winds into both an ocean and a climate model, to quantify its impact. It turns out that with this single perturbation, much of the ‘hiatus’ can be simulated. The slowdown in warming occurs as a combined result of both increased heat uptake in the Western Pacific Ocean, and increased cooling of the east and central Pacific (the latter leads to atmospheric teleconnections of reduced warming in other locations).  We find that the heat content change within the ocean accounts for about half of the slowdown, the remaining half comes from the atmospheric teleconnections from the east Pacific.

Unfortunately, however, the hiatus looks likely to be temporary, with projections suggesting that when the trade winds return to normal strength, warming is set to be rapid (see below). This is because the recent accelerated heat uptake in the Pacific Ocean is by no means permanent; this is consistent with the shallow depths at which the excess heat can now be found, at the 100-300m layer just below the surface mixed layer that interacts with the atmosphere. [Ed: though see also Mike's commentary on this aspect of the paper]

Even if the excess heat fluxed into the ocean were longer-term, burying the heat deep in the ocean would not come without its consequences; ocean thermal expansion translates this directly into sea-level rise, with Western Pacific Island nations already acutely aware of this from the recent trends.

Our study addresses some important topics but also raises several new questions.  For example, we find that climate models do not appear to capture the observed scale of multi-decadal variability in the Pacific – for example, none reproduce the magnitude of the observed Pacific trade wind acceleration – the best the models can do is around half this magnitude.  This begs the question as to why this is the case: given the positive ocean-atmosphere feedbacks operating to drive these strengthened trade winds, the answer could lie in the ocean, the atmosphere, or both.

The study also discusses the unprecedented nature of the wind trends, and suggests that only around half of the trend can be explained by the IPO. So where does the other half come from?  The Indian Ocean is as one possibility, given its recent rapid warming; but models capture this in greenhouse gas forced projections. What else might be accelerating the winds in the Pacific beyond what you’d expect to see from the underlying SST fields alone?

The study also points to the length of the wind trend as being crucial to the hiatus; arguing that anything much shorter, like a decadal wind trend, would not have resulted in nearly as much heat uptake by the ocean. This is related to the time-scale for ocean adjustment to wind forcing in the subtropics: in short it takes time to spin-up the ocean circulation response, and then more time to see this circulation inject a significant amount of heat into the ocean thermocline. Given the ocean inertia to change, what happens when the trade winds next weaken back to average values?  Does the subducted heat get mixed away before this can resurface, or does the heat find a way to return to the surface when the winds reverse?  Our initial work suggests the latter: as when we forced the wind anomalies to abate, warming out of the hiatus can be rapid, eventually recovering the warming that paused during the hiatus. So this suggests that whenever the current wind trends reverse, warming will resume as projected, and in time the present “pause” will be long forgotten by the climate system. [Ed: see again Mike's piece for a discussion of an alternative hypothesis--namely, the possibility that a La Niña-like state is part of the response to anthropogenic forcing itself].

Of course, other factors could have also contributed to part of the recent slowdown in the globally averaged air temperature metric: increased aerosols, a solar minimum, and problems with missing data in the Arctic. Summing up all of the documented contributions to the hiatus, spanning ocean heat uptake, reduced radiation reaching Earth’s surface, and data gaps, climate scientists have probably accounted for the hiatus twice over. Of course each effect is not linearly additive, but even so, many experts are now asking why hasn’t the past decade been one of considerable cooling in global mean air-temperatures?  Or put another way, why isn’t the model-observed gap even wider?  One way to explain this is that the current generation of climate models may be too low in their climate sensitivity – an argument made recently by Sherwood et al in relation to unresolved cloud physics. A perhaps completely unexpected conclusion when analysts first noticed the model-observed divergence progressing over the past decade.

References

  1. G. Foster, and S. Rahmstorf, "Global temperature evolution 1979–2010", Environ. Res. Lett., vol. 6, pp. 044022, 2011. http://dx.doi.org/10.1088/1748-9326/6/4/044022
  2. M.A. Balmaseda, K.E. Trenberth, and E. Källén, "Distinctive climate signals in reanalysis of global ocean heat content", Geophysical Research Letters, vol. 40, pp. 1754-1759, 2013. http://dx.doi.org/10.1002/grl.50382
  3. M.H. England, S. McGregor, P. Spence, G.A. Meehl, A. Timmermann, W. Cai, A.S. Gupta, M.J. McPhaden, A. Purich, and A. Santoso, "Recent intensification of wind-driven circulation in the Pacific and the ongoing warming hiatus", Nature Climate change, vol. 4, pp. 222-227, 2014. http://dx.doi.org/10.1038/nclimate2106
  4. Y. Kosaka, and S. Xie, "Recent global-warming hiatus tied to equatorial Pacific surface cooling", Nature, vol. 501, pp. 403-407, 2013. http://dx.doi.org/10.1038/nature12534
Author: "group" Tags: "Climate modelling, Climate Science, El N..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 04 Feb 2014 14:44

A little late starting this month’s open thread – must be the weather…

Author: "group" Tags: "Climate Science, Open thread"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 04 Feb 2014 12:31

Guest commentary by Tim Osborn and Phil Jones

The Climatic Research Unit (CRU) land surface air temperature data set, CRUTEM4, can now be explored using Google Earth. Access is via this portal together with instructions for using it (though it is quite intuitive).

We have published a short paper in Earth System Science Data (Osborn and Jones, 2014) to describe this new approach.

This is part of ongoing efforts to make our climate data as accessible and transparent as possible. The CRUTEM4 dataset is already freely available via the CRU and UK Met Office, including the full underlying database of weather station data. But accessing it through Google Earth will enhance:

  1. Traceability of how we construct the dataset, by showing the weather station data used to make each grid box temperature anomaly.
  2. Accessibility for teaching and research, by extracting grid box and weather station data without the need for programming.
  3. Identifying errors.  With ~6000 station records collected and collated by third parties there are bound to be errors.  If they are identified, they can be corrected. Note that the global temperature record is not greatly affected by changes to the input data (e.g. Figure 5 of Jones et al., 2012).

A walkthrough

The view when the KML file is first opened, with red/green shading to show grid boxes with CRUTEM4 data:

Navigate to a region of interest and click a shaded box to see the grid-box annual temperature anomaly:

Choose from links to view a larger annual image, a seasonal image, the grid-box data values (in CSV format for import into a spreadsheet) or choose “stations” to see the weather stations used:

Click a weather station pin to view the station annual temperature series, with links to larger annual and seasonal images, and to the station data values (again in CSV format for easy import into a spreadsheet):

We encourage you to try it for yourself – and to also read the open-access paper (Osborn and Jones, 2014) which describes the construction of CRUTEM4 in detail.

[Editor's note: Other portals for visualizations of station data exist from David Archer, GISTEMP and many National Weather Service websites.]

References

  1. T.J. Osborn, and P.D. Jones, "The CRUTEM4 land-surface air temperature data set: construction, previous versions and dissemination via Google Earth", Earth System Science Data, vol. 6, pp. 61-68, 2014. http://dx.doi.org/10.5194/essd-6-61-2014
  2. P.D. Jones, D.H. Lister, T.J. Osborn, C. Harpham, M. Salmon, and C.P. Morice, "Hemispheric and large-scale land-surface air temperature variations: An extensive revision and an update to 2010", J. Geophys. Res., vol. 117, 2012. http://dx.doi.org/10.1029/2011JD017139
Author: "group" Tags: "Climate Science, Instrumental Record"
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 01 Feb 2014 17:34

Along with David’s online class a number of new climate science Massive Online Open Courses (MOOCs) are now coming online.


A new online course from MIT, “Global Warming Science”, introduces the basic science underpinning our knowledge of the climate system, how climate has changed in the past, and how it may change in the future. The course focuses on the fundamental energy balance in the climate system, between incoming solar radiation and outgoing infrared radiation, and how this balance is affected by greenhouse gases. They also discuss physical processes that shape the climate, such as atmospheric and oceanic convection and large-scale circulation, solar variability, orbital mechanics, and aerosols, as well as the evidence for past and present climate change. Climate models of varying degrees of complexity are available for students to run – including a model of a single column of the Earth’s atmosphere, which includes many of the important elements of simulating climate change. Together, this range of topics forms the scientific basis for our understanding of anthropogenic (human-influenced) climate change.

The introduction video gives a flavour of the course, which is presented by Kerry Emanuel, Dan Cziczo and David McGee:

The course is geared toward students with some mathematical and scientific background, but does not require any prior knowledge of climate or atmospheric science. Classes begin on February 19th and run for 12 weeks. Students may simply audit the course, or complete problems sets and a final exam to receive a certificate of completion. The course is free, and one can register for it here.

There are other climate science courses available too:

Good studying!

Author: "gavin" Tags: "Climate modelling, Climate Science, Comm..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 27 Jan 2014 18:59

The global temperature data for 2013 are now published. 2010 and 2005 remain the warmest years since records began in the 19th Century. 1998 ranks third in two records, and in the analysis of Cowtan & Way, which interpolates the data-poor region in the Arctic with a better method, 2013 is warmer than 1998 (even though 1998 was a record El Nino year, and 2013 was neutral).

The end of January, when the temperature measurements of the previous year are in, is always the time to take a look at the global temperature trend. (And, as the Guardian noted aptly, also the time where the “climate science denialists feverishly yell [...] that global warming stopped in 1998.”) Here is the ranking of the warmest years in the four available data sets of the global near-surface temperatures (1):

Rank
1
2010
2010
2010
2010
2
2005
2005
2005
2005
3
2007
1998
1998
2007
4
2002
2013
2003
2009
5
1998
2003
2006
2013

New this year: for the first time there is a careful analysis of geographical data gaps – especially in the Arctic there’s a gaping hole – and their interpolation for the HadCRUT4 data. Thus there are now two surface temperature data sets with global coverage (the GISTEMP data from NASA have always filled gaps by interpolation). In these two data series 2007 is ranked 3rd. Their direct comparison looks like this:

had4_v2_giss
Figure 1 Global temperature (annual values) in the data from NASA GISS (orange) and from Cowtan & Way (blue), i.e. HadCRUT4 with interpolated data gaps.

One can clearly see the extreme year 1998, which (thanks to the record-El Niño) stands out above the long-term trend like no other year. But even taking this outlier year as starting point, the linear trend 1998-2013 in all four data sets is positive. Also clearly visible is 2010 as the warmest year since records began, and the minima in the years 2008 and 2011/2012. But just like the peaks are getting higher, these minima are less and less deep.

In these data curves I cannot see a particularly striking or significant current “warming pause”, even though the warming trend from 1998 is of course less than the long-term trend. Even in Nature, there was recently a (journalistic) contribution that in its introduction strongly overstated this alleged “hiatus”. It makes a good story that perhaps some cannot resist. (“Warming trend is somewhat reduced, but within the usual range of variation” simply does not make good headline.)

The role of El Niño and La Niña

The recent slower warming is mainly explained by the fact that in recent years the La Niña state in the tropical Pacific prevailed, in which the eastern Pacific is cold and the ocean stores more heat (2). This is due to an increase in the trade winds that push water westward across the tropical Pacific, while in the east cold water from the depths comes to the surface (see last graph here). In addition, radiative forcing has recently increased more slowly (more on this in the analysis of Hansen et al. – definitely worth a read).

NASA shows the following graphic, where you can see that the warmer years tend to be those with an El Niño in the tropical Pacific (red years), while the particularly cool years are those with La Niña (blue years).

gistemp_nino_100
Figure 2 The GISS data, with El Niño and La Niña conditions highlighted. Neutral years like 2013 are gray. Source: NASA .

Quality of the interpolation

How good is the interpolation into regions not regularly covered by weather stations? In any case, of course, better than simply ignoring the gaps, as the HadCRUT and NOAA data have done so far.  The truly global average is important, since only it is directly related to the energy balance of our planet and thus the radiative forcing by greenhouse gases. An average over just part of the globe is not.  The Arctic has been warming disproportionately in the last ten to fifteen years.

But how well the interpolation works we know only since the important work of Cowtan and Way. These colleagues have gone to the trouble of carefully validating their method. Although there are no permanent weather stations in the Arctic, there is intermittent data from buoys and and from weather model reanalyses with which they could test their method. For the last few decades and Cowtan & Way also make use of satellite data (more on this in our article on underestimated warming). I therefore assume that the data from Cowtan & Way is the methodologically best estimate of the global mean temperature which we currently have. This correction is naturally small (less than a tenth of a degree) and hardly changes the long-term trend of global warming – but if you look deeper into  shorter periods of time, it can make a noticeable difference. The comparison with the uncorrected HadCRUT4 data looks like this:

CowtanWay3
Figure 3: Comparison of interpolated and non-interpolated HadCRUT4 data, as moving averages over 12 months. Source: Kevin Cowtan, University of York.

And here’s a look at the last years in detail:

Cowtan2
Figure 4: The interpolated HadCRUT4 data (annual average) from 1970. Source: Kevin Cowtan, University of York.

Following this analysis, 2013 was thus even warmer than the record El-Niño-year 1998.

Conclusion

  • In all four data series of the global near-surface air temperature, the linear trend even from the extreme El Niño year 1998 is positive, i.e. shows continued warming, despite the choice of a warm outlier as the initial year.
  • In all four data series of the global near-surface air temperature, 2010 was the warmest year on record, followed by 2005.
  • The year 1998 is, at best, rank 3 – in the currently best data set of Cowtan & Way, 1998 is actually only ranked 7th. Even 2013 is – without El Niño – warmer there than 1998.

The German news site Spiegel Online presents these facts under the headline Warming of the air paused for 16 years (my translation). The headline of the NASA news release, NASA Finds 2013 Sustained Long-Term Climate Warming trend, is thus completely turned on its head.

This will not surprise anyone who has followed climate reporting of Der Spiegel in recent years. To the contrary – colleagues express their surprise publicly when a sensible article on the subject appears there. For years, Der Spiegel has acted as a gateway for dubious “climate skeptics” claims into the German media whilst trying to discredit top climate scientists (we’ve covered at least one example here).

Do Spiegel readers know more (as their advertising goes) – more than NASA, NOAA, Hadley Centre and the World Meteorological Organization WMO together? Or are they simply being taken for a ride for political reasons?

Footnotes

(1) In addition to the data of the near-surface temperatures, which are composed of measurements from weather stations and sea surface temperatures, there is also the microwave data from satellites, which can be used to estimate air temperatures in the troposphere in a few kilometers altitude. In the long-term climate trend since the beginning of satellite measurements in 1979, the tropospheric temperatures show a similar warming as the surface temperatures, but the short-term fluctuations in the troposphere are significantly different from those near the surface. For example, the El Niño peak in 1998 is about twice as high as in the surface data in the troposphere, see Foster and Rahmstorf 2011 . In their trend from 1998 , the two satellite series contradict each other: UAH shows +0.05 ° C per decade (a bit more than HadCRUT4), RSS shows -0.05 ° C per decade.

(2) Another graphic to illustrate the change between El Niño and La Niña: the Oceanic Niño Index ONI, the standard index of NOAA to describe the seesaw in the tropical Pacific.

ONI
Figure 5 The ONI index. The arrows added by me point to some globally warm or cool years (compare Figure 1 or 4). Source: NOAA .

Weblinks

Kevin Cowtan has a neat online trend calculator for all current global temperature data series.

The global temperature jigsaw (an overview over the “pause” debate)
What ocean heating reveals about global warming

Author: "stefan" Tags: "Climate Science, Instrumental Record"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 17 Jan 2014 23:17

Gavin provided a thoughtful commentary about the role of scientists as advocates in his RealClimate piece a few weeks ago.

I have weighed in with my own views on the matter in my op-ed today in this Sunday’s New York Times. And, as with Gavin, my own views have been greatly influenced and shaped by our sadly departed friend and colleague, Stephen Schneider. Those who were familiar with Steve will recognize his spirit and legacy in my commentary. A few excerpts are provided below:

THE overwhelming consensus among climate scientists is that human-caused climate change is happening. Yet a fringe minority of our populace clings to an irrational rejection of well-established science. This virulent strain of anti-science infects the halls of Congress, the pages of leading newspapers and what we see on TV, leading to the appearance of a debate where none should exist.

.
.

My colleague Stephen Schneider of Stanford University, who died in 2010, used to say that being a scientist-advocate is not an oxymoron. Just because we are scientists does not mean that we should check our citizenship at the door of a public meeting, he would explain. The New Republic once called him a “scientific pugilist” for advocating a forceful approach to global warming. But fighting for scientific truth and an informed debate is nothing to apologize for.

.
.

Our Department of Homeland Security has urged citizens to report anything dangerous they witness: “If you see something, say something.” We scientists are citizens, too, and, in climate change, we see a clear and present danger. The public is beginning to see the danger, too — Midwestern farmers struggling with drought, more damaging wildfires out West, and withering, record, summer heat across the country, while wondering about possible linkages between rapid Arctic warming and strange weather patterns, like the recent outbreak of Arctic air across much of the United States.

.
.

The piece ends on this note:

How will history judge us if we watch the threat unfold before our eyes, but fail to communicate the urgency of acting to avert potential disaster? How would I explain to the future children of my 8-year-old daughter that their grandfather saw the threat, but didn’t speak up in time?

Those are the stakes.

I would encourage interested readers to read the commentary in full at the New York Times website.

Constructive contributions are welcome in the comment section below :-)

Author: "mike" Tags: "Climate Science"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 14 Jan 2014 12:57

Thames_Barrier_JWolfeBack in 2007 I wrote a post looking at the closures of the Thames Barrier since construction finished in 1983. Since then there has been another 7 years of data* and given that there was a spate of closures last week due to both river and tidal flooding, it seems a good time to revisit the topic.

The Barrier is in the Thames Estuary in the South-East UK and is raised whenever there is a risk of flooding in London proper. The Thames below Teddington weir is tidal with quite a wide range and so (the very real) risks are greatest at high tide. Risks are elevated for one of two reasons (and occasionally both) – either the river flow is elevated, and so a normal high tide would cause flooding (a ‘fluvial’ risk) or the tide itself is elevated (usually associated with a storm surge) (a ‘tidal’ risk). Both river floods and storm surges have a potential climate change component, and so the frequency of closure and the reasons why might be useful in assessing whether risks have changed in recent years. Countering that though, there have been non-climatic changes to how flooding risks are managed, improved hydrological modelling which gives better predictability, and local subsidence issues which will increase flooding risks but is not climatic in origin. Connecting flooding (or flood risks) to climate change is always complicated – as this recent Carbon Brief posting reminds us.

So how many closures have their been?

closures14

Red denotes the closures that are tidal in origin, grey is the total number of closures. Obviously tidal closures are dominant, but closures due to the river flooding in 2001, 2003 and this season are very notable. However, for the reasons outlined above and discussed previously, the raw number of closures – despite having increased over the years, is not a clean indicator of climate changes. We can look a little deeper at the data though.

For each closure, there is a record of the high water level at Southend (at the mouth of the Estuary), the river flow at Teddington and how much of the high water level was due to a storm surge. These are plotted below for each of the closures. Note that closures come in clusters, all are during the winter months (Sep-Apr) and in cases where the river flow is high, closures can happen for multiple high tides in a row.

factors14

Looking at the changes in these causes is interesting. The highest storm surges were in 2007 and 1993 but there is no apparent trend. The river flow data are more enigmatic, with the peak flows post-2000 standing out, but since there is much better data on the history of the Thames flow, that data should be looked at instead to determine trends. There is also a summary description of the first 100 years of data at Teddington available.

The high water record is perhaps the most interesting and there seems to be an upward trend in the peak levels. However, sea level rise at Southend to 1983 was about 1.2 mm/year and in the 30 years since then, one would expect 3.5 to 5.0 cm more, which is much less than what one would deduce from the peaks in the figure. So, again, a deeper look into the more direct data is probably warranted.

I therefore conclude, as I did in 2007, that:

Thames Barrier closings tell a complicated story which mix climate information with management issues and are probably too erratic to be particularly meaningful – if you want to say something about global sea level, then look at the integrated picture from satellites and tide gauges. But it is a good illustration of adaptive measures that are, and will increasingly be, needed to deal with ongoing climate change.

* Updated data thanks to the UK Environment Agency (via @AlanBarrierEA @EnvAgency) with a helping hand from Richard Betts.

Author: "gavin" Tags: "Climate impacts, Climate Science"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 02 Jan 2014 17:48

by Michael E. Mann and Gavin Schmidt

This time last year we gave an overview of what different methods of assessing climate sensitivity were giving in the most recent analyses. We discussed the three general methods that can be used:

The first is to focus on a time in the past when the climate was different and in quasi-equilibrium, and estimate the relationship between the relevant forcings and temperature response (paleo-constraints). The second is to find a metric in the present day climate that we think is coupled to the sensitivity and for which we have some empirical data (climatological constraints). Finally, there are constraints based on changes in forcing and response over the recent past (transient constraints).

All three constraints need to be reconciled to get a robust idea what the sensitivity really is.

A new paper using the second ‘climatological’ approach by Steve Sherwood and colleagues was just published in Nature and like Fasullo and Trenberth (2012) (discussed here) suggests that models with an equilibrium climate sensitivity (ECS) of less than 3ºC do much worse at fitting the observations than other models.

Sherwood et al focus on a particular process associated with cloud cover which is the degree to which the lower troposphere mixes with the air above. Mixing is associated with reductions in low cloud cover (which give a net cooling effect via their reflectivity), and increases in mid- and high cloud cover (which have net warming effects because of the longwave absorption – like greenhouse gases). Basically, models that have more mixing on average show greater sensitivity to that mixing in warmer conditions, and so are associated with higher cloud feedbacks and larger climate sensitivity.

The CMIP5 ensemble spread of ECS is quite large, ranging from 2.1ºC (GISS E2-R – though see note at the end) to 4.7ºC (MIROC-ESM), with a 90% spread of ±1.3ºC, and most of this spread is directly tied to variations in cloud feedbacks. These feedbacks are uncertain, in part, because it involves processes (cloud microphysics, boundary layer meteorology and convection) that occur on scales considerably smaller than the grid spacing of the climate models, and thus cannot be explicitly resolved. These must be parameterized and different parameterizations can lead to large differences in how clouds respond to forcings.

Whether clouds end up being an aggravating (positive feedback) or mitigating (negative feedback) factor depends not just on whether there will be more or less clouds in a warming world, but what types of clouds there will be. The net feedback potentially represents a relatively small difference of much larger positive and negative contributions that tend to cancel and getting that right is a real challenge for climate models.

By looking at two reanalyses datasets (MERRA and ERA-Interim), Sherwood et al then try and assess which models have more realistic representations of the lower tropospheric mixing process, as indicated in the figure:


sherwood_fig5c
Figure (derived from Sherwood et al, fig. 5c) showing the relationship between the models’ estimate of Lower Tropospheric Mixing (LTMI) and sensitivity, along with estimates of the same metric from radiosondes and the MERRA and ERA-Interim reanalyses.

From that figure one can conclude that this process is indeed correlated to sensitivity, and that the observationally derived constraints suggest a sensitivity at the higher end of the model spectrum.

There was a interesting talk at AGU from Peter Caldwell (PCDMI) on how simply data mining for correlations between model diagnostics and climate sensitivity is likely to give you many false positives just because of the number of possible options, and the fact that individual models aren’t strictly independent. Sherwood et al get past that by focussing on physical processes that have an a priori connection to sensitivity, and are careful not to infer overly precise probabilistic statements about the real world. However, they do conclude that ‘models with ECS lower than 3ºC’ do not match the observations as well. This is consistent with the Fasullo and Trenberth study linked to above, and also to work by Andrew Dessler that suggest that models with amplifying net cloud feedback appear most consistent with observations.

There are a number of technical points that should also be added. First, ECS is the long term (multi-century) equilibrium response to a doubling of CO2 in an coupled ocean-atmosphere model. It doesn’t include many feedbacks associated with ‘slow’ processes (such as ice sheets, or vegetation, or the carbon cycle). See our earlier discussion for different definitions. Second, Sherwood et al are using a particular estimate of the ‘effective’ ECS in their analysis of the CMIP5 models. This estimate follows from the method used in Andrews et al (2011), but is subtly different than the ‘true’ ECS, since it uses a linear extrapolation from a relatively short period in the abrupt 4xCO2 experiments. In the case of the GISS models, the effective ECS is about 10% smaller than the ‘true’ value, however this distinction should not really affect their conclusions.

Third, in much of the press coverage of this paper (e.g. Guardian, UPI), the headline was that temperatures would rise by ’4ºC by 2100′. This unfortunately plays into the widespread confusion between an emergent model property (the ECS) and a projection into the future. These are connected, but the second depends strongly on the scenario of future forcings. Thus the temperature prediction for 2100 is contingent on following the RCP85 (business as usual) scenario.

Implications

Last year, the IPCC assessment dropped the lower bound on the expected range of climate sensitivity slightly, going from 2-4.5ºC in AR4 to 1.5-4.5ºC in AR5. One of us (Mike), mildly criticized this at the time.

Other estimates that have come in since AR5 (such as Schurer et al.) support ECS values similar to the CMIP5 mid-range, i.e. ~3ºC, and it has always been hard to reconcile a sensitivity of only 1.5ºC with the paleo-evidence (as we discussed years ago).

However, it remains true that we do not have a precise number for the ECS. Sherwood et al’s results give weight to higher values than some other recent estimates based on transient estimates (e.g. Otto et al. (2013)), but it should be kept in mind that there is a great asymmetry in risk between the high and low end estimates. Uncertainty cuts both ways and is not our friend. If the climate indeed turns out to have the higher-end climate sensitivity suggested by here, the impacts of unmitigated climate change are likely to be considerably greater than suggested by current best estimates.

References

  1. S.C. Sherwood, S. Bony, and J. Dufresne, "Spread in model climate sensitivity traced to atmospheric convective mixing", Nature, vol. 505, pp. 37-42, 2014. http://dx.doi.org/10.1038/nature12829
  2. J.T. Fasullo, and K.E. Trenberth, "A Less Cloudy Future: The Role of Subtropical Subsidence in Climate Sensitivity", Science, vol. 338, pp. 792-794, 2012. http://dx.doi.org/10.1126/science.1227465
  3. A.E. Dessler, "A Determination of the Cloud Feedback from Climate Variations over the Past Decade", Science, vol. 330, pp. 1523-1527, 2010. http://dx.doi.org/10.1126/science.1192546
  4. A. Schurer, G. Hegerl, M.E. Mann, S.F.B. Tett, and S.J. Phipps, "Separating forced from chaotic climate variability over the past millennium", Journal of Climate, pp. 130325112547002, 2013. http://dx.doi.org/10.1175/JCLI-D-12-00826.1
  5. A. Otto, F.E.L. Otto, O. Boucher, J. Church, G. Hegerl, P.M. Forster, N.P. Gillett, J. Gregory, G.C. Johnson, R. Knutti, N. Lewis, U. Lohmann, J. Marotzke, G. Myhre, D. Shindell, B. Stevens, and M.R. Allen, "Energy budget constraints on climate response", Nature Geosci, vol. 6, pp. 415-416, 2013. http://dx.doi.org/10.1038/ngeo1836
Author: "mike" Tags: "Climate modelling, Climate Science, Gree..."
Comments Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader