6 December 2011
AGU 3: Great Disasters of the 21st Century
Posted by Dave Petley
Yesterday (Monday) morning I also attended a session run by the Natural Hazards division on Great Disasters of the 21st Century. First up was Mark Bove from Munich Re, who talked about two different aspects of natural hazards. The first was to highlight the increasing levels of loss associated with natural disasters – in particular financial losses in the first decade of the 21st Century were equivalent to those of the 1980s and 1990s combined. Whilst he attributed this primarily to increased vulnerability, he used the Munich Re disaster catalogue to explore whether there might be an increase in occurrence of meteorological hazards as a result of an increasingly warm world. He presented two lines of evidence to support this – first, he showed that increasing occurrence of weather-driven hazards is far exceeding that of tectonic hazards. Second, he reported work they have undertaken to develop a measurement driven index of thunderstorm potential in the USA. This shows a clear increase with time, suggesting that the climate is changing. Of course this is not necessarily human induced, but in my view (and that of the mjority of scientists) this is by far the most likely cause.
Second was Chris Field, presenting a brief summary of the results of the IPCC study on climate and extreme events. As expected, he was very measured in the way that he presented the outcomes of this work. The core message was that there is evidence that extreme events are becoming more common and more extreme, and that this is likely to continue with time. The message was positive though in that he emphasised that it is possible to mitigate many hazards, and concerted effort could have a big impact, especially where it involves local actions.
Third up was Gordon McBean of the University of Western Ontario, talking about risk assessment for coastal cities. I thought that this was a disappointing presentation using less than perfect graphics, so will move on to a highly provocative presentation by Max Wyss of WAPMERR in Switzerland, who presented an extremely challenging talk on the value of the GSHAP seismic hazard map to local officials. His methodology was I suspect controversial. He looked at earthquakes that have killed more than 1000 people since the GSHAP map was produced, and compared the size of the event with the size of an event that a local government official might expect from the GSHAP dataset. The outcomes were stark – for Kashmir for example GSHAP dataset suggests an earthquake of magnitude 6.1, whereas actually it was 7.7, which represented an energy release 1000 times greater than the model implies. In terms of losses this means that GSHAP implies 3,800 fatalities against an actual number of 86,000. The author suggested that the GSHAP map underestimates every earthquake disaster in this sort of way.
I am not convinced that GSHAP is meant to be used quite like this, but the point about the difference between the ways that scientists view risk and the way that people on the ground do so is well made. It suggests we need to find much better ways to present hazard data. One can only hope that the successor to GSHAP, which is called GEM, is taking this message on board.
The room then rapidly filled (to bursting point) for a talk by Kenji Satake of the University of Tokyo on the Tohuku earthquake and tsunami. This was a fascinating review of the causes of this dreadful event. There were two startling elements to the presentation. First, he suggested that in effect the model of the earthquake cycle for this fault was wrong. The conventional earthquake cycle model suggests that stress builds until slip occurs, generating an earthquake. Stress then builds again until the next event. Satake suggested that in fact there may be two cycles occurring simultaneously on this fault, with the longer cycle generating very large events about every 700 years. If true, this would have major implications for seismic hazard on other major fault systems.
Second, he showed that the tsunami had two distinct elements – a initial slow rise of water level, and a sudden, very rapid pulse. That was I think caught by this amazing video of the tsunami, which shows a slow rise for about 8 minutes and then the very rapid increase:
http://www.youtube.com/watch?v=a-iRFUUkcIU
The explanation for this is that the tsunami was in effect generated by two slip types on the same fault – a shallow slip event and a deep slip event, which generated the two wave patterns. In effect this earthquake was a composite of two known earlier tsunami generating earthquake events – the 869 AD Jogan earthquake, which is thought to have resulted from slip deep on the fault, and the 1896 M=7.2 earthquake , which generated a big tsunami due to large amounts of shallow slip. In both cases there is geological evidence of the earlier tsunami events.
The final talk was Satoko Oki of the University of Tokyo, who presented on changed perceptions of risk in tsunami prone areas. The conclusions were deeply uncomfortable, suggesting that the perception of risk from smaller tsunamis has decreased in the aftermath of this very large event. Put another way, people are less likely to evacuate for a small event now, even though such an event still presents a very real risk. This serves to show just how complex our understanding and perception of risk is, and just how hard it is going to be to continue to reduce disaster impacts.
Thank you for posting these summaries. The topics, as well as your comments, are very interesting. I am looking forward ot the rest of the week.
Thanks for the summary. Very much appreciated.