The catastrophic surge event of 1953 on the eastern UK and northern European coastlines led to widespread agreement on the necessity of a coordinated response to understand the risk of future oceanographic flood events and, so far as possible, to afford protection against such events. One element of this response was better use of historical data and scientific knowledge in assessing flood risk. The timing of the event also coincided roughly with the birth of extreme value theory as a statistical discipline for measuring risks of extreme events, and over the last 50 years, as techniques have been developed and refined, various attempts have been made to improve the precision of flood risk assessment around the UK coastline. In part, this article provides a review of such developments. Our broader aim, however, is to show how modern statistical modelling techniques, allied with the tools of extreme value theory and knowledge of sea-dynamic physics, can lead to further improvements in flood risk assessment. Our long-term goal is a coherent spatial model that exploits spatial smoothness in the surge process characteristics and we outline the details of such a model. The analysis of the present article, however, is restricted to a site-by-site analysis of high-tide surges. Nonetheless, we argue that the Bayesian methodology adopted for such analysis enables a risk-based interpretation of results that is most natural in this setting, and preferable to inferences that are available from more conventional analyses.
One contribution of 14 to a Theme ‘The Big Flood: North Sea storm surge’.
- © 2005 The Royal Society