Ross Stein's Slides from Tokyo Workshop II (13 December 2003)
Posted by JAC on 3/22/2011, 7:57 am
http://sicarius.wr.usgs.gov/tokyo/slides/TokyoWS.html

Introduction and Strategic Overview

These two facts bring us together: 140,000 people died in the great Kanto earthquake, and the population of Tokyo is now 6 times larger.


I am here because I believe that a repeat of the 1923 earthquake - or any event that shakes Tokyo equally hard would not only be a Japanese tragedy, but also a global catastrophe. On top of the human toll we all fear,

Such an event would likely undermine confidence in the EQ defenses of the developed world.
Paying claims on the insured losses of a trillion-dollar EQ could consume the reserves of the reinsurance industry. This would drive up the cost - and choke off the availability - of the earthquake insurance worldwide. The consequences for heavily-insured California would be severe.
Financial analysts project that to rebuild Tokyo, the Japanese would likely withdraw much of their investment in the United States, causing U.S. capital markets to plunge.
So a great earthquake striking Tokyo would also penetrate the psyche and economy of the U.S., perhaps with an impact approaching a repeat of the 1857 or 1906 shocks on the San Andreas, or the 1811-1812 shocks in Missouri.

Now I am also here because, scientifically, Tokyo is a tantalizing target of study:

The historical record of earthquake shaking is long and rich, thanks to the work of Usami, Utsu, Takemura and JMA.
Tsunami data exist for some of the largest events, such as 1498, 1855, 1605, 1633 and 1703.
The geodetic record is unsurpassed.
And as pointed out by Ishibashi, there is speculative evidence for earthquake interaction, such as the 1853 Odawara, 1854 Tokai and 1855 Edo earthquakes; or the 1703 Genroku shock, 1707 Hoei eruption of Mt Fuji, and the 1707 Tokai shock.
For all of these reasons. a small group of USGS, AFRC, NIED, GSI, and Swiss Re scientists began a collaboration earlier this year on the earthquake hazard faced by Japan's capital city, and we have asked you here to share our preliminary findings and most importantly, to seek your guidance, criticism,and advice. Today we are presenting our approach, not out answers, which have yet to come.

Now, despite its importance, and the quality of its data, Kanto is also a uniquely challenging region because:

It lies near the junction of three tectonic plates, and its traversed by a volcanic front. This is just a teaser for Serkan Bozkurt's closing GIS presentation this afternoon, based on Ishida's plate model.
And so, unlike the San Francisco Bay area or the Marmara Sea, to associate historical earthquakes with the plates or plate boundaries on which they struck, earthquake depths and focal mechanisms are needed.
Trying to overcome these obstacles, while exploiting its opportunities, lie at the heart of our efforts. We have three goals for this study, each more ambitious and more difficult than its predecessor:

First, we would like to estimate the time-independent or 'Poisson' probability of an earthquake of a given size striking Tokyo. Such an assessment would be based almost exclusively on the historical earthquake record, and thus rests on the assumption that the record is long enough to encompass the full range of earthquake occurrence.

We are using new computer-based methods to locate and estimate the size of earthquakes. There are several internal-consistency tests we can perform on such a model:

How well can we estimate the magnitude and location of modern shocks using only their intensity data? Bill Bakun will report some real success, and unsolved problems.
Does the resulting historical catalog produce a reasonable b-value (the ratio of small to large shocks) and a reasonable rate of seismic moment release?
If we can reliably estimate earthquake locations, magnitudes, and their uncertainties, the Poisson probability would accurately reflect the average likelihood of earthquakes striking Kanto over the long term. But it would not necessarily reflect the likelihood during the next year, next decade or next 30 years, which could, in our judgment, depart significantly from the long term average. Which brings us to our second goal:

Second, we would like to build a renewal model of earthquake probability, which treats major faults late in their earthquake cycles as more likely to rupture. A renewal model depends on:

being able to associate large historical earthquakes with major faults
inferring earthquake inter-event times and variability on these faults
estimating the extent to which these faults slip seismically
choosing how the probability evolves with time (PDF)
Although more demanding than a time-independent assessment, the renewal model is potentially more faithful to the current state of the hazard. For example, whether the 1923 source has a 200 or 400-yr inter-event time, and whether the inter-event time is highly regular or very irregular, can halve or double the current earthquake probability.

At minimum, we need to be able to distinguish between upper crustal and subduction earthquakes, which depends both on location and depth of historical events. Tom Parsons will talk about a new method for extracting depth information from historical intensity data.
Marine terrace data may contain the best evidence for earthquake inter-event times and their variability for PHS events. Shinji Toda and Masanobu Shishikura will present new finding on this topic this afternoon.
We also need to identify the major upper crustal earthquake sources. For example, we do not know what fault slipped in the 1855 Edo event. Tom will touch on this.
Our Third Goal is to produce what we term as 'interaction probability' that explicitly included the effect of stress transfer to major faults by past earthquakes. In our judgement, aftershocks, earthquake sequences, and seismic quiescence are products of stress transfer.

Global observations such as the order-of-magnitude drop in M 6 San Francisco Bay area earthquakes in the 75 years after the 1906 shock, or the landers-Big Bear-Hector Mine sequence in southern California.

In the Kanto plain, there is a change in the distribution of seismicity after the 1923 shock, and there also may be a drop in the rate of large events during the century after the 1703 shock. These observations suggest that earthquake interaction will be important to any probability analysis of Kanto.

For the interaction probability, we need to calculate how the 1923 earthquake elastically stressed faults in its environment, triggering aftershocks and subsequent mainshocks; and how the transient stresses excited by viscoelastic rebound continued to redistribute stress on major faults over the next 50 years.

To attempt an interaction probability, we need:

The geometry and slip in the 1923 earthquake, which is used to calculate the Coulomb stress change on surrounding faults. Marleen Nyst will compare the performance of past models against a coseismic geodetic dataset recently augmented by Takuya Nishimura.
We need a rheological model that reproduces the observed postseismic geodetic observations.
A model of the fault stressing rates derived from the GeoNet surface strain-rate field. Takuya Nishimura will be pursuing this with help from Takeshi Sagiya.
The friction coefficient and aftershock duration on the major faults.
Some of these parameters suffer from large uncertainties. And so we have to balance our desire to produce a physically defensible, realistic model with one in which too many assumptions are unsupported.

The interaction probability draws heavily on the laboratory-based theory of rate-and-state friction formulated by Jim Dieterich and others. While Tom Parsons, Shinji Toda, Jim Dieterich and I have published studies in support of this approach, we recognize that it is only a hypothesis and subject to dispute. But this reliance inspired us to give you a tour of the USGS rock mechanics laboratories, which will be led by Nick Beeler and Dave Lockner, right after lunch.

We will also build earthquake interaction onto the Poisson probability, bypassing the renewal model altogether. In fact, this would be our preference if earthquake inter-event times and coefficients of variation prove to be highly uncertain, while earthquake stress transfer and the fault stressing rates are deemed more reliable. In some respects, adding interaction to a Poisson probability is a more conservative approach.

Let me close with two thoughts, a warning and a wish:

Our probability forecast will be essentially intestable in our lifetimes. it is guaranteed to be 'quantitative,' but it could also be wrong. So the prerequisites for a model deserving consideration are these: it must be able to reproduce the observed pattern of post-1703 and post-1923 seismicity, and the past century of geodetic observations. This alone will be difficult to achieve.

Perhaps most important in this international collboration is our desire to contribute to - rather than to compete with - the Japanese goverment studies now underway. We would hope that our interpreted historical earthquake catalog, our revised 1923 and planned 1703 source models, and our three types of probability estimates, when complete, would be of value to the Long-term Evaluations project of the Earthquake Research Committee led by Kunihiko Shimazaki; and to the Special Project for Earthquake Disaster Mitigation in Urban Areas project led by Naoshi Hirata.

67
In this thread:
6.6 Aftershock - JAC, 3/22/2011, 5:13 am
< Return to the front page of the: message board | monthly archive this page is in
Post A Reply
This thread has been archived and can no longer receive replies.