You are on page 1of 18

Earthquake Engineering & Engineering Seismology

Basics of Seismology and Seismic Hazard Assessment

Treatment of Uncertainty in PSHA

Dr Peter J. Stafford
Department of Civil & Environmental Engineering
Imperial College London

Friday 11th December, 2008

Total Uncertainty

The objective of any probabilistic analysis should be to apply the


most accurate model possible whilst ensuring that all of the
relevant uncertainties are accounted for

The key is to capture the total uncertainty

However, different components of this total uncertainty must be


handled in different ways

Types of Uncertainty

When modelling a process there are three generic types of


uncertainty:

Aleatory Variability
Arises from the inherent randomness of a process
(comes from the Latin word for dice, alea)

Epistemic Uncertainty
Arises because our analyses are done on models and not on real
systems. Reflects our lack of knowledge of how to properly
model the system (comes from the Greek word for knowledge)

Ontological Uncertainty
Arises from the unknown or the unexpected. Reflects possibilities
that are not considered during the modelling

Elms (2004)

1
The PSHA Watchdog…

Not everyone thinks that the distinction between aleatory


variability and epistemic uncertainty is helpful

However, the various types of


uncertainty must be handled differently
and the terminology should help things
rather than hinder them

Aleatory Variability

May be modelled using probability distributions

Examples: ground-motion variability, variability in source-scaling


relationships, Heisenberg uncertainty principle

Incorporated directly into the PSHA process

One simply integrates over the range of possibilities in


order to obtain the expected value

Epistemic Uncertainty

Cannot easily be modelled using probability distributions


(if it can be modelled in this way then there will always be
uncertainty regarding which distribution should used as well as
regarding the values of the parameters of the distribution)

In PSHA, usually dealt with through the use of logic trees


and through expert elicitation

Examples: Different source zonations, use of alternative EGMMs,


probability models, gravity, flat earth model…

2
Ontological Uncertainty

Cannot be modelled by definition

May be reduced through peer review, quality control, also


tends to decrease over time (learn from mistakes!)

Not usually addressed in PSHA, or earthquake engineering


(despite being particularly important for structural design, e.g.,
what are the consequences of not identifying a mechanism?)

Many relevant examples for PSHA… particularly for seismic


zonation

Hazard maps based upon observed seismicity

Hazard maps in Australia

First zonation
map:
McEwin et al.
(1976)

1979 Update

1990 Update

2004 Update

Brown & Gibson (2004)

3
Clear Explanation

The clearest explanation of the difference between epistemic


uncertainty and ontological uncertainty was provided by former
US secretary of Defence, Donald Rumsfeld

“As we know, there are known knowns. There


are things we know we know. We also know
there are known unknowns. That is to say we
know there are some things we do not know.
But there are also unknown unknowns, the
ones we don’t know we don’t know.”

Known unknowns = epistemic uncertainty


Unknown unknowns = ontological uncertainty

Aleatory Variability vs Epistemic Uncertainty

Modified from Bommer & Abrahamson (2007)

Aleatory Variability

4
Aleatory Variability – Modelling and Parametric

For any given model there is a trade-off between the accuracy of


predictions and the number of parameters (each of which is uncertain)

SeismoStruct

Aleatory Variability – Modelling and Parametric

EGMMs – model developments require additional parameters


that are difficult to estimate
Modelling Parametric
Uncertainty Uncertainty
M, R, Site, F

M, R, VS,30, F

M, R, VS,30, F, ZTOR, Z1.0


Time Time

Other PSHA examples: Magnitude-frequency distributions, time-


dependent probability models

The Influence of Aleatory Variability

Many studies in the past


ignored the aleatory
variability in the ground
motion altogether

Others thought they


were being conservative
by considering the 84th
percentile ground
motion for all
magnitude-distance
scenarios

Bommer & Abrahamson (2006)

5
The Influence of Aleatory Variability

The greater the standard


deviation of the ground
motion model, the
greater the hazard

This parameter has a


very strong impact upon
the results of a hazard
analysis

Bommer & Abrahamson (2006)

Influence of Sigma

Roughly twice the probability of exceeding the ground motion level


indicated by the dashed line for the same median prediction

Truncation of the Ground Motion Distribution

As seen yesterday, an upper


bound on epsilon may be
specified

The influence that this upper


bound has depends upon the
seismicity of the region being
considered

Higher Seismicity

Bommer et al. (2004)

6
Ground Motion Variability & PEGASOS
10 -1
PEGASOS soil (surface) PSHA for re-licensing of
PRA study from 1980s Swiss nuclear power
10 -2
plants
Annual P[Exceedence]

10 -3 Commissioned a very
comprehensive study to
be carried out (only one of
10 -4 two to every be
undertaken at this degree
of complexity SSHAC
10 -5 Level 1)

10 -6 New results much larger


than old

10 -7
0.1 0.2 1 2 Some stakeholders not
happy
Peak Ground Acceleration [g]

“Academic” Debate

Sparked a lot of discussion (mainly just confusion) in the journal


Engineering Geology

Aleatory Ground Motion Variability

10 -1
PEGASOS soil (surface)
PRA study from 1980s
PRA study replicated with = 0.0
PEGASOS study
10 -2 repeated with the
ground motion
Annual P[Exceedence]

variability turned off


10 -3

10 -4

10 -5

10 -6

10 -7
0.1 0.2 1 2
Peak Ground Acceleration [g] Bommer & Abrahamson (2006)

7
Aleatory Ground Motion Variability

10 -1
PEGASOS
PRA study
soil (surface)
from 1980s PRA study repeated with
PRA study
PRA study
replicated with
replicated with
= 0.0
= 0.67
the ground motion
10 -2
variability turned on
Annual P[Exceedence]

10 -3 Actually obtain larger


ground motions with the
old model!
10 -4

10 -5

10 -6

10 -7
0.1 0.2 1 2
Peak Ground Acceleration [g] Bommer & Abrahamson (2006)

Epistemic Uncertainty

Epistemic Uncertainty

Arises in hazard analyses due to the existence of multiple


models for each part of the overall process

Different source identification approaches


Different methods for modelling seismicity
Different methods for modelling ground motions
Different methods for combining the above

And for different approaches to actual hazard analysis as well

Probabilistic Seismic Hazard Analysis


Deterministic Seismic Hazard Analysis
Parametric-Historic Hazard Analysis

8
Working Example – Expert Elicitation

Without any collaboration, or discussion, each of you must


make an estimate of:

(1) The height of the ceiling of this room (in units of


metres, e.g., 27.4 m), and

(2) The volume of this room (in cubic metres, e.g., 2 m 3)


Height of Room Volume of Room
Mean 3.928571429 2676.928571
Median 4 380
Mode 4.5 350
Expert_Elicitation.xls LQ 3.5 346.25
UQ 4.5 1181.25
S.D. 0.726878015 7535.293455
CoV 0.185023495 2.814902697

Working Example – Expert Elicitation

Now, imagine that you are me and that I have sought your
expert opinions on these two very important pieces of
information.

What should I do with the results?

If I needed one value, which value should I use?

As experts, now that you have seen the results from the other
experts, would any of you want to modify your estimates?

Of course, in this particular case, rather than paying you all a lot
of money for your services, I could buy a ruler and measure it
myself

When money can’t buy the “right answer”…

What if I asked you to


specify the locations of
seismic sources?

Now further investment in


research isn’t likely to
improve things much over a
reasonable time frame

I can’t take an ‘average’ of


these source models

What do I do?

Reiter (1990)

9
LLNL & EPRI Hazard Studies, mid 1980s

These are the 11


hazard curves for PGA
associated with the 11
seismic source models

Can I take the ‘average’


of these?

Shall I give all of them


an equal weight?

Reiter (1990)

PEGASOS SSHAC Level 1

Multiple groups of experts

Organised by the TFIs


(Technical Facilitator/Integrator)
“super-experts”

Abrahamson et al. (2002)

Range of Expert Opinions

Bommer (2004)

10
Not everyone likes the use of experts

However, I hope you


will agree that the use
of experts should be
preferred to the use of
non-experts…

Krinitzsky (2003)

Logic Trees

The common approach in PSHA is to capture epistemic


uncertainty through the use of a logic tree

A logic tree is a series of branches that provide alternative paths


through a standard PSHA calculation procedure

Whenever there is epistemic uncertainty in some part of the


process, a node is placed and alternative branches are grown
that reflect the range of scientifically viable options

Each branch is assigned a weight and the sum of the weights


assigned to branches leaving each node should sum to unity

Not-so-logic trees…

Krinitzsky (2003)

11
Logic Tree Examples

Logic Tree Examples

Consideration of
focal depth may be
incorrect here

It is ok if these are
distinct locations of
sources, but if they
are within a given
source then this is
aleatory variability
rather than
epistemic
uncertainty

Logic Tree Examples

12
Sequential Independence

The nodes of the logic tree should be sequentially independent

It is quite common to see things like individual nodes for


elements of a seismicity model, i.e., Mmax, a & b values

If considering a
conservation of moment
release then these
parameters are correlated
and some combinations of
branches will be physically
unrealistic

Youngs & Coppersmith (1985)

Efficiency of Logic Trees

For every path through the logic tree a complete PSHA must be
carried out

The number of branches may grow very large, very quickly, i.e.
2n if just two options per node exist

One needs to take care that the branches are defined well and
as efficiently as possible

This is not an easy task and the poor specification of logic trees
is a major cause of confusion in the profession

Efficiency of Logic Trees

Initial sensitivity analyses may


be extremely useful for
identifying which branches are
really driving the overall
uncertainty

Four teams of source


characterisation experts yet the
uncertainty is almost entirely
driven by uncertainty in ground
motion models!

13
Compatibility of Logic Tree Inputs

Many models use different input


parameters

All of these input parameters must form a


consistent set

Ground-motion models are a particularly


difficult set of models to ensure
consistency as the models draw from a
large range of different definitions

The key ones are:


Magnitude
Distance
Component definition
Site classes

Bommer et al. (2005)

Parameter Conversions

Ground motion models make use of a large range of different


predictor variables and provide outputs in different ways
(component definitions)

For comparisons the models must be applied in a consistent


manner so parameter conversions must be made

Scherbaum et al. (2005)

14
Error Propagation

With every conversion the errors in the conversions must be


propagated and the overall uncertainty increases as a result

The large numbers of empirical


conversions result in a large
inflation of the uncertainty

Bommer et al. (2005) Beyer & Bommer (2006)

What are the weights on the branches?

Some people argue that the weights that are assigned to the
branches are probabilities, while others state that these are
degrees of belief, or confidence, in the applicability of a model

Whether or not these values are probabilities or weights has a


major impact upon how we manipulate the results

Logic Tree Assumptions

If the weights are probabilities then


through the total probability
theorem the branches should be
mutually exclusive and collectively
exhaustive

The nodes should also be


independent

For the first two conditions some


degree of violation may occur due to
the use of common datasets for
model development as well as not
considering all available models

Bommer & Scherbaum (2008)

15
Mean vs Median

At long return periods the mean


often tends to cross fractiles as
is it influenced strongly by
extreme values

This has lead some to propose


the use of the median rather
than the mean because “it is
less sensitive to the extreme
branches”

Abrahamson & Bommer (2005)

NO!
Should we use the mean hazard curve?

YES!

Near Denver, Colorado

(Frank Scherbaum)

16
How do we compute the design value?
Mean Hazard or Mean Ground Motion?

(Frank Scherbaum)

Interpretation of Hazard Curve Suites

The suites of hazard


curves may be
interpreted as
representing degrees of
confidence and safety

This interpretation is
similar to a
performance-based
earthquake engineering
framework

Bommer & Scherbaum (2008)

Epistemic Uncertainty

Although the procedures for handling epistemic uncertainty are by no


means agreed upon (more epistemic uncertainty!) it is important that
something is done to account for this component of the total uncertainty

As Abrahamson (2006) points out, one issue that needs to be addressed


is the under-estimation of epistemic uncertainty
• more data means more alternative models
• more alternative models means more epistemic uncertainty
• therefore more data means more uncertainty????

Common sense should prevail, the largest epistemic uncertainty should be


associated with those processes about which we know very little

Processes with data should act as a guide

17
Total Uncertainty

The terminology aleatory and epistemic (as well as ontological)


are really just terms to help to keep track of uncertainty

Regardless of what we call these


terms, the key is to capture the
total uncertainty

18

You might also like