Professional Documents
Culture Documents
Challenges of Building Logic Trees For Probabilistic Seismic Hazard Analysis
Challenges of Building Logic Trees For Probabilistic Seismic Hazard Analysis
INTRODUCTION
Probabilistic seismic hazard analyses (PSHA) conducted to define the seismic loading to
be used in the earthquake-resistant design of safety-critical facilities (such as nuclear power
plants) must demonstrate that epistemic uncertainties have been captured in order to provide
regulatory assurance. The tool ubiquitously used for this purpose is the logic tree, first
introduced to the field of PSHA by Kulkarni et al. (1984). Logic trees are in some ways
a cumbersome tool and it is to be hoped, and expected, that superior devices will be devel-
oped in the future. However, for as long as logic trees are being used it is important that their
application is consistent with their intent, and for this reason it is very useful to question their
meaning (e.g., Bommer and Scherbaum 2008, Scherbaum and Kuehn 2011, Musson 2012).
This Opinion Paper addresses a number of practical issues related to the construction of
logic trees for defining the input to PSHA calculations, including correcting a recommenda-
tion made in an earlier publication and also clarifying the views presented in a previous
Opinion Paper that has been interpreted and applied in ways that were not intended. Another
purpose of this current article is to highlight an aspect of logic tree construction that has not,
in my view, received sufficient attention, namely that the process generally involves creating
new models for seismic sources and also for ground-motion prediction. The population of
logic trees with models that represent both the best estimates on the basis of what is known
and alternative models that represent what could occur in light of our lack of knowledge,
requires expertise and experience in both the relevant earth science disciplines and in the
characterization of uncertainty. And as the demand for PSHA studies conducted to the
a)
Civil & Environmental Engineering Dept., Imperial College London, South Kensington, SW7 2AZ, U.K.
1723
Earthquake Spectra, Volume 28, No. 4, pages 1723–1735, November 2012; © 2012, Earthquake Engineering Research Institute
1724 J. J. BOMMER
highest standards for critical facilities grows, we are facing a potential shortage of suitably
qualified professionals with experience in this field.
Commission guidelines for implementing SSHAC processes (USNRC 2012, p. 36), “the
center, body, and range of technically defensible interpretations” (or the CBR of the
TDI, to satisfy the insatiable desire of this field for more acronyms). These distributions
are defined by the models or parameter values on the branches of the logic tree, and simul-
taneously the weights assigned to those branches. The key point to make at this stage is that
the discrete distribution represented by the branches must both include the best estimates that
the evaluators can develop for each component of the SSC and GMC models and simulta-
neously capture the range of alternatives (in terms of activity rates, earthquake locations,
maximum magnitudes, ground-motion amplitudes, etc) that should be contemplated in
view of the limitations of the data and the knowledge currently available.
model being “the best model available” (p. 1293) carefully pointing out that this does not mean
the probability of the model being true. I can fully accept the definition given by Scherbaum
and Kuehn (2011), and I think it is perhaps the clearest definition that has yet been proposed.
I could also live with the definition of Musson (2012) were it not for the implicit assumption
that for populating the branches of logic trees we are limited to the available models. This is
reflected in the assertion by Musson (2012) that if only one model for a particular feature
were available it should be assigned a weight (probability) of 1.0 since, by definition, in such
circumstances that model would be the best one available. To my mind, this somewhat
ignores the purpose of the logic tree, which is to capture what we know (our best estimate)
and what we do not know (the epistemic uncertainty). If there are insufficient models avail-
able to populate a logic tree that captures epistemic uncertainty, then new models must be
developed as part of the PSHA. The onus is on the hazard analyst to both determine the best
models for the SSC and GMC inputs and the range of uncertainty associated with the models,
and this must be done regardless of how many or how few alternatives happen to have been
published in the technical literature. This issue is discussed in more detail later in the paper.
In passing, a minor issue to note is that it may frequently be the case that the two branches
emanating from a single node will be assigned the same weight, which means that we would
view them as equally likely of being the best model (or in other words, we have no way of
distinguishing either of them as superior). Another point to note is that if one wishes to adopt
a truly purist interpretation of the weights as probabilities, then the weights on GMC
branches would need to be the conditional probabilities associated with each prediction
model being the best model given the particular SSC logic tree to which the GMC branches
are being added.
Although I think that Scherbaum and Kuehn (2011) have put forward an excellent defi-
nition for logic tree branch weights, I think it may be useful to have a definition that relates to
the whole logic tree, considering all the branches and their associated weights. Such a defini-
tion might state that a logic tree defines the probability distribution of potential ground-
motion amplitudes at a given location due to possible future earthquakes in the region.
By virtue of the nature of sigma in ground-motion prediction (e.g., Strasser et al. 2009),
each GMPE defines a distribution of ground-motion amplitudes for a given earthquake sce-
nario; a logic tree therefore defines a composite distribution due to multiple GMPEs and
multiple earthquake scenarios.
because those models are available, the challenge to capture the full range of epistemic uncer-
tainty must still be addressed.
Scherbaum and Kuehn (2011), on the other hand, state that a branch weight of zero “must
correspond to the case in which the analyst is certain that the corresponding model should not
be used” (p. 1238). If we have already populated the branches of our logic tree and are now
testing these models against data, then we may indeed assign zero weights to those models
which simply appear incompatible with local or regional recordings or observations. In
essence, these zero weights correspond to pruning these branches from the logic tree. How-
ever, in the context of considering available models that might be included, it is perfectly
feasible that several models (for example, published GMPEs) will be effectively assigned
weights of zero not because of a belief that they should not be used but simply because
they are not needed in order to construct the distribution of uncertainty. By way of illustra-
tion, suppose a PSHA is being conducted for a region that is not unambiguously defined as
stable or active, and consequently the logic tree is being populated with GMPEs from both
western and eastern North America. In such a case, the spread of the model predictions may
be so large that the relatively smaller differences among the NGA (Next Generation of
Attenuation) models (Abrahamson et al. 2008) could make it unnecessary to include all
five of those GMPEs.
such complications can generally be avoided, partly because GMPEs are generally derived
using similar parameter definitions, and this obviates the need to apply adjustments. More-
over, it has become possible to apply quality criteria to the selection of, rather than the
weighting of, models (e.g., Bommer et al. 2010). The grading matrix approach can now
be laid to rest, since weights can now be assigned to GMPEs only addressing the key
issue of the analyst’s degree-of-belief in each equation being the most appropriate for
the region and site being studied.
Within this new perspective (for GMC modelers), Scherbaum and Kuehn (2011) recom-
mend that weights be applied to branches sequentially, starting with the probability the ana-
lyst wishes to associate to the model in which he or she has greatest confidence. There is no
doubt that sequential (as opposed to parallel) weighting facilitates genuinely treating the
weights as probabilities throughout the process. In such an approach, which Scherbaum
and Kuehn (2011) also refer to as the stick-breaking method, one begins by assigning a
weight (P) to the most favored model, then the weights assigned to all the other branches
must be taken from the remainder (1-P). I suspect, however, that an analyst will naturally take
cognizance of how many branches are going to share the (1-P) weights, and P will likely be
set smaller if the remainder is to be spread over a large number of alternative branches (in the
same way that the central value of a three-point approximation of a given distribution will
differ from the central value of a seven-point approximation of the same distribution). If this
is the case, then it might be that it is not possible to avoid some element of parallel weighting.
Scherbaum and Kuehn (2011) acknowledge this to some degree by accepting that the process
might need to be iterative, with several sequences applied (presumably with different starting
weights on the most favored model) to the logic tree until the analyst feels that the distribution
reflects the full degree of epistemic uncertainty intended (in other words, avoiding unjusti-
fiably narrow distributions). I do not have any clear proposal as to how it should be achieved,
but I would argue that we need to find a way to consider population of the branches and
assigning the weights in unison, to create the intended distribution. Encouragingly, new
ideas for assigning branch weights that follow the rules of probability calculus are already
emerging (Runge and Scherbaum 2012).
One of the arguments that Scherbaum and Kuehn (2011) advance in favor of sequential
weighting is that following other approaches (parallel weighting) could render the mean
hazard highly insensitive to the actual weights, as was found by Sabetta et al. (2005)
and Scherbaum et al. (2005). Clearly this insensitivity implies that there is a danger of
the hazard analyst feeling confident about having captured epistemic uncertainty simply
by including many branches populated by several available models. The use of visualization
tools such as those proposed by Scherbaum et al. (2010) could assist here because the insen-
sitivity to branch weights will be heightened if several of the models cluster in ground-motion
space (i.e., predict broadly similar levels of motion). This would help to clarify the often
obscure relationship between weights on models and the resulting distribution of ground-
motion amplitudes, and ultimately on hazard estimates. One way this obscurity could be
circumvented entirely would be to construct the GMC logic tree by adopting or creating
a backbone GMPE as the central branch and then creating other branches carrying other
GMPEs obtained by scaling up and down the predicted values from the first equation
(e.g., Bommer and Scherbaum 2008). This approach, which may appear unorthodox to
some, has already been used in practice, as noted in the next section.
CHALLENGES OF BUILDING LOGIC TREES FOR PROBABILISTIC SEISMIC HAZARD ANALYSIS 1729
were used to scale average spectral shapes from the other two models. More recently, in the
BC Hydro SSHAC Level 3 PSHA, for which several GMPEs were available for predicting
spectral accelerations due to subduction earthquakes (e.g., Arango et al. 2012), it would have
been possible to construct a logic tree from published GMPEs without neglecting epistemic
uncertainty. However, the GMC technical integrator (TI) team in the BC Hydro project con-
cluded from their evaluation that none of these models was suitable because none were up-to-
date, and proceeded to compile a global database of recordings from subduction earthquakes
to derive a new GMPE. Additional branches for the GMC logic tree were created by model-
ing alternative magnitudes at which the scaling changed for the largest events, and by scaling
the median predictions in proportion to the range of average between-event residuals for
recordings from the Cascadia subduction zone.
New GMC models may be created through regressions on strong-motion data sets
assembled as part of the project, through stochastic simulations using parameters obtained
from weak-motion inversions (e.g., Boore 2003), or by making hybrid-empirical adjustments
(Campbell 2003). Even if a project does not derive new equations in one of these ways, the
GMC component of a PSHA is still likely to involve the creation of new models even if only
through the application of adjustments to make the existing equations more applicable to the
site (e.g., Cotton et al. 2006, Van Houtte et al. 2011) or to create additional versions of the
models to better cover the range of epistemic uncertainty (e.g., Petersen et al. 2008).
The apparent lack of appreciation of the general need to develop new GMC models as
well as new SSC models in PSHA studies for critical facilities may partly have been the result
of the highest profile studies to date—Yucca Mountain (Stepp et al. 2001) and PEGASOS,
which is the acronym for the German name of the project (Abrahamson et al. 2002, Renault
et al. 2010)—being conducted as Level 4 studies. In an SSHAC Level 4 study, the evalua-
tions and integrations are performed by a panel of experts making individual assessments that
are coordinated by a technical facilitator/integrator (TFI). New models may be commissioned
from contractors, as happened in the Yucca Mountain study, or created by virtue of applying
adjustments and extensions to existing equations, as happened in the PEGASOS and
PEGASOS Refinement Projects, but this organizational structure is not conducive to the
members of the ground-motion panel working together to develop new models for the
logic tree branches. In an SSHAC Level 3 process, where the TI teams work to develop
a single consensus SSC and GMC logic tree, new models are likely to be the norm, as
was the case in the BC Hydro PSHA. SSHAC Level 3 PSHA studies underway currently
for the Diablo Canyon plant in California, for the Hanford site in Washington, and for a new-
build site in South Africa are all likely to see the TI teams developing new models.
Musson (2012) effectively states, quite reasonably, that such endeavors to create new
models may be beyond the scope of hazard studies conducted over short schedules and
with limited resources, for those applications in which such hazard assessments are consid-
ered acceptable. However, if SSHAC Level 3 PSHA studies are to be conducted for safety-
critical facilities at multiple sites within a country or region, then the benefits could be spread
to other applications by conducting SSHAC Level 3 studies to develop to regional SSC and
GMC models, which can then be refined for site-specific studies (adding any local seismic
sources and modifying the ground motions for the near-surface geo-materials) using a Level
2 study (e.g., Coppersmith and Bommer 2012). This is the rationale behind the recently
CHALLENGES OF BUILDING LOGIC TREES FOR PROBABILISTIC SEISMIC HAZARD ANALYSIS 1731
completed CEUS SSC (Central and Eastern United States Seismic Source Characterization
for Nuclear Facilities) Project, which will no doubt also be used as input to regional and site-
specific hazard assessments for non-nuclear applications.
technical expertise required in the relevant subjects becomes even more demanding. This
expertise is acquired through both academic study and practical experience, benefits
from some engagement with research, and requires continuous updating through participa-
tion in scientific meetings and contributions (as authors and/or reviewers) to the literature.
The global pool of experts in these fields (as they are applied to PSHA) is relatively small,
and the various projects mentioned earlier have absorbed a significant proportion of those
available for engagement in high-level seismic hazard studies. If the global nuclear renais-
sance continues to advance, and with the expectations and requirements for the rigor afforded
by Level 3 and 4 assessments raised by the impact of the Tohoku earthquake at Fukushima,
then there is going to be a genuine challenge to assemble teams for all of the studies that
might be undertaken. A positive move to address this issue is the involvement of several
young people in active roles in most of the projects that have been mentioned, which provides
training and preparation.
In passing it is also worth mentioning that there are two absolutely vital roles in an
SSHAC Level 3 study, the evaluator-integrator roles covered by the TI team, and the parti-
cipatory peer review panel (PPRP), which undertakes continuous review of the process and
the technical bases for all decisions throughout the project. Membership of a PPRP requires
detailed technical knowledge in key subject matters (collectively, the panel must cover all
relevant disciplines), extensive experience in PSHA, and a thorough understanding of the
SSHAC process. There is as much need to expand the pool of experts for this role as
for populating the TI teams on SSHAC Level 3 projects.
However, the expertise of the TI team and PPRP members in earth sciences and PSHA
may not be sufficient by itself, since ultimately the SSHAC Level 3 process is defining logic
trees that represent distributions that will be used as probability distributions. Scherbaum and
Kuehn (2011) raise the issue of the need for those constructing logic trees to possess what
they call “normative expertise,” questioning whether this might be “often outside the range of
competence and/or interest of hazard analysts,” (p. 1239) These philosophical aspects have
clearly not been a focus of interest for many seismic hazard practitioners, and it is a useful
contribution of Scherbaum and Kuehn (2011) to bring attention to these often neglected
issues. However, whether this is beyond the competence of those who conduct seismic
hazard analyses really depends on the degree of normative expertise that is required.
I would assert that what is needed for PSHA is a good grasp of the fundamentals of prob-
ability and a working appreciation of their application, in which case with a modest effort all
hazard analysts should be able to equip themselves with the required knowledge. Scherbaum
and Kuehn (2011) assert only that normative expertise needs to be present in PSHA projects
and rigorously applied, rather than requiring all hazard analysts to become normative experts.
I would contend that all hazard analysts should become conversant with the axioms of prob-
ability since they will be assigning logic tree branch weights. Scherbaum and Kuehn (2011)
provide a useful definition for what is meant by normative expertise: the ability “to design
unbiased and logically consistent probability estimates on models” (p. 1246). When technical
experts are assembled for an SSHAC Level 3 project, the early stages (and the first sessions
of most workshops) will include ‘training’ on the SSHAC process, the responsibilities and
duties assigned with each role, and the perils of cognitive bias. These preparatory sessions
may also include the basics of PSHA, and, taking the message of Scherbaum and Kuehn
(2011) on board, I would argue that fundamentals of probability should also be added to
CHALLENGES OF BUILDING LOGIC TREES FOR PROBABILISTIC SEISMIC HAZARD ANALYSIS 1733
the core curriculum of the general training on SSHAC procedures and seismic hazard
analysis.
CONCLUDING REMARKS
Building logic trees for PSHA involves populating the branches with models (or para-
meter values) and their associated weights, so that the two together yield the distribution of
the predicted feature (seismicity distributions and rates or ground-motion amplitudes) that
reflects the center, body and range of technically defensible interpretations. Ultimately,
we need to create logic trees that will capture what we do know in terms of our best estimates
and what we do not know about the parameters being predicted, as discrete approximations to
continuous distributions. High-level PSHA studies for critical facilities involve exciting tech-
nical challenges, but there is now also an urgent need to expand the pool of experts available
to conduct such studies. The experts need in the first instance to be established in one or more
of the disciplines required to build SSC or GMC models for PSHA, and then acquire experi-
ence in PSHA as well as specifically of participation in SSHAC Level 3 or 4 projects. Under-
standing of cognitive bias, the fundamentals of probability theory, the engineering
application of earthquake ground motion in design and fragility analysis, and nuclear reg-
ulation also need to be acquired. Clear guidelines on how to implement SSHAC Level 3
processes are now available (USNRC 2012) and these will no doubt be refined and updated
in the light of experience; the urgent challenge now is to develop a much broader pool of
experts to make the implementation possible. Bringing young and enthusiastic scientists and
engineers into major PSHA projects, to acquire the necessary experience, is essential if the
demands of future projects are to be met.
This is the seventh Opinion Paper related to the use of logic trees in PSHA to appear in
Earthquake Spectra since 2005. To my mind, this confirms that while logic trees have been
widely and effectively deployed in seismic hazard analyses, some of the more subtle aspects
of their construction and application have not been considered in sufficient depth, and con-
sequently, the state-of-the-practice is still being defined. I hope that others will feel inspired—
or provoked—to enter the debate and bring alternative perspectives and ideas to these
exchanges in order to enrich and broaden the conclusions that will eventually emerge.
Another contribution that would be of great benefit would be for the innovations and
ideas developed for the construction of logic trees in SSHAC Level 3 and 4 PSHA projects
to be published in the mainstream literature (to the degree that confidentiality requirements
allow) in order to share experience and developments beyond the teams of participants in each
of these projects.
ACKNOWLEDGMENTS
I am grateful to all those with whom I have enjoyed stimulating discussions of these
topics and whose ideas and questions have challenged my own views and therefore contrib-
uted to their evolution, especially Norm Abrahamson and the experts in the PEGASOS and
PEGASOS Refinement Projects. In this regard, special mention is due to Frank Scherbaum,
with whom I am in violent agreement on many issues and with whom I truly enjoy debating
our minor differences of opinion (and invariably learn a great deal in the process). My sincere
thanks are also due to Roger Musson for providing a preprint of his paper and for continuing
these discussions in a friendly and constructive spirit. I am very grateful to Peter Stafford for
1734 J. J. BOMMER
constructive feedback on the first draft of this paper and for excellent insights and suggestions
regarding aspects of probability, and to Frank Scherbaum and Nico Kuehn for stimulating
discussions triggered by an early version of this manuscript. Frank Scherbaum, Roger
Musson, and an anonymous reviewer all provided constructive and encouraging reviews
of the original submission, which are gratefully acknowledged.
REFERENCES
Abrahamson, N., Atkinson, G., Boore, D., Bozorgnia, Y., Campbell, K., Chiou, B., Idriss, I.M.,
Silva, W., and Youngs, R., 2008. Comparison of the NGA ground-motion relations. Earth-
quake Spectra 24, 45–66.
Abrahamson, N. A., Birkhauser, P., Koller, M., Mayer-Rosa, D., Smit, P., Sprecher, C., Tinic, S.,
and Graf, R., 2002. PEGASOS: A comprehensive probabilistic seismic hazard assessment for
nuclear power plants in Switzerland, Paper No. 633, Twelfth European Conference on Earth-
quake Engineering, 9–13 September 2002, London, England.
Abrahamson, N. A., and Bommer, J. J., 2005. Probability and uncertainty in seismic hazard ana-
lysis, Earthquake Spectra 21, 603–607.
Arango, M. C., Strasser, F. O., Bommer, J. J., Cepeda, J. M., Boroschek, R., Hernandez, D. A,
and Tavera, H., 2012. An evaluation of the applicability of current ground-motion models to
the South and Central American subduction zones, Bulletin of the Seismological Society of
America 102, 143–168.
Bommer, J. J., and Akkar, S., 2012. Consistent source-to-site distance metrics in ground-motion
prediction equations and seismic source models for PSHA, Earthquake Spectra 28, 1–15.
Bommer, J. J., Douglas, J., Scherbaum, F., Cotton, F., Bungum, H., and Fäh, D., 2010. On the
selection of ground-motion prediction equations for seismic hazard analysis, Seismological
Research Letters 81, 794–801.
Bommer, J. J., and Scherbaum, F., 2008. The use and misuse of logic-trees in probabilistic seis-
mic hazard analysis, Earthquake Spectra 24, 997–1009.
Bommer, J. J., Scherbaum, F., Bungum, H., Cotton, F., Sabetta, F., and Abrahamson, N. A., 2005.
On the use of logic trees for ground-motion prediction equations in seismic-hazard analysis,
Bulletin of the Seismological Society of America 95, 377–389.
Boore, D. M., 2003. Simulation of ground motion using the stochastic method, Pure & Applied
Geophysics 160, 635–676.
Budnitz, R. J., Apostolakis, G., Boore, D. M., Cluff, L. S., Coppersmith, K. J., Cornell, C. A., and
Morris, P. A., 1997. Recommendations for Probabilistic Seismic Hazard Analysis: Guidance
on uncertainty and use of experts, NUREG/CR-6372, two volumes, U.S. Nuclear Regulatory
Commission, Washington D.C.
Campbell, K. W., 2003. Prediction of strong ground motion using the hybrid empirical method
and its use in the development of ground motion (attenuation) relations in eastern North
America, Bulletin of the Seismological Society of America 93, 1012–1033.
Coppersmith, K. J., and Bommer, J. J., 2012. Use of the SSHAC methodology within regulated
environments: Cost-effective application for seismic characterization at multiple sites, Nuclear
Engineering and Design 245, 233–240.
Cotton, F., Scherbaum, F., Bommer, J. J., and Bungum, H., 2006. Criteria for selecting and
adjusting ground-motion models for specific target regions: applications to Central Europe
and rock sites, Journal of Seismology 10, 137–156.
CHALLENGES OF BUILDING LOGIC TREES FOR PROBABILISTIC SEISMIC HAZARD ANALYSIS 1735
Crouse, C. B., 1991. Ground-motion attenuation equations for earthquakes on the Cascadia sub-
duction zone, Earthquake Spectra 7, 210–236.
Kulkarni, R. B., Youngs, R. R., and Coppersmith, K. J., 1984. Assessment of confidence intervals
for results of seismic hazard analysis, in Proceedings, Eighth World Conference on Earth-
quake Engineering, vol.1, International Association for Earthquake Engineering, Tokyo,
Japan, 263–270.
McGuire, R. K., Cornell, C. A., and Toro, G. R., 2005. The case for the mean hazard curve,
Earthquake Spectra 21, 879–886.
Musson, R. M. W., 2005. Against fractiles, Earthquake Spectra 21, 887–891.
Musson, R., 2012. On the nature of logic trees in probabilistic seismic hazard assessment,
Earthquake Spectra 28, 1291–1296.
Petersen, M. D., Frankel, A. D., Harmsen, S. C., Mueller, C. S., Haller, K. M., Wheeler, R. L., Wesson,
R. L., Zeng, Y., Boyd, O. S., Perkins, D. M., Luco, N., Field, E. H., Wills, C. J., and Rukstales, K. S.,
2008. Documentation for the 2008UpdateoftheUnitedStatesNational Seismic HazardMaps, U.S.
Geological Survey Open-File Report 2008-1128, USGS, Reston, VA.
Renault, P., Heuberger, S., and Abrahamson, N. A., 2010. PEGASOS Refinement Project: An
improved PSHA for Swiss nuclear power plants, Paper No. ST4-0991, 14th European Con-
ference on Earthquake Engineering, 30 August–3 September 2010, Ohrid, Macedonia.
Runge, A., and Scherbaum, F., 2012. The quantification of consistent logic tree branch weights
for PSHA (abstract), Seismological Research Letters 83, DOI: 10.1785/gssrl.83.2.316.
Sabetta, F., Lucantoni, A., Bungum, H., and Bommer, J. J., 2005. Sensitivity of PSHA results to
ground motion prediction relations and logic-tree weights, Soil Dynamics & Earthquake Engi-
neering 25, 317–329.
Scherbaum, F., Bommer, J. J., Bungum, H., Cotton, F., and Abrahamson, N. A., 2005. Composite
ground-motion models and logic-trees: methodology, sensitivities and uncertainties, Buletin of
the Seismological Society of America 95, 1575–1593.
Scherbaum, F., and Kuehn, N. M., 2011. Logic tree branch weights and probabilities: Summing
up to one is not enough, Earthquake Spectra 27, 1237–1251.
Scherbaum, F., Kuehn, N. M., Ohrnberger, M., and Koehler, A., 2010. Exploring the proximity of
ground-motion models using high-dimensional visualization techniques, Earthquake Spectra
26, 1117–1138.
Stepp, J. C., Wong, I., Whitney, J., Quittemeyer, R., Abrahamson, N., Toro, G., Youngs, R.,
Coppersmith, K., Savy, J., Sullivan, T., and Yucca Mountain PSHA Project Members,
2001. Probabilistic seismic hazard analyses for ground motions and fault displacements at
Yucca Mountain, Nevada, Earthquake Spectra 17, 113–151.
Strasser, F. O., Abrahamson, N. A., and Bommer, J. J., 2009. Sigma: Issues, insights, and chal-
lenges, Seismological Research Letters 80, 40–56.
U.S. Nuclear Regulatory Commission (USNRC), 2012. Practical Implementation Guidelines for
SSHAC Level 3 and 4 Hazard Studies, NUREG 2117, Washington D.C.
Van Houtte, C., Drouet, S., and Cotton, F., 2011. Analysis of the origins of κ (kappa) to compute
hard rock to rock adjustments factors for GMPEs, Buletin of the Seismological Society of
America 101, 2926–2941.
Youngs, R., Chiou, S. -J., Silva, W. J., and Humphrey, J. R., 1997. Strong ground motion attenua-
tion relationships for subduction zone earthquakes, Seismological Research Letters 68, 58–73.
(Received 3 March 2012; accepted 17 May 2012)