You are on page 1of 28

Operational Risk Modelling

Aniruddho “Ani” Sanyal, PhD


Senior Consultant
Wolters Kluwer Financial Services

April 9, 2014, New York


Agenda

• Operational Risk management –latest trends


• Economic Rationale of the methodology – avoiding
certain pitfalls.
• Getting an AMA-compliant measure of Op Risk capital
that ties in the requisite components –
– The Change of Measure (COM) approach
• Implementing the COM approach
– in the MS Excel© environment and a proprietary Add-In
file built by WKFS using Matlab Excel Builder
– Heavy-lifting only possible because of the MCR

2
Agenda

• Operational Risk management –latest trends


• Economic Rationale of the methodology – avoiding the
pitfalls.
• Getting an AMA-compliant measure of Op Risk capital
that tie in the requisite components –
– The Change of Measure (COM) approach
• Implementing the COM approach
– in the MS Excel© environment and a proprietary Add-In
file built by WKFS
– Heavy-lifting only possible because of the MCR

3
ORM – Latest trends

• Inadequate controls and procedures leading to events


such as JP Morgan’s “London Whale” or rogue trader
Kweku Adoboli at UBS –image problem
• Shareholder have joined regulators in seeking
transparency
• No longer a silo-ed part of risk but important toolkit for
running the bank: AML, Compliance overlap with ORM
• Capital calculation directive remains the same as under
Base II but capital got more expensive under Basel III.
– BIA and STA – punitive, unsuitable for stress-testing.

4
ORM – Latest trends

• The cross-over between Pillars I and II is getting the


business managers involved in op risk quantification
rather than just quant specialists. Weak Pillar I leads to
weak Pillar II and additional charges

5
Agenda

• Operational Risk management –latest trends


• Economic Rationale of the methodology – avoiding the
pitfalls.
• Getting an AMA-compliant measure of Op Risk capital
that tie in the requisite components –
– The Change of Measure (COM) approach
• Implementing the COM approach
– in the MS Excel© environment and a proprietary Add-In
file built by WKFS
– Heavy-lifting only possible because of the MCR

6
Economic Rationale

• Collaborated with Dr. Kabir Dutta – recognized OpRisk


expert from Wharton Business School and the Federal
Reserve Bank of Boston.
• His paper with Jason Perry: “A Tale of Tails” is the one
of the most downloaded in the Op Risk Literature
• His Change of Measure Approach to incorporating
scenarios is gaining a lot of attention in the US, the UK,
the Middle East and elsewhere .
• A way to layer scenario analysis on ILD and more
intuitive and pragmatic alternative to Bayesian Belief
Network (BBN).

7
Economic Rationale
Capital Calculation

Stress Testing Internal Loss Data

Regulatory
Guidance

External Loss Data ?


Scenario Analysis

BEICF/RCSA
Economic Rationale

• Dutta-Perry standards in AMA modeling


– Good Fit — Statistically how well does the method fit the
data?
– Realistic — If a method fits well in a statistical sense, does
it generate a loss distribution with a realistic capital
estimate?
– Well-Specified — Are the characteristics of the fitted data
similar to the loss data and logically consistent?
– Flexible — How well is the method able to reasonably
accommodate a wide variety of empirical loss data?
– Simple — Is the method easy to apply in practice?

9
Economic Rationale

• “My view, when I was a regulator, was that coupling scenario


analysis with risk modeling was the most effective way to go. No
model will capture all types of risk, but models allow for systematic
data analysis, which should be an important part of, but not the
whole of, op risk management. Scenario analysis allows you to
consider risks not well captured in the model and, thereby, increase
awareness of – and monitoring of – those risks. The supervisory
process should consider current and prospective risks, which may
not be fully reflected in historical data, and how to mitigate them”
• Randall Kroszner, professor of economics at University of Chicago Booth
School of Business, and a former governor of the Federal Reserve system
from 2006 to 2009.
• COM provides hitherto the most robust way of achieving this.

10
Economic Rationale

• Humans are not “Econs”


– Systems 1 and 2
– Remembering vs. Experiencing
selves
– Cognitive illusions
• Biases
• Priming
• Anchoring
– Framing
• “Sausages are 90% fat free” sounds
healthier than “containing 10% fat”!
– Heuristics
• Mental shortcuts and rules of thumb
• Dependence on readily available data
rather than due diligence
– “Availability heuristic”

11
Economic Rationale

• Immediate implications
– Gather data in the range-
frequency format
• At the level of factors rather than
processes.
• Frequency as “once in n years” – not
“probabilities”
– Deprecate “median and 95
percentile loss” kind of thinking!
• Can be very misleading due to anchoring
effects
• Cannot accommodate multiple
overlapping perceptions
• Too often GIGO!

12
Agenda

• Operational Risk management –latest trends


• Economic Rationale of the methodology – avoiding the
pitfalls.
• Getting an AMA-compliant measure of Op Risk capital
that tie in the requisite components –
– The Change of Measure (COM) approach
• Implementing the COM approach
– in the MS Excel© environment and a proprietary Add-In
file built by WKFS
– Heavy-lifting only possible because of the MCR

13
AMA-compliant measure of Op Risk capital

• Relating to operational risk


• Applying the Dutta-Babel (2012) COM approach
– Methodology applied in 12 major US/Canadian institutions
– WKFS: first packaged implementation of the above model
– The only approach that can combine internal loss data
with scenario analysis to get to a single capital number.
– Using the familiar Excel front-end that uses a compiled
Matlab® engine in the background
– Follows the paper closely including the choice of key
severity distributions (can add more!)

14
AMA-compliant measure of Op Risk capital
Essence of the methodology – the
“residual” approach
1. m events are supposed to
occur in t years
2. t sets of observations are
taken, each of length ni
3. nTot observations: k
occurrences actually found in
[a,b] while m is predicted by
the scenario
4. If m>k, m-k additional samples
are taken from [a,b] and
added to nTot
5. Parameters p1, p2, and
frequency are re-estimated
and VaR calculated using SLA
6. Steps 2 through 5 are repeated
10,000 times
7. Median of 10,000 VaR
observations is taken.

15
AMA-compliant measure of Op Risk capital

Estimate Capital using


the new distribution.
SLA is used for most
Change of Measures on distributions. Monte
some or all Carlo with 1,000,000
distributions using trials is used for
GOF (AD, KS, Chi- scenarios (10,000 Empirical distribution
Square, QQ-Plots) Trials). Re-estimate
parameters using MLE
Parameter Estimation
(Lognormal,
Loglogistic,
Summary Statistics Loggamma, and
using Internal Data. Weibull)
(Generate Data if
internal data does not
exists)

16
Agenda

• Operational Risk management –latest trends


• Economic Rationale of the methodology – avoiding the
pitfalls.
• Getting an AMA-compliant measure of Op Risk capital
that tie in the requisite components –
– The Change of Measure (COM) approach
• Implementing the COM approach
– in the MS Excel© environment and a proprietary Add-In
file built by WKFS
– Heavy-lifting only possible because of the MCR

17
Implementing COM – Demonstration!

Can accommodate varied


perceptions? There is no need
to build artificial consensus.

18
Implementing the COM approach

19
Implementing the COM approach

• Industry-standard Goodness-of-Fit tests

20
Implementing the COM approach

$4,248,933

$3,748,933

$3,248,933
Lognormal
$2,748,933 Loglogistic
Loggamma
$2,248,933 Weibull
Observed
$1,748,933 Linear (Observed)

$1,248,933

$748,933

$248,933
$248,933 $748,933 $1,248,933 $1,748,933 $2,248,933 $2,748,933 $3,248,933 $3,748,933 $4,248,933

21
Implementing the COM approach
Chosen percentile (e.g.,99, 99.9, 99.95) reflects risk appetite.

OpVaR based on ILD


Based on selected
Distributions Param 1 Param 2 Frequency 99 99.9 99.95 risk appetite
Percentiles – UOM
Lognormal 9.9625555 1.4957571 10 $ 2,742,665 $ 6,112,252 $ 7,729,540 50 $ 31,867
Loglogistic 9.96657069 0.8482016 10 $ 8,571,680 $ 53,738,478 $ 95,856,331 represented 75 here $ 67,142
Loggamma
Weibull
43.1285805 0.2309966
44924.1393 0.6480781
10
10
$
$
5,225,300
1,440,566
$
$
17,154,322
1,935,826
$
$
24,409,400
2,099,504
by the percentile
85
90
$ 102,301
$ 133,841
Empirical - - 10 $ 4,983,625 $ 6,333,945 $ 6,851,424 we get a 95SINGLE $ 251,213
number that is $ 971,416
97 $ 361,347
99
OpVaR based on ILD and Scenario Data
Distributions Param 1 Param 2 Frequency 99 99.9 99.95
inclusive99.97 of own $$ 3,941,731
99.9
4,731,303
Lognormal 10.0157787 1.5773649 9.8225806 $ 3,589,584 $ 8,526,057 $ 10,965,963 loss experience
Count and 800
Loglogistic
Loggamma
9.89408297 0.8871885 9.5483871
39.4140262 0.2518867 10.306452
$ 10,075,948 $
$ 6,574,492 $
68,635,229
23,316,968
$ 125,795,686
$ 33,914,233
scenario data. $ 10,173
Aggregate Loss Amount $ 64,152,727
Min
Weibull 56200.3688 0.5733778 10.193548 $ 2,470,153 $ 3,537,540 $ 3,902,428 Max $ 4,731,303
Empirical - - 10.080645 $ 9,382,157 $ 12,735,458 $ 13,393,003 Mean $ 80,191
std 231942.02
skewness 12.57
kurtosis 218.97

22
Implementing the COM approach
Interactive debugging from the Function
• Why we chose the Excel Builder Wizard alongside standard Excel VBA
coding and debugging.

23
Implementing the COM approach

• Stand-alone – can be used with other platforms


• Runs 50K simulations in 20 minutes and 10 billion
simulations in 12 hours (Windows 7, 32-bit Excel)
– Faster with 64 bit Excel
– May be speeded up with Matlab Production Server
• Matlab is the pre-eminent platform for AMA modeling –
distantly followed by open source alternatives
• Great support
• Excel is center of workflow (can be replaced with in-
house platform, plugging in C++ component)

24
Implementing the COM approach

• Combines multiple elements of AMA – ILD and Scenarios


• Cross checks scenarios for redundancy
• External data can be included to inform scenarios and/or choose
starter distributions when there is no internal data
• Scenario Selection can be made using RCSA, BEICF, and other
qualitative information on self assessments
– Scenarios should be prioritized based on the riskiness of the
business evaluated by the qualitative assessments
• Stabler than conventional AMA
– Finite capital estimates (instability – scourge of older models)
• Espousing this methodology will be simpler and more intuitive for
bankers- Analyst/Risk Mgr collects data and generates reports

25
Implementing the COM approach

• Can accommodate LOB/event types with little or no loss


data.
• Can make the model more heavily weighted toward
scenarios – using the Empirical distribution approach
– Takes much longer to run – but not years to gather data!
– There always are Op Risk losses that can be gathered from
the general ledger
• Works better than the BBN
– Easier for bankers to understand.
– Don’t have to maintain/transfer complex NPT (node
probability tables)- when key people move on.

26
Implementing the COM approach
Used ReportBuilder and SSIS to integrate with platform

27
Q&A

28

You might also like