You are on page 1of 12

Soft Comput

DOI 10.1007/s00500-015-1847-6

METHODOLOGIES AND APPLICATION

Relevant applications of Monte Carlo simulation in Solvency II


Giuseppe Casarano1 · Gilberto Castellani2 · Luca Passalacqua2 ·
Francesca Perla3 · Paolo Zanetti3

© Springer-Verlag Berlin Heidelberg 2015

Abstract The definition of solvency for insurance com- a tractable formulation of the very complex expectations
panies, within the European Union, is currently being resulting from the “market-consistent” valuation of funda-
revised as part of Solvency II Directive. The new defini- mental measures, such as Technical Provisions, Solvency
tion induces revolutionary changes in the logic of control Capital Requirement and Probability Distribution Forecast,
and expands the responsibilities in business management. in the solvency assessment of life insurance companies. We
The rationale of the fundamental measures of the Directive illustrate the software and technological solutions adopted
cannot be understood without reference to probability dis- to integrate the Disar system—an asset–liability compu-
tribution functions. Many insurers are struggling with the tational system for monitoring life insurance policies—in
realisation of a so-called “internal model” to assess risks advanced computing environments, thus meeting the demand
and determine the overall solvency needs, as requested by for high computing performance that makes feasible the cal-
the Directive. The quantitative assessment of the solvency culation process of the solvency measures covered by the
position of an insurer relies on Monte Carlo simulation, Directive.
in particular on nested Monte Carlo simulation that pro-
duces very hard computational and technological problems Keywords Life insurance policies · Monte carlo simula-
to deal with. In this paper, we address methodological tion · Nested simulation · Modelling uncertainty · Stochastic
and computational issues of an “internal model” designing models · Risk assessment · Asset–liability management

Communicated by V. Loia.
1 Introduction
B Paolo Zanetti
paolo.zanetti@uniparthenope.it With the European Directive 2009/138—Solvency II—
Giuseppe Casarano Directive (2009) “on the taking-up and pursuit of the business
giuseppe.casarano@alef.it of Insurance and Reinsurance”, the probability distribution
Gilberto Castellani functions enter significantly the balance sheets of European
gilberto.castellani@uniroma1.it insurance companies. The Directive changes the manage-
Luca Passalacqua ment style of insurance undertakings, changes the logic
luca.passalacqua@uniroma1.it of the evaluation process of the fundamental measures
Francesca Perla (reserves, solvency margin) and requires insurance under-
francesca.perla@uniparthenope.it takings to evaluate values and risks in a “market-consistent
way”1 , thus giving a prominent role to the “evaluation
1 Alef Avanced Laboratory Economics and Finance, Rome,
Italy
1 Whenever the term “market-consistent” valuation is referred in this
2 Department of Statistical Sciences, Sapienza, University of
paper, it is be construed as follows: if the contracts are hedgeable (and
Rome, Rome, Italy
then market valuation is available) the market-consistent value is given
3 Department of Management and Quantitative Studies, by the market price, that is to say a “marked to market” valuation; if the
University of Naples “Parthenope”, Naples, Italy contracts are non hedgeable (and then a market valuation is unavailable)

123
G. Casarano et al.

model under conditions of uncertainty”. The Directive intro- For all this it is mandatory for firms to quickly adopt a com-
duces hence new valuation criteria (“mark to market” and plex simulation system, able to provide market-consistent
“mark to model”), neologisms in the glossary of insurance evaluation of values and risks and to perform timely mea-
companies—Technical Provisions (TP ) in place of math- surements, as required by regulations, to carry out continuous
ematical reserve, Solvency Capital Requirement (SCR ) in verifications. The development of such a system requires
place of solvency margin, Probability Distribution Forecast a strong synergy between high-level theory and high-level
(PDF )—; requires to evaluate the Technical Provisions in a technology, that is a synergy between models and tech-
market-consistent way and to measure the Solvency Cap- niques of quantitative finance, computational schemes and
ital Requirement with the Value-at-Risk (VaR) approach data management. The appropriateness of data quality and
(confidence level = 99.5 %, unwinding period = 1). The models as well as accuracy and efficiency of computation
application of the Solvency II principles gives rise to very and the adequacy of the IT infrastructure are more and more
hard theoretical and computational problems; the market- preconditions for an efficient governance of insurance com-
consistent valuation, the estimation of SCR , the elicitation panies.
of distribution functions are very computationally demand- The contributions in this study are manifold: first, a
ing and “time-consuming”, because of the complexity of the tractable formulation of the problem of Risk Margin Assess-
contracts and the great number of contracts in each port- ment in the context of policies with profit is given. Second,
folio. For portfolios usually held by insurance companies, a nested simulation environment is set up that is capable of
closed-form solutions are not available, and evaluation mod- handling the problem in its entirety, in contrast with previ-
els rely on Monte Carlo simulation. However, the estimation ous works that only concentrated on partial aspects. Third,
of some fundamental measures in Solvency II, like the SCR , strategies to alleviate the computational burden of the nested
requires the simulation of a very large number of scenarios Monte Carlo simulation are analysed and evaluated. Fourth, a
to describe the tails of the distribution, thus resulting in a complete architecture for the simulation environment is pre-
“compute-intensive” process. To make the evaluation model sented. The findings of this study can be interesting for other
of “effective” use, therefore, the computational performance researchers working in all areas where nested simulation is
of the simulation process plays a crucial role. applied.
The implications and requirements introduced by the Section 2 gives some technical background and describes
Directive become particularly compelling when the com- the framework for assessing solvency in an insurance con-
panies calculate TP and SCR using an “internal model” text and sets out a roadmap to the probability distribution
(Directive 2009, art. 112). The “internal model” is a sys- forecast. The following Section sketches the main ideas
tem used by the undertakings to assess risks and determine of nested Monte Carlo simulations, emphasizing the com-
the overall solvency needs, ensuring the quality standards putational issues to be coped with. Section 4 describes
indicated by the Directive and subject to the approval of the architecture of the DISAR (Dynamic Investment Strat-
the national supervisory authority (Directive 2009, art. 112– egy with Accounting Rules), a simulation system designed
127). Nevertheless, the “internal model” must allow to obtain for computing Technical Provisions and Solvency Capital
responses in a suitable turnaround time; it is then a challeng- Requirement—also in compliance with the Own Risk and
ing matter to focus on numerical simulation, with the aim Solvency Assessment, the ORSA—as requested by the Direc-
of obtaining adaptive solution processes, that is, capable of tive to the “internal model” (Castellani and Passalacqua
being properly scaled to balance accuracy and computational 2011). In Sect. 5, our approach to the nested simulation is
efficiency on demand, depending on the evaluation context. detailed. Section 6 illustrates the computational environment
There is further an issue of timeliness, beside methods and some results of numerical simulations and performance
and culture; the first reporting on Forward Looking Assess- evaluation. Section 7 concludes the paper.
ment of Own Risks (Directive 2009, art. 45), (EIOPA 2013)
(based on the Own Risk and Solvency Assessment principles
[(Directive 2009, art. 45), (EIOPA 2012)] has been delivered 2 General scheme for solvency assessment
to the national regulator by 31 October 2014; the first annual
reporting within twenty-two weeks from the end of 2014; the 2.1 Basic definitions: technical provisions, net asset
first quarterly reporting within eight weeks from the end of value, solvency capital requirement
third quarter 2015.
Solvency can be defined as the ability of an insurance under-
taking to discharge its indebtedness. The estimation problem
of Technical Provisions (TP ), Net Asset Value (NAV ) and
Footnote 1 continued
the market consistency must be guaranteed by an evaluation model, that Solvency Capital Requirement (SCR ), introduced by the
is to say a “marked to model” valuation. Directive Solvency II, involves the market-consistent valua-

123
Relevant applications of Monte Carlo simulation in Solvency II

tion of all components of the asset–liability insurance portfo-


lio. The expected payments and their associated expenses are
the liabilities of an insurance contract. The SCR determines
the amount of capital that ensures that an undertaking will
be able to meet its obligations over 1 year with a probability
of at least 99.5 % (Directive 2009, art. 100), which limits the
chance of falling into financial ruin to less than once in 200
cases.
In the art. 77 of the Directive the TP is defined as the
sum of the Best Estimate (BE) and the Risk Margin (RM)2 .
Art. 77.4 states that the Technical Provisions can be com-
puted “as a whole” “where future cash flows [...] can be
Fig. 1 Fundamental measures in Solvency II
replicated reliably using financial instruments for which a
reliable market value is observable; the value of technical
provisions associated with those future cash flows shall be of NAV (denoted with E[NAV(T )]) and of a percentile, at
determined on the basis of the market value of those finan- a prefixed confidence level α(= 0.005), of the NAV distrib-
cial instruments. In this case, separate calculations of the best ution at a future time T > 0 (NAVα (T ) with T = 1 year).
estimate and the risk margin shall not be required”3 . It is rel- Then the SCR , evaluated in t = 0 for the future time T > 0,
evant to note that that the replication must be interpreted as is given by
referred to risk components (hedgeable or non hedgeable)
and not to cash flows.
SCR(0, T ) = [E[NAV(T )] − NAVα (T )] v(0, T ), (3)
Let V (t; X) be the market-consistent value of the assets
X and V (t; Y) the market-consistent value of the liabilities
Y at time t, where where v(0, T ) is the risk-free appropriate discount factor pre-
vailing in the financial market at time zero for the maturity

nx ny
 T.
y
V (t; X) = Vkx (t), V (t; Y) = Vk (t), (1) In Fig. 1 is represented the graphical illustration of the
k=1 k=1 “fundamental” measures of Solvency II.
In the representation, assets and liabilities are put beside
with n x (n y ) the number of assets (liabilities) and with Vkx (t) for emphasizing the need to adopt an unified set of valuation
y
(Vk (t)) the value of the kth asset (liability) contract4 . criteria able to measure all elements (“assets” vs “technical
With our assumption, the TP is the market-consistent provisions” + SCR ) in the logic of “total balance sheet”,
value of the liabilities, that is V (t; Y). implementing a consistent plan of “asset–liability manage-
The NAV is defined as the difference between the value ment” (ALM)5 .
of assets and liabilities:
2.2 Evaluation principles
NAV(t) = V (t; X) − V (t; Y). (2)
The PDF is defined in art. 13 of the Directive Solvency
The SCR estimation, that is consistent with the definition II: “probability distribution forecast means a mathemati-
of discounted VaR as referred to in the Directive for “inter- cal function that assigns to an exhaustive set of mutually
nal models”, involves the evaluation of the expected value exclusive future events a probability of realisation”; it is con-
sidered a fundamental component of the “internal model”.
2 The BE “shall correspond to the probability-weighted average of Art. 121 says that the calculation the probability distribution
future cash-flows, taking account of the time value of money (expected forecast has to “be consistent with the methods used to calcu-
present value of future cash-flows), using the relevant risk-free interest
late technical provisions” and the art. 122 says that “where
rate term structure, [...] up-to-date and credible information and realistic
assumptions and be performed using adequate, applicable and relevant practicable” the SCR should be evaluated “directly from
actuarial and statistical methods”; the RM “shall be such as to ensure
that the value of the technical provisions is equivalent to the amount 5
that insurance [...] undertakings would be expected to require in order The asset–liability management, in the Professional Actuarial Spe-
to take over and meet [...] obligations” (Directive 2009, art. 77). cialty Guide (Luckner et al. 2002), is defined as “the practice of
3
managing a business so that decisions on assets and liabilities are
Otherwise, the risk margin is calculated using a “Cost-of-Capital” coordinated; it can be defined as the ongoing process of formulating,
approach (Salzmann and Wüthrich 2010). implementing, monitoring and revising strategies related to assets and
4 Here and in the following it is supposed that the RM for the non liabilities in an attempt to achieve financial objectives for a given set of
hedgeable risk components is zero. risk tolerances and constraints”.

123
G. Casarano et al.

the probability distribution forecast generated by the inter- The SCR estimation requires the evaluation of the
nal model” using the Value-at-Risk approach at a confidence expected value E[NAV(T )] and the percentile NAVα (T ). In
level of 99.5 %, over a one-year period. general, the NAV distribution is not available even in cases
An evaluation principle must be introduced since, for esti- in which the joint distribution of risk drivers is known and
mating TP , NAV , SCR , the valuation of both assets and a closed-form valuation of NAV is available [e.g. a straddle
liabilities is required. The valuation is performed assuming on stock (Hull 2012)]. So also in this case it is necessary
that all the random variables concerning the valuation prob- to use simulation methods to numerically calculate approxi-
lem are defined on a probability space(, F , P), where  mate values of SCR , Monte Carlo simulation being the most
denotes the space of events, F the σ -algebra of the measur- suitable one to elicit an empirical probability distribution of
able events and P is the physical probability measure (also NAV .
known as the real-world measure). In summary, considering the typical composition of insur-
The most relevant technical issues in the evaluation ance company portfolio, in the evaluation process of NAV ,
can be illustrated—without loss of generality—taking into the liability value V (t; Y) has to be evaluated using Monte
account only the “risk drivers” of financial risk. The risk Carlo simulation, either for t = 0 or for t = T , while in gen-
drivers (interest rate, inflation, stock price, exchange, credit, eral the asset value V (t; X) can be calculated in closed-form
contract-specific risk sources) are modelled by a multivariate otherwise using in the same way Monte Carlo simulation.
stochastic process, possibly a Markov one, Z(t). It is relevant to note that in accordance with the ORSA
The value in 0 ≤ t ≤ H , V (t), of a generic contract (asset requirements, NAV(T ), E[NAV(T )] and SCR(T, s) must be
or liability) with term H , under the necessary assumptions, calculated with 0 < T < s < H .
is given by
  
 2.3 Life insurance policies “with profit”
V (t) = EP ξ(H )V (H )Ft , (4)
Participating life insurance contracts are characterized by an
where ξ(t) is a suitable “state-price deflator”. In this market, interest rate guarantee and some “profit sharing rules”, which
we assume the existence of a suitable equivalent martingale provide the possibility for the policyholder to participate
measure M (risk-adjusted, forward or others) under which in the earnings of the insurance company. These contracts
 
V (H ) 
then usually contain “embedded options”, typically “cliquet”
V (t) = N (t) EM Ft , (5) options (De Felice and Moriconi 2004). The discussion of the
N (H ) valuation of such policies provides the opportunity to eluci-
date most of the theoretical and computational issues in the
where N (t) is the corresponding numéraire such that the
application of Solvency II principles, as well as the comput-
process Y (t) = V (t)/N (t) is a M-martingale; Ft is the
ing methods and technologies needed to achieve timely and
filtration at time t.
reliable risk estimates.
The interested reader may refer to Glasserman (2004) for
In the following, the general methodological approach of a
an introductory treatment of the mathematical background
Solvency II compliant simulation scheme is applied to
involved. If the time t = 0 is the current time, V (0) is known
profit sharing life insurance policies (PS policies) with min-
with certainty, while V (T ), 0 < T ≤ H is a random variable
imum guarantees6 . In these contracts—widely diffused in
depending on the Z(t) trajectory in [0, T ].
Italy7 —the benefits which are credited to the policyholder are
In many cases of practical interest, the closed-form solu-
indexed to the annual return of a specified investment portfo-
tion (or accurate approximations at least) of assets’ value
lio, called the segregated fund (in Italian gestione separata).
is available, while liabilities value, due to the complexity
A profit sharing policy is a “complex” structured contract,
of payoff, cannot be computed analytically; thus a viable
with underlying the segregated fund return. Since the policies
approach is to rely on numerical simulations. The numerical
are “non hedgeable” contracts, the market-consistent valua-
approach for evaluating liabilities, shown below, can also be
tion requires a “mark to model” approach based on a suitable
applied to assets when no closed-form solution on the asset
stochastic model calibrated on the market data, where uncer-
side is accessible.
tainties are of actuarial and financial type.
V (t) is the expected value of a multivariate distribution
that, as is the case of insurance contracts, is defined on a very
complex domain and with a cumulative distribution function 6The extension to unit-linked and index-linked policies is straightfor-
not available in closed form. This leads to having to neces- ward in more usual cases.
sarily use Monte Carlo simulation for computing the integral 7 At the end of year 2011 the Italian Supervisory Authority listed 386
in (5), possibly in combination with other techniques for the segregated funds, belonging to 70 insurance companies, with the overall
management of complex payoff. amount of statutory reserves summing up to about 305 billions euros.

123
Relevant applications of Monte Carlo simulation in Solvency II

2.4 Contractual structure of profit sharing life can then be rewritten with a change of measure, employing
insurance policies the so-called risk-neutral probability measure Q. Under Q
prices measured in units of the value of the money market
The asset–liability framework for evaluation of “Italian style” account are martingales. Accordingly,
PS policies with minimum guarantees can be described   
considering a single-premium pure endowment insurance YT 
V (t; YT ) = E Q
T Ft T px
contract, written at time 0 for a life of age x and initial sum r (u)du 
e t
 
 T
insured C0 . Following a typical interest crediting mechanism, 
the benefits are readjusted at the end of the year T according = Y T E Q T e− t r (u) du Ft , (9)
to the revaluation rule
where rt is the instantaneous intensity of interest rate ( “spot”
C T = C 0 T , (6) rate) determining the value of the money market account, Ft
is the filtration containing the information about financial
where the readjustment factor T is defined as events, T px is the risk-neutral probability that an individual
aged x will persist for T more years (lapse included) and

Y T = Y0 T px is the actuarially expected benefit. Notice that

T
T = max (1 + ρk ), (1 + γk ) T
, (7) both ρt and rt are Ft -adapted random variables.
k=1 A closer inspection of the payoff of the policy in eq. (9)
shows that it includes embedded options8 , whose underlying
where γk is the minimum guaranteed annual rate at the year k is the segregated fund return. The presence of the options can
(beyond the tecnical rate). The readjustment rate ρt is defined be made explicit by expressing V (t; YT ) using either a put or
as a call decomposition (De Felice and Moriconi 2005, p. 91):
max {min{βk Ik , Ik − ηk } − ck − i k , δk }
ρt = , (8) V (t; YT ) = Bt + Pt = G t + Ct , (10)
1 + ik

where Ik is the annual rate of return of the segregated fund where Bt is the value of a risky investment (base component)
in the year [k −1, k], δ k is the minimum guaranteed annual and Pt that of a put option; G t is the value a guaranteed invest-
cliquet rate, βk ∈ (0, 1] is the participation coefficient, ck is ment and a Ct that of a call option, or—in the words of the
the management fee, ηk is the minimum annual rate retained Directive—the policy guaranteed benefits and its future dis-
by the insurance undertaking, and i k is the technical rate. Ik cretionary benefits (FDB) (Castellani and Passalacqua 2011).
is a random variable; it is computed using “accounting rules” The put and call decompositions are very relevant to control
and it depends on the management strategy of the fund. All the values of financial guarantees; the evaluation of call com-
the other quantities are contractually defined and in many ponent is explicitly requested by the Commission Delegated
cases are constant values. Regulation (EU) (OJ 2015, art. 15): [...] when calculating
If we denote by ε(x, T ) the event that outcomes to pay- technical provisions, insurance and reinsurance undertakings
ment (e.g. “the aged x insured is alive at time T ”), then the shall determine separately the value of future discretionary
benefit for the policyholder—the liability of the company— benefits.
in T is given by The liability side of the policy then contains complex
options which are hard to value.
YT = C0 T Iε(x,T ) , Further, the return of the segregated fund—which repre-
sents the “underlying” of the policy—is strongly influenced
where Iε(x,T ) is the indicator function of ε(x, T ), expressing by the “management actions” of the insurer (Castellani and
“technical” (actuarial) uncertainty. YT is then affected by Passalacqua 2011). The complexity of profit sharing rule and
financial and actuarial uncertainty. the management actions involved entails to use numerical
simulation for valuating the liability; the risk-neutral expec-
2.5 On the valuation of financial component tation V (t; YT ) in (9) is then computed by Monte Carlo
simulations on fine grained grid time 9 .
In a market-consistent valuation framework of the policy,
8
the value of liability at time t, V (t; YT ), can be expressed A call option gives right to buy, whereas a put option means the right
to sell, an asset—the underlying—at a predetermined price.
as the expected value of the payoff at time t = T , weighted 9 For an exhaustive analysis of the basic principles and methodological
by a suitable state-price deflator (see Sect. 2.2). Assuming
approach for a valuation system of profit sharing policies with mini-
independence between actuarial and financial uncertainty, mum guarantees we address to Castellani et al. (2004); De Felice and
the expectation can be factorised. Finally, the expectations Moriconi (2004, 2005).

123
G. Casarano et al.

3 Nested Monte Carlo simulation a large number of “inner” simulation for each “outer” sce-
nario for the risk drivers valuation. The total number of
The nested Monte Carlo simulation is a technique widely simulations—“inner” and “outer”—may be very high in view
used for valuating risks with fixed time horizons. A well- of typical values of the number of simulations used by insur-
known application of nested simulation approach in finance ers to valuate future liability cash-flows and of the number of
literature was originally proposed for pricing American “outer” simulations that may be needed to obtain reliable esti-
options by Broadie and Glasserman (2007). An exposure of mates. Further, the amount of computational effort depends
the application of nested simulation approach to portfolio risk not only on the total number of simulations but also on the
measurement is given in Glasserman (2004) and McNeil et al. computational cost required to valuate liability cash-flows in
(2006). Now the nested simulation is beginning to find appli- each “outer” scenario.
cation in calculation of insures’ capital requirements (Bauer The nested Monte Carlo simulation can also be applied in
et al. 2012). the calculation of the PDF and the “parallel” implementation
The nested Monte Carlo simulation is at present, for insur- can drastically reduce the computational effort required, as
ance undertakings, the most suitable approach to measure the experiments results in Sect. 6 show.
the Solvency Capital Requirement with the Value-at-Risk Recently, different techniques have been proposed in liter-
approach as required by the Directive Solvency II, since it ature with the aim to reduce the computational cost of “full”
allows to elicit an empirical probability distribution function nested simulation, Least-squares Monte Carlo (Bauer et al.
of the contracts values in future time and then the correspond- 2010a, b) and Replicating portfolio (Lesnevski et al. 2008)
ing moments and percentiles. techniques among the others.
Considering the evaluation of an insurance liability V (T ) The Replicating portfolio approach requires to find a port-
with term H , in 0 ≤ T ≤ H (for example the liability value folio of relatively simple assets that in some sense replicates
V (T ; YH ) of a policy with profit in Sect. 2.3), the nested the behaviour of the liability cash flows, so that this portfo-
Monte Carlo technique requires the following: lio can be used as a proxy to value the liabilities. However,
in practice it is unlikely to find a replicating portfolio with
1. the simulation of n P sample paths (Z(t)(i) ), i = this properties. Capturing complex liabilities in insurance
1, . . . , n P , from t = 0 to t = T under the real-world portfolio, containing embedded options and reflecting the
measure P, conditional to F0 (in particular conditional management actions of insurance business, with portfolio of
to Z(0) if Z(t) is a Markov process); simple assets can give rise to significant errors in the evalu-
2. for each of the n P sample paths (Z(t)(i) ) from t = 0 to ation of capital requirement.
t = T , the simulation of n M sample paths (Z(t)(i, j) ), The Least-squares Monte Carlo technique (Longstaff and
j = 1, . . . , n M , from t = T to t = H under the equiva- Schwartz 2001) allows to reduce the number of “inner”
lent martingale measure M (for example the risk-neutral scenarios: the value of an insurance liability V (T ) is approx-
probability Q), conditional to FT (in particular condi- imate by a finite linear combination of “basis” functions,
tional to Z(T ) if Z(t) is a Markov process). generally polynomials. Nested Monte carlo simulation and
least-squares regression are employed to determine the coef-
The number of simulations in each of the two levels, that is ficients in the linear combination. The choice of which
the values of n P and n M , must be sufficient to ensure that economic variables will act as state variables in the regres-
the selected risk measure can be calculated with a prefixed sion and of the basis functions in the linear combination is
approximation error and in pre-established computing time. crucial to have good estimations of future liabilities.
It has been shown that the value of n M —the number of At present, the Least-squares Monte Carlo seems the most
“inner” simulations—influences the accuracy in the V (T ) promising approach; nevertheless, the investigation field is
estimation, while the value of n P —the number of “outer” still open and several solutions can be explored.
simulations—affects the numerical precision in the valua- Soft computing techniques have a long account of appli-
tion of the percentile of V (T ). In more detail, it has been cations to insurance solvency. In his survey, Shapiro reported
demonstrated that, in the evaluation of the percentile, the attempts dating back to the early nineties at using Neural Net-
asymptotic variance of the estimator is determined by the works (NN) to predict bankruptcy of insurance undertakings,
number of real-world scenarios, while the asymptotic bias with a major focus on early-warning systems leveraging the
of the estimator is influenced by the number of risk-neutral classification power of NNs when analysing balance sheets
scenarios used in each portfolio re-valuation (Broadie et al. looking for factors that are predictive of bankruptcy (Shapiro
2011; Gordy and Juneja 2010). 2002). The same author surveyed the literature on the appli-
Nested Monte Carlo simulation sounds simple in prin- cation of fuzzy logic to insurance-related problem in Shapiro
ciple, but its implementation in practice is a computational (2004). The different effects of stochastic uncertainty (risk)
challenge; it results in a nested “stochastic” simulation with and imprecision uncertainty (vagueness) in the Value-at-Risk

123
Relevant applications of Monte Carlo simulation in Solvency II

Fig. 2 Disar system Computing


architecture Units

Service

server

methodology were discussed in Zmeškal (2005), leading B—ALM valuation, that is the evaluation of market-
to the fuzzy-stochastic approach. The fuzzy representation consistent values of policies.
of financial asset returns has been used by many authors
including, e.g. Yoshida (2009). More recent approaches have The cash-flows produced by A are an input to B.
been proposed for the quantification of claim reserving under The following definitions are useful to describe the com-
ambiguity (de Andrés-Sánchez 2012). puting process of Disar:

• Elementary Elaboration Block (eeb) one of the two phases


4 The DISAR system
A o B corresponding to a given segregated fund, for a given
set of parameters. An eeb is a computing “atomic unit”.
Disar (Dynamic Investment Strategy with Accounting Rules)
• Collective Elaboration (ce) one of the two phases A o B
is a simulation system for the monitoring of portfolios of
corresponding to a set of given segregated funds. A ce is
profit sharing Italian life insurance policies with minimum
a set of eeb.
guarantees, linked to “segregated funds”. Disar works in an
ALM framework and embodies the set of accounting rules
and the ALM strategy of the assets of the segregated fund, 4.2 The Disar architecture
which in turn determines the rate of return underlying the
policy10 . Disar is composed of a Database Management Sys- The components of the system, shown in Fig. 2, are the fol-
tem and a set of computing engines. The simulation of the lowing:
evolution of market risk drivers is based on a stochastic
model; Monte Carlo simulation is used11 . As required by 1. A Database Server, hosting a Relational DataBase Man-
the Solvency II Directive, Disar performs computations in a agement System;
market-consistent way at individual and aggregated level and 2. A Master Server, hosting the Disar Master Service
meets the requirements needed for the approval of internal (DiMaS), that receives the primary requests from the
models. Clients, defines the elementary elaboration blocks, esti-
mates the complexity of the elaborations, establishes the
4.1 The computing process elaboration schedule, distributes the elementary requests
to the processing units and monitors the process.
The computing process in Disar consists in two phases: 3. A set of Computing Units: each unit hosts the Disar
Engine Service (DiEngS) that manages the Disar Actu-
A—Actuarial valuation, that is the calculation of actuar- arial Engine (DiActEng) and the Disar Alm Engine
ially expected cash-flows generated by the policies; (DiAlmEng).

10 The methodological asset-liability management (ALM) framework


– The (DiActEng) is in charge of phase A, that is per-
in which DISAR has been designed is detailed in Castellani et al. (2004). forms an eeb of type A;
11 A more detailed description of the Disar system is given in Castellani – The (DiAlmEng)is in charge of phase B, that is per-
and Passalacqua (2011). forms an eeb of type B;

123
G. Casarano et al.

– The (DiEngS) executes a single eeb of either one of (ALM) (See Sect. 4.1); by performing scenarios genera-
the two engines, it may write the results directly to tion separately from the actuarial and ALM valuation—this
the Database or return them to the Master Server. decoupling allows to use exogenous reference trajecto-
ries, moreover, the simulation of the trajectories can be
4. A set of Clients, each hosting the Disar Interface (DiInt)
parallelised—; by performing on the liability side a decom-
that allows to set computational parameters and monitors
position of the contracts, followed by an aggregation of
the progress of the elaborations.
elementary contracts; by sub-dividing the evaluations to be
performed by processing and assembling eeb.
4.3 Sources of uncertainty The above operations allow the implementation of Disar
in a distributed and parallel computing environment.
Disar considers different sources of uncertainty, both finan- Disar is a high complexity simulation system; it is “data
cial risks, including interest rate risk, equity risk, inflation intensive” and “cpu intensive”. To meet the needs of calcula-
risk, currency risk, credit risk, contract specific risk sources tions and to preserve the data quality as well as the accuracy
and actuarial risks, such as longevity/mortality risk and thus and efficiency of computation, as required by the Directive
involves a large number of random variables. Actuarial risks in the development of “internal models”, a grid computing
are assumed to be independent of each other, while finan- architecture of Disar has been designed to parallelize the
cial risks are possibly correlated. A list—not exhaustive—of computing processes on a grid of conventional computers
the models used for the evaluation of the main risk drivers with “reentrant” code. A strong reduction of the computing
includes, among the others, the Cox–Ingersoll–Ross (CIR) time has been obtained already with a small number of non
model for the interest rate risk (Cox et al. 1985), the Black specialised nodes (Castellani and Passalacqua 2011).
and Scholes (BS) model (Black and Scholes 1973) for the Since the most time-consuming jobs are those processed
equity benchmark and the CAPM for the equity prices, the by the ALM engine (DiAlmEng)—for evaluating (inter alia)
lognormal model for the consumer prices and a determinis- TP , NAV and SCR —that involve Monte Carlo calculations,
tic model for the expected inflation, the Duffie and Singleton a further improvement of the system is achievable by par-
(1999) model for the credit spreads and the lognormal model allelising the simulations13 . The parallelization strategy is
for the exchange rates12 . based on the distribution of simulations among processors;
processors work concurrently to compute the “local” aver-
ages that are afterwards combined to obtain the “global” final
5 Simulation process of Solvency II fundamental sample values.
measures in Disar In the parallel simulation of TP —that is of V (t; Y)—,
let P the number of “parallel processes”, each process, con-
There is no doubt that the elicitation of the PDF of insur- currently with the others, simulate n M /P trajectories under
ance firm and the Forward Looking Assessment of Own the probability measure M and compute “local” average val-
Risks, in compliance with the ORSA, raise severe theo- ues; then concurs with other processes to perform a suitable
retical problems on how to aggregate individual “random combination—generally an addition operation—of partial
components” and technical problems in relation to the results. (Communication among processes is limited to the
computational efficiency. The methodological approach that final phase.)
insurance companies must put in place requires the invoking In Disar the evaluation of the empirical probability distrib-
of not only sophisticated valuation models—that is high-level ution of NAV(T ) with T = 1 year (See (2)), which is needed
theory—but also efficient computation processes and high to calculate the NAVα (T ) and the SCR(0, T ) in (3), is per-
computational capabilities—that is high-level technology. formed by a “parallel” nested Monte Carlo simulation. The
Disar is able to provide (inter alia) the market value of simulations of the trajectories under the real-world measure
the policies, the (NAV ) of the ALM portfolio and the corre- P are now assigned to concurrent processes; each process
sponding components (base, call, put and guaranteed); the performs, independently from the others, n P /P “outer”
Value of Business In Force (VBIF ); the overall SCR and its simulations and for each of them performs the n M “inner
components (interest rate, equity, mortality, etc.). simulations”. (Communication among the processes requires
With the aim to improve the efficiency of simulation only the “collection” of results.)
processes, a reduction of the computation complexity of
the valuations is implemented in Disar by decomposing the 13 The first experiences of parallelization of the algorithm implemented
overall evaluation in the two phases A (actuarial) and B in DiAlmEng are reported in Corsaro et al. (2009) presented to the 18th
International AFIR Colloquium (2008). A complementary approach to
the strategy described in Castellani and Passalacqua (2011) based on
12 See Table 1 in Castellani and Passalacqua (2011) for the list of main the parallelization of the simulations on multicore architecture has been
risk drivers with the corresponding model used in Disar for the valuation. developed in Corsaro et al. (2011) and De Angelis et al. (2013).

123
Relevant applications of Monte Carlo simulation in Solvency II

The coherence between the results of the sequential and by the skip-ahead function. An input parameter to Disar
parallel simulations is guaranteed by the application of determines the generator to be used.
suitable techniques of implementation of random numbers The analysis has been carried out on a single eeb of a
generators; more precisely the generation of independent segregated fund of an Italian insurance company with seven
streams is realized using the block-splitting (also called skip- financial sources of uncertainty, including interest rate risk,
ahead) technique (Haromoto et al. 2008) that consists of equity risk, inflation risk and currency risk.
splitting the original sequence into k non-overlapping blocks.
Each of k blocks provides an independent sequence. If M 6.1 Numerical results
is the total number of random numbers generated in the
sequential simulation with a given initial seed, in the parallel The results reported below have been obtained by fixing the
simulation fixing the same seed, each of P processes gener- computational budget n P × n M = 107 , since an analysis car-
ates random numbers only from a block of length M/P that ried out using different computational budgets has shown
is assigned to it. that it is a suitable value for a testing from the accuracy and
The parallel approach in nested Monte Carlo simulation computational point of view. Further, a simulation performed
can be used also to evaluate the NAV(T ), E[NAV(T )] and with n P = 105 and n M = 104 is considered as a “full nested
SCR(T, s) with 0 < T < s < H in accordance with ORSA. simulation” for a comparison of the results.
Bearing in mind the put or call decomposition in (10) of Figures 3, 4 show, respectively, the empirical density func-
the expected liability value of a policy with profit, the parallel tion and the empirical cumulative distribution function, for
nested Monte Carlo simulation implemented in DiAlmEng values of percentile at a confidence level between 0.1 and
produces the empirical distributions of all components: base 0.9 %, of NAV(T ), with T = 1 year, of the considered eeb
and put components in the put decomposition and guaran- at the evaluation date 31/12/2014, with different combina-
teed and call component—that is the future discretionary tions of values of real-world and risk-neutral trajectories.
benefits—in the call decomposition. As expected, Fig. 3 shows that more the value of n M
decreases more the distribution is dispersed, having a neg-
ative effect on the estimation of a percentile in the tail.
Analysing in more detail the behaviour of the distributions in
6 Computational environment and numerical the tail (Fig. 4) in comparison with that of the “full nested”
simulations distribution, it can be noted that the value of percentiles
decreases when the n M reduces, resulting in an underestima-
The performance of Disar system was investigated on tion of the worst case value. On the other hand, low values
multicore computing systems—standard “low cost” high- of n P produces non-smooth distributions and overestimation
performance architectures. The numerical experiments have of percentiles. In Table 1 are also shown the values of the
been performed on a multicore system installed at the Uni-
versity of Naples “Parthenope”: a Blade Server HP ProLiant
BL460c G6 with eight blades equipped with two Quad-Core
1.2e−05

Intel Xeon E5540. The development software environment


used ensures portability and efficiency of the system. For
handling parallel processes the MPI library, that is de facto
empirical density function

standard library for message passing parallel programming,


8.0e−06

is used; in particular, the Intel version is used that is com-


pliant with “MPICH ABI Compatibility Initiative” (Oyanagi
2014) ensuring the portability on a wide range of hardware
and software platforms.
4.0e−06

5000 x 2000
The random numbers generators used are the MT 2003 10000 x 1000
(Matsumoto and Nishimura 2000) and MT 19937 (Mat- 25000 x 400
100000 x 100
sumoto and Nishimura 1998) based on the Mersenne Twister 100000 x 10000
algorithm for Large-Scale Monte Carlo simulation in dis-
0.0e+00

tributed computing environments—respectively, a 64-bit


generator with period length 22203 − 1 and a 128-bit genera-
tor with period length 219937 − 1. Those random generators 4e+05 5e+05 6e+05 7e+05 8e+05
NAV(1) (euro)
provide mutual independence of the sequences and allow the
coherence between the results of the sequential and parallel Fig. 3 Empirical density function of NAV(T ) with T = 1 year for
simulations by using the block-splitting method implemented different value of n P × n M

123
G. Casarano et al.

(n P and n M ) but also on the portfolio size; as the portfolio


size increases the computational effort to generate an inner
0.8

step simulation sample is large compared to the effort for


the generation of an outer sample (see Sect. 6.2). In Broadie
et al. (2011) is demonstrated that the variance is asymptot-
0.6

ically dominating the bias squared and then that it is better


percentile (%)

to use fewer inner trajectories and to increase the number


of real-world scenarios; in the experience of the authors, for
example, insurance companies apply 25,000 natural scenar-
0.4

ios and a number of risk-neutral simulations lesser than 100


in conjunction with Least-squares Monte Carlo technique or
5000 x 2000
equivalent methods.
0.2

10000 x 1000
25000 x 400
100000 x 100
100000 x 10000 6.2 Performance evaluation

435000 440000 445000 450000 455000 460000 465000 An analysis of the computational effort required to perform
NAV(1) (euro) the previous simulations is now described. The performance
of DiAlmEng is illustrated in the Tables 2, 3. Execution
Fig. 4 Empirical cumulative distribution function of NAV(T ) with
T = 1 year for different value of n P × n M times and the values of speed-up (the ratio of the sequen-
tial execution time to the parallel one) and efficiency (the
ratio of the speed-up to the number of cores) are reported for
ratio between the estimated SCR(0, T ) with T = 1 year, the values of “inner” and “outer” simulations considered in
for each empirical distributions, and the Best Estimate; the Sect. 6.1, varying the number of cores.
result, in per cent, provides guidance on the riskiness of The execution times, fixed the number of cores, are very
the policy. Further, the relative error in the estimation of similar. On one core, a slight increment of the execution times
SCR(0, T ), assuming the “full nested” SCR as the exact is observed for higher values of “inner” trajectories.
value, is reported. Now, the estimated ratios increase as n M The gain in terms of execution times is evident; “paral-
decreases. A reduction of the bias and then an improvement lel” nested Monte Carlo simulation speeds-up consistently
of the estimation of a percentile in the tail require many the simulation process. The speed-up increases almost lin-
inner samples in each real-world scenario, but at the same early with the number of cores and the efficiency values are
time a not small number of “outer” scenarios. Referring to very high (quite close to the ideal value of 1) for all numbers
the performed experiments, the better approximation of the of cores and for all combinations of n P and n M ; lower effi-
“full nested” SCR , for the considered segregated fund, is the ciency values are obtained for the highest value of “inner” or
combination 104 × 103 . In general, as already mentioned, “outer” simulations. On several cores the computation with
the selection of the value of the “inner” and “outer” scenar- the highest number of “outer” scenarios has lower value of
ios depends on the required approximation error and on a efficiency due to the increment of the number of results to
pre-established acceptable computing time. The execution be collected (and then to the increment of communication
times however, depend not only on the number of simulation among the cores).

Table 1 SCR (0,1)/BE of the empirical distributions


nP × nM 5000 × 2000 10,000 × 1000 25,000 × 400 100,000 × 100 100,000 × 10,000

SCR (0,1)/BE (SCR relative error) 0.551 % (0.03349) 0.587 % (0.02829) 0.597 % (0.04574) 0.663 % (0.1618) 0.571 %

Table 2 Execution time


# cores 5000 × 2000 10,000 × 1000 25,000 × 400 100,000 × 100
(hh:mm:ss) for the eeb at
valuation date 31/12/2014 1 5:43:10 5:37:20 5:36:21 5:33:10
4 1:27:04 1:24:47 1:24:30 1:25:30
8 0:43:51 0:43:02 0:42:14 0:42:50
16 0:23:56 0:23:13 0.23:05 0:23:08
32 0:13:04 0:12:36 0:12:02 0:12:14
64 0:06:07 0:06:06 0:06:10 0:06:13

123
Relevant applications of Monte Carlo simulation in Solvency II

Table 3 Speed-up (efficiency)


# cores 5000 × 2000 10,000 × 1000 25,000 × 400 100,000 × 100

4 3.94 (0.985) 3.98 (0.995) 3.98 (0.995) 3.90 (0.974)


8 7.83 (0.978) 7.84 (0.980) 7.96 (0.996 ) 7.78 (0.972)
16 14.34 (0.896) 14.53 (0.910) 14.57 (0.911) 14.40 (0.900)
32 26.26 (0.821) 26.77 (0.837) 27.95 (0.873) 27.23 (0.851)
64 56.10 (0.877) 55.30 (0.864) 54.54 (0.852) 53.59 (0.837)

Table 4 Performance on Collective elaboration at valuation date In the end it is important observe that the overall SCR ,
31/12/2014 and PDF , must take into account both life and non-life busi-
# cores n P = 100000, n M = 100 ness of insurance undertakings. The evaluation framework
Execution time (dd:hh:mm) Speed-up (efficiency) described for the life insurance contracts can be applied to
non-life ones, Monte Carlo simulation is used. The simu-
1 100:15:45 (estimated) lation process of values and risks of non-life business is,
64 2:11:19 40.72 (0.64) however, a less time-consuming process. Suitable techniques
have to be used for the calculation of the aggregated SCR —
Similar performance results are obtained for a Collective that is the overall solvency needs (Directive 2009, art. 45).
Elaboration, that is for the ALM valuation of a set of eeb.
Compliance with ethical standards
In Table 4 the performance results, on 1 and 64 cores, of a
ce composed by six eeb for three segregated funds with both Conflict of interest The authors declare that they have no conflict of
financial and actuarial risks are reported. The combination interest.
n P = 100,000, n M = 100 has been tested since, as shown,
produces the lower value of efficiency. High values of the
speed-up can be obtained scheduling the assignment to the
cores of the evaluation in sequence of eeb s in the ce; this References
is because the number of real-world simulations is generally
Bauer D, Bergmann D, Kiesel R (2010a) On the risk-neutral valuation
very high and this ensures the full and optimal use of hard- of life insurance contracts with numerical methods in view. Astin
ware resources. In any case, when it is necessary, a different Bull 40(1):65–95
“scheduling” procedure can be implemented using a “score” Bauer D, Bergmann D, Reuss A (2010b) Solvency II and Nested
that Disar calculates and assigns to each eeb on the basis of Simulations—a Least- Squares Monte Carlo Approach, Working
paper, Georgia State University and Ulm University
an estimation of its computational complexity. Bauer D, Reuss A, Singer D (2012) On the calculation of the Solvency
The values in Table 4 attest that the evaluation of the II Capital requirement based on Nested simulations. Astin Bull
overall solvency needs of an insurer is a strongly “time- 42(2):453–501
consuming” process. Black F, Scholes M (1973) The pricing of options and corporate liabil-
ities. J Political Econ 81(3):637–654
Throughout the analysis makes evident that a Solvency II Broadie M, Du Y, Moallemi CC (2011) Efficient risk estimation via
“compliant” simulation system that allows a computationally nested sequential simulation. Manag Sci 57:1172–1194
feasible and a reasonably accurate assessment of an insurer’s Broadie M, Glasserman P (2007) Pricing American-style securities
solvency position requires an adequate IT infrastructure and using simulation. J Econ Dyn Control 21:1323–1352
Castellani G, De Felice M, Moriconi F, Pacati C (2005) Embedded
simulation procedures able to exploit the whole computing Value in Life Insurance, Working Paper
power. Castellani G, Passalacqua L (2011) Applications of Distributed and
Parallel Computing in the Solvency II Framework: the DISAR
System. In: Guarracino MR et al (eds) Euro-Par 2010 Parallel
7 Conclusions Processing Workshops., Lect Notes Comp Sci 6586, 413–421
Springer-Verlag, Berlin
Monte Carlo—and nested Monte Carlo—simulation is in any Corsaro S, De Angelis PL, Marino Z, Perla F, Zanetti P (2009) Com-
putational issues in internal models: the case of profit-sharing life
case the computational kernel of the simulation process in insurance policies. G dell’Istituto Ital degli Attuari LXXII:237–
the “internal model” of insurance undertakings and it is a 256
“compute-intensive” process when applied to the evaluation Corsaro S, Marino Z, Perla F, Zanetti P (2011) Measuring default risk in
of future capital requirements. The theoretical framework a parallel ALM software for life insurance portfolios. In: Guarra-
cino MR et al (eds) Euro-Par 2010 Parallel Processing Workshops.,
and the “high technological” infrastructure of a simulation Lect Notes Comp Sci 6586, 471–478 Springer-Verlag, Berlin
system, like Disar, are then needful to meet the requisites of Cox JC, Ingersoll JE, Ross SA (1985) A theory of the term structure of
the Directive Solvency II. interest rates. Econometrica 53:385–407

123
G. Casarano et al.

de Andrés-Sánchez J (2012) Claim reserving with fuzzy regression and Luckner WR, Abbott MC, Backus JE, Benedetti S, Bergman D, Cox
the two ways of ANOVA. Appl Soft Comput 12(8):2435–2441 SH, Feldblum S, Gilbert CL, Liu XL, Lui VY, Mohrenweiser
De Angelis PL, Perla F, Zanetti P (2013) Hybrid MPI/OpenMP appli- JA, Overgard WH, Pedersen HW, Rudolph MJ, Shiu ES, Smith
cation on multicore architectures: the case of profit-sharing life PL (2002) Professional actuarial speciality guide—asset–liability
insurance policies valuation. Appl Math Sci 7(102):5051–5070 management, Society of Actuaries
De Felice M, Moriconi F (2004) Market consistent valuation in Matsumoto M, Nishimura T (1998) Mersenne Twister: A 623-
life insurance. Measuring fair value and embedded options. G dimensionally equidistributed uniform pseudorandom number
dell’Istituto Ital degli Attuari LXVII:95–117 generator, ACM Trans. on Modeling and Computer. Simulation
De Felice M, Moriconi F (2005) Market based tools for managing the 8(1):3–30
life insurance company. Astin Bull 35(1):79 Matsumoto M, Nishimura T (2000) Dynamic creation of pseudorandom
Directive 2009/138/EC of the European Parliament and of the Council number generators. In: Niederreiter H, Spanier J (eds) Monte Carlo
of 25 November 2009 on the taking-up and pursuit of the business and Quasi-Monte Carlo methods. Springer, Berlin, pp 56–69
of Insurance and Reinsurance (Solvency II), Official Journal of the McNeil A, Frey R, Embrechts P (2006) Quantitative risk manage-
European Union, L335/1, 17.12.2009 ment: concepts, techniques, and tools. Princeton University Press,
Duffie D, Singleton K (1999) Modeling term structures of defaultable Princeton, New Jersey
bonds. Rev Financ Stud 12(4):687–720 Official Journal of the European Union, Commission Delegated Regu-
EIOPA (2012) Final Report on Public Consultation No. 11/008. On the lation (EU) 2015/35 of 10 October 2014 supplementing Directive
Proposal for Guidelines on ORSA, 9 July 2012 2009/138/EC of the European Parliament and of the Council on the
EIOPA (2013) Final Report on Public Consultation No. 13/009. On the taking-up and pursuit of the business of Insurance and Reinsurance
Proposal for Guidelines on Forward Looking Assessment of Own (Solvency II) Text with EEA relevance, 17.1.2015
Risks (based on the ORSA principles), 23 Sept 2013 Oyanagi S (2014) MPICH ABI Compatibility Status, CRAYDOC S-
Glasserman P (2004) Monte Carlo methods in financial engineering. 2544-70, Jun (2014)
Springer, New York Salzmann R, Wüthrich MW (2010) Cost-of-capital margin for a general
Gordy MB, Juneja S (2010) Nested simulation in portfolio risk man- insurance liability runoff. ASTIN Bull 40(2):415–451
agement. Manag Sci 56(10):1833–1848 Shapiro AF (2002) The merging of neural networks, fuzzy logic, and
Haromoto H, Matsumoto M, Nishimura T, Panneton F, L’Ecuyer P genetic algorithms. Insur: Math Econ 31(1):115–131
(2008) Efficient jump ahead for F2 -linear random number gen- Shapiro AF (2004) Fuzzy logic in insurance. Insur: Math Econ
erators. INFORMS J Comput 20(3):385–390 35(2):399–424
Hull JC (2012) Options, futures, and other derivatives, 8th Edition, Yoshida Y (2009) An estimation model of value-at-risk portfolio under
Prentice Hall, USA uncertainty. Fuzzy Sets Syst 160(22):3250–3262
Lesnevski V, Nelson BL, Staum J (2008) An adaptive procedure for Zmeškal Z (2005) Value at risk methodology under soft conditions
simulating coherent risk measures based on generalized scenarios. approach (fuzzy-stochastic approach). Eur J Oper Res 161(2):337–
J Comput Finance 11:1–31 347
Longstaff FA, Schwartz ES (2001) Valuing American options by
simulation: a simple least-squares approach. Rev Financ Stud
14:113–147

123

You might also like