8 views

Uploaded by Abhishek Rastogi

cqf

- A&J Questions Bank for SOA Exam MFE/ CAS Exam 3F
- t4.hull-chapters-11,13-&-17
- Delta Gamma Neutral Strategy
- Formal Proof of the Black-Scholes Options Pricing (2001)
- Options, Futures, and Other Derivatives, 10th Edition, Copyright © John C. Hull 2017
- Whaley+Wilmott_TheBestHedgingStrategy
- JPM Volatility Swap Primer
- Options Analysis Toolkits
- Binomial Greeks
- Asymptotic Approximations to Cev and Sabr Models
- Digital Option Replication- 1063 s 361 379
- smile-lecture1
- Brownian Motion, Martingales, And Stochast - Jean-Francois Le Gall
- Breaking Barriers
- A Multinomial Approximation for American Option Prices in Levy Process Models
- Insurance Pricing
- Chap 023
- Stock Trak Trader Rules
- Fd Schedule & Qb
- Rm Lecture Notes 07 b

You are on page 1of 29

Different types of financial analysis

1. Applied Stochastic

2. Notes on Brownian

motion and related

The Wiener process, a mathematical model of randomness

phenomena

3. Random Walk 2

4. Random Walk

equities, currencies, commodities and indices

PDE's and Transition Density Functions

Taylor series

Moment Generating Function

Stochastic Integration

Extensions of Itô’s Lemma

Binomial Model

A simple model for an asset price random walk

Delta hedging

No arbitrage

Risk neutrality

Discrete Martingales

Binomial Model extended

Continuous Martingales

What is a continuous time martingale?

The martingale zoology in a hurry: martingale, local martingale,

supermartingale, submartingales, semimartingales

On the average, how far is the walker from the starting point?

The displacement x in the direction of the X-axis that a particle experiences on the average or, more exactly, the square root of the arithmetic mean of the square of

the displacement in the direction of the X-axis; it is

x= Square root(2Dt), t is time , D is diffusion Coeffiecient D= 1/T Integration from - infinifty to + Infinity (Deltasquare / 2) X Pfobality density function ) integrated

accross space)

What is the probability that at a particular time the walker is at the origin?

The probability that a (one-dimensional) simple random walker returns to the origin infinitely often is one.

More generally, what is the probability distribution for the position of the walker?

Does the random walker keep returning to the origin or does the walker eventually leave forever?

A return to the origin, often referred to as an equalization, occurs when Sn equals 0 for some n greater than 0. If an in nite number of equalizations occur, then the

walk is called recurrent. If only a nite number of equalizations occur, then the walk is called transient.

Module 2: Quantitative Risk & Return

Portfolio

Management

Measuring risk and return

Benefits of diversification

Modern Portfolio Theory

and the Capital Asset

Pricing Model

The efficient frontier

How to analyse portfolio

performance

Alphas and Betas

Fundamentals of

Optimization and

Application to

Portfolio Selection

Fundamentals of portfolio

optimization

Formulation of

optimization problems

Solving unconstrained

problems using calculus

Kuhn-Tucker conditions

Derivation of CAPM

Basel III

Definition of capital

Evolution of Basel

Key provisions

Expected Shortfall

Measuring Risk

Expected Shortfall and

Liquidity Horizons

Correlation Everywhere

Frontiers: Extreme Value

Theory

Liquidity Asset

Liability

Management

Gap analysis, of assets,

liabilities and

contingencies, both static

and dynamic

The role of derivative and

non-derivative

instruments in liquidity

Liquidity Coverage Ratio

(LCR)

Net Stable Funding Rate

(NSFR)

Collateral and

Margins

Expected Exposure (EE)

profiles for various types

of instruments

Types of Collateral

Calculation Initial and

Variation Margins

Minimum transfer amount

(MTA)

ISDA / CSA

documentation

Module 3: Equities & Currencies

Black-Scholes Model

The assumptions that go into the

Black-Scholes equation

Foundations of options theory: delta

hedging and no arbitrage

The Black-Scholes partial

differential equation

Modifying the equation for

commodity and currency options

The Black-Scholes formulae for

calls, puts and simple digitals

The meaning and importance of the

Greeks, delta, gamma, theta, vega

and rho

American options and early

exercise

Relationship between option values

and expectations

Martingale Theory –

Applications to Option

Pricing

The Greeks in detail

Higher-order Greeks

Which, When and Why

Computing the price of a derivative

as an expectation

Girsanov's theorem and change of

measures

The fundamental asset pricing

formula

The Black-Scholes Formula

Extensions to Black-Scholes:

dividends and time-dependent

parameters

Black's formula for options on

futures

Understanding Volatility

The many types of volatility

The market prices of options tells us

about volatility

The term structure of volatility

Volatility arbitrage: Should you

hedge using implied or actual

volatility?

Introduction to Numerical

Methods

The justification for pricing by Monte

Carlo simulation

Grids and discretization of

derivatives

Implicit finite-difference methods

including Crank-Nicolson schemes

Douglas schemes

Richardson extrapolation

American-style exercise

Explicit finite-difference method for

two-factor models

ADI and Hopscotch methods

Exotic Options

Characterisation of exotic options

Time dependence (Bermudian

options)

Path dependence and embedded

decisions

Asian options

and New York (Nelson 1904)

Delta Hedging

Bates Jump-Diffusion

Advanced Greeks

The names and contract details for

basic types of exotic options

How to classify exotic options

according to important features

How to compare and contrast

different contracts

Pricing exotics using Monte Carlo

simulation

Pricing exotics via partial differential

equations and then finite difference

methods

Advanced Volatility

Modelling in Complete

Markets

The relationship between implied

volatility and actual volatility in a

deterministic world

The difference between 'random'

and 'uncertain'

How to price contracts when

volatility, interest rate and dividend

are uncertain

Non-linear pricing equations

Optimal static hedging with traded

options

How non-linear equations make a

mockery of calibration

Market-Based Valuation of

Equity Index Options

Stylized Facts of Equity & Options

Markets

Numerically efficient valuation of

equity index options

Calibration of option pricing models

to market data

Simulation of option pricing models

for European & American options

Canonical Example

Euro Stoxx 50 index and options

Python implementation

Module 4: Fixed Income

Fixed Income Products

and Analysis

Names and properties of the

basic and most important fixed-

income products

Features commonly found in

fixed-income products

Simple ways to analyze the

market value of the instruments:

yield, duration and convexity

How to construct yield curves and

forward rates

Swaps

The relationship between swaps

and zero-coupon bonds

Stochastic Interest Rate

Modeling

Stochastic models for interest

rates

for many fixed-income products

one-factor interest rate models

multi-factor interest rate modeling

Analysis

How to choose time-dependent

parameters in one-factor models

so that

Today’s yield curve is an output of

the model

The advantages and

disadvantages of yield curve

fitting

How to analyze short-term

interest rates to determine the

best model for the volatility and

the real drift

How to analyze the slope of the

yield curve to get information

about the market price of risk

interest rates

The pricing of interest rate

products in a probabilistic setting

The equivalent martingale

measures

The fundamental asset pricing

formula for bonds

Application for popular interest

rates models

The dynamics of bond prices

The fundamental asset pricing

formula for derivatives on bonds

Heath Jarrow and Morton

Model

The Heath, Jarrow & Morton

(HJM) forward rate model

The relationship between HJM

and spot rate models

The advantages and

disadvantages of the HJM

approach

How to decompose the random

movements of the forward rate

curve into its principal

components

Fixed Income Market

Practices

Basics: discount factors, FRAs,

swaps, and other delta products

Basic curve stripping, bucket

deltas, and managing IR risks

Interpolation methods

Risk bleeding

Scenarios-based risks and

hedging (wave method)

Current Market Practices

Advanced stripping

SABR Model

Vanilla options: European

swaptions, caps, and floors

Arbitrage Free SABR

The Libor Market model

Standard Libor market model

dynamics

Numéraire and measure

The drift

Factor reduction

The connection to statistics

The basic Monte Carlo algorithm,

standard error and uniform

variates

Non-uniform variates, efficiency

ratio and yield

Co-dependence in multiple

dimensions

Wiener path construction;

Poisson path construction

Numerical integration for solving

SDEs

Variance reduction techniques

Sensitivity calculations

Energy Derivatives

Participants' risk exposure in

electricity markets

Price volatility

Fluctuation of electricity

production costs

Hedging with electricity futures for

generators, marketers and end-

users

Risks of hedging with electricity

futures

"Stack and Roll" hedging for the

long-term periods

"Hedging with electricity options

and crack spreads

Speculation (views taking) using

electricity futures

Politics of Speculation. Dodd-

Frank Act

high frequency commodity trading

spread

Module 5: Credit Products and Risk

Books

Introduction to Credit

Derivatives & Structural

Models

Introduction to credit risk

1. CREDIT RISK

Rating based 1. Sturctural models model credit risk based on assuming a stochastic

modeling of credit process for the value of the firm and the term structure of interest rates.

Modelling credit risk risk, Theory and Clearly the problem is to determine the value and volatility of the firm’s

application of assets and to model the stochastic process driving

migration the value of the firm adequately.

matricces.pdf

Basic structural models: Merton

Model, Black and Cox Model

Advanced structural models

Intensity Models

Modelling default by Poisson

Process

Relationship between intensity

and arrival time of default

Risky bond pricing: constant vs.

stochastic hazard rate

Bond pricing with recovery

Affine intensity models and use of

Feynman-Kac

Two-factor affine intensity model

example: Vasicek

Credit Default Swaps

An Introduction to CDS

Default Modelling Toolkit.

Inhomogenous Poisson Process

CDS Pricing: Basic and

Advanced Models

Bootstrapping intensity from CDS

market quotes

Accruals and upfront premium in

CDS pricing

X-Valuation Adjustment

(CVA, DVA, FVA, MVA)

Theory

Historical development of OTC

derivatives and xVA

Credit and debt value

adjustments (CVA and DVA)

Funding value adjustment (FVA)

Margin and Capital Value

Adjustments (MVA and KVA)

Current market practice and

application

X-Valuation Adjustment

(CVA, DVA, FVA, MVA)

Implementation

Implementation of counterparty

credit valuation adjustment (CVA)

methodologies currently used to

quantify CVA in terms of

exposure and Monte Carol

simulation and the Libor Market

Model

Illustrate this methodology as

well as DVA, FVA and others

CDO & Correlation Sense

CDO market pricing and risk

management

Loss Function and CDO Pricing

Equation

Motivation from loss distribution

Classification of Copula

Functions

Simulating via Gaussian Copula

Intuition and timescale

Rank Correlation

Uncertain correlation model for

Mezzanine tranche

Compound (implied) correlation

in Loss Distribution

Statistical Methods in

Estimating Default

Probability

Sources of default probability

information

Capital Structure Arbitrage

Generalized Linear Models

(GLM): theory, estimation and

inference

functions

Estimation of default probability

for an enterprise with a logit

model

and the ordered probit model

Other Topics

1. They are more general than structural models and assume that an

exogenous random variable drives default and that the probability of default

(PD) over any time interval is non-zero. An important input to determine the

default probability and the price of a bond is the rating of the company. Thus,

to determine the risk of a credit portfolio of rated issuers one generally has to

consider historical average defaults and transition probabilities for current

rating classes. Quite often in reduced form approaches the migration from

Reduced Form Models

one rating state to another is modeled using a Markov chain model with a

migration matrix governing the changes from one rating state to another.

Besides the fact that they allow for realistic short-term credit spreads,

reduced form models also give great flexibility in specifying the source of

default.

2.

1. Analysts expect financial information about the company consisting of five

years of audited annual financial statements,the last several interim financial

statements, and narrative descriptions of operations and products. The

meeting with corporate management can be considered an important part of

an agency’s rating process. The purpose is to review in detail the company’s

key operating and financing plans, management policies, and other credit

1. CREDIT RISK

factors that have an impact on the rating.

Rating based

modeling of credit

A:

risk, Theory and

Although credit rating and credit score may be used interchangeably in

application of

some cases, there is a distinction between these two phrases. A credit

migration

rating, often expressed as a letter grade, conveys the creditworthiness of a

matricces.pdf

business or government. A credit score is also an expression of

creditworthiness, but it is expressed in numerical form and only used for

2.219148718-The-

individuals. Both ratings and scores are designed to show creditors a

Credit-Scoring-

borrower's likelihood of repaying a debt.

Toolkit-R-

Anderson.pdf

3. Credit Risk

2. Rating Facotrs: Business Risk || Financial Risk

Scorecards

Industry Characteristics || Financial Characteristics

Rating process Developing and

Competitive Position || Financial Policy

Implementing

Marketing || Profitability

Intelligent Credit

Technology || Capital Structure

Scoring.pdf

Efficiency || Cash Flow Protection

Regulation || Financial Flexibility

4.74973105-

Management

Consumer-Credit-

Source: S&P’s Corporate Ratings Criteria (2000)

Models-Pricing-

Profit-and-

In the world of emerging markets, rating agencies usually also incorporate

Portfolios-2009

country and sovereign risk to their rating analysis. Both business risk factors

such as macroeconomic volatility, exchange-rate risk, government

5. 242637835-Model-

regulation, taxes, legal issues, etc., and financial risk factors such as

Risk

accounting standards, potential price controls, inflation, and access.

6.rating_models_tc

3.The borrowers who share a similar risk profile are assigned to the same

m16-22933

rating grade. Afterwards a probability of default (PD) is assigned. Very often

the same PD is assigned to all borrowers of the same rating grade. For such

a rating methodology the PDs do not discriminate between better and lower

creditworthiness inside one rating grade. Consequently, the probability to

migrate to a certain other rating grade is the same for all borrowers having

the same rating grade.

1.A point-in-time (PIT) estimate refers to immediate In some cases, using several

probabilities, typically one-year, that will fluctuate up scorecards for a portfolio provides

and down over the course of an economic cycle. In Objectives better risk differentiation than using

contrast, a through-the-cycle (TTC) estimate is one that one scorecard on everyone. This is

approximates a stressed bottom-of-the-cycle scenario, • Reduction in bad usually the segmentation case

with a horizon of five years or more. debt/bankruptcy/claims/fraud where a population is made up of

• Increase in approval rates or market share distinct subpopulations, and

2, According to Aguais (2005), no risk estimate will ever in areas such as secured where one scorecard will not work

be purely PIT or TTC, but will+D56 loans, where low delinquency presents efficiently for all of them (i.e., we

always be a combination of the two. For example, expansion opportunities assume that different characteristics

default estimates based upon account performance • Increased profitability are required to predict risk for the

or the value of traded securities tend towards PIT, while • Increased operational efficiency (e.g., to different subpopulations in our

rating agency grades tend better manage workflow portfolio).

towards TTC. All of them will vary over an economic in an adjudication environment)

cycle, some more than others. • Cost savings or faster turnaround through 1. Generating segmentation ideas

Companies’ own internal grades are made up of a automation of adjudication based on experience and industry

combination of ‘subjective assessments, using scorecards knowledge, and then validating

statistical models, market information, and agency • Better predictive power (compared to these ideas using analytics

ratings’ that have a mixture of different existing custom or bureau 2. Generating unique segments

time horizons. While it may be ideal to provide separate scorecard) using statistical techniques such as

PIT and TTC estimates for each clustering or decision trees

obligor, this is beyond the capabilities of today’s banks, Data

and has not been required by Basel II. Typical segmentation areas used in

The following data items are usually the industry include those based on:

3. Stages of Scorecard Development collected for applications from the previous

two to five years, or from a large enough • Demographics. Regional

Feasibility study sample: province/state, internal definition,

• Account/identification number urban/rural, postal-code based,

Data—Will there be sufficient data available to develop • Date opened or applied neighborhood), age, lifestyle code,

a model? • Arrears/claims history over the life of the time at bureau, tenure at bank

Resources—Will there be money and people available? account

Technology—Is the technology available to support us? • Accept/reject indicator • Product Type. Gold/platinum cards,

• Product/channel and other segment length of mortgage, insurance type,

Player Identification identifiers secured/unsecured, new versus

• Current account status (e.g., inactive, used leases for auto, size of loan.

Project manager—Reports to the champion, and must closed, lost, stolen, fraud, etc.

advise of any resource requirements • Sources of Business (channel).

or shortfalls. As a rule of thumb, for application corecard Store-front, take one, branch,

Scorecard developer—Develops the actual scorecard. development there should be pproximately internet, dealers, brokers

Internal analysts—Assist in assembling and 2,000 “bad” accounts and 2,000 “good” segmentation

understanding data. accounts that can be randomly selected for

Functional experts—Assist in understanding the each proposed scorecard, from a group of

business and affected areas, and will be key when approved accounts opened within a defined Statistically-Based Segmentation

deciding upon the strategies to be employed. time frame. For behavior scorecards, these

Technical resources—Responsible for final would be from a group of accounts that Clustering

implementation of the scorecard. were current at a given point in time, or at a

certain delinquency status for collections Clustering is a widely used

Data Preparation scoring. A further 2,000 technique to identify groups that

declined applications may also be required are similar to each other with

Project scope—Which cases are to be included? for application scorecards where reject respect to the input variables.

Good/bad definition—What is to be predicted? inference is to be performed. Clustering, which can be used to

Sample windows—What will be the observation and segment databases, places objects

outcome periods? .Exclusions into groups, or “clusters,” suggested

The first step is to measure the Behavioural scores estimate the The use of nonlinear model functions as well as the maximum likelihood

improvement in predictive power risk that the borrower will default method to optimize those functions means that regression models also

through segmentation. This can be in the next 12 months. They are make it possible to calculate membership probabilities and thus to

done using a number of statistics obtained by taking a sample of determine default probabilities directly from the model function.

such as the Kolmogorov-Smirnov previous borrowers and relating

(KS), c-statistic, and so on. Exhibit

their characteristics, including their The curves of the model functions and their mathematical representation

4.13 shows an example of this repayment, arrears and usage are shown in chart 17. In this chart, the function denotes the cumulative

analysis using the c-statistic (details

during a performance period, with standard normal distribution, and the term (P) stands for a linear

on their default status 12 months after combination of the factors input into the rating model; this combination can

the c-statistic are covered later, in

the end of that performance also contain a constant term. By rescaling the linear term, both model

Chapter 6). period.Some of these functions can be adjusted to yield almost identical results. The results of the

characteristics indicate whether two model types are therefore not substantially different.

We assume that a proper or the borrower can afford to repay

sufficient score s(x) captures as the loan, but the most important Due to their relative ease of mathematical representation, logit models are

much information for predicting the characteristics are usually those used more frequently for rating modeling in practice. The general manner in

probability of a performance from the credit bureau and which regression models work is therefore only discussed here using the

outcome, say good/bad, as does the information on the arrears status of logistic regression model (logit model) as an example.

original data vector, x. the borrower.

P = 1/ (1+ exp ( - (b0 + b1 K1 + b2 K2 + bnKn)))

Now we consider three approaches A borrower is usually assumed to

to modelling the credit risk of have defaulted if their payments In this formula, n refers to the number of financial indicators included in the

portfolios of consumer loans, all of on the loan are more than 90 days scoring function, Ki refers to the specific value of the creditworthiness

which are based on the behavioural overdue. If we define those who criterion, and bi stands for each indicator s coefficient within the scoring

scores of the individual borrowers have defaulted as “bad” (B) and function (for i ¼ 1; :::n). The constant b0 has a decisive impact on the value

who make up the portfolio. those who have not defaulted as of p (i.e. the probability of membership).

“good” (G), then the behavioural

The three models have analogies score is essentially a sufficient Selecting an S-shaped logistic function curve ensures that the p values fall

with the three main approaches to statistic of the probability of the between 0 and 1 and can thus be interpreted as actual probabilities. The

corporate credit-risk modelling: a borrower being good. typical curve of a logit function is shown again in relation to the result of the

structural approach, a reduced-form exponential function (score) in chart 18.

defaultmode approach and a Thus, if x are the characteristics of

ratings-based reduced-form the borrower, a score s(x) has the Years @ address

approach. property that P(G | x) = P(G | s(x)) <3 years

3–6 years

>6 years

Blank

Years @ employer

<2 years

2–8 years

9–20 years

>20 years

Blank

Home phone

Y/N

Accom. status

Own

Rent

Parents

Other

Application score—Used for new business In retail credit, lenders strive to automate as many

origination, and combines data from the decisions as possible. The motivation for human input

customer, past dealings, and the credit arises only where the models are known to be weak, the

bureaux. value at risk is large, and/or the potential profit is high.

Behavioural score—Used for account

management (limit setting, over-limit Instances where judgmental assessments dominate are

management, authorisations), and usually insovereign, corporate, and project-finance lending, where

focuses upon the behaviour of an individual the borrowers’ financial situation is complex, the information

account. is not standard and/or difficult to interpret, and volumes are

Collections score—Used as part of the extremely low.If there is a scoring model, it will only be

collections process, usually to drive predictive used for guidance, and the underwriter will assess other

diallers in outbound call centres, and information that has not been incorporated in the score.

incorporates behavioural, collections, and

bureau data. The internal versus external rating issue is primarily the

Customer score—Combines behaviour on bespoke versus generic debate:

many accounts, and is used for both account Most internal ratings are provided by bespoke models

management and cross-sales to existing developed specifically for a lender, while external ratings

customers. are generics based upon the experience of many lenders,

Bureau score—A score provided by the credit that are also available to competitors. For retail credit, the

bureau, usually a delinquency or bankruptcy latter may be provided by credit bureaux, or co-operatives.

predictor that summarises the data held by Generics are often used by lenders that: (i) are small, and

them. cannot provide sufficient data for a bespoke development;

(ii) are looking to enter new markets, where they have no

Markets where credit scoring is used today experience; or (iii) lack the technological sophistication to

include, but are not limited to: develop and implement a bespoke system.

Unsecured—Credit cards, personal loans, Application scoring traditionally has assessed a very

overdrafts. specific default risk. The most common risk considered is

Secured—Home loan mortgages, motor the chance that an applicant will go 90 days overdue on

vehicle finance. their payments in the next 12 months. What would happen

Store credit—Clothing, furniture, mail order. over other time periods and whether the customer is

Service provision—Phone contracts, proving profitable to the lender are aspects not considered

municipal accounts, short-term insurance. in this assessment.

Enterprise lending—Working-capital loans,

trade credit. Ascore, s(x), is a function of the characteristics’ attributes x

of a potential borrower which can be translated into the

Request -> Application probability estimate that the borrower will be good.The

Time -> Behavioural critical assumption in credit scoring is that the score is all

Entry -> Recovery that is required for predicting the probability of the applicant

Transaction -> Fraud being good. In some ways the score is like a sufficient

Event warning -> Behavioural/fraud statistic. It is also usual to assume that the score has a

Campaign -> Response monotonic increasing relationship with the probability of

being good – in that case the score is called a monotonic

Request—Customer application for a specific score.

product, or increased facilities.

Time—Regular recalculation, such as

monthly.

Entry—Calculated on first entry into a specific

stage

of the CRMC, and the resultant score is

retained for future use.

Transaction—Customer already has the

product, and scores are calculated each time

Module 6: Big Data and

Machine Learning

Big Data in Finance

What is Data science?

Introduction to Classification

Bayesian Models and inference using

Markov chain Monte-Carlo

Introduction to graphical models: Bayesian

networks, Markov networks, inference in

graphical models

Optimisation techniques

Examples: Predictive analytics/trading &

Pricing

Classification, Clustering and

filtering

Classification: K-nearest neighbours,

optimal Bayes classifier, naïve Bayes, LDA

and QDA, reduced rank LDA, Logistic

regression, Support Vector Machines

mean, Expectation-maximization, DBSCAN,

OPTICS and Mean-shift

Kalman filtering

Analytics

Regression: liner regression, bias-variance

decomposition, subset selection, shrinkage

methods, regression in high dimensions

and regression using SVM’s and kernel

methods

Dimension reduction: Principal component

analysis (PCA), kernel PCA, non-negative

matrix decomposition, PageRank

Examples (2 worked examples)

Data analytics sandbox

Examples

Stylised Facts

Volatility clustering: the concept and the

evidence

Properties of daily asset returns

framework

Why ARCH models are popular

Econometric methods

Co-Integration using R

Multivariate time series analysis

Financial time series: stationary and unit

root

Eagle-Granger Procedure

Estimation of reduced rank regression:

Johansen Procedure

Stochastic modelling of equilibrium: Orstein-

Uhlenbeck process

Statistical arbitrage using mean reversion

Additional Content

Time Series Distribution

Poisson's Distribution

chart 18

m - number of characteristics included in the model

factor - scaling parameter based on formula presented

previously

offset - scaling parameter based on formula presented

previously

Neutral score = sum(scorei*distri) where i = 1 to k, k- number

of bins

scorei- scoring assigned to the ith bin

distri- percentage distribution of the total cases in the

ith bin

1 to k

where k number of categories of analyzed predictor

Gx(i) cumulative distribution of “good” cases in the ith

category

Bx(i) cumulative distribution of “bad” cases in the ith

category

Gx(0)=Bx(0)=0

Gini Coefficient = 2.AUC - 1

System Stability/Population stability report between recent

applicants and

expected (from development sample):

index = sum of ((% Actual -% Expected ) × ln ( % Actual / %

Expected))

An index of less than 0.10 shows no significant change,0.10–

0.25 denotes a small change that needs to be investigated, and

an index greater than 0.25 points to a significant shift in the

applicant population.

Characteristic Analysis Report:

“Expected %” and “Actual %” again refer to the distributions of

the development and recent samples, respectively. The index

here is calculated simply by:

sum of ((%Actual - % Expected) * Points))

of default.

Now it’s time to identity the drivers of the LGD modelling. There

are categorized under four categories

1. Obligor characteristics: In case of retail obligors, you can also

include socio-demographic variables, such as marital status,

gender (if allowed by the regulation), salary, time at address,

time at job, etc. Also, variables measuring the intensity of the

bank relationship such as number of years client, number of

products with the bank, can be considered. In case of

corporate, you can include industry sector, sector indicators,

size of the company, legal form of the company, age of the

company, balance sheet information such as revenues, total

assets, solvency, profitability, liquidity ratios, etc

2. Loan characteristics: Popular examples here are real estate,

cash, inventories, guarantees, etc. It is highly advised to qualify

the collateral as precisely as possible. So, in case of real estate,

you can further categorize the collateral into, for example, flat,

apartment, villa, detached/semi-detached house.

3. Country-specific regulations

4. Macroeconomic factors: Here you can think of gross

domestic product (GDP), which reflects economic growth,

aggregated default rates, inflation, unemployment rate, and the

interest rates, which obviously will have an impact on the

discount factor.

Modelling approach:

There are various approaches for modeling depending upon the

distribution of dependent and independent variable. It could be

one stage model or two stage model.

1. Linear regression

2. Linear regression with Beta transformation

3. Logistic regression

4. Decision tree

Downturn LGD: To calculate the downturn LGD, we take the

LGD of last 5 year and take the average of them for each year.

Take the LGD, which is highest as downturn LGD. This LGD is

used for calculation for expected losses for each pool.

a. Impute the missing value in training datasets with median

using proc stdize

b. There could be nominal variable that needs to be collapsed

using proc cluster and proc tree, they merge levels using chi

square statistics , proc means generates level and proportions

of events statistics

c. Remove redundancy: using proc varclus, it create variable

cluster based on second eignvalue, default is 1.00 you could

specify 0.7

d. Pick variable based on R square ratios from cluster. Rquare

ratio is ratio of (1-Rquare within dataset)/ (1- Rsquare value

with other clusters)

e. Run the proc Reg with VIF option to detect the co-linearity,

remove the variable with VIF greater than 5.

f. Now run corr and create a table of speaman and hoffding

value, remove values with low rank of spearsman and high

rankof hoffding value, they indicate nonlinearity, its called

nonlinearity screening

g. Next step to adjust outlier using percentile treatment

3. Now for each numeric variable create intervals and for each

category variable with different levels calculate WOEs and IVs.

Each bucket/bin should contain 5% of observations. The

variables with IV less than 0.1 and more than 0.5 should be kept

of analysis; they are under predictive or over predictive.

stepwise logistic regression (forward selection, backward

selection and stepwise selection) and subset selection (where

selection = score), Select the model, which predicts best

and explanatory. Cutoff is calculated using ROC.

have done the oversampling, correct the sample bias by using

prievent option in Logistic Regression step. Use ROC, KS

statistics and Gini coefficient for judging the model

performance. KS should be nearly .5. For good behavioral score

card, it should be above .65

2. customers with 30 DPD, will go to stage 2

3. relative ratio of current PD and PD at origination, higher that a

predefined limit, depending upon certain tests as specified by

IFRS 9, will go to stage 2

4. all customer with forbearance status will go to stage 2

5. Rest of the customer, which are neither in stage 3 and stage 2,

will go stage 1

The PD, LGD and EAD models are built on initially for stage 1,

based on PD, we make decision for which customer, we should

build lifetime model. Life time means the remaining time in the

contract period.

The below description of LGD and EAD model is based on

simplest approach towards LGD at EAD calculation.

EAD Model:

In case of mortgage, there are mainly three kinds of products

-Amortization type: Where product is based on annuity

-linear

-Interest only mortgage

Depending upon the product type, EAD is calculated as

remaining balance at end of each year. There are two

adjustments applied each year:

-Prepayment rate at end of each year

-Drop in customer balance due to interest reset at interest reset

period, interest reset period could be 5 year, 10 years and 15

years

Amortization can be based on monthly interest rate or yearly

interest rate. At the end of contract duration, EAD is set to zero.

LGD Model:

Once EAD is calculated, LGD can be computed easily. Mortgage

is backed by collateral. For different scenarios- best, base and

worst, using the forward looking information collateral is

calculated for remaining years of product life. Two adjustments

are applied to calculated collateral

-Haircut is applied to collateral

-Cure rate is applied

-Interests and costs of sale added to EAD

LGD is calculated as follows:

LGD = min (1, max(Collateral-EAD,0)/EAD *CureRate)

measures considered here are the R-square value, Mean

Squared Error (MSE) and Mean Absolute Error (MAE). E.g.

model parameters are loan-to-value (LTV) ratio at time of loan

application (start of loan), a binary indicator for whether this

account has had a previous default, time on book in years, ratio

of valuation of property to average in that region (binned), type

of security, i.e. detached, semi-detached, terraced, flat or other,

age group of property and region.

LGD calculations for one year datasets: We calculate the total

loss by summing NPV of pending payments and future

payments. We calculate the total exposure by summing up on

balance and off balance. LGD is calculated by dividing total

losses by total exposure for each observation. For each

observation calculate the LGD as below

Collateral = market value* predicted haircut

Outstanding Balance = NPV of (Payment dues from the past

since default ) + NPV of (sum of future cash flows)

Repossession LGD = (Outstanding Balance at default -

Collateral)/ Total Exposure

Non Repossession LGD = Outstanding Balance/Total Exposure

Sum and average the LGD to calculate the LGD for portfolio.

Downturn LGD

In recognition that banks may be unable to derive their own

downturn LGD estimates, the Board of Governors of the Federal

Reserve System proposed the following generic formula for

deriving downturn LGD from long-term-average or through-the-

cycle LGD estimates:

LGDDownturn = .08 + .92 LGD

where LGD equals the long-term average LGD and LGDDownturn

equals expected downturn LGD.5 Under this rule, downturn LGD

can be anywhere from zero to eight percentage points higher

than the long-term average LGD. For example, a debt with a

long-term average LGD of 100% would have a corresponding

downturn LGD also equal to 100%, while an instrument with a

LGD of 0% would have a corresponding downturn LGD equal to

8%. For a debt with LGD equal to 50%, the corresponding

downturn LGD equals 54%.

very low in boom times. Mortgage industry is trillion

pounds industry; with capital requirement going high in

downturn would threat the economic stability. With

TTCs, main issue is that drivers to it, are very static in

nature and might not reflect the actual economic cycle.

So PRA in UK has proposed that organization should

neither use the PITs approach nor TTCs approach.

Rather organization should be using the hybrid

approach.

Now with hybrid approach, PRA has introduced the

concept of cyclicality. PRA defines cyclicality as measure

of PITness of the model. It would be very specific to

model. Additionally PRA has put a cap on cyclicality, at

no point model cyclicality should exceed more that 30%.

PRA has purposed two different formulas to calculate

the cyclicality.

Formula (1) cyclicality = (PDt – CT)/(DRt-CT)

Formula (2) cyclicality = (PD(t) – PD(t-1) )/ (DR(t)-

DR(t-1))

Both the formulas have their own issues while

calculating cyclicality. Formula (1) gives very high

number for cyclicality, when default rate gets close to

CT. Formula (2) gives very high number for cyclicality

when default rates of two consecutive year becomes

closer. To deal with these problems of cyclicality

formula, I purpose a new formula.

Formula (3) cyclicality = SD(ΔPD/ ΔDR)

Now question is why this formula is appropriate for

cyclicality calculation. When we are looking to capture

changes of series, standard deviation of it should

capture it. Next question is why we need model

cyclicality; CP2916 says that we should uplift the model’s

long run average PDs in proportion of cyclicality. So first

step is to calculate model cyclicality and then uplift the

model long run PD in proportion of model cyclicality. It

will complete the CP2916 calibration.

For more information on CP2916 implementation,

please contact me by email

levels using chi square statistics , proc means generates

level and proportions of events statistics

c. Remove redundancy: using proc varclus, it create

variable cluster based on second eignvalue, default is

1.00 you could specify 0.7

d. Pick variable based on R square ratios from cluster.

Rquare ratio is ratio of (1-Rquare within dataset)/ (1-

Rsquare value with other clusters)

e. Run the proc Reg with VIF option to detect the co-

linearity, remove the variable with VIF greater than 5.

f. Now run corr and create a table of speaman and

hoffding value, remove values with low rank of

spearsman and high rankof hoffding value, they indicate

nonlinearity, its called nonlinearity screening

g. Next step to adjust outlier using percentile treatment

for each category variable with different levels calculate

WOEs and IVs. Each bucket/bin should contain 5% of

observations. The variables with IV less than 0.1 and

more than 0.5 should be kept of analysis; they are under

predictive or over predictive.

using stepwise logistic regression (forward selection,

backward selection and stepwise selection) and subset

selection (where selection = score), Select the model,

which predicts best

and explanatory. Cutoff is calculated using ROC.

case you have done the oversampling, correct the

sample bias by using prievent option in Logistic

Regression step. Use ROC, KS statistics and Gini

coefficient for judging the model performance. KS

should be nearly .5. For good behavioral score card, it

should be above .65

We now have two redictions

In credit scoring, one can analyse the historical data on past for odds of a good, one

borrowers and use it to predict the likely future behaviour based on age and the other

of prospective borrowers. To do this one assembles relevant based on residential status.

data on the past borrowers from application forms, credit

bureaus, accounting and financial records, as well as They are very different from

marketing information. one another. How do we

combine these different

There are three interrelated ways of describing the chance results to obtain a prediction

of a borrower being good. The first is the familiar one of which makes use of the

estimating the probability that an event occurs; the second information in both age and

is to consider the odds of its occurrence; and the third is to own/rent status. The

prescribe a score or index, which contains all the simplest approach is to use

information needed to estimate these odds.We want to Bayes’ rule in a very naïve

understand how probabilities, odds, and scores are related way :

to one another. When dealing with probabilities it is often

easier in practice to consider the odds of the event – the Naïve Bayes’ scorecard

chance of the event happening divided by the chance of it building

not happening. Horse racing is a good example of where

this occurs. Similarly with default risk one can assess the

chance of a good or bad outcome by the odds of a good or

bad, where we define the odds as the ratio of the

probability of a good (bad) outcome to a bad (good)

outcome:

Information odd :

O (G | X) = p (X) . p (G | X) / p(X) . p (B | X)

= p (G | X) / p (B | X)

O (G | X) = p (G | X) / p (B | X)

- A&J Questions Bank for SOA Exam MFE/ CAS Exam 3FUploaded byanjstudymanual
- t4.hull-chapters-11,13-&-17Uploaded byBhabaniParida
- Delta Gamma Neutral StrategyUploaded byakrathi
- Formal Proof of the Black-Scholes Options Pricing (2001)Uploaded byFranck Dernoncourt
- Options, Futures, and Other Derivatives, 10th Edition, Copyright © John C. Hull 2017Uploaded byArindam Paul
- Whaley+Wilmott_TheBestHedgingStrategyUploaded bydlr1949
- JPM Volatility Swap PrimerUploaded bymark
- Options Analysis ToolkitsUploaded byAlberico Carluccio
- Binomial GreeksUploaded byjshew_jr_junk
- Asymptotic Approximations to Cev and Sabr ModelsUploaded byartursepp
- Digital Option Replication- 1063 s 361 379Uploaded byColt Trickle
- smile-lecture1Uploaded byChris Jauregui
- Brownian Motion, Martingales, And Stochast - Jean-Francois Le GallUploaded byjesusdark44
- Breaking BarriersUploaded byMadalina Ciuperca
- A Multinomial Approximation for American Option Prices in Levy Process ModelsUploaded bybboyvn
- Insurance PricingUploaded byhaziqizah
- Chap 023Uploaded byFlorensia Restian
- Stock Trak Trader RulesUploaded byAngelo Corelli
- Fd Schedule & QbUploaded byDrDhananjhay Gangineni
- Rm Lecture Notes 07 bUploaded byDavid
- Nmims Moolyankan Quiz 2011Uploaded byIvani Bora
- d362.pdfUploaded byAlp San
- [Brian Johnson] Option Strategy Risk Return Rati(BookZZ.org) (4)Uploaded byKarthik Mucheli
- What Are WarrantsUploaded bysuandiw
- InvertBSwSORQFrevisionUploaded byminqiang
- lec20Uploaded byBunty Sangral
- Esops FinalUploaded byAnshul Sood
- productFlyer_978-0-387-98509-1Uploaded byUriel Garcia
- Barclays-Capital-Equity-Correlation-Explaining-the-Investment-Opportunity.pdfUploaded byAnonymous s41KVlqk
- bba6Uploaded bymithasuhel

- ALM2.pdfUploaded byAbhishek Rastogi
- abhichara-the-magic-of-tantric-mystics-and-warlocks(3).pdfUploaded byAbhishek Rastogi
- ILMU KHODAMUploaded byMonge Dorj
- Algorithmic Trading In R.pdfUploaded bydarraghg
- Paramount e BrochureUploaded byAbhishek Rastogi
- GypsySorceryandFortuneTelling_10000859Uploaded byAna Nicoleta
- Servitor CreationUploaded byButnaru Emanuel
- Pending Work Items (2) (1)Uploaded byAbhishek Rastogi
- Stochastic differential equation.pdfUploaded byAbhishek Rastogi
- NotesUploaded byAbhishek Rastogi
- 1_14460104851878179904_circular.pptUploaded byAbhishek Rastogi
- 74Sector-wise and industry-wise deployment of bank credit (SIBC).xlsUploaded byAbhishek Rastogi
- 210Report on Large Credits (RLC)Uploaded byAbhishek Rastogi
- Rbi Risk Managment Market RiskUploaded byAbhishek Rastogi
- JDUploaded byAbhishek Rastogi
- Ch 06 Mgmt 1362002Uploaded byAbhishek Rastogi
- 5530451_1529321885092Uploaded byAbhishek Rastogi
- 1_14460104851878179904_circularUploaded byAbhishek Rastogi
- 203Report on Asset Quality (RAQ)Uploaded byAbhishek Rastogi
- 209Balance Sheet Analysis (BSA)Uploaded byAbhishek Rastogi
- 209Balance Sheet Analysis (BSA).xlsxUploaded byAbhishek Rastogi
- 74Sector-Wise and Industry-wise Deployment of Bank Credit (SIBC)Uploaded byAbhishek Rastogi
- 83Stress TestUploaded byAbhishek Rastogi
- SuzlonUploaded byAbhishek Rastogi
- Returns to Be Submitted to RBIUploaded byAbhishek Rastogi
- 84irs Test (1)Uploaded byAbhishek Rastogi
- Capital GainUploaded byAbhishek Rastogi

- Building Your Company's Vision - James CollinsUploaded byMiguel De Leon
- Paschimanchal Vidyut Vitran Nigam LtdUploaded byShivam Sharma
- Affidavit-of-Delay-PALANGDAO.docxUploaded byKennedy Raro
- Brochure UkUploaded byantesyodespues
- American Airlines Mission StatementUploaded bycanoqwarline
- Tunnel Depth - Def ModulusUploaded bygaddargaddar
- Business Intelligence & ToolsUploaded byNaveenmanuelcabral
- Asia Water Week 2013 - ProgramUploaded byadbwaterforall
- my interp of the frameworksUploaded byapi-238895021
- Experimental Studies on Concrete with Bentonite as Mineral AdmixtureUploaded byGRD Journals
- Nova Security Memo of Law (Revised)Uploaded byxanthippas
- 3 19 13 0204 607 3913 TPO Violation RMC Cert Production Documents Criminal Complaints Decl PC TPO - Copy - Copy - Copy - CopyUploaded byNevadaGadfly
- Working Capital (ST15)Uploaded byJohn Samonte
- ACCY312 Week 6 Tutorial SolutionsUploaded byduraton94
- EASA PART 66 - MODULE 11.02 - Airframe StructuresUploaded byNedu Japsi
- OPeration of Pig LaunchersUploaded byErwinalex79
- Cyrene as...Uploaded byAnonymous TYG46Rir3
- cover letter and resumeUploaded byapi-316141732
- Kerry-Lugar Bill SummaryUploaded byhafizayub
- Red Hat Enterprise Linux-7-7.0 Release Notes-En-USUploaded byahaha29
- FM 3-19.30 Physical SecurityUploaded byJustin
- Chapter10 Lecture 33 Performance Analysis VI Take-Off and Landing 2Uploaded byChegrani Ahmed
- BRKSEC-2020Uploaded byDaniel Vieceli
- STEAM TRAPS OKUploaded byiran1362
- CAR 145 With AnswerUploaded byavijayakumar_1964
- Using a System Cost Analysis ToUploaded bysaa6383
- Cantt Acts _DEA ViewsUploaded byBhadraKaali
- Transformer Oil TestUploaded bydetroit44
- 4100 R3 HighResUploaded byForos Isc
- CTC - Commissioning Certificate UPSUploaded byWahyu Ardiansyah