Value at Risk

By A V Vedpuriswar

February 8, 2009

j

VAR summarizes the worst loss over a target horizon that will not be exceeded at a given level of confidence.

j

For example, ³under normal market conditions, the most the portfolio can lose over a month is about $3.6 billion at the 99% confidence level.´

1

j j

The main idea behind VAR is to consider the total portfolio risk at the highest level of the institution. Initially applied to market risk, it is now used to measure credit risk, operational risk and enterprise wide risk. Many banks can now use their own VAR models as the basis for their required capital for market risk.

j

2

VAR can be calculated using two broad approaches :

j

Non parametric method : This is the most general method which does not make any assumption about the shape of the distribution of returns.

j

Parametric method: VAR computation becomes much easier if a distribution, such as normal, is assumed.

j

3

Std dev = $9.1 million per day Total no.$10 million = 11 No. of observations < . of observations < .Illustration Average revenue = $5.2 million Confidence level = 95% No.$ 9 million = 15 4 .of observations = 254.

05) = 12.6 million 5.. j Z at 95% confidence interval.6) = $14.2) = $ 15.6) = = = 1.7 / 4 .7 million § . of observations to the left = (254) (.4) j VAR = E (W) ± (-9.$9. 1 tailed = 1.(10 .645 j VAR = (1.645) (9.j Find the point such that the no.7 ± 11) /( 15 ± 11 ) j So required point = .4 j If we assume a normal distribution.2 million .1 ± (-9.7 j (12.

VAR as a benchmark measure VAR can be used as company wide yardstick to compare risks across different markets. 6 . j VAR can be used to drill down into risk reports to understand whether the higher risk is due to increased volatility or bigger bets. j j VAR can also be used to understand whether risk has increased over time.

jIn contrast. jThe choice of time horizon must correspond to the time required for corrective action as losses start to develop. 7 . pension funds generally invest in less liquid portfolios and adjust their risk exposures only slowly. jCorrective action may include reducing the risk institution or raising new capital. j So a one month horizon makes more sense.VAR as a potential loss measure jVAR can also give a broad idea of the worst loss an institution can incur. profile of the jBanks may use daily VAR because of the liquidity and rapid turnover in their portfolios.

the higher the confidence level chosen. The higher the degree of risk aversion of the company. If the bank determines its risk profile by targeting a particular credit rating. operational risk and other risks. credit risk.VAR as equity capital j j j j The VAR measure should adequately capture all the risks facing the institution. the expected default rate can be converted directly into a confidence level. j 8 . So the risk measure must encompass market risk. Higher credit ratings should lead to a higher confidence level.

daily VAR is adjusted to other periods.Delta Gamma Method j j In linear models. j j 14 . This adjustment assumes that the position is fixed and the daily returns are independent and identically distributed. This adjustment is not appropriate for options because option delta changes dynamically over time. by scaling by a square root of time factor. The delta gamma method provides an analytical second order correction to the delta normal VAR.

This is an improvement over the normal distribution because historical data typically contain fat tails. The main drawback of this method is its reliance on a short historical moving window to infer movements in market prices.Historical simulation method j The historical simulation method consists of going back in time and applying current weights to a time series of historical asset returns. other than relying on historical data. j j j 16 . This method makes no specific assumption about return distribution.

j j j j The sampling variation of historical simulation is greater than for a parametric method. The dilemma is that this may involve observations are no longer relevant. duly supplemented by stress tests . VAR Longer sample paths are required to obtain meaningful quantities. Many institutions are now using historical simulation over a window of 1-4 years. that j j This is taken as a reasonable trade off between precision and non stationarity. Banks use periods between 250 and 750 days. 17 .

Monte Carlo Simulation Method j The Monte Carlo Simulation Method is similar to the historical simulation. This method uses computer simulations generate random price paths. to j j j 18 . the returns are sorted to produce the desired VAR. except that movements in risk factors are generated by drawings from some pre specified distribution. Finally. The risk manager samples pseudo random numbers from this distribution and then generates pseudo dollar returns as before.

pricing patterns j j Monte Carlo analysis can deal with time decay of options.j j They are VAR. by far the most powerful approach to They can account for a wide range of risks including price risk. Non linear exposures and complex can also be handled. fat tails and extreme scenarios and complex interactions. daily settlements & associated cash flows and the effect of pre specified trading or hedging strategies. 19 . volatility risk.

j j j 20 . Different random numbers will lead to different results. A large number of iterations may be needed to converge to a stable VAR measure. When all the risk factors have a normal distribution and exposures are linear.j The Monte Carlo approach requires users to make assumptions about the stochastic process and to understand the sensitivity of the results to these assumptions. the method should converge to the VAR produced by the delta-normal VAR.

approach is computationally quite It requires marking to market the whole portfolio over a large number of realisations of underlying random variables. methods. To speed up the process. the portfolio is valued using a linear interpolation from the exact values at adjoining grid points. In the grid Monte Carlo approach. 21 j j j . have been devised to break the link between the number of Monte Carlo draws and the number of times the portfolio is repriced.j j The Monte Carlo demanding. the portfolio is exactly valued over a limited number of grid points. For each simulation.

Too high a confidence level reduces the expected number of observations in the tail and thus the power of the tests.Backtesting j j j j j Backtesting is done to check the accuracy of the model. factor of 3 to 28 j . the Basle Committee recommends a 99% confidence level over a 10 business day horizon. Longer horizon reduces the number of independent observations and thus the power of the tests. The resulting VAR is multiplied by a safety arrive at the minimum regulatory capital. For the internal models approach. It should be done in such a way that the likelihood of catching biases in VAR forecasts is maximized.

the number of occurrences below VAR shrinks. There is no simple way to estimate a 99. j j 29 . Shorter time intervals create more data points and facilitate more effective back testing.j As the confidence level increases. leading to poor measures of high quantiles.99% VAR from the sample because it has too few observations.

They allow full valuation on the target data. They can either postulate a stochastic process or resample from historical data. But they are prone to model risk and sampling variation.Choosing the method j j j j j Simulation methods are quite flexible. 30 . Greater precision can be achieved by increasing the number of replications but this may slow the process down.

of option values. or longer horizons.j For large portfolios where optionality is not a dominant factor. 31 . the delta normal method provides a fast and efficient method for measuring VAR. For fast approximations gamma is efficient. a full valuation method may be required. delta j j For portfolios with substantial option components.

price shocks are never reversed and prices move as a random walk. In Brownian motion models. This cannot be the price process for default free bond prices which must converge to their face value at expiration.j j If the stochastic process chosen for the price is unrealistic. For example. so will be the estimate of VAR. the geometric Brownian motion model adequately describes the behaviour of stock prices and exchange rates but not that of fixed income securities. j j 32 .

V A R Applications Passive Reporting risk Controlling risks Allocating risk Disclosure to shareholders Management reports Regulatory requirements Setting risk limits Performance valuation Capital allocation . Strategic business decisions. Defensive Active 33 .

j VAR methods represent the culmination of a trend towards centralized risk management. 34 . j A portfolio approach gives a better picture of risk rather than looking at different instruments in isolation. j Many institutions have started to measure market risk on a global basis because the sources of risk have multiplied and volatility has increased.

the total exposure depends on the net current value of contracts covered by the agreements. 35 j j j . these exposures may add up to an unacceptable risk. Also. Even though all the desks may have a reasonable exposure when considered on an individual basis. coming from various desks such as currencies. A financial institution may have myriad transactions with the same counterparty.j j Centralization makes sense for credit risk management too. fixed income commodities and so on. with netting agreements. All these steps are not possible in the absence of a global measurement system.

36 .j Institutions which will benefit most from a global risk management system are those which are exposed to: diverse risk active positions taking / proprietary trading complex instruments .

institutions should provide summary VAR figures on a daily. j Ideally. 37 . j Banks can disclose their aggregated risk without revealing their individual positions.j VAR is a useful information reporting tool. j Disclosure of information is an effective means of market discipline. weekly or monthly basis.

Position limits alone do not give a complete picture. (compared to 5 year treasury) may be more risky. VAR limits can supplement position limits.j j j j j j VAR is also a useful risk control tool. VAR can be used as the basis for scaling down positions. In volatile environments. for 38 . VAR acts as a common denominator comparing various risky activities. The same limit on a 30 year treasury.

If he makes a loss. traders may become reckless. Without controlling for risk. 39 j j j j j . the worst that can happen is he will get fined. The economic capital is the aggregate capital required as a cushion against unexpected losses.j VAR can be viewed as a measure of risk capital or economic capital required to support a financial activity. If the trader makes a large profit. VAR helps in measuring risk adjusted return. he receives a large bonus.

The individual/undiversified appropriate choice. j Such decisions should be based on marginal and diversified VAR measures. VAR seems the at to j j External performance measurement aims allocation of existing / new capital existing or new business units. 40 .j j The application of VAR in performance measurement depends on its intended purposes. Internal performance measurement aims at rewarding people for actions they have full control over.

to j j 41 . VAR can help management take decisions about which business lines to expand. And also about the appropriate level of capital hold. maintain or reduce.j VAR can also be used at the strategic level to identify where shareholder value is being added throughout the corporation.

Managers start to learn business they did not know. costs and risks in all their business activities. process produces The process almost always leads to improvements.j j j A strong capital allocation substantial benefits. Finance executives are forced to examine prospects for revenues. things about their j 42 .

j j 43 .Extreme Value Theory (EVT) j EVT extends the central limit theorem which deals with the distribution of the average of identically and independently distributed variables from an unknown distribution to the distribution of their tails. The EVT approach is useful for estimating tail probabilities of extreme events. the normal distribution generally underestimates potential losses. For very high confidence levels (>99%).

EVT helps us to draw smooth curves through the extreme tails of the distribution based on powerful statistical theory. In many cases the t distribution with 4-6 degrees of freedom is adequate to describe the tails of financial data.j j j Empirical distributions suffer from a lack of in the tails. j 44 . data This makes it difficult to estimate VAR reliably.

j j j j j EVT applies to the tails Not appropriate for the centre of the distribution Also called semi parametric approach EVT theorem was proved by Gnedenko in 1943 EVT helps us to draw smooth curves through the of the distribution tails 45 45 .

¼0 ¼=0 ß>0 Normal distribution corresponds to ¼ = 0 Tails disappear at exponential speed 46 46 .EVT Theorem F (y) F (y) y = = = 1 ± (1+¼ y).µ) / ß.1/¼ 1 ± e-y (x .

EVT Estimators 2% Normal EVT 0% 47 47 .

So they need to be complemented by stress testing. Once in a lifetime events cannot be taken into account even by powerful statistical tools. j j 48 48 . Stress tests can simulate shocks that have never occurred or have been covered highly unlikely.j j j j Fitting EVT functions to recent historical data is fraught with the same pitfalls as VAR. The goal of stress testing is to identify unusual scenarios that would not occur under standard VAR models. Stress tests can also simulate shocks that reflect permanent structural breaks or temporarily changed statistical patterns.

but the problem is the stress needs to be pertinent to the type of risk the institution has. It would be difficult to enforce a limited number of relevant stress tests. The complex portfolio models banks generally employ give the illusion of accurate simulation at the expense of substance. j j 49 49 .j Stress testing should be enforced.

Goldman Sachs¶ chief financial officer David Viniar once described the credit crunch as ³a 25-sigma event´ 50 50 .How effective are VAR models? VAR and sub prime j j j j The tendency of risk managers and other` to describe events in terms of sigma¶ tells executives us a lot. it implies a distribution. Real life distributions have fat tails. normal Whenever there is talk about sigma.

51 51 .risk as a measure of the risk involved in a portfolio. a properly working model would still produce two to three exceptions a year ± the existence of clusters of exceptions indicates that something is wrong. Risk models of many banks were unable to predict the likelihood . While a few VAR exceptions are expected ± 99%.j j j j The credit crisis of late 2007 was largely a failure of risk management. speed or severity of the crisis. Attention turned particularly to the use of value-at.

is j 52 52 . Lehman brothers three at 95%.j Credit Suisse reported 11 exceptions at the 99% confidence level in the third quarter. VAR is a tool for normal markets and it not designed for stress situations. Clearly. Goldman Sachs five at 95%. Bear Stearns 10 at 99% and UBS 16 at 99%. Morgan Stanley six at 95%.

so the VAR it produces will be too low. but it also swamps recent events. especially as the environment was emerging from a period of relatively benign volatility. A two-year window won¶t capture the extremes.What window? j It would have been difficult for VAR models to have captured all the recent market events. j j j 53 53 . It will improve matters a little. A longer window is a partial solution at best .

but it would not necessarily allow VAR models to react quickly to an extreme event.Is shorter window a better thing? j A longer observation period may pick up a wider variety of market conditions. These models would be surprised by outbreak of volatility. some believe the answer would in fact be to use shorter windows. the first j j 54 54 . If the problem is that models are not reacting fast enough. but would rapidly adapt.

the type of VAR model that would actually have worked best in the second half of 2007 would most likely have been a model driven by a frequently updated short data history. Or any frequently updated short data history that weights more recent observations more heavily than more distant observations. j 55 55 .What models work best? j j The best VAR models are those that are quicker react to a step-change in volatility. to With the benefit of hindsight.

j 56 56 .j In an environment like the third quarter of 2007. a long data series will include an extensive period of low volatility. these will be outweighed by the intervening period of calm. which will mute the model¶s reaction to a sudden increase in volatility. Although it will include episodes of volatility from several years ago.

The importance of updating j In the wake of the recent credit crisis. Shifting to weekly or even daily updating would improve the responsiveness of the model to a sudden change of conditions. an unarguable improvement seems to be increasing the frequency of updating. j j 57 57 . Monthly or even quarterly updating of the data series is the norm.