Professional Documents
Culture Documents
Author
Kishor Chandra Das
(Student ID: A4025837)
London School of
Business & Finance
Supervisor
Dr. Richard Osborne
Academic Year
Oct’2010 – Jun’2012
Page | i
Dissertation Paper
Page | ii
Dissertation Paper
Acknowledgements
I would first thank the management team at LSBF for accepting my application for
the MBA programme and providing a cohesive learning environment at the college. I
am extremely grateful to Dr. Richard Osborne, who supervised my dissertation,
advised me on narrowing down my research topic and shared his valuable feedback
through the months of this dissertation work. It has been a great pleasure to work
under him as a research student, and I have learned a tremendous amount about
selecting and carrying out different types of researches. I sincerely appreciate his
personal and academic guidance.
My friends and co-students on this MBA programme have been very helpful in
sharing their opinions and experiences, which helped me in giving the final shape to
my dissertation. The richness of experience and the competitive environment within
the group inspired me to take this challenging topic for my dissertation. My friend
Jana Novosad has been a source of inspiration for me, who always helped me in
keeping my focus on the work and completing it on time. I would like to thank Tony
Brown, Ligia Yamasaki, Chris Hilton, Irfan Akhai, David Ohana, Kayne Manning,
Gourav Shome, Tom Greenshields, Harshal Nikam and all my friends on this MBA
programme for their continued support and intellectual discussion though these
years.
Special thanks to my wife Mousumi for putting up with me through the time of writing
this dissertation. I am deeply grateful to my mother, Manorama, for her blessings.
I dedicate this work to the memory of my late father, Rama Chandra Das, who
always wanted to see me completing my Masters degree.
Page | iii
Dissertation Paper
Page | iv
Dissertation Paper
Abstract
This research was set out to investigate whether liquidity risk profile of UK based
financial brokerage firms can be accurately assessed from the publicly available
information published by the firms. Among all the models published between 1965 to
2010, for predicting financial distress, this research considered 5 of the models to
predict the financial distress situation of the sample UK based brokerage firms. This
research took 5 of the recently failed financial services firms to assess the accuracy
of the predictions from the models and applied the models on 5 of the out-of-sample
financial brokerage firms based in UK. These findings revealed that accounting
based models such as Z-score and O-score are not effective in predicting financial
distress for financial services firms but the Merton DD model found to be effective in
predicting failure cases. The Lekme CFI model was also effective in determining
liquidity risks under the liquidity risk profiling framework, which was proposed in this
paper. Though the results from the Merton DD probability and Lemke cfi profiling
model found to be consistent for the recently failed firms but the results were in
contrast for the UK brokerage firms. Merton DD predicts 0% probability of failure for
the UK brokerage firms where as cfi profiling model predicts financial distress for 2
out of the 5 UK brokerage firms.
Page | v
Dissertation Paper
Page | vi
Dissertation Paper
Table of Contents
1 Introduction ............................................................................................. 1
1.1 Background ............................................................................................................. 1
1.2 Aims & Objectives .................................................................................................. 4
1.3 Hypothesis .............................................................................................................. 5
1.4 Structure of the Dissertation .................................................................................... 6
2 Literature Review .................................................................................... 8
2.1 Discriminant Analysis ............................................................................................. 8
2.2 Logit Analysis ....................................................................................................... 11
2.3 Proportional Hazard Analysis ................................................................................ 13
2.4 Neural Network Models ........................................................................................ 15
2.5 Critical Appraisal of Gaps in Literature ................................................................. 17
3 Research Methodology ......................................................................... 21
3.1 Research Philosophy ............................................................................................. 21
3.2 Research Approach ............................................................................................... 21
3.3 Research Strategy.................................................................................................. 22
3.4 Time Horizon ........................................................................................................ 22
3.5 Data Collection ..................................................................................................... 22
4 Model Setup & Results Analysis .......................................................... 24
4.1 Altman’s Z-Score .................................................................................................. 24
4.2 Ohlson’s O-Score .................................................................................................. 28
4.3 Merton DD............................................................................................................ 29
4.4 CHS Score ............................................................................................................ 33
4.5 Lemke CFI ............................................................................................................ 36
4.6 Liquidity Risk Profile ............................................................................................ 40
4.6.1 Liquidity Profile ........................................................................................................ 41
4.6.2 Desired Value ........................................................................................................... 43
4.6.3 Tolerance Limits ....................................................................................................... 45
5 Conclusion .............................................................................................. 47
6 References .............................................................................................. 49
7 Appendices ............................................................................................. 53
Page | vii
Dissertation Paper
Page | viii
Dissertation Paper
1 Introduction
London is at the centre of the financial activities of the world. London Stock
Exchange’s history is more than 300 years old and over 400 firms, mainly
investment banks and stockbrokers, are members of the London Stock Exchange
(LSE, 2011). London’s stockbrokers are shrinking due to the recent financial crisis
and Europe’s sovereign debt crisis. The reason for taking London’s financial
brokerage firms as subject of this research is to understand the robustness of
London’s dominance in this industry as a long term competitor in the global financial
activities. The sector is already experiencing a permanent decline in overall activity
after years of expansion, which is triggering significant job losses. Over the decade
banks' vulnerability to liquidity risk has increased as the rapid growth of structured
products dispersed risk throughout the financial system but increased the system's
interconnectedness and banks' reliance on wholesale markets as source of funding.
Particularly, as the macroeconomic and financial market developments of the past
few years have increased the financial firms overall vulnerability to liquidity risk, this
research will reemphasise on the importance of maintaining a resilient liquidity
position to sail through crisis periods. The established knowledge from this research
can help understand the safe liquidity risk profile with in which the brokerage firms
should operate for long term stability to the company.
1.1 Background
The term “liquidity” has different meaning in different contexts. It is often counter-
productive to use it without further and closer definition (Goodhart, 2008). The three
main liquidity notions are central bank liquidity, market liquidity and funding liquidity.
Any discussion of liquidity, in context of financial institutions, always refers to the
notion of funding liquidity. The Basel Committee of Banking supervision defines
liquidity as the ability of financial institutions to meet their liabilities, unwind or settle
their positions as they come due (BIS, 2008). Similarly, the IMF provides a definition
of funding liquidity as the ability of solvent institutions to make agreed upon
payments in a timely fashion. Liquidity represents the capacity to fulfil all payment
obligations as and when they fall due, to their full extent and in the currency required
(Duttweiler, 2009). All the above definitions are compatible. The terms “financial
distress risk” and “liquidity risk” are sometimes used interchangeably. Outecheva
Page | 1
Dissertation Paper
(2007) defines distress risk as the inability of a firm to pay its financial obligations as
they mature. Financial distress is a series of subsequent stages characterised by a
special set of adverse financial events (Turetsky & McEven, 2001). Outecheva
(2007) grouped the liquidity risk (or the financial distress) problems into three main
categories (a) Event Oriented (b) Process Oriented (c) Technical. Event oriented
problems are default, overdrawing, bankruptcy, non-payment of preferred stock
dividend. But these events occur after a series of states of distress between good
financial health and these events. This process oriented explanation of liquidity
problems provides an understanding of this complex phenomenon. The complex
nature of financial failures deserves a behaviourally more appealing and robust
model form that recognizes the substantial amount of heterogeneity that can exist
across and within all firms in terms of the role that attributes play in influencing the
outcome of the process (Jones & Hensher, 2007). A company can be facing liquidity
problems without any of the above events occurring but occurrences of these events
are not possible without the preceding period of liquidity issues. A company is
technically categorized having liquidity problems through identification of some
technical indicators. Accounting based indicators are still very popular among
practitioners and researchers to determine the technical state of financial distress or
liquidity situation. Market based and accounting based determinants are combined
very often to model the calculation of liquidity situation.
Liquidity risk represents the danger of not being able to fulfil payment obligations,
whereby the failure to perform is followed by undesirable consequences. Liquidity
risk management is of paramount importance because a liquidity shortfall at a single
institution can have system-wide repercussions. Financial market developments in
the past decade have increased the complexity of liquidity risk and its management.
The market turmoil that began in mid-2007 re-emphasised the importance of liquidity
to the functioning of financial market. The recent financial crisis highlighted the lack
of sound liquidity risk management in financial institutions and lack of regulatory
control in determining the resilience to liquidity stress in the financial system. The
financial system must be examined on which the great masses of money are
manipulated and assure the public that it is safe and right (Bagehot, 1873). At least
since 1873 it was known that financial institutions are subject to funding liquidity risk
but its importance and urgency is felt after the global financial crisis in 2007. Basel
Page | 2
Dissertation Paper
The brokerage firms (any financial institution) should make public disclosure of its
liquidity position. Public disclosure improves transparency, facilitates valuation,
reduces uncertainty in the markets and strengthens market discipline. Brokerage
firms should disclose sufficient information regarding its liquidity risk management to
enable market participants to make an informed judgement about the ability of the
firm to meet its liquidity needs. Public disclosure of liquidity information is included
as one of the principles of sound liquidity risk management and supervision by Basel
Committee on Banking Supervision. Market participants are concerned that
complexity of on and off-balance sheet arrangements do not permit them to fully
understand the liquidity position of financial firms. The liquidity risk profile
measurement tool is not only useful for the investors, regulators and partners but
also for the internal controllers and management. Liquidity and solvency level of a
company determines the ability of the company to continue in business. Companies
Page | 3
Dissertation Paper
face financial trouble and go bankrupt if the short term financial needs cannot be met
by the liquid assets or by the funding liquidity in the market. Funding liquidity is
dependent on many micro and macro environmental factors but the primary factor in
this research is ability of the borrower to repay the dues on their due dates. Lenders,
rating agencies, auditors and investors apply different models to predict the
likelihood of bankruptcy of the company and assess the risk associated it. There
have been several models in use for estimating this risk, liquidity risk profile can be
one of the most appropriate tools to assess the risk in the current market conditions.
Page | 4
Dissertation Paper
changes with the micro and macro factors of the business, the liquidity required also
changes with the risk profile.
When entering into the liquidity issue situation firms are forced renegotiate their debt
repayment terms with the creditors or to raise capital to fund their turnaround
strategy. The different phases of the financial distress, the turnaround strategy can
potentially lead to recovery from the situation or it can also lead to complete failure
of the strategy. The central question of this research is “Can the liquidity risk profile
of the UK based financial brokerage firms be accurately predicted in different phases
of adverse situations, based on the publicly available information?”. In order to find
an answer to this central question, this research aims to:
1. To validate the old established models on the financial profile of banks those
have failed in recent years.
2. If the established models are not accurate enough then establish a new
model that is as current as possible for predicting the financial distress and
liquidity risk profile.
1. Assessing the liquidity risk profile, with an accuracy of more than 75%,
indicating the financial distress well in advance, as early as 3 years prior to
bankruptcy situation.
2. Understanding the validity of the old models in the current context and to
understand the underlying reasons what makes the models invalid.
3. Understanding what factors are most important in assessing the liquidity risk
profile of financial brokerage firms in the current context.
1.3 Hypothesis
Companies don’t collapse in one day, one month or one quarter. Generally there
would be warning signs long before the financial problems surface in the public. The
warning signs can be noticed analysing the performance of companies’ position from
the balance sheet. Detecting warning signs can be sometimes difficult as companies
may be following “Window Dressing” to make their balance sheets look prettier on
Page | 5
Dissertation Paper
reporting dates (Norris, 2010). The first hypothesis of this research is if companies
engage in any kind of window dressing of their financial position then they may be
leaving signs of this unethical practice in their balance sheets. Analysing the cases
of failed financial companies, during their stable periods of operations and the period
before their financial instability, would put some light on the warning signs that went
unnoticed. If there is any similarity of the mistakes made by the failing banks, e.g
Lehman Brothers, MF Global, Northern Rock, Merrill Lynch, and Royal Bank of
Scotland then the lessons from these cases may be applied across other financial
firms to prevent their failure. The second hypothesis of this research is every
company, depending on their industry sector and market, may need to operate within
a liquidity tolerance level. If a company operates beyond the tolerance limits of its
liquidity profile then the company is more likely to face financial troubles. So, if safe
operating of a brokerage firm is linked to the liquidity risk profile; then bankruptcy
and financial troubles of the firm may be prevented by operating within the liquidity
tolerance limits.
Although the above two hypothesis are applicable to any company in any industry
sector, but for testing these hypothesis through this research we will apply it on the
financial brokerage companies. The emphasis of this research is on the financial
brokerage firms as Liquidity risk is one of the most debated topics in the financial
sector since the financial crisis of 2008. The brokerage firms are a subject of this
research as the depository guarantee insurance schemes are not applicable to
savings and investments of customers through these brokerage firms. So, these
investments are at higher risk than the savings through retail banks. In later
researches we may extend the testing of these hypotheses on other industry
sectors.
Page | 6
Dissertation Paper
different theories and functions are discussed. This section presents a critical review
of existing theories and models of financial distress, discussion on the key terms and
limitations, and the conceptual framework in which financial distress is seen as a
dynamic process.
Chapter 4 explains the setup of the different models taken here for comparative
analysis. The results obtained from feeding the primary and secondary data sets to
these models are presented and analysed in this section. This section ends with
discussion of the conceptual framework for calculating and presenting the liquidity
risk profile. Empirical results from this framework are also discussed and presented
in this section.
Page | 7
Dissertation Paper
2 Literature Review
A simple definition of liquidity risk is the inability of a company to meet its current
financial obligations. The financial obligations are not just limited to debt financiers
and the equity financiers but also include other stakeholders such as service
providers, vendors and employees. Most of the studies and researches are limited to
the financial obligations towards the debt financiers as they are the first to get
affected in a liquidity crisis situation. This study is not restricted to analysing the
liquidity situation from a debt providers’ point of view but to analyse it from an overall
operations point of view of the business. Bankruptcy and failures of financial
brokerage firms has been very expensive and serious concern to the industry as the
compensation paid under the investor protection schemes are resulting in million
dollars of losses to the regulatory trustees (Altman & Loris, 1976). In Altman and
Loris’s (1976) research they observed that almost 73% of the failed brokerage firms
were less than 5 years old in business, 20% of the firms had asset size of more than
1 billion USD and 17% of the firms had successful track record of more than 10
years in business. Numerous such studies have been published over the past 40
years on financial distress modelling, liquidity risk profiling, predicting bankruptcy
and counter party risks. Most of these models are based on (i) Discriminant Analysis
(ii) Logistic and Probabilistic Analysis (iii) Proportional Hazard analysis and (iii)
Neural Network analysis. Factors that are found to be most common in these studies
are net income to total assets ratio, current ratio, working capital to total assets ratio,
retained earnings to total assets ratio and profit from operations to total assets ratio
(Bellovary et al., 2007).
Page | 8
Dissertation Paper
discriminate the stable and bankrupt companies with an accuracy of 90%. The
current ratio has been almost venerated by accountants and other financial decision
makers as a prime criterion of measuring liquidity (Lemke, 1970). In today’s date
also current ratio is considered as a key indicator in assessing creditworthiness and
eligibility for loans. The fundamental problem with the current ratio is that it a static
information at a given point in time, it does not give any indication of the rate at
which the obligations are due or the rate at which the receivables are due.
Maintenance of liquidity requires that cash outflows be matched by cash inflows (or
the difference made up from liquid holdings) on a day-to-day basis, and the
necessity for rather detailed planning is imposed by seasonal and non-routine
fluctuations in cash flows (Lemke, 1970). Individual variables such as current ratio
are good quick indicators but not robust reliable representation of financial stability.
Beaver’s research was primarily based on individual variables discriminating the
successful and failing companies. The univariate model is susceptible to faulty
interpretation and can potentially be confusing. Beaver had recommended for
researches based on multivariate analysis to increase the accuracy of the models for
predicting failures. Further researches by Edward I. Altman, Robert G. Haldeman
and P. Narayanan (1977) and Joseph Aharony, Charles P. Jones and Itzhak Swary
(1980) are based on the multivariate discriminate analysis, which establishes that
failure warnings can be detected as early as 5 years prior to the collapse of the
company with an accuracy level of 70% and the accuracy level can be 90% within 3
years from bankruptcy. Altman’s multivariate discriminate model was based on 7
ratios; return on assets, stability of earnings, debt service, cumulative profitability,
liquidity, capitalization and size.
Models based on the univarite analysis and multivariate analysis determine a score,
commonly referred as the Z-Score. The resulting score is very much subject to
individual interpretation. Critics of the Z-score models argue that the score does not
suggest any probability of failure or any estimated time before collapse, so its
practical use is very limited. Results of new statistical techniques on ratio-based
analysis are as good as MDA, but that rely on fewer assumptions (Hossari, 2007).
The underlying assumptions of MDA, distribution of the variables, dispersion
matrices of the sample groups, interpretation of the significance of individual
variables, dimension reduction, definition of groups, selection of prior probabilities
Page | 9
Dissertation Paper
and samples and assessment of classification error rates etc may not hold true in
practical scenarios and can lead misclassifications. The discriminant analysis on
accounting ratio-based models can be less informative than the market-based
models with respect to predicting financial distress of firms (Charalambakis et al.,
2009). Charalambakis et al (2009) argue the drawbacks of the accounting ratio
based models for the reasons that (a) accounting-based models use information
from the financial statements, which present the past performance of a firm and may
not convey information about its future status, (b) the book value of assets and
liabilities are understated (c) accounting data provide a snapshot of the value of the
company at a specific point in time while market data dynamically reflect the value of
the company. Arguments on accounting ratio vs market data by Charalambakis et al
(2009) are quite valid. Liquidity difficulties can be caused by exogenous or
endogenous risk factors (Outecheva, 2007). Outecheva defines the endogenous risk
factors as the internal problems of the company, which negatively affect only a
particular firm or a small number of firms within the same network. The exogenous
risk factors are pervasive; they can affect all companies in the market or in the same
industry segment. Despite the strong economic intuition suggesting that industry
effects should be an important component in bankruptcy prediction, not much
attention has been paid to industry effects in extant literature, mostly likely due to the
limited number of bankruptcies in each industry (Mansi et al., 2010). The counter
argument can be that market dynamics and industry effects are also reflected in the
firm’s accounting ratios. The accounting ratios include the future cash flows, short
term and long term assets and liabilities. These ratios provide forward looking
information specific to the firm that market data can provide. But most of the
accounting ratios are available only on an annual basis whereas market data are
available on a daily or monthly basis. The accounting ratios will be more accurate
and appropriate if the financial distress analysis is done annually and using the
market data will be useful if the analysis is being done in interim periods. According
to financial theory, these Despite the critique that discrimanant analysis on financial
ratios are past oriented and cannot capture the future dynamics and prospects of the
company as a going concern, they perform well in models predicting financial
distress and probability of default (Outecheva, 2007).
Page | 10
Dissertation Paper
Page | 11
Dissertation Paper
the variables than the discriminant analysis models. Probit analysis is considered as
a variant to logit analysis as the only difference is in the assumption on the
distribution of the random term. The predicted probabilities from probit and logit
models are quite similar unless the sample is large and enriched with observations
at the tails (Hossari, 2007).
Jones & Hensher (2004) presented the view that the literature on predicting
corporate financial distress has largely been confined to simple multiple discriminant
analysis, binary logistic or probit analysis, or rudimentary multinomial logit models
but failed to keep abreast of the important methodological developments in discrete
choice modelling. They were of the opinion that the mixed logit model fulfils the
purpose and provides a superior framework prediction and explanation on
significance of each of the variables. A simple representation of the mixed logit
model is
where the mixed logit probability Pni is a weighted average of logit probabilities Lni(β)
evaluated at different values of the logit model parameters β, with f(β) as the mixing
(or probability weighting) distribution. The mixed logit model is more appropriate than
a logit model when the outcomes are discrete. The mixed logit model solves the
three limitations of the standard logit model, i.e it allows for random taste variation, it
allows for unrestricted substitution pattern and it considers the correlation in
unobserved factors over time. The random parameters (β) in mixed logit model can
take any form of distribution such as normal, lognormal, triangular or uniform and
provide a more predictive performance on a hold out sample. The challenges in
applying mixed logit stem from not knowing the form of the mixing distribution f(β),
and the need to evaluate numerically the integral function and estimate the
parameters of f(β) by simulated maximum likelihood (Kalotay, 2007).
Page | 12
Dissertation Paper
effects of market capitalization and volatility become stronger, while the effects of
losses, leverage, and recent past returns become slightly weaker. In a recent study
by Sattar et al (2010), they find that CHS Score outperforms the Altman’s (1968) Z
score, Ohlson’s (1980) O score and Merton’s (1974) distance to default model. Their
study assumed that bond spreads reflect the markets’ perception of risk and
assessed the relative performance of CHS Score, Merton DD, Z Score and O Score
to predict the financial distress. The CHS score model has certain qualitative
aspects that have improved upon the earlier models, such as (a) the problem of
using understated book value, as argued by Charalambakis et al (2009), has been
addressed in CHS model by considering the market value of the equity component
to determine the total asset value, (b) the variable market-to-book ratio brings out
the difference in market perception and firm’s estimates (c) firm’s excess stock
returns in comparison to the S&P 500 index analyses the firm’s performance with the
market performance.
Specific studies on predicting bank failures with proportional hazards model have
shown that the results are as accurate as multiple discriminant analysis models. The
proportional hazards models indicate both, probability of failure and most probable
time to failure. Merton’s (1974) DD model estimates the probability of default at any
given point in time. The basic function of the Merton DD model is simple, it considers
the face value of the debt, market value of its equity and the volatility of the firm’s
value for the time horizon. In simple form, the Merton’s DD model can be
represented as
Value of the firm and the volatility are not directly observable values. Market value of
the firm is sum of market value of its equity and market value of its debt. While
market value of equity can be easily determined for publicly listed firms but it is
assumed that market value of debt and volatility of firm value can be determined
from other observable variables. Several researchers have examined Merton’s DD
model against other simplified models applying the same functions of Merton DD
Page | 13
Dissertation Paper
model. In a notable research, Bharath and Shumway (2008) conclude that while the
Merton DD model does not produce a sufficient statistic for the probability of default,
its functional form is useful for forecasting defaults. Bharath and Shumway
constructed a naive model, based on the functions of Merton’s DD model, and
observed results from the naive model are equally good as Merton’s DD model.
Henebry (1996) tested proportional hazards models to study the predictive accuracy
of bank failures with cash flow information in the predictive models. The proportional
hazard model assumes that the effect on survival of a particular value for a variable
will be proportionally the same over time. The advantage of proportional hazards
model over the MDA and logit models is that there are no distributional assumptions
for the hazards model or for the estimation of coefficients. Results from Henebry’s
(1996) research suggest that cash flow variables improve the accuracy in long run
predictions except in one year model. Henebry himself casted doubt on the
appropriateness of the technique as most of the significant variables violated the
model assumption for more than one year time horizons. To draw more definitive
conclusions he proposed further researches on modified proportional hazards model
to allow variables to change over time. Shumway (2001) refers the single-period
bankruptcy models as static models. Shumway’s (2001) argument is that firms
change through time and the static models produce biased and inconsistent
probabilities by ignoring this fact. Estimating hazard models with accounting
variables reveals that half of the accounting variables are statistically unrelated to
bankruptcy probability (Shumway, 2001). Shumway argues that his simple hazard
model out performs the alternative models in out-of-sample forecasts by considering
three market driven variables that include market size, historical stock returns and
the standard deviation of stock returns. The three strong arguments for using the
hazards model are (a) consideration for a firm’s period at risk, which is missing in the
static models, (b) it incorporates the time varying covariates that change with time to
reveal the firm’s changing health and (c) use of much more time-series data to
produce more accurate out-of-sample forecasts. Charalambakis et al (2009)
investigated and compared the effectiveness of two Z-score models with that of
Shumway’s simple hazard model for UK based firms.
Page | 14
Dissertation Paper
Some of the key findings as presented by Charalambakis et al (2009) are; One, with
respect to the discrete hazard model, using the components of the Taffler’s (1983) Z
score model, they found that half of the accounting ratios on which Taffler Z-score is
based are not related to forecasting corporate failure. Two, Shumway’s (2001)
discrete hazard model based on accounting and market variables contains
significantly more information about the probability of financial distress than hazard
models based on Z-score and Z-score components. Three, out-of sample forecasts
clearly demonstrated that Shumway’s (2001) hazard model based on accounting
and market variables outperformed other hazard models based on Z-score and Z-
score components. Even the hazard model based on accounting and market
variables outperformed the hazard model purely based on market variables. Their
study suggests that Z-score is not a powerful predictor of corporate financial distress
as it lacks statistical power. Their data sample consisted of 3,459 firms and 32,257
firm-year observations. This huge data sample is beneficial to remove any bias
arising from smaller data samples but at the same time this data from 1980 to 2006
(26 years) is also likely to average out and hide the changing pattern of the recent
years. The other weaknesses of Charalambakis et al’s (2009) research is that they
excluded firnancial and utility firms from their data sample. Also, their strudy was
restricted to bankruptcy cases and excluded the other distressed conditions such as
default, mergers and acquisitions.
Page | 15
Dissertation Paper
method (Zhang et al., 1998). The number of output nodes in the network is based on
the type of problem in hand. For a dichotomous output, a single node output layer is
required.
Neural network based corporate collapse prediction model performed better than the
MDA-based model the farther the data was from ‘y0’ (‘y0’ denotes the year in which
collapse occurred) and a lower occurrence of a Type I error than MDA (Coats &
Fant, 1993). Tam & Kiang (1992) argue in favour of the neural network model for the
reasons that it does not assume the distribution pattern of the variables like the
statistical models. The other argument in favour of neural network model is its
adaptive adjustment capability based on the new real life examples. Though the
neural network model is a good comparative alternative to other financial distress
classification techniques its application is still limited. The difficulties in implementing
a neural network model are (a) defining an effective network topology with optimal
number of layers and number of neurons in each layer (b) its difficulty in explaining
the relative significance of each input variable. This difficulty is termed a the “black-
box” syndrome in dealing with qualitative information (Xiong, 2009). (c) it demands
for more computational power and time for training the neural net.
Page | 16
Dissertation Paper
Xiong (2009) proposed a new model and named it Adaptive Genetic Fuzzy Neural
Network (AFGNN), which combines the genetic algorithm and neural network that
can overcome the “black-box” syndrome of the neural network models. His proposed
architecture for the AFGNN model consisted of a simple 4 layer neural network. The
numbers of nodes in the layers were determined by the number of variables and the
number of fuzzy sets. The results from hi research indicate that the performance of
AGFNN is much better than the neural network in financial distress prediction.
Modelling the liquidity risk profile consists of at least three stages. The first stage is
evaluating and determining the potential of the observed variables. The second
stage is establishing a mathematical or statistical model that can explain the
relationship between the variables and the results. The third stage is validating the
model through empirical studies and determining the effectiveness of the model. The
univariate discriminant analysis is a useful research in identifying the potential
variables, which is the first stage of the modelling process. Critics against the
univariate discriminant analysis should not forget the fact that univariate analysis is
the foundation of all recent researches and findings in financial distress modelling or
liquidity risk modelling. Researches on the linear regression model as used in MDA,
the logistic regression as in logit or probit analysis and the Neural-network models
are part of the second stage of the modelling process. Among all the models
Page | 17
Dissertation Paper
developed for predicting financial distress and bankruptcy situations between 1968
and 2004, Hossari (2007) found MDA as the most prominent model. His findings
suggest that although new methodological approaches have been in use recently
but most of these new approaches use MDA as the benchmark methodology. The
specification test conducted by Lo (1985) concludes that the null hypothesis of their
research that discriminant analysis and logit analysis are equivalent may not be
rejected. The third stage of validating the model through empirical studies, where I
find most of the gaps in the past literature. Most of the literatures emphasise more
on the theoretical concepts of the mathematical or the statistical models but not
always fully explain the practical usefulness and implications of the models. The
real-world firms do not obey the constraints imposed by our assumptions (Lemke,
1970). The more we move into the real world and away from the restrictive
assumptions of the theoretical analysis, the greater is the possibility of variation of
the observed variables, and the interpretation of changes becomes more and more
complex. The circumstantial evidence contained in company annual reports does not
permit dogmatic conclusions (Lemke, 1970). My findings suggest that the literatures
are mostly targeted towards academic audience. Most of the mathematical and
statistical functions used in the models are complex, which only a mathematician or
a statistician can easily understand. To bring the research findings to wide
acceptance and common use it needs to be presented for the common investors
and stakeholders, who not necessarily have deep mathematical or statistical
background.
Among all the ratios that have been studied to analyse and predict the financial
distress situation in a firm the top five ratios are Net Income to Total Assets (nita),
Current Liability to Current Assets (current ratio), Working capital to total assets
(wcta), Retained earnings to total assets (reta), and Earnings before interest and
taxes to total assets (ebitta). The statement of cash flows, which illustrates a
company‘s capacity to transform its results into cash, has been virtually ignored by
analysts, who tend to focus instead on the balance sheet and the income statement
(Maux & Morin, November 2011). As liquidity is all about successfully clearing the
financial liabilities as they become due, there can be three ways to meet these
financial liabilities. The first is to generate sufficient and on-time cash flows from the
operations and investing activities that it can pay for the liabilities as they fall due.
Page | 18
Dissertation Paper
The second option is to use the reserve fund or sale of assets to raise money and
fulfil the financial obligations. The third option is to raise funds from the debt or
equity markets. The second and third options are perceived as distressed actions.
The healthy option to maintain liquidity is to generate sufficient cash flows from
operations and investing activities. The current ratio is not very appropriate measure
of liquidity as it cannot differentiate and present the dynamic nature of liquidity. The
current ratio does not capture the rate at which the liabilities fall due and the rate at
which the cash inflows are expected. Lemke (1970) proposed the projected liquidity
flow index and retrospective liquidity flow index concept for further study and
analysis. For assessing the future of a firm projected liquidity flow index is a more
desirable metrics to measure than a retrospective liquidity flow index. If a projected
index is computed for internal purposes, it would seem a simple matter to include it
or the information requisite for its computation in company annual reports. But firms
usually do not report this information in their annual report and even if the forecasted
information is reported its reliability will be questionable as it would be seriously
open to management manipulation and management’s biased optimistic or
pessimistic views. The historical data would be more reliable and accurate for
computation of a retrospective liquidity flow-index and, in the absence of forecast
information, might be taken as an indication of a firm's short-term debt-paying
capacity based on the latest accessible data.
The cash flows information can be very useful in (a) Assessing the firm’s ability to
generate positive future net cash flows (b) Assessing the entity's ability to meet its
liabilities and its needs for external financing (c) Assessing the reasons for
differences between cash flows and income (d) Assessing the effects of investing
and financing activities on the firm’s financial position. While analysing the signals of
failure in Lehman Brother’s bankruptcy case, Maux & Morin (2011) observed that
Lehman Brothers had $7.286 billion in cash and cash equivalents on November 30,
2007. The analysis of its statement of cash flows showed signals major dysfunctions
in working capital management. Over a three-year period, Lehman Brothers
generated net negative cash flows of $161.657 billion. The systematic payment of
dividends despite sizeable cash deficits in operating activities, not to mention the
financing of dividends through long-term loans, also points to dysfunctional cash
management. Maux & Morin (2011), in their paper concluded that statements of
Page | 19
Dissertation Paper
cash flows are highly informative and are thus of great value to investors and
analysts. I find the arguments in the literature for using cash flow information having
strong relevance in representing liquidity or liquidity risk profile of a firm. This paper
extends the study and recommendations proposed by Lemke (1970). Liquidity risk
profile is about identifying the changing pattern of liquidity flow-index and assessing
the risk linked to the changes. The liquidity risk profile represents the optimal
operating limits for the liquidity position of the firm.
In the past, not many researches have been conducted on the time series variability
of the determinant variables. Ohlson (1980), Shumway (2001) and Hossari (2007)
discussed on multi firm-year data analysis and observations in changes in the
determinant variables but further researches have not been carried out to empirically
test this and to develop simpler models. Among all the models and researches
discussed above, the MDA model is one of the simplest to understand, apply and
explain. Although there have been advanced models proposed by many others, but
the models are more complex as compared to MDA and the results are equally
comparable. If the MDA model can be enhanced with lesser assumptions, with
determinants that can improve the accuracy of results and if the simplicity of the
model can still be maintained then it will be of more practical use. This research will
maintain the simplicity of MDA model but rather than limiting the model with one year
accounting ratios it will apply some of the other determinants, such as market data
and firm-year observations, as argued in some of the other models.
Page | 20
Dissertation Paper
3 Research Methodology
In order to find the answers to the research questions here, the method that is
proposed to be followed is of objective style. The two fold method of this research is
to first find answers to the research questions from the sample of 5 failed banks and
brokerage firms and second is to apply the observations and answers to another set
of 5 UK based brokerage firms to validate the hypothesis and to make predictions.
This section explains the five layers of this research method, which are Research
Philosophy, Research Approach, Research Strategies, Time Horizons and Data
Collection.
The three views of research philosophies are positivism, interpretivism and realism
(Saunders et al., 2003). The research method for this paper reflects the philosophy
of positivism and interpretivism. In this research I assume that predictions can be
made on the basis of the previously observed and explained realities and their inter-
relationship, which is a reflection of the positivism philosophy. This research method
also assumes that business environments are complex in nature. Generalization of
the formula for determining Liquidity Risk Profile for all companies may not be
possible. The observations made from the sample of companies taken for this
research may not be applicable to all the companies.
This research is primarily based on a deductive approach to study and test the
hypotheses. In this research there is a search for explaining the causal relationship
between the liquidity position of the banks and their failure. In the deductive
approach of research, observations may provide the basis of explanation, permit the
anticipation of the phenomena, predict their occurrences and therefore allow it to be
controlled (Hussey and Hussey, 1997). The research also adopts the inductive
approach, though it’s not the primary approach here, to understand the alternative
theories or hypothesis that may be suggested by the inductive approach after
analysing the sample data.
Page | 21
Dissertation Paper
The research strategy described here proposes the general plan of how to find the
answers to the questions in the section 1.2. Sounders et al (2003) have defined 8
different research strategies. Research objectives cannot be achieved by adopting
just one of these 8 strategies. At the same time adopting all the 8 strategies may not
be beneficial for the research. So, depending on the scope and time constraints of
the research a multi-method strategy needs to be followed, considering the
appropriateness of the strategy for the research. This research follows Case study,
Longitudinal time horizon and Exploratory, Descriptive and explanatory study
strategies. Surveys are not expected to have any benefit in this research as the
accuracy and quality of responses from banks and brokerage firms will not be
reliable in this financial climate and risk of exposing them to regulatory supervision.
Action Research and Experimental strategies are not adopted due to the time
constraints and limited access to resources required for those kind of strategies. The
case study strategy is of particular interest in this research as this research involves
the empirical investigation of the phenomena of liquidity position of failed banks and
brokerage companies in the real life context using evidences from multiple sources.
The longitudinal time horizon strategy will give insights to the warning signs of
liquidity positions. A cross sectional strategy cannot be achieved within the time
limits of research to cover the issue of warning signs of liquidity position across
organizations and across industries. The proposed strategy here is to analyse the
data for 10 years for the 5 banking and brokerage firms. Enquiries to find answers to
the research questions can be classified into exploratory, descriptive and
explanatory studies. The exploratory study of similar researches and literatures are
valuable means of finding assessing what is happening. The descriptive study can
help in determining the liquidity risk profiles of the 5 sampled UK based brokerage
firms. Time horizon for this descriptive study is also 10 years from December 2011.
The two fold research method requires 2 sets of samples. The first sample is a set of
failed banks and brokerage firms. This sample includes Lehman Brothers, Merrill
Page | 22
Dissertation Paper
Lynch, Northern Rock, Royal Bank of Scotland and MF Global. This sample is
selected taking some UK firms and some US firms as there are a lot of researches
that have been conducted on these cases, which will help in this research. The
sample could have included some more European and Asian firms as well but that
would require a longer time to gather data and analyse it. The data considered for
this research are all secondary data available to public through the annual reports
from the firms. This research analysis will be based on 10 years of data, prior to the
declared bankruptcy or takeovers of the companies.
The second sample is a set of 5 brokerage firms based in UK. These names include
Jarvis Securities Plc, Fiske plc, Collins Stewart Plc, Brewin Dolphin plc and Walker
Crips plc. The prediction analysis for these firms considers 5-10 years of financial
reports for analysis. Only the publicly listed firms are taken as sample due to the
reasons of availability and accessibility of secondary data for the statistical analysis
purpose.
The limitation of this data sampling and data collection is in the frequency at which
the data is published. Although the COMPUSTAT data tape would have been more
appropriate source of data for this research but due to lack of access to the data at
college and the due to high cost of privately sourcing the data from Standard&Poors,
that option was excluded. Since some of the firms in the sample have gone bankrupt
in past and some of them started being a public company in recent years, so some
of their historical data are not available. In such cases mean values are taken for
filling the data gaps.
Page | 23
Dissertation Paper
Page | 24
Dissertation Paper
To assess if the model is still valid in the current context after 44 years since it was
developed, I applied the model on a secondary sample. The secondary sample
consisted of 3 non-financial UK companies, ARM Holdings, Tesco Plc and
Woolworths Group Plc. The results were found to be consistent with the
expectations. ARM Holdings shows consistent improvement in its Z-score over the
past 5 years. The Z-Score value for Tesco Plc deteriorated from 2007 to 2009 and
again showed improvement from 2009 to 2011. Woolworths group Plc went for
liquidation in 2008. So, the Z-score indicator is consistent with the known facts about
these companies.
Table: 1
The results above conclude that Z-score model is still valid and applicable for
manufacturing and retail companies. When the model was applied on the primary
sample in this study, the results were inconclusive. The Z-score was less than 0.50
for Lehman Brothers, Merrill Lynch and MF Global for past 10 years (Table 2).
Page | 25
Dissertation Paper
Applying the Altman’s Z-score model, one can observe that Lehman had been
clearly sending signals of financial distress since 2005 as the Z-Scores were well
below 1.81 (Altman‘s threshold of financial distress) for the three years preceding
bankruptcy (Maux & Morin, November 2011). Maux & Morin’s conclusion was based
on the Z-score calculation for 3 years, 2005 to 2007. But in their analysis they didn’t
compare the Z-scores for Lehman Brothers with that of years prior to 2005 and they
did not validate the applicability of the Z-score model for companies such as Lehman
Brothers. As I can see in our results that Z-scores were below the threshold of 1.81
even in the years before 2005. The conclusion from this result would be that Lehman
Brothers and Merrill Lynch were under financial distress and showing signs of failure
throughout the past 10 years. But that is not the fact, during late 90s and early 2000s
performance of these banks were very stable. Comparison of the mean observations
of the independent variables in Altman’s (2000) research and my findings are
presented in Table 3.
Variable X4 and X5 are observed to be very different for Lehman Brothers and
Merrill Lynch from the observed means in Altman’s (2000) study. Variable X4
measures how much the assets value of a company can drop before its liabilities
exceed its assets. The mean observations of X4 for Lehman and Merrill were 0.069
and 0.075. The market value of equity is very insignificant compared to the total
liabilities for investment banks. The variable X5 shows the asset turnover ratio or
sales generating capability of company’s assets. Variable X5 can show a very wide
Page | 26
Dissertation Paper
Altman (2000) proposed an alternate Z-score model without the X5 variable for non-
manufacturing and service industry, to avoid the industry effect of X5. The alternate
Z-Score1 model is
X1 X2 X3 X4 X5
Year -1 Lehman Brother 0.2339 0.0285 0.0087 0.0532 0.0854
Merrill Lynch 0.1827 0.0233 -0.0126 0.0434 0.0614
MF Global 0.0360 0.0001 -0.0019 0.0328 0.0551
Table: 3
Z1 score of less than 1.21 indicates financial distress and Z1 score of greater than
2.90 indicates stable financial performance. The gray area is between 1.21 and 2.90.
When this alternate Z1 score model was applied on the primary sample, the results
(Table 4) were still inconclusive. The observed results suggest neither the original Z-
score model nor the alternate model are appropriate for assessing the financial
distress condition of financial firms.
Page | 27
Dissertation Paper
X1 = X2 =
X3 = X4 =
X5 = X6 =
X7 = X8 =
X9 =
The higher the O-Score the greater is the probability of failure and financial distress.
Ohlson observed that a cut-off value of 0.038 minimizes the type I and type II errors.
When I compared the O-score results for the primary sample companies, I observed
Page | 28
Dissertation Paper
opposite and inconclusive results (Table 5). All the observed O-scores are greater
than the cut-off value of 0.038. Does that mean all these companies were on the
verge of bankruptcy all these years?
Financial services companies have very different financial structure than other
industry segments and operate in a different liquidity environment. Collecting
appropriate data for bankruptcy prediction are, in some cases, difficult to obtain from
publicly available data sources (Ohlson, 1980). That is one of the reasons why
Ohlson had excluded utilities and financial services companies from his sample set.
This model could be appropriate when applied to the companies within the original
sample set but in my test on the out-of-sample companies this model failed to show
any predictive capability in detecting financial distress. Ohlson (1980), in his original
research paper, had mentioned that if adequate and accurate data can be gathered
then non-accounting data, such as information based on equity prices and changes
in prices might prove to be most useful.
4.3 Merton DD
The Merton DD model translates the value and volatility of a firm’s equity into an
implied probability of default by using nonlinear equations.
V is the total value of the company; F is the face value of total debt; μ is the
expected compounded return on V; σV is the volatility of firm value; T is the time
horizon. Bharath & Shumway (2008) proposed a naive model by retaining the
functional form of Merton’s original DD model and the naive model was further
simplified by Bharath & Shumway (2008) and Mansi et al (2010), which resulted in a
simplified function as
Page | 29
Dissertation Paper
σE is the equity volatility in percentage value, calculated from the daily close prices
of the stock for the respective financial year.
Bharath & Shumway (2008) included the 25% times equity volatility to allow for
volatility associated with default risk and the five percentage points to σE to
represent term structure. The distance to default is measured in standard deviations.
Assuming the volatility of firm’s value follows a normal distribution the probability of
default is determined from the distance to default. This naive model, which it retains
the structure of the Merton DD, is easy to compute. Bharath & Shumway (2008)
argue that this naive model captures approximately the same quantity of information
as the Merton DD. The probability estimate with this model is
The fundamental assumption of the model is that if the firm’s asset value falls below
the total liabilities then the firm will default. The total asset value is sum of equity
value at market price (market capitalization) and total liability. In our dataset, market
capitalization for RBS and Northern Rock were missing. So the total asset values for
these firms were taken from the balance sheet. Total liability is also taken from the
annual balance sheet report. The natural logarithm of asset value over liabilities
shows the rate of continuous fall in asset value that will lead to default. Since the
firm’s assets grow or fall over time and the volatility of the value affects the expected
growth or fall, this needs to be factored by risk free rate and price volatility for the
time horizon. Price volatility is calculated from 25 close prices1 randomly taken from
the historical stock prices for the given financial year. No lag has been considered in
this analysis for the model function as considered by Mansi et al (2010), Bharath &
Shumway (2008) and Outecheva (2007). The other variation that is taken in this
analysis compared to the standard black-scholes-merton (BSM) model is that the
asset value is taken as the book value debt plus the market value of equity. The
1
Historical stock prices have been sourced Yahoo finance web site.
Page | 30
Dissertation Paper
standard BSM model calculates the market value of assets based on the BSM
European call option pricing model. BSM Model treats financial distress as a
situation in which cash flow is not sufficient to meet its financial obligations. The
Merton DD model addresses distress events such as (a) there is a missed or
delayed payment of interest or principal, or (b) a company files bankruptcy, or (c)
there is a distressed agreement with creditors. Therefore, Merton DD captures not
only companies which default on their financial obligations and file bankruptcy, but
also cases where financially distressed companies avoid bankruptcy by negotiating
with creditors out of court and firms which are delisted from the stock exchange
because of poor performance as a result of financial distress (Outecheva, 2007).
In my study I calculated the distance to default for the primary data set firms, which
were 1.932, 0.435, 0.160, -0.249 and 0.255 standard deviations away from the
centre for Lehman Brothers, Merrill Lynch, MF Global, RBS and Northern Rock
respectively (Table 6). The probability of default with in a time horizon of 1 year were
3%, 33%, 44%, 60% and 40% respectively for the above firms. Although this result is
in accordance with the conclusions drawn by Bharath & Shumway (2008), Mansi et
al (2010) and Merton (1974) but in the real world, out of the 10,000 companies that
trade on exchanges, only 600 go bankrupt - this is a 6% failure rate, so any model
should do better than 94% accuracy (Haber, 2005). Only for the case of RBS the
estimated probability was more than 50% (at 60%), for all other cases the estimated
probability was less than 50%. In case of Lehman Brothers the distance to default
was 1.932 standard deviations (probability of 3%), which is quite in contrast to the
actual fact that Lehman Brothers filed for bankruptcy within one year from the point
where this dataset was taken.
I extended the distance to default calculation for the out of sample UK brokerage
firms. All of the UK brokerage firms in the sample found to have distance to default
Page | 31
Dissertation Paper
between 4.7σ and 21σ (Table 7). The inference that can be drawn from it is all these
firms have zero probability of default in next one year. There was one of the unusual
observations is in the case of Jarvis securities. At end of financial year 2011, i.e on
31st March 2011, the total number of diluted shares outstanding in the market was
10,764,606 and the stock was trading at 158p per share. The total market
capitalization as on 31st March 2011 was £17,008 thousands where as the total
asset value on the balance sheet was £6,588 thousands. The equity value was
much higher than the total asset value of the company and market-to-book ratio was
approximately 8.7 for shares and market-to-book ratio for enterprise value was 3.4.
At first look, I assumed that the numbers are probably incorrect but even after cross
verification I found the same results.
2
Outliers are ignored in calculating this ratio.
Page | 32
Dissertation Paper
have not been able to put into common practical use because of the complexity and
difficulty in collecting the data. The simple naive model discussed in this section look
promising compared to the other models. Results presented above and the results
published by Bharath & Sumway (2008) prove that the usefulness of the Merton DD
probability is due to the functional form suggested by the Merton model. The iterative
procedure used to solve the Merton model for default probability does not appear to
have any higher significance than the simple naive calculation. The results
presented here cover only a smaller sample of firms but I would recommend further
studies covering a larger sample of firm-year, firm-quarter and firm-month data for
UK brokerage firms to extensively analyse the effectiveness and accuracy of this
naive Merton DD model.
Where
Page | 33
Dissertation Paper
Net Income, Book Value of Equity, Total Liabilities, Cash and Cash Equivalents data
was taken from the respective annual statements of the firms. Rest all data on stock
price, volatility and excess return was sourced from Yahoo finance. Standard
deviations for the stock prices were calculated (Appendix A) from the daily close
prices of the firms for the last quarter, taken from yahoo finance historic prices.
Wherever historic prices were not available, standard deviation was calculated
based on the high and low stock prices for the period.
CHS
Year NIMTA TLMTA EXRET CASHMTA SIGMA MB PRICE Score
Lehman Brothers 2007 0.006 0.949 0.129 0.010 0.450 1.588 4.137 -8.360
Merrill Lynch 2007 -0.008 0.958 -0.053 0.040 0.120 1.760 3.944 -7.261
MF Global 2011 -0.002 0.968 -0.020 0.000 0.040 0.931 2.114 -7.597
Royal Bank of
Scotland 2008 -0.005 0.975 0.446 0.043 0.448 1.000 -0.705 -10.209
Northern Rock 2007 -0.002 0.975 0.125 0.002 0.240 1.624 4.787 -8.449
Campbell et al's (2010) Published Result
Complete Sample Mean 0.000 0.445 -0.011 0.084 0.562 2.041 2.019
Bankrupt Group Mean -0.040 0.763 -0.115 0.044 1.061 2.430 0.432
Failed Group Mean -0.044 0.731 -0.105 0.072 1.167 2.104 0.277
Table 8
A comparison of calculated values for the individual variables with that of the
Campbell et al’s (2010) published results is given in Table 8. The mean value of CHS
scores, as published by Mansi et al (2010) was -7.702, which indicated a low
probability of default. These firms, with CHS score between -7.2 and -10.2, fall into
Page | 34
Dissertation Paper
the category of low probability of default when compared with the CHS mean score
from Mansi et al’s (2010) results. As we know, this categorization is not correct, all
the above firms have gone through high levels of financial distress on those given
years and some of them have collapsed with bankruptcy filing. When 5 years of data
(Table 9) compared for Lehman Brothers it reveals that the only variables that were
changing were market linked variables. Cash and cash equivalent position
deteriorated from 2004, i.e 4 years prior to the bankruptcy. The volatility showed a
sharp variation to the volatility over previous periods in 2007, a sharp increase from
4.9% to 45% between 2006 and 2007. A general perception of distress is that a
continued history of losses or a consistent decline in market value would be a better
predictor of bankruptcy than one loss in a quarter or a financial year; or a sudden
stock price decline in a short period. In case of MF Global, there was continued loss
for last 4 years but the case of Lehman Brothers was different. There was continued
growth in equity value, asset value and profitability. At the time of bankruptcy filing
Lehman Brothers’ total asset value was higher than the total liabilities. The
bankruptcy filing was primarily due to shortage of liquidity or mismanagement of
cash flows. This issue of cash flow mismanagement is not captured in CHS scoring.
The ratio of cash and cash equivalents to market value of total assets is supposed to
capture the liquidity variable but in this case it does not sufficiently capture the
liquidity issues arising due to cash flow mismanagement.
Lehman Brothers 2007 0.01 0.95 12.89% 0.01 45.00% 1.59 4.14 - 8.36
2006 0.01 0.92 6.37% 0.01 4.90% 2.18 4.30 - 8.50
2005 0.01 0.91 14.76% 0.01 5.86% 2.19 4.14 - 9.09
2004 0.01 0.93 6.14% 0.01 3.52% 1.70 3.74 - 8.47
2003 0.01 0.94 -8.50% 0.03 3.11% 1.63 3.59 - 7.42
Table 9
Page | 35
Dissertation Paper
As liquidity is all about meeting the financial obligations as and when they come due,
cash reserve and cash flow have direct relevance in maintaining liquidity. The
current ratio has been almost venerated by accountants and other financial decision-
makers as a prime criterion of liquidity (Lemke, 1970). Limitations of the current ratio
have long been recognized and Lemke’s proposed cash flow index is an attempt to
substitute current ratio as a measure for assessing liquidity. The desired liquidity
position of a firm in real-world situations is dependent on internal process
efficiencies, its financial policy, expansion and diversification plans, risks associated
with business operations and projects and many other factors. The rate of cash
flows from operations, investing activities and financing activities need to be
managed and controlled to maintain the desired liquidity position. The variables that
determine the rate of cash flow are
Page | 36
Dissertation Paper
8. Unexpended long term cash and securities at beginning ( ulcb): Cash and
securities balance at beginning of the period that was held for capital or long-
term debt payment but remained unpaid.
9. Cash for long term expenditure committed at beginning ( cle): Total cash and
securities held at beginning of the period for contractual commitments.
10. Cash Receipts (cr): Total cash receipts for the period.
11. Cash receipts for long-term liabilities (crll): Cash receipts during the period
that was raised for capital projects or discharge of long term liabilities.
12. Cash and securities balance at end of period ( cse): Cash and securities
balance required at end of the period for smooth operation of future business.
Some of these data points are not explicitly reported in the company financial reports
and calculating those data points for investors and creditors can be impractical, who
do not have access to the company internal records. In order to simplify the data
collection and calculation, I have taken
In simple terms the required rate of cash flow is sum of contractual commitments for
the period at beginning of the period and operational expenses during the period.
Operational expenses are net of gross revenue from continued and discontinued
operations minus the net profit after taxes. Here I have assumed that a financially
healthy firm must clear all its payables that are due at beginning of the period and
payables that fall due within the period. Payables falling due within the period are
assumed to be same as the total operating cost for the period.
Practical rate of cash flow is sum of free cash available at beginning of the period,
Page | 37
Dissertation Paper
segregated cash for contractual commitments during the period, and cash received
from operations during the period. Cash in-flow from operations is calculated
somewhat differently here.
The numerators (Required rate of cash out-flow) and the denominator (Practical rate
of cash inflow) are expected to be positive numbers. A cash flow index value can
range from 0 to ∞. A value closer to 0 may indicate discontinued operation and a
value higher than 1 indicates requirement for external financing to maintain the
required liquidity level. The possible reasons behind having a cash flow index value
of higher than 1 could be
a) Cash flow from operations is not sufficient to meet the ongoing cost of
business operations
b) Company is making capital expenditure for future growth and expansion,
which requires external sources of funding
Both of these situations mean additional risk to business. When the risks go out of
control it creates the liquidity shortage and leads to financial distress. Table 10
below, shows the calculated index for Lehman Brothers for 8 years prior to its
bankruptcy. The observations here show that there is a significant change in the
cash flow index value from 2000 to 2007. A very similar trend was also observed for
Merrill Lynch. Although the observations look very promising to suggest that the
trend of cash flow index represents liquidity risk profile and can be very useful in
predicting the financial distress situation but to prove the statistical significance of it
more firm-year observations for different size of banks and brokerage firms is
required. This is analysis is limited to the primary data set taken for the study but I
recommend further studies with more extensive data.
Page | 38
Dissertation Paper
The cash flow index for the secondary sample, UK financial brokerage firms,
suggests that many of the firms are at higher liquidity risk (Table 11). From the
comparable observations of the UK brokerage firms taken for study, Walker Crips
Group’s liquidity level is at highest risk. Although the cash flow index of Fiske plc is
higher than that of Walker Crips but the standard deviation of Walker Crips’s cash
flow index is higher than that of Fiske plc. Jarvis Securities is most stable from
liquidity risk perspective, among the sample firms. The index for Fiske for 2007 is
negative, which violates the assumption of this index model (Expected index value is
between 0 and ∞). Closer analysis on the cash flow statement showed a sharp
increase in trade receivables from £6,518k to £22,123k, without any explanation of
why the trade receivables increased so much. A similar observation was made for
Walker Crips for 2005.
Page | 39
Dissertation Paper
Table 11
Maintaining the desired level of liquidity requires that cash outflows be matched by
cash inflows per day-by-day for periods short enough to assure cash availability.
Difference in the outflows and inflows needs to be matched from the liquid holdings.
As this paper is based on the analysis of data taken from annual reports, daily,
monthly and quarterly indexing has not been possible with this data. A further study
of cash flow indexing with daily, monthly, quarterly and annual cash flow data will be
useful in extending the knowledge of effectiveness of this index.
Page | 40
Dissertation Paper
risk profiling. Liquidity risk profiling of a firm has to be done taking the individual
financial characteristics of the firm into consideration. To effectively capture the
individual firm’s financial characteristics discrimination of firm-day observations and
firm-month observations are required to gather sufficient data for accurate profiling.
The liquidity tolerances applicable for one firm may not be applicable to another firm
in the same industry segment. In this section I will discuss and explain the four
alternative functions for profiling and assessing the liquidity risk. My proposed
approach for liquidity risk profiling consists of three steps. The first step is
determining the liquidity profile of the firm based on the historical data. This historical
data can be any ratio or score that has a statistical significance with the liquidity
position of the firm. For this paper I have taken the Lemke CFI, as the predictive
variables for profiling, but I recommend further research on the effectiveness of Z-
score, O-score, Merton DD and CHS score under this proposed liquidity profiling
framework. The second step is determining the desired value of the predictive
variable based on the profile determined in first step. The third step is determining
the tolerance limits for the firm based on the desired value of predictive variable and
the profiling. The liquidity risk profile of a company can be represented as
L represents the symbol for liquidity risk profile; V is the value representing current
position of the firm, m is the desired value of the predictive variable; l and h
represent the represent the lower and upper levels of the tolerance limit.
Liquidity risk profiling is meant for an easy and quick diagnosis of the liquidity
position, it will not be able to answer the underlying reasons behind the liquidity
problems. In order to understand the root cause of the risk symptoms, it may require
deep dive investigation of the company internal factors, industry factors and market
environmental factors. When the risk index is above or below the upper and lower
limits of its risk profile then should trigger an alert for investigating the root cause of
the problem.
Page | 41
Dissertation Paper
3.00 3.00
2.00 2.00
1.00 1.00
0.00 0.00
1996 1998 2000 2002 2004 2006 2008 1996 1998 2000 2002 2004 2006 2008
cfi Scatter diagram: Merrill Lynch cfi Geometric shape: Merrill Lynch
Diagram 1
Recognising the right pattern in these data points is essential to accurately the
profiling the firm. Also, in the context of Liquidity profiling, determining the time
window for data points very important. For explaining the concept in simple terms
and for the purpose of this paper, I have taken the time window as 5 years from t – 1
to t – 5. t is the year for which the desired index value is being calculated. I have
taken the time window as 5 years with a logical reasoning. Since the firm has not
collapsed or gone bankrupt the recent 5 years of data represents survival
characteristics of the firm. The profile can also change from year to year, the firm’s
liquidity profile is not necessarily expected to follow the same geometric shape. The
diagram 2, as an example, shows the changing pattern of Merrill Lynch’s cfi over the
recent years.
Again, for the purpose of this paper and for keeping the explanations simple, I have
Page | 42
Dissertation Paper
taken simple geometric shapes such as linear curve, triangle, circular, and polygon,
as the liquidity profiles. Determining the linear curve profile from the scatter plot is
simple after knowing the mean and the two end points. The triangle profile is formed
by taking the three extreme data points on the scatter plot covering the maximum
area. Similarly the circular pattern is formed by taking the three extreme data points.
The polygon is profile is formed by simply connecting the data points. Diagram 3
shows the formation of these profiles.
2007 2007
The desired value for liquidity index or what I call as the “Liquidity Barycentre” is
determined from the shape of the profile. Like in physical objects the equilibrium of
the gravitational forces is at its centre of gravity, similarly the equilibrium of the
liquidity profile is at its liquidity barycentre. Assuming the liquidity barycentre is same
as the centre of gravity for the geometric shapes, representing the liquidity profiles,
we can calculate the desired value as
Or
Page | 43
Dissertation Paper
The liquidity barycentre for the different profiles are as given in the table below
Triangle (B)
Polygon (C)
(weighted
average)
Table 12
Considering cfi as the liquidity measure, the liquidity barycentre would be
(1)
(2)
(3)
The desired cfi values for Merrill Lynch and Lehman Brothers for these three profiles
are as shown in Table 13. The desired values for Lemke CFI for all the sample firms
are given in Appendix B.
3
Formulae on this table are presented for calculation of liquidity barycentre and may not exactly represent the
centre of mass for the shape.
Page | 44
Dissertation Paper
Lehman Brother’s observed cfi was higher than the desired cfi for all the 3 years
where the desired value could be calculated. The case for Merrill Lynch was also
very similar, the observed cfi was higher than the desired cfi except for the year
2005. From these observed values it could be concluded that the liquidity positions
of Lehman Brothers and Merrill Lynch were known to be at risk for these years. But it
would be inappropriate to conclude that any observed cfi value higher than the
desired value means high risk or below the desired value means low risk. To identify
the high risk zones, the tolerance limits needs to be added to the desired value. The
next section explains the determination of tolerance limits.
The concept of tolerance limits originated from the quality assurance processes in
manufacturing shop floors. When the product or service crosses its tolerance limit, it
may not be effective for the purpose it is meant for. That applies in liquidity
management as well. The purpose of liquidity management is to ensure smooth
execution of day to day business transactions. Under liquidity, i.e over the tolerance
limits, is a condition of financial distress and undesirable for businesses. But over
liquidity is not an undesirable state from liquidity risk management standpoint. So,
the tolerance limit for under liquidity is more important than the tolerance limit for
over liquidity. Less is the variation of liquidity level from the desired state more is the
financial stability of the firm. The common standard for measuring the variation is the
σ level. In order to empirically test the most appropriate tolerance limit, I applied a
trial and error method with different sigma levels on the failed firms. The trial and
error results (Appendix C) suggest that half-σ level is the most appropriate value for
determining the tolerance limit. The profiling results (Appendix D) show that 2 out of
Page | 45
Dissertation Paper
5 sampled UK brokerage firms have higher level of risk in their liquidity positions.
Walker Crips continues to be at higher risk for past couple of years, which is showing
similar trends as that of Lehman Brothers. This finding of liquidity issues at Walker
Crips could be related to the sale of Walker Crips Asset Managers in April 2012. The
news of £12.3m sale of Walker Crips Asset Managers is one more logical reason to
trust the results found in this analysis. The other firm showing indications of high risk
is Fiske plc. Brewin Dolphin has improved its liquidity position since 2007, there is no
liquidity risk indication for Brewin Dolphin as of publication of last financial reports.
Collins Stewart’s liquidity position could be profiled only for one year as the historical
financial reports beyond 2005 could not be sourced. There is no risk indication for
Collins Stewart from the results.
Page | 46
Dissertation Paper
5 Conclusion
Results indicate that the functional structure of Merton DD model provide useful
guidance for formulating default forecasting models. My findings from this study
confirms with the conclusion made by Mansi et al (2010) that Altman (1968)’s Z-
Score and Ohlson (1980)’s O-Score are highly ineffective in predicting financial
distress. They cautioned researchers of using the accounting ratio based models
and recommended the use of CHS model to measure distress risk. In my analysis,
Z-scores and O-scores did not predict failure for the 5 primary sample firms that are
known to have failed in past. Although Mansi et al (2010) recommended use of CHS
model, but CHS score did not predict the financial distress of sampled firms in this
study. Among all other models, variables and indexes that I have tested here, I could
conclude that the Merton DD probability is a useful variable for forecasting default.
Merton DD is the only model that predicted some probability (3% to 60%) of failure
for all the firms that are known to have failed. This result proved the hypothesis that
failures of firms can be predicted from the past performance records and behaviours
of the firms. Although the accuracy of prediction based on the Merton DD probability
was 100% but I would recommend further research covering a larger sample of firm-
year, firm-quarter and firm-month data for UK brokerage firms to extensively analyse
the effectiveness and accuracy of Merton DD model.
Under the framework of liquidity risk profiling, the Lemke cfi model fared well. As
liquidity is about meeting the financial obligation as and when they fall due, the rate
cash flow would be a more appropriate measure for liquidity than the traditional
measures such as current ratio. A simple model, based on Lemke’s (1970) cash flow
index principles, proved to be more effective in measuring liquidity risk. The results
also provide evidence that financially stable companies operate within an acceptable
tolerance level from the desired liquidity level. The liquidity profile and the tolerance
levels are dynamic in nature and can be determined from the past performance
results of the firm. This study included very limited number of firms and limited
number of firm-year observations for validating the model and predicting the
outcome, but this limitation can be overcome by undertaking future researches with
COMPUSTAT data tapes. Although the original Z-score and O-score models did not
prove to be effective in this study but it would be useful to study the effectiveness of
Z-score, O-score, Merton DD and CHS score under the suggested profiling
Page | 47
Dissertation Paper
Page | 48
Dissertation Paper
6 References
1. AccountingWeb, 2010. SEC wants public companies to disclose more info about short-
term borrowing. [Online] Accounting Web Available at:
http://www.accountingweb.com/topic/accounting-auditing/sec-wants-public-companies-
disclose-more-info-about-short-term-borrowing [Accessed 08 December 2011].
2. Aharony, J., Jones, C.P. & Swary, I., 1980. An Analysis of Risk and Return
Characteristics of Corporate Bankruptcy Using Capital Market Data. The Journal of
Finance, XXXV(4), pp.1001-16.
3. Altman, E.I., 1968. Financial Ratios and Discriminant Analysis. The Journal of Finance,
XXIII(4), pp.589-609.
4. Altman, E.I., Haldeman, R.G. & Narayanan, P., 1977. ZETA Analysis: A new model to
identify bankruptcy risk of corporations. Journal of Banking and Finance, (1), pp.29-54.
5. Altman, E.I. & Loris, B., 1976. A Financial Early Warning System for Over-The-
Counter Broker-Dealers. Journal of Finance, XXXI, pp.1201-27.
6. Anon., 2002. The Economic Effects of 9/11: A Retrospective Assessment. Report for
Congress. Congressional Research Service.
7. Bagehot, W., 1873. Lombard Street : a description of the money market. London.
8. Beaver, W., 1966. Financial Ratios as Predictors of Failure. Journal of Accounting
Research, Supplement to Volume 4.
9. Bellovary, J., Giacomino, D. & Akers, M., 2007. A Review of Bankruptcy Prediction
Studies: 1930 to Present. Journal of Financial Education, 33(Winter).
10. Bernhardsen, E., 2001. A Model of Bankruptcy Prediction. Working Paper. Oslo: Norges
Bank.
11. Bharath, S.T. & Shumway, T., 2008. Forecasting Default with the Merton Distance to
Default Model. Review of Financial Studies, 21(3), pp.1339-69.
12. BionicTurtle, n.d. Expected default frequency (EDF, PD) with Merton Model. [Online]
Available at:
http://www.youtube.com/watch?feature=endscreen&v=E4A0SOiasAM&NR=1
[Accessed 11 Jun 2012].
13. BIS, 2008. Principles for Sound Liquidity Risk Management and Supervision. Basel:
BIS Bank for International Settlements.
14. Campbell, J.Y., Hilscher, J. & Szilagyi, J., 2010. In Search of Distress Risk. Journal of
Finance, 63, pp.2899-939.
15. Charalambakis, E.C., Espenlaub, S.K. & Garrett, I., 2009. Assessing the probability of
financial distress of UK firms. In European Financial Management. Milan, 2009. The
European Financial Management Association.
16. Coats, P. & Fant, L., 1993. Recognizing Financial Distress Patterns Using a Neural.
Financial Management, 22(3), pp.142-56.
17. Cohen, H.R., 2010. SEC Shapes New Disclosure Requirements. [Online] Harvard
Education Available at: http://blogs.law.harvard.edu/corpgov/2010/11/23/sec-shapes-
new-disclosure-requirements/ [Accessed 08 December 2011].
Page | 49
Dissertation Paper
18. Cox, D. & Cox, M., 2006. The Mathematics of Banking and Finance. John Wiley &
Sons Ltd.
19. Damodaran, A., 2012. Annual Returns on Stock, T.Bonds and T.Bills: 1928 - Current.
[Online] Available at:
http://pages.stern.nyu.edu/~adamodar/New_Home_Page/datafile/histret.html [Accessed
11 Jun 2012].
20. Deyong, R., Flannery, M.J., Lang, W.W. & Sorescu, S.M., 2001. The Information
Content of Bank Exam Ratings and Subordinated Debt Prices. Journal of Money, Credit
& Banking, 33(4), pp.900-25.
21. Duttweiler, R., 2009. Managing Liquidity in Banks. London: John Wiley & Sons.
22. Finance, Y., 2012. Historical Stock Prices. [Online] Available at:
http://uk.finance.yahoo.com/q/hp?s=RBS.L [Accessed 11 Jun 2012].
23. Gibson, B.N., 1998. Bankruptcy Prediction: The Hidden Impact of Derivatives. [Online]
Available at: http://www.trinity.edu/rjensen/acct5341/1998sp/gibson/bankrupt.htm
[Accessed 27 December 2011].
24. Goodhart, C., 2008. Liquidity and Money Market Operations. London: London School
of Economics.
25. Goodhart, C., 2008. Liquidity Risk Management. London School of Economics.
26. Haber, J.R., 2005. Assessing How Bankruptcy Prediction Models are Evaluated. Journal
of Business & Economics Research, 3(1), pp.87-92.
27. Haykin, S., 1999. Neural Networks: A Comprehensive Foundation. 2nd ed. New Jersey:
Prentice-Hall.
28. Henebry, K.L., 1996. Do Cash Flow Variables Improve the Predictive Accuracy of Cox
Proportional Hazards Model for Bank Failure? The Quarterly Review of Economics &
Finance, 36(3), pp.395-409.
29. Hossari, G., 2007. Benchmarking New Statistical Techniques in Ratio-Based Modelling
of Corporate Collapse. International Review of Business Research Papers, 3(3), pp.141-
61.
30. Jones, S. & Hensher, D.A., 2004. Predicting Firm Financial Distress: A Mixed Logit
Model. The Accounting Review, 79(4), pp.1011-38.
31. Jones, S. & Hensher, D.A., 2007. Optimizing the Performance of Mixed Logit Model.
Abacus, 43(3), pp.241-64.
32. Kalotay, E., 2007. Discussion of Hansher and Jones. Abacus, 43(3), pp.265-70.
33. Karacabey, A.A., 2007. Bank Failure Prediction Using Modified Minimum Deviation
Model. International Research Journal of Finance and Economics, (12), pp.147-59.
34. Lemke, K.W., 1970. The Evaluation of Liquidity: An Analytical Study. Journal of
Accounting Research, (Spring), pp.47-77.
35. Lo, A.W., 1985. Logit Versus Discriminant Analysis: A Specification Test. Dissertation.
University of Pennsylvania.
36. LSE, 2011. London Stock Exchange. [Online] Available at:
http://www.londonstockexchange.com [Accessed 02 Jan 2012].
Page | 50
Dissertation Paper
37. Mansi, S.A., Maxwell, W.F. & Zhang, A., 2010. Bankruptcy Prediction Models and the
Cost of Debt. [Online] Available at: http://ssrn.com/abstract=1622407 [Accessed 27
December 2011].
38. Maux, J.L. & Morin, D., November 2011. Lehman Brothers’ inevitable bankruptcy
splashed across its financial statements. International Journal of Business and Social
Science, 2(20), pp.39-65.
39. Merton, R.C., 1974. On the Pricing of Corporate Debt: The Risk Structure of Interest
Rates. Journal of Finance, 29, pp.449-70.
40. Meyer, A. & Pifer, H.W., 1970. Prediction of Bank Failures. Journal of Finance,
(September).
41. Nikolaou, K., 2009. Liquidity (Risk) Concepts. ECB Working Papers, 1008.
42. Norris, F., 2010. Economix. [Online] Available at:
http://economix.blogs.nytimes.com/2010/09/17/window-dressing-24-7/ [Accessed 17
December 2011].
43. Ohlson, J.A., 1980. Financial Ratios and the Probabilistic Prediction of Bankruptcy.
Journal of Accounting Research, 18(1), pp.109-31.
44. Outecheva, N., 2007. Corporate Financial Distress: An Emperical Analysis of Distress
Risk. PhD Dissertation. Bamberg: University of St Gallen.
45. Saunders, M., Lewis, P. & Thornhill, A., 2003. Research Methods for Business Students.
3rd ed. Harlow: Pearson Education.
46. Shumway, T., 2001. Forecasting bankruptcy more accurately: A simple hazard model.
Journal of Business, 74, pp.101-24.
47. Tam, K.Y. & Kiang, M.Y., 1992. Managerial Applications of Neural Networks: The Case
of Bank Failure Predictions. Management Science, 38(7), pp.926-47.
48. Turetsky, H. & McEven, R., 2001. An Empirical investigation of Firm Longevity.
Review of Quantitative Finance and Accounting, 16, pp.323-43.
49. Xiong, Z., 2009. Research on Financial Distress Prediction with Adaptive Genetic Fuzzy
Neural Networks on Listed Corporations of China. Network and System Sciences, 5,
pp.385-91.
50. Zavgren, C.V., 1985. Assessing the Vulnerability of Failure of American Industrial
Firms: A Logistic Analysis. Journal of Business Finance & Accounting, (Spring), pp.19-
45.
51. Zhang, G., Patuwo, B. & Hu, M., 1998. Forecasting with artificial neural networks: The
state of the art. International Journal of Forecastin, 14, pp.35-62.
Page | 51
Dissertation Paper
Page | 52
Dissertation Paper
7 Appendices
Appendix A
The volatility of daily close prices, used for calculating the volatility in firm value.
Page | 53
Dissertation Paper
Appendix B
4
Observed cfi values for the sample firms . Red coloured cells indicate where the desired cfi is less than the observed cfi.
Results here indicate continued level risk for Merrill Lynch and Lehman Brothers for the recent years where as the risk assessment for the UK brokerage
firms, based on the cfi values, does not indicate high level of risks. Walker Crips is the only firm with continued level of risk in recent years.
4
RBS and Northen Rock cfi values are not included here as their annual financial reports did not provide enough level of details to calculate the cfi values.
Page | 54
Dissertation Paper
Appendix C
A tolerance limit of σ/2 is effectively able to detect he liquidity risks.
For more conservative predictions, a tolerance limit of σ/3 can be taken.
Cells in red coloured background identify the cfi values (with the tolerance limits) that are below the observed cfi level.
Page | 55
Dissertation Paper
Appendix D
Liquidity risk profile of sampled UK brokerage firms.
The liquidity position for Fiske and Walker Crips show are higher risk for the firms compared to the other sampled brokerage firms.
Page | 56