Professional Documents
Culture Documents
Efficient Markets?
EMH expounds the view that all relevant information is already priced into a security
today and that the information from yesterday’s price has no impact on today’s price. This
means that the price of a time series are priced independently of each other and that they only
change with respect to changes in underlying fundamentals. Moreover, it asserts that it is not
possible to predict future returns, thus returns both positive and negative (losses) are
distributed normally. These ideas have always remained traditionally contested by market
participants who felt that EMH was contradicting the fact that they were smart enough to beat
the market. Despite this fact EMH still became a central plank of derivative pricing. Other
opponents included a small band of behavioural economists, who in particular have cautioned
against some of EMH’s wilder claims and the fact it ignores human behaviour. For instance
in October 1987 when the stock market crashed, an economist posed the question as to
whether, ‘the present value of the US economy really fell more than 20 per cent on that
Monday and then rose over 9 per cent on Wednesday?’i Robert Shiller in his excellent book
Irrational Exuberance also attacked the ideas around efficient markets arguing that,
‘investors, their confidence and expectations buoyed by past price increases, bid up stock
prices further, thereby enticing more investors to do the same, so that the cycle repeats again
Since the onset of the recent financial crisis the level of criticism of EMH has gone up
several notches particularly given that such dramatic changes in asset prices in 2008 suggest
that there was indeed a period of mispricing before the crisis rather than a sudden jump in
fundamentals within such a short space of time. Moreover, the idea that there was a credit
bubble leading up the crisis is widely accepted which is of course contrary to EMH and
rational expectations which argues that asset bubbles cannot exist. Finally, assumptions of the
normal distributions of returns meant that investors remained generally unconcerned about
1
Thomas Aubrey 2012
large unexpected events as from a probabilistic perspective such events were so remote they
could be discounted. To a large extent most of the criticisms of EMH have been made against
the simplest model of EMH which assumes independence of observations and normal
distributions. However, it is worth pointing out that most recent testing for EMH has been
done on a weaker version of the random walk where the condition for price independence and
The results in general are mixed with no real consensus developing with perhaps one
exception as highlighted by the great economist Paul Samuelson. Samuelson believed that
markets are generally micro efficient but certainly not macro efficient.iv At the micro level,
the information relevant to that company. It is only when insider information is disclosed that
prices move and this can move in both ways. The fact that it is very hard for stock pickers to
consistently beat the market suggests that this is potentially a good proxy. However, the idea
that EMH is useful at the market level is clearly counter intuitive given there is a great deal of
evidence that asset bubbles do exist. Moreover, the investors who have been able to
consistently been able to beat the market have been global macro investors and not stock
pickers.
The problem with EMH at the macro level is that it makes the huge assumption that
investors are able to digest all the relevant information available to make sense of the
trajectory of an economy. Given the complexity of the system this is an almost impossible
task, hence, most investors have been using the assumptions filtered by the intellectual
between output and inflation and what it can tell us about the future direction of these
variables. If the relationship between output and inflation did hold through time as Fisher,
2
Thomas Aubrey 2012
Friedman and the new neoclassical synthesis have led us to believe, then EMH at the macro
be looking at. If that information has been sending misleading signals to investors then we
should have expected to have seen periodically large unexpected losses, which we clearly did
in 2000 and 2008. This also suggests that at the macro level data series are less likely to be
independent of each other, with yesterday’s price having more in common with today’s price
and tomorrow’s price; a process described as long term memory. This is in keeping with
Shiller’s analysis of human behaviour and of course explains asset bubbles too. Such an
approach to analysing macro data also suggests that the behaviour of markets changes
through time. It is perfectly possible for markets to display linear and random behaviour for a
time generating a stream of data validating EMH, however, there are also likely to be to
periods when markets display a non-linear and deterministic trajectory which refutes EMH.
Indeed, such a trajectory is more in line with the nature of complex, dynamical systems, and
The implication that the streams of data through time at the macro level are constantly
changing has a bearing on the distribution of returns. When the streams of data are generated
from a simple, linear and random process then a normal distribution (Gaussian) is
appropriate. However, if the process is non-linear and displaying long term memory, then a
different distribution of returns can be expected. Any process with an extended long term
memory is likely to have an increasing probability of extreme values. Indeed, the idea that the
probability of heavy losses can increase and decrease through time depending on the extent of
long term memory is critical for investors to take note of. These ideas are however not new,
3
Thomas Aubrey 2012
they were introduced into financial theory decades ago by the mathematician Benoit
Mandelbrot.
In the early 1960’s Mandelbrot in an analysis of cotton prices found that values might
persist for some time and then suddenly change, but that there were also periods where
sudden discontinuous changes can occur. This insight can be broadly applied to financial data
whereby in an asset boom, the way in which markets develop owes more to the reflexive or
tomorrow’s value is more related to today’s value. Given that asset price bubbles are
unsustainable, at some stage this leads to a sudden jump or fall. This in turn induces a period
of volatility where prices jump around violently. Mandelbrot labeled these two effects as
Noah and Joseph.v The Joseph effect describes the upward movement of an asset bubble or
general trend akin to the Biblical story about seven continuous years of abundant harvests.
The Noah effect describes sudden discontinuous changes which generally describe a highly
volatile period with a great deal of uncertainty akin to God’s attempt in Genesis to purify the
For an asset bubble to exist, the market would by definition be increasing along a
predictable trend line, which implies stored long term memory. This of course means that
signals emanating from current monetary policy as deployed by central banks are not
efficient and therefore investors should stop using them. Interestingly, Shiller and Jung have
support it.vi Such an approach also means that non-linear trajectories are far more likely to
explain an economy than a linear one. As John Kay argued in the FT, ‘economic systems are
typically dynamic and non-linear. This means that outcomes are likely to be very sensitive to
small changes in the parameters that determine their evolution. These systems are also
reflexive in the sense that beliefs about what will happen influence what does happen.’vii
4
Thomas Aubrey 2012
Buiter also highlighted the reality of the non-linear relationships in an economy which are
complex to model which resulted the new neoclassical contingent, ‘stripping the model of its
Modeling non-linear systems is hard in the natural sciences as well as they generally
have multiple solutions many of which are impossible to calculate. Linearising any non-
linearities eliminates any understanding of the dynamism of the non-linear system. Hence it is
interesting to note that the natural sciences have been moving away from attempting to model
complex systems towards looking at streams of data as being the most effective way of
understanding the nature of a complex, dynamical system. For example, the way in which a
drop of water drips from a tap highlights not only the challenge of modeling non-linear
systems but also the way scientists have fundamentally changed the way they now approach
these kind of problems. Science has increasingly moved away from attempting to construct
complex models towards analysing streams of data. This change came about in the 1970’s
and was the result of the development of chaos theory. James Gleick’s book on Chaos recalls
the research by Robert Shaw, one of the pioneers of chaos theory that is worth repeating. The
ability to predict when the drop might actually drip would require a very complicated model
of the physical processes. After identifying the main variables including the flow of water,
the viscosity of the fluid and the surface tension, the challenge comes with predicting the
behaviour of the process. As the drop fills with water oscillating in multiple directions, it
eventually reaches a critical point and snaps off. As Gleick writes; ‘A physicist trying to
model the drip problem completely – writing down sets of coupled non-linear partial
differential equations with appropriate boundary conditions and then trying to solve them –
5
Thomas Aubrey 2012
The solution was of course to abandon the modeling and analyse the output of streams
of data instead. This is no different to the handful of investors who were able to make sense
analysing streams of data instead. So how can the collected and on-going data be used to
understand the behaviour of a system? Shaw in the late 1970’s used his data from the
dripping tap to plot a graph of the relationship between a single drop and its subsequent drop.
If the dripping was regular and predictable it would always fall on the same spot. If it was
random, dots would be scattered all over the graph. What Shaw found was that there was a
deterministic pattern to the dripping that resembled a chaotic attractor, including elements of
unpredictability. The challenge was then how to generate simple equations that would then
Chaotic attractors were first discovered by the meteorologist, Edward Lorenz, back in
1962. Lorenz found that by inputting the same data into a weather prediction computer
programme at different stages of the programme led to different outcomes. This formed the
basis of the now well-known concept in chaos theory of the sensitivity to initial conditions or
the butterfly effect. The flapping wing represents a small change in the initial condition of the
system which then causes a chain of events leading to large scale of phenomena. Had the
butterfly not flapped its wings the trajectory of the system might have been significantly
different. Such an approach however rejects the notion of any natural tendency towards
equilibrium. Indeed, as biologists have begun to study the brain in more detail, they have
found it is one of the most chaotic organs in existence. This has led the scientific community
away from the traditional view of equilibrium as being a natural state. According to one
leading neuroscientist, ‘when you reach an equilibrium in biology, you’re dead.’x Given that
an economy is a function of the decisions that millions of chaotic brains make, it should not
be surprising that an economy does not naturally tend towards an equilibrium state.
6
Thomas Aubrey 2012
Moreover, the idea that an economy might follow a deterministic but unpredictable trajectory
is also highly plausible given the arguments of Shiller and the behavioural psychology
school.
This begins to resemble our problem with economic modeling and to an extent
validates the Fed’s paper on forecasting and Lucas’ views on EMH. Given that there is
determinism in the system, short term forecasts in economics are still valid, however, any
medium term predictions become redundant due to the sensitivity of the system to any
changes in initial conditions. This suggests that investors need to stop modeling and focus on
the analysis of streams of data if they are to generate returns for their clients. This approach is
supported by Mandelbrot who stated that, ‘we cannot know everything. Physicists abandoned
that pipedream during the twentieth century after quantum theory and in a different way after
chaos theory. Instead they learned to think of the world in the second way as a black box. We
can see what goes into the box and what comes out but not what is happening inside.’xi
An economy does not function in a random way at all, but is rather the outcome of the
individual decisions made by the hundreds of millions of economic agents across the globe.
Research has highlighted that deterministic non-linear systems can generate apparent random
behaviour, which is why empirical studies of the random walk even at the macro level can
supporting the notion that observing inputs and outcomes is more likely to be of use than
trying to understand what is going on inside the system. In a recent paper two economists
highlight much of this thinking posing the question that if, ‘business cycles are caused by
non-linear dynamics, how appropriate a scientific approach is to base our analysis on linear
considerations?’ xiii
7
Thomas Aubrey 2012
Appendix A
The chart shows the relationship between the Wicksellian Differential which
measures the distance from equilibrium in an economy and the volatility of the US equity
market as measured by the VIX. The distance from equilibrium describes an increase in the
amount of credit being pumped into an economy that in turn causes the destabilisation of the
economy. The extent of the cumulative process of credit creation is also related to the
The chart demonstrates how investors get misled by market signals with volatility
jumping only when investors realise that the expansion has not been sustainable. Using the
every year based on the Wicksellian Differential or distance from equilibrium and the extent
of the cumulative process. Thus in January 2007 when volatility was at all-time lows, using a
credit-based disequilibrium approach the distributions would have highlighted fat tails with
extreme values due the cumulative process from 2002 and the fact that the economy was at a
8
Thomas Aubrey 2012
high level of disequilibrium. The jump in volatility that came in 2008 is purely a function of
i
R. Thaler (1998) ‘Giving Markets a Human Dimension’ in Mastering Finance (Financial
Times) p.193
ii
R. Shiller (2000) Irrational Exuberance (Princeton University Press) p.44
iii
J. Campbell, A. Lo, C. Mackinlay (1997) The Econometrics of Financial Markets
www.bostonfed.org
v
B. Mandelbrot and R. Hudson (2004) The (Mis)behaviour of Markets (Basic Books) p.200-
202
vi
J. Jung and R. Shiller (2006) ‘Samuelson’s Dictum and the Stock Market’
www.econ.yale.edu
vii
J. Kay (2011) ‘Economics: Rituals of Rigour’ www.ft.com
viii
W. Buiter (2009) ‘The unfortunate uselessness of most state of the art academic monetary
economics’ www.ft.com
ix
J. Gleick (1988) Chaos: Making a New Science (Viking) p.263
x
Gleick Chaos p.298 quoting Arnold Mandell
xi
Mandelbrot & Hudson (Mis)behaviour p.28
xii
W. Barnett, R. Gallant, M Hinich, M. Jensen, J Jungeilges (1996) ‘Comparisons of the
available tests for non-linearity and chaos’ in W. Barnett, G. Gandolfo, C. Hillinger (eds.)