Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Save to My Library
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
On the alleged fractal nature of markets

On the alleged fractal nature of markets

Ratings: (0)|Views: 35 |Likes:
Published by Cha-am Jamal
Empirical evidence of the fractal model of markets offers less explanatory power than the linear models they are supposed to replace.
Empirical evidence of the fractal model of markets offers less explanatory power than the linear models they are supposed to replace.

More info:

Published by: Cha-am Jamal on May 21, 2010
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





 On the Fractal Nature of Stock MarketsJamal Munshi, Sonoma State University, 1992
All rights reserved
Stock market data have thwarted decades of effort by mathematicians andstatisticians to discover their hidden pattern. Simple time series analysesincluding AR, MA, ARMA, and ARIMA were eventually replaced with moresophisticated instruments of torture such as spectral analysis. But the datarefused to confess.The failure to discover the structure in price movements convinced manyresearchers that the movements were random. The so called random walk hypothesis (RWH) of Osborne and others was developed into the efficientmarket hypothesis (EMH) by Eugene Fama. The `weak form of theEMHsaysthat movements in stock returns are independent random events independent of historical values. The rationale is that if patterns did exist, arbitrageurs wouldtake advantage and thereby quickly eliminate them.Both the RWH and the EMH came under immediate attack from marketanalysts and this attack continues to this day partly because statistics used intests of the EMH are controversial. The null hypothesis states that the market isefficient. The test then consists of presenting convincing evidence that it is not.The tests usually fail. Many argue that the failure of these tests represent a TypeII error, that is, a failure to detect a real effect because of low power of thestatistical test employed.Besides, the methods of analysis assume a normal and linear world that isdifficult to defend. All residuals are assumed to be independent and normallydisrtributed, all relationships are assumed to be linear, and all effects areassumed to be linearly additive with no interactions. At each point in time thedata are assumed to be taken from identically distributed independent populations of numbers the other members of which are unobservable.Econometric models such as ARIMA assume that all dependencies in time arelinear.It is therefore logical to conjecture that the reason for the failure of statistics toreject the EMH is due not to the strength of the theory but to the weakness of the statistics. Many hold that a different and more powerful mathematicaldevice that allowed for non-linearities to exist might be more successful indiscovering the hidden structure of stock prices.In the early seventies, it appeared that Catastrophe Theory was just such adevice. It had a seductive ability to model long bull market periods followed bycatastrophic crashes. But it proved to be a mathematical artifact whose properties could not be generalized. It yielded no secret structure or patterns instock prices. The results of other non-EMH models such as the Rational Bubble
theory and the Fads theory are equally unimpressive.Many economists feel that the mathematics of time series implied by ChaosTheory is a promising alternative. If time series data are allowed to be non-linearly dependent, rather than independent as the EMH requires, or linearlydependent as the AR models require, then much of what appears to be erraticrandom behavior or "white noise" may to be part of the deterministic responseof the system. Certain non-linear dynamical system of equations can generatetime series data that appear remarkably similar to the observed stock marketdata.By using new mathematical techniques hidden structures can be discovered inwhat appears to be a random time series. One technique, attributed to Lorenz,uses a plot of the data in phase space to detect patterns called strange attractors.Another method proposed by Takens uses an algorithm to determine the`correlation dimension' of the data. A low correlation dimension indicates adeterministic system. A high correlation dimension is indicative of randomness.The correlation dimension technique has yielded mixed results with stock data.Halbert, Brock, and others working with daily returns of IBM concluded thatthe correlation dimension was sufficiently high to regard the time series aswhite noise. However, Schenkmann et al claim that weekly data of IBM returnshave a significant deterministic component. These structures may not beinconsistent with the EMH if the discovery of the structure, though providinginsight to economic theorists, do not provide arbitrage opportunities.A third technique for discovering structure in time series data has beendescribed by Mandelbrot, Hurst, Feder, and most recently by Peters . Called`rescaled range analysis', or R/S, it is a test for randomness of a series notunlike the runs test. The test rests on the relationship that in a truly randomseries, a serial selection of sub-samples without replacement should produce arandom sampling distribution with a standard deviation given bysigmaXbar = [ sigma/n^0.5 ] * [ (N-n)/(N-1) ]Here sigmaXbar is the standard deviation of the distribution of sample meansobtained by drawing samples without replacement of size n from a populationof size N, and sigma is the standard deviation of the population, i.e., when n=1.However, when the time series has runs, it can be shown that the exponent of nin the term `n^0.5', will differ from 0.5. The paper by Peters describes thefollowing relationships.R/S = NH (Peters equation 4)where R is the range of subsample sums, S is the standard deviation of the largesample, and N is the size of the sub-samples . The `H' term is called the Hurstconstant and is equal to 0.5 if no runs exist and the data are sequencedrandomly. If there is a tendency for positive runs, that is increases are more
likely to be followed by increases and decreases are more likely to be followed by decreases, then H will be greater than 0.5 but less than 1.0. Values of H between 0 and 0.5 are indicative of negative runs, that is increases are morelikely to be followed by decreases and vice versa. Hurst and Mandelbrot havefound that many natural phenomena previously thought to be random have H-values around 0.7. These values are indicative of serious departures fromindependence.Once `H' is determined for a time series, the autocorrelation in the time series iscomputed as follows:CN = 2(2H-1) -1CN is he correlation coefficient and its magnitude is indicative of the degree towhich the elements of the time series are dependent on historical values. Theinterpretation of this coefficient used by Peters to challenge the EMH is that itrepresents the percentage of the variation in the time series that can beexplained by historical data. The weak form of the EMH would require that thiscorrelation be zero; i.e., the observations are independent of each other.Therefore, any evidence of such a correlation can be interpreted as to mean thatthe weak form does not hold.Peters studied 463 monthly returns of the S&P500 index returns, 30-year government T-bond returns, and the excess of stocks returns over the bondreturns. He found, using R/S analysis, that these time series were not random but that they contained runs or persistence as evidenced by values of CNranging from 16.8% to 24.5%. The correlation estimates indicate that asignificant portion of the returns are determined by past returns. This findingappears to present a serious challenge to the efficient market hypothesis.Peters obtained sequential subsamples for eleven different values of N andcomputed R/S for each N. To estimate H he converted his equation 4 to linear form by taking logarithms to yieldlog(R/S) = H * log(N)and then used OLS linear regression between log(R/S) and log(N). The slope of the regression is taken to be an unbiased estimate of H. The results aresummarized in Table 1.TABLE 1Summary of Results Using Logarithmic TransformationsReturns Regression Serial CorrelationConstant H CNStocks -0.103 0.611 0.168Bonds -0.151 0.641 0.215

Activity (6)

You've already reviewed this. Edit your review.
1 hundred reads
Adrian Danga liked this
Kaleb Coberly liked this
chongkm liked this
hhiropall liked this

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->