You are on page 1of 4

28 4 Vol. 28 No.

4
2011 4 Control Theory & Applications Apr. 2011

: 10008152(2011)04060104

Adaboost

,
( , 116023)

: (echo state network, ESN), ESN.


, ESN. AdaboostESN, Adaboost
, ESN(Adaboost ESN, ABESN). ESN
, , ; , , . ESN
, , ESN.
Mackey-Glass, .
: ESN; Adaboost. RT; ;
: TP273 : A

Improvement of echo state network accuracy with Adaboost


HAN Min, MU Da-yun
(Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian Liaoning 116023, China)

Abstract: Modifying the prediction model of individual echo state network(ESN) improves the total prediction result
with limited extent. To solve this problem, we consider an ensemble of ESN. The general performance and prediction
accuracy of each individual ESN is boosted by using the Adaboost algorithm. Based on the Adaboost algorithm results,
we develop an ESN predictor(ABESN). In this predictor, the weights of training samples are constantly adjusted according
to the fitting error, the greater the fitting error, the heavier the weights for the training samples. Therefore, the ESN
predictor will focus on the hard-learning samples in the next iteration cycle. The prediction models of individual ESN are
weighted and added up to form the final predictor of the ensemble of ESN. The presented model is tested on the benchmark
prediction problem of Mackey-Glass time series as well as the time series of sunspots. Simulation results demonstrate its
high prediction accuracy and effectiveness.
Key words: ESN; Adaboost. RT algorithm; nonlinear time series; prediction

1 (Introduction) ESN
,
.
, , ESN
[1] . . ,
AdaboostESN
[2] , , ,
,
[3, 4] .
, ,
2004, Herbert Jaeger
, ESN
, (echo state net-
, ESN
work, ESN)[5] ,
ABESN.
.
Mackey-Glass, ESN
,
, .
[68] .
: 20100118; : 20100504.
: (60674073); 863(2007AA04Z158);
(2006BAB14B05).
602 28
2 (Prediction model (R; T) [9]

of echo state networks) , [10] ,


ESN. ESN,
1. : .
(
x(t + 1) = tanh (Wx x(t) + Win u(t)), 3.1 Adaboost. RT (Adaboost. RT Algo-
(1) rithm)
y(t + 1) = W T x(t).
Adaboost. RT:
: x(t), u(t), y(t)tESN 1) . (x1 , y1 ), ,
, Wx Win (xm , ym ), y R;
; ;
, W . (0 < < 1).
tanhSigmoid. 1
2) . t = 1, Dt (i) = ,
, : m
t = 0.
Step 1 N
3) . ,
Wx Wx ,
ft (x) y , :
;
ft (xi ) yi
Step 2 , AREt (i) = | |. (3)
yi
.
, , ft (x):
P
; t = Dt (i). (4)
i:AREt (i)>
Step 3
, W . t = nt (n=1, 2, 3). Dt :


Dt (i)
t , AREt (i) 6 ,
Zt
Dt+1 (i) = (5)

D (i)
t 1, .
Zt
Zt , t = t + 1.
4) .
P 1
(lg )ft (x)

ffin (x) = t P . (6)
1
(lg )
t
1 ESN
3.2 Adaboost ESN
(ABESN
Fig. 1 Structure of ESN
prediction model)
ESN ,
: . ESN
Yd k,
min kA W (2) , ,

W .

A = [x(Init), x(Init + 1), , x(Trn )]T , W ESN
W , ,

Yd = [yd (Init), yd (Init + 1), , yd (Tm )],
, ESN
Init1, . Trn .
,
Tn = Trn Init + 1. , ,
(Adaboost-ESN predictor)
3 ABESN [12] . AdaboostESN
AdaBoost, ,
1
[9] . Adaboost. RT Dt (m),
m
4 : Adaboost 603
ESN, ,
AREt (i)t , (5)Dt ,
Dt ,
, , ESN
,
ESNABESN,
:

: (x1 , y1 ), , (xm , ym )x, y R;
1
: Dt (i) =
m
For t = 1, , T 2 ABESN
Dt (i)ESN; Fig. 2 Sunspot predictive results using ABESN
ESN: ft (x); ABESNESN
ft (xi ) yi ,
: AREt (i) = | |;
P yi 1. 1,
ft (x)t = Dt (i), , [5]
i:AREt (i)>
t = nt ; ESN18.5573, ABESN
, [8]
(5)Dt+1 (i);
ESN, ,
End loop;
ABESN.
(6)ABESNffin (x);
1

Table 1 Comparison of different predictive model
,
ABESN ESN[5] SVESM[8]
Ermse , ,
. y(t) Ermse () 12.0353 18.5573 15.4040
, , N
4.2 Mackey-Glass (Prediction of
, :
Mackey-Glass time series)
1 P N
1
Ermse = [ y (k) y(k)]2 ] 2 .
[ (7) ABESN
N 1 k=1
, Mackey-Glass
, 1000
, , . , 800, .
, , . Wx 10.02, Adaboost
4 (Simulations) 0.3, 20. 3ABESN
ABESN Mackey-Glass.
,
, Mackey-Glass
, ABESN,
.
4.1 (Simulation of sunspots
data)
ABESN
, 17002003
. 100,
0.05, 1, Adaboost
0.4, 20. 3 ABESNMackey-Glass
250, ,
2. Fig. 3 Mackey-Glass predictive results using ABESN
604 28
2ESN, tion[M]. Wuhan: Huazhong University of Science Technology Press,
1994.)
ESN,
[2] HORNIK K, STINCHCOMBE M, WHITE H. Multi-layer feed-
, .
forward networks are universal approximators[J]. Neural Networks,
(SVESM)1200, 1989, 2(3): 359 366.
2200, 0.0085; [3] CAI X D. Time series prediction with recurrent neural networks
trained by a hybrid PSO-EA algorithm[J]. Neurocomputing, 2007,
(SOM)0.035,
70(13/15): 2342 2353.
1001; MLPs,
[4] , , . RBF
500, 0.060. [J]. , 2009, 26(2): 151 155.
(ZHANG Dongqing, NING Xuanxi, LIU Xueni. On-line prediction
2 of nonlinear time series using RBF neural networks[J]. Control The-
ory & Applications. 2009, 26(2): 151 155.)
Table 2 Errors comparisons of different models
[5] JAEGER H, HAAS H. Harnessing nonlinearity: predicting chaotic
Ermse systems and saving energy in wireless communication[J]. Science,
2004, 304(5667): 78 80.
ABESN 300, 800 0.00220
[6] LIN X W , YANG Z H, SONG Y X. Short-term stock price prediction
ESNs[5] 1000, 3000 0.00063 based on echo state networks[J]. Expert Systems with Applications,
SVESM[8] 1200, 2200 0.00850 2009, 36(3): 7313 7317.
SOM 1001 0.03500 [7] GE Q, WEI C J. Multiresolution-based echo state network and its
MLPs 500 0.06000 application to the longterm prediction of network traffic[C] //2008
International Symposium on Computational Intelligence and Design.
Piscataway, N J: IEEE, 2008, 1: 469 472.
5 (Conclusions)
[8] SHI Z W, HAN M. Support vector echo state machine for chaotic time
ABESN series prediction[J]. IEEE Transactions on Neural Networks, 2007,
, Adaboost. RT 18(2): 359 372.
[9] FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of
ESN, ,
on-line learning and an application to boosting[J]. Journal of Com-
Mackey-Glass, puter and System Sciences, 1997, 55(1): 119 139.
, : [10] SOLOMATINE D P, SHRESTHA D L. AdaBoost. RT: a boosting al-
gorithm for regression problems[C] //2004 IEEE International Joint
,
Conference on Neural Networks. New York: IEEE, 2004, 2: 1163
, Adaboost. RT 1168.
ESN , [11] SHRESTHA D L, SOLOMATINE D P. Experiments with AdaBoost.
, RT, an improved boosting scheme for regression[J]. Neural Compu-
tation, 2006, 18(7): 1678 1710.
ESN;
[12] GAVIN B, JEREMY W, RACHEL H, et al. Diversity creation meth-
Mackey-Glass ods: A survey and categorization[J]. Information Fusion Journal,
, ESN Special Issue on Diversity in Multiple Classifier Systems, 2005, 6(1):
5 20.
, ABESN
,
:
.
(1959), , , ,
3S, E-mail: minhan@dlut.
(References):
edu.cn;
[1] , . [M]. :
(1985), , ,
, 1991.
(YANG Shuzi, WU Ya. Time Series Analysis in Engineer Applica- , E-mail: dymu@mail.dlut.edu.cn.

You might also like