Professional Documents
Culture Documents
net/publication/242590044
CITATIONS READS
15 543
3 authors:
Ghasem Azimi
17 PUBLICATIONS 136 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Rapid detection of Mycoplasma pneumoniae by loop-mediated isothermal amplification (LAMP) View project
All content following this page was uploaded by Hossein Shayeghi on 17 January 2014.
frequently used back propagation model [15-16]) have a set 1500 Monday
Tuesday
problems:
Wednesday
Thursday
1400 Friday
a) Dependence on initial parameters.
b)Long training time.
Load(MW)
1300
∑( y −y)
2
local utility. The simulation results show that the proposed CGA S yy = i
based method provides a greater degree of accuracy in many cases i =1
for STLF problem, compared to the BP method. Moreover, it gives and x and y represent the data pairs, x and y are the mean
lower Mean Absolute Percentage Error (MAPE). Thus, it can be values calculated over the samples and n is number of
applied to automatically design an optimal load forecaster based on samples.
historical data. Table 1 summarizes the correlations of the daily electric
load consumptions in year 2003. Column (D+i) represents the
II. LOAD DATA ANALYSIS ith day after the day D, given row-wise.
The available data for this research are total hourly actual
TABLE 1
loads of a local utility in Iran for the years 2000 to 2003. In DAILY LOAD CORRELATIONS IN YEAR 2003
order to use these data in a meaningful and logical manner, Day D+1 D+2 D+3 D+4 D+5 D+6 D+7 D+14
first of all they should be closely analyzed and their dynamics Saturday 0.9879 0.9904 0.9821 0.9750 0.9636 0.9411 0.9905 0.9908
should be clearly understood. Then, they can be clustered into Sunday 0.9937 0.9895 0.9858 0.9802 0.9633 0.9879 0.9937 -
smaller sets according to some common characteristics and Monday 0.9953 0.9930 0.9845 0.9612 0.9904 0.9937 0.9954 -
separate models can be built for each cluster. This is necessary Tuesday 0.9950 0.9876 0.9620 0.9821 0.9895 0.9953 0.9902 -
Wednesday 0.9852 0.9652 0.9750 0.9858 0.9930 0.9950 0.9900 -
because it has always been emphasized in the literature that it Thursday 0.9793 0.9636 0.9802 0.9845 0.9876 0.9852 0.9893 0.9872
is impossible to reflect every different type of load behavior Friday 0.9411 0.9633 0.9612 0.9620 0.9652 0.9793 0.9773 0.9575
with a single model. Fig 1 shows hourly load curves for a
sample week. It is seen that four working days (Sunday- From the Table 1 it can be seen that, weekdays highly
Wednesday) have very similar patterns and Saturday (first correlated with each other; but Thursday and Friday have
working day of the week in Iranian calendar) is slightly lower correlations with each other and with weekdays.
18
International Journal of Intelligent Systems and Technologies 4:1 2009
Saturday is the day which has the lowest correlations with the the number of neurons in the hidden layer is different and
other weekdays [1]. here, is obtained using the GA based method.
1200
two weather factors are used in this architecture, since the area
1100
of the forecasted load is a normal climate area.
1000
A block diagram for the proposed ANN architecture is
900
shown in Fig. 3.
800
700
0 5 10 15 20 25
Time(Hour)
Spring
Fig. 2. Daily load curves for a typical day for each season 0 0
0 0
0 Summer 0
It should be noted that would be better to distinguish Load forecast
Inputs (25)
between the seasons by using different ANN modules. (120)
Fall
Accordingly, the training would be easier and there is a chance 0
0
0
0
to have better results. Thus, four ANN modules for summer, 0 0
Winter
winter, spring and fall is used in STLF problem. In the training
Input Output
process, every network is only supplied by data on that layer layer
Hidden
particular season. layer
Four seasonal networks have the same architecture, which Fig. 3. Block diagram of the proposed ANN Architecture
is a three-layer feed-forward neural network. For every season,
the number of neurons in the input or the output layer is There are a total of 120 neurons in the input layer. The
already fixed, based on the input and output data chosen. But details of ANN input variables for groups Sunday through
19
International Journal of Intelligent Systems and Technologies 4:1 2009
Wednesday, Thursday, Friday and Saturday are given as and output and may have difficulty in convergence during
follows, and the variables are selected according to the training. If the number is too large, the training process would
discussions as mentioned in Sec. 2 and by trial and error. take longer and could harm the capability of ANN. It would
vary for different applications and could usually depend on the
A. The ANN input variables for group Sunday through size of the training set and the number of input variables. The
Wednesday: hidden layer size and its neurons number are selected either
1. The first 24 input neurons represent hourly scaled load arbitrarily or based on the trial and error approach. But in this
values of the previous day. research, a heuristic method is proposed for find the optimal
2. The next 48 neurons are used to capture the effect of number of neurons in the hidden layer. By using continue
temperature: hourly temperature data from previous genetic algorithm, the number of neurons in the hidden layer is
day and hourly temperature data for the day of determined based on the flowchart as shown in Fig 4, that is
forecast. Fig. 5 is shown the optimal number of neurons in the hidden
3. The next 24 input neurons represent hourly scaled load layer based on the mean absolute percentage error (MAPE)
values of the two previous day. performance index for each of the four-day classes in the
4. The next 24 input neurons represent hourly scaled load summer.
values of the same day in the previous week.
Start
B. The ANN input variables for groups Thursday, Friday and
Saturday:
Generating the initial number of
1. The first 24 input neurons represent hourly scaled load neurons in the hidden layer (nh)
values of the same day in the previous week.
2. The next 48 neurons are used to define the effect of
temperature: The first 24 neurons represent hourly nh=nh+1
scaled temperature data of the same day in the previous
week and another 24 neurons represent hourly load Training the ANN by using GA
temperature data of the next day (the day of forecast).
3. The next 48 neurons represent hourly scaled load Evaluating the ANN performance by using
values of the same day in the two and three previous testing data
week.
Calculating and saving the value of MAPE
1.9 2.2
1.8 2.1
MAPE
1.7 2
MAPE
1.6 1.9
1.5 1.8
1.4 1.7
1.3 1.6
0 2 4 6 8 10 12 14 16 18 20 0 5 10 15 20
Neuron Neuron
a) b)
1.65 1.8
1.6 1.75
1.55 1.7
1.5 1.65
M A PE
M APE
1.45 1.6
1.4 1.55
1.35 1.5
1.3 1.45
1.25 1.4
0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20
Neuron Neuron
c) d)
Fig.5. The optimal number of neurons based on the MAPE performance index for each of the four-day classes
a) Saturday b) Sunday-Wednesday c) Thursday d) Friday.
A.4 Training of the designed neural networks drawbacks genetic based design of neural networks has been
Training of a neural network is a process of determining the proposed.
network parameters (weights) in order to achieve the desired
objective based on the set of examples called the training set. A.5 Validation of the trained neural network
The use of ANN for solution of the STLF problem can be Since acceptable training errors do not always guarantee
broken down into two groups based on learning strategies: similar network performance for a different set of data, for
supervised and unsupervised learning. Supervised learning is example due to the lack of representativeness of the training
based on direct comparison between the inputs and outputs. set or the improperly selected network size, it is necessary to
This is usually formulated as the minimization of an error validate the network performance after it is trained. This is
function such as the total mean square error between the actual usually done by randomly selecting 10-20% of the total
output and the desired output summed over all available data. training data and setting it aside for testing. Based on the
The unsupervised learning is solely based on the correlations above discussions, test and validation data is randomly
among input data. No information on “correct output” is extracted by selecting 20% and 10% from the entire training
available for learning [18, 20]. Most applications use a data, respectively and the rest of entire data (about 70%) is
supervised learning ANN. used for the networks training [3].
The most commonly used training method is known as the If the testing errors are unacceptable, possible causes should
back propagation algorithm [15-16, 22]. This algorithm is an be identified and corrected, and the network should be
iterative procedure. It has some drawbacks. Some of them retrained. The old test set should be included into the training
are the slowness of learning speed, possibility of falling into set. If new relevant data is available, a new test set should be
local minimum and the necessity of adjusting a learning collected. Otherwise, a new test set is again randomly
constant in every application [18]. In order overcome these extracted from the entire training data.
21
International Journal of Intelligent Systems and Technologies 4:1 2009
IV. NEURAL NETWORK TRAINING ALGORITHMS learning rate (η), which, however, increases the training
The ANN use the training algorithms that are pre- time.
determined, because the ANN which is created, initialing, is • Trapping at local minima: Back propagation adjusts the
purposeless and after some time the training algorithms weights to reach the minima (see Fig. 6) of the error
change behavioral condition of all the neurons of ANN or, in function (of weights). However, the network can be trapped
other words, its connection weights and corrects them in order in local minima. This problem, however, can be solved by
to have a suitable reaction suitable to achieve desirable adding momentum to the training rule or by statistical
objective. In this section, the proposed neural network is training methods applied over the back propagation
employed to learn the input-output relationship of an algorithm.
application using the BP and CGA.
Error
function
of
A. Back propagation algorithm weights
TABLE III
1200 MAPE BY GA AND BP ALGORITHMS FOR DIFFERENT DAY CLASSES
Last Year
Last Friday
1100 Last Day MAPE
Forecast Day Day-classes
1000
BP GA
Load(MW)
24
International Journal of Intelligent Systems and Technologies 4:1 2009
1500 1500
BP BP
GA GA
1400 Real load 1400 Real load
1300 1300
Load(MW)
Load(MW)
1200 1200
1100 1100
1000 1000
900 900
0 5 10 15 20 25 0 5 10 15 20 25
Time(hour) Time(hour)
a) b)
1500 1400
BP BP
GA 1350 GA
1400 Real load Real load
1300
1300 1250
Load(MW)
1200
Load(MW)
1200
1150
1100
1100
1050
1000 1000
950
900
0 5 10 15 20 25
900
Time(hour) 0 5 10 15 20 25
Time(hour)
c)
d)
Fig. 9. ANN forecasted loads for sample four-day classes: (a) Saturday (b) Remaining working days (c) Thursday (d) Friday
5 8
GA GA
BP 7 BP
4
6
5
3
APE
APE
2 3
2
1
1
0
0 0 5 10 15 20 25
0 5 10 15 20 25
Time(hour)
Time(hour)
a) b)
6 3.5
GA GA
BP 3 BP
5
2.5
4
2
APE
APE
3
1.5
2
1
1 0.5
0 0
0 5 10 15 20 25 0 5 10 15 20 25
Time(hour) Time(hour)
c) d)
Fig. 10. Forecasted absolute percentage errors for sample four-day classes: (a) Saturday (b) Remaining working days (c) Thursday (d) Friday
25
International Journal of Intelligent Systems and Technologies 4:1 2009
1150 1150
BP BP
1100 GA
1100 GA
Real Load Real Load
1050
1050
1000
1000
Load(MW)
Load(MW)
950
950 900
900 850
850 800
750
800
700
0 5 10 15 20 25
750
0 5 10 15 20 25 Time(Hour)
Time(Hour)
b)
a)
Fig. 11. ANN forecasted loads for sample of two special days: (a) Gorban festival (b) 22 Bahman.
20 4.5
GA GA
BP 4 BP
3.5
15
3
2.5
APE
APE
10
2
1.5
5
1
0.5
0 0
0 5 10 15 20 25 0 5 10 15 20 25
Time(Hour) Time(Hour)
a) b)
Fig.12. APE of two typical of special days: (a) Gorban festival (b) 22 Bahman.
TABLE V The test results show that the proposed forecasting methods
MAPES OF TWO SPECIAL DAYS WITH GA AND BP ALGORITHMS
could provide a considerable improvement of the forecasting
Special days MAPE accuracy for the regular and the special days. The results show
(Day of forecast) BP GA that the proposed ANN-based model not only is effective in
Gorban festival reaching proper load forecast but also it can be applied to the
2.3949 1.9709
(Religious holiday) automatic design of an optimal forecaster based on the
22 Bahman available historical data. Also, it has easy implementation and
1.713 1.4656
(National holiday) good performance. The model performance evaluation in terms of
‘MAPE’ and APE indices reveals that the proposed ANN-CGA
based model produces lower prediction error and is superior to
X. CONCLUSIONS the ANN-BP method. Thus, the proposed forecasting methods
This paper describes the process of developing three layer feed- could be provided a considerable improvement of the
forward large neural networks for short-term load forecasting based forecasting accuracy for the regular and the special days and it
on CGA technique. It is based on clustering data analysis and is recommended as a promising approach for the solution of the
correlation measures. Data clustering is considered load STLF problem.
consumption profile of a local utility in Iran; and the proposed
solution for all day types, including special days. Clustering is REFERENCES
performed after a detailed data analysis, based on correlation
measures, daily and seasonal variations, holiday behaviors, [1] A. K. Topalli, I. Erkmen, I. Topalli, Intelligent short-term load
forecasting in Turkey, Electrical Power and Energy Systems, Vol. 28,
etc. Then separate large neural network models are 2006, pp. 437-447.
constructed for each cluster. At present there is no systematic [2] K.-H. Kim, H.-S. Youn and Y.-C. Kang, Short-term load forecasting for
methodology for optimal design and training of an artificial special days in Anomalous load conditions using neural networks and
neural network for short-term load forecasting. For this reason, fuzzy inference method, IEEE Trans. on Power Systems, Vol. 15, No. 2,
2000, pp. 559-565.
continuous genetic algorithm is employed to find the optimum [3] H. shayeghi, H. A. Shayanfar, A. Jalili, M. Porabasi, A Neural network
large neural networks structure and connecting weights for based short-term load forecasting, Proc. of the 2007 Internatinal Conf.
one-day ahead electric load forecasting problem in this paper. on Artificial Intelligence (ICAI 07), Las Vegas, U.S.A., June 2007.
This study also presented the research work conducted to [4] G. Box, G.M. Jenkins, Time series analysis, forecasting and control,
San Francisco: Holden-Day; 1970.
improve the short-term load forecasting for special days in [5] J.H. Park, Y.M. Park and K.Y. Lee, Composite modeling for adaptive
anomalous load conditions, which was a difficult task using short-term load forecasting, IEEE Trans. on Power Systems, Vol. 6,
conventional methods. 1991, pp. 450-457.
26
International Journal of Intelligent Systems and Technologies 4:1 2009
[6] A. D. Papalexopoulos, T. C. Hesterberg, A regression-based approach [17] C. Sun, D. Gong, Support vector machines with PSO algorithm for
to short-term system load forecasting, IEEE Trans on Power Systems, short-term load forecasting, Proc. of IEEE International Conference on
Vol. 4, No. 4, 1990, pp. 1535-1547. Networking, Sensing and Control, 2006, pp. 676-680.
[7] J.W. Taylor, R. Buizza, Using weather ensemble predictions in [18] N. O. Attoh-Okine, Application of genetic-based neural network to
electricity demand forecasting, International Journal of Forecastingt, lateritic soil strength modeling, Construction and Building Materials,
Vol. 19, 2003., pp. 57-70. 2004, Vol. 18, pp. 619- 623.
[8] A. D. Trudnowski et al., Real-Time very short-term load prediction for [19] H. Shayeghi, A. Jalili and H. A. Shayanfar, Robust modified GA based
power system automatic control, IEEE Trans. Control Systems multi-stage fuzzy LFC, Energy Conversion and Management, Vol. 48,
Technology, Vol. 9, No. 2, 2001, pp. 254-260. pp. 1656-1670. 2007.
[9] M. S. S. Rao, S. A. Soman, B. L. Menezes, P. Chawande, P. Dipti, T. [20] M. Shahidehpour, H. Yamin, Z. Li, Market operations in electric power
Ghanshyam, An expert system approach to short-term load forecasting systems: forecasting, scheduling, and risk management, Wiley-
for reliance energy limited, IEEE PES Meeting, Mumbai, 2006. Interscience Publication, 2002.
[10] S. Rahman and R. Bhatnagar, An expert system based algorithm for [21] H. S. Hippert, C. E. Pedreira, R. C. Souza, Neural networks for short-
short-term load forecasting, IEEE Trans. on Power Systems, Vol. 3, No. term load forecasting: A review and evaluation, IEEE Trans. on Power
2, pp. 392-399. 1988. Systems, Vol. 16, No. 1, 2001, pp. 44-55.
[11] T. P. Tsao, G. C. L. Iao, S. H. Chen, Short-term load forecasting using [22] H.S. Hippert, D.W. Bunn, R.C. Souza, Large neural networks for
neural networks and evolutionary programming, Proc. of Fifth electricity load forecasting: are they overfitted, International Journal of
International Power Engineering Conference, Singapore, 2001, pp. Forecasting, 2005, Vol. 21, pp. 425-434.
743-748. [23] J.-R. Zhang, J. Zhang, T.-M. Lok, M. R. Lyu, A hybrid particle swarm
[12] S. H. Ling, F. H. F. Leung, H. K. Lam, Y.-S. Lee, P. K. S. Tam, “A optimization-back propagation algorithm for feed-forward neural
novel genetic-algorithm-based neural network for short-term load network training, Applied Mathematics and Computation, Vol. 185,
forecasting, IEEE Trans. on Industrial Electronics, Vol. 50, No. 4, 2007, pp. 1026-1037,.
2003, pp. 793-799. [24] A. Konar, Artificial intelligence and soft computing: behavioral and
[13] S. Sheng, C. Wang, Integrating radial basis function neural network cognitive modeling of the human brain, CRC Press, 1999.
with fuzzy control for load forecasting in power system, IEEE/PES [25] E. D. Goldberg, Genetic algorithms in search and machine learning,
Transmission and Distribution Conference & Exhibition: Asia and reading, MA: Addison-Wesley; 1989.
Pacific, Dalian, China, 2005, pp. 1-5. [26] D. T. Pham and D. Karaboga, Intelligent optimization techniques,
[14] D. Srinivasan, Evolving artificial neural networks for short term load genetic algorithms, Tabu search, simulated annealing and neural
forecasting, Neurocomputing, Vol. 23, 1998, pp. 265-276. networks, Berlin, Germany: Springer, 2000.
[15] J. A. Momoh Y. Wang, M. Elfayoumy, Artificial neural network based [27] S. Amin, J. L. Fernandez-Villacanas, Dynamic local search, Proc. of the
load forecasting, IEEE International Conference on Systems, Man and 2nd International Conference on Genetic Algorithms in Engineering
Cybernetics, Computational Cybernetics and Simulation, 1997, pp. Systems: Innovations and Applications, 1997, pp.129-132.
3443-3451. [28] R. L. Haupt, S. E. Haupt, Practical genetic algorithms, Wiley-
[16] W. Charytoniuk, M.S. Chen, Neural network design short-term load Interscience publication, 2004.
forecasting, Proc. of IEEE International Conference on Electric Utility
Deregulation and Restructuring and Power Technologies, City
Univercity, London, 2000, pp. 554-561.
27