You are on page 1of 52

IMPLEMENTATION OF SHORT-TERM LOAD FORECASTING USING NEURAL NETWORKS AND ANFIS

By

P.Uday Kiran

M.Ajay Kumar

M.Suresh

4/4 B.Tech(EEE)

What is Load Forecasting?
 Predicting the load

 It is important for maintaining the Power Plant
 Forecaster ascertains the estimated load for required hour

Importance of Load Forecasting
• Load Forecasting has always been the essential part of an efficient Power System planning and operation • Several Electric Power companies are now forecasting load power based on conventional methods • However, since the relationship between load power and factors influencing the load power is non-linear, it is difficult to identify its non-linearity by using conventional methods

Types of Load Forecasting
 Short-Term Load Forecasting Load prediction period may be a week or shorter period than a week

 Medium –Term Load Forecasting Load prediction period may be few months
 Long-Term Load Forecasting Load prediction period may be more than a year

Factors affecting the load Forecasting
• • • • • • • Seasonal changes Daily changes Temperature Humidity Clouds Random event Any economical or environmental change

Need for Forecasting the load • • • • Planning of power generation Scheduling of fuel supplies and maintenance For minimizing the operation costs Important for supplier: With the forecasted load number of generations in operation can be controlled .

2. are used to store the acquired knowledge.Inter-neuron connection strengths. • It resembles brain in two aspects: 1. known as synaptic weights.Knowledge is acquired by the network from its environment through learning process. • The procedure used to set the connection strengths is called learning .Artificial Neural Network • A Neural network is a massively parallel-distributed processor made up of simple processing units. know as neurons.

Basic Elements In Neural Network Structure • ANN performs fundamentally like a human brain. • Neurons of ANN consists of 3 main components. . the summation function within the node. transfer function. The cell body in the human neuron receives incoming impulses via dendrites. weights connecting the nodes.

COMPONENTS OF NEURON THE NEURON MODEL .

• After the training process. ANN generalizes it. • These two information-processing capabilities make it possible for neural networks to solve complex problems. Generalization refers to the neural network producing reasonable outputs for inputs not encountered during training (learning). .

Model of Artificial neuron .

.Properties of neural networks  Non-linearity  Input-Output mapping  Adaptivity  Fault tolerance These are the properties that are most desirable for Solving the problems at hand.

Taxonomy Neural networks Feed-forward networks Feed back networks .

Feed-Forward Neural Network .

• The aim of this network is to train the net to achieve a balance between the ability to respond correctly to the input patterns that are used for training and ability to provide good responses that are similar .Back Propagation Network • Back Propagation is a systematic method for training multi-layer artificial neural networks using back propagation of errors rule.

one or more hidden layers.Back Propagation neural network • The BP network consists of one input layer. one is the input information transmitting in the forward direction and another is the error transmitting in the backward direction. one output layer. . • The learning process includes two courses.

Training Algorithm The training algorithm of back propagation involves 4 stages:  Initialization of weights  Feed-forward  Back propagation of errors  Updation of weights and biases .

. • The availability of historical data is most important for ANN‟ s to apply to this field.APPLICATION OF ANN’S • The application of ANN‟ s to short-term load forecasting has gained a lot of attention recently.

which can also be extended by user programming (SIMULINK) .MATLAB An interactive system whose basic element is an array that does not require dimensioning It allows • Graphics • Computation • External interface It has in built tool boxes.

NEURAL NETWORK TOOL BOX STEPS INVOLVED • Assemble the training data • Create the network object • Train the network • Simulate the network response to new inputs .

the NN output needs de-scaling to generate forecasted loads .Preprocessing and post processing • It’s useful to scale inputs & outputs such that they fall in specified range • So.

LOAD FORECASTING USING ANN • In this work NPDCL Warangal distribution system load data is considered for forecasting. . • Temperature data was taken from the NITW weather station. • Feed Forward Back propagation 2-layered network structure with non-linear sigmoid function as transfer function is chosen.

• The connection weights can be real numbers or integers. all of them should be fixed. When training is completed.Network Properties • The two layers include hidden layer and output layer. but some can be fixed deliberately. They are adjustable during the training. .

L(h-120). L(h-3). L(h-97). L(h-146). . L(h-48). L(h-25). L(h-4). L(h98). L(h-144). L(h-168). L(h-50). L(h-74). L(h-26). L(h-194). L(h-96). L(h-6). L(h-72). the set of • {L(h-1). L(h-2). L(h-170). L(h-73). L(h192). L(h-121). L(h-169). } total of 30 inputs are used at input layer . L(h-122). L(h24). L(h-5). L(h145). L(h-49).Load Forecasting with only loads as inputs • Input Variables • Different sets of lagged loads have been proposed as input features for the load prediction in the electricity market. L(h-193). Bearing in mind the daily and weekly periodicity and trend of the load signal.

Topology Of ANN (D-8) L(h192) L(h193) L(h194) (D-7) L(h168) L(h169) L(h170) … (D-2) (D-1) (D) L(h) … L(h-48) L(h-24) L(h-1) … L(h-49) L(h-25) L(h-2) Output L(h-50) L(h-26) L(h-3) at L(h-4) Interva L(h-5) l L(h-6) „ h‟ .

.Training: 1) The back propagation NN algorithm is used here for learning the neural network. Simulation: Using the trained neural network. the forecasting output is simulated using the test input patterns. and then a backward pass modifying the synapses (weights) to decrease the error. 2) The implementation of Back Propagation involves a forward pass through the network to estimate the error.

Number of output nodes=1. . Number of training samples=10.Simulation Results • • • • • Number of input nodes =30. Number of hidden nodes=50. Number of testing samples=5.

87 6.1925 10.7 % ERROR 12.ACUTUAL LOAD 1256 FORECASTED LOAD 1417.73 920.8742 1307 1153 992 1028 1390 1270.42 .3504 10.68 910.2064 8.

.

. Hourly temperature values were from the NITW weather station for the month January 2007.Load Forecasting By Considering Temperature Effect INPUT VARIABLES: Hourly load data for the month January 2007 was collected from NPDCL (Northern Power Distribution Corporation Limited). We used this data to train the network and test its performance.

T(h-3) forecaste L(h-4). T(h-5) L(h-6). T(h-6) . L(h-5). T(h) L(h) L(h-1).Topology Of ANN (D-3) L(h-72) T(h-72) L(h-73) T(h-73) L(h-74) T(h-74) (D-2) L(h-48) T(h-48) L(h-49) T(h-49) L(h-50) T(h-50) (D-1) L(h-24) T(h-24) L(h-25) T(h-25) L(h-26) T(h-26) (D) Output h. T(h-2) L(h-3).T(h-1) Load to be L(h-2). T(h-4) d at hour „h‟.

Number of output nodes=1. . Number of training samples=10. Number of hidden nodes=50.Simulation Results: • • • • • Number of input nodes=32. Number of testing samples=5.

ACUTUAL LOAD 1028 1200 1185 1159 1174 FORECASTED LOAD 1059.841 .4 1192.06 2.5 1164.697 0.5 1208 1128.227 3.632 4.9 % ERROR 3.

.

. • It constructs a set of fuzzy if-then rules with appropriate membership functions to generate the stipulated input-output pairs. • The parameters of MF’s and rules change through the learning process. • First order sugeno fuzzy model with hybrid learning algorithm is used.Adaptive neuro-fuzzy inference system • In ANFIS MF parameters are chosen so as tailor them to a set of i/p-o/p data.

Fuzzy Inference System 1.Fuzzification interface: transforms input crisp values into fuzzy values .

Decision-making logic: performs inference for fuzzy control actions. . (i) Rule base containing a number of fuzzy ifthen rules (ii) Data base defines the membership functions of fuzzy sets used in the fuzzy rules. 3. 4.Knowledge base : A combination of rule base and data base.Defuzzification interface: transforms fuzzy value to a crisp value.2.

Membership functions for seven linguistic variables NB NM NS PS PM PB Xmin Xmax .

If x is A2 and y is B2. If x is A1 and y is B1. then f1=p1x+q1y+r1 • 2.ANFIS Architecture • 1. then f2=p2x+q2y+r2 .

for i=1.i = µ B i-2 (y) . Here Gaussian membership function can be used. Where x (or y) is the input to node i and Ai (or Bi-2) is a linguistic label (“small” or “large”) associated with the node . .i = µ A i (x) .2 or O 1.Layer 1: Every node i in this layer is an adaptive node with a node function.4. for i=3. O 1.

whose output is the product of all the incoming signals. Layer 3: Here. the ith node calculates the ratio of the ith rule‟s firing strength to the sum of all rule‟s firing strength. .Layer 2: Every node in this layer is a fixed node labeled Π.

Layer 5: The single node in this layer is a fixed node labeled Σ. which computes the overall output as the summation of all incoming signals: . Where wi is a normalized firing strength from layer 3 and {pi.Layer 4: Every node i in this layer is an adaptive node with a node function. ri} is the parameter set of the node. qi. These parameters are referred to as consequent parameters.

• We used this data to train the ANFIS and test its performance. Our focus is on a normal weekday. Hourly temperature values were from the NITW weather station for the month January 2007. .Load Forecasting Using ANFIS: • INPUT VARIABLES: Hourly load data for the month January 2007 was collected from NPDCL (Northern Power Distribution Corporation Limited).

Topology Of ANFIS Output L(h-2) T(h-2) Load and temperature at (h-2) L(h-1) T(h-1) Load and temperature at (h-1) „h‟. T(h). L(h) „h‟ is the hour Load to be of predicted forecasted at load hour „h‟ .

ANFIS editor: anfisedit .

• Number of Membership function for output=1. • Number of training samples=38.Simulation Results: • Number of input nodes=6. . • Number of Memebership functions for each input=3. • Number of output nodes=1. • Number of testing samples=5.

Simulation Block Diagram .

ACUTUAL LOAD 1028 1200 1185 1159 1174 FORECASTED LOAD 1035.727 0.464 0.349 .5 1192.633 0.9 % ERROR 0.4 1179.5 1167 1169.69 0.

Conclusions From the simulation results obtained we observed that the error in the load forecasting was decreased to a great extent when temperature effect was considered. The error was further decreased when simulated by ANFIS. .

" PrQceedings of IEEE.Galiana. 392-399 • [3] D. R. 1987. on Power Systems.442.449 .6. "Short-Term Load Forecasting. F. Dec.2." IEEE Trans. M. vo1.D. on Power Systems. "An Expert System Based Algorithm for Short Term Load Forecast. vo1.El-Sharkawi. pp 1558-1 573.Gross.Bhatnagar. May1991.2. no. • [2] S. pp.Rahman. no." IEEE Trans. pp.Park.3.Mark 11.References • [1] G.A.75.12.J.C. "Electric Load Forecasting Using An Artificial Neural Network. vo1. May 1988. R. no.

Yu.Park. USA.3.C. "An Expert System for Short Term Load Forecasting by Fuzzy Decision. 1992.H.T.7.T.244-250 .Term Load Forecasting Using An Artificial Neural Network. "Short . J.Park.Chen. pp.[4] K. 124132 [5] S.K. Washington.1098-1105 [6] Y. 1992. no. on Expert System Application to Power Systems." IEEE Trans.Park. A. no.T. on Power Systems. pp.R. Feb." Proc." IEEE Trans. pp.Cha. "Weather Sensitive Short-Term Load Forecasting Using Nonfully Connected Artificial Neural Network. J. vo1. of 2nd Symp. Aug. D.Moghaddamjo. Y.7. 1. on Power Systems.Y. July 1989.Lee. vo1.

THANK YOU .