# ELECTRIC LOAD FORECASTING USING ANN

Text Only Version

#### ELECTRIC LOAD FORECASTING USING ANN

ELECTRIC LOAD FORECASTING USING ANN

1Abhishek Gupta, 2Ravi Malik

1.2 A.P., Geeta Engineering College, Panipat

1earthat2005@gmail.com, 2ravimalikap@gmail.com

Index TermsNeural Networks ,Back Propagation.

1. INTRODUCTION

The most used thing in todays world is energy. We are using energy in various forms in our day to day life i.e. electricity, refined oils, LPG, solar energy, wind energy, chemical energies in form of batteries and many other forms. Sometimes we are extravagant and sometimes we are careful. But the aim is to provide the uninterrupted supply to the users of electricity, and to achieve the aim there must be proper evaluation of present day and future demand of power. Thats why we need a technique to tell us about the demand of consumers and the exact capability to generate the power and this need LOAD FORECASTING technique. It is used by power companies to estimate the amount of power needed to supply the demand. It tells about the scenario of present and future load demand. It has many applications including energy purchasing and generation, load switching, contract evaluation, and infrastructure development. Load forecasting has become in recent years one of the major areas of research in electrical engineering. Load forecasting is however a difficult task. First, because the load series is complex and exhibits several levels

of seasonality. Second, the load at a given hour is dependent not only on the load at the previous day, but also on the load at the same hour on the previous day and previous week, and because there are many important exogenous variables that must be considered.

2. ARTIFICIAL NEURAL NETWORK

A neuron with a single input vector containing R elements is shown in Figure.1. P1, P2, PR are the input elements. w1,1, w1,2,..wl,R are the corresponding weights for individual input element. The dot product of input elements and corresponding weights are fed to the summing 21 neuron. A single bias b is then added to the summing neuron to form n to feed as the input for the transfer function f. A mathematical explanation of n is shown in Eq. (1).

n = w1,1P1 + w1,1P2 + + w1,RPR + b (1)

Figure 1. A mathematical Neuron

Finally, an output a is calculated through the transfer function. Some common transfer functions utilized in neural networks are shown in Figure 2. A transfer function acts as a squashing function, such that the output of a neural network is between certain values (usually between 0 and 1, or -1 and 1). Generally speaking, there are three major transfer functions. The first one is the Threshold Function where the output of the transfer function would be a 0 if the Summed input is less than a certain threshold value. When the summed input is greater than or equal to a certain threshold value, the output of the transfer function would be a 1. Let be the transfer function and be the input. A Threshold Function is shown in Eq. (2).

1 v 1

by a transfer function. Let i denote the input layer, j

cp(v) =

1

v 2

2

> v > 1

2

(2)

denote the hidden layer, k denote the output layer,

yi denote an input signal, wji denote a synaptic weight between input and hidden layer, vj denote the summed

a v 1

signal at a hidden neuron, and cpj(. ) denote the transfer

l 2 function at the hidden layer. We could write the summing equation at the hidden layer as

The last transfer function is the Sigmoid Function whose

range is from 0 to 1 shown in Eq. (3). The modified sigmoid function, also referred as the Hyperbolic Tangent Function has a range from -1 to 1 shown in Eq. (4).

n

vj(n) = I wji(n)yi(n) (5)

i=1

cp(v) = 1

1 + exp(av)

where a is a slope parameter of the sigmoid function.

(3)

where n is the number of input signals. The activated sum through transfer function could be written as

yj = cpj(vj) + bj (6)

( ) v

1 exp (v)

where bj is a threshold value at the hidden layer. The

cp v

= tan (2) = 1 1 + exp (v) (4)

transfer function could be anyone of the step, piecewise- linear, sigmoid, or hyperbolic tangent function.

Figure 2.Common Non Linear Transfer Function

3. WORKING PRINCIPLE OF ARTIFICIAL NEURAL NETWORKS

The working principles of an artificial neural network are very straightforward. Let us take a three-layer feed forward neural network as shown in Figure 3. From left to right, starting from the input layer, each input neuron is connected to every hidden neuron in the hidden layer. Then, each hidden neuron in the hidden layer is also connected to every output neuron in the output layer. Signals are passing through the input layer and multiplied by the corresponding synaptic weights. Those multiplied signals are then summed at the hidden layer and activated

Figure 3. A Multi-Layer Fully Connected Feed forward Network

The activated signal yj in the hidden layer would be multiplied by the synaptic weight wkj between the hidden layer and the output layer and summed at the output layer. Those summed intermediate signals would then be activated by the transfer function at the output layer. Let vk denote the summed signal at the output layer, cpk(.) denote the transfer function, and yk denote the output signal. We could write the summing equation at the output layer as

m

vk(m) = I wkj(m)yj(m) (7)

j=1

where m is the number of hidden neurons. The activated sum through transfer function could be written as

yk = cpk(vk) + bk (8)

where bk is a threshold value at the output layer. Again, the transfer function could be anyone of the step, piecewise-linear, sigmoid, or hyperbolic tangent function. Usually a piecewise-linear function is utilized as a transfer function at the output layer.

4. PROPOSED WORK

A broad spectrum of factors affect the systems load level such as trend effects, cyclic-time effects, and weather effects, random effects like human activities, load management and thunderstorms. Thus the load pofile is dynamic in nature with temporal, seasonal and annual variations. In our project we developed a system that predicted 24 hour at a time load demand. As inputs we took the past 24 load and the day of the week. A back propagation neural network is designed to train input data according to target data. Before training of data, some pre- processing operation is performed like mapping of mean to 0 and standard deviation to 1 of each row of input and to

Start

Pre-process data by mapping each rows mean to 0 and deviation to 1

Principle component analysis of data

Create feed forward back propagation network with training function trainlm

Set parameters of

Train network again

Simulate the Model

Plot Graph

End

make each row highly uncorrelated principle component analysis of column of matrices is done. The inputs are then fed into our Artificial Neural Network (ANN) and after sufficient training were used to predict the load demand for the next week. A schematic flow chart of our system is shown in Fig 4. A simulink model of neural network comes into existence after running the script in MATLAB, which train the input data as per target values and also tell how much data is actually trained accurately. Regression fulfils this purpose. For perfectly fitting of input values to output regression value should be in between 0.998-1.00. Although in MATLAB neural network toolbox training parameters are predefined but these can be change for better performance. In my work, number of echoes is set to 700 and learning rate is 0.3.

5. RESULTS

While training the model the difference between target value and input value should be minimum. It is defined in terms of mean square error. For our network the MSE graph is shown in figure 5.

Figure 5. Mean Square Error Graph

Value of MSE continuously decreases first and then becomes constant. It shows my designed neural network model holds good for target data. One more performance parameter Regression graph is shown in figure 6

Figure 6. Regression Graph- A performance evaluation

In above graph black dotted line represents exact training of input values. Whole data should lie on this line. Black circle represents data in above graph. After inputting of data into neural network and training it, finally graphs are plotted between load values of 24 hrs in each day in a week as shown in figure 7.

Above graph shows that output value exactly matches the input showing how much deviation will be each day from input values.

6. CONCLUSION

The result of BPP network model used for one day ahead short term load forecast, shows that BPP network has a good performance and reasonable prediction accuracy was achieved for this model. Its forecasting reliabilities were evaluated by computing the mean absolute error between the exact and predicted values. We were able to obtain an Absolute Mean Error with a high degree of accuracy. The results suggest that ANN model with the developed structure can perform good prediction with least error and finally this neural network could be an important tool for short term load forecasting.

REFERENCES

. Samsher Kadir Sheikp, M. G. Unde, SHORT-TERM LOAD FORECASTING USING ANN TECHNIQUE

International Journal of Engineering Sciences & Emerging Technologies, Feb 2012 ISSN: 2231 6604

Volume 1, Issue 2, pp: 97-107

. Salman Quaiyum, Yousuf Ibrahim Khan, Saidur Rahman, Parijat Barman, Artificial Neural Network based Short Term Load Forecasting of Power System International Journal of Computer Applications (0975

Sunday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours Wednesday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours Saturday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours

Monday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours Thursday Forecasting

4

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours

Tuesday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours Friday Forecasting

D a y V a lu e

2

0

-2

0 10 20 30

24 Hours

In

out

8887), Volume 30 No.4, September 2011

. Nazri MOHD NAWI, Rozaida GHAZALI, Mohd Najib MOHD SALLEH, An Approach to Improve Back- propagation algorithm by Using Adaptive Gain Biomedical Soft Computing and Human Sciences,

Vol.16,No.2,pp.125-134

. Nazari, Jamshid and Ersoy, Okan K.," Implementation of back-propagation neural networks with MATLAB" (1992). ECE Technical Reports. Paper 275.

. Muhammad Buhari, Member, IAENG and Sanusi Sani Adamu, Short-Term Load Forecasting Using Artificial Neural Network IMECS 2012 Vol I Hong Kong

. Load Forecasting Chapter 12, E.A Feinberg and Dora Genethlio, Page 269 285, from links:www.ams.sunysb.edu nd www.usda.gov.

. One Hour Ahead Load Forecasting Using Artificial Neural Network for the Western Area of Saudi Arabiaby

A. J. Al-Shareef, E. A. ohamed, and E. Al-Judaibi

. Catalao J. P. S. , Mariano S. J. P. S., Mendes V. M. F., Ferreira L. A. F. M., Short-term electricity prices forecasting in a competitive market: A neural network approach, Electric Power Systems Research 77 (2007) 12971304

. A.J. Conejo, J. Contreras, R. EspÂ´nola, M.A. Plazas,

Figure 7. : Daily Load forecasting Graph

Forecasting electricity prices for a

day-ahead pool-based electric energy market, Int. J. Forecast. 21 (3) (2005) 435462.