# Performance Analysis Of Ga And Pso Over Economic Load Dispatch Problem

DOI : 10.17577/IJERTV2IS50068

Text Only Version

#### Performance Analysis Of Ga And Pso Over Economic Load Dispatch Problem

Sakshi Rajpoot

S.G.S.I.T.S. Indore

ABSTRACT

Economic load dispatch problem is one of the most pop- ular concerns in power system engineering. Many me- thod have been proposed in past to solve this. Genetic algorithm and particle swarm optimization are the most popular algorithms in term of optimization. This paper is implementation of GA and PSO over the Economic Load Dispatch problem. Comparison of both algorithms is shown here with a standard example when considering

Fj = Cost function of generator j

aj, bj, cj = Cost coefficients of generator j n = total number of generators.

While minimizing the total generation cost, the total gen- eration should be equal to the total system demand plus the transmission network loss. The transmission loss is given by the equation,

j n

Loss and no Loss conditions.

Keywords Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Economic Load Dispatch (ELD)

Where

PL = BojPj

j 1

(3)

1. INTRODUCTION

In Economic Load Dispatch (ELD) are designed and op- erated to meet the continuous variation of power de- mand. The power demand is shared among the generat- ing units and economic of operation is the main consid-

Boj = the loss co-efficient matrix.

The equality constraint for the ELD problem can be giv- en by

j n

eration in assigning the power to be generated by each generating units. Therefore, Economic Load Dispatch

Pj

j 1

= D + PL (4)

(ELD) is implemented in order to ensure for economic operation of a power system. Economic Dispatch prob- lem is an optimization problem that determines the op- timal output of online generating units so as to meet the load demand with an objective to minimize the total gen- eration cost. Economic Load Dispatch (ELD) pertains to optimum generation scheduling of available generators in an interconnected power system to minimize the cost of generation subject to relevant system constraints. Cost equations are obtained from the heat rate characteristics of the generating machine. Smooth cost functions are li- near, differentiable and convex functions. The most sim- plified cost function of each generator can be represented as a quadratic function as given in whose solution can be obtained by the conventional methods:

j n

Where D is the total demand needed by the load or con- sumer. The generation output of each unit should be be- tween its minimum and maximum limits. That is, the fol- lowing inequality constraint for each generator should be satisfied:

Pjmin < Pj < Pjmax (5)

Where Pjmin , Pjmax are the minimum and maximum out- put of individual generators respectively.

2. PROPOSED METHODOLOGY

Various mathematical methods and optimization tech- niques have been employed to solve ELD problems. We here are analyzing performance of two most popular al- gorithms from optimization family.

C = FjPj

j 1

(1)

1. GENETIC ALGORITHM

Genetic Algorithms are a family of computational mod-

j

j

Where

FjPj = aj + bjPj + cjP 2 (2)

els inspired by evolution. These algorithms encode a po- tential solution to a special problem on a simple chromo- some-like data structure and apply recombination opera-

C = Total generation cost

tors to these structures as to preserve critical information. Genetic algorithms are often viewed as function optimiz-

er, although the ranges of problems to which genetic al- gorithms have been applied are quite broad. Genetic Al- gorithms are search algorithms that are based on con- cepts of natural selection and natural genetics. Genetic algorithm was developed to simulate some of the processes observed in natural evolution, a process that operates on chromosomes (organic devices for encoding the structure of living being).

The genetic algorithm differs from oth- er search methods in that it searches among a population of points, and works with a coding of parameter set, ra- ther than the parameter values themselves. It also uses objective function information without any gradient in- formation. The transition scheme of the genetic algo- rithm is probabilistic, where as traditional methods use gradient information. Because of these features of genet- ic algorithm, they are used as general purpose optimiza- tion algorithm. They also provide means to search irregu- lar space and hence are applied to a variety of function optimization, parameter estimation and machine learning applications. GAs start with selecting an initial popula- tion, iteratively apply operators to reproduce new popu- lations, evaluate these populations, and decide whether or not the algorithms should continue to execute.

GAs differ from classical op- timization algorithms mainly in that Gas operate on a population of individuals instead of parameters in clas- sical algorithms. Compared to the optimization algo- rithms, each individual in a population is encoded into a chromosome that represents a candidate solution. A chromosome is composed of genes that are usually of bi- nary form. The evaluation of an individual is determined by the fitness function value corresponding to the objec- tive function value.

Typical GAs includes the following steps:

1. Generate an initial random population of chro- mosomes.

2. Evaluate the population of chromosomes using an appropriate fitness function.

3. Select the subset of chromosomes with better fitness value as parents.

4. Crossover the pairs of parents with given prob- ability (Pc) to produce offspring.

5. Mutate the chromosomes of offspring with probability (Pm) to avoid early trap into local solutions.

6. Re-evaluate the fitness values of offspring.

7. Terminate algorithms if the stopping criteria is satisfied.

Create population of chromosomes

Create population of chromosomes

Determine the fitness of each individual

Determine the fitness of each individual

Perform reproduction using crossover

Display results

Perform reproduction using crossover

Display results

Perform mutation

Figure 1 Flow Chart of Genetic Algorithm

2. PARTICLE SWARM OPTIMIZATION

Particle Swarm Optimization (PSO) is a technique used to explore the search space of a given problem to find the settings or parameters required to maximize or minimize a particular objective.

This technique, first described by James Kennedy and Russell C. Eberhart in 1995, origi- nates from two separate concepts: the idea of swarm in- telligence based off the observation of swarming habits by certain kinds of animals (such as birds and fish); and the field of evolutionary computation. The PSO algo- rithm works by simultaneously maintaining several can- didate solutions in the search space. During each itera- tion of algorithm, each candidate solution can be thought of as a particle flying through the fitness landscape finding the maximum or minimum of the objective func- tion. Initially, the PSO algorithm chooses candidate solu- tion randomly with in the search space. It should be noted that the PSO algorithm has no knowledge of the underlying objective function, and thus has no way of knowing if any of the candidate solutions are near to or far away from a local or global maximum or minimum.

The PSO algorithm simply uses the objec- tive function to evaluate its candidate solutions, an op- erates upon the resultant fitness values. Each particle maintains its position, composed of the candidate solu- tion and its evaluated fitness, and its velocity. Addition- ally, it remembers the best fitness value it has achieved thus far during the operation of the algorithm, referred to as the individual best fitness, and the candidate solution that achieved this fitness, called the global best position or global best candidate solution. The PSO algorithm consists of just three steps, which are repeated until some stopping condition is met:

1. Evaluate the fitness of each particle.

2. Update individual and global best fitnesss and positions.

3. Update velocity and position of each particle.

The first two steps are fairly trivial. Fitness evaluation is conducted by supplying the candidate solution to the ob- jective function. Individual and global best fitness and positions are updated by comparing the newly evaluated fitness against the previous individual and global best fitness, and replacing the best fitness and positions as ne- cessary. The velocity and position update step is respon- sible for the optimization ability of the PSO algorithm. The velocity of each particle in the swarm is updated us- ing the following equation:

vi (t+1) = w vi (t) + c1 r1 [xi (t) xi (t)]

+ c2 r2 [g (t) xi (t)] (6)

Start

Generation on initial searching points of each agent

Generation on initial searching points of each agent

Evaluation of searching points of each agent

Evaluation of searching points of each agent

Modification of each searching points by state equation

Modification of each searching points by state equation

Reach maximum iteration

Reach maximum iteration

Stop

Figure 2 Flow Chart of particle swarm optimization

Each of the three terms of the velocity update equation has different roles in the PSO algorithm. This process is repeated until some stopping condition is met. Some common stopping conditions include: a pre-set number of iterations of the PSO algorithm, a number of iterations since the last update of the global best candidate solu- tion, or a predefined target fitness value.

1. SIMULATION AND RESULTS

We developed model for solving the Economic Load Dispatch problem using the statements mentioned in sec- tion I in MATLAB R2009b. We used genetic algorithm and particle swarm optimization toolboxes with the ob-

jective functions developed for ELD. We simulated many problems one which is as follows:

We considered a standard problem for six generator system. The cost characteristic equations for all six units are as given below:

1

1

UNIT 1: F1 = 0.15420 * P 2 + 38.53973 * P1 +

756.79886 Rs/Hr 10 P1 125 MW

2

2

UNIT 2: F2 = 0.10587 * P 2 + 46.15916 * P2 +

451.32513 Rs/Hr 10 P2 150 MW

3

3

UNIT 3: F3 = 0.02803 * P 2 + 40.39655 * P3 +

1049.9977 Rs/Hr 35 P3 225 MW

4

4

UNIT 4: F4 = 0.03546 * P 2 + 38.30553 * P4 +

1243.5311 Rs/Hr 35 P4 210 MW

2

2

UNIT 5: F5 = 0.02111 * P5 + 36.32782 * P5 + 1658.5596 Rs/Hr 130 P5 325 MW

2

2

UNIT 6: F6 = 0.01799 * P6 + 38.27041 * P6 + 1356.6592 Rs/Hr 125 P6 315 MW

Transmission loss Bmn matrix for the above equations is as follows:

Bmn = [0.000140 0.000017 0.000015 0.000019

0.000026 0.000022;

 0.000017 0.000020; 0.000060 0.000013 0.000016 0.000015 0.000015 0.000019; 0.000013 0.000065 0.000017 0.000024 0.000019 0.000025; 0.000016 0.000017 0.000071 0.000030 0.000026 0.000015 0.000024 0.000030 0.000069 0.000032; 0.000022 0.000085]; 0.000020 0.000019 0.000025 0.000032

And the system load is 700 MW. Scenario 1: Considering System Loss

On simulating our program the results we get are as fol- lows:

 Method GA PSO P1 (MW) 13.9889 28.3027 P2 (MW) 10.0747 10 P3 (MW) 100.83 118.9557 P4 (MW) 122.9943 118.6726 P5 (MW) 225.265 230.7595 P6 (MW) 245.9769 212.7411 Cost(RS/HR) 36924.2717 36912.1478

Scenario 2: Neglecting System Loss In this case making B=0

On simulating our program the results we get are as fol- lows:

 Method GA PSO P1 (MW) 15.0034 24.9737 P2 (MW) 10.1551 10 P3 (MW) 102.0484 102.6617 P4 (MW) 107.8269 110.6343 P5 (MW) 235.549 232.6834 P6 (MW) 229.4174 219.0468 Cost(RS/HR) 36021.0081 36003.1282
2. CONCLUSION

In this paper both conventional GA and PSO based eco- nomic dispatch of load for generation cost reduction were comparatively investigated on two sample networks (6 generators system with loss and without loss). The re- sults obtained were satisfactory for both approaches but it was shown that the PSO performed better than GA from the economic view points. This is because of the better convergence criteria and efficient population gen- eration of PSO. A future recommendation can be made for GA and PSO to solve ELD problems as the use of new efficient operators to control and enhance the effi- ciency of instantaneous population for better and fast convergence.

REFERENCE

1. A Eberhart RC and Kennedy J, A new optimizer us- ing particle swarm theory, in proceedings of 6th interna- tional symposium on Micro Machine and Human Sci- ence, Nagoya, Japan, IEEE Service centre, Piscataway, NJ, pp. 3943, 1995.

2. R. Chakrabarti, P. K. Chattopadhyay, M. Basu, and

C. K. Panigrahi, Particle swarm optimization technique for dynamic economic dispatch, IE(I) Journal-EL, vol. 87, pp. 48-54, 2006.

3. B. N. S. Rahimullah, E.I. Ramlan and T.K.A. Rah- man, Evolutionary Approach for Solving Economic Dispatch in Power System, In Proceedings of the IEEE/PES National Power Engineering Conference, vol.1, pp. 32 36, Dec 2003.

4. Younes.M, hadjeri.S, Zidi.S, Houari.S and Laari- oua. M, Economic Power Dispatch using an Ant Col- ony Optimization Method, 10th International confer- ence on Sciences and Techniques of Automatic control & computer engineering, Hammamet, Tunisia, 785-794, December 20-22, 2009.

5. K.A. De Jong, W.M. Spears and D.F. Gordon, Us- ing Markov chains to analyze GAFOs. In D. Whitley

and M. Vose (eds.), Foundations of Genetic Algorithms 3, Morgan Kaufmann, San Mateo, CA, 115137, 1995.

6. J. Rees and G.J. Koehler, An investigation of GA performance results for different cardinality alphabets. In L.D. Davis, K. DeJong, M.D. Vose and L.D. Whitley (eds.), Evolutionary Algorithms: IMA Volumes in Mathematics and its Applications, Vol. 111. Springer-Verlag, New York, 191206, 1999.

7. J.L. Shapiro, A. PrÃ¼gel-Bennett and M. Rattray, A statistical mechanics formulation of the dynamics of ge- netic algorithms. Lecture Notes in Computer Science, Vol. 865. Springer-Verlag, Berlin, pp. 1727, 1994.

8. James Kennedy and Russell Eberhart, Particle swarm optimization, In Proceedings of the IEEE Inter- national Conference on Neural Networks, volume IV, pages 19421948, Piscataway, NJ, 1995.

9. James Kennedy, Russell Eberhart, and Yuhui Shi. Swarm Intelligence. Morgan Kaufmann, 2001.

10. Yuhui Shi and Russell Eberhart, A modified parti- cle swarm optimizer, Proceedings of the IEEE Interna- tional Conference on Evolutionary Computation, pages 6973, 1998.