3 views

Uploaded by rkk

optimisation ppt

save

- Weatherwax Theodoridis Solutions
- [9] SQP Review
- DMS solutions
- Lecture-2
- Pages From Power Generation Operation and Control by Allen J Wood & Bruce F Wollenberg-2
- Lecture Notes
- 4037_s13_er
- Misg12 Tuc
- Genetic Algorithm Help File Matlab
- The Envelope Theorem
- Nonlinopt Klerk Roos Terlaky BOOK
- Worksheet 5 (Sol)
- A 0515
- Chap 8
- Session19 Extra Svm
- Roller Bearing Design
- GA
- runall
- optimization Using Scilab.pdf
- HEARTH THERMOCOUPLES
- Technique for Estimating Natural Frequencies
- Optimiz. Ciclos Limpieza Intercamb.
- Ch01 Introduction design of thermal system
- Deturck and Wilf - Lectures on Numerical Analysis
- Albu 2012
- Inverse Problems in Exploration Seismology
- Three strategies to derive a dual problem
- Guia de Inicio Arena
- Capital Structure Optimization
- Note Part 1
- Sapiens: A Brief History of Humankind
- The Unwinding: An Inner History of the New America
- Yes Please
- Dispatches from Pluto: Lost and Found in the Mississippi Delta
- Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future
- The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
- John Adams
- Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America
- A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True Story
- This Changes Everything: Capitalism vs. The Climate
- Grand Pursuit: The Story of Economic Genius
- The Emperor of All Maladies: A Biography of Cancer
- The Prize: The Epic Quest for Oil, Money & Power
- Team of Rivals: The Political Genius of Abraham Lincoln
- The New Confessions of an Economic Hit Man
- The World Is Flat 3.0: A Brief History of the Twenty-first Century
- Rise of ISIS: A Threat We Can't Ignore
- Smart People Should Build Things: How to Restore Our Culture of Achievement, Build a Path for Entrepreneurs, and Create New Jobs in America
- The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers
- How To Win Friends and Influence People
- Steve Jobs
- Angela's Ashes: A Memoir
- Bad Feminist: Essays
- The Light Between Oceans: A Novel
- The Silver Linings Playbook: A Novel
- Leaving Berlin: A Novel
- Extremely Loud and Incredibly Close: A Novel
- The Sympathizer: A Novel (Pulitzer Prize for Fiction)
- You Too Can Have a Body Like Mine: A Novel
- The Incarnations: A Novel
- Life of Pi
- The Love Affairs of Nathaniel P.: A Novel
- A Man Called Ove: A Novel
- Bel Canto
- The Master
- The Rosie Project: A Novel
- The Blazing World: A Novel
- We Are Not Ourselves: A Novel
- The First Bad Man: A Novel
- The Flamethrowers: A Novel
- Brooklyn: A Novel
- The Art of Racing in the Rain: A Novel
- Wolf Hall: A Novel
- The Wallcreeper
- Interpreter of Maladies
- Beautiful Ruins: A Novel
- The Kitchen House: A Novel
- The Perks of Being a Wallflower
- The Bonfire of the Vanities: A Novel
- Lovers at the Chameleon Club, Paris 1932: A Novel
- A Prayer for Owen Meany: A Novel
- The Cider House Rules
- Mislaid: A Novel

You are on page 1of 30

Successive Quadratic

Programming

The Successive quadratic programming is one of

the most recently developed and perhaps one of the

best methods of optimization.

The method has a theoretical basis that is related to

**1. The solution of a set of nonlinear equations using
**

Newton’s method.

**2. The derivation of simultaneous nonlinear equations using
**

Kuhn-Tucker conditions to the Lagrangian of the

constrained optimization problem.

Successive Quadratic

Programming

Consider a nonlinear optimization problem with only equality constraints

as:

** The extension to include inequality constraints will be considered at a later
**

stage. The Lagrange function, L (X,), corresponding to the problem of the

above equation is given by:

where k is the Lagrange multiplier for the kth equality constraint.

p )..Successive Quadratic Programming The Kuhn-Tucker necessary consitions can be stated as: where [A] is an n x p matrix whose kth column denotes the gradient of the function hk. k=1. i=1.. For convenience.... we rewrite the above equations as: .. The above equations represent a set of n+p nonlinear equations in n+p unknowns (xi.n and k... These nonlinear equations can be solved using Newton’s method.

Successive Quadratic Programming According to the Newton’s method. the solution of the above equation can be found as: where Yj is the solution at the start of the jth equation and ∆Yj is the change in Yj necessary to generate the improved solution. Yj+1. . and [F]j= [F(Yj)]j is the (n+p) x (n+p) Jacobian matrix of the nonlinear equations whose ith column denotes the gradient of the function Fi (Y) with respect to the vector Y.

Successive Quadratic Programming By substituting into we obtain: .

Successive Quadratic Programming Method where denotes the Hessian matrix of the Lagrange function. The first set of equations in can be written separately as: .

j+1. .The iterative process indicated by the above equation can be continued until convergence is achieved.Successive Quadratic Programming Method The equation and the second set of equations in the equation can now be combined as: The above equation can be solved to find the change in the design vector ∆Xj and the new values of the Lagrange multipliers.

Successive Quadratic Programming Method Now consider the following quadratic programming problem: Find ∆X that minimizes the quadratic objective function subject to the linear equality constraints The Lagrange function L corresponding to the above problem is given by: where k is the Lagrange multiplier associated with the kth equality constraint. .

.Successive Quadratic Programming Method The Kuhn-Tucker necessary conditions can be stated as: The above equations can be identified to be same as in matrix form.

Successive Quadratic Programming Method This shows that the original problem of the equation can be solved iteratively by solving the quadratic programming problem defined by the equation In fact. the quadratic programming problem of the above equation becomes: . when inequality constraints are added to the original problem.

.Successive Quadratic Programming Method with the Lagrange function given by: Since the minimum of the Augmented Lagrange function is involved. the Successive quadratic programming method is also known as the projected Lagrangian method.

and the quadratic programming subproblem (in terms of the design vector S) is restated as: .Successive Quadratic Programming Method Solution Procedure As in the case of Newton’s method of unconstrained minimization. the solution vector ∆X in the equation is treated as the search direction S.

Typical values of these constants are given by: .Successive Quadratic Programming Method where [H] is a positive definite matrix that is taken initially as the identity matrix and is updated in subsequent iterations so as to converge to the Hessian matrix of the Lagrange function of the equation: and are constants used to ensure that the linearized constraints do not cut off the feasible space completely.

Successive Quadratic Programming Method The subproblem of the equation is a quadratic programming problem and hence methods for minimizing quadratic functions can be used for their solution. the problem can be solved by any of the methods described in this lecture since the gradients of the function involved can be evaluated easily. are needed. Since the Lagrange multipliers associated with the solution of the above problem. Alternatively. they can be evaluated: .

Successive Quadratic Programming Method .

Successive Quadratic Programming Method Once the search direction. S. is found by solving the problem in the equation the design vector is updated as: where * is the optimal step length along the direction S found by minimizing the function (using an exterior penalty function approach): .

for the next iteration of the Hessian matrix [H] is updated to improve the quadratic approximation in the equation . Once Xj+1 is found from the equation .Successive Quadratic Programming Method The one-dimensional step length * can be found by any of the methods discussed before for one-dimensional minimization.

Successive Quadratic Programming Method Usually. given below is used for this purpose: where is given by and the constants 0. .2 and 0.8 in the above equation can be changed. based on numerical experience. a modified BFGS formula.

Successive Quadratic Programming Method Example 1: Find the solution of the problem using the Successive quadratic programming technique .

and f (X1)=1.5917.Successive Quadratic Programming Method . g2(X1)=-5.8765. The gradients of the objective and constraint functions at X1 are given by: .Example 1 Solution: Let the starting point be: With g1(X1)= g3(X1)=0.

Example 1 Solution: We assume the matrix [H1] to be the identity matrix and hence the objective function of the equation becomes .Successive Quadratic Programming Method .

and hence the constraints of the equation .Successive Quadratic Programming Method .Example 1 Solution: Equation gives 1=3=0 since g1= g3=0 and 2=1.0 since g2<0.

Successive Quadratic Programming Method .Example 1 Solution: can be expressed as: We solve this quadratic programming problem directly with the use of the Kuhn-Tucker conditions which are given by: .

Example 1 The equations can be expressed in this case as: By considering all possibilities of active constraints. we find that the optimum solution of the quadratic programming problem is given by .Successive Quadratic Programming Method .

Successive Quadratic Programming Method .Example 1 The new design vector X can be expressed as: where can be found by minimizing the function in equation .

Successive Quadratic Programming Method . we find that attains its minimum value of 1.93. which corresponds to the new design vector with f (X2)=1.Example 1 By using the quadratic interpolation technique (unrestricted search method can also be used for simplicity).48 at *=64.0074932 (violated slightly) .38874 and g1 (X2)=0.

Example 1 Next we update the matrix [H] using with .Successive Quadratic Programming Method .

Successive Quadratic Programming Method .Example 1 .

Note that the objective function reduced from a value of 1.38874 in one iteration when X changed from X1 to X2.Example 1 We can now start another iteration by defining a new quadratic programming problem using and continue the procedure until the optimum solution is found.Successive Quadratic Programming Method . .5917 to 1.

- Weatherwax Theodoridis SolutionsUploaded byRakesh
- [9] SQP ReviewUploaded byAlneo Lesag
- DMS solutionsUploaded byVillux
- Lecture-2Uploaded byRaul Gil Bayardo
- Pages From Power Generation Operation and Control by Allen J Wood & Bruce F Wollenberg-2Uploaded bygavinilaa
- Lecture NotesUploaded byAaditya Raut
- 4037_s13_erUploaded bySherlock Wesley Conan
- Misg12 TucUploaded byDaryAnto
- Genetic Algorithm Help File MatlabUploaded byazrim02
- The Envelope TheoremUploaded byLiknusLiknus
- Nonlinopt Klerk Roos Terlaky BOOKUploaded byrenan_andrade2924
- Worksheet 5 (Sol)Uploaded bypolipoli345
- A 0515Uploaded byGurjant Singh
- Chap 8Uploaded byNikhil Dedhia
- Session19 Extra SvmUploaded byImaad Ukaye
- Roller Bearing DesignUploaded byUNsha bee kom
- GAUploaded byZellagui Energy
- runallUploaded byGingramF
- optimization Using Scilab.pdfUploaded byMiguel Granados
- HEARTH THERMOCOUPLESUploaded byindrajitsao1
- Technique for Estimating Natural FrequenciesUploaded by王轩
- Optimiz. Ciclos Limpieza Intercamb.Uploaded byMario Antunez
- Ch01 Introduction design of thermal systemUploaded byalzyoud
- Deturck and Wilf - Lectures on Numerical AnalysisUploaded bytifojo
- Albu 2012Uploaded byMaiman Lato
- Inverse Problems in Exploration SeismologyUploaded bygt0084e1
- Three strategies to derive a dual problemUploaded byRyota Tomioka
- Guia de Inicio ArenaUploaded byJulio Mojica Herazo
- Capital Structure OptimizationUploaded byMamadou Bass
- Note Part 1Uploaded byJavier Mendoza