Successive Linear Programming (SLP), also known as Sequential Linear Programming, is an optimization technique for approximately solving nonlinear optimization problems.[1]
Starting at some estimate of the optimal solution, the method is based on solving a sequence of first-order approximations (i.e. linearizations) of the model. The linearizations are linear programming problems, which can be solved efficiently. As the linearizations need not be bounded, trust regions or similar techniques are needed to ensure convergence in theory. [2]
SLP has been used widely in the petrochemical industry since the 1970s. [3]
See also
Sequential quadratic programming
Sequential linear-quadratic programming
Augmented Lagrangian method
References
(Nocedal & Wright 2006, p. 551)
(Bazaraa, Sheraly & Shetty 1993, p. 432)
(Palacios-Gomez et al.)
Sources
Nocedal, Jorge; Wright, Stephen J. (2006). Numerical Optimization (2nd ed.). Berlin, New York: Springer-Verlag. ISBN 978-0-387-30303-1.
Bazaraa, Mokhtar S.; Sherali, Hanif D.; Shetty, C.M. (1993). Nonlinear Programming, Theory and Applications (2nd ed.). John Wiley & Sons. ISBN 0-471-55793-5.
Palacios-Gomez, F.; Lasdon, L.; Enquist, M. (October 1982). "Nonlinear Optimization by Successive Linear Programming". Management Science. 28 (10): 1106–1120. doi:10.1287/mnsc.28.10.1106.
Optimization: Algorithms, methods, and heuristics
Unconstrained nonlinear
Functions
Golden-section search Interpolation methods Line search Nelder–Mead method Successive parabolic interpolation
Gradients
Convergence
Trust region Wolfe conditions
Quasi–Newton
Berndt–Hall–Hall–Hausman Broyden–Fletcher–Goldfarb–Shanno and L-BFGS Davidon–Fletcher–Powell Symmetric rank-one (SR1)
Other methods
Conjugate gradient Gauss–Newton Gradient Levenberg–Marquardt Powell's dog leg method Truncated Newton
Hessians
Newton's method
Graph of a strictly concave quadratic function with unique maximum.
Constrained nonlinear
General
Barrier methods Penalty methods
Differentiable
Augmented Lagrangian methods Sequential quadratic programming Successive linear programming
Convex optimization
Convex
minimization
Cutting-plane method Reduced gradient (Frank–Wolfe) Subgradient method
Linear and
quadratic
Interior point
Affine scaling Ellipsoid algorithm of Khachiyan Projective algorithm of Karmarkar
Basis-exchange
Simplex algorithm of Dantzig Revised simplex algorithm Criss-cross algorithm Principal pivoting algorithm of Lemke
Combinatorial
Paradigms
Approximation algorithm Dynamic programming Greedy algorithm Integer programming
Branch and bound/cut
Graph
algorithms
Minimum
spanning tree
Borůvka Prim Kruskal
Shortest path
Bellman–Ford
SPFA Dijkstra Floyd–Warshall
Network flows
Dinic Edmonds–Karp Ford–Fulkerson Push–relabel maximum flow
Metaheuristics
Evolutionary algorithm Hill climbing Local search Simulated annealing Tabu search
Software
Undergraduate Texts in Mathematics
Graduate Studies in Mathematics
Hellenica World - Scientific Library
Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License