Constrained Optimization Problem with Rosen's

Rosen's Gradient Projection method; This is the another direct method to solve the Constrained Optimization problem
216 Downloads
Updated 29 Dec 2020

View License

Algorithm of Rosen's gradient Projection Method
Algorithm. The procedure involved in the application of the gradient projection
method can be described by the following steps:
1. Start with an initial point X1. The point X1 has to be feasible, that is,
gj(X1) ≤ 0, j = 1, 2, . . . ,m
2. Set the iteration number as i = 1.
3. If Xi is an interior feasible point (i.e. if gj(Xi)<0 for j = 1, 2, . . . , m), set the
direction of search as Si = −∇f (Xi), normalize the search direction as
Si =
−∇f (Xi)
‖∇f (Xi)‖
and go to step 5. However, if gj (Xi) = 0 for j = j1, j2, . . . , jp, go to step 4.
4. Calculate the projection matrix Pi as
Pi = I − Np(NTp
Np)−1NTp
where
Np = [∇gj1(Xi)∇gj2(Xi) . . .∇gjp(Xi)]
and find the normalized search direction Si as
Si =
−Pi∇f (Xi)
‖Pi∇f (Xi)‖
5. Test whether or not Si = 0. If Si
≠0, go to step 6. If Si = 0, compute the vector
𝝀 at Xi as
𝛌 = −(NTp
Np)−1NTp
∇f (Xi)
If all the components of the vector 𝝀 are nonnegative, take Xopt = Xi and stop
the iterative procedure. If some of the components of 𝝀 are negative, find the
component 𝜆q that has the most negative value and form the new matrix Np as
Np = [∇gj1∇gj2⋯∇gjq−1∇gjq+1⋯∇gjp]
and go to step 3.
6. If Si ≠0, find the maximum step length 𝜆M that is permissible without violating
any of the constraints as 𝜆M = min(𝜆k), 𝜆k >0 and k is any integer among 1 to m
other than j1, j2, . . . , jp. Also find the value of df/d𝜆(𝜆M) = STi
∇f (Xi +𝜆MSi).
If df/d𝜆(𝜆M) is zero or negative, take the step length as 𝜆i = 𝜆M. On the other
hand, if df/d𝜆(𝜆M) is positive, find theminimizing step length λ∗i either by interpolation
or by any of the methods discussed in Chapter 5, and take 𝜆i = λ∗i .
7. Find the new approximation to the minimum as
Xi+1 = Xi + λiSi
If 𝜆i = 𝜆M or if 𝜆M
≤𝜆*
i, some new constraints (one or more) become active at
Xi+1 and hence generate the new matrix Np to include the gradients of all active
constraints evaluated at Xi+1. Set the new iteration number as i = i+1, and go
to step 4. If 𝜆i = λ∗i and λ∗i
<𝜆M, no new constraint will be active at Xi+1 and
hence the matrix Np remains unaltered. Set the new value of i as i = i+1, and
go to step 3.

This example also can solve with this code
Minimize f (x1, x2) = x1^2 + x2^2 − 2x1 − 4x2
subject to
g1(x1, x2) = x1 + 4x2 − 5 ≤ 0
g2(x1, x2) = 2x1 + 3x2 − 6 ≤ 0
g3(x1, x2) = −x1 ≤ 0
g4(x1, x2) = −x2 ≤ 0
starting from the point X1 =
{1.0
1.0}

Cite As

Narayan Das Ahirwar (2026). Constrained Optimization Problem with Rosen's (https://www.mathworks.com/matlabcentral/fileexchange/84923-constrained-optimization-problem-with-rosen-s), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2020b
Compatible with any release
Platform Compatibility
Windows macOS Linux
Version Published Release Notes
1.0.0