Optimization Technique

108_Second-order Conditions

elif 2024. 3. 17. 22:59

In this post, I'll study the second-order necessary and sufficient conditions for optimization probles with constraints.

Similar to the unconstrained case, the second part of the Taylor expansion of function at ${x^*}$ is used to determine whether that point is a local minimum. In the presence of constraints, the active constriants at ${x^*}$ must also be considered to determine a feasible change $d$.

 

All $d$ satisfying the active constraints must lie in the tangent hyperplane. Such $d$ must be orthogonal to the gradient of the active constraints, as the gradient of a constraint is perpendicular to the constraint's tangent hyperplane. To derive the second-order condition, we perform a Taylor expansion of the Lagrangian function and consider $d$ that satisfies the aboce conditions. If at ${x^*}$, for all $d$ in the tangent hyperplane of the constraints, the second term of the Taylor expansion is positive, this serves as a sufficient condition for a local miunimum. The necessary condition is that the second term must be non-negative. This can be summarized in the formula as follows.

 

If ${x^*}$ satisfies the first-order KKT necessary conditions for the general optimization problem, the Hessian of the Lagrangian function $L$ at ${x^*}$ can be defined as follows.

 

 

Assuming there exist nonzero feasible directions $d \ne 0$ at ${x^*}$ that satisfy the linear systems as follows.

 

 

If ${x^*}$ is a local minimum point for the optimization problem, then the following condition holds

 

 

If the second-order necessary condition is not satisfied, it cannot be a local minimum point.

Solve an example.

 

 

Check the sufficient conditions at the following points, which satisfies the KKT necessary conditions.

 

 

The Hessian functions for the cost and constraint functions are as follows.

 

 

The eigenvalues of ${\nabla ^2}g$ are ${\lambda _1} = 2$ and ${\lambda _2} = 2$, and since both eigenvalues are positive, the function $g$ is convex. Therefore, the feasible set defined by $g(x) \leqslant 0$ is convex. However, the eigenvalues of ${\nabla ^2}f$ are $-1$ and $5$, so it is not convex. The Hessian of the Lagrangian function is as follows.

 

 

In the case of the first point, ${\nabla ^2}L = {\nabla ^2}f$. In such a case, it becomes a unconstrained problem, meaning for all $d$, ${d^T}{\nabla ^2}f({x^*})d > 0$ or ${\nabla ^2}f$ must be positive definite at ${x^*}$. Since the eigenvalues of ${\nabla ^2}f$ are not all positive, the previously mentioned condition is not satisfied. Therefore, the first point does not satisfy the second-order sufficient conditions for a local minimum. In other words, the first point cannot be a local minimum.

 

At the second and third points, the following condition is satisfied.

 

 

The Hessian matrix is not positive definite at both points. If we define $d=({d_1}, {d_2})$, than $\nabla {g^T}d = 0$ can be represented as follows.

 

 

Therefore, ${d_1}=-{d_2}=c$ and $c \ne 0$. A non zero $d$ that satisfies $\nabla {g^T}d = 0$ is $d=c(1,-1)$. Thus, the sufficient condition is as follows.

 

 

Therefore, the second and third points satisfy the sufficient conditions and are local minima. Although ${\nabla ^2}L$ is not positive definite at ${x^*}$, they are still minimum points because the function $f$ is continuous, and the feasible set is close and bounded, guaranteeing the existence of a global minimum.

Hence, the second and third points are global minimum points.

'Optimization Technique' 카테고리의 다른 글

110_More on Duality  (0) 2024.03.19
109_Duality  (0) 2024.03.18
107_Convex Functions  (0) 2024.03.16
106_KKT Condition  (0) 2024.03.15
105_Lagrange Multiplier Theorem  (2) 2024.03.14