Optimization Technique

118_Conjugate Gradient Algorithm

elif 2024. 3. 27. 20:51

The Conjugate Gradient (CG) algorithm is a method used to solve systems of linear equations or to optimize multivariable functions. It's particularly useful when direct methods are computationally inefficient or for large, sparse systems. The algorithm iteratively improves an initial estimate of the solution until convergence criteria are met.

 

 

First, set ${x^(0)}$ as the starting point and the iteration counter $k=0$.

 

 

Where $f({x^(0)}$ represents the gradient of the cost function at the starting point. When setting the convergence paramter to $\varepsilon $, if $\left\| {{{\text{c}}^{(k)}}} \right\| < \varepsilon $, the process stops otherwise, the step size ${\alpha _k} = \alpha $ is calculated.

 

 

Subsequently, the design variables are updated as follows.

 

 

Based on the updated design variables, the gradient of the cost function is recalculated.

 

 

Here, $\left\| {{{\text{c}}^{(k)}}} \right\|$ is calculated. If $\left\| {{{\text{c}}^{(k)}}} \right\| < \varepsilon $, the process stops otherwise, it continues.

The new conjugate direction is calculated according to the following formula.

 

 

The step size is recalculated.

 

 

Repeat this process until the convergence criteria are met.

 

The major difference between the Conjugate Gradient method and the steepest descent lies in the calculation of the conjugate direction. The Conjugate Gradient method adjusts the direction used in previous iterations to modify the current descent direction, allowing for more efficient convergence in high-dimensional optimization problems.

'Optimization Technique' 카테고리의 다른 글

120_Inverse Hessian Updating  (0) 2024.03.29
119_Modified Newton Method  (0) 2024.03.28
115_KKT Conditions for LP Problem  (0) 2024.03.24
114_Linear Programming  (0) 2024.03.23
113_Dual Problem  (0) 2024.03.22