Optimization Technique

123_BFGS Method Example

elif 2024. 4. 1. 16:14

In this post, I'll solve an example using the BFGS algorithm as explained in the last post(122_BFGS Method).

 

 

Set ${x^{(0)}} = (1,2)$, ${{\text{H}}^{(0)}} = {\text{I}}$, $\varepsilon  = 0.001$, and $k=0$. Setting the Hessian to the identity matrix makes it equivalent to the steepest descent step. First, calculate the gradient vector ${\text{c}}$.

 

 

Calculate the norm and criterion check.

 

 

Since the norm values is greater than $\varepsilon $, change the direction of the design change and update the design variables.

 

 

Substitute the updated design variables into the Cost function and recalculate.

 

 

Insert the calculated step size to finally update the design variables.

 

 

Calculate the correction matrices ${{\text{D}}^{(0)}}$ and ${{\text{E}}^{(0)}}$ using the quasi-Newton condition.

 

 

For the start of the second iteration, recalculate the norm of the gradient vector to check the criterion again.

 

 

Calculate the search direction.

 

 

Calculate the step size and use the calculated value along with the search direction value to update $x$.

 

 

Based on the calculated values, compute the correction matrices.

 

 

Through this, it was observed that ${{\text{H}}^{(2)}}$ is very similar to the Hessian of the given cost function. By repeating this process to approximate the Hessian, the optimal solution can be obtained.

'Optimization Technique' 카테고리의 다른 글

125_Linearized Subproblem  (0) 2024.04.03
124_Linearization  (0) 2024.04.02
122_BFGS Method  (0) 2024.03.31
121_DFP Method Example  (0) 2024.03.30
120_Inverse Hessian Updating  (0) 2024.03.29