The concept of Taylor's expansion is essential in optimal design and numerical methods. Taylor's expansion is a mathematical formula that approximates the value of a function near a specific point using the derivatives of the function at that point. In other words, it can be considered a generalization of linear approximation in the form of a polynomial function. It is very useful when attempting to use approximations instead of calculating diffcult functions accurately.
For a function $f(x)$ with one variable, as an example, the Taylor expansion of $f(x)$ around the point ${x^*}$ is as follows.
Here, $R$ represents the remainder term, which is smaller in magnitude compared to the previous terms if ${x^*}$ is sufficiently close to $x$. If we set $x-{x^*}=d$, then the Tayler expansion can be expressed as a quadratic polynomial in terms of $d$
For a function of two variables, $f({x_1},{x_2})$. The Tayler expansion around the point $\left( {x_1^*,x_2^*} \right)$ is as follows.
Where ${d_1} = {x_1} - x_1^*$ and ${d_2} = {x_2} - x_2^*$, with all partial derivatives calculated at the given point, and $R$ is omitted. Using sigma notation for sums, the above expression can be represented as follows.
The two expressions are identical, where $\partial f/\partial {x_i}$ represents the gradient of the function, and ${\partial ^2}f/\partial {x_i}\partial {x_j}$ calculated at the given point ${x^*}$ is the Hessian. Therefore, the Taylor expansion can be represented in the following matrix notation.
Here, ${\bf{x}} = ({x_1},{x_2})$, ${{\bf{x}}^*} = (x_1^*,x_2^*)$, and ${\bf{d}} = {\bf{x}} - {{\bf{x}}^{\bf{*}}}$. By using matrix notation, the Taylor expansion can be generalized for a function of $n$ variables, where $x$, ${x^*}$, and $\nabla f$ are n-dimentional vectors, and ${\bf{H}}$ is the $n \times n$ Hessian matrix. if $\nabla f = f({\bf{x}}) - f({{\bf{x}}^{\bf{*}}})$ is defined, then it can be represented as follows.
In the above equation, by retaining only the first term, we can obtain $\delta f$, which is the first-order change of $f(x)$ at ${x^*}$.
'Optimization Technique' 카테고리의 다른 글
109_Duality (0) | 2024.03.18 |
---|---|
108_Second-order Conditions (0) | 2024.03.17 |
107_Convex Functions (0) | 2024.03.16 |
106_KKT Condition (0) | 2024.03.15 |
105_Lagrange Multiplier Theorem (2) | 2024.03.14 |