English 中文(简体)
Algorithms for Convex Problems
  • 时间:2024-09-08

Algorithms for Convex Problem


Previous Page Next Page  

Method of Steepest Descent

This method is also called Gradient method or Cauchy s method. This method involves the following terminologies −

$$x_{k+1}=x_k+alpha_kd_k$$

$d_k= - igtriangledown fleft ( x_k ight )$ or $ d_k= -frac{igtriangledown fleft ( x_k ight )}{left | igtriangledown fleft ( x_k ight ) ight |}$

Let $phi left (alpha ight )=fleft ( x_k +alpha d_k ight )$

By differentiating $phi$ and equating it to zero, we can get $alpha$.

So the algorithm goes as follows −

    Initiapze $x_0$,$varepsilon_1$,$varepsilon_2$ and set $k=0$.

    Set $d_k=-igtriangledown fleft ( x_k ight ) $or $d_k=-frac{igtriangledown fleft (x_k ight )}{left |igtriangledown fleft (x_k ight ) ight |}$.

    find $alpha_k$ such that it minimizes $phileft ( alpha ight )=fleft ( x_k+alpha d_k ight )$.

    Set $x_{k+1}=x_k+alpha_kd_k$.

    If $left | x_{k+1-x_k} ight | <varepsilon_1$ or $left | igtriangledown fleft ( x_{k+1} ight ) ight | leq varepsilon_2$, go to step 6, otherwise set $k=k+1$ and go to step 2.

    The optimal solution is $hat{x}=x_{k+1}$.

Newton Method

Newton Method works on the following principle −

$fleft ( x ight )=yleft ( x ight )=fleft ( x_k ight )+left ( x-x_k ight )^T igtriangledown fleft ( x_k ight )+frac{1}{2}left ( x-x_k ight )^T Hleft ( x_k ight )left ( x-x_k ight )$

$igtriangledown yleft ( x ight )=igtriangledown fleft ( x_k ight )+Hleft ( x_k ight )left ( x-x_k ight )$

At $x_{k+1}, igtriangledown yleft ( x_{k+1} ight )=igtriangledown fleft ( x_k ight )+Hleft ( x_k ight )left ( x_{k+1}-x_k ight )$

For $x_{k+1}$ to be optimal solution $igtriangledown yleft ( x_k+1 ight )=0$

Thus, $x_{k+1}=x_k-Hleft ( x_k ight )^{-1} igtriangledown fleft ( x_k ight )$

Here $Hleft ( x_k ight )$ should be non-singular.

Hence the algorithm goes as follows −

Step 1 − Initiapze $x_0,varepsilon$ and set $k=0$.

Step 2 − find $Hleft ( x_k ight ) igtriangledown fleft ( x_k ight )$.

Step 3 − Solve for the pnear system $Hleft ( x_k ight )hleft ( x_k ight )=igtriangledown fleft ( x_k ight )$ for $hleft ( x_k ight )$.

Step 4 − find $x_{k+1}=x_k-hleft ( x_k ight )$.

Step 5 − If $left | x_{k+1}-x_k ight |< varepsilon$ or $left | igtriangledown fleft ( x_k ight ) ight | leq varepsilon$ then go to step 6, else set $k=k+1$ and go to step 2.

Step 6 − The optimal solution is $hat{x}=x_{k+1}$.

Conjugate Gradient Method

This method is used for solving problems of the following types −

$min fleft ( x ight )=frac{1}{2}x^T Qx-bx$

where Q is a positive definite nXn matrix and b is constant.

Given $x_0,varepsilon,$ compute $g_0=Qx_0-b$

Set $d_0=-g_0$ for $k=0,1,2,...,$

Set $alpha_k=frac{g_{k}^{T}g_k}{d_{k}^{T}Q d_k}$

Compute $x_{k+1}=x_k+alpha_kd_k$

Set $g_{k+1}=g_k+alpha_kd_k$

Compute $eta_k=frac{g_{k}^{T}g_k}{d_{k}^{T}Qd_k}$

Compute $x_{k+1}=x_k+alpha_kd_k$

Set $g_{k+1}=x_k+alpha_kQd_k$

Compute $eta_k=frac{g_{k+1}^{T}g_{k+1}}{g_{k}^{T}gk}$

Set $d_{k+1}=-g_{k+1}+eta_kd_k$.

Advertisements