- Algorithms for Convex Problems
- Karush-Kuhn-Tucker Optimality Necessary Conditions
- Fritz-John Conditions
- Convex Programming Problem
- Pseudoconvex Function
- Strongly Quasiconvex Function
- Strictly Quasiconvex Function
- Differentiable Quasiconvex Function
- Quasiconvex & Quasiconcave functions
- Sufficient & Necessary Conditions for Global Optima
- Differentiable Convex Function
- Convex & Concave Function
- Direction
- Extreme point of a convex set
- Polyhedral Set
- Conic Combination
- Polar Cone
- Convex Cones
- Fundamental Separation Theorem
- Closest Point Theorem
- Weierstrass Theorem
- Caratheodory Theorem
- Convex Hull
- Affine Set
- Convex Set
- Minima and Maxima
- Inner Product
- Norm
- Linear Programming
- Introduction
- Home
Convex Optimization Resources
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Convex Optimization - Programming Problem
There are four types of convex programming problems −
Step 1 − $min :fleft ( x ight )$, where $x in S$ and S be a non-empty convex set in $mathbb{R}^n$ and $fleft ( x ight )$ is convex function.
Step 2 − $min : fleft ( x ight ), x in mathbb{R}^n$ subject to
$g_ileft ( x ight ) geq 0, 1 leq m_1$ and $g_ileft ( x ight )$ is a convex function.
$g_ileft ( x ight ) leq 0,m_1+1 leq m_2$ and $g_ileft ( x ight )$ is a concave function.
$g_ileft ( x ight ) = 0, m_2+1 leq m$ and $g_ileft ( x ight )$ is a pnear function.
where $fleft ( x ight )$ is a convex fucntion.
Step 3 − $max :fleft ( x ight )$ where $x in S$ and S be a non-empty convex set in $mathbb{R}^n$ and $fleft ( x ight )$ is concave function.
Step 4 − $min :fleft ( x ight )$, where $x in mathbb{R}^n$ subject to
$g_ileft ( x ight ) geq 0, 1 leq m_1$ and $g_ileft ( x ight )$ is a convex function.
$g_ileft ( x ight ) leq 0, m_1+1 leq m_2$ and $g_ileft ( x ight )$ is a concave function.
$g_ileft ( x ight ) = 0,m_2+1 leq m$ and $g_ileft ( x ight )$ is a pnear function.
where $fleft ( x ight )$ is a concave function.
Cone of feasible direction
Let S be a non-empty set in $mathbb{R}^n$ and let $hat{x} in :Closureleft ( S ight )$, then the cone of feasible direction of S at $hat{x}$, denoted by D, is defined as $D=left { d:d eq 0,hat{x}+lambda d in S, lambda in left ( 0, delta ight ), delta > 0 ight }$
Each non-zero vector $d in D$ is called feasible direction.
For a given function $f:mathbb{R}^n Rightarrow mathbb{R}$ the cone of improving direction at $hat{x}$ is denoted by F and is given by
$$F=left { d:fleft ( hat{x}+lambda d ight )leq fleft ( hat{x} ight ),forall lambda in left ( 0,delta ight ), delta >0 ight }$$
Each direction $d in F$ is called an improving direction or descent direction of f at $hat{x}$
Theorem
Necessary Condition
Consider the problem $min fleft ( x ight )$ such that $x in S$ where S be a non-empty set in $mathbb{R}^n$. Suppose f is differentiable at a point $hat{x} in S$. If $hat{x}$ is a local optimal solution, then $F_0 cap D= phi$ where $F_0=left { d:igtriangledown fleft ( hat{x} ight )^T d < 0 ight }$ and D is a cone of feasible direction.
Sufficient Condition
If $F_0 cap D= phi$ f is a pseudoconvex function at $hat{x}$ and there exists a neighbourhood of $hat{x},N_varepsilon left ( hat{x} ight ), varepsilon > 0$ such that $d=x-hat{x}in D$ for any $x in S cap N_varepsilon left ( hat{x} ight )$, then $hat{x}$ is local optimal solution.
Proof
Necessary Condition
Let $F_0 cap D eq phi$, ie, there exists a $d in F_0 cap D$ such that $d in F_0$ and $din D$
Since $d in D$, therefore there exists $delta_1 >0$ such that $hat{x}+lambda d in S, lambda in left ( 0,delta_1 ight ).$
Since $d in F_0$, therefore $igtriangledown f left ( hat{x} ight )^T d <0$
Thus, there exists $delta_2>0$ such that $fleft ( hat{x}+lambda d ight )< fleft ( hat{x} ight ),forall lambda in f left ( 0,delta_2 ight )$
Let $delta=min left {delta_1,delta_2 ight }$
Then $hat{x}+lambda d in S, fleft (hat{x}+lambda d ight ) < fleft ( hat{x} ight ),forall lambda in f left ( 0,delta ight )$
But $hat{x}$ is local optimal solution.
Hence it is contradiction.
Thus $F_0cap D=phi$
Sufficient Condition
Let $F_0 cap D eq phi$ nd let f be a pseudoconvex function.
Let there exists a neighbourhood of $hat{x}, N_{varepsilon}left ( hat{x} ight )$ such that $d=x-hat{x}, forall x in S cap N_varepsilonleft ( hat{x} ight )$
Let $hat{x}$ is not a local optimal solution.
Thus, there exists $ar{x} in S cap N_varepsilon left ( hat{x} ight )$ such that $f left ( ar{x} ight )< f left ( hat{x} ight )$
By assumption on $S cap N_varepsilon left ( hat{x} ight ),d=left ( ar{x}-hat{x} ight )in D$
By pseudoconvex of f,
$$fleft ( hat{x} ight )>fleft ( ar{x} ight )Rightarrow igtriangledown fleft ( hat{x} ight )^Tleft ( ar{x}-hat{x} ight )<0$$
$Rightarrow igtriangledown fleft ( hat{x} ight) ^T d <0$
$Rightarrow d in F_0$
hence $F_0cap D eq phi$
which is a contradiction.
Hence, $hat{x}$ is local optimal solution.
Consider the following problem:$min :fleft ( x ight )$ where $x in X$ such that $g_xleft ( x ight ) leq 0, i=1,2,...,m$
$f:X ightarrow mathbb{R},g_i:X ightarrow mathbb{R}^n$ and X is an open set in $mathbb{R}^n$
Let $S=left {x:g_ileft ( x ight )leq 0,forall i ight }$
Let $hat{x} in X$, then $M=left {1,2,...,m ight }$
Let $I=left {i:g_ileft ( hat{x} ight )=0, i in M ight }$ where I is called index set for all active constraints at $hat{x}$
Let $J=left {i:g_ileft ( hat{x} ight )<0,i in M ight }$ where J is called index set for all active constraints at $hat{x}$.
Let $F_0=left { d in mathbb{R}^m:igtriangledown fleft ( hat{x} ight )^T d <0 ight }$
Let $G_0=left { d in mathbb{R}^m:igtriangledown gIleft ( hat{x} ight )^T d <0 ight }$
or $G_0=left { d in mathbb{R}^m:igtriangledown gileft ( hat{x} ight )^T d <0 ,forall i in I ight }$
Lemma
If $S=left { x in X:g_ileft ( x ight ) leq 0, forall i in I ight }$ and X is non-empty open set in $mathbb{R}^n$. Let $hat{x}in S$ and $g_i$ are different at $hat{x}, i in I$ and let $g_i$ where $i in J$ are continuous at $hat{x}$, then $G_0 subseteq D$.
Proof
Let $d in G_0$
Since $hat{x} in X$ and X is an open set, thus there exists $delta_1 >0$ such that $hat{x}+lambda d in X$ for $lambda in left ( 0, delta_1 ight )$
Also since $g_hat{x}<0$ and $g_i$ are continuous at $hat{x}, forall i in J$, there exists $delta_2>0$, $g_ileft ( hat{x}+lambda d ight )<0, lambda in left ( 0, delta_2 ight )$
Since $d in G_0$, therefore, $igtriangledown g_ileft ( hat{x} ight )^T d <0, forall i in I$ thus there exists $delta_3 >0, g_ileft ( hat{x}+lambda d ight )< g_ileft ( hat{x} ight )=0$, for $lambda in left ( 0, delta_3 ight ) i in J$
Let $delta=minleft { delta_1, delta_2, delta_3 ight }$
therefore, $hat{x}+lambda d in X, g_ileft ( hat{x}+lambda d ight )< 0, i in M$
$Rightarrow hat{x}+lambda d in S$
$Rightarrow d in D$
$Rightarrow G_0 subseteq D$
Hence Proved.
Theorem
Necessary Condition
Let $f$ and $g_i, i in I$, are different at $hat{x} in S,$ and $g_j$ are continous at $hat{x} in S$. If $hat{x}$ is local minima of $S$, then $F_0 cap G_0=phi$.
Sufficient Condition
If $F_0 cap G_0= phi$ and f is a pseudoconvex function at $left (hat{x}, g_i 9x ight ), i in I$ are strictly pseudoconvex functions over some $varepsilon$ - neighbourhood of $hat{x}, hat{x}$ is a local optimal solution.
Remarks
Let $hat{x}$ be a feasible point such that $igtriangledown fleft(hat{x} ight)=0$, then $F_0 = phi$. Thus, $F_0 cap G_0= phi$ but $hat{x}$ is not an optimal solution
But if $igtriangledown gleft(hat{x} ight)=0$, then $G_0=phi$, thus $F_0 cap G_0= phi$
Consider the problem : min $fleft(x ight)$ such that $gleft(x ight)=0$.
Since $gleft(x ight)=0$, thus $g_1left(x ight)=gleft(x ight)<0$ and $g_2left(x ight)=-gleft(x ight) leq 0$.
Let $hat{x} in S$, then $g_1left(hat{x} ight)=0$ and $g_2left(hat{x} ight)=0$.
But $igtriangledown g_1left(hat{x} ight)= - igtriangledown g_2left(hat{x} ight)$
Thus, $G_0= phi$ and $F_0 cap G_0= phi$.