- Algorithms for Convex Problems
- Karush-Kuhn-Tucker Optimality Necessary Conditions
- Fritz-John Conditions
- Convex Programming Problem
- Pseudoconvex Function
- Strongly Quasiconvex Function
- Strictly Quasiconvex Function
- Differentiable Quasiconvex Function
- Quasiconvex & Quasiconcave functions
- Sufficient & Necessary Conditions for Global Optima
- Differentiable Convex Function
- Convex & Concave Function
- Direction
- Extreme point of a convex set
- Polyhedral Set
- Conic Combination
- Polar Cone
- Convex Cones
- Fundamental Separation Theorem
- Closest Point Theorem
- Weierstrass Theorem
- Caratheodory Theorem
- Convex Hull
- Affine Set
- Convex Set
- Minima and Maxima
- Inner Product
- Norm
- Linear Programming
- Introduction
- Home
Convex Optimization Resources
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Caratheodory Theorem
Let S be an arbitrary set in $mathbb{R}^n$.If $x in Coleft ( S ight )$, then $x in Coleft ( x_1,x_2,....,x_n,x_{n+1} ight )$.
Proof
Since $x in Coleft ( S ight )$, then $x$ is representated by a convex combination of a finite number of points in S, i.e.,
$x=displaystylesumpmits_{j=1}^k lambda_jx_j,displaystylesumpmits_{j=1}^k lambda_j=1, lambda_j geq 0$ and $x_j in S, forall j in left ( 1,k ight )$
If $k leq n+1$, the result obtained is obviously true.
If $k geq n+1$, then $left ( x_2-x_1 ight )left ( x_3-x_1 ight ),....., left ( x_k-x_1 ight )$ are pnearly dependent.
$Rightarrow exists mu _j in mathbb{R}, 2leq jleq k$ (not all zero) such that $displaystylesumpmits_{j=2}^k mu _jleft ( x_j-x_1 ight )=0$
Define $mu_1=-displaystylesumpmits_{j=2}^k mu _j$, then $displaystylesumpmits_{j=1}^k mu_j x_j=0, displaystylesumpmits_{j=1}^k mu_j=0$
where not all $mu_j s$ are equal to zero. Since $displaystylesumpmits_{j=1}^k mu_j=0$, at least one of the $mu_j > 0,1 leq j leq k$
Then, $x=displaystylesumpmits_{1}^k lambda_j x_j+0$
$x=displaystylesumpmits_{1}^k lambda_j x_j- alpha displaystylesumpmits_{1}^k mu_j x_j$
$x=displaystylesumpmits_{1}^kleft ( lambda_j- alphamu_j ight )x_j $
Choose $alpha$ such that $alpha=minleft { frac{lambda_j}{mu_j}, mu_jgeq 0 ight }=frac{lambda_j}{mu _j},$ for some $i=1,2,...,k$
If $mu_jleq 0, lambda_j-alpha mu_jgeq 0$
If $mu_j> 0, then :frac{lambda _j}{mu_j}geq frac{lambda_i}{mu _i}=alpha Rightarrow lambda_j-alpha mu_jgeq 0, j=1,2,...k$
In particular, $lambda_i-alpha mu_i=0$, by definition of $alpha$
$x=displaystylesumpmits_{j=1}^k left ( lambda_j- alphamu_j ight )x_j$,where
$lambda_j- alphamu_jgeq0$ and $displaystylesumpmits_{j=1}^kleft ( lambda_j- alphamu_j ight )=1$ and $lambda_i- alphamu_i=0$
Thus, x can be representated as a convex combination of at most (k-1) points.
This reduction process can be repeated until x is representated as a convex combination of (n+1) elements.
Advertisements