Questions tagged [subgradient]

This tag is for questions relating to subgradient, an iterative method for solving convex minimization problems, used predominantly in Nondifferentiable optimization for functions that are convex but nondifferentiable. The subgradient method is a very simple algorithm for minimizing convex nondifferentiable functions where newton's method and simple linear programming will not work.

The Subgradient (related to Subderivative and Subdifferential) of a function is a way of generalizing or approximating the derivative of a convex function at nondifferentiable points.

Definition : A vector $~g ∈ \mathbb R^n~$is a subgradient of $~f : \mathbb R^n → \mathbb R~$ at $~x ∈ \text{dom}~ f~$ if for all $~z ∈\text{dom}~f~,$ $$f(z) ≥ f(x) + g^T(z − x)~.$$

Note : If $~f~$ is convex and differentiable, then its gradient at $~x~$ is a subgradient. But a subgradient can exist even when $~f~$ is not differentiable at $~x~$.

Subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of steepest descent. Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.

For more details please visit the following references:

https://see.stanford.edu/materials/lsocoee364b/01-subgradients_notes.pdf

https://people.csail.mit.edu/dsontag/courses/ml16/slides/notes_convexity16.pdf

https://optimization.mccormick.northwestern.edu/index.php/Subgradient_optimization

https://en.wikipedia.org/wiki/Subgradient_method

259 questions
2
votes
0 answers

why subgradient of convex function is unique

The subgradient has definition of: $$ \partial f(x) = \{g;f(y) \ge f(x)+ g^T(y-x), \forall y \in dom(f) \} $$ My question is, when function f is convex function and differentiable, why it's the case that that $\nabla{f(x)}$ is the unique…
Li haonan
  • 209
1
vote
0 answers

Subgradient of a quadratic plus 1-norm

I need to find a point in the subgradient (at a certain point $x^k$) of the function $$ f(x) = \frac{1}{2}||y-Ax||^2 + ||x||_1$$ for some kind of iterative method, ie I need to find $g \in \partial f(x^k)$. Here $y$ and $A$ are constant vector and…
karlabos
  • 1,257
1
vote
0 answers

Find subgradiant

Calculate subgradient of the function $f(x)=|x-3|+|x+1|$ at point $x=-1$ and $x=3$. At $x=-1$ for all $y\in R$ following should be satisfied: $|y-3|+|y+1|\ge4+k(y+1)$, where $k$ is subgradient. But how do I solve that? How to find $k$? Same for…
0
votes
1 answer

subdifferential of a linear function which has a non-negative domain

I am trying to find the sub-derivative of the function $f(x) = \lambda x, x\ge0$. I am familiar of the sub-derivative of the absolute value but not sure how to find the sub-derivative of this function at $x=0$. Applying the same logic of the…
rando
  • 313
0
votes
0 answers

sub differential exercice

Let $D$ be the unit disc in $\mathcal{R}^2$ and let $f : \mathcal{R}^2 → \mathcal{R}$ be the function $f(x) = d(x, D)$ = the distance from $x$ to $D$. We are looking for the sub differential of $f$ at $\bar x$ that is we are looking for $\xi \in…
0
votes
0 answers

How to find the subgradients of $f(x)=\|x\|^2-1$ if $\|x\|\geq 1$, $f(x)=0$ if $\|x\|\leq 1$, $x\in\mathbb{R}^2$.

How to find the subgradients of $f(x)=\|x\|^2-1$ if $\|x\|\geq 1$, $f(x)=0$ if $\|x\|\leq 1$, $x\in\mathbb{R}^2$. By definition a subgradient $a$ must satisfy $f(x+y)\geq f(x)+a\cdot y$ I just have problems in the case $\|x\|=1$, when $\|x+y\|\leq…
Guadalupe
  • 103