1

I am a bit confused why the 0-1 loss function is not convex. What's wrong with it?

user34790
  • 473
  • 1
  • 3
  • 8
  • Could you provide a little more background? I've taken classes in nonlinear optimization, and I have no idea what a 0-1 loss function is. – Geoff Oxberry Mar 22 '13 at 05:24
  • Perhaps you're referring to {0,1}-valued indicator functions? If so, Geoff's answer below still applies. – Michael Grant Mar 22 '13 at 13:10

1 Answers1

3

I'm not sure this is what you're looking for, but here goes:

A zero-one loss function $L: \mathbb{R} \rightarrow \{0,1\}$ is defined as:

\begin{align} L(x) = \left\{\begin{array}{ll} 0, & \textrm{if $x \geq 0$}, \\ 1, & \textrm{if $x < 0.$}\end{array}\right. \end{align}

$L$ is convex if, for all $x_1, x_2 \in \mathbb{R}$, and all $\lambda \in [0,1]$,

\begin{align} L(\lambda x_1 + (1 - \lambda)x_2) \leq \lambda L(x_1) + (1 - \lambda) L(x_2). \end{align}

A counterexample is: $(x_1, x_2, \lambda) = (-1, 1/2, 1/2)$.

Then:

\begin{align} L(\lambda x_1 + (1 - \lambda)x_2) &= L(-1/4) = 1, \\ L(x_1) &= 1, \\ L(x_2) &= 0, \end{align}

and $L(\lambda x_1 + (1 - \lambda)x_2) \leq \lambda L(x_1) + (1 - \lambda) L(x_2)$ does not hold, because $1 \leq 1/2$ is not true, so $L$ is not convex.

Geoff Oxberry
  • 30,394
  • 9
  • 64
  • 127