0

Suppose $f$ is continuous and nonnegative on $[a, b]$, and $\int_{a}^{b}f dx=0$. Show that $f=0$ on [a,b].

My idea:

  1. The conditions, continuous and nonnegative, remind me to use the theorem to show they are Rienman integrable. My question:

    1. It seems that we do not need to show they are Rienman integrable, since we already know their integration.
    2. It seems it does not help if I have already showed their Rienman integrable.
  2. I can come up two approaches, but I am not sure whether they are right.

    1. Suppose there is a point larger than 0, then this point must fall into one of the partitions, then its integration cannot be 0.
    2. We take $[a,b]$ as the partition, then the difference between $U(p,f)$ and $L(p,f)$ must be less than arbitrary small positive number. Hence $\sup f$ and $\inf f$ must be 0.
Jill Clover
  • 4,787

1 Answers1

5

Assume that $f$ is not identically $0$ on $[a,b]$. Then, there exists $x_{0} \in [a,b]$ with $f(x_{0})>0$. Since $f$ is continuous and nonnegative on $[a,b]$ we know that in some neighborhood of $x_{0}$ in $[a,b]$ we have $f(x)>\frac{1}{2}f(x_{0})$. Split your integral up into three parts and you'll see that it must be positive.

Edit: Sorry this is rather late, but to address the comment

$\int_{a}^{b}f~dx = \int_{a}^{x_{0}-\delta}f~dx + \int_{x_{0}-\delta}^{x_{0}+\delta}f~dx + \int_{x_{0}+\delta}^{b}f~dx \geq 0 + \delta f(x_{0}) + 0 > 0$

where $\delta>0$ is chosen so that $f(x)>\frac{1}{2}f(x_{0})$ for $x \in (x_{0}-\delta,x_{0}+\delta)$.

Edit (addressing second comment): We need a positive lower bound for $f$ on the interval so we can use the following fact about integrals

$\int_{a}^{b}f~dx \geq m\int_{a}^{b}dx$

where $m=\inf_{[a,b]}f$.

GAM
  • 496