0

I know this can also be done with integrals, but I'm doing it from definitions as an exercise. Say we have a collection of open intervals $\{(a_{k}, b_{k})\}_{k =1}^{n}$ in the interval $(a, b)$. Then for all $\epsilon$, we seek a $\delta$ such that $\sum_{k = 1}^{n} [b_{k} - a_{k}] < \delta$ will yield $\sum_{k = 1}^{n} |TV(f_{[a_{k}, b_{k}]})| < \epsilon$. To do that we first want to get the sum of variations strictly less than, say, $\epsilon/2$, so that the desired sum will still be less than $\epsilon$ when we take the sup.

Here I'm wondering if the right way to view this is by "zooming in" on just one of the intervals $(a_{k}, b_{k})$ and chopping it up with partition $P_{k}$ into intervals $[x_{i-1}, x_{i}]$, then using absolute continuity of $f$ and choosing $\delta$ to respond to the $\epsilon/2n$ challenge for that function, which would give us the double sum

$\sum_{k =1}^{n} V(f_{[a_{k}, b_{k}}], P_{k}) = \sum_{k = 1}^{n} \sum_{i = 1}^{k} |f(x_{i}) - f(x_{i -1})| < \sum_{k = 1}^{n} \epsilon/2n = \epsilon/2$

But now I'm confused, because we were trying to find a $\delta$ bounding the total sum of all interval lengths, but this $\delta$ corresponds to a partition of just one of the intervals.

BMac
  • 605
  • 3
  • 12

1 Answers1

0

Hint: For any collection of intervals $\{ [a_k, b_k] \}_{k=1}^n$, we can choose $a_k', b_k' \in [a_k, b_k]$ such that $|f(a_k') - f(b_k')| \ge \frac{1}{2} TV(f |_{[a_k, b_k]})$. It follows that $\sum_{k=1}^n TV(f |_{[a_k, b_k]}) \le 2 \sum_{k=1}^n |f(a_k') - f(b_k')|$.

  • But my confusion still remains--for which set of intervals are we now picking a $\delta$, the original set, or the subset? – BMac Aug 14 '18 at 18:42