I know this can also be done with integrals, but I'm doing it from definitions as an exercise. Say we have a collection of open intervals $\{(a_{k}, b_{k})\}_{k =1}^{n}$ in the interval $(a, b)$. Then for all $\epsilon$, we seek a $\delta$ such that $\sum_{k = 1}^{n} [b_{k} - a_{k}] < \delta$ will yield $\sum_{k = 1}^{n} |TV(f_{[a_{k}, b_{k}]})| < \epsilon$. To do that we first want to get the sum of variations strictly less than, say, $\epsilon/2$, so that the desired sum will still be less than $\epsilon$ when we take the sup.
Here I'm wondering if the right way to view this is by "zooming in" on just one of the intervals $(a_{k}, b_{k})$ and chopping it up with partition $P_{k}$ into intervals $[x_{i-1}, x_{i}]$, then using absolute continuity of $f$ and choosing $\delta$ to respond to the $\epsilon/2n$ challenge for that function, which would give us the double sum
$\sum_{k =1}^{n} V(f_{[a_{k}, b_{k}}], P_{k}) = \sum_{k = 1}^{n} \sum_{i = 1}^{k} |f(x_{i}) - f(x_{i -1})| < \sum_{k = 1}^{n} \epsilon/2n = \epsilon/2$
But now I'm confused, because we were trying to find a $\delta$ bounding the total sum of all interval lengths, but this $\delta$ corresponds to a partition of just one of the intervals.