It's not hard to prove that, if $X$ and $Y$ are independent random variables then $f(X)$ and $f(Y)$ are independent, where $f,g: \mathbb{R} \rightarrow \mathbb{R}$ are measurable maps (see here) and it's also not hard to extend this definition to random vectors in $\mathbb{R}$. Now for the question:
Suppose we have $X_i$ iid random variables and $Y_i$ iid random variables for $i=1,...,n$, and suppose that mutual independence holds for all $X_i$ and $Y_i$. Now let $X=(X_1,...,X_n)$ and $Y=(Y_1,...,Y_n)$ be random vectors. Since these are independent, can we say that $g(X)=\sum_{i=1}^{n}X_i$ and $g(Y)=\sum_{i=1}^{n}Y_i$ are independent? This seems like a pretty straightforward application, but I have not seen anything like this example referenced in a textbook before, and yet this seems to be a particularly powerful example, so I want to make sure I'm not completely misunderstanding something.