gradient23's proof is great, in my opinion, but I would like to show another proof that seems more intuitive to me, though much less rigorous.
The proof is based on a verbal definition of independence from wikipedia:
two events are independent [...] if the occurrence of one does not affect the probability of occurrence of the other
In addition, we use the fact that independence is symmetric.
The (non-rigorous) proof:
- We assume that $A$ and $B$ are independent.
- By definition, the occurrence of $A$ doesn't affect the probability of $B$.
- Thus, the occurrence of $A$ also doesn't affect the probability of $B^C$.
- So by definition, $A$ and $B^C$ are also independent, which by definition again means that the occurrence of $B^C$ doesn't affect the probability of $A$.
(Here we used the symmetry of independence.)
- Therefore, the occurrence of $B^C$ also doesn't affect the probability of $A^C$.
- So by definition, $B^C$ and $A^C$ are also independent.
One could convert the proof to the language of math:
$$\begin{gathered}A\text{ and }B\,\text{are independent}\\
\downarrow\\
P\left(B|A\right)=P\left(B\right)\\
\downarrow\\
1-P\left(B^{C}|A\right)=1-P\left(B^{C}\right)\\
\downarrow\\
P\left(B^{C}|A\right)=P\left(B^{C}\right)\\
\downarrow\\
A\text{ and }B^{C}\,\text{are independent}\\
\downarrow\\
P\left(A|B^{C}\right)=P\left(A\right)\\
\downarrow\\
1-P\left(A^{C}|B^{C}\right)=1-P\left(A^{C}\right)\\
\downarrow\\
P\left(A^{C}|B^{C}\right)=P\left(A^{C}\right)\\
\downarrow\\
B^{C}\text{ and }A^{C}\,\text{are independent}
\end{gathered}
$$
But now we used conditional probabilities, which might be a problem in case $P(A)=0$ or $P(B^C)=0$ (here is a discussion about this problem).