2

Not a duplicate of

Suppose $F$ and $G$ are families of sets. Prove that $(\bigcup F) \setminus (\bigcup G) \subseteq \bigcup (F \setminus G)$.

This is exercise $3.4.20.a$ from the book How to Prove it by Velleman $($$2^{nd}$ edition$)$:

Suppose $\mathcal F$ and $\mathcal G$ are families of sets. Prove that $(\bigcup\mathcal F)\setminus(\bigcup\mathcal G)\subseteq\bigcup(\mathcal F\setminus\mathcal G).$

Here is my proof:

Let $x$ be an arbitrary element of $(\bigcup\mathcal F)\setminus(\bigcup\mathcal G)$. This means $x\in\bigcup\mathcal F$ and $x\notin\bigcup\mathcal G$. Since $x\in\bigcup\mathcal F$, we can choose some $A_0$ such that $A_0\in \mathcal F$ and $x\in A_0$. $x\notin\bigcup\mathcal G$ is equivalent to $\forall B(B\in\mathcal G\rightarrow x\notin B)$ and in particular $A_0\in\mathcal G\rightarrow x\notin A_0$. From $A_0\in\mathcal G\rightarrow x\notin A_0$ and $x\in A_0$, $A_0\notin\mathcal G$. From $A_0\in\mathcal F$ and $A_0\notin\mathcal G$, $A_0\in\mathcal F\setminus\mathcal G$. From $A_0\in\mathcal F\setminus\mathcal G$ and $x\in A_0$, $x\in\bigcup(\mathcal F\setminus\mathcal G)$. Therefore if $x\in(\bigcup\mathcal F)\setminus(\bigcup\mathcal G)$ then $x\in\bigcup(\mathcal F\setminus\mathcal G)$. Since $x$ is arbitrary, $\forall x\Bigr(x\in(\bigcup\mathcal F)\setminus(\bigcup\mathcal G)\rightarrow x\in\bigcup(\mathcal F\setminus\mathcal G)\Bigr)$ and so $(\bigcup\mathcal F)\setminus(\bigcup\mathcal G)\subseteq\bigcup(\mathcal F\setminus\mathcal G)$. $Q.E.D.$

Is my proof valid$?$

Thanks for your attention.

  • 1
    Too much symbols for my taste. Try to avoid quantifiers and especially the symbol $\to$ in proofs. – J. De Ro Jul 13 '20 at 11:47
  • 1
    I second @ε-δ. The use of the symbol $\to$ here is, in my opinion, non-standard and confusing. Is it meant to be $\Rightarrow$? Or does it have some other meaning? From the context my guess would be the former, but the doubt makes an otherwise straight-forward proof difficult to read. – SeraPhim Jul 13 '20 at 11:51
  • 1
    @SeraPhim In the above book by Velleman that I am using to self-study the author uses $\rightarrow$ instead of $\Rightarrow$ all throughout the book. – Khashayar Baghizadeh Jul 13 '20 at 11:53
  • 1
    Ah I see, that's unfortunate. Sometimes authors like to be cool and hip and use non-standard notation. I personally think it's counter-productive to do this, and just causes more problems than it solves (if it solves any at all). Especially when the standard $\Rightarrow$ is so ubiquitous. – SeraPhim Jul 13 '20 at 11:56
  • @SeraPhim I also found the following link: https://math.stackexchange.com/questions/2336918/what-is-the-difference-between-implication-symbols-rightarrow-and-rightarr – Khashayar Baghizadeh Jul 13 '20 at 12:05
  • 1
    I'm not saying it isn't used. I'm saying it's not standard for set theory proofs. – SeraPhim Jul 13 '20 at 12:18
  • What does $F\setminus G$ mean when $F$ and $G$ are families of sets? Is it ${f\setminus g,|, f\in F \text{ and } g\in G}$? – snulty Jul 13 '20 at 12:44
  • @snulty Well $F$ and $G$ are sets in their own right. So it is the standard set minus operation. – SeraPhim Jul 13 '20 at 12:50
  • @SeraPhim Ah yeah that one makes a lot more sense. I kind of find this union without any indexing confusing, it's probably a bit more general (not relying on indexing) but I'm not as used to it. – snulty Jul 13 '20 at 12:52
  • @snulty see the discussion under my answer concerning the indexing. Not using indexing seems to just be a choice by the author to make things look tidier, but isn't actually anymore general. – SeraPhim Jul 13 '20 at 12:58

2 Answers2

1

I think your way of arguing is fine, but one can write it down more clear. Here is how I would write down your argument:

(I assume, that $\bigcup \mathcal{F}$ stands for the union of all sets in $\mathcal{F}$.)

Let $x \in \bigcup \mathcal{F} \setminus \bigcup \mathcal{G}$. That is equivalent to the condition that both $x \in \bigcup \mathcal{F}$ and $x \notin \bigcup \mathcal{G}$ hold.

Hence, there must be a set $A_0$ in the family $\mathcal{F}$ containing $x$ (because $x \in \bigcup \mathcal{F}$ holds) and no set in $\mathcal{G}$ can contain $x$ (because $x \notin \bigcup \mathcal{G}$ holds).

Therefore, no set in $\mathcal{G}$ can contain $x$ and - in particular - $A_0$ cannot be in the family $\mathcal{G}$: $A_0 \notin \mathcal{G}$.

But $A_0 \in \mathcal{F}$ and $A_0 \notin \mathcal{G}$ clearly implies $A_0 \in \mathcal{F}\setminus\mathcal{G}$, so after all,

\begin{equation} x \in A_0 \subset \bigcup \mathcal{F}\setminus\mathcal{G}. \end{equation}

1

Your proof seems fine, but as mentioned in the comments it could do with some tidying.

Let $\mathcal{F}=\{F_i\}_{i\in I}$ and $\mathcal{G} = \{G_j\}_{j\in J}$ be families of sets indexed by $I$ and $J$ respectively, and $x\in (\bigcup_{i} F_i)\backslash(\bigcup_{j} G_j)$. Then $x\in\bigcup_{i} F_i$ and $x\notin\bigcup_{j} G_j$. Therefore there is some $F_k\in\mathcal{F}$ such that $x\in F_k$ and $x\notin G_j$ for all $G_j\in\mathcal{G}$. Hence $F_k \in\mathcal{F}\backslash\mathcal{G}$, and so $x\in\bigcup_{t\in T}\mathcal{F}\backslash\mathcal{G}$, where $T$ indexes $\mathcal{F}\backslash\mathcal{G}$.

SeraPhim
  • 1,168
  • Why did you assume that $\mathcal F$ and $\mathcal G$ are indexed families? – Khashayar Baghizadeh Jul 13 '20 at 12:20
  • 1
    I just prefer to give families of sets an indexing set. It's not really an important part of the proof. – SeraPhim Jul 13 '20 at 12:22
  • 1
    Although now that you mention it, I'm not actually sure if all families of sets can be indexed... I was under the impression that they could be but now you've made me doubt myself. Stand by while I do some digging. – SeraPhim Jul 13 '20 at 12:26
  • 2
    I think any family of sets is just indexed by itself - let $I = \mathcal F$, then $\mathcal F = {F_i}_{i \in I}$, where $F_i =i $. But this does kind of expose that the indexing isn't really needed, you could just write "there is some $F \in \mathcal F$" instead. – Izaak van Dongen Jul 13 '20 at 12:32
  • 1
    @IzaakvanDongen Good point! You're right that the use of indexes isn't really necessary. It's more a personal preference. I find it helps me think about families of sets a little better. – SeraPhim Jul 13 '20 at 12:36