19

Prove that the union of three subspaces of V is a subspace iff one of the subspaces contains the other two.

I can do this problem when I am working in only two subspaces of $V$ but I don't know how to do it with three.

What I tried is: If one of the subspaces contains the other two, Then their union is obviously a subspace because the subspace that contains them is a subspace. (Is this sufficient??).

If the union of three subspaces is a subspace..... How do I prove that one of the subspaces must contain the other two from here?

*When proving this for two I said that there is an element in one of the subspaces that is not the other and proved by contradiction that one of the subspaces must be contained in the other. How would I do this for three?

user26857
  • 52,094
Soaps
  • 1,093

6 Answers6

33

The statement is false. Consider the following counterexample:

Consider the vector space $V=(\mathbb{Z}/2\mathbb{Z})^{2}$ where $F=\mathbb{Z}/2\mathbb{Z}$. Let $V_{1}$ be spanned by $(1,0)$. Let $V_{2}$ be spanned by $(0,1)$. Let $V_{3}$ be spanned by $(1,1)$. Then we have $V=V_{1}\cup V_{2}\cup V_{3}$, but none of the $V_{1},V_{2},V_{3}$ are subspace of another.

You can usually count on field of characteristic $2$ to give you counterexample. There are many similar counterexample, too. In finite dimension, I think all counterexamples can be constructed this way. My intuition tells me that there are infinite dimensional counterexamples of other form, but have not checked clearly.

EDIT. Here is a proof of the statement with the restriction $F\not=\mathbb{Z}/2\mathbb{Z}$:

Without loss of generality, we can assume the whole space $V$ is in fact $V_{1}+V_{2}+V_{3}$. Easily seen that in fact we must also have $V=V_{1}\cup V_{2}\cup V_{3}$.

There exist $a,b\in F$ such that $a,b\not=0$ and $a-b=1$ (take $a$ to be anything except $0,1$, and take $b=a-1$).

Assume $V_{1}$ and $V_{2}$ neither contains another (otherwise this reduce to the 2-subspace case). For any $u\in V_{1}\setminus(V_{1}\cap V_{2})$ we take an arbitrary $w\in V_{2}\setminus(V_{1}\cap V_{2})$ (it exists due to the fact that neither $V_{1}$ nor $V_{2}$ contains another). Then $au+w$ is in neither $V_{1}$ nor $V_{2}$ (if in $V_{1}$ then since $au\in V_{1}$ we must have $w\in V_{1}$ so $w\in V_{1}\cap V_{2}$ contradiction; same for the other case but now using the fact that $a\not=0$), so $u+aw\in V_{3}$. Same argument apply to show $bu+w\in V_{3}$. Hence $u=(bu+w)-(au+w)\in V_{3}$. Hence $V_{1}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Same argument apply to show $V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Now for any $v\in V_{1}\cap V_{2}$ we pick a $w\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Then $w+v\notin V_{1}\cap V_{2}$ (otherwise $w\in V_{1}\cap V_{2}$). But $w+v\in V_{2}$. Hence $w+v\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Thus $v=(w+v)-w$ so $v\in V_{3}$. Hence $V_{1}\cap V_{2}\subset V_{3}$. Therefore $V_{1},V_{2}\subset V_{3}$.

user26857
  • 52,094
Gina
  • 5,338
  • 1
    Why can we assume that the whole space $V$ is in fact $V_{1}+V_{2}+V_{3}$? – ubadub Mar 04 '19 at 01:51
  • Why does $V_1\cap V_2\subset V_3$ imply $V_1\subset V_3$ and $V_2\subset V_3$? – perimasu Mar 09 '21 at 19:28
  • Gina showed that $(V_{1} \setminus (V_{1} \cap V_{2})) \subset V_{3}$ and $(V_{1} \cap V_{2}) \subset V_{3}$. Since $V_1 = (V_{1} \setminus (V_{1} \cap V_{2})) \cup (V_{1} \cap V_{2})$, then $V_{1} \subset V_{3}$. Because the argument for $V_{2}$ is essentially identical, they didn't bother showing that part. – GhostyOcean Dec 31 '22 at 15:59
15

Gina's answer is great, but I think we can clean it up a bit.

Let $U_1,U_2,U_3$ be subspaces of $V$ over a field $k\neq \mathbb{F}_2$.

$(\Leftarrow)$ Suppose that one of the subspaces contains the other two. Without loss of generality, assume $U_1\subset U_3$ and $U_2\subset U_3$. Then $U_1\cup U_2\cup U_3 = U_3$, and so $U_1\cup U_2\cup U_3$ is indeed a subspace of $V$.

$(\Rightarrow)$ Now suppose $U_1\cup U_2\cup U_3$ is a subspace. If $U_2$ contains $U_3$ (or conversely), let $W = U_2 \cup U_3$. Then applying the case of the union of two subspaces (you need to prove this case first) to the union $U_1\cup W$, we have that either $U_1$ contains $W$ or $W$ contains $U_1$, showing that one of the three subspaces contains the other two, as desired. So assume $U_2$ and $U_3$ are such that neither contains the other. Let \begin{equation*} x\in U_2\setminus U_3 ~~~ \text{and} ~~~ y\in U_3\setminus U_2, \end{equation*} and choose nonzero $a,b\in k$ such that $a-b = 1$ (such $a,b$ exist since we assume $k$ is not $\mathbb{F}_2$).

We claim that $ax + y$ and $bx + y$ are both in $U_1$. To see that $ax + y\in U_1$, suppose not. Then either $ax + y\in U_2$ or $ax + y\in U_3$. If $ax + y\in U_2$, then we have $(ax + y) - ax = y\in U_2$, a contradiction. And if $ax +y \in U_3$, we have $(ax + y) - y = ax \in U_3$, another contradiction, and so $ax+y\in U_1$. Similarly for $bx + y$, suppose $bx + y\in U_2$. Then $(bx + y) - bx = y \in U_2$, a contradiction. And if $bx + y\in U_3$, then $(bx + y) - y = bx \in U_3$, also a contradiction. Thus $bx + y\in U_1$ as well. Therefore \begin{equation*} (ax + y) - (bx + y) = (a-b)x = x \in U_1. \end{equation*} Now, since $x\in U_2\setminus U_3$ implies $x \in U_1$, we have $U_2\setminus U_3\subset U_1$. A similar argument shows that $x + ay$ and $x + by$ must be in $U_1$ as well, and hence \begin{equation*} (x + ay) - (x + by) = (a - b)y = y \in U_1, \end{equation*} and therefore $U_3\setminus U_2\subset U_1$. If $U_2\cap U_3=\{0\}$, we're done, so assume otherwise.

Now for any $u\in U_2\cap U_3$, choose $v \in U_3\setminus U_2\subset U_1$. Then $u+v\not\in U_2\cap U_3$, for otherwise $(u+v)-u=v\in U_2$, a contradiction. But this implies $u+v$ must be in $U_1$, and hence so is $(u+v) - v = u$. In other words, if $u\in U_2\cap U_3$, then $u\in U_1$, and hence $U_2\cap U_3\subset U_1$, as was to be shown. $\tag*{$\square$}$

This problem appears in the first chapter of Linear Algebra Done Right, by Axler. I personally think it's pretty challenging for so early in an introductory linear algebra book, but it's a great exercise. Lots of details to keep straight.

JeffW89
  • 656
3

Gina gave an excellent answer,in fact,we can have : If $V$ is a vector space over the field $F$ and there is a collection of finite number of subspaces of $V$, $\{U_1,U_2,U_3,\cdots ,U_n\}$,and $n$,the number of the elements of the collection above,is not more than the cardinality of $F$,when $F$ is finite,or $F$ is just infinite,then the union of all the subspaces $U_1,U_2,U_3,\cdots ,U_n$ is a subspace of $V$ if and only if one of the subspaces $U_1,U_2,U_3,\cdots ,U_n$ contains all other subspaces.The proof is similar to the way one proves that "a vector space over an infinite field cannot be a finite union of proper subspaces of its own", and using the technique "prove by contradiction".(Using the pigeonhole principle to deduce absurdity: Imagine that the elements of $F$ "fly" into the subspaces $U_1,U_2,U_3,\cdots ,U_n$)

W.Leywon
  • 388
  • 2
    Readers who are interested in this may also be interested in the proof of the 'prime avoidance lemma'. – W.Leywon Feb 18 '17 at 14:58
  • I can prove it for the case where $|F| > n$, but is it also true for the case where $|F| = n$? Because in this case, I don't think we have enough elements in $F^$ to apply the pigeonhole principle to gather the fact that some $U_j$ not equal to $U_{1}$ contains at least two elements of the form $x + \alpha y$, where $x \in U_1$ and $y \in \bigcup_{i = 1}^n U_i \setminus U_1$, and $\alpha \in F^$. – jsmith Aug 07 '22 at 01:38
2

With regard to Gina and Jeff's answers, I believe the simplification at the start can be made easier (at least in my opinion).

Suppose that subspaces $U_1,U_2,U_3$ union to form a subspace. We have 2 cases.

Case 1: Suppose $U_i\subseteq \cup_{j\neq i}U_j$ for some $i\in \{1,2,3\}$. WLOG, let $i=1$. Then the problem is reduced to the case of 2 subspaces $U_j, j=2,3$. WLOG again, we have $U_2\subseteq U_3$, then $U_1\subseteq U_2 \cup U_3 = U_3$. Hence $U_3$ contains the other 2 sets.

Case 2: $\forall i\in \{1,2,3\}, U_i\not\subseteq \cup_{j\neq i}U_j$. Then for each $i$, there exists $\mathbf{u}_i\in U_i$ such that $u_i\notin \cup_{j\neq i}U_j$, i.e. $u_i\notin U_j$ for $j\neq i$.

After this, we can make a similar argument to that of Gina's. For simplicity, I assume the field in concern to be the real/complex field like in Axler's book.

$\mathbf{u}_1 + \mathbf{u}_2, 2\mathbf{u}_1 + \mathbf{u}_2, \mathbf{u}_1 + 2\mathbf{u}_2$ must all be in $U_3\setminus (U_1 \cup U_2)$. (Note that all these vectors are in $\cup_{1\leq i\leq 3} U_i$.) In particular, these vectors lie in $U_3$. Hence, $\mathbf{u}_1, \mathbf{u}_2 \in U_3$, a contradiction.

Hence only case 1 is possible and we are done.

0

I had some trouble trying to prove this statement because neither contradiction nor the logical equivalence $p \rightarrow (q \vee r) \equiv (p \wedge \neg q) \rightarrow r$ seems to provide a good enough hypothesis to find a clear way toward the consequence of the hard conditional (both work just fine on proving that if the union of two subspaces is also a subspace, then one of the subspaces contains the other). So after reading the first few lines of JeffW89's proof, I took a hint and found my way in proving completely this claim. I post my proof of the difficult implication because it is somewhat different from any other posted; it rests on the following proposition.

Proposition. Let $V$ be a vector space over a field $F$. If $V+u=\left\{v+u:v \in V\right\}$ for a given $u \in V$, then $V+u=V$.

Proof. It is clear that $V+u \subseteq V$ by additive closure. If $v \in V$, then $v-u \in V$ also, so that $(v-u)+u=v \in V+u$, thus $V+u \supseteq V$, that is, $V=V+u$. QED.

Claim. Let $V$ be a vector space defined over a field $F$ with more than two element. Suppose $U_{1}, U_{2}, U_{3}$ are subspaces of $V$. If $U_{1} \cup U_{2} \cup U_{3}$ is a subspace of $V$, then there exists $k \neq i, j$ such that $U_{i}, U_{j} \subseteq U_{k}$, where $i \neq j$.

Proof. (We omit the case $U_{i} \subseteq U_{j}$ for $i \neq j$ because, at this point, it is trivial to show that this implies that either $U_{k} \subseteq U_{j}$ or $U_{j} \subseteq U_{k}$ for $k \neq i,j$) Without loss of generality, suppose that $U_{1} \not\subseteq U_{2}$ and $U_{2} \not\subseteq U_{1}$, then there exist $u_{1} \in U_{1}$ and $u_{2} \in U_{2}$ such that $u_{1} \notin U_{2}$ and $u_{2} \notin U_{1}$. Since $u_{1} + u_{2} \in U_{1} \cup U_{2} \cup U_{3}$, there exists $i$ such that $u_{1}+u_{2} \in U_{i}$. If $u_{1}+u_{2} \in U_{1}$, then $u_{2} \in U_{1}$ by additive closure on $U_{1}$; a contradiction, likewise, if $u_{1}+u_{2} \in U_{2}$, then $u_{1} \in U_{2}$. So that $u_{1}+u_{2} \in U_{3} \cap U_{1}^{C} \cap U_{2}^{C}$.

Let $w_{1} \in U_{1}$ and $w_{2} \in U_{2}$ be given. If $w_{1}+u_{1}+u_{2} \in U_{1}$, then $u_{1}+u_{2} \in U_{1}$, which is impossible. If $w_{1}+u_{1}+u_{2} \in U_{2}$, then $w_{1}+u_{1} \in U_{2}$, implying that $U_{1} \subseteq U_{2}$ by previous result; a contradiction. Thus only $w_{1}+u_{1}+u_{2} \in U_{3}$, hence $w_{1} \in U_{3}$. We can show that $w_{2} \in U_{3}$ by a similar argument. Finally, we conclude that $U_{1}, U_{2} \subseteq U_{3}$. QED.

0

This is similar to the above solutions but I'm writing it here for clarity:

$\Rightarrow$ Denote the subspaces by $W_1, W_2, W_3$. Assume that $(W_1 \cup W_2) \not\subset W_3$ and similar for $W_1,W_2$. Take $x \in W_1/(W_2 \cup W_3)$ and similarly $x_2$ and $x_3$.

Assuming $\mathbb{F} \ne \mathbb{F}_2$, take $\lambda \in \mathbb{F}$ such that $\lambda \ne 0, 1$. Then, consider the set $$\{\lambda{x_1}+x_2, x_1+\lambda{x_2}, x_1+x_2, x_1\}.$$ By Pigeonhole, at least two of the elements are in the same set in same subspace. If two are in $W_1$ it can easily be deduced $x_2 \in W_1$ which gives a contradiction. If two are in $W_2$, since the last element can not be in $W_2$, one of the first two elements are in $W_2$ which with subtraction again gives $x_1 \in W_2$ a contradiction. If two elements are in $W_3$ (since the last element can not be in $W_3$), we get either $$\lambda{x_1}+x_2-\lambda{x_1}-\lambda^2{x_2}=(1-\lambda^2)x_2 \in W_3 \implies x_2 \in W_3$$ or $$\lambda{x_1}+x_2 - \lambda{x_1}-\lambda{x_2}=(1-\lambda)x_2 \in W_3 \implies x_2 \in W_3$$ with the final case symmetric. In all cases we get a contradiction, so in fact one of the inclusions holds.

$\Leftarrow$ WLOG $W_1\cup{W_2}\cup{W_3}=W_1$ so we get a subspace. $\blacksquare$