1

I read that the vector space of abstract "tuples" is isomorphic to vector space of $n \times 1$ or $1 \times n $ matrices.

Where can I find a good explanation of this or can someone explain it here?

LearningMath
  • 1,201

2 Answers2

3

What an $n$-vector looks like: $(a_1, \dots, a_n)$.

What a $1 \times n$ matrix looks like: $[a_1, \dots, a_n]$.

What an $n \times 1$ matrix looks like: $[a_1, \dots, a_n]^T$.

Try to guess a map from one of these spaces to the other based on this. Can you show that your guess is linear? If so you're basically done.

syusim
  • 2,195
2

The Vector Spaces

I will only consider real-valued $n$-tuples, row matrices, and column matrices in this answer.

An $n$-tuple is an element of the space $\Bbb R^n = \overbrace{\Bbb R \times \cdots \times \Bbb R}^{n\text{ times}}$. A typical element of $\Bbb R^n$ has the form $(a_1, a_2, \cdots, a_n)$, s.t. $a_1, a_2, \cdots, a_n \in \Bbb R$. Addition and scalar multiplication are defined by

  • $(u_1, \cdots, u_n) + (v_1, \cdots, v_n) = (u_1 + v_1, \cdots, u_n + v_n)$
  • $\forall k \in \Bbb R$, $k(u_1, \cdots, u_n) = (ku_1, \cdots, ku_n)$

A row matrix is an element of the space $M_{1\times n}(\Bbb R)$. A typical element looks like $\begin{bmatrix} a_1 & a_2 & \cdots & a_n \end{bmatrix}$. Addition and scalar multiplication are defined as usual for matrices.

A column matrix is an element of the space $M_{n\times 1}(\Bbb R)$. A typical element has the form $\begin{bmatrix} a_1 \\ a_2 \\ \vdots \\ a_n\end{bmatrix}$. Addition and scalar multiplication are defined as usual for matrices.


What Does It Mean for Spaces to Be Isomorphic?

An isomorphism is a bijective homomorphism. As long as $1$ bijective homomorphism exists between any two spaces, then they are called isomorphic.

A bijection is a function that is injective and surjective.

A homomorphism is a structure-preserving function. In the case of vector spaces, homomorphisms are called linear transformations. The defining property of linear transformations is that of linearity, i.e. if $f$ is linear, then $f(a\vec u + b\vec v) = af(\vec u) + bf(\vec v)$.

Our goal is then to find a bijective homomorphism between the three above spaces.


The Case of $\Bbb R^n$ and $M_{1\times n}(\Bbb R)$

We'll begin by considering the case of $\Bbb R^n$ and $M_{1\times n}(\Bbb R)$. Let $T$ be defined by $$T(a_1, a_2, \cdots, a_n) = \begin{bmatrix} a_1 & a_2 & \cdots & a_n\end{bmatrix}$$

I claim that $T$ is an isomorphism. Let's see if I'm right.


Proof of Injectivity:

I'll prove the contrapositive of the usual statement. Assume $(a_1, a_2, \cdots, a_n) \ne (b_1, b_2, \cdots, b_n)$, then $a_i \ne b_i$ for at least one $i$.

$T(a_1, a_2, \cdots, a_n) = \begin{bmatrix} a_1 & a_2 & \cdots & a_n\end{bmatrix}$ and $T(b_1, b_2, \cdots, b_n) = \begin{bmatrix} b_1 & b_2 & \cdots & b_n\end{bmatrix}$. If $\begin{bmatrix} a_1 & a_2 & \cdots & a_n\end{bmatrix}$ were equal to $\begin{bmatrix} b_1 & b_2 & \cdots & b_n\end{bmatrix}$, then by definition $a_i = b_i$ for all $i$. But we already said that's not the case. Therefore $$(a_1, a_2, \cdots, a_n) \ne (b_1, b_2, \cdots, b_n) \implies T(a_1, a_2, \cdots, a_n) \ne T(b_1, b_2, \cdots, b_n)$$

Thus $T$ is injective.$\ \ \ \ \square$


Proof of Surjectivity:

Consider the arbitrary element of $M_{1\times n}(\Bbb R)$ given by $\begin{bmatrix} a_1 & a_2 & \cdots & a_n \end{bmatrix}$. What is this matrix the image of under $T$? It's clearly the case that $$T(a_1, \cdots, a_n) = \begin{bmatrix} a_1 & \cdots & a_n\end{bmatrix}$$

But is $(a_1, \cdots, a_n) \in \Bbb R^n$ because $\Bbb R^n$ contains all such real-valued $n$-tuples. Thus because an arbitrary element of $M_{1\times n}(\Bbb R)$ is the image of an element of $\Bbb R^n$, all elements of $M_{1\times n}(\Bbb R)$ must be the image of some element of $\Bbb R^n$.$\ \ \ \ \square$

NOTE: I'm not entirely happy with this proof but proving obvious things are sometimes weirdly difficult. Oh well.


Proof of Linearity:

Let $a, b \in \Bbb R$ and $\vec u = (u_1, \cdots, u_n), \vec v = (v_1, \cdots, v_n) \in \Bbb R^n$. Then $$\begin{align}T(a\vec u + b\vec v) &= T(au_1 + bv_1, \cdots, au_n + bv_n) \\ &= \begin{bmatrix} au_1 + bv_1 & \cdots & au_n + bv_n\end{bmatrix} \\ &= \begin{bmatrix} au_1 & \cdots & au_n\end{bmatrix} + \begin{bmatrix} bv_1 & \cdots & bv_n\end{bmatrix} \\ &= a\begin{bmatrix} u_1 & \cdots & u_n\end{bmatrix} + b\begin{bmatrix} v_1 & \cdots & v_n\end{bmatrix} \\ &= aT(\vec u) + bT(\vec v)\end{align}$$

Thus $T$ is linear which means that $T$ is a homomorphism between $\Bbb R^n$ and $M_{1\times n}(\Bbb R)$.$\ \ \ \ \square$


This proves that $T$ is an isomorphism between $\Bbb R^n$ and $M_{1\times n}(\Bbb R)$. Because an isomorphism exists, these two spaces are isomorphic.

There are likewise canonical isomorphisms between $\Bbb R^n$ and $M_{n\times 1}(\Bbb R)$ and between $M_{1\times n}(\Bbb R)$ and $M_{n\times 1}(\Bbb R)$.


Now what does this mean? It means that as long as we stick to using their properties as vector spaces (and not as sets or anything else), we can view these three spaces as being exactly the same. We just think of elements of one as being nothing more than the relabelled elements of either of the others. Thus we can freely switch these different "labellings" in most problems.

  • You say that there is an obvious bijection. Is there a way all of this to be shown explicitly and not be obvious. Again, thanks a lot for your effort because I didn't find any resource talking about this and I find this very confusing. – LearningMath May 17 '15 at 00:19
  • You also have to show that the bijection is a homomorphism. It is not enough that a bijection exists and a homomorphism exists -- consider $\mathbb R$ and $\mathbb R^2$. –  May 17 '15 at 00:32
  • The point is not whether the proof is hard or easy, the point is that your answer makes it sound like the proof is not necessary, which is misleading. Even in the comments you implied that any bijection between vector spaces is an isomorphism, which is not true. –  May 17 '15 at 00:54