7

Let $U \subseteq \mathbb R^2$ be an open, bounded, connected subset. Let $A \in \text{SL}_2$ ($A$ is an invertible $2 \times 2$ matrix with determinant $1$) and suppose that $AU = U$.

Must $A$ be either orthogonal or of finite order?

If we assume that $0 \in U$, then I can prove that $A$ is diagonalizable (over $\mathbb C$), with all eigenvalues of modulus $1$. I am not sure if this helps though.

Indeed, we can assume that $B_r(0) \subseteq U \subseteq B_{R}(0)$. Thus for any $x \in B_r(0)$ and for any $k \in \mathbb{Z}$, since $A^k U \subseteq U$, $|A^k x| \le R$. This implies that all orbits of $A$ are bounded, i.e. $\sup_{k\in\mathbb{Z}}\|A^k x\|<+\infty$ for any $x\in \mathbb{R}^n$, which implies the required assertion about diagonalizablilty.

Asaf Shachar
  • 25,111
  • 1
    Do you mean $AU = U$? Otherwise just take the unit disk and a contraction by $\lambda < 1$. – Klaus Jul 27 '20 at 11:22
  • Thanks, you are right. I have updated the question accordingly. – Asaf Shachar Jul 27 '20 at 11:42
  • @Klaus: A contraction by $\lambda<1$ would not have determinant $1$. – celtschk Jul 27 '20 at 15:05
  • @celtschk That was edited in after I commented. :-) – Klaus Jul 27 '20 at 15:07
  • @Klaus: I see, I should have checked the edit. I assumed that the only change was updating the equation to $AU=U$ (which the post now says, and which I could infer from your post that it didn't say before). Also the following comment by Asaf Shachar seemed to indicate that this was the change done. – celtschk Jul 27 '20 at 15:12

3 Answers3

4

You don't need to assume that $0\in U$.

Since $AU=U$, all orbits of vectors in $U$ completely lie in $U$. Since $U$ is bounded, no vector in $U$ can have an unbounded orbit (because an unbounded orbit would not lie in $U$). Since $U$ is open, it spans all of $\mathbb R^2$, that is, any vector in $\mathbb R^2$ can be written as linear combination of vectors $u_i$ in $U$. But then for a general vector $v\in\mathbb R^2$, we have $$\|A^kv\| = \|A^k(\sum_i \alpha_i u_i)\| = \|\sum_i \alpha_i (A^k u_i)\| \le \sum_i \|\alpha_i (A^k u_i)\| = \sum_i |\alpha_i| \|A^k u_i\|$$ and therefore the orbit of $v$ is bounded because all orbits of $u_i$ are.

celtschk
  • 43,384
4

This is false. That is, there is an element $A\in Sl_2$ which preserves a connected bounded open set $U$, but for which $A$ is neither orthogonal, nor finite order. In fact, there are many such $A$.

Let's construct some. To begin with, let $B =\begin{bmatrix} \cos \theta & -\sin \theta \\ \sin\theta & \cos\theta\end{bmatrix}$ be a rotation matrix, where $\theta$ is in irrational multiple of $\pi$. In particular, $B$ has infinite order. In fact, $\langle B\rangle$, the subgroup of $SO(2)$ generated by $B$, is dense.

Now, $SO(2)$ is not normal in $Sl_2$. This implies that there is a matrix $C\in Sl_2$ for which $CBC^{-1}\notin SO(2)$. (For, if $CBC^{-1}\in SO(2)$ for every $C\in Sl_2$, then $C\langle B\rangle C^{-1}\subseteq SO(2)$. Then, since $SO(2)$ is closed $\overline{ C\langle B\rangle C^{-1}}\subseteq SO(2)$. But $\overline{C \langle B\rangle C^{-1}} = C\overline{\langle B\rangle}C^{-1} = C SO(2) C^{-1}$. This shows that $SO(2)$ is normal in $Sl_2$, giving a contradiction.)

Choosing a $C$ with $CBC^{-1}\notin SO(2)$, I claim that $A = CBC^{-1}$ meets all your guidelines. It's not orthogonal by construction, and since $CBC^{-1}$ and $B$ have the same order, it's not of finite order.

So, we need only construct $U$. To that end, note that $B$ preserves the ball of radius $1$ centered at $(0,0)$, because $B$ is orthogonal. Calling this ball $V$, this just means $BV = V$. Let $U = CV$.

Note that $$AU = CBC^{-1} CV = CBV = CV = U,$$ so $U$ is preserved. Since left multiplication by $C$ is an isomorphism, it is, in particular a homeomorphism, so $U$ is connected and open. Lastly, $U$ is bounded because $C$ has bounded operator norm.

Edit

To answer the question in the comments...

Proposition: Suppose $A\in Sl_2$ preserves a bounded open set $U$. Then $A$ is conjugate to an element in $SO(2)$.

Proof: From Celtschk's answer together with the OPs observation, the hypothesis that $A$ preserves an open set $U$ implies that $A$ is diagonalizable. Let's assume first that the eigenvalues $r$ and $\frac{1}{r}$ have different magnitudes. This, in particular, implies they are real numbers. By swapping $r$ and $\frac{1}{r}$, we may thus assume that $|r| > 1$. Let $v_1$ be an eigenvector for $r$ and $v_2$ be an eigenvector for $\frac{1}{r}$. As mentioned in Celtschk's answer, there is a vector $v = a_1 v_1 + a_2 v_2$ with $a_1\neq 0$. Then, by induction, $A^n v = r^n a_1 v_1 + \frac{1}{r}^n a_2 v_2 \in U$ for all $n = 1,2,3,...$. Note that $|A^n v| \geq |r|^n |a_1| \rightarrow \infty$ as $n\rightarrow \infty$. So, we have contradicted the fact that $U$ is bounded. This concludes the case where the eigenvalues of $A$ are real.

So, we may assume that the eigenvalues $|r|$ and $|\frac{1}{r}|$ have the same magnitude, which must therefore be $1$. We will show that the closure of the subgroup generated by $A$, $\overline{\langle A\rangle}$ is compact. Once we've done this, the Cartan-Iwasa-Malcev theorem tells us that $\overline{\langle A\rangle}$ is conjugate to a subgroup of $SO(2)$, so $A$ is conjugate to something in $SO(2)$.

Now, let's show that $\overline{\langle A\rangle}$ is compact. Once and for all, choose a (possibly complex) matrix $C$ which diagonalizes $A$, so $CAC^{-1}$ is diagonal.

For a (possibly complex) matrix $D = (d)_{ij}$, let $ \|D\| = max(|d_{ij}|)$. Note that the inequality $\|AB\| \leq 2\|A\| \|B\|$ holds since any entry of $AB$ is of the form $a_1 b_1 + a_2b_2$ where the $a_i$ are entries in $A$ and the $b_j$ are entries of $B$.

Note also that $\|(CAC^{-1})^n\| = 1$ since $(CAC^{-1})^n$ is diagonal with diagonal entries $\lambda^n$ and $(1/\lambda)^n$.

Thus, we see that $\|A^n\| = \| C^{-1}(CAC^{-1})^n C\| \leq 4\|C^{-1}\| \| (CAC^{-1})^n\| \|C\| = 4\|C^{-1}\| \|C\|$. That is, every entry of $A^n$ is bounded, independent of $n$.

Thus, the cyclic subgroup generated by $A$ is bounded, and so, therefore is its closure. That is $\overline{\langle A\rangle}$ is a closed bounded subset of $Sl_2$. Since $Sl_2$ is closed in $M_2(\mathbb{R})$ (because it's the inverse image of a point under the continuous $\det$ map), $\overline{\langle A\rangle}$ is closed and bounded in $M_2(\mathbb{R})\cong \mathbb{R}^4$. Thus, it is compact. $\square$

  • For a concrete example, $\begin{bmatrix} 1 & 1\ 0 & 1\end{bmatrix} \begin{bmatrix} \cos 1 & -\sin 1\ \sin 1 & \cos 1\end{bmatrix} \begin{bmatrix} 1 & 1 \ 0 & 1\end{bmatrix}^{-1} \approx \begin{bmatrix} 1.38177 & -.1682941\ .841471 & -.301169\end{bmatrix}$ is an explicit counter example. – Jason DeVito - on hiatus Jul 28 '20 at 03:42
  • Thanks for the compliment. I find most of your questions too hard ;-). As to your question about whether or not this procedure generates all examples. I'm not sure. Right now, I'm busy with some other stuff, but I'll think about it later. – Jason DeVito - on hiatus Jul 28 '20 at 15:43
  • @Asaf: I've proven the answer to your follow up question is yes. – Jason DeVito - on hiatus Jul 28 '20 at 22:38
  • Thanks, this is amazing, really! By the way if I am not mistaken you actually proved that any diagonalizable matrix in $SL_2$ is conjugate to an element of $SO(2)$, right? (I think that you only used the assumption $AU=U$ for implying diagonalizability). – Asaf Shachar Jul 29 '20 at 08:38
  • @Asaf: I think I proved any diagonalizble matrix with unit length eigenvalues is conjugate to an element in $SO(2)$. – Jason DeVito - on hiatus Jul 29 '20 at 13:17
  • Thanks, you are right of course! Just one last question-does any diagonalizable matrix with unit length eigenvalues similar to an orthogonal matrix over the reals? (i.e. can we always find a real conjugating matrix?) – Asaf Shachar Jul 29 '20 at 16:48
  • Actually, I now think that the fact that any diagonalizable matrix with unimodular eigenvalues is similar (over $\mathbb R$) to an orthogonal matrix follows directly from the theorem on the real Jordan normal form, so you don't actually need to go through the argument on maximal compact groups, right? Or am I mistaken? – Asaf Shachar Jul 29 '20 at 17:00
  • @Asaf: For your first question, regarding conjugating over the reals, yes. That comes the the Cartan-Iwasa-Malcev theorem: any compact subgroup of $Sl_2$ is conjugated by an element of $Sl_2 $ into $SO(2)$. For your second, I'm not sure I know what a real Jordan form is, but I wouldn't be surprised if there is some kind of machinery that could give the result I proved. – Jason DeVito - on hiatus Jul 29 '20 at 17:20
  • @Asaf: Nevermind, I see what you're saying. One can use, e.g., https://math.stackexchange.com/questions/556257/show-that-two-matrices-with-the-same-eigenvalues-are-similar. Yes, I worked too hard in my proof ;-) – Jason DeVito - on hiatus Jul 29 '20 at 17:23
3

I'm probably (still) missing something, but...

We know $A$ is a real 2x2 matrix of determinant 1, and all eigenvalues have modulus 1, from @celtschk's answer. From this, we know that the eigenvalues are either $\exp(it), \exp(-it)$ or $r, 1/r$, where $t$ and $r$ are real. In the first case, $A$ is a rotation by angle $t$, hence orthogonal. In the second case, the fact that the eigenvalues have modulus $1$ means that they are both $+1$ or both $-1$, so $A$ is either the identity or $-I$.

Probably @celtschk figured all this was trivial and didn't both writing it out, but I've done so for clueless folks like me who come upon this question later.

Post-comment addition Let $z = \exp(it) = c + i s$, where $c = \cos t, s = \sin t$, so the other eigenvalue is $\bar{z}$. Let $v_1$ be a (complex) eigenvector for $z$, so $$ Av_1 = zv_1. $$ Conjugating both sides gives us $$ A \bar{v}_1 = \bar{z} \bar{v}_1 $$ (because $A$ is real), so we have an eigenvector $v_2 = \bar{v}_1$ for the other eigenvalue, too.

I believe that if you let $$ w_1 = \Re(v_1); w_2 = \Im(v_1) $$ then you find that $w_1$ and $w_2$ are orthogonal vectors in $\Bbb R^2$, and that $$ Aw_1 = \cos(t) w_1 + \sin(t) w_2 \\ Aw_2 = \sin(t) w_1 - \cos(t) w_2 $$ so that in the $w$ basis, $A$ is a rotation by $t$...but then it's a rotation by $t$ in any basis.

As I said, I believe this, but I can't actually get the algebra right, and I've got a deadline in a couple of hours, so I have to stop; I hope that this gets you going in the right direction (or lets you see why I'm completely wrong!).

John Hughes
  • 93,729
  • Thank you. Perhaps I am a bit stupid today, but can you please help me see why a diagonalizable $2 \times 2$ matrix with eigenvalues $e^{it},e^{-it}$ must be a rotation by angle $t$? Any elaboration would be welcomed. – Asaf Shachar Jul 27 '20 at 15:36
  • Hunh. I used to know why that was true. (Or at least I used to know why I used to think that was true!). Let me see if I can reconstruct... – John Hughes Jul 27 '20 at 16:56
  • BTW, if you can reconstruct the argument that I've halfway made, please feel free to edit it into my answer. – John Hughes Jul 27 '20 at 17:32
  • @Asaf: In general, a diagonalizable (over $\mathbb{C}$) $2\times 2$ matrix with eigenvalues $e^{\pm it}$ need not be a rotation. For example, if you conjugate a rotation matrix by an element in $GL_2$, then, in general, you don't get an orthogonal matrix, but the eigenvalues still match those of an orthogonal matrix. Specifically, $\begin{bmatrix} 1 & 1\ 0 & 1\end{bmatrix}\begin{bmatrix} 0 & -1\1&0\end{bmatrix} \begin{bmatrix}1&1\ 0 & 1\end{bmatrix}^{-1} = \begin{bmatrix} 1 & -2\ 1 & -1\end{bmatrix}$ is not orthogonal. – Jason DeVito - on hiatus Jul 27 '20 at 20:27
  • Thanks @JasonDeVito -- I knew that there was some other assumption hidden in there. Of course, I can't recall what it was. Maybe I was thinking of ... no... I got nuthin'. Darn...and I was so close! – John Hughes Jul 28 '20 at 01:28
  • @JohnHughes: Ha! Yeah, that happens to me some times. Incidentally, it turns out you can turn my matrix counterexample above into a full fledged counterexample to whole problem with just a slight change. I just posted a solution to that effect. – Jason DeVito - on hiatus Jul 28 '20 at 03:39
  • @JohnHughes Hi, unfortunately I had to unaccept your answer due to Jason DeVito's comments. I appreciate the effort and kindness though (I also seem to vaguely remember that you have answered questions of mine and helped me in the past). – Asaf Shachar Jul 28 '20 at 04:45
  • Not a problem, Asaf --- I was going to suggest that celtshck really deserved the credit, since that seemed like the hard part to me! I think we've crossed paths in the past, but I would have said that mostly I learned things from YOU --- analysis isn't really my strong suit. Glad to see that Jason has resolved the question --- it was a really fun one! – John Hughes Jul 28 '20 at 04:55