3

Let $A$ and $B$ be Hermitian matrices.

  • If $AB=BA$, we know that $e^{A+B} = e^A e^B$.
  • In this paper, the author showed that $\text{Tr } e^{A+B} = \text{Tr } e^A e^B$ iff. $AB=BA$.

As such, $e^{A+B} = e^A e^B$ is equivalent to $\text{Tr } e^{A+B} = \text{Tr } e^A e^B$ in the context of Hermitian matrices.

My question is how we can derive the commutation relation between $A$ and $B$ directly from $e^{A+B}=e^A e^B$ without bringing in the Golden-Thompson inequality (as in the paper I linked). Since the condition $e^{A+B} = e^A e^B$ has a simpler form than that involving the trace, I think there should be some way.

Edit: rephrase the question

1 Answers1

5

The idea here is $A+B$ is Hermitian and the exponential map preserves Hermicity. Taking the conjugate transpose of each side, we have
$e^Ae^B = e^{A+B} = \big(e^{A+B}\big)^*=\big(e^B\big)^*\big(e^A\big)^*=e^Be^A$
so $e^A$ and $e^B$ commute.

Now call on a lemma twice:
for Hermitian $X,Y$
$e^XY= Ye^X$
iff $XY=YX$
proof sketch: the same unitary matrix $U$ that simultaneously diaogonalizes $e^X$ and $Y$ must diagonalize $X$ as well since all are Hermitian. And the same argument also runs backwards. (Underlying idea: the exponential map is injective on reals and Hermitian matrices are diagonalizable with real spectrum. So $e^X \mathbf v = \sigma \cdot \mathbf v\implies X\mathbf v = \log(\sigma)\cdot \mathbf v$ and of course $X \mathbf v = \lambda \cdot \mathbf v\implies e^X\mathbf v = e^{\lambda}\cdot \mathbf v$)

after applying the lemma once, with $Y:=e^B$, $X=A$, we know $Ae^B = e^BA$
and a 2nd application of the lemma, with $Y:=A$ and $X:= B$, tells us $AB = BA$

user8675309
  • 10,034
  • Could you elaborate more on $e^X \mathbf v = \sigma \cdot \mathbf v\implies X\mathbf v = \log(\sigma)\cdot \mathbf v$? It's not clear to me. – Minh Nguyen Apr 11 '21 at 02:18
  • $X$ is Hermitian so diagonalizable with d distinct eigenvalues thus
    $\mathbb C^n=W_1 \oplus ... \oplus W_d$, i.e. the vector space is a direct sum of Xs eigenspaces with $\mathbf w_i \in W_i$, compute the same thing two different ways
    $\mathbf 0 \neq \mathbf v=\sum_{i=1}^d \alpha_i \mathbf w_i$ and
    $ \sigma \mathbf v =e^X\mathbf v=e^X\sum_{i=1}^d \alpha_i \mathbf w_i=\sum_{i=1}^d e^{\lambda_i} \alpha_i \mathbf w_i$ $\implies \mathbf 0=\sum_{i=1}^d (\sigma - e^{\lambda_i}) \alpha_i \mathbf w_i$
    $\implies (\sigma - e^{\lambda_i}) \alpha_i=0$ for all i by linear independence.
    – user8675309 Apr 11 '21 at 04:45
  • But at most one $(\sigma - e^{\lambda_i}) \neq 0$ since $\lambda_i$ are distinct, real and the exponential function is injective on reals. Thus at least $d-1$ of the $\alpha_i=0$ and at most $d-1$ of the $\alpha_i=0$ because $\mathbf v\neq \mathbf 0$. $\implies\mathbf v\propto\mathbf w_k$ for some $k\in \big{1,2,...,d\big}$ (and of course $\lambda_k = \log(\sigma))$. – user8675309 Apr 11 '21 at 04:45
  • Thank you. I think it's correct, with just a small typo: at most one $(\sigma - e^{\lambda_i}) = 0$. – Minh Nguyen Apr 11 '21 at 08:23
  • Yes indeed. Unfortunately a typo like this in the form of a comment can't be edited outside the original 5 minute window. I may end up dropping this explanation (sans typo) into the posting. – user8675309 Apr 11 '21 at 17:10