6

Edit if we replace rotations with "add isotropic noise", this relation can be proven using Chebychev inequality as shown here. The $\pi/4$ angle seems to be connected to forgetting of starting point. In high dimensions, random rotations seem to keep iterates $u_1,u_2,...,$ roughly along the same line (hence triangle inequality for cosines becomes equality), until $\pi/2$ angle is reached at which point the process becomes ergodic.


Suppose I start with vector $u_1$ in $d$ dimensions and obtain $u_{i+1}$ by performing a sequence of $i$ small rotations in $d$ dimensions. For $d=100$, the following gives a good approximation, within 0.1% of true value in expectation.

$$\cos(u_1,u_4)=\cos(u_1,u_2)\cos(u_2,u_3)\cos(u_3,u_4)$$

where

$$\cos(x,y)=\frac{\langle x, y\rangle}{\|x\| \|y\|}$$

"Small rotation" of $v$ is done by sampling entries $z$ from standard normal, and rotating $v$ in the plane defined by vectors $v,z$ by $\theta$ radians. This identity works for $\theta_i\le\pi/4$ and breaks down for $\theta$ slightly above $\pi/4$.

  1. How can this be justified?
  2. Why is $\pi/4$ special?
  • 2
    I don't know Wolfram's language well enough to follow what you are doing (in particular, I do wonder if your random choice of vector is indeed sampled uniformly). That said, if you were to fix some $v_1$ in the unit sphere (I assume it's the unit sphere?) and consider it the north pole, a greater and greater proportion of the surface area will be located around the equator. So, I would expect all three inner products to approach $0$ on average; it will be increasingly rare to randomly choose vectors that aren't approximately orthogonal. – Theo Bendit Sep 07 '22 at 22:07
  • Is $\langle a, b \rangle$ the ordinary dot product? – Brian Tung Sep 07 '22 at 22:13
  • @BrianTung yes it is – Yaroslav Bulatov Sep 07 '22 at 22:13
  • Then isn't it the case that by symmetry, we also have $\langle v_1, v_2 \rangle \approx \langle v_1, v_3 \rangle \cdot \langle v_3, v_2 \rangle$? And then dividing both sides by $\langle v_3, v_2 \rangle$, we get $\langle v_1, v_3 \rangle \approx \langle v_1, v_2 \rangle / \langle v_3, v_2 \rangle$? We know that they should all be close to $0$; is that the sense in which these sides are approximately equal to each other? – Brian Tung Sep 07 '22 at 22:16
  • Side note: Each of these dot products is a zero-mean, nearly normally distributed random variable with variance $\sim 1/d$ for large $d$. – Brian Tung Sep 07 '22 at 22:23
  • @BrianTung not completely sure it's the same sense. The setup here is that I'm fixing $v1,v2$ on the sphere, then taking uniform(isotropic) measure over the sphere with a restriction that $\langle v2,x\rangle =C$. Then asking for the value of $E[\langle v1,x\rangle ]$ using this measure, which seems to be $\langle v1,v2\rangle*C$ – Yaroslav Bulatov Sep 07 '22 at 22:49
  • High dimensions? The approximation seems to work best when $d = 1$. – Dan Sep 07 '22 at 22:49
  • Clarified the question to better reflect how sampling was done – Yaroslav Bulatov Sep 07 '22 at 23:06
  • 1
    As the vectors all lie on the unit sphere, isn't this really a question about why $$\cos (\theta_{3}) \approx \frac{1}{2} \bigg( \cos(\theta_{1} + \theta_{2}) + \cos(\theta_{1} - \theta_{2}) \bigg)$$ in higher dimensions, where $\theta_{i}$ is the angle made between the vectors $v_{i}$ and $v_{j}$? – Matthew Cassell Sep 08 '22 at 00:21
  • @mattos good observation, yes, that seems to be equivalent where $\theta_1,\theta_2,\theta_3$ are angles formed by pairs $(v_1,v_2)$, $(v_2,v_3)$,$(v_1,v_3)$ respectively – Yaroslav Bulatov Sep 08 '22 at 00:48
  • Then perhaps the arguments made here might be useful (though they may not be as well, I haven't had the time to fully go through the answers). – Matthew Cassell Sep 09 '22 at 00:13
  • Yes, I suspect it's some kind of independence/orthogonality phenomenon since projection factor shrinkage is multiplicative, $\cos(a,c)=\cos(a,b)\cos(b,c)$ – Yaroslav Bulatov Sep 09 '22 at 00:30
  • You should pick appropriate and relevant tags. The two you have here are not that. – Ted Shifrin Sep 10 '22 at 18:25

1 Answers1

2

EDIT: It turns out I was misunderstanding. This is not a solution to the problem.

Unless I’m misunderstanding, I think this is a corollary of the following fact:

”Theorem”: given two uniform random unit vectors $u,v \in \mathbb{R}^d$, for large values of $d$ we usually see that $\langle u, v \rangle \approx 0$.

I put theorem in quotes because I haven’t defined “large $d$“ and I haven’t defined “usually”. Another way of stating this theorem is that in high dimension, almost all vectors are almost orthogonal.

The argument for this “theorem” is pretty intuitive. Given two such unit vectors chosen uniformly randomly, we see by the central limit theorem that: $$\langle u, v \rangle \sim \mathcal{N}(0, \frac{1}{d})$$ That is, the dot product will be distributed as a normal random variable with zero mean and variance $1/d$.

As you can see, as $d \rightarrow \infty$, the variance $\sigma^2 \rightarrow 0$. Hence we “usually” see dot products close to zero.

It immediately follows that both the right hand side and the left hand side of your expression should on average be very close to zero for large $d$. As Brian Tung points out, the ratio may blow up though.

Joe
  • 2,794
  • 1
    Actually, I rather suspect that in most cases, the product of any two dot products will be much smaller than the third, relatively speaking. It's just that they're all close to $0$. – Brian Tung Sep 07 '22 at 22:33
  • 1
    Oops I misunderstood your comment. I’ll edit that. – Joe Sep 07 '22 at 22:34
  • What I was pointing out was that if his assertion were true in a relative sense (that is, the LHS and RHS were roughly one-to-one), it would imply that the dot products were close to $1$. Since we know that isn't actually true, I suggested that it was simply that both LHS and RHS were close to $0$. – Brian Tung Sep 07 '22 at 22:36
  • I now realize the question was ambiguous, I have rephrased it to be more precise – Yaroslav Bulatov Sep 07 '22 at 23:08
  • Ahh yes it seems I completely misunderstood your question. My bad! Sorry! – Joe Sep 07 '22 at 23:51