So, I'm seeking to prove the below identity, for $A,B$ vectors fields in $\mathbb{R}^3$:
$$A \times (\nabla \times B) + B \times (\nabla \times A) = \nabla (A \cdot B) - (A \cdot \nabla)B - (B \cdot \nabla)A \newcommand{\vp}{\varphi} \newcommand{\on}{\operatorname} \newcommand{\ve}{\varepsilon} \newcommand{\N}{\nabla} \newcommand{\ip}[2]{\left \langle #1, #2 \right \rangle} \newcommand{\para}[1]{\left( #1 \right)} \newcommand{\d}{\delta} \newcommand{\e}{\mathbf{\hat{e}}} \newcommand{\p}{\partial}$$
Specifically, I would like to do so with Einstein's summation notation and the Levi-Civita symbol, since I am new to those topics.
Known Identities: Let the following hold:
- $\{\e_i\}_{i=1}^3$ is the standard Cartesian basis of $\mathbb{R}^3$
- $U := (U_i)_{i=1}^3, V := (V_i)_{i=1}^3$ be vector fields
- $\varphi$ is a scalar field
- $\nabla := (\partial_i)_{i=1}^3$ as usual
- $\delta_{i,j}$ is the usual Kronecker $\d$ ($0$ if $i \ne j$ and $1$ otherwise)
- $\ve_{i,j,k}$ is the Levi-Civita symbol (which equals $\on{sign}(ijk)$ as a permutation in $S_3$, and $0$ if the indices are indistinct)
So, some identities I'm already familiar with, or am given (in the Einstein convention):
$$\begin{alignat*}{99} \on{grad}(\vp) &=\,& \N \vp &= \p_i \vp \; \e_i \\ \on{div}\para{V } &=\,& \N \cdot V &= \p_i V_i \\ \on{curl} \para{ V } &=\,& \N \times V &= \ve_{i,j,k} \cdot \p_j V_k \; \e_i \\ \ip U V &=\,& U \cdot V &= U_i V_i \\ &\,& U \times V &= \ve_{i,j,k} U_j V_k \; \e_i \\ &\,& \para{ U \cdot \N } V &= U_i (\p_i V_j) \e_j \end{alignat*}$$
Edit: Fixed a minor issue where I had the dot product running over two indices; clearly it should just be the one, not sure how I ended up with that. Fixed its use in the subsequent calculation.
Work So Far: So, to this end, I've derived the following:
$$\begin{align*} A \times \para{ \N \times B } &= A \times \Big( \ve_{i,j,k} \p_j B_k \e_i \Big) \tag{curl identity} \\ &= A \times \underbrace{\Big( \ve_{k,j,i} \p_j B_i \e_k \Big)}_{\substack{\text{$k$th component} \\ \text{of $\N \times B$}}} \tag{swap $k$ and $i$} \\ &= \ve_{i,j,k} \ve_{k,j,i} A_j \p_j B_i \e_i \tag{cross product identity} \\ &= -\ve_{i,j,k}^2 A_j \p_j B_i \e_i \tag{$\ve_{i,j,k} = -\ve_{k,j,i}$} \\ B \times \para{ \N \times A } &=-\ve_{i,j,k}^2 B_j \p_j A_i \e_i \tag{same reasoning} \\ \N \para{ A \cdot B } &= \N (A_j B_j) \tag{take the dot product} \\ &= \p_i (A_j B_j) \e_i \tag{gradient identity} \\ &= B_j \p_i A_j \e_i + A_j \p_i B_j \e_i \tag{product rule} \\ \para{ A \cdot \N } B &= A_i \p_i B_j \e_j \tag{given identity} \\ &= A_j \p_j B_i \e_i \tag{swap $i$ and $j$} \\ \para{ B \cdot \N} A &= B_i \p_i A_j \e_j \tag{given identity} \\ &= B_j \p_j A_i \e_i \tag{swap $i$ and $j$} \end{align*}$$ Now observe, $$\begin{align*} &\N \para{ A \cdot B } - \para{ A \cdot \N } B - \para{ B \cdot \N} A \\ &= \Big( B_j \p_i A_j + A_j \p_i B_j - A_i \p_j B_i - B_j \p_j A_i \Big) \e_i \\ &A \times \para{ \N \times B } + B \times \para{ \N \times A } \\ &= -\ve_{i,j,k}^2\Big( A_j \p_j B_i + B_j \p_j A_i \Big) \e_i \end{align*}$$
From here, I'm not entirely sure how to adequately handle everything. I have some questions:
- Firstly, and most naturally, are my derivations so far actually correct to begin with? (My next couple of questions address specific doubts.) What would be the natural way to proceed from here, beyond sheer brute force?
- Secondly, in my very final line with the Levi-Civita symbol above, I have a $k$ floating around, but not one in the previous line. This in particular hints to me at having done something wrong. If not wrong, how does one mitigate this?
- Thirdly, in finding $\N(A \cdot B)$, I'm not really sure how to use the identity for the gradient that I am given. The identity $\N \vp = \p_i(\vp) \e_i$ hints a summation over all possible coordinates and introduces one in $i$. Was it right to, for $\N (A_i B_j)$, to still have that be done in $i$? Or should I do it in $k$?
I also have a couple of questions around manipulating the Levi-Civita symbol in the midst of Einstein summation. They might not be super-related/necessary for the above arguments, but it'd help clear some things up, I hope.
- (Answered, see addendum below.) Fourthly, my textbook gave an identity for the product of Levi-Civita symbols, below, but I can't make sense of it. It reads off to me as being $\ve_{i,j,k} \ve_{k,i,m} = \d_{i,i} \d_{j,m} - \d_{j,i} \d_{i,m}$. I don't totally understand this though. Since $i,k$ are repeated here, shouldn't the summation be over $i,k \in \{1,2,3\}$, meaning $i$ in particular should not show up on the right-hand side? It certainly doesn't mirror other identities I've seen, and the use of an unsimplified $\d_{i,i} \equiv 1$ is somewhat suspect.
(Addendum: I found a different scan of the book, which was a bit cleaner. The identity stated was $\ve_{i,j,k} \cdot \ve_{k,\ell,m} = \d_{i,\ell} \cdot \d_{j,m} - \d_{j,\ell} \cdot \d_{i,m}$.)
- I have seen others tout the like identity $\ve_{i,j,k} \cdot \ve_{\ell,m,k} = \d_{i,\ell} \d_{j,m} - \d_{i,m} \d_{j,\ell}$. Taking this identity at face value, if I understand correctly, since only the third index $k$ is repeated, we are implicitly summing over $k \in \{1,2,3\}$, right? i.e. the claim is equivalent to:
$$ \forall i,j,m,\ell \in \{1,2,3\} \text{ we have } \sum_{k=1}^3 \ve_{i,j,k} \cdot \ve_{\ell,m,k} = \d_{i,\ell} \d_{j,m} - \d_{i,m} \d_{j,\ell} $$ (I guess we don't sum over $k$ on the right-hand side?) Also, if I wanted to manipulate this so that, say, the second index was fixed, how would I do that? It's clearly not as simple as replacing $k$ with $j$ and vice versa, or the same with $m$, I keep getting things that look "obviously wrong".
Sorry if some of these are incredibly basic; I'm very new to a lot of this approach to vector calculus (and wasn't good at it to begin with), and scouring MSE and Google for the exact sort of answers I need has not been proving useful. Thanks for any help you can provide.
