2

Can Mathematica generate symbolic expressions for gradients?

For example, if $x_1$ and $x_2$ are two points, could I get Mathematica to generate expressions similar to the following?

$\frac{\partial \left|x_1 - x_2\right|}{\partial x_1} = \frac{x_1 - x_2}{\left|x_1 - x_2\right|}$

$\frac{\partial \left|x_1 - x_2\right|}{\partial x_2} = \frac{x_2 - x_1}{\left|x_1 - x_2\right|}$

My experience with Mathematica is limited. I know how to get derivatives w.r.t. scalars.

Elsewhere on this site I found this question and from the answers it looks like Mathematica recently acquired the ability to do some amount of symbolic linear algebra.

J. M.'s missing motivation
  • 124,525
  • 11
  • 401
  • 574
Praxeolitic
  • 123
  • 4
  • D[Sqrt[(x1 - x2) (x1 - x2)], x1]? If x1 and x2 are supposed to be vectors, then I think your expressions are wrong, because the left-hand-side is a scalar, and the right-hand side is a vector. – march Jun 20 '16 at 02:23
  • @march Both sides are a vector. $\frac{\partial}{\partial x_1}$ refers to the vector of derivatives with respect to each component of $x_1$. – Praxeolitic Jun 20 '16 at 02:38
  • So, you mean the gradient? – march Jun 20 '16 at 02:39
  • @march Mostly yes, but split it into portions due to $x_1$ and $x_2$. I'm not sure if there's a better way to refer to that. – Praxeolitic Jun 20 '16 at 02:45

2 Answers2

7

Everything you want in the question can be done by defining the derivative of the Norm:

Derivative[1][Norm][z_] := z/Norm[z]

D[Norm[x - y], {x}]
(* ==> (x - y)/Norm[x - y] *)

Simplify[D[Norm[x - y], {y}]]
(* ==> (-x + y)/Norm[x - y] *)

Here, the syntax I used for the derivatives is such that it would remain valid if x or y were replaced by vectors (i.e., Lists).

Jens
  • 97,245
  • 7
  • 213
  • 499
  • Is there a way to load definitions for derivatives of common linear algebra expressions "out of the box"? – Praxeolitic Jun 20 '16 at 03:54
  • 2
    Not really. For most purposes, it's still necessary to either use explicit component notation (i.e., replace x by {xa, xb, xc} etc.) or to add your own definitions as I did here, or to use $Assumptions in combination with TensorReduce because by default interpretation of symbols like x and y is simply that they are generic complex scalars. It's especially difficult to "simplify" vector calculus expressions automatically because it's not as easy to characterize what constitutes a "simpler" result. – Jens Jun 20 '16 at 04:06
2

To some extent (and with some care) this can be done with FeynCalc. At least I used it several times when I needed to compute gradients and divergences of Cartesian vectors. The trick is to work with D-dimensional 4-vectors and take the limit $D \to 3$ at the end. Since FeynCalc doesn't distinguish between upper and lower indices, the results are the same as if one would work with Cartesian vectors. For example, computing $\partial x_1^i/ \partial x_1^j$ via

FourDivergence[FVD[x1, i], FVD[x1, j]]

returns $g^{i j}$ which one should interpret as a Kronecker delta. For the examples from the OP's question we can use

FourDivergence[Sqrt[SPD[x1 - x2]], FVD[x1, i]] // Together

to get

$\frac{\text{x1}^i-\text{x2}^i}{\sqrt{\text{x1}^2-2 (\text{x1} \cdot \text{x2} \cdot )+\text{x2}^2}}$

and

FourDivergence[Sqrt[SPD[x1 - x2]], FVD[x2, i]] // Together

to obtain

$\frac{\text{x2}^i-\text{x1}^i}{\sqrt{\text{x1}^2-2 (\text{x1} \cdot \text{x2} \cdot )+\text{x2}^2}}$

This is of course not a general solution, but I find it quite useful from time to time, so why not share it here. If one wants something more general, then the xTensor package might be worth a try.

vsht
  • 3,517
  • 13
  • 23