1

I'm trying to set the derivative of a function to zero at a specified point. This will act as an initial condition for an algorithm I'm writing: so I'm specifying the value of the function at x=0, and I'm specifying the value of the slope of the function at x=0. The function is u[n, x] and is unknown. The algorithm, based on Adomian decomposition method, will solve a nonlinear differential equation for this u-function.

Clear[u]
u /: D[u[n_, x_], x_] := 0
UpValues[u];
D[u[n, x], x] == 0

It gives the right output -- says it's True.

The problem I'm having is other queries say they are True when they shouldn't be:

D[u[n, x], 0] == 0 \\True
D[u[n, 0], 0] == 0 \\True

This is a somewhat duplicate of a question here, which is asked in a broader sense: How to set the derivative of a function to zero?

Ideally, what should happen is based on modifying this line of code:

u /: (D[u[n_, x_], x_] := 0)/.x->0

The result would be:

(D[u[n, x], x] == 0)/.x->0 \\True
(D[u[n, x], x] == 0)/.x->5 \\False or indeterminate

Thank you for any tips.

Buddhapus
  • 581
  • 2
  • 14

0 Answers0