I have a quick question: in question:Solving an ODE with boundary conditions at infinity and 0//different problems, I asked for help concerning a differential equation- however, the whole solving routine turned out to be very unstable at large distances. Now I wonder if it is possible to implement a certain boundary condition that helps to keep the gradient of a function negative while minimizing its value at a certain point, i.e.: I solved my differential equation via:
start = 0.01;
inf = 5.;
e = 4.84;
beta = 0.263;
Du = (beta^2)/4.
deqn = {(0.25*x^2 + 2*(Sin[F[x]])^2)*F''[x] + 0.5*x*F'[x] +
Sin[2*F[x]]*F'[x]*F'[x] - 0.25*Sin[2*F[x]] -
Sin[2*F[x]]*((Sin[F[x]])^2)*x^(-2) - Du*x^2*Sin[F[x]] == 0.,
F[start] == Pi, F'[start] == dy0};
ydysol1 = ParametricNDSolve[deqn, F, {x, start, inf}, dy0][[1]]
and now I am trying to implement:
dysol1 = NMinimize[{((F[dy0] /. ydysol1)[
inf])^2 ,(F[dy0] /. ydysol1)'[y] < 0, -10. < dy0 < -1.,
start < y < inf }, {dy0, y}]
This runs forever(even for a very small intervall where I know the gradient is < 0(i.e. 0.01 to 5). I would highly appreciate suggestions and furthermore want to stress that I am no expert in solving differential equations and have hardly any experience in working with mathematica. Thank you so much in advance.