2

I am trying to find the transmission coefficient of a potential using a direct integration technique. So I start the integration left of the barrier and end to the right of it. One can then calculate the transmission coefficient by comparing the the value of the solution at the right hand side to its initial value.

I have a lot of parameters for the potential. Call them, say, $a, b, c, d, e, f$ and $g$. For a large set of values of $a-f$, I want to find the value of $g$ which maximizes the transmission and the value of that transmission.

The speed is acceptable if I fix $g$, but calculating many values of $g$ to manually search for this maximum is incredibly slow, so I wish to use some numerical optimization method to minimize the number of $g$ values that need to be used, however the obvious FindMaximum and NMaximize do not work for me at all.

Here is a minimal (non) working example, which is effectively using NDSolve as a very silly way to square a number and then trying to find the minimum of(x^2-1)^2: this feature, and the use of multiple NDSolves to avoid indeterminate points, are a key parts of the real problem though:

Calcλsquared[λ_, ϵ_] := (temp = 
   f[x] /. NDSolve[{f'[x] == 2 f[x]/x, f[-λ] == λ^2}, 
                    f[x], {x, -λ, -ϵ}][[1]];
   temp = f[x] /. 
     NDSolve[{f'[x] == 2 f[x]/x, 
              f[ϵ] == (temp /. x -> -ϵ)}, 
              f[x], {x, ϵ, λ}][[1]];
   temp /. x -> λ)

Followed by either

FindMinimum[(Calcλsquared[λ, 10^(-3)] - 1)^2, {λ, 0.99}]

or

NMinimize[(Calcλsquared[λ, 10^(-3)] - 1)^2, λ]

Edit: I have made some progress using the ?NumericQ as mentioned here: optimization problem with NDSolve. However this does not completely solve my problem. I have managed to find a new minimal example. In the real case the function returns a list and this breaks it:

Calcλsquared[λ_?NumericQ, ϵ_?NumericQ] := (temp = 
  f[x] /. NDSolve[{f'[x] == 2 f[x]/x, f[-λ] == λ^2}, 
                  f[x], {x, -λ, -ϵ}][[1]];
 temp = f[x] /. NDSolve[{f'[x] == 2 f[x]/x, 
                         f[ϵ] == (temp /. x -> -ϵ)}, 
                         f[x], {x, ϵ, λ}][[1]];
  {λ, temp /. x -> λ})

and

FindMinimum[(Calcλsquared[λ[[2]], 10^(-3)] - 1)^2, {λ, 0.99}]
MarcoB
  • 67,153
  • 18
  • 91
  • 189
user73121
  • 21
  • 2
  • Using WhenEvent[] to detect when the derivative is zero is great for this, but you'll have to postprocess to filter out anything that isn't a maximum. – J. M.'s missing motivation Jun 04 '20 at 11:56
  • I didn't manage to find them with search and they didn't appear while typing out the title but the ?NumericQ thing which showed up on the side after I posted (here https://mathematica.stackexchange.com/questions/6751/optimization-problem-with-ndsolve?rq=1 ) does fix the minimal non-working example – user73121 Jun 04 '20 at 12:01
  • Are there any restrictions for parameters? – Alex Trounev Jun 04 '20 at 16:35
  • Yes, a lot, but what is the relationship between that and the failure of the FindMinimum method if the function returns a list? – user73121 Jun 04 '20 at 16:51
  • Did you try to solve a problem or just want to test FindMinimum? – Alex Trounev Jun 04 '20 at 17:51
  • Yes I have a real problem. The problem is described in the question (finding transmission coeffiecients and I need to find a maximum, so in the real example it is FindMaxmum but the behaviour is the same). – user73121 Jun 04 '20 at 17:55
  • If there are restrictions then you can try MinimalBy, For example lst = Table[{la, (Calc\[Lambda]squared[la, 10^-3][[2]] - 1)^2}, {la, .01, 1.5, .01}]; MinimalBy[lst, Last] gives out {{0.99, 0.0000650146}} – Alex Trounev Jun 04 '20 at 18:09
  • As I tried to explain, there are so many parameters that doing it this way would take an extremely long amount of time, hence I wanted to use the FindMaximum or NMaximize so that it would only need to calculate whatever was necessary to find it, and not multiply the number of calculations by 20 or whatever (they already take hours-days depending how I want to sample the parameters) – user73121 Jun 04 '20 at 18:14
  • @user73121 Then may be better to use more sophisticated function for optimization? – Alex Trounev Jun 04 '20 at 21:00

1 Answers1

1

One way to optimise code is to use Module like this

Calc\[Lambda]squared[l_?NumericQ, e_?NumericQ] := 
 Module[{\[Lambda] = l, \[Epsilon] = e}, 
  temp = f[x] /. 
    NDSolve[{f'[x] == 2 f[x]/x, f[-\[Lambda]] == \[Lambda]^2}, 
      f[x], {x, -\[Lambda], -\[Epsilon]}][[1]];
  temp = f[x] /. 
    NDSolve[{f'[x] == 2 f[x]/x, 
       f[\[Epsilon]] == (temp /. x -> -\[Epsilon])}, 
      f[x], {x, \[Epsilon], \[Lambda]}][[1]];
  u = temp /. x -> \[Lambda]; u]

Then Nminimize can handle this function in 2 sec:

NMinimize[(Calc\[Lambda]squared[\[Lambda], 10^(-3)] - 
     1)^2, \[Lambda]] // AbsoluteTiming



(*{2.12628, {3.1173*10^-15, {\[Lambda] -> 0.994015}}}*}
Alex Trounev
  • 44,369
  • 3
  • 48
  • 106