1
 $Version
 "13.0.1 for Linux x86 (64-bit) (January 29, 2022)"

I have $28$GB of RAM.

I need to calculate two million consecutive eigenvalues for a given billiard( e.g. circular billiard).

\[ScriptCapitalR] = ImplicitRegion[x^2 + y^2 <= 1, {x, y}];
Region[\[ScriptCapitalR], GridLines -> Automatic, 
 GridLinesStyle -> Directive[Red, Dashed]]

enter image description here

vals = NDEigenvalues[{-1/2 Laplacian[u[x, y], {x, y}], 
     DirichletCondition[u[x, y] == 0, True]}, 
    u[x, y], {x, y} \[Element] Region[\[ScriptCapitalR]], 2000000, 
    Method -> {"PDEDiscretization" -> {"FiniteElement", \
{"MeshOptions" -> {"MaxCellMeasure" -> 0.00001}}},
      "Eigensystem" -> {"Arnoldi", 
        "MaxIterations" -> Infinity}}]; // AbsoluteTiming

The above code works fine but takes an immensely long time and consumes all of my RAM, which either freezes my PC or restarts it involuntarily.

In the above code, by selecting "MaxCellMeasure" I can reduce the time and memory consumption, but then it limits the maximum number of eigenvalues that can be computed for this discretized system.

Is there any other way to speed up the evaluation? Or is there any alternative way to find two million consecutive eigenvalues?

Thank you.

user444
  • 2,414
  • 1
  • 7
  • 28
  • Region is not defined. – Alex Trounev Mar 14 '24 at 03:11
  • 3
    Dr. Evil vibes: https://imgflip.com/i/8j6rc8 – Chris K Mar 14 '24 at 03:43
  • @AlexTrounev, I've updated the question. – user444 Mar 14 '24 at 04:19
  • 1
    @ChrisK LOL! People are always being evil. For reference: https://doi.org/10.1103/PhysRevResearch.4.013138 – user444 Mar 14 '24 at 04:23
  • 3
    The eigenvalues of the Laplacian on a disk are known analytically. – Ghoster Mar 14 '24 at 06:27
  • 1
    Have you tried to estimate the memory and time with the current approach? You can see how NDEigensystem is implemented here. Most likely you the Arnoldi method of Eigensystem will be the bottleneck. – user21 Mar 14 '24 at 06:47
  • @Ghoster, the problem is not exactly about the disk. The problem is about how to speedup the process – user444 Mar 14 '24 at 07:53
  • @user21, How do I estimate the memory and time with the current approach without running the actual calculation? Will https://mathematica.stackexchange.com/a/218870/84456 answer work? – user444 Mar 14 '24 at 08:00
  • 1
    @user444, start will larger MaxCellMeasure and fewer eigenvalues, then gradually increase them and measure the timing and memory used. Afterwards, extrapolate to the desired number of eigenvalues and MaxCellMeasure. – Domen Mar 14 '24 at 09:38
  • 2
    Also, you can see that the authors used some specialized algorithms for their calculations: "Using the very efficient scaling method of Vergini and Saraceno [20,21] with a corner adapted Fourier-Bessel basis [22] (the implementation is available as part of [23]) we computed more than $2 \times 10^6$ levels for each triangle" – Domen Mar 14 '24 at 09:43
  • Actually we can use \[ScriptCapitalL] = -Laplacian[u[x, y], {x, y}]; \[ScriptCapitalB] = DirichletCondition[u[x, y] == 0, True]; and DEigenvalues[{\[ScriptCapitalL], \[ScriptCapitalB]}, u[x, y], {x, y} \[Element] Disk[], 2 10^6] but it runs forever. – Alex Trounev Mar 14 '24 at 19:28
  • @AlexTrounev, It does give numerical values with //N, but as I've tested for $50$ eigenvalues, NDEigenvalues take $0.564904$ seconds whereas DEigenvalues is taking $61.6252$ seconds with the 50th eigenvalues in both the methods entirely different from one another. Why is it happening? And isn't it supposed to be a faster process? – user444 Mar 14 '24 at 23:52
  • @user444 It is not clear why do you need 2 10^6 eigenvalues for Laplacian on a disk. Would you like to repeat research paper linked above with elliptical region? For my opinion they are not taken account that triangle region could be mapped on a disk with using some conformal map. – Alex Trounev Mar 15 '24 at 14:02
  • @AlexTrounev, in the paper, authors had diagnosed quantum chaos using the Random Matrix Theory (RMT) approach and found Nearest-neighbour-Level Spacing Distribution (NLSD). For that reason, they needed 2*10^6 eigenvalues, as it is a statistical method. I am getting similar results with 5k eigenvalues for the billiard I am working with(Not disk). However, when you compare 5K with 2M, the sample space is not good enough. What to do now? If you have a better suggestion, I am open to it. – user444 Mar 15 '24 at 15:00

0 Answers0