I have seen it said that diffraction causes laser beam divergence, or that a laser beam will always diverge, due to diffraction, or some variation of these statements. I understand diffraction in general, and I understand that the phenomenon applies to all waves, so I understand that it would also apply to laser beams; but it is not clear to me how it causes laser beam divergence, or why a laser beam will always diverge, due to diffraction. When trying to research to understand how diffraction causes laser beam divergence, I can't find anything that directly and clearly explains this – most results either just mention diffraction in the context of lasers without providing explanation, or mention 'diffraction-limited beams', which I think is something different to what I'm asking. So how does diffraction cause laser beam divergence, and why will a laser beam always diverge, due to diffraction?
-
I would suggest that you look up Huygens principle and play with it to see how it would apply to a laser beam of various widths. – S. McGrew Apr 02 '21 at 01:15
-
It's not true that all laser beams diverge due to diffraction. What's true is that all laser beams with finite spatial extent diverge due to diffraction. – The Photon Apr 02 '21 at 05:25
-
1@ThePhoton Assuming "finite spatial extent" is the same as 'spatial confinement', don't all laser beams have "finite spatial extent"? Isn't this the assumption we use when solving Maxwell's equations, which then results in the 'Gaussian beam'? Or am I misunderstanding something? – The Pointer Apr 02 '21 at 05:50
-
Yes real laser beams all have finite extent. But we don't always use that assumption when solving Maxwell's equations. For example, when we obtain plane wave solutions (which, take note, is a solution that doesn't diverge due to diffraction). – The Photon Apr 02 '21 at 05:54
-
@ThePhoton oh, right. But, from what I remember, the plane wave solutions don't produce a Gaussian beam. – The Pointer Apr 02 '21 at 05:55
-
Yes, but are you askiing if all Gaussian beams diverge, or if all possible beams diverge? – The Photon Apr 02 '21 at 05:56
-
@ThePhoton Eh, that's a good question. I honestly wasn't even considering the specific type of beam. My question is moreso how diffraction causes laser beam divergence, in general, since this is the general claim that I encounter during my research/studies. I mean, I'm assuming that all types of beams diverge due to diffraction, no? – The Pointer Apr 02 '21 at 06:02
-
Now go back to my first comment. – The Photon Apr 02 '21 at 14:38
-
@ThePhoton Ahh, ok, now it makes sense. Well then, I guess my question is only with regards to those beams that diverge due to diffraction (specifically, Gaussian beams). – The Pointer Apr 02 '21 at 14:41
-
See this question and the associated answers: https://physics.stackexchange.com/questions/444894/why-exactly-does-diffraction-occur – Andrew Steane Apr 04 '21 at 20:35
-
@ThePhoton 1. There are no lasers with infinite spatial extent. 2. Therefore all laser beams have finite spatial extent at the laser. 3 Therefore all laser beams diffract. – Andrew Steane Apr 04 '21 at 20:39
5 Answers
The key point is that a laser beam is a wave which propagates according to Huygens principle. Once you accept this fact the divergence follows naturally.
Huygens principle states that the propagation is due to the generation of spherical waves, which will generate spherical waves in the next step of propagation. [Picture taken from wiki]

In the image we see that the center of the "hole" generates a "flat" wave. The diffraction is evident only in at the edges.
In order to capture the behaviour of the "central part" of a wavefront we use approximation and omit the edges to a certain extend. In the upper picture we might describe the central part as a plane wave. If instead we use spherical mirrors to generate a propagating wave, we end up with the Gaussian beam $$ E \propto exp\left( - \frac{r^2}{w_0^2 (1 + (z/z_R)^2)} \right) $$ If we include the quadratic phase correction for the wavefront and the Gouy phase the approximation improves. However, the Gaussian beam is always an approximation obtained by omitting the edges of the wave (in deriving it, we use the paraxial Helmholz equation).
- 8,739
-
Thanks for the answer. So, if I'm understanding this correctly, the reasoning is that, since the presence of any aperture/opening will cause diffraction, it is the presence of an aperture in lasers that causes the diffraction? Furthermore, is it necessary to use spherical mirrors for lasers to work? – The Pointer Apr 04 '21 at 16:37
-
I'm sorry, but no. The reasoning is: (1) A laser behaves like a wave. (2) Huygens principle describes its propagation. (3) Huygens principle uses the superposition of spherical waves. (5) If we have a superposition of spherical waves, diffraction is inevitable and expected/understandable. The finite size of the laser beam is only important, because diffraction is detectable only in the "wings" of the beam. After all, diffraction is the deviation from the linear propagation -- see Sommerfeld. – Semoi Apr 04 '21 at 18:15
-
https://en.wikipedia.org/wiki/Huygens–Fresnel_principle "It states that every point on a wavefront is itself the source of spherical wavelets, and the secondary wavelets emanating from different points mutually interfere." So are the gaps that we see between the waves due to destructive interference? – The Pointer Apr 04 '21 at 19:15
-
No, they are not. These are "cosine waves", but in 3D. Since we are unable to draw a cosine in 3D, we only draw the points, where the phase is zero, $\phi = 0$. These are the lines in my picture. – Semoi Apr 04 '21 at 19:43
-
The simplest description of a laser beam uses ray optics. Often it is a good approximation. In it light is a ray that follows a straight line. According to this description, there need be no divergence. This description is too simple.
A better description is light as a wave. To get the true beam, you must solve the classical Maxwell's equations with a boundary condition. The optical cavity of a laser must have curved mirrors to be stable. The wave solution for a cavity bounded by spherical mirrors is a Gaussian Beam. Wavefronts are spherical. "Rays" are not quite straight, but follow hyperbolic paths. The beam cross section is Gaussian. The intensity is maximum at the beam axis and falls off smoothly away from the axis.
Image from https://www.rp-photonics.com/gaussian_beams.html
There is also a quantum mechanical explanation. The simplest quantum mechanical explanation invokes the Uncertainty Principle.
Imagine a beam with a uniform amplitude across the cross section. The beam consists of photons. The photons pass through a circular aperture, which confines the beam cross section to a limited $\Delta x$. Because $\Delta x \Delta p \ge \hbar$, the photon must have a non-zero momentum in the direction perpendicular to the beam. The beam cannot be perfectly collimated.
In practice, the solution to Maxwell's equations has a Gaussian cross section. Apertures are carefully chosen large enough to not significantly distort the beam by truncating the edge. Even though not physically confined, the beam cross section is confined because of the Gaussian profile. The beam cannot be perfectly collimated because of the Uncertainty Principle.
This is enough to tell you that a small diameter beam will have a large divergence. If you focus a beam to a small spot, it will have a very small waist. Therefore it must have a large divergence angle.
The image is from optique-ingenieur
A better quantum mechanical explanation shows the classical explanation is the same thing in disguise. See Interesting relationship between diffraction and Heisenberg’s uncertainty principle?
A photon has a wave function that is a solution of the Schrodinger equation. Like Maxwell's equations, this is a wave equation. A photon in a cavity has the same boundary conditions as the electromagnetic wave in the same cavity. The photon's wave function is also a radially symmetric function with spherical wavefronts and a Gaussian profile.
The wave function is in the position basis. You take the Fourier Transform to convert to the momentum basis. The Fourier Transform of the Gaussian cross section is a Gaussian cross section. The transverse momentum of the beam is a superposition of non-zero momentum states. The beam cannot be perfectly collimated. It has the same divergence as the electromagnetic wave.
- 38,487
- 5
- 49
- 129
-
Thanks for the answer. But how does this answer my question about diffraction causing beam divergence? This isn't clear to me. The only time I see the word 'diffraction' used is when you link to the other question regarding Heisenberg's uncertainty principle. – The Pointer Apr 02 '21 at 02:23
-
Good point. Loosely, diffraction is the difference between ray optics and wave descriptions of light. Diffraction is the bending of light caused by its wave nature. So if you include diffraction in your description of how light propagates by choosing a wave description, you get beam divergence. – mmesser314 Apr 02 '21 at 02:32
-
Historically, lens design was done by ray tracing. The goal is to get as close to the perfect lens as possible by minimizing lens aberrations. Spherical surfaces are not the ideal shape, but they are much easier to manufacture. They do not direct rays exactly where you want. But you can choose radii and separations between elements to minimize aberrations. If you do a very good job, you get a lens so good that the biggest error is diffraction. Such a lens is diffraction limited. Diffraction is not predicted by ray tracing, but it is easy to calculate from the size of lens apertures. – mmesser314 Apr 02 '21 at 02:34
-
Your answer is interesting and informative, but all of this dances around my question. You're repeating that diffraction causes beam divergence, which I understand, but there is no explanation here for how it does so, and why a beam will always diverge, due to diffraction. The point of my question was to get a direct and clear answer to this. – The Pointer Apr 02 '21 at 02:38
-
Diffraction isn't exactly the cause of beam divergence. The Uncertainty Principle, or quantum mechanics, is the cause of beam divergence and of diffraction. That is the Uncertainty Principle is the cause of light not traveling in straight lines as predicted by ray tracing. – mmesser314 Apr 02 '21 at 02:43
-
With regards to divergence being due to the HUP, that seems to agree with what I read here https://physics.stackexchange.com/a/79469/141502. However, this https://physics.stackexchange.com/a/114300/141502 answer says "strictly speking, the physics of diffraction cannot be explained as the HUP (i.e. as arising from the canonical commutation relationships) because there is no position observable $\hat{X}$ for the photon, so you can't think of $\Delta,x,\Delta p$." This seems to disagree with what you said about the HUP being the cause of both divergence and diffraction, no? – The Pointer Apr 02 '21 at 02:57
-
Ok, Andrew Steane's comment to his answer here https://physics.stackexchange.com/a/626372/141502 states that the HUP follows from diffraction, so I think that clarifies that. – The Pointer Apr 02 '21 at 16:16
-
People think in different ways. This makes they talk in different ways. A physicist thinks about the behavior of light. Classical physics can be applied in many cases, so he thinks about Maxwell's equations and electric fields. If how light interacts with individual atoms is important, he thinks about photons and quantum mechanics. This leads to two completely different ways of talking about light. They would say wave properties or the Uncertainty Principle causes light to bend, which means they cause diffraction. – mmesser314 Apr 03 '21 at 00:44
-
A lens designer thinks about rays, lens surfaces, and aberrations. He thinks about why rays do not all meet at a point. This aberration causes rays to focus to a circle of this size. That aberration adds so much to it. He works for a while and gets the aberrations small. Light still doesn't focus perfectly. He says diffraction makes light bend. To him, diffraction is just another aberration that prevents light from focusing perfectly. This is a reasonable way for him to think for his purposes. – mmesser314 Apr 03 '21 at 00:50
-
There are many examples of people thinking in different ways. Standing at the side of the road, I see a car moving by. The driver thinks the car is sitting still. The seat is right under him. A while latter, it is still right under him. The world is moving around him. He steps on the gas, and the world accelerates. You can do correct physics from this point of view. Fictitious forces were invented for this purpose. It can be confusing, even when it is correct. – mmesser314 Apr 03 '21 at 00:55
-
The physicist can explain why light bends. The lens designer would have a hard time describing exactly what it means to say that diffraction causes light to bend. It is a little backwards. – mmesser314 Apr 03 '21 at 01:03
-
Physicists are careful about such things when they need to be. They care about measurements and how the world behaves. Sometimes they can be sloppy and still have an accurate description. Mathematicians are much more likely to be sticklers for correctness in picky details. They deal with pure ideas. Their only way to show an idea is right is proof. Proofs must be airtight because one false idea can be used to prove other false ideas. – mmesser314 Apr 03 '21 at 01:03
-
"The optical cavity of a laser must have curved mirrors to be stable..." — this doesn't mean that lasers always have stable resonators. Unstable ones have some useful properties and are also used in lasers. Still, diffractive divergence is still a thing there, since the aperture is finite anyway. – Ruslan Apr 04 '21 at 21:31
I will answer this only in terms of diffraction since that is the fundamental limit of laser beam divergence. Diffraction is the spreading of the beam because of the finite width of the beam. It is a fundamental of physics. Even the uncertainty principle is an aspect of the same thing. I.e, the more you confine the location of a particle, the less you know about its direction. In a way, this all comes down to wave mechanics.
Consider that each point in a wave propagates out in a circular fashion from that point (like a water ripple out from what a stone falls into the water). If a point next to that point also propagates out with the same phase (the peaks and valleys oscillate together), the two circular waves will add together to give a composite wave. As the line of “emitters” increases, the wave starts to look like a planar wave, but the edges will still propagate outwards. As the “beam” of the wave gets wider and wider from the emitters, the net effect is that the spread is less and less. It doesn’t matter if this is a light wave from a laser, a water wave, or a slit in a quantum experiment, the result is the same.
So, we can say a laser beam diverges because of fundamental physics and the nature of the spatial superposition of the coherently emitted photons from a laser.
Here are some resources where you could read further:
-
oh and do read the answer to a similar question on physics stack exchange: https://physics.stackexchange.com/a/79469/290525 – Raghavendra Singh Apr 04 '21 at 13:50
I am no expert on this particular topic. I could be wrong.
To my knowledge, all electromagnetic waves diffract. Since laser is a highly coherent monochromatic light created from stimulated emission, the wavelengths are all the same and the troughes overlap with the troughes and crests with the crests. This means that it would follow a perfect path of diffraction through a gap without the waves themselves cancelling each other out.
The laser beam source is always of finite dimensions, thus it cannot provide a truly parallel beam. The laser beam is sligtly divergent by its origin.
- 13,885

