Followup; setting Fs = bandWidth in accepted answer's code yields the "correct" frequency response of an LFM chirp: constant magnitude, parabolic phase. Setting it to anything else, including >> bandWidth, yields distortions, and magnitude converges to a square in the limit Fs / bandWidth -> inf.
Why's this the case? Shouldn't greater Fs be better - less aliasing, greater resolution, etc?
fs = bandWidth; this isn't about "zoom". Real chirp doesn't exhibit this either. I presume it's some convenient coincidence, but won't dig further into this - already chirped out. – OverLordGoldDragon Sep 08 '20 at 12:41