I am a newcomer in signal processing. I saw that the $L^2$-norm of a signal is also applied as its energy! How is this concept illustrated for those ones who are working in pure math.
-
I lack time for this but if this really interests you, don't let anyone stop you at circuit analogies or the current answer - there is a satisfactory explanation. – OverLordGoldDragon Oct 25 '20 at 08:11
-
6Hold on! $L^2$ norm is the square root of the sum of the squares of the samples. For energy, you'd use the square of the $L^2$ norm. – Olli Niemitalo Oct 25 '20 at 09:03
-
@OlliNiemitalo Correct. I'd say I ignored this but actually didn't notice. – OverLordGoldDragon Oct 25 '20 at 09:18
-
4It would be good to clarify how the tag continuous-signals is associated with the question. – Olli Niemitalo Oct 25 '20 at 09:59
4 Answers
Yes, the square of the $L_2$ norm of a signal is also by definition its energy $\mathcal{E}_x$.
The concept of signal energy :
$$ \mathcal{E}_x = \int_{-\infty}^{ \infty } x(t)^2 dt\tag{1} $$
is fundamentally based on the concept of energy (or work) in physics, as the Kinetic Energy of a particle with mass $m$ and velocity $v$ given by
$$ K = \frac{1}{2} m v^2 \tag{2}$$
There is also the concept of power defined as the time-rate of work $W(t)$.
$$ p(t) = \frac{dW(t)}{dt} \tag{3} $$
The relation between instantaneous power $p(t)$ and the energy is :
$$ \mathcal{E} = \int_{-\infty}^{\infty} p(t) dt \tag{4} $$
Electrical engineers ignore the mechanical roots, and rely on an electrical analog of energy as heat loss in an Ohmic resistor defined to be:
$$ \mathcal{E} = \int_{-\infty}^{\infty} p(t) dt \tag{5} $$
Where $p(t)$ is the instantaneous electric power associated with a current $i(t)$ passing through a linear time-invariant resistor $R$ , and is given by :
$$ p(t) = R \cdot i^2(t) \tag{6} $$
( $p(t) = v^2(t)/R $ is also an equivalent expression, based on Ohm's law $v(t) = R i(t)$)
Then the energy of the current, passing through a linear time-invariant system denoted by an Ohmic resistor $R$, is given by :
$$ \mathcal{E} = \int_{-\infty}^{\infty} R \cdot i^2(t) dt \tag{7}$$
Ignoring the resistor $R$ (or setting it to be $R=1$), and replacing the current variable $i(t)$ with a general unitless $x(t)$, we arrive at the mathematical definition of signal energy of as:
$$ \mathcal{E} = \int_{-\infty}^{\infty} x^2(t) dt \tag{8}$$
That being stated, in a parallel course, the study of normed linear Hilbert spaces also consider mathematical p-th Euclidean norm of a complex valued vector as :
$$ L_p = \left( \int_{-\infty}^{\infty} |x(t)|^p dt \right)^{1/p} \tag{9}$$
And you can see that the square of the case $p=2$ corresponds to the signal energy as defined in Eq.(8).
All of these can similarly be transferred to discrete-time domain.
- 28,152
- 3
- 24
- 50
-
1
-
-
@LaurentDuval I stated it for a real signal but why not include the complex case ;-) – Fat32 Oct 28 '20 at 11:18
-
-
-
1
-
-
1@Fat32 Since this question was asked, which is roughly when I first started with this network and signal processing. Hoped to finish a thing a biiit sooner. – OverLordGoldDragon Apr 05 '22 at 20:22
-
1
-
It's worth mentioning energy and signal energy are not the same in spite their formulas are similar, this can be confusing. Price variations have energy in $\small$^2$ units, but no energy in joules. "Energy" of signals describing a physical quantity may be converted to J, by an operation depending on the nature of the signal. E.g. "energy" of DC current samples in $\small A^2$ is converted to joules by a multiplication by impedance Z. But "energy" of a voltage signal in $\small V^2$ requires a division by Z. – mins May 09 '23 at 14:14
-
1
From physics, energy is a term often used as a quantitative property. In other words, energy is a quantity that is preserved under some actions, transformations, etc. In signal processing (where physics vanish), this often takes the shape of a sum or an integral of a squared quantity for reals, or its modulus for complex data. We can write it symbolically for discrete or continuous time ($\cdot^H$ denotes the complex conjugate) by $\sum x[n]x^H[n]$ or $\int x(t)x^H(t)$. When they are well-defined (convergence, etc.), such quantities are mostly proportional to the square of some $L^2$ or $\ell^2$ norm. As said in other answers, energy and squared $L^2$ or $\ell^2$ norms are related by definition, they are at the center of complex Hilbert spaces.
Now, why are these concepts so important in signal processing? Because the linearity of systems is strongly linked to energy: minimizing an energy often results in linear equations, from simple averaging to generic convolution, with a special connection with Gaussian noises.
The crux of the squared norm use in DSP is related to orthogonality and unitarity: in signal and image processing, we pretend that some representations can preserve the energy (or up to a factor, or approximately), and be way more efficient for some processing methods: smoothing, adaptive filtering, separation, inversion, restoration, reconstruction, etc. Fourier, short-time Fourier, spectrograms, wavelets and other perform this energy conservation.
Lastly, energy preservation also plays a role in algorithmic stability.
- 31,850
- 3
- 33
- 101
-
-
Indeed I am looking forward tow points concerning this issue: 1) To compute the energy why does taking integral make sense? 2) Why in $L^2$-norm? I mean that why the others like $L^p$-norms are not used?! – ABB Oct 26 '20 at 14:43
-
2
- One usually considers the energy of the whole data. Hence the sum or the integral. The latter is a kind if continuous sum (the sign looks like a big elongated S, from Summe, or sum in German. The $L^2$ is quite common for many possible reasons. One prominent is that it is relatively easy to "minimize" or "bound". Yet, other $L^p$ norms or quasinorms are increasingly used, because they better model some properties, or now are equiped with efficient algorithms, esp. in optimization
– Laurent Duval Oct 26 '20 at 15:29
How is this concept illustrated for those ones who are working in pure math.
I've never seen a pure mathematician need an illustration for a definition!
Really, the energy is defined as the sum of squares (discrete time) or the integral of squared (continuous time) signal.
At that point, it's not concept you have to apply, just a definition.
When leaving the math aspect of this and starting to care about the physicality:
This is compatible with the notion of power transported through a physical amplitude-changing phenomenon (like, say, a pressure wave in air, an electric voltage or a current on a wire, an electrical or magnetic field intensity, gravitational waves…): Instantaneous power is proportional to the square of amplitude, and energy is the integral of power over time.
Hence, that definition bridges the physical meaning of energy into signal procesing!
- 30,525
- 4
- 34
- 58
-
2It's not "just a definition", it makes perfect sense when explained properly, which I'm afraid your answer not only suggests can't be done, but falsely claims it to be axiomatic rather than emergent. – OverLordGoldDragon Oct 25 '20 at 08:08
-
3@OverLordGoldDragon Well, I agree it makes sense and emerged from the physical reality, but then we just removed all the constant factors that brings. Hence my compatibility and proportionality. But really, what would you expect me to do? I could introduce e.g. power over a resistor exposed to a voltage signal – but you're ignoring we're targetting pure mathematicians there. There's no physical pre-knowledge we could assume at this point. This is meant as a motivation, not a derivation, and the derivation becomes Really ugly once you start realizing that our signal processing – Marcus Müller Oct 25 '20 at 08:17
-
1"math" model that we teach to undergrads assumes a lot of things that will be hard to put into an answer, but would probably hurt someone with more functional analysis in their blood than I have a in their tummy. Let's start with integrateability: We teach students that in an integral over an interval, removing or adding countable many points doesn't change the integral value. Cool, so how do we apply this to things that aren't functions, which still use all the time to describe e.g. autocorrelation or frequency domain behaviour, namely tempered distributions? Um, how does something that – Marcus Müller Oct 25 '20 at 08:23
-
1isn't wide-sense stationary and hence doesn't have a PSD still have a power / energy in frequency domain, then? And how in time domain? Because that's what applying a unitary operator like a properly scaled Fourier Transform to a Banach space like L²... I didn't feel confident writing all that down, and it really wouldn't serve as a motivation! $$,$$On the contrary, math books are typically following a structure that would suggest "Def: Energy is L² norm; math around a bit; Corollary: Energy as defined above is compatible with physical def of power; Proof." – Marcus Müller Oct 25 '20 at 08:28
-
2That's the thing, it's not limited to circuits or other cliche examples, it's fundamental to sinusoids, and I've seen no satisfactory explanation on this wherever I looked so I figured it out myself. This explanation is so long (but necessary) that there's no way I'd do it just for sake of answering a random SE question, so I don't blame you for being brief, but I sure as hell wouldn't claim "this is all there's to it", as that's blatantly false and teaches complacency with shallow understanding. I'd make the answer's shortcomings very explicit. – OverLordGoldDragon Oct 25 '20 at 08:33
-
1There's but one grain of truth to your "it's defined" statement, in that if source isn't a sum of sinusoids to begin with (details), then a reinterpretation is needed, and sometimes it's rather meaningless but we just "treat as if" - in that sense it's a definition. But this is something one'd append as a footnote, rather than present as THE response (and suggest there's nothing more). If you just added something like "one way to think of it is" and "but there's more" (you get the idea), I wouldn't downvote. – OverLordGoldDragon Oct 25 '20 at 08:35
-
1Thanks for the kind words! But the question really becomes, if I say "one way to think about it", does it still serve as motivation? I think your sinusoid aspect is valid, and you could write an answer saying "for signals composed of..", and I'd frankly upvote that, most likely. – Marcus Müller Oct 25 '20 at 08:43
-
2I think it's very important to not suggest "there's not much more to it than what I wrote in this answer", and a brief clarification like that works, but in this specific case I'd explicitly say something along the lines of "there's much more to it". -- Also, I don't mean to batter anyone over a 'snack' question, but the other responses I've seen are of similar nature and discouraged my own early pondering, and I'm not about to rant on all the years-old Q&A's so I focus on the present. – OverLordGoldDragon Oct 25 '20 at 08:52
Possibly off-topic but in order to provide context, i.e. Parseval's identity:
I think a more general outlook should be pointed out. It's applicable in "reality" because we believe that Energy is conserved irrespective of description and there are equivalent similar relations for any of the linear transforms/representations; Laplace, Mellin, Fourier, Discrete, etc...
The use of the L_2 norm is a reflection of this. Basically, they are weighted integrals/sums of coefficients/functions. Thus we need L_2 convergence/formulations to reach this conservation.
"More generally, Parseval's identity holds in any inner-product space, "
https://en.wikipedia.org/wiki/Parseval%27s_identity
A little sketchy and abstract but somewhat informative.
- 415
- 3
- 5