What is additive noise? What actually additive means?
I have tried to search in the internet but the only answer that I get is the noises gets added therefore it is additive, which make me think that noises are not destructive in nature am I right?
What is additive noise? What actually additive means?
I have tried to search in the internet but the only answer that I get is the noises gets added therefore it is additive, which make me think that noises are not destructive in nature am I right?
noises gets added therefore it is additive
correct!
which make me think that noises are not destructive
incorrect :(
simple thought experiment: You flip a fair coin $X$ (Head = -1 / Tail = 1) and tell me the result. The entropy here is 1 bit, i.e. the (expected) information ($I(X=\xi) = -\log_2 \left[P(X=\xi)\right]$) of each outcome is 1 bit.
Then there's additive noise $N$ that takes one of the values $\{-2,0,+2\}$ with equal probability.
When you receive a -1, you can't know whether the coin was Head and there was 0 noise, or the coin was Tail and there was -2 noise. Both are equally likely!¹
So, your additive noise is absolutely able to destroy information and hence is very destructive to your signal.
If you're more coming from a wireless communications background: Your $X\in\{-1,+1\}$ can be interpreted as BPSK. Now you see how even benign Gaussian noise destroys your reception when its sign is the opposite of your transmit symbol!
¹ we can even formalize that. Since $X$ (2 options) and $N$ (3 options) are independent, and each of them equidistributed, there's six possible combinations, each of them equally likely
X | N | Y = X+N
------------------
-1 | -2 | -3
-1 | 0 | -1
-1 | +2 | +1
+1 | -2 | -1
+1 | 0 | +1
+1 | +2 | +3
Thus, we have four possible outcomes for the sum of signal and additive noise, -3, -1, +1 and +3.
Thus, the expected information to get out of this channel is 1/3·1 + 2/3·0 bit = 1/3 bit, where put in full 1 bit! That's a very destructive additive noise channel.
To complement Marcus' answer:
Say you have a resistor (nothing is connected to it) at a certain temperature above absolute zero. The heat causes electrons to move around at random, creating a random current. This current through the resistor creates a random voltage.
If you connect a sensitive-enough voltmeter to the resistor, you can detect this voltage -- but, in practice, you have to be careful not to measure the random currents inside the voltmeter itself!
Now, imagine you connect a signal source to one end of the resistor, and you ground the other end. The source may be, for example, an antenna. The signal source will create a voltage across the resistor.
Now this is the key part: the voltage created by the source will add up with the random voltage caused by heat. This happens because a resistor is linear, in the sense that all currents applied to it are added. That's just the way a resistor works (I don't know if there is a fundamental explanation for this).
In brief, if the random noise is called $n(t)$, and the signal source is called $v(t)$, then the voltage across the resistor is $v(t)+n(t)$ -- and that is why $n(t)$ is called additive noise.
Notes:
This example is about thermal noise -- there are other kinds of noise, most of them additive.
Since the noise $n(t)$ is the cumulative effect of billions of electrons moving at random, the central limit theorem applies and the probability density function of the noise will be Gaussian.