0

Could somebody please explain why we step up the voltage when transmitting electricity over long distances? I have read it is to reduce energy losses. Why does a high voltage not result in high energy losses? Could you also show me it using the p=v^2/R equation as thats where my difficulties lie. If the voltage is really high would the power losses not me bigger?

Jake
  • 503
  • related: https://physics.stackexchange.com/q/65147/ – CDCM Aug 24 '17 at 00:51
  • I agree with @CDCM ... several people have already answered this question, including myself. – David White Aug 24 '17 at 00:56
  • Sorry guys. They dont really answer it and show how I am using the p=v^2/r equation wrong which is the real flaw Im having. – Jake Aug 24 '17 at 01:08
  • The voltage isn't the total voltage between the transmission line and ground, but the voltage drop from the start to the end of the line. That's $V_L$ in the duplicate question. – Nick T Aug 24 '17 at 03:01

1 Answers1

0

In your expression $v$ is the voltage drop across the cables which will decrease as the resistance of the cables $R$ is decreased.

In your expression for the power loss the voltage term is squared and so the numerator $v^2$ decreases faster than the denominator R as the resistance decreases, thus the power loss in the cables decrease as the resistance of the cables decreases.

If the current is $I$ then the voltage drop across the cables $v=RI$ and so the power loss is $\dfrac{\left ( RI\right )^2}{R}=I^2R$ which is perhaps a more familiar form?

Farcher
  • 95,680