3

Joule's Law Of Heating states that for a conductor Heat loss is given by $$H=i^2Rt$$ Couldn't we substitute $$V=iR$$ and show that $H=(V^2/R)t$ ?

If this is so, when transmitting power across large distances, we should keep voltage low which seems to contradict reality. Where am I going wrong?

JjJot
  • 39
  • 1
    Related, possible duplicate: https://physics.stackexchange.com/q/145301/123208 – PM 2Ring Apr 24 '20 at 15:59
  • @PM2Ring Yep that's my problem, didn't see it in my related questions feed... Thanks for pointing out – JjJot Apr 25 '20 at 13:19
  • No worries. It took me a while to find it, so it doesn't surprise me that it didn't turn up in the automatic search. – PM 2Ring Apr 25 '20 at 13:21

1 Answers1

0

Notice that the equation derived is essentially power times time, giving you the energy loss by heat as H. For simplicity assume the circuit to be DC, the power given is $P={V^2}/R$ or $P={I^2}R$.

Since the transmission line bears huge resistance, according to the first equation using high voltage would means the power is directly proportional to the square of voltage and inversely as the resistance. If we consider the second equation, the power is directly proportional to the square of current and resistance. If the transmission line uses high current, the power converted to heat would be more than the case of using high voltage.

Philip.P
  • 402
  • 2
  • 7