6

First law: Every body remains in a state of rest or uniform motion (constant velocity) unless it is acted upon by an external unbalanced force. This means that in the absence of a non-zero net force, the center of mass of a body either remains at rest, or moves at a constant speed in a straight line.

Doesn't the law of increasing entropy affect all objects though, since they are all in the closed system of the universe at large, and therefore they are all subject to slowing down, regardless of the containing medium, given enough time?

I guess what I'm curious is, can there ever be a body that will remain at uniform motion or uniform rest given that entropy must increase?

3 Answers3

2

Yes, it will (in the classical picture) continue forever, even with its entropy increasing, the entropy increase just means that some of the potential energy within the body will turn into heat (=kinetic energy), however the center of mass motion is unaffected. Entropy increase says nothing about slowing down, rather the opposite.

TROLLHUNTER
  • 5,172
  • 1
    @kakemonsteret, "the entropy increase just means that some of the potential energy within the body will turn into heat (=kinetic energy)" ??? totally wrong! –  Feb 01 '11 at 16:27
  • @sb1 WHAT is wrong. – TROLLHUNTER Feb 01 '11 at 16:28
  • conversion of potential energy to kinetic energy within the system has nothing to do with entropy increase! –  Feb 01 '11 at 16:30
  • An ideal gas does not have any potential energy, yet its entropy can and does increase. –  Feb 01 '11 at 16:31
  • 1
    Yes it does, "Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process," -wik – TROLLHUNTER Feb 01 '11 at 16:31
  • Any gas of interacting particles has potential energy. – TROLLHUNTER Feb 01 '11 at 16:32
  • exactly! it is the energy which is unavailable for extraction. nothing to do with potential energy. –  Feb 01 '11 at 16:33
  • potential energy is available for extraction ? – TROLLHUNTER Feb 01 '11 at 16:34
  • If you have a ball and drop it, some of that potential energy can be converted into useful energy – TROLLHUNTER Feb 01 '11 at 16:35
  • An ideal gas does not have any P.E. For ordinary real gases, there are small van der wall's force between particles. So real gases have a little potential energy. But again that has nothing to do with the increase in entropy. –  Feb 01 '11 at 16:35
  • Sorry, you are too much confused. lets stop the discussion. –  Feb 01 '11 at 16:36
  • Why not ? potential energy=energy – TROLLHUNTER Feb 01 '11 at 16:37
  • @sb1 I am sorry that you failed to produce a counterargument, and chose to run off – TROLLHUNTER Feb 01 '11 at 16:39
  • @kakemonsteret I am sorry but I stopped the discussion because I found that you are hopelessly confused! –  Feb 01 '11 at 16:41
  • @kakemonsteret I don't think that the language you're using is likely to prompt a further useful response from @sb1 ... I would advise against that train of thought. Even tho I find the exchange interesting. – jcolebrand Feb 01 '11 at 16:42
  • I have no intent to discuss with someone who does not even know the basics to discuss anything about physics. –  Feb 01 '11 at 16:43
  • 1
    @sb1 No, you are confused, ideal gases dont exist – TROLLHUNTER Feb 01 '11 at 16:43
  • @kakemonsteret you seem to have a misunderstanding about the concept of entropy. @sb1 is correct especially with the example of a non-interacting ideal gas where there is no potential energy to convert into kinetic. Entropy is a measure of the redundancy in any possible description of a system. For example if I have a bus with 20 (or $x$) seats and 5 ($y$) passengers, then I have $N = {}^x C_y$ possible different ways (or "states") to seat them (for simplicity assuming that passengers are "indistuingishable"). Assuming each state is equally likely with (i.e with probability $1/N$) to be ... –  Feb 01 '11 at 16:45
  • ... occupied we get entropy $ S = - \sum_N 1/N \ln(1/N) = ln(N) $. More generally if the probability of a state $k$ to be occupied is $p_k$, then the entropy is $S = - \sum_k p_k \ln(p_k)$, where the sum is over all state labels $k$. Note that one needs no notion of potential energy at this juncture. –  Feb 01 '11 at 16:49
  • 1
    How exactly do you want to increase the entropy of anything, without converting potential energy into kinetic ? That is impossible – TROLLHUNTER Feb 01 '11 at 16:52
1

This question seems to me harder and more interesting than either of the answers take into account. A) GR has an analogue of Newton's First Law: the geodesics. B) The OP doesn't wish to consider the Universe as a whole. @Jerry is taking the OP in reverse. The OP is: since none of the usual bodies we study is the whole Universe, none of them are closed systems, therefore friction and dissipation effects exist, and so, does this imply that the Second Law of Thermodynamics trumps Newton's First Law...considering either of them as approximations governing the evolution of the body in question over a very long time-scale.

Because the time-scale is very long, actually the phrase used in the OP is «given enough time», it is not clear to me that we can depress the interactions of the body in question with the rest of the Universe sufficiently. The numbers must be crunched at this point....

Okay, here is the answer, although not the complete answer: For a fixed body that we wish to study, no matter how well-insulated and isolated it is (in reality....therefore it cannot be perfectly insulated and isolated), eventually the effects of dissipation with the environment will make Newton's First Law a poor approximation for the behaviour of that body because it's hypotheses will not be fulfilled. The crucial hypothesis is the absence of an external force. But dissipation is in fact due to fluctuating external forces. AS the Universe as a whole approaches statistical equilibrium, every body in it will behave like a Brownian particle with a random, non-differentiable motion. In fact, even worse, the «body» will cease to be a separate body before this point is reached...so the unspoken but implicit hypotheses of Newton's Laws will also break down.

At the moment, the Universe is far from statistical equilibrium, and it is unlikely to ever reach it if theories of the «big crunch» are correct, but what I sketched above might hold good for local parts, given enough time. So the complete answer would depend on: how big is this body? Will it disintegrate before the end of the Universe or not? Is there a local region in which it will remain for long enough for statistical equilibrium to be reached? Or if not that, something else?

I.e., the question is about the old idea of «heat death» of the Universe, but in light of present-day cosmology, the old Boltzmannia--Nietzschean conclusions are not well-established. And the question is about partial «heat degenerations» of a local part of the Universe, degenerations not really deaths but bad enough to make Newton's Laws imperfect tools for studying the body in question. Cf. Feynman's discussion of «ratchet and pawl» in his Physics Lectures, and also recent work on Szilard engines.

  • I probably should have in some way clarified "external factor". I now feel like maybe this is a vague question. I will have to revisit later today. – jcolebrand Feb 17 '14 at 16:04
0

If you ignore the microscopic explanation of entropy, entropy is just an internal state of a system, on par with the system's volume, or the number of particles in the system. If you have a gas with a fixed entropy (let's store it in a vacuum tube so it doesn't escape, and let's give the tube infinite insulation so it doesn't leech out any heat) out in deep space, and you throw it, it will just happily trail off in a straight line forever with a constant entropy, volume, and number of particles.

You get changes in entropy and whatnot only when the system in question interacts either with another system or its environment. But it really is best thinking of these things, at least on a macroscopic scale, as internal degrees of freedom of the system.

Zo the Relativist
  • 41,373
  • 2
  • 74
  • 143
  • ??? Entropy of a closed system always increases – TROLLHUNTER Feb 01 '11 at 15:58
  • 1
    @kakemonsteret: No: the entropy of a closed system must always either increase or remain constant. The Second Law reads $\Delta S \geq 0$, not $\Delta S > 0$. In particular, a system in equilibrium maintains constant entropy. – Zo the Relativist Feb 01 '11 at 16:11
  • 1
    barring fluctuations... – Marek Feb 01 '11 at 18:34
  • @Marek: sure, but at that point, the second law isn't being obeyed anyway. And which is also why I say to ignore the microscopic explanation of entropy. – Zo the Relativist Feb 01 '11 at 21:36
  • @JerrySchirmer This answer is correct physically but neglects the point of the question. The OP takes it as given that nothing is precisely a closed system except the Universe as a whole. «all objects though, since they are all in the closed system of the universe at large, and therefore they are all subject to slowing down...» so instead of fighting the hypo we should address the physical situation envisaged in the OP: non-closed systems. Imperfect vacua... – joseph f. johnson Feb 14 '14 at 15:38
  • @josephf.johnson: before you even get close to the situation where you're concerned about the whole universe, you have to abandon the notion of absolute velocity, so I don't see a way in which the question makes sense in the absolute sense. – Zo the Relativist Feb 14 '14 at 18:30
  • @JerrySchirmer The OP is asking about the validity of the Law for non-closed systems. Look at the quote. The OP wants to know if «all objects» really are «slowing down» because of entropy increase. (Because they are not the whole Universe and so are not closed systems). The question is hard because neither Newton's Laws nor the Second Law of Thermodynamics apply except to closed systems, but the OP is asking about open systems. – joseph f. johnson Feb 15 '14 at 19:03
  • @josephf.johnson: there are certainly cases where you can turn off the interactions to sufficient precision to be far beyond measurability. And Newton's laws give way to general relativity far before you start considering the universe as a whole. – Zo the Relativist Feb 15 '14 at 19:06