3

In "Concepts in Thermal Physics" (second edition) by Blundell and Blundell, temperature is defined using the following relation: $\frac{1}{k_BT}=\frac{\mathrm{d}\ln(\Omega)}{\mathrm{d}E}$. I am wondering how this relationship between T, E, and $\Omega$ came to be and how it fits in with the other definitions of temperature.

  • Can you clarify what other definitions of temperature you are referring to? – user1379857 Apr 15 '21 at 20:39
  • see: https://physics.stackexchange.com/questions/269538/proving-that-the-boltzmann-entropy-is-equal-to-the-thermodynamic-entropy/625248#625248 – ratsalad Apr 16 '21 at 21:42

3 Answers3

1

It's the usual definition. The Boltzmann entropy, valid when microstates have equiprobability, is: $S = k_B \ln\Omega $. From thermodynamics we know that $\frac{1}{T} = \frac{dS}{dE}$, using the Boltzmann entropy we have: $\frac{1}{k_B T} = \frac{d\ln\Omega}{dE}$

DanielC
  • 4,333
Mark_Bell
  • 896
  • Technically, I believe the expression from continuum thermodynamics is: $$\frac{1}{T}=\frac{\partial S}{\partial E}\biggr )_{N,V}$$ Where N is the number of particles and V is the volume. See http://www.hyperphysics.de/hyperphysics/hbase/thermo/temper2.html#c1 – Bob D Apr 15 '21 at 21:08
  • Yes, if the system is isolated, then $N$ and $V$ are constant. The relation i wrote is implied from the dependence of $ E(S, V, N) $. – Mark_Bell Apr 15 '21 at 21:17
  • So you are saying $$\frac{1}{T}=\frac{dS}{dE}$$ is for a non isolated system? – Bob D Apr 15 '21 at 21:22
  • I meant that i omitted in my notation that the derivative is computed fixing N and V. $ dE = \frac{\partial E}{\partial S} dS + \frac{\partial E}{ \partial V} dV + \frac{\partial E}{\partial N} dN$. Or writing $dS = \frac{ \delta Q_rev}{T}$ and with $dE = \delta Q - \delta W = TdS - \delta W$ and without work: $dE = TdS$ – Mark_Bell Apr 16 '21 at 00:19
  • So then it should be the partial derivative of S with respect to E holding N and V constant, correct? – Bob D Apr 16 '21 at 23:24
1

This connection is linked from both the thermal dynamics and the statistical microcanonocal ensemble.

From thermal dynamics, the entropy is defined as $dS =\frac{dQ}{T}$. Using the first law of thermal dynamics: $$ \tag{1} dS = \frac{dQ}{T} = \frac{1}{T} \left[ dU + PdV -\mu dN \right] $$ Then, treat entropy as a atate function $S(U,V,N)$, the chain-rule of differentiation: $$\tag{2} dS = \frac{\partial S}{\partial U}\Big\vert_{N,V} dU + \frac{\partial S}{\partial V}\Big\vert_{U,N}dV +\frac{\partial S}{\partial N}\Big\vert_{U,V} dN. $$

Compare Eq.(1) and Eq.(2), we have Maxwell's relations: \begin{align} \frac{1}{T} =& \frac{\partial S}{\partial U}\Big\vert_{N,V}\tag{3}\\ \frac{P}{T} =&\frac{\partial S}{\partial V}\Big\vert_{U,N} \\ \frac{\mu}{T} =& -\frac{\partial S}{\partial N}\Big\vert_{U,V} \\ \end{align}


Then, from the statistical microcanonical ensemble scheme, the total number of micro-states $\Omega$ is calculated as function of $\Omega(E, V, N)$, where $E = \sum_i^N \epsilon_i$. The equilibrium condition is assume by maximum $\Omega(E, V, N)$ with a constrain that $E = \sum_i^N \epsilon_i = U$, where $U$ is the given internal energy of the thermal dynamics.

Under equilibrium condition, the entropy is defined by Boltzmann formula: $$ \tag{4} S = S(U, V, N) = K_b \ln \Omega(E=U, V, N) $$

Therefore, we maximumize the $\ln \Omega(E, V, N)$ with constrian $E = U$ \begin{align} \frac{\partial }{\partial E} &\left\{ \ln\Omega(E, V, N) -\lambda (E-U) \right\}=0\\ \Longrightarrow &\,\,\,\frac{\partial \ln\Omega(E, V, N) }{\partial E}\big\vert_{E=U} - \lambda = 0 \tag{5} \end{align}

Eq.(5) is assumed to be in thermal equilibrium ($E$ is now replaced by $U$), $S = K_b \ln \Omega$, and We may now compare equation (5) with Maxwell relation in thermal dynamics, Eq.(3).

$$ \lambda = \frac{1}{K_b T}. $$

And rewrite Eq.(5) as: $$ \frac{\partial \ln\Omega(E, V, N) }{\partial E}\big\vert_{E=U}=\frac{1}{K_b} \frac{\partial S }{\partial U} = \frac{1}{K_b T} $$

ytlu
  • 4,196
0

Here is the argument why statistical entropy has such form. In thermodynamics temperature is defined as $\frac{1}{T}=\frac{\partial S}{\partial E}$, where $E$ is the "energy" of the system, and $S$ is the "entropy". The partial derivative is taken under the condition that all other extensive quantities, like volume and the number of particles, are held constant. The "entropy" must reach maximum at the equilibrium state of the system (given energy, volume and the number of particles are held constant).

Statistical mechanic tries to fit into the definitions of thermodynamics, and one can indeed show that a classical system, described by the probability distribution over the microscopic states, in the limit of vanishingly small (and energy preserving) external noise will distribute uniformly over all available states with the same energy.

For something to be a thermodynamic system, entropy must reach maximum over the equilibrium distribution, and since we know that equilibrium distribution is the "maximally spread distribution" (i.e. uniform on the all available phase space), entropy should be "something that increases as the probability distribution takes larger part of the phase space". What fixes $S=\ln(\Omega)$ is the requirement of the total entropy of non-interacting systems to be additive (for two non-interacting systems $\Omega_{total}=\Omega_1\Omega_2$, and therefore $S_{tot}=S_1+S_2$). Plugging the entropy into the definition of temperature, we obtain the desired connection of statistical mechanics of classical systems and thermodynamics: $$ \frac{1}{T}=\frac{\partial \ln{\Omega}}{\partial E} $$

Pavlo. B.
  • 2,605
  • In thermal dynamics, temperature is defined as the common parameter under thermal equilibrium of two systems (Oth law of the thermal dynamics). The $\frac{1}{T} = \frac{\partial S}{\partial U}$ is a derived relation, one of the Maxwell's relations. – ytlu Apr 17 '21 at 04:04
  • Well, we probably used different textbooks. In Chandler's "Introduction to Modern Statistical Mechanics" temperature is defined as $\frac{1}{T}\equiv\frac{\partial S}{\partial E}$. I am familiar with your approach too, but I like the one in Chandler better, because it is based on axioms, and not "laws", which makes his approach more general (it is still pretty physical though). – Pavlo. B. Apr 17 '21 at 04:15
  • In microcanonical ensemble, the temperature is given by $\frac{1}{T} = \frac{\partial S}{\partial E}$ via the Maxwell relation in the "thermal dynamics". In thermal dynamics, the temperature is assumed an equilibrium parameter without precise definition. – ytlu Apr 17 '21 at 04:35
  • 1
    "Thermodynamics" is a real field... In Chandler (check this guy out, really good and small textbook), a system is characterized by extensive (i.e. additive) macroscopic variables $(E, V, N ...)$, and there exist such function $S(q)$ ($q$ are microscopic variables), which reaches maximum on the equilibrium probability distribution over $q$ for fixed $(E, V, N ...)$. All thermodynamics is basically derived from this definition and some stability arguments. Temperature is just defined through the partial derivative of entropy over energy – Pavlo. B. Apr 17 '21 at 04:50
  • **sorry, I meant $S[P(q)]$, where $P(q)$ is the probability distribution over the microscopic variables $q$ – Pavlo. B. Apr 17 '21 at 05:00
  • I see. I am looking forward to having a look on the book. – ytlu Apr 17 '21 at 05:03