You are on page 1of 33

Thermal physics

1. Review
2. Entropy
Definition and properties
Entropy of the ideal gas; of mixing
Relation between entropy and
temperature
Entropy and heat capacities
Prof. Massimo PICA CIAMARRA
Office: SPMS-PAP-03-14
Office hours: Tuesday, 10:30-11:30
contact: massimo@ntu.edu.sg

1. Review

2. Entropy
Definition and properties

Equilibrium: maximum multiplicity


The probability to find a system in a given macrostate, is proportional to its
multiplicity. We have clarified that since the multiplicity is strongly peaked, there is a
macrostate that is much more probable that the others. This is the macrostate
corresponding to the equilibrium condition.
If a system is not in such a macrostate, as time evolves it will change macrostates, until
reaching the one maximixing the multiplicity.
We can thus rephrase the second law of thermodynamics with the following statement:

Entropy
Instead of dealing with multiplicities, it is convenient to deal with their logarithm.
This is the entropy:

The units are those of

[Joule/Kelvin]

Remember that the logarithm is a monotonic function.


Accordingly, if increases, then increases.
Thus the second law of thermodynamics can be stated as:
Entropy tends to increase

Entropy
This relation is fundamental in Statistical Mechanics.

It has been introduced by Boltzmann,


the great father of statistical mechanics.
The equation is engraved on the
Boltzmann tombstone in Vienna.

Entropy an example
We have computed the multiplicity of an Einstein solid with
quanta of energy:

The corresponding entropy is:

If

and

we have

Note that the entropy increases:


- by increasing N
- by increasing q

J/K

oscillators and

Entropy & disorder


You will frequently find the term entropy associated to that of disorder. Indeed, in
general the disorder of a system increases with its entropy even though to make this
statement precise one would need to specify what is (how to measure) disorder.
However, entropy and disorder are not easily related.
An important counter example is the crystallization of hard spheres.
At high density, spheres crystallizes because they go in the state of higher entropy.
Thus, the entropy of an ordered system can be larger than that of a disordered one.
Smaller entropy

Hard spheres at
high density

Larger entropy

Entropy is additive

2. Entropy
Entropy of the ideal gas
Entropy of mixing

Entropy of the ideal gas


We have previously calculated the multiplicity of the ideal gas

We can thus determine the entropy. Using Stirling's approximation, we find:

Which is known as the Sackur-Tetrode equation

Example
The entropy of the ideal gas depends on its volume, energy, and number of particles.

We can thus compute how this quantity change during a thermodynamic transformation.
For instance, suppose we change the volume from
to , at a constant temperature.
For an ideal gas, constant temperature mean constant U. The entropy change is:

Next week we will see how entropy is connected to the temperature, as well as to the
pressure and to the so-called chemical potential.
In the second part of this course, we will consider how entropy changes during
different thermodynamic transformations

Entropy of mixing
Two different gases, A and gas B, with the same energy, volume, and number of
particles, are separated by a partition. If we remove the partition, the gases mix.
What is the entropy change?
We consider each gas as a separate system, as they do not interact.

! "# $ % &

'('

Entropy of mixing
Consider the total entropy of the system.
If the two gases are different, the total entropy is the sum of the entropies of the A and of
the B system. If the gases have equal concentration (N) mass (m) and energy (U), then
*

If the two gases are the same gas, then we have a single system containing twice as many
particles, and twice the energy. Thus the entropy is
+

With respect to the , - ., the , . case leads to a small increment of entropy.


The difference is the entropy of mixing:
)
'('

2. Entropy
Relation between entropy and temperature

Equilibrium, entropy, temperature


In the first part of this course, we have clarified that when two objects are placed in thermal
contact, they reach a condition of thermal equilibrium. Heat flows from the hot to the cold
object.
In thermal equilibrium, the temperature of the two objects is the same. We measure
temperature with a thermometer.
In the second part, we have clarified that when two objects are in thermal contact, they
exchange energy until the entropy is maximized. The macrostate with the maximum
entropy is the most probable.
This implies that there is a relation between temperature and change of entropy;
when the entropy of the system does not change, the temperature is fixed.

Examples two Einstein solids

(N ,q)= q+N 1
q

S=klog(N,q)

Stotal =SA +S B

Examples two Einstein solids


We already know that the equilibrium condition (qA = 60 in this example), is the one that
maximize the entropy

Examples two Einstein solids


Let's maximize the entropy, under the condition of constant energy:
Total entropy: '('
Total (constant) number of quanta:

'('

To maximize the entropy with respect to


0 '('
0

0
0

0
0

(remember

0
0

, we fix its derivative wrt


0
0

0
0

0
0

to zero:

0
0

Since the number of quanta is proportional to the energy, at equilibrium we have:


0
0

0
0

Two systems in thermal equilibrium have equal derivate of the entropy with respect to the
their internal energy.

Examples two Einstein solids


In thermal equilibrium,

>

Entropy and temperature


In thermal equilibrium,
experiments tell us
,
the statistical approach tells us
This suggests to relate

and

Dimensional analysis fixes:

0
0

or equivalently

0
0

Examples Einstein solid


For an Einstein solid, we have calculated:

Where
Thus the temperature is:

0
0

From which we derive


1

This is what expected from the equipartition theorem,


where f is the number
of degrees of freedom. Indeed, an Einstein solid with N oscillators has f=2N degrees of
freedom. Why 2N?: N for the positions, N for the momenta.
Ideal gas, U does not depend on position, so f = Nd (in d dimensions), where N is the
number of atoms.

Examples Monoatomic gas


For a monoatomic gas, we have found:

Therefore the temperature is

From which we derive

0
0

Which is what we expect from the equipartition theorem

2. Entropy
Entropy and Heat Capacities

Heat capacity
The heat capacity at constant volume is
In order to compute

2
2

from first principles, we need to determined how U depends on T.

The strategy is as follows:


1) Calculate the multiplicity 3

/ and the entropy

2) determine how T depends on U:


3) invert this relation to determine 3

Calculate entropy

/ and finally compute

Determine

2
24

Examples
For the Einstein solid and for a monoatomic ideal gas, we have computed multiplicity,
entropy, and temperature, and we have also determined the dependence of the internal
energy on the temperature.
Einstein solid:
5

2
2

where

is the number of oscillators

Monoatomic gas:
5

2
2

where

is the number of atoms

Measuring the entropy


Calculate entropy

Determine CV

In general, this strategy is too complicated, as it is difficult to calculate the entropy from
first principles. Indeed, the heat capacity of just few systems has been theoretically
derived.
Since it is difficult to calculate the entropy, one uses its relation with the heat capacity to
actually estimate the entropy. Indeed, it is quite easy to measure the heat capacity, as one
only need to measure how the temperature changes when work/heat is done on a system
(remember Joule's experiment).

Measure CV

Estimate entropy

Measuring entropy changes


Since the temperature is defined as
We also have:
where we have used the first law 0

6 at constant volume
2
2

Since the heat capacity is defined as


0

we find
0

Measuring entropy changes


If the temperature changes for

to

, the entropy change is:


7 0

Example: m = 20g of water from 20 do 100 C.


The specific heat capacity of water is the mechanical equivalent of heat, and
Joules determined
J g-1 C-1.
does not depend much on temperature in
the considered range. Assuming to be constant, we get:
7 0

The corresponding change in multiplicity is HUGE:


5

&"

"

Absolute value of the entropy


If one knows the specific heat capacity at all temperatures, then one calculates
7

And determine 3 / given

What is the value of S(0)?


At zero temperature the system settles in the state of smallest energy. In general, there is
only one such state, so that
and

=0

However, for a variety of system this is note the case; e.g. because of different orientations
of the molecules, because of isotopes, because the system is amorphous, etc.
If this is the case, then:
# and
>0
In these cases,

is known as residual entropy

Third law of thermodynamics


Consider the relation
7

If CV is constant, then we can bring it out of the integral. The remaining integral of 1/T,
which is log(T) diverges for T 0.
This implies that one must have:

CV 0asT0
Which is known as third law of thermodynamics.

Einstein solid
For an Einstein solid, we have determined
Einstein solid:
5

2
2

where

is the number of oscillators

So CV is constant. This violates the third law of thermodynamics, which means that this
result is not correct.
Indeed, to determine how the internal energy depends on U, we started from our
calculation of the entropy of the solid, S(N,U,V).
To calculate the entropy S, we used the high-temperature assumption, and considered
, where q is the total number of quanta.
Next tutorial: work out how CV looks like in the low-temperature limit (Problem 3.8)

Important achievement
We have closed the circle and reached an important achievement.
1) We started the course by reporting an experimental observation:
two objects in thermal contact reach thermal equilibrium.
In equilibrium the objects have equal temperature, as measured with a thermometer.
We used an operative definition: we knew how to measure temperature, but we did not
know what temperature is.
B) We then introduced the idea of multiplicity and of entropy. We assumed that a system
explores randomly all of its accessible microstates.
C) We clarified that the temperature is related to the derivative of the entropy with respect
to the internal energy.

We have obtained a microscopic interpretation of temperature

You might also like