Lecture 2: Review and preview: Statistical Physics- Notes, Engg , Sem Notes | EduRev

: Lecture 2: Review and preview: Statistical Physics- Notes, Engg , Sem Notes | EduRev

 Page 1


Lecture 2: Review and preview
Let us begin with a review of undergraduate statistical physics, looking back
at what you have already learnt in your rst course on statistical physics.
Both Newtonian mechanics and its quantum counterpart, the Heisenberg-
Schrodinger wave-mechanics are inherently deterministic theories. Consider
for instance, the Hamiltonian formulation of Newton's laws
dp
dt
= 
@H(p;q)
@q
dq
dt
=
@H(p;q)
@p
(1)
In these equations, knowing the initial condition (p(t = 0);q(t = 0)) xes
behaviour for all t > 0. The situation is similar in quantum mechanics.
Consider Dirac's formulation of Schrodinger's equation
i
dj (t)i
dt
= Hj i (2)
Knowing initial statej (t = 0)i xes behaviour for allt> 0 given the system
HamiltonianH.
Now, given the success of Newtonian mechanics (or its relativistic gener-
alizations) in treating the behaviour of classical few-body systems and the
corresponding success of the Schrodinger equation in the quantum realm,
it might perhaps be natural to implicitly assume that the behaviour of
large macroscopic bodies|their macroscopic properties, internal structure
and dynamics|could all be understood at least in principle by an applica-
tion of these deterministic laws to all the 10
23
atoms that make up the
macroscopic body.
However, the great insight which forms the foundation of Statistical Physics
is that such an approach is neither feasible nor relevant! The rst point is that
it is simply not feasible to follow trajectories of 10
23
electrons in a crystal
or 10
23
atoms in a gas. The second, more fundamental point is that this in-
formation on the trajectories of all the particles, or the time evolution of the
many-body wavefunction, does not help us understand the microscopic sig-
nicance of macroscopic notions like \temperature", \hotness" vs \coldness"
etc, which are key ingredients in our description of macroscopic systems.
These key ingredients are assembled to form the science of thermodynam-
ics: As we know, thermodynamics starts with operational denitions for a
1
Page 2


Lecture 2: Review and preview
Let us begin with a review of undergraduate statistical physics, looking back
at what you have already learnt in your rst course on statistical physics.
Both Newtonian mechanics and its quantum counterpart, the Heisenberg-
Schrodinger wave-mechanics are inherently deterministic theories. Consider
for instance, the Hamiltonian formulation of Newton's laws
dp
dt
= 
@H(p;q)
@q
dq
dt
=
@H(p;q)
@p
(1)
In these equations, knowing the initial condition (p(t = 0);q(t = 0)) xes
behaviour for all t > 0. The situation is similar in quantum mechanics.
Consider Dirac's formulation of Schrodinger's equation
i
dj (t)i
dt
= Hj i (2)
Knowing initial statej (t = 0)i xes behaviour for allt> 0 given the system
HamiltonianH.
Now, given the success of Newtonian mechanics (or its relativistic gener-
alizations) in treating the behaviour of classical few-body systems and the
corresponding success of the Schrodinger equation in the quantum realm,
it might perhaps be natural to implicitly assume that the behaviour of
large macroscopic bodies|their macroscopic properties, internal structure
and dynamics|could all be understood at least in principle by an applica-
tion of these deterministic laws to all the 10
23
atoms that make up the
macroscopic body.
However, the great insight which forms the foundation of Statistical Physics
is that such an approach is neither feasible nor relevant! The rst point is that
it is simply not feasible to follow trajectories of 10
23
electrons in a crystal
or 10
23
atoms in a gas. The second, more fundamental point is that this in-
formation on the trajectories of all the particles, or the time evolution of the
many-body wavefunction, does not help us understand the microscopic sig-
nicance of macroscopic notions like \temperature", \hotness" vs \coldness"
etc, which are key ingredients in our description of macroscopic systems.
These key ingredients are assembled to form the science of thermodynam-
ics: As we know, thermodynamics starts with operational denitions for a
1
few key quantities, the so-called \thermodynamic variables" that characterize
the state of a macroscopic body. These include the degree of \disorder" and
quantity of heat, quantied by the thermodynamic entropyS, the hotness or
coldness of macroscopic bodies, characterized by the absolute temperature
T , the quantity of \available" energy, characterized by the free energyF , the
internal energy U, and so on.
The predictive power of thermodynamics derives from a few simple prop-
erties these thermodynamic variables are postulated to satisfy. For instance,
heat always 
ows from body A to B when in contact if T
A
> T
B
. S either
increases or remains constant with time. And S goes to zero as T! 0, and
so on.
Now, as remarked earlier, it is by no means obvious at all where these
properties like entropy and absolute temperature are \hiding" in the the
trajectories of 10
23
atoms in a gas, or the evolution of the many-body
wavefunction in a hilbert space of dimension 10
23
. And the central insight
of statistical physics is the realization that these thermodynamic properties
are emergent and statistical in nature.
The emergent aspect has to do with the fact that it makes no sense to say
a single atom is \hot" or \cold", or ascribe temperatureT to it. But one mole
of the corresponding gas in equilibrium can be described by thermodynamics,
and does have a well-dened temperature at least in equilibrium. Likewise,
there is no precise sense in which there is a sensibly dened entropy for a
system of few atoms. Entropy, and the Third Law of thermodynamics both
re
ects the properties and behaviour of macroscopically large collections of
atoms. Similarly, there is no sense in which a few atoms of Helium are in a
super
uid state. Super
uidity (and we will have much more to say about it
later) is a property of a macroscopically large collection of Helium atoms.
The statistical aspect is another facet which again re
ects the key role
played by the \thermodynamic limit", i.e. the limit of macroscopically large
system sizes (formally dened by keeping the density xed and nite, but
sending the volume to innity). It has to do with the fact that thermody-
namic laws can be violated by rare 
uctuations in small systems. Thus, if
you insist on using the operational denitions of thermodynamics to measure
the entropy of very tiny systems, say a few dozen molecules bound together
to form a polymer chain, you will nd that the third law of thermodynamics
can be disobeyed by rare 
uctuations in the behaviour of the system.
Consequently, our microscopic understanding of thermodynamic proper-
ties has a statistical 
avour that you are already familiar with. For the sake
2
Page 3


Lecture 2: Review and preview
Let us begin with a review of undergraduate statistical physics, looking back
at what you have already learnt in your rst course on statistical physics.
Both Newtonian mechanics and its quantum counterpart, the Heisenberg-
Schrodinger wave-mechanics are inherently deterministic theories. Consider
for instance, the Hamiltonian formulation of Newton's laws
dp
dt
= 
@H(p;q)
@q
dq
dt
=
@H(p;q)
@p
(1)
In these equations, knowing the initial condition (p(t = 0);q(t = 0)) xes
behaviour for all t > 0. The situation is similar in quantum mechanics.
Consider Dirac's formulation of Schrodinger's equation
i
dj (t)i
dt
= Hj i (2)
Knowing initial statej (t = 0)i xes behaviour for allt> 0 given the system
HamiltonianH.
Now, given the success of Newtonian mechanics (or its relativistic gener-
alizations) in treating the behaviour of classical few-body systems and the
corresponding success of the Schrodinger equation in the quantum realm,
it might perhaps be natural to implicitly assume that the behaviour of
large macroscopic bodies|their macroscopic properties, internal structure
and dynamics|could all be understood at least in principle by an applica-
tion of these deterministic laws to all the 10
23
atoms that make up the
macroscopic body.
However, the great insight which forms the foundation of Statistical Physics
is that such an approach is neither feasible nor relevant! The rst point is that
it is simply not feasible to follow trajectories of 10
23
electrons in a crystal
or 10
23
atoms in a gas. The second, more fundamental point is that this in-
formation on the trajectories of all the particles, or the time evolution of the
many-body wavefunction, does not help us understand the microscopic sig-
nicance of macroscopic notions like \temperature", \hotness" vs \coldness"
etc, which are key ingredients in our description of macroscopic systems.
These key ingredients are assembled to form the science of thermodynam-
ics: As we know, thermodynamics starts with operational denitions for a
1
few key quantities, the so-called \thermodynamic variables" that characterize
the state of a macroscopic body. These include the degree of \disorder" and
quantity of heat, quantied by the thermodynamic entropyS, the hotness or
coldness of macroscopic bodies, characterized by the absolute temperature
T , the quantity of \available" energy, characterized by the free energyF , the
internal energy U, and so on.
The predictive power of thermodynamics derives from a few simple prop-
erties these thermodynamic variables are postulated to satisfy. For instance,
heat always 
ows from body A to B when in contact if T
A
> T
B
. S either
increases or remains constant with time. And S goes to zero as T! 0, and
so on.
Now, as remarked earlier, it is by no means obvious at all where these
properties like entropy and absolute temperature are \hiding" in the the
trajectories of 10
23
atoms in a gas, or the evolution of the many-body
wavefunction in a hilbert space of dimension 10
23
. And the central insight
of statistical physics is the realization that these thermodynamic properties
are emergent and statistical in nature.
The emergent aspect has to do with the fact that it makes no sense to say
a single atom is \hot" or \cold", or ascribe temperatureT to it. But one mole
of the corresponding gas in equilibrium can be described by thermodynamics,
and does have a well-dened temperature at least in equilibrium. Likewise,
there is no precise sense in which there is a sensibly dened entropy for a
system of few atoms. Entropy, and the Third Law of thermodynamics both
re
ects the properties and behaviour of macroscopically large collections of
atoms. Similarly, there is no sense in which a few atoms of Helium are in a
super
uid state. Super
uidity (and we will have much more to say about it
later) is a property of a macroscopically large collection of Helium atoms.
The statistical aspect is another facet which again re
ects the key role
played by the \thermodynamic limit", i.e. the limit of macroscopically large
system sizes (formally dened by keeping the density xed and nite, but
sending the volume to innity). It has to do with the fact that thermody-
namic laws can be violated by rare 
uctuations in small systems. Thus, if
you insist on using the operational denitions of thermodynamics to measure
the entropy of very tiny systems, say a few dozen molecules bound together
to form a polymer chain, you will nd that the third law of thermodynamics
can be disobeyed by rare 
uctuations in the behaviour of the system.
Consequently, our microscopic understanding of thermodynamic proper-
ties has a statistical 
avour that you are already familiar with. For the sake
2
of completeness, we provide a quick review: The basic idea is to start with the
Gibbs distribution function, which postulates that a a macroscopic system
of N particles in xed volume V is in eigenstatejmi with probability
P
m
=
1
Z
exp(E
m
(V;N)=k
B
T ) where
Z =
X
m
exp(E
m
(V;N)=k
B
T ) (3)
Here, Z is the canonical partition function of the system. An interesting
aspect of this statistical description is that T , an emergent property of a
macroscopic system, enters in the relative probabilities of various m.
With this starting point, one denes
U =
X
m
E
m
P
m
(4)
F  UTS =k
B
T log(Z) (5)
where F is the Helmholtz free energy and U the internal energy. The tem-
perature T , the internal energy U, the entropy S (dened implicitly in the
above by subtracting the rst equation from the second) and the free en-
ergy F dened in this manner are then argued to be posssessed of all the
properties one expects of the corresponding quantities dened operationally
in thermodynamics. This provides an a posteriori justication of the Gibbs
distribution function.
This framework generalizes readily if one wants a more general prescrip-
tion that allows for number 
uctuations. One starts with a larger space of
states which considers all possible values of the total number N, and postu-
lates the grand-canonical distribution function
P
m;N
=
1
Z
GC
exp ((E
m
(V;N)N)=k
B
T )
Z
GC
=
X
m;N
exp ((E
m
(V;N)N)=k
B
T ) (6)
Z
GC
, the grand-canonical partition function depends on the chemical poten-
tial , which can be thought of as the energy cost of adding a particle. Note
that  is an \intensive" variable, which can be thought of as a \Lagrange-
multiplier" that xes the mean number of particles to equal what we expect
3
Page 4


Lecture 2: Review and preview
Let us begin with a review of undergraduate statistical physics, looking back
at what you have already learnt in your rst course on statistical physics.
Both Newtonian mechanics and its quantum counterpart, the Heisenberg-
Schrodinger wave-mechanics are inherently deterministic theories. Consider
for instance, the Hamiltonian formulation of Newton's laws
dp
dt
= 
@H(p;q)
@q
dq
dt
=
@H(p;q)
@p
(1)
In these equations, knowing the initial condition (p(t = 0);q(t = 0)) xes
behaviour for all t > 0. The situation is similar in quantum mechanics.
Consider Dirac's formulation of Schrodinger's equation
i
dj (t)i
dt
= Hj i (2)
Knowing initial statej (t = 0)i xes behaviour for allt> 0 given the system
HamiltonianH.
Now, given the success of Newtonian mechanics (or its relativistic gener-
alizations) in treating the behaviour of classical few-body systems and the
corresponding success of the Schrodinger equation in the quantum realm,
it might perhaps be natural to implicitly assume that the behaviour of
large macroscopic bodies|their macroscopic properties, internal structure
and dynamics|could all be understood at least in principle by an applica-
tion of these deterministic laws to all the 10
23
atoms that make up the
macroscopic body.
However, the great insight which forms the foundation of Statistical Physics
is that such an approach is neither feasible nor relevant! The rst point is that
it is simply not feasible to follow trajectories of 10
23
electrons in a crystal
or 10
23
atoms in a gas. The second, more fundamental point is that this in-
formation on the trajectories of all the particles, or the time evolution of the
many-body wavefunction, does not help us understand the microscopic sig-
nicance of macroscopic notions like \temperature", \hotness" vs \coldness"
etc, which are key ingredients in our description of macroscopic systems.
These key ingredients are assembled to form the science of thermodynam-
ics: As we know, thermodynamics starts with operational denitions for a
1
few key quantities, the so-called \thermodynamic variables" that characterize
the state of a macroscopic body. These include the degree of \disorder" and
quantity of heat, quantied by the thermodynamic entropyS, the hotness or
coldness of macroscopic bodies, characterized by the absolute temperature
T , the quantity of \available" energy, characterized by the free energyF , the
internal energy U, and so on.
The predictive power of thermodynamics derives from a few simple prop-
erties these thermodynamic variables are postulated to satisfy. For instance,
heat always 
ows from body A to B when in contact if T
A
> T
B
. S either
increases or remains constant with time. And S goes to zero as T! 0, and
so on.
Now, as remarked earlier, it is by no means obvious at all where these
properties like entropy and absolute temperature are \hiding" in the the
trajectories of 10
23
atoms in a gas, or the evolution of the many-body
wavefunction in a hilbert space of dimension 10
23
. And the central insight
of statistical physics is the realization that these thermodynamic properties
are emergent and statistical in nature.
The emergent aspect has to do with the fact that it makes no sense to say
a single atom is \hot" or \cold", or ascribe temperatureT to it. But one mole
of the corresponding gas in equilibrium can be described by thermodynamics,
and does have a well-dened temperature at least in equilibrium. Likewise,
there is no precise sense in which there is a sensibly dened entropy for a
system of few atoms. Entropy, and the Third Law of thermodynamics both
re
ects the properties and behaviour of macroscopically large collections of
atoms. Similarly, there is no sense in which a few atoms of Helium are in a
super
uid state. Super
uidity (and we will have much more to say about it
later) is a property of a macroscopically large collection of Helium atoms.
The statistical aspect is another facet which again re
ects the key role
played by the \thermodynamic limit", i.e. the limit of macroscopically large
system sizes (formally dened by keeping the density xed and nite, but
sending the volume to innity). It has to do with the fact that thermody-
namic laws can be violated by rare 
uctuations in small systems. Thus, if
you insist on using the operational denitions of thermodynamics to measure
the entropy of very tiny systems, say a few dozen molecules bound together
to form a polymer chain, you will nd that the third law of thermodynamics
can be disobeyed by rare 
uctuations in the behaviour of the system.
Consequently, our microscopic understanding of thermodynamic proper-
ties has a statistical 
avour that you are already familiar with. For the sake
2
of completeness, we provide a quick review: The basic idea is to start with the
Gibbs distribution function, which postulates that a a macroscopic system
of N particles in xed volume V is in eigenstatejmi with probability
P
m
=
1
Z
exp(E
m
(V;N)=k
B
T ) where
Z =
X
m
exp(E
m
(V;N)=k
B
T ) (3)
Here, Z is the canonical partition function of the system. An interesting
aspect of this statistical description is that T , an emergent property of a
macroscopic system, enters in the relative probabilities of various m.
With this starting point, one denes
U =
X
m
E
m
P
m
(4)
F  UTS =k
B
T log(Z) (5)
where F is the Helmholtz free energy and U the internal energy. The tem-
perature T , the internal energy U, the entropy S (dened implicitly in the
above by subtracting the rst equation from the second) and the free en-
ergy F dened in this manner are then argued to be posssessed of all the
properties one expects of the corresponding quantities dened operationally
in thermodynamics. This provides an a posteriori justication of the Gibbs
distribution function.
This framework generalizes readily if one wants a more general prescrip-
tion that allows for number 
uctuations. One starts with a larger space of
states which considers all possible values of the total number N, and postu-
lates the grand-canonical distribution function
P
m;N
=
1
Z
GC
exp ((E
m
(V;N)N)=k
B
T )
Z
GC
=
X
m;N
exp ((E
m
(V;N)N)=k
B
T ) (6)
Z
GC
, the grand-canonical partition function depends on the chemical poten-
tial , which can be thought of as the energy cost of adding a particle. Note
that  is an \intensive" variable, which can be thought of as a \Lagrange-
multiplier" that xes the mean number of particles to equal what we expect
3
for a system of that average density. From the grand-canonical partition
sum, one obtains another thermodynamic potential

 = k
B
T logZ
GC
(7)
known as the Gibbs Free energy.
If appropriate for the experimental situation at hand, one can also work
with a distribution function that allows for a variable volume
P
m;V
=
1
Z
P
exp ((E
m
(V;N) +PV )=k
B
T )
Z
P
=
X
m
Z
dV exp ((E
m
(V;N) +PV )=k
B
T ) (8)
Here Z
P
is the partition function at xed pressure P |again, the intensive
pressure variable can be thought of as a \Lagrange-multiplier" that xes the
mean volume, and the corresponding thermodynamic potential
H = k
B
T log(Z
P
) (9)
is the thermodynamic enthalpy.
.
Thus, the Gibbs distribution provides a statistical way of \understand-
ing" the underlying rationale for the macroscopic laws of thermodynamics,
and provides a clear calculational prescription for macroscopic concepts like
temperature, free or available energy, entropy etc. Undergraduate treatments
of statistical physics thus end on the following triumphant note:
Macroscopic phenomena are governed by thermodynamics. Since statisti-
cal mechanics provides the rationale for thermodynamics, all these phenom-
ena can in principle be derived from statistical mechanics. At this point in
our discussion, it is therefore worth asking: Really? Is this really true?
More precisely, let us list some phenomenological facts, drawn from ev-
eryday experience and a study of undergraduate physics, and ask: Where is
all this lurking inside the Gibbs distribution function?
 Matter exists in several dierent phases:
Crystalline solid, liquid, gaseous phases of H
2
O; ferromagnetic and
paramagnetic metals, insulators...
These phases are separated by phase transitions, accessed by changing
pressure, temperature, magnetic eld...
4
Page 5


Lecture 2: Review and preview
Let us begin with a review of undergraduate statistical physics, looking back
at what you have already learnt in your rst course on statistical physics.
Both Newtonian mechanics and its quantum counterpart, the Heisenberg-
Schrodinger wave-mechanics are inherently deterministic theories. Consider
for instance, the Hamiltonian formulation of Newton's laws
dp
dt
= 
@H(p;q)
@q
dq
dt
=
@H(p;q)
@p
(1)
In these equations, knowing the initial condition (p(t = 0);q(t = 0)) xes
behaviour for all t > 0. The situation is similar in quantum mechanics.
Consider Dirac's formulation of Schrodinger's equation
i
dj (t)i
dt
= Hj i (2)
Knowing initial statej (t = 0)i xes behaviour for allt> 0 given the system
HamiltonianH.
Now, given the success of Newtonian mechanics (or its relativistic gener-
alizations) in treating the behaviour of classical few-body systems and the
corresponding success of the Schrodinger equation in the quantum realm,
it might perhaps be natural to implicitly assume that the behaviour of
large macroscopic bodies|their macroscopic properties, internal structure
and dynamics|could all be understood at least in principle by an applica-
tion of these deterministic laws to all the 10
23
atoms that make up the
macroscopic body.
However, the great insight which forms the foundation of Statistical Physics
is that such an approach is neither feasible nor relevant! The rst point is that
it is simply not feasible to follow trajectories of 10
23
electrons in a crystal
or 10
23
atoms in a gas. The second, more fundamental point is that this in-
formation on the trajectories of all the particles, or the time evolution of the
many-body wavefunction, does not help us understand the microscopic sig-
nicance of macroscopic notions like \temperature", \hotness" vs \coldness"
etc, which are key ingredients in our description of macroscopic systems.
These key ingredients are assembled to form the science of thermodynam-
ics: As we know, thermodynamics starts with operational denitions for a
1
few key quantities, the so-called \thermodynamic variables" that characterize
the state of a macroscopic body. These include the degree of \disorder" and
quantity of heat, quantied by the thermodynamic entropyS, the hotness or
coldness of macroscopic bodies, characterized by the absolute temperature
T , the quantity of \available" energy, characterized by the free energyF , the
internal energy U, and so on.
The predictive power of thermodynamics derives from a few simple prop-
erties these thermodynamic variables are postulated to satisfy. For instance,
heat always 
ows from body A to B when in contact if T
A
> T
B
. S either
increases or remains constant with time. And S goes to zero as T! 0, and
so on.
Now, as remarked earlier, it is by no means obvious at all where these
properties like entropy and absolute temperature are \hiding" in the the
trajectories of 10
23
atoms in a gas, or the evolution of the many-body
wavefunction in a hilbert space of dimension 10
23
. And the central insight
of statistical physics is the realization that these thermodynamic properties
are emergent and statistical in nature.
The emergent aspect has to do with the fact that it makes no sense to say
a single atom is \hot" or \cold", or ascribe temperatureT to it. But one mole
of the corresponding gas in equilibrium can be described by thermodynamics,
and does have a well-dened temperature at least in equilibrium. Likewise,
there is no precise sense in which there is a sensibly dened entropy for a
system of few atoms. Entropy, and the Third Law of thermodynamics both
re
ects the properties and behaviour of macroscopically large collections of
atoms. Similarly, there is no sense in which a few atoms of Helium are in a
super
uid state. Super
uidity (and we will have much more to say about it
later) is a property of a macroscopically large collection of Helium atoms.
The statistical aspect is another facet which again re
ects the key role
played by the \thermodynamic limit", i.e. the limit of macroscopically large
system sizes (formally dened by keeping the density xed and nite, but
sending the volume to innity). It has to do with the fact that thermody-
namic laws can be violated by rare 
uctuations in small systems. Thus, if
you insist on using the operational denitions of thermodynamics to measure
the entropy of very tiny systems, say a few dozen molecules bound together
to form a polymer chain, you will nd that the third law of thermodynamics
can be disobeyed by rare 
uctuations in the behaviour of the system.
Consequently, our microscopic understanding of thermodynamic proper-
ties has a statistical 
avour that you are already familiar with. For the sake
2
of completeness, we provide a quick review: The basic idea is to start with the
Gibbs distribution function, which postulates that a a macroscopic system
of N particles in xed volume V is in eigenstatejmi with probability
P
m
=
1
Z
exp(E
m
(V;N)=k
B
T ) where
Z =
X
m
exp(E
m
(V;N)=k
B
T ) (3)
Here, Z is the canonical partition function of the system. An interesting
aspect of this statistical description is that T , an emergent property of a
macroscopic system, enters in the relative probabilities of various m.
With this starting point, one denes
U =
X
m
E
m
P
m
(4)
F  UTS =k
B
T log(Z) (5)
where F is the Helmholtz free energy and U the internal energy. The tem-
perature T , the internal energy U, the entropy S (dened implicitly in the
above by subtracting the rst equation from the second) and the free en-
ergy F dened in this manner are then argued to be posssessed of all the
properties one expects of the corresponding quantities dened operationally
in thermodynamics. This provides an a posteriori justication of the Gibbs
distribution function.
This framework generalizes readily if one wants a more general prescrip-
tion that allows for number 
uctuations. One starts with a larger space of
states which considers all possible values of the total number N, and postu-
lates the grand-canonical distribution function
P
m;N
=
1
Z
GC
exp ((E
m
(V;N)N)=k
B
T )
Z
GC
=
X
m;N
exp ((E
m
(V;N)N)=k
B
T ) (6)
Z
GC
, the grand-canonical partition function depends on the chemical poten-
tial , which can be thought of as the energy cost of adding a particle. Note
that  is an \intensive" variable, which can be thought of as a \Lagrange-
multiplier" that xes the mean number of particles to equal what we expect
3
for a system of that average density. From the grand-canonical partition
sum, one obtains another thermodynamic potential

 = k
B
T logZ
GC
(7)
known as the Gibbs Free energy.
If appropriate for the experimental situation at hand, one can also work
with a distribution function that allows for a variable volume
P
m;V
=
1
Z
P
exp ((E
m
(V;N) +PV )=k
B
T )
Z
P
=
X
m
Z
dV exp ((E
m
(V;N) +PV )=k
B
T ) (8)
Here Z
P
is the partition function at xed pressure P |again, the intensive
pressure variable can be thought of as a \Lagrange-multiplier" that xes the
mean volume, and the corresponding thermodynamic potential
H = k
B
T log(Z
P
) (9)
is the thermodynamic enthalpy.
.
Thus, the Gibbs distribution provides a statistical way of \understand-
ing" the underlying rationale for the macroscopic laws of thermodynamics,
and provides a clear calculational prescription for macroscopic concepts like
temperature, free or available energy, entropy etc. Undergraduate treatments
of statistical physics thus end on the following triumphant note:
Macroscopic phenomena are governed by thermodynamics. Since statisti-
cal mechanics provides the rationale for thermodynamics, all these phenom-
ena can in principle be derived from statistical mechanics. At this point in
our discussion, it is therefore worth asking: Really? Is this really true?
More precisely, let us list some phenomenological facts, drawn from ev-
eryday experience and a study of undergraduate physics, and ask: Where is
all this lurking inside the Gibbs distribution function?
 Matter exists in several dierent phases:
Crystalline solid, liquid, gaseous phases of H
2
O; ferromagnetic and
paramagnetic metals, insulators...
These phases are separated by phase transitions, accessed by changing
pressure, temperature, magnetic eld...
4
 Some of these phases are \distinctly" dierent from other phases:
Atoms in a crystal have a very \ordered" arrangement. Not so in a
liquid
 Some phase transitions are \rst order" with latent heat of phase
change.
 Other transitions are accompanied by large scale 
uctuations
e.g. as evidenced by the phenomenon of critical opalescence at liquid-
gas critical point.
None of these follow in any automatic way from the basic prescription
of Gibbs. They are all emergent properties of macroscopic aggregates of
constituent particles, and require new ways of thinking to understand them
well. One actually needs another layer of new concepts needed to \eciently"
think about these macroscopic phenomena. Here, we list them by way of
preview, and discuss them in some detail in the next lecture:
 Spontaneous breaking of symmetry.
 Phases distinguished by long-range order.
 Order parameters.
 Rigidity.
 Broken ergodicity.
 Gapless elementary excitations related to the underlying rigidity.
More on this in the next lecture...
5
Read More
Offer running on EduRev: Apply code STAYHOME200 to get INR 200 off on our premium plan EduRev Infinity!