Table Of ContentEntropy
Keith Andrew
Physics Department, University ofA rkansas, Fayetteville, Arkansas 72701
(Received 28 February 1983; accepted for publication 17 September 1983)
Several different conceptualizations of entropy are examined: thermal, probabilistic, quantum
mechanical, information theoretic, black-hole, and chaotic. The unifying nature of entropy in all
physical processes and recent cosmological advances are discussed.
I. INTRODUCTION thermodynamics states that the entropy of an isolated sys
tem never decreases. The power in this law is that it pro
Entropy offers a means of perceiving unity in physics. It
vides a key to understanding the motion of all isolated clas
is a common denominator in the evolution of all physical
sical macroscopic systems. It does not, however, describe
processes and thus serves as a fundamental unifying fea
any mechanism responsible for the creation of the initial
ture. Yet as fundamental as entropy is, it does not appear in
low-entropy state.
any of the basic equations of motion. Classically it is treated
as a phenomenological property of all systems. Systems
B. Microscopic entropy: Accessible states
evolve in the direction of increasing entropy. This is a dy
namic condition subject to the constraint that there exists We may directly relate our microscopic knowledge of
an initial low-entropy state; this state serves as a boundary the system to the entropy of the system by the relation
condition. Classically entropy is clearly expressed in terms
S=klnW, (3)
of phase space, partition functions, and probability distri
butions. Quantum mechanically, entropy is also a clear and where k is Boltzmann's constant and W = W(E, V,m,. .. ),
well-defined concept. The normal tools of quantum me which is the number of quantum states which have the
chanics-Hilbert space, wave functions, observables, and same values on all macroscopic extensive variables: energy,
the density matrix-suffice to yield a totally unambiguous volume, magnetization, etc. Knowing the nature of the
concept of entropy.1-4 particles making up the system and the interactions
Ultimately one hopes to understand entropy well between them one can in principle calculate W. The differ
enough to perceive the unity it provides to diverse areas of ence in entropy after completing a process is
physics. An understanding of entropy is founded upon con
WI
cepts such as probability, reversibility, information con AS=S1-S =kln-. (4)
' W;
tent,5 black holes,6 chaos,7 geometry,8 and time asymme
try.9·10 Some of these concepts strike deeply into basic The condition for reversibility is now W1 = W;. The num
problems with our present understanding of nature-the ber of quantum states accessible to the system remains un
meaning of having "knowledge" of a system, a generalized changed by the process. From the point of view of accessi
second law of thermodynamics, a possible boundary condi ble states all physical quantities of interest can be expressed
tion on the cosmological evolution of the universe, the ar in terms of the fundamental quantity In W. This represents
row of time, and the CPT theorem. Here we investigate the microcanonical approach; the system is distributed
each area in turn to help illuminate the underlying unity with equal probability over all accessible states offixed ex
provided by entropy. tensive macroscopic variables: energy E and particle num
ber N.
II.ENTROPY
A. Oassical macroscopic entropy C. Microscopic entropy: Statistics
We are concerned with finding an extensive state vari Instead of treating In Was the fundamental quantity we
able which is capable of telling whether an isolated system, may treat the partition function,
which has undergone a change to a new equilibrium config
uration, has done so reversibly or not. This is to be done Z= _Le-E,lkT' (5)
without any reference to the possible microstates of the
system. This quantity provides a means for separating pro where k is Boltzmann's constant and E, is the energy of
cesses that can occur from those that cannot even though stater, as such a quantity. Thus all physical quantities may
they satisfy the first law of thermodynamics. Such a vari be expressed in terms of lnZ. The entropy of a system is
able is the entropy S, where then
f
S= k (lnZ +ElkT), (6)
dS=S -S;>O (1)
1
where Eis the average energy of the system. With the aid of
for any thermally isolated system, and a probability argument one can now show11 that this is
equivalent to S = k In W. Here one says that the system
dS= t!Q (2) remains in the state of maximum entropy. This represents
T the canonical approach: the system is distributed over all
for any quasistatic infinitesimal process which absorbs heat states with a fixed number of particles but with different
rtQ. The equality of Eq. (1 ) only holds for isentropic pro energies.
cesses, these being reversible. As such the second law of The statistical approach reproduces all the results of
492 Am. J. Phys. 52 (6), June 1984 © 1984 American Association of Physics Teachers 492
This article is copyrighted as indicated in the article. Reuse of AAPT content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:
86.185.217.15 On: Sat, 17 May 2014 13:16:45
thermodynamics and goes beyond. The theory is adept at analogies to be introduced in later sections do not have this
dealing with systems for which quantum effects are negligi property. They merely exploit the mathematical character
ble and the information is less than complete. Such incom istics of the Boltzmann definition in terms of symmetry,
plete states are known as mixed, otherwise one has a pure convexity, etc. With this in mind we now make the natural
state. 12 In terms of the canonical probability distribution p extension of this formalism to the quantum regime, en
the entropy is abling one to understand entropy from the more funda
f
mental rules of quantum mechanics.
S(E,N) = - k p(x)ln[ap(x)]dx, (7)
1
wherep(x) = z- e-H/kT;H = H(q1 .. ·PN), theHamilton D. Quantum mechanics and entropy
ianofthesystem;a = h 3NN 1!...Nk!;dx = d3q1 ••• d3pN;xis
the phase space point of the system (q 1 •••p N ); and h 3N is the Recall that in quantum mechanics observables corre
spond to self-adjoint operators in Hilbert space while
phase-space volume.
The time development of p is given by Liouville's equa states, mixed and pure, 12 can be characterized by a density
matrix p = l:; c; Ii) (ii. The expectation value of an obser
tion,
vable A in statep is (A ) = Tr( pA ). Entropy is not an obser
.§p_ + {p,H J = 0, (8) vable, it is a function of a mixed state defined as 13
at
S(p) = - k Tr(p lnp). ( 11)
where
I( Notice that for any pure state S ( p) = 0. The time evolution
{p,H l = ap aH _ aH .§p_) ofp is given by quantum Liouville equation
aqi api aq; ap;
i d i
dtp -h[p,H] = 0, (12)
is the Poisson bracket.
Thus far several points are worth noting. First, we may
where [p,H] =pH-Hp.
use Eq. (7) to express entropy as a function of p: S = S ~).
A problem now arises as to the application of the Schro
For the time rate of change of entropy we have
dinger equation and the second law of thermodynamics as
dS = as dp = 0 (9) expressed in Sec. II A: Entropy must never decrease. The
dt ap dt entropy of a system obeying the Schrooinger equation,
with a time-independent Hamiltonian, always remains
since dp/dt = 0 by Liouville's equation. Thus one con
constant. The time evolution of the density matrix is given
cludes that S ( p(t)) = S ( p(O)); the entropy of the system is
by
time independent. This result has a corresponding quan
p(t) = e - iHt p(O)eiHt. (13)
tum analog to be taken up in Sec. II D.
Next we shall consider the H theorem first proposed by Since e;Hi is a unitary operator, the eigenvalues of p(t) are
Boltzmann in 1872. Given a well-defined quantity H, the same as p(O); therefore S ( p(t)) = S ( p(O)), even for irre
which is a function of the system's probability distribution, versible process. 14 One way to avoid this difficulty is to
the H theorem states that the time rate of change of this abandon the Schrodinger equation in statistical mechanics
quantity is such that and use instead the master equation, Boltzmann equation,
etc. These equations are not time invariant and it is a great
dH<O (10)
challenge to show that all time invariant equations follow
dt '
from them. Another means of reconciling this discrepancy
where the condition dH I dt = 0 only occurs at equilibrium. between macroscopic and microscopic concepts is to retain
Proofs of the theorem for a gas vary but they require some the Schrodinger equation and use an information theoretic
assumptions concerning randomness and lack of correla approach to calculate the entropy; this is the subject of Sec.
tions between collisions. One immediately observes that II E.
the H theorem is the negative of the second law of thermo
dynamics; as entropy increases H decreases.
Finally we mention the Poincare recurrence theorem
E. Entropy and information
with regards to fluctuations and the statement "entropy
never decreases." Poincare's recurrence theorem states The connection between entropy and information is well
that in an isolated system any state the system has gone known. 5•15•16 The entropy of a system is a measure of one's
through will be revisited to arbitrary closeness an infinite lack of information about the internal configuration of the
number of times. This seems to be in direct contradiction to system. To see how this comes about let us review the basics
the H theorem and the second law statement "entropy nev of information theory. Any message can be represented by
er decreases." However, the entropy of the system, even if a sequence of binary digits, which we shall consider to be
in equilibrium, is constantly undergoing fluctuations. finite in length. Ift he length of a "word" is n then one needs
Some of these fluctuations are such that they reduce the n such digits. The set En ofa ll words oflength n contains 2n
system's entropy to the original low-entropy state in accor elements, thus the amount of information needed to char
dance with the theorem of Poincare. However, the recur acterize one element is log of (the number of elements of
2
rence times for such states of a macroscopic system are of En) = log N = n where N = in. Let E = U E; of all pair
2
such a large order of magnitude that experimentally they wise disjoint sets, N; = number of elements in each E;,
never occur. p; = N;IN, and N = .IN;. Thus the information required
It is important to notice that changes in thermodynamic to determine a word is which E; it is in and log N; bits to
2
entropy can be measured in the laboratory just as energy characterize the word completely. The average amount of
change and position change can be measured. The abstract information needed to determine an element, given which
493 Am. J. Phys., Vol. 52, No. 6, June 1984 Keith Andrew 493
This article is copyrighted as indicated in the article. Reuse of AAPT content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:
86.185.217.15 On: Sat, 17 May 2014 13:16:45
E; it is in, is S= _!_ kc3 A
(17)
4 fzG '
where k is Boltzmann's constant, c is the speed oflight, fz is
the Planck's constant, G is the gravitational constant, and
A is the surface area of the horizon. For a system of several
black holes the black-hole entropy is additive. Black-hole
Where log N is the information needed if one does not
2 entropy is not the thermodynamic entropy of the system
know to which E; a given element belongs, the correspond inside the event horizon. To get a feeling for how irrevers
ing lack of information is
ible the process of black-hole formation is, notice that the
entropy of a one solar mass black hole is 1060 ergs/K; in
(14) contrast the entropy of the sun is 1042 ergs/K.
Another means of understanding the motivation for Eq.
This is Shannon's formula. 16 Physically, the set Eis a set of ( 17) is by direct analogy to classical thermodynamics. Two
N measurements and p; are the probability of finding the
neighboring black-hole equilibrium states are related by
system in the pure state Ii). Then, to within a constant, the first law of black-hole dynamics:
Shannon's formula gives an expression for the entropy of
the system. This is a classical expression and great care dm =_LdA +fl dJ, (18)
must be exercised in arriving at a quantum-mechanical 81TG c2
expression due to commutation rules. 17 where g is the surface gravity of the black hole, A is the
Whenever new information becomes available this im surface area of the black hole, fl is the angular velocity of
poses constraints on some of the p; which decreases the the black hole, J is the angular momentum of the black
entropy, 18 hole, G is the gravitational constant, and m is the mass of
ill= -ilS. (15) the black hole. Comparing this to the first law of thermody
namics,
A thermodynamic system not in equilibrium increases its
+
entropy since information about the internal configuration dU = Tds pdV, (19)
of the system is lost during its time evolution.19 This is the one sees that if some multiple of A is regarded as entropy,
basic significance of the second law of thermodynamics. then some multiple of g is analogous to temperature.
Herein we may use the Schrodinger equation to analyze Therefore not only is there an entropy associated with
the time evolution ofe ntropy S (t ). 20•21 _t..ssume the system's black holes but also a finite temperature.
macroscopic variable~ are energy, (H ) , and magnetiza
T= fzg/kc1T. (20)
tion, (m). Given (H),= and (m),= one can find
0 0
S (t = 0) by maximizing - k Trp(O)ln p(O) subject to the One result of black-hole entropy is a generalized second
constraints due to (H)0Aand (m)0• Now use the Schr9- law of thermodynamics: The common entropy in the
dinger e.quation to find (H),, and (m),, i.e., p(t) = e - iHt black-hole exterior plus the black-hole entropy never de
p(O)e - iHt. Now max,_imize - k Tr( p, lnp,) subject to the crease. Black-hole entropy is regarded as a genuine contri
predicted values of (H), and (m),. Notice that this proce bution to the entropy of the universe.
dure is different from calculating p(t) from the Liouville Black holes also represent limits to the applicability of
equation. Thus the macroscopic entropy at time t is the physical laws. Thus Eq. ( 17) can be used to put restrictions
maximum of on several "everyday" quantities, i.e., the ideal digital com
puter cannot perform more than 1015 elementary opera
S(t)= -kTr(jJ, lnp,)=S((H),, (m),). (16)
tions per second and there exists an upper bound on the
This entropy is the maximum uncertainty about theAmi number of quarks. 24
croscopic state, given the macroscopic observables (H ) ,
and (m), as predicted by the Schrodinger equation. This G. Dynamical entropy and chaos
technique provides the most natural means of overcoming
It has been shown that very simple dynamical systems
the discrepancy between microscopic and macroscopic
with regular data exhibit completely unpredictable behav
conceptualizations. It has the further advantage of direct
ior. 25 As an example, consider (Jn+ = 28n(mod21T) on the
interpretation as information loss of the microscopic state 1
of the system. Such connections between microscopic and unit circle. Then 8(}n = 2n 880 for some initial 80• Any
macroscopic domains are powerful guides to understand small uncertainty in (} n will eventually fill the entire phase
space available to it, the total circumference. Although the
ing nature.
equation is completely deterministic after a sufficient num
ber of iterations, all knowledge of the initial datum is lost,
F. Black holes and entropy as well as predictive power, provided there exists an uncer
tainty in the initial datum.
Recently entropy and thermodynamics have led to sur
We are led to a quantity which indicates a measure of the
prising discoveries in understanding gravitation, thereby
degree of disorder in the phase-space trajectories traced out
uniting general relativity to thermodynamics. The detailed
by the solutions of the system. Such a quantity is the metric
studies of black-hole dynamics have led to an interesting
law: Changes in a black hole generally take place in the entropy (or Kolmogorov entropy). 26•27
Consider iterative mappings of the type
direction of increasing its horizon surface area. 22•23 The
surface area of a black hole is interpreted as a measure of (21)
the inaccessibility of information about its internal config
uration. This directly yields an expression for black-hole We wish to know if the solutions exhibit any stationary
entropy6: behavior as n--+ oo . If there exists a sufficiently sensitive
494 Am. J. Phys., Vol. 52, No. 6, June 1984 Keith Andrew 494
This article is copyrighted as indicated in the article. Reuse of AAPT content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:
86.185.217.15 On: Sat, 17 May 2014 13:16:45
dependence on initial conditions then neighboring solu an essential ingredient on the path to a quantum theory of
tions diverge exponentially under iteration. Thus the gravitation. However, it is perfectly time symmetric and
knowledge of the initial conditions is lost at an exponential leads to no distinguishable difference between a black hole
rate. Neighboring solutions may diverge less than exponen and its time reversed counterpart, a white hole. Further
tially. Then initial knowledge is still lost, but the metric more, it leads to an estimated value of the entropy per bar
entropy is zero and the system is not truly random. Where yon ratio many orders of magnitude above the observed
en s
the trajectories diverge as then the metric entropy is value. In addition, the theory incorporates an observer
defined, for one-dimensional maps, 8 as dependent space-time gemnetry as a quantum aspect.
f
There is a somewhat more radical approach to this prob
S= (1ogl ~ T(x)I) = logl d~~x)lµ(x)dx, (22) lem due to Penrose. 29·30 One maintains a difference
between black and white holes by invoking the Weyl curva
where µ(x)dx is the relative number of iterates between x ture hypothesis as the necessary constraint on singularities.
andx + dx. This highly time asymmetric hypothesis states that the
Formal chaos will exist for S > 0. Such systems are ma W eyl conformal curvature vanishes as the singularity is
croscopically deterministic but not predictive. The knowl approached. This leads to a search for new laws of physics
edge of initial conditions is lost after -;::::,S - t iterations. The which are time asymmetric, the possibility that quantum
system wm be predictive only ifs = 0. linear superposition f!lils when space-time geometry is in
A crucial feature of metric entropy is its stable character. volved, and a search for a time asymmetric theory of gra
It is a slowly varying function in chaotic dynamical systems vity. Both views provide clear evidence of the need for
and thus a preferred means of stable description. It is the further work in understanding entropy and the second law.
metric entropy for solutions to the Einstein field equations There are other considerations for the description of mi
that leads to cosmological implications. croscopic phenomena in the presence of a gravitational
field. If one follows the general rule that the weaker the
interaction the more symmetry conditions it violates, then
III. DISCUSSION
one may abandon CPT invariance, Furthermore, space
There is nothing more suggestive in the classical descrip time may not be smooth on a microscopic scale. The struc
tions of entropy than the existence of a connection between ture may be more reminiscent of a fractal geometry.31·32
entropy and the "arrow of time." Time appears to be asym These kinds of ideas are difficult to handle and are in neeq
metric; one can recall the past but not the future. In the offurther investigation. The concept of metric entropy goes
classical description this asymmetry manifests itself as irre much farther than just chaos. It is responsible for introduc
versibility, heat does not flow from cold to hot bodies. This ing the concept of gravitational turbulence, in analogy to
asymmetry is revealed in the boundary conditions of the hydrodynamic turbulence, as a new avenue of develop
system and not in the dynamical equations describing the ment. 8 Perhaps more interesting is the idea of gravitational
evolution of the system. entropy of a cosmological space-time.8•29 This manifests
Turning to the more fundamental rules of quantum me itself as a function of the Weyl conformal curvature; thus it
chanics to formulate a microscopic description of entropy is a purely general relativistic effect having no Newtonian
is helpful. Using the information theoretic approach en analog. As a result, the evolution of the universe is en
tropy increase is understood as the loss of information dowed with a significant time asymmetry. Analyzing the
about the internal configuration of the system during its features of gravitation and entropy has led Davies to con
time evolution. One associates this loss of information with clude33:
an increased randomness of the system, i.e., irreversibility. The origin ofa ll thermodynamic irreversibility in the
Although there does not exist a satisfactory microscopic real universe depends ultimately on gravitation. Any
time asymmetric equation of motion this approach pro gravitating universe that can exist and contains more
vides a consistent means of bridging the gap between mi th11.n one type of interacting material must be asym
croscopic and macroscopic phenomena. Until recently this metric in time, both globally in its motion, and local
would have appeared sufficient. However, a dynamic time ly in its thenpodynamics.
asymmetric event has been discovered: the CP violating K0 The concept of entropy takes one from macroscopic ir
decay. By virtue of the CPT theorem this implies a Tinvar reversibility through statistical and quantum-mechanical
iance violation; a breakdown of microscopic reversibility. microstates to quantum gravity and time asymmetry. Such
Any time asymmetric microscopic equation of motion pervasiveness speaks for the power and unity of entropy as
must be capable of describing the K0 decay. Ifs uch an equa a means of understanding all physical phenomena.
tion is to be successful one expects the time invariant equa
tions to be derivable from it.
ACl{NOWLEDGMENTS
The concept of black-hole entropy adds another insight
to understanding reversibility. It is difficult to imagine an I wish to thank A. Hobson, M. Lieber, and 0. Zinke for
event more graphically irreversible than gravitational col useful discussions and criticisms.
lapse. Yet Hawking has proposed a mechanism capable of
dismantling a black hole: quantum-mechanical evapora
tion. 28 This mechanism describes the black hole as emitting
particles until the final naked singularity state violently
'A. Hobson, Concepts ofSatistical Mechanics (Gordon and Breach, New
explodes. As a result, black-hole evaporation provides a
York, 1971).
means of producing a high entropy per baryon ratio for the
2F. Reif, Fundamentals of Statistical and Thermal Physics (McGraw
universe. Herein lies a problem.
Hill, New York, 1965).
The theory of black-hole evaporation weaves together 3R. P. Feynman, Statistical Mechanics, A Set of Lectures (Benjamin/
aspects of general relativity and quantum mechanics; it is Cummings, Reading, MA, 1981).
495 Am. J. Phys., Vol. 52, No. 6, June 1984 Keith Andrew 495
This article is copyrighted as indicated in the article. Reuse of AAPT content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:
86.185.217.15 On: Sat, 17 May 2014 13:16:45
4A. Wehr!, Rev. Mod. Phys. 50, 221 (1978). tion (University of Illinois, Urbana, 1949).
5L. Brillouin, Science and Information Theory (Academic, New York, 17Variations of this formula will be found in the literature especially when
1962). considering nondiscrete cases.
6J. D. Bekenstein, Phys. Rev. D 7, 2333 (1973). 18The quantity - AS is often referred to as the negentropy.
7E. Ott, Rev. Mod. Phys. 53, 655 (1981). 19Problems of the EPR type may arise when considering information
8J. D. Barrow, Phys. Rep. 85, 1 (1982). transfer, see Ref. 9, p. 17.
9Time in Science and Philosophy, edited by J. Zeman (Elsevier, New 20E. T. Jaynes, Phys. Rev. 106, 620 (1957).
York, 1971). 21E. T. Jaynes, in Statistical Physics, Brandeis Summer Institute 1962,
'°P. C. W. Davies, The Physics ofT ime Asymmetry (University of Califor edited by K. W. Ford (Benjamin, New York, 1963).
nia, Berkeley, 1974). 22R. Penrose and R. M. Floyd, Nature 229, 177 (1 971 ).
"Reference 2, p. 215. 23S. W. Hawking, Phys. Rev. Lett. 26, 1344 (1971).
12A pure state corresponds to a complete description of the system. A 24J. D. Bekenstein, Gen. Relativ. Gravitation 14, 355 (1982).
mixed state is an incomplete description. Classically, a pure state is 25P. Collet and J. P. Eckmann,lterated Maps on the Interoal as Dynamical
given by the phase space point x = (q1 ... q1,p1 .. -PJ) for all qi and Pi avail Systems (Birkhauser, Boston, 1980).
able to a system of/ degrees offreedom. Quantum mechanically a state is 26A. N. Kolmogorov, Dok!. Akad. Nauk. SSSR 124, 754 (1959).
described by l/J (q1 ...q 1) or by the density matrix p. A necessary and 270ther quantities, such as topological entropy, will be found in the litera
sufficient condition for this to be a pure state is p = p2• The uncertainty ture in characterizing chaotic systems.
in a mixed quantum state is not the quantum uncertainty of the state. 28S. W. Hawking, Commun. Math. Phys. 43, 199 (1975).
13Formula first given by J. von Neumann, Gott. Nacr., 273 (1927). 29R. Penrose, in General Relativity, An Einstein Centenary Survey, edited
14Reference 4, p. 227. by S. W. Hawking and W. Isreal (Cambridge University, New York,
15The first connection between entropy and information can be found in 1979).
L. Szilard, "On the Decrease of Entropy in a Thermodynamic System by 30R. Penrose, in Quantum Gravity 2, A Second Oxford Symposium, edited
the Intervention oflntelligent Beings" ( 1929). The original German and by C. J. Isham, R. Penrose, and D. W. Sciama (Clarendon, Oxford,
an English translation may be found in The Collected Works of Leo 1981).
Szilard: Scientific Papers, edited by B. T. Feld and G. W. Szilard (MIT, 31P. C. W. Davies, in Ref. 30.
Cambridge, MA, 1972). 32B. B. Mandelbrot, Fractals (Freeman, London, 1977).
16C. Shannon and W. Weaver, The Mathematical Theory o/Communica- 33Reference 10, p. 109.
Stirling's cycle and the second law of thermodynamics
Raul A. Simon
Universidad de Magallanes, Casilla 113-D, Punta Arenas, Chile
(Received 6 October 1982; accepted for publication 1September1983)
In order to prove the general validity of the second law of thermodynamics, several instances of
the Stirling cycle are studied, especially in relation to their efficiency.
I. INTRODUCTION II. THE STIRLING CYCLE
Some time ago, the Sociedad Cientifica de Chile spon The Stirling cycle-nowadays rather forgotten-is de
sored a series of conferences at the Biblioteca Nacional in picted in Fig. 1. It is composed of two isothermals, ab and
Santiago, with the engineer, Froimovich, about the second cd, and two isochores, be and da. In Ref. 1 Froimovich
law of thermodynamics. At the first of these lectures, the considers such a cycle. Starting from the correct formula
author of the present paper was given the mission of de
7/ = 1 - PalPd• (1)
fending the truth of the second law against the unorthodox
he states that if pa be kept fixed and pd lowered by using a
views ofFroimovich. The whole occurrence was rather un
substance with a different "expansion coefficient," then the
successful because, while the debate took place, a very
efficiency of the cycle can be brought infinitesimally close
noisy jazz concert was going on across the hallway. Appar
to unity.
ently, no one thought that the two events might interfere!
The argument presented by the author at the time will be Before deriving Eq. (1 ) , and seeing that 7J.;;; 1 - (T 1/T2),
we must point out that the expansion coefficient mentioned
repeated here, with some improvements, in Sec. II. In Sec.
by Froimovich is irrelevant in this context; what he prob
III we perform a similar calculation for a more general type
ably means is (a p!aT)v.
of gas, i.e., one obeying a vitrial-type equation of state. In
The substance considered by Froimovich as undergoing
Sec. IV we study a Stirling cycle in the vicinity of a change
a Stirling cycle is a so-called "harmonic gas," i.e., one that
of phase (e.g., between liquid and vapor). In Sec. V, we
satisfies the following conditions:
study the same kind of cycle for a magnetic substance. Fin
(i) Its internal energy depends only on temperature:
ally, in Sec. VI, some comments of a general nature are
advanced. u = u(T). (2)
496 Am. J. Phys. 52 (6), June 1984 © 1984 American Association of Physics Teachers 496
This article is copyrighted as indicated in the article. Reuse of AAPT content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:
86.185.217.15 On: Sat, 17 May 2014 13:16:45