Reversible and irreversible changesThe physical meaning of entropyStandard Entropies of building materials

Learning Objectives

You are expected to be able to define and explain the significance of terms identified in bold.

You are watching: Which physical state of nitrogen has the highest entropy

A reversible process is one carried out in infinitessimal steps after which, when undone, both the system and surroundings (that is, the world) remain unchanged (see the example of gas expansion-compression below). Although true reversible change cannot be realized in practice, it can always be approximated. ((in which a process is carried out. As a process is carried out in a more reversible manner, the value of w approaches its maximum possible value, and q approaches its minimum possible value. Although q is not a state function, the quotient qrev/T is, and is known as the entropy. energy within a system. The entropy of a substance increases with its molecular weight and complexity and with temperature. The entropy also increases as the pressure or concentration becomes smaller. Entropies of gases are much larger than those of condensed phases. The absolute entropy of a pure substance at a given temperature is the sum of all the entropy it would acquire on warming from absolute zero (where S=0) to the particular temperature.

Entropy is one of the most fundamental concepts of physical science, with far-reaching consequences ranging from cosmology to les-grizzlys-catalans.orgistry. It is also widely mis-represented as a measure of "disorder", as we discuss below. The German physicist Rudolf Clausius originated the concept as "energy gone to waste" in the early 1850s, and its definition went through a number of more precise definitions over the next 15 years.

Previously, we explained how the tendency of thermal energy to disperse as widely as possible is what drives all spontaneous processes, including, of course les-grizzlys-catalans.orgical reactions. We now need to understand how the direction and extent of the spreading and sharing of energy can be related to measurable thermodynamic properties of substances— that is, of reactants and products.

You will recall that when a quantity of heat q flows from a warmer body to a cooler one, permitting the available thermal energy to spread into and populate more microstates, that the ratio q/T measures the extent of this energy spreading. It turns out that we can generalize this to other processes as well, but there is a difficulty with using q because it is not a state function; that is, its value is dependent on the pathway or manner in which a process is carried out. This means, of course, that the quotient q/T cannot be a state function either, so we are unable to use it to get differences between reactants and products as we do with the other state functions. The way around this is to restrict our consideration to a special class of pathways that are described as reversible.


Reversible and irreversible changes

A change is said to occur reversibly when it can be carried out in a series of infinitesimal steps, each one of which can be undone by making a similarly minute change to the conditions that bring the change about. For example, the reversible expansion of a gas can be achieved by reducing the external pressure in a series of infinitesimal steps; reversing any step will restore the system and the surroundings to their previous state. Similarly, heat can be transferred reversibly between two bodies by changing the temperature difference between them in infinitesimal steps each of which can be undone by reversing the temperature difference.

The most widely cited example of an irreversible change is the free expansion of a gas into a vacuum. Although the system can always be restored to its original state by recompressing the gas, this would require that the surroundings perform work on the gas. Since the gas does no work on the surrounding in a free expansion (the external pressure is zero, so PΔV = 0,) there will be a permanent change in the surroundings. Another example of irreversible change is the conversion of mechanical work into frictional heat; there is no way, by reversing the motion of a weight along a surface, that the heat released due to friction can be restored to the system.

*
Figure (PageIndex1): Reversible vs. Irreversible Expansions and Compressions

These diagrams show the same expansion and compression ±ΔV carried out in different numbers of steps ranging from a single step at the top to an "infinite" number of steps at the bottom. As the number of steps increases, the processes become less irreversible; that is, the difference between the work done in expansion and that required to re-compress the gas diminishes. In the limit of an ”infinite” number of steps (bottom), these work terms are identical, and both the system and surroundings (the “world”) are unchanged by the expansion-compression cycle. In all other cases the system (the gas) is restored to its initial state, but the surroundings are forever changed.


Definition: Reversible Changes

A reversible change is one carried out in such as way that, when undone, both the system and surroundings (that is, the world) remain unchanged.


Thus when a process is carried out reversibly, the w-term in the First Law expression has its greatest possible value, and the q-term is at its smallest. These special quantities wmax and qmin (which we denote as qrev and pronounce “q-reversible”) have unique values for any given process and are therefore state functions.


Work and reversibility

For a process that reversibly exchanges a quantity of heat qrev with the surroundings, the entropy change is defined as

< colorred Delta S = dfracq_revT label23.2.1>

This is the basic way of evaluating ΔS for constant-temperature processes such as phase changes, or the isothermal expansion of a gas. For processes in which the temperature is not constant such as heating or cooling of a substance, the equation must be integrated over the required temperature range, as discussed below.

*
Figure (PageIndex2): Note that the reversible condition implies wmax and qmin. The impossibility of extracting all of the internal energy as work is essentially a statement of the Second Law.

If no real process can take place reversibly, what use is an expression involving qrev? This is a rather fine point that you should understand: although transfer of heat between the system and surroundings is impossible to achieve in a truly reversible manner, this idealized pathway is only crucial for the definition of ΔS; by virtue of its being a state function, the same value of ΔS will apply when the system undergoes the same net change via any pathway. For example, the entropy change a gas undergoes when its volume is doubled at constant temperature will be the same regardless of whether the expansion is carried out in 1000 tiny steps (as reversible as patience is likely to allow) or by a single-step (as irreversible a pathway as you can get!) expansion into a vacuum.


The physical meaning of entropy

Entropy is a measure of the degree of spreading and sharing of thermal energy within a system. This “spreading and sharing” can be spreading of the thermal energy into a larger volume of space or its sharing amongst previously inaccessible microstates of the system. The following table shows how this concept applies to a number of common processes.

Table (PageIndex1): Observations and Explanations in terms of Entropy alters system and processsource of entropy increase of system
A deck of cards is shuffled, or 100 coins, initially heads up, are randomly tossed. This has nothing to do with entropy because macro objects are unable to exchange thermal energy with the surroundings within the time scale of the process
Two identical blocks of copper, one at 20°C and the other at 40°C, are placed in contact. The cooler block contains more unoccupied microstates, so heat flows from the warmer block until equal numbers of microstates are populated in the two blocks.
A gas expands isothermally to twice its initial volume. A constant amount of thermal energy spreads over a larger volume of space
1 mole of water is heated by 1C°. The increased thermal energy makes additional microstates accessible. (The increase is by a factor of about 1020,000,000,000,000, 000,000,000.)
Equal volumes of two gases are allowed to mix. The effect is the same as allowing each gas to expand to twice its volume; the thermal energy in each is now spread over a larger volume.
One mole of dihydrogen, H2, is placed in a container and heated to 3000K. Some of the H2 dissociates to H because at this temperature there are more thermally accessible microstates in the 2 moles of H.
The above reaction mixture is cooled to 300K. The composition shifts back to virtually all H2because this molecule contains more thermally accessible microstates at low temperatures.

Entropy is an extensive quantity; that is, it is proportional to the quantity of matter in a system; thus 100 g of metallic copper has twice the entropy of 50 g at the same temperature. This makes sense because the larger piece of copper contains twice as many quantized energy levels able to contain the thermal energy.

Entropy and "disorder"

Entropy is still described, particularly in older textbooks, as a measure of disorder. In a narrow technical sense this is correct, since the spreading and sharing of thermal energy does have the effect of randomizing the disposition of thermal energy within a system. But to simply equate entropy with “disorder” without further qualification is extremely misleading because it is far too easy to forget that entropy (and thermodynamics in general) applies only to molecular-level systems capable of exchanging thermal energy with the surroundings. Carrying these concepts over to macro systems may yield compelling analogies, but it is no longer science. it is far better to avoid the term “disorder” altogether in discussing entropy.


*
api/deki/files/219497/%2524%257Bfilename%257D?revision=1&size=bestfit&width=11&height=10" />10–23 J K–1) and Ω (omega) is the number of microstates that correspond to a given macrostate of the system. The more such microstates, the greater is the probability of the system being in the corresponding macrostate. For any physically realizable macrostate, the quantity Ω is an unimaginably large number, typically around (10^10^25) for one mole. By comparison, the number of atoms that make up the earth is about (10^50). But even though it is beyond human comprehension to compare numbers that seem to verge on infinity, the thermal energy contained in actual physical systems manages to discover the largest of these quantities with no difficulty at all, quickly settling in to the most probable macrostate for a given set of conditions.

The reason S depends on the logarithm of Ω is easy to understand. Suppose we have two systems (containers of gas, say) with S1, Ω1 and S2, Ω2. If we now redefine this as a single system (without actually mixing the two gases), then the entropy of the new system will be

but the number of microstates will be the product Ω1Ω2because for each state of system 1, system 2 can be in any of Ω2 states. Because

Hence, the additivity of the entropy is preserved.

If someone could make a movie showing the motions of individual atoms of a gas or of a les-grizzlys-catalans.orgical reaction system in its equilibrium state, there is no way you could determine, on watching it, whether the film is playing in the forward or reverse direction. Physicists describe this by saying that such systems possess time-reversal symmetry; neither classical nor quantum mechanics offers any clue to the direction of time.

However, when a movie showing changes at the macroscopic level is being played backward, the weirdness is starkly apparent to anyone; if you see books flying off of a table top or tea being sucked back up into a tea bag (or a les-grizzlys-catalans.orgical reaction running in reverse), you will immediately know that something is wrong. At this level, time clearly has a direction, and it is often noted that because the entropy of the world as a whole always increases and never decreases, it is entropy that gives time its direction. It is for this reason that entropy is sometimes referred to as "time"s arrow".

But there is a problem here: conventional thermodynamics is able to define entropy change only for reversible processes which, as we know, take infinitely long to perform. So we are faced with the apparent paradox that thermodynamics, which deals only with differences between states and not the journeys between them, is unable to describe the very process of change by which we are aware of the flow of time.

The direction of time is revealed to the les-grizzlys-catalans.orgist by the progress of a reaction toward its state of equilibrium; once equilibrium is reached, the net change that leads to it ceases, and from the standpoint of that particular system, the flow of time stops. If we extend the same idea to the much larger system of the world as a whole, this leads to the concept of the "heat death of the universe" that was mentioned briefly in the previous lesson.


Absolute Entropies

Energy values, as you know, are all relative, and must be defined on a scale that is completely arbitrary; there is no such thing as the absolute energy of a substance, so we can arbitrarily define the enthalpy or internal energy of an element in its most stable form at 298K and 1 atm pressure as zero. The same is not true of the entropy; since entropy is a measure of the “dilution” of thermal energy, it follows that the less thermal energy available to spread through a system (that is, the lower the temperature), the smaller will be its entropy. In other words, as the absolute temperature of a substance approaches zero, so does its entropy. This principle is the basis of the Third law of thermodynamics, which states that the entropy of a perfectly-ordered solid at 0 K is zero.

Third law of thermodynamics

The entropy of a perfectly-ordered solid at 0 K is zero.

The absolute entropy of a substance at any temperature above 0 K must be determined by calculating the increments of heat q required to bring the substance from 0 K to the temperature of interest, and then summing the ratios q/T. Two kinds of experimental measurements are needed:

The enthalpies associated with any phase changes the substance may undergo within the temperature range of interest. Melting of a solid and vaporization of a liquid correspond to sizeable increases in the number of microstates available to accept thermal energy, so as these processes occur, energy will flow into a system, filling these new microstates to the extent required to maintain a constant temperature (the freezing or boiling point); these inflows of thermal energy correspond to the heats of fusion and vaporization. The entropy increase associated with melting, for example, is just ΔHfusion/Tm. The heat capacity C of a phase expresses the quantity of heat required to change the temperature by a small amount ΔT , or more precisely, by an infinitesimal amount dT . Thus the entropy increase brought about by warming a substance over a range of temperatures that does not encompass a phase transition is given by the sum of the quantities C dT/T for each increment of temperature dT . This is of course just the integral

< S_0^o ightarrow T^o = int _o^o^T^o dfracC_pT dt >

Because the heat capacity is itself slightly temperature dependent, the most precise determinations of absolute entropies require that the functional dependence of C on T be used in the above integral in place of a constant C.

< S_0^o ightarrow T^o = int _o^o^T^o dfracC_p(T)T dt >

When this is not known, one can take a series of heat capacity measurements over narrow temperature increments ΔT and measure the area under each section of the curve in Figure (PageIndex3).


*
Figure (PageIndex3): Heat capitity/temperature as a function of temperature

The area under each section of the plot represents the entropy change associated with heating the substance through an interval ΔT. To this must be added the enthalpies of melting, vaporization, and of any solid-solid phase changes. Values of Cp for temperatures near zero are not measured directly, but can be estimated from quantum theory.

*
Figure (PageIndex4): Molar entropy as a function of temperature/ Tb are added to obtain the absolute entropy at temperature T. As shown in Figure (PageIndex4) above, the entropy of a substance increases with temperature, and it does so for two reasons:
As the temperature rises, more microstates become accessible, allowing thermal energy to be more widely dispersed. This is reflected in the gradual increase of entropy with temperature. The molecules of solids, liquids, and gases have increasingly greater freedom to move around, facilitating the spreading and sharing of thermal energy. Phase changes are therefore accompanied by massive and discontinuous increase in the entropy.

Standard Entropies of substances


Table (PageIndex2): Standard entropies of some gases at 298 K, J K–1 mol–1 He 126 H2 131 CH4 186
Ne 146 N2 192 H2O(g) 187
Ar 155 CO 197 CO2 213
Kr 164 F2 203 C2H6 229
Xe 170 O2 205 n -C3H8 270
Cl2 223 n -C4H10 310

Table (PageIndex3): Standard entropies of some solid elements at 298 K, J K–1 mol–1 C(diamond) C(graphite) Fe Pb Na S(rhombic) Si W
2.5 5.7 27.1 51.0 64.9 32.0 18.9 33.5

Table (PageIndex4): Standard entropy of water at 298 K, J K–1 mol–1 solid liquid gas
41 70 186


< Delta S = R ln left( dfracV_2V_1 ight) label23.2.4>



How thermal energy is stored in molecules

Thermal energy is the portion of a molecule"s energy that is proportional to its temperature, and thus relates to motion at the molecular scale. What kinds of molecular motions are possible? For monatomic molecules, there is only one: actual movement from one location to another, which we call translation. Since there are three directions in space, all molecules possess three modes of translational motion.

For polyatomic molecules, two additional kinds of motions are possible. One of these is rotation; a linear molecule such as CO2 in which the atoms are all laid out along the x-axis can rotate along the y- and z-axes, while molecules having less symmetry can rotate about all three axes. Thus linear molecules possess two modes of rotational motion, while non-linear ones have three rotational modes. Finally, molecules consisting of two or more atoms can undergo internal vibrations. For freely moving molecules in a gas, the number of vibrational modes or patterns depends on both the number of atoms and the shape of the molecule, and it increases rapidly as the molecule becomes more complicated.

*
Figure (PageIndex5)

The relative populations of the quantized translational, rotational and vibrational energy states of a typical diatomic molecule are depicted by the thickness of the lines in this sles-grizzlys-catalans.orgatic (not-to-scale!) diagram. The colored shading indicates the total thermal energy available at a given temperature. The numbers at the top show order-of-magnitude spacings between adjacent levels. It is readily apparent that virtually all the thermal energy resides in translational states.

Notice the greatly different spacing of the three kinds of energy levels. This is extremely important because it determines the number of energy quanta that a molecule can accept, and, as the following illustration shows, the number of different ways this energy can be distributed amongst the molecules.

*
api/deki/files/61544/MicroStates.gif?revision=1&size=bestfit&width=518&height=139" />Figure (PageIndex7)

Each of these ten possibilities represents a distinct microstate that will describe the system at any instant in time. Those microstates that possess identical distributions of energy among the accessible quantum levels (and differ only in which particular molecules occupy the levels) are known as configurations. Because all microstates are equally probable, the probability of any one configuration is proportional to the number of microstates that can produce it. Thus in the system shown above, the configuration labeled ii will be observed 60% of the time, while iii will occur only 10% of the time.

As the number of molecules and the number of quanta increases, the number of accessible microstates grows explosively; if 1000 quanta of energy are shared by 1000 molecules, the number of available microstates will be around 10600— a number that greatly exceeds the number of atoms in the observable universe! The number of possible configurations (as defined above) also increases, but in such a way as to greatly reduce the probability of all but the most probable configurations. Thus for a sample of a gas large enough to be observable under normal conditions, only a single configuration (energy distribution amongst the quantum states) need be considered; even the second-most-probable configuration can be neglected.

The bottom line: any collection of molecules large enough in numbers to have les-grizzlys-catalans.orgical significance will have its therrmal energy distributed over an unimaginably large number of microstates. The number of microstates increases exponentially as more energy states ("configurations" as defined above) become accessible owing to

Addition of energy quanta (higher temperature), Increase in the number of molecules (resulting from dissociation, for example). the volume of the system increases (which decreases the spacing between energy states, allowing more of them to be populated at a given temperature.)

Why do gases tend to expand, but never contract?

Everybody knows that a gas, if left to itself, will tend to expand and fill the volume within which it is confined completely and uniformly. What “drives” this expansion? At the simplest level it is clear that with more space available, random motions of the individual molecules will inevitably disperse them throughout the space. But as we mentioned above, the allowed energy states that molecules can occupy are spaced more closely in a larger volume than in a smaller one. The larger the volume available to the gas, the greater the number of microstates its thermal energy can occupy. Since all such states within the thermally accessible range of energies are equally probable, the expansion of the gas can be viewed as a consequence of the tendency of thermal energy to be spread and shared as widely as possible. Once this has happened, the probability that this sharing of energy will reverse itself (that is, that the gas will spontaneously contract) is so minute as to be unthinkable.

*
Figure (PageIndex7)

Imagine a gas initially confined to one half of a box (Figure (PageIndex7)). The barrier is then removed so that it can expand into the full volume of the container. We know that the entropy of the gas will increase as the thermal energy of its molecules spreads into the enlarged space. In terms of the spreading of thermal energy, Figure 23.2.X may be helpful. The tendency of a gas to expand is due to the more closely-spaced thermal energy states in the larger volume

*
Figure (PageIndex8): The pink shading represents the thermally-accessible range of microstates for a given temperature.

See more: Description: The Big Field By Mike Lupica Summary, The Big Field By Mike Lupica


Entropy of mixing and dilution

Mixing and dilution really amount to the same thing, especially for idea gases. Replace the pair of containers shown above with one containing two kinds of molecules in the separate sections (Figure (PageIndex9)). When we remove the barrier, the "red" and "blue" molecules will each expand into the space of the other. (Recall Dalton"s Law that "each gas is a vacuum to the other gas".) However, notice that although each gas underwent an expansion, the overall process amounts to what we call "mixing".