**BIOCHEMISTRY TOPICS**

### Entropy

Entropy (S), a state function definable in classical and statistical thermodynamics. Qualitative assessment of entropy changes. ΔSfor isothermal expansion of an ideal gas. Engines and entropy.

In order to account for spontaneity or directionality of processes, the concept of **entropy**
is defined and incorporated into what is known as the
**second law of thermodynamics**.
Roughly speaking, entropy (symbolized * S*) is a quantitative measure of the number
of ways that energy can be distributed within a system. Entropy can be defined and measured as a
thermodynamic quantity. Furthermore, it can be shown that entropy is a state function.
These features of entropy are essential to the formulation of the second law.
Spontaneous processes such as the attainment of thermal equilibrium or the mixing
of gases or liquids can be shown to result in an increase in total
entropy, meaning the entropy of the system plus the entropy of the
surroundings (

*S*

_{univ}). Indeed, the statement that in any spontaneous process, Δ

*S*

_{univ}> 0 can be taken as a statement of the second law of thermodynamics.

Entropy, was originally defined in classical thermodynamics, which made no assumptions about the detailed nature of matter. The advent and establishment of atomic theory led to the treatment of the macroscopic systems of classical thermodynamics as enormous collections of nanoscale particles by the method of statistical mechanics. The explicit incorporation of atomic and molecular composition of matter into statistical thermodynamics yields the classical quantity entropy in a quite different, yet ultimately equivalent way. In either case, entropy is a state function, just like energy, enthalpy, or any state variable. Both definitions are shown in the figure below.

The classical thermodynamic formulation for entropy was originally due to Clausius (1865). This was followed by Boltzmann's statistical thermodynamic approach in 1877.

#### Understanding entropy

What is entropy? Is entropy "a measure of disorder in a system"? Is the seeming tendency of your room to get messy an example of the inexorable increase in entropy? What is a microstate? Find out the answers to these important questions and many more at this highly recommended Entropy website by Professor Frank Lambert of Occidental College.

#### Qualitative assessment of entropy changes

We can develop some general rules of thumb for qualitative assessment of entropy changes for various
types of processes. First of all, any system that undergoes a process that increases its temperature
also increases in entropy; *i.e.* Δ*S*_{sys} > 0.
This is evident from the classical definition of entropy, since
*q* is positive for any real system undergoing a process in which
Δ*T*_{sys} > 0. Where
*T* is not constant, the classical definition becomes
*dS* = δ*q*_{rev}/ *T*, which must be integrated
in order to obtain Δ*S*_{sys} for the process.
The integral is always non-negative for δ*q* positive, *i.e*.
heat passing into the system from the surroundings.
The quantities *dS* and δ*q*_{rev} are called
**differentials**, and represent an infinitesimal change in entropy and an infinitesimal
amount of heat transfer, respectively.
(We note in passing for now that although *dS* and δ*q*_{rev} are both differentials,
there is a crucial distinction between them, which corresponds mathematically to the fact that
*S* is a state function, while *q* is not. This is described further below.)

For a phase change, such as the melting of ice to liquid water, or evaporation of water to water vapor,
the change from the more to less condensed phase is always associated with Δ*S* > 0.
For a given substance, the relative change in entropy for the evaporation of a liquid is much greater
than that for the
melting of the solid. In chemical reactions, the change from more
numerous simpler, smaller molecules to fewer, larger and more complex
molecules generally represents a *decrease* in entropy.
(Note that this is in spite of the fact that a larger molecule
has greater entropy than a small one. The kinetic energy in a system
of molecules can be more effectively dispersed by an increase in the
number of particles with translational freedom than it can by
the increased number of vibrational modes within a coherent collection
of the same particles linked by chemical bonds.) In reactions where all
reactants and products are gases, the entropy change will be positive in the
direction of the reaction that produces more total moles of gas. In
reactions occurring in solution, or in reactions of heterogeneous systems,
production of a gas will make a significant positive contribution
to the overall entropy change for the reaction. If a precipitation
reaction occurs, this generally makes a a negative contribution
to the entropy of reaction, although in this case we need to be
careful to consider the entropic contribution of water molecules
released when precipitating ions are desolvated.

#### Trends in *S°*_{T, m } values

_{T, m }

Looking at trends in standard molar entropies (*S° _{T, m }* values) confirms a number of
qualitative predictions and augments our practical understanding of entropy and the change in entropy associated
with various processes.

**Phases**: solid < liquid < gas

**Molar mass**: *S*° values increase with increasing molar mass

**Molecular complexity**: S° values increase with increasing molecular complexity

**Allotropes** have distinct *S*° values

**Dissolution**: Dispersal of a pure substance into a solvent increases its *S*° value

Notes on these trends: Allotropes are different forms of elements, *e.g*.
diamond and graphite are carbon allotropes.
In considering the dissolution of a solute, the Δ*S* of the solvent must also be accounted for.

#### Entropy change for isothermal expansion of an ideal gas

We previously used free expansion of an ideal gas as an example of a spontaneous process. The entropy change for such a process can be calculated, provided we are careful about specifying the path. When 1 mol ideal gas expands into a vacuum to double its original volume, it does so against no external pressure, so the system performs no work. Let us further specify the process begin and end at the same temperature.

Since the energy of an ideal gas depends only on its temperature, Δ*U* = 0 for this process.
Therefore, *q* = 0 for the free isothermal expansion of an ideal gas. Does this mean Δ*S* = 0
for this process? Well, no. The process is irreversible, and *q*_{rev} ≠ *q* for this process.
What we need to consider is a reversible path for the isothermal expansion, use the formula
Δ*S* = *q*_{rev}/*T* to compute Δ*S* for the reversible process,
which will be the same as Δ*S* for free expansion, since Δ*S* is a state function and in
reversible and irreversible processes, the system has the same beginning and ending states.
The process can be carried out reversibly (*i.e*. along a reversible path) as follows.
The boundary of the system is envisioned as a closed cylinder with a movable piston, so that the volume of the system
can change continuously. The surroundings, in full thermal contact with the system, are held at a constant temperature
*T*, which maintains the system temperature at *T* as well.

The initial state of the system is one mol ideal gas at *P*_{1}, *V*_{1}, *T*, while the final state is described as 1 mol ideal gas at *P*_{2}, *V*_{2}, *T*, with *P*_{1}*V*_{1} = *P*_{2}*V*_{2}, as required by the ideal gas law. To initiate the process, the external pressure exerted by the piston, *P*_{ext}, is allowed to decrease infinitesimally below *P*_{1} for the system. The system adjusts smoothly to this incremental decrease in *P*_{ext} , never varying from an equilibrium state, by an infinitesimal increase in volume to allow the pressure to match *P*_{ext}. The process is reversible, since the direction can be reversed - in this case to compression - by an infinitesimal *increase* in *P*_{ext}. We imagine the process moving in the forward direction as this lowering of external pressure continues, and the volume of the system increases, finally reaching *V*_{2}.

Since *V* = *RT*/*P* for 1 mol ideal gas, we can calculate the work done by the system as the
integral of a differential equation derived as follows

q_{rev}= −w= ∫P(V)dV

where *P*(*V*)*dV* = (*RT*/*V*)*dV* .
By using this equation, and the fact that *T* is constant throughout, to write

ΔS=q_{rev}/T= (1/T)∫(RT/V)dV= (RT/T)∫(1/V)dV=Rln(V_{2}/V_{1})

where the last equality follows by carrying out the integration from* V*_{1} to
*V*_{2}.

#### Entropy changes for chemical reactions

In cases for which we want to calculate the entropy change Δ*S* for a given chemical reaction,
we follow a procedure similar to that used in calculation of
Δ*H* from standard enthalpies of formation. In this case, we rely on tabulated values of
standard entropies (S° values) for all reactant and product species.

*Example*: Calculate the thermodynamic value Δ*S°* for
the following reaction at 298.15 K (25.00 °C):

N_{2}(g) + 3 H_{2}(g) → 2 NH_{3}(g)

Use the standard entropies given in thermodynamic tables (these are usually found in an appendix of a general chemistry textbook).

*Solution*: We need the *S*° values
for NH_{3}(*g*), N_{2}(*g*), and H_{2}(*g*).
Note that in contrast to the case for standard enthalpies of formation, the *S*° values
for molecular nitrogen gas or diatomic hydrogen gas are not zero.
The values are 192.34 J mol^{−1} K^{–1} {NH_{3}(*g*)},
191.50 J mol^{–1} K^{−1} {N_{2}(*g*)},
and 130.57 J mol^{−1} K^{−1} {H_{2}(*g*)}.
To get the standard entropy change for the reaction, multiply the *S*°
values by their stoichiometric coefficients in the chemical equation above
(along with the unit "mol"), but *using a minus sign for reactants*,
and then add up all the terms, as shown below:

ΔS°= 2 mol ·S°_{298}{NH_{3}(g)} − 1 mol ·S°_{298}{N_{2}(g)} − 3 mol ·S°_{298}{H_{2}(g)}

ΔS°_{298}= 2 mol ·192.34 J mol^{−1}K^{−1}− 1 mol · 191.50 J mol^{−1}K^{−1}− 3 mol · 130.57 J mol^{−1}K^{−1}.

ΔS°_{298}= − 198.53 J/K

The resulting thermochemical equation is then

N_{2}(g) + 3 H_{2}(g) → 2 NH_{3}(g) ΔS°= − 198.53 J/K

#### Engines and entropy

The history of the development of the steam engine is not only
part of the story about the dawn of the industrial age, but is also
intimately associated with the development of classical thermodynamics.
The theoretical study of engines led to the definition and measurement
of entropy in terms of heat and temperature. The early steam engines
were quite inefficient, and there were many attempts to improve
them. By the latter half of the 1700s, the Scottish inventor James
Watt had made much progress, but even Watt's best engines had an
efficiency of only about 5%. Efficiency is defined for an engine
by the ratio of amount of input energy - here heat from a fire -
to the work output. For a perfect engine, that ratio would be one,
or an efficiency of 100%. Now, the first law of thermodynamics says
you can't get something for nothing - no engine could ever *exceed*
100% efficiency - but might someday an engine approaching 100% efficiency
be built? In 1824, a Frenchman, Sadi Carnot, published his opus,
*Reflections on the Motive Power of Fire*, that answered
this question with a resounding no. Carnot brilliantly showed that
the efficiency of even an idealized engine could be no greater than
the ratio (*T*_{h} – *T*_{c}) / *T*_{h},
where *T*_{h} is the temperature of the hot reservoir, and
*T*_{c} the temperature of the cold reservoir.

A useful engine must be a device that works in a cyclic mode. Rather than a steam engine,
Carnot considered an ideal engine whose working substance was an ideal gas.
The high temperature reservoir at temperature *T*_{h} could theoretically differ from
the 373 K of boiling water; similarly, *T*_{c} could be other than the lower limit of
273 K for liquid water. The engine operates by the cyclic path shown at left. The paths are
**reversible** paths, meaning they are idealized processes that occur by an infinitesimal
displacement from equilibrium, and whose direction of spontaneity can be reversed
at any point by an infinitesimal change δ*q* or δ*w* along the path.
This idealized heat engine is referred to as the **Carnot cycle**.
The cycle is composed of two **isothermal** paths, in which *T* is constant,
and two **adiabatic** paths, for which *q* = 0, by definition.

Under these conditions, the efficiency can be computed according to the first law:

ΔU=q_{tot}+w_{tot}=q_{h}+q_{c}+w_{tot}= 0 or: –w_{tot}=q_{h}+q_{c}

This says that the total work that can be done by the engine is equal to the heat absorbed from the
hot reservoir minus that exhausted to the cold reservoir. This means that efficiency, *e*, is given by

e= –w_{tot}/q_{h}= (q_{h}+q_{c}) /q_{h}

Since the two isotherms are joined by two adiabats, it can be shown that

(V_{D}/V_{C}) = (V_{A}/V_{B}), so that

–w_{tot}=q_{h}+q_{c}=RT_{h}ln(V_{B}/V_{A}) –RT_{c}ln(V_{B}/V_{A}) =Rln(V_{B}/V_{A}){T_{h }–T_{c}}

Thus, efficiency can be expressed as

e= –w_{tot}/q_{h}= {Rln(V_{B}/V_{A})}{T_{h}–T_{c}}/RT_{h}ln(V_{B}/V_{A}) = (T_{h}–T_{c}) /T_{h}

**Exercise**: Show that equating the two above expressions for efficiency *e*
of a heat engine leads to the relation

{q_{h}/T_{h }} + {q_{c}/T_{c}} = 0

for the Carnot cycle, suggesting that the quantity *q*_{rev}/*T* provides the basis
for the definition of a state function (where *q*_{rev} is the heat transferred to the system
along a reversible path.

Clausius was the first to define entropy in the classical sense, which he did after realizing what we showed
in the exercise above, and he defined entropy as Δ*S*
= *q*_{rev}/*T* (or the integral of d*q*_{rev}/*T*
for a process in which temperature is changing). This definition is a really a way to compute entropy for an
isothermal process, since *q*_{rev}/*T* - the heat transferred to the system in a
*reversible* process divided by the temperature at which it occurs - is a state function,
whereas *q* is not. But no matter how we carry out a change in state, Δ*S*
is the same for the same change in state, or (equivalently) Δ*S* is zero for a cyclic process.
In all irreversible (= not reversible = spontaneous) processes, *q* is not a state function,
and is path-dependent, and we can show that Δ*S*_{univ} > 0.
Mathematically, this means that the integral of a differential quantity of energy transfer as heat,
d*q,* is not uniquely defined as Δ*q*. Actually, we never write Δ*q* or
*dq* for just this reason. The integral of d*q* is path dependent,
and mathematicians have a more succinct way of describing this by saying d*q* is not an
**exact differential** (and the notation δ*q* is often used to signify this,
whereas *dU*, *dH*, *dS*, etc. are all exact differentials).

Related topics pages: