An Entropy contains a broadrange of properties of a thermodynamic system. It relates to the number Ω of microscopic configuration which isalso known as microstates which are consistent with the macroscopic quantitatesthat characterize the system i.e. volume, pressure, and temperature.
LazareCarnot, a French mathematician suggested in his 1803 paper named FundamentalPrinciples of Equilibrium and Movement in any machine represents moment lossesby activities through acceleration and shocks of the moving parts. From thebasis of this work, in 1824 Lazare’s son Sadi Carnot published Reflections onthe Motive Power OF Fire which suggested that in every heat-engines, wheneverheat (caloric) decreases through a temperature difference i.e. from a hot to acold body, motive power or work can be produced. He made an analogy of a waterwheel that how water is fallen on it. This gives an initial idea to the secondlaw of thermodynamics. In the early 18th century, Carnot tells usthat both heat and light were indestructible forms of matter that are attractedand repelled by other matter and he took this view from the NewtonianHypothesis.
In 1843, James Joule gives thefirst law of thermodynamics, from his experiments on heat-friction expressesthe concept of energy and its conservation in all processes. However, itdoesn’t tell us the effects of friction and dissipation.
During the 1850s and 1860s, aGerman physicist named Rudolf Clausius objects to the supposition that nochange occurs in a working body and gave this change a mathematical explanationby interrogating the nature of the inherent loss of usable heat when work isdone. For example, the heat produced by friction. Clausius explained entropy asthe transformation content in contrast to an earlier view that was based on thetheories of Newton that heat was an indestructible particle having a mass.
Later an entropy was given astatistical basis by different scientists named Ludwig Boltzmann, JosiahWillard Gibbs and James Clerk Maxwell. In 1877 Boltzmann tells us the way tomeasure the entropy of ideal gas particles in which he explained the entropythat it is directly proportional to the natural logarithm of the number ofmicrostates that gas can occupy. Hence, the crucial problem in statisticalthermodynamics has been to illustrate the distribution of a given amount ofenergy E over N identical system
A measure of an extent towhich energy is dispersed is called entropy.
Entropy can be defined as thetwo equivalent definitions:
- The classical thermodynamic definition
- The statistical mechanics’ definition
The ancient definition ofclassical thermodynamics was first developed. In the classical thermodynamics point,the microscopic features of a system are not considered. While the behavior ofa system is illustrated in terms of empirically defined thermodynamic variablessuch as entropy, temperature, heat capacity, and pressure. The classicalthermodynamics explanation describes a state of equilibrium whereas moreattempts have been made to develop an appropriate definition of entropy in a non-equilibriumsystem. Hence, the macroscopic approach in studying thermodynamics that doesnot require any knowledge about the behavior of individual particles is knownas classical thermodynamics.
The statistical definition ofentropy and other properties of thermodynamic were developed later. Thethermodynamic properties are defined in terms of the statistics of the motionsof the microscopic constituents of a system. in short, the statisticalthermodynamics is defined as an elaborated approach which is based on theaverage behavior of large groups of individual particles.
Entropy can be calculatedusing different equations:
- If the process is at a constant temperature, then the equation will be:
ΔS indicated the change in entropy
qrev shows the reverse of the heat
T is the Kelvin temperature
- If there is a known reaction, then ΔSrxn can be determined by using a table of standard entropy values.
- Gibbs free energy (ΔG) and the enthalpy (ΔH) can also be used to determine ΔS. The Gibbs entropy formula is :
The entropy of a system
entropy is a fundamentalfunction of a state. It arises directly from the Carnot cycle. It can also beexplained as a reversible heat divided by temperature.
In a thermodynamic system, thephysical states like pressure, density, and temperature tend to be uniform overtime because the equilibrium state has a higher probability than any otherstate.
For example, if an icy waterglass is placed in the air at room temperature. The difference in temperaturebetween ice water and a warm room starts to equalize itself as portions of thethermal energy from warm surroundings to cool icy water. A time comes when thetemperature of the room and a glass content becomes equal. Therefore, you cansay that the entropy of the room has decreased when the energy is dispersed toglass content.
The above example tells that it is an isolated system, the entropy of the glass content system has increased more than the entropy of the surrounding room has decreased. So it is seen that in an isolated system the diffusion of energy from warm to cool results in a net increase in entropy. Hence, when the “universe” of the room and glass content system reached an equilibrium temperature, the entropy change from an initial state is at maximum. So, the entropy of the thermodynamic system is a measure of how far the equalization has progressed.
Entropy never decreases for anisolated system. The increase in an entropy leads to an irreversible change ina system because some of the energy is expended as waste heat that limits theamount of work a system can perform. Entropy can only be calculated; it cannever be observed directly. For a given substance, entropy can be calculated asa standard molar entropy from an absolute zero which is also known as absoluteentropy or as a difference in entropy from any reference state which can alsobe known as zero entropy.
Entropy has the dimension ofenergy that is divided by temperature. So it has the units of Joules per Kelvin(J/K) in SI units. These units are the same as that of heat capacity but offcourse both of the concepts are distinct.
Entropy is not a conservedquantity; for example, consider an isolated system having a non-uniformtemperature, the heat might flow irreversibly hence the temperature becomesuniform such that entropy increases. The second law of thermodynamics tells usthat entropy of a closed system remains constant or may increase. The chemicalreactions are a source of changes in entropy and entropy also plays animportant role in explaining the direction of a chemical reaction that proceedsspontaneously.
The definition of entropyaccording to one dictionary is that “it is a measure of thermal energy perunit temperature that is not available for any useful work”. At a uniformtemperature, a substance has a maximum entropy and is unable to drive a heatengine. At non-uniform temperature, a substance has low entropy and some of thethermal energy can drive a heat engine.
There is a special case of entropy occurs when two or more different substances are mixed and hence the entropy of mixing takes place with an increase in entropy. If the substances are at the same temperature and pressure, the net exchange of heat and work will be zero. The entropy change is different due to the mixing of different substances. This results due to a change in volume per particle with mixing at a statistical mechanical level.
Experimentalmeasurement of entropy:
The entropy of a substance can be measured indirectly. The definition of temperature will be used in terms of entropy whereas limiting the energy exchange to heat (dUto dQ).
The above relation shows howentropy changes dS when at a specific temperature a small amount of energy dQis introduced into a system.
The measurement process goesas follows. At first, a given sample is cooled close to absolute zero. Theentropy approaches zero at such temperatures because of the temperaturedefinition. Then a small amount of heat is introduced into the sample andtemperature change is measured until the desired temperature is achieved i.e.25ᵒC. The data gained is put into the above equation and the results yield theabsolute value of entropy of the sample at the final temperature. This value ofentropy is called as calorimetric entropy.
Thesecond law of thermodynamics:
the second law ofthermodynamics generally requires that the total entropy of a system can’tdecrease until increasing the entropy of any other system. So the entropy of anisolated system tends not to decrease. It follows that without the applicationof some work the transfer of heat from the cold body to a hot body is notpossible. It is also impossible for any device that operates on a cycle toproduce total work from one reservoir of temperature; the total work productionneeds a flow of heat from a hot reservoir to a cold reservoir or a singleexpanding reservoir that undergoes adiabatic cooling that performs adiabaticwork. As a result, there is no possibility of a permanent motion system.
- The fundamental thermodynamic relation:
Theentropy of any system depends on its internal energy and its externalparameters i.e. volume. In the thermodynamic limit, this fact leads us to anequation that relates the change in the internal energy U to changes in entropyand the external parameters. We call this relation a fundamental thermodynamicrelation. If the external pressure p holds on volume V as the only externalparameter, then we gain the following relation:
dU = TdS – p dV
Thisfundamental thermodynamic relation is involving many thermodynamic identitiesthat are independent of the microscopic details of the system. For example,Maxwell relations and heat capacities relationships.
- Entropyin chemical thermodynamics:
Thermodynamic entropy plays a centralrole in chemical thermodynamics that enables changes to be quantified and to predictthe outcome of reactions. The second law of thermodynamics tells us thatentropy in an isolated system is the combination of a subsystem under study andits surroundings that increases during all spontaneous chemical and physicalprocesses. The Clausius equation introduces the measurement of entropy change.Entropy change states the direction and quantifies the magnitude of smallchanges i.e. heat transfer between systems from hot to a cold bodyspontaneously.
Entropy change equations for simpleprocesses:
- Isothermalexpansion or compression of an ideal gas:
At a constant temperature the expansion or compression of an ideal gas from an initial volume V0 and pressure P0 to a final volume V and pressure P, the change in entropy is given by the following equation:
n indicates the number of moles of a gas
Ris the ideal constant.
Theseequations may also apply for expansion into a finite vacuum or a throttling process,where the temperature, internal energy, and enthalpy for an ideal gas remainsconstant.
- Coolingand heating
The entropy change equation for heating or cooling of any system at constant pressure from an initial temperature to a final temperature is given by:
Cprepresents the constant pressure molar heat capacity
There is no phase change occurs intemperature interval.
Similarly, at constant volume, the entropy change is given by
WhereCv is the constant-volume molar heat capacity and there is no phase change.
Aroundabsolute zero at low temperature, the heat capacities of solids drop quicklynear to zero, therefore the assumption of constant heat capacity does not apply.
As entropy is a state function, the entropy changes of any process in which temperature and volume both vary is the same as for a path that is divided into two steps i.e. heating at constant volume and expansion at a constant temperature. The entropy change for an ideal gas is given as:
Similarly, when the temperature and pressure of an ideal gas both vary, the equation will be given as:
- Phase transitions:
At constant temperature and pressure, a reversible phase transition occurs. The reversible heat is the enthalpy change for the transition and the entropy change is the given by dividing the enthalpy change by the thermodynamic temperature. The entropy of fusion is given by:
Similarly, the entropy of vaporization for the vaporization of a liquid to gas is given as follows:
What are the units of the entropy equation? ›
The units of entropy are J/K. The temperature in this equation must be measured on the absolute, or Kelvin temperature scale. On this scale, zero is the theoretically lowest possible temperature that any substance can reach. At absolute 0 (0 K), all atomic motion ceases and the disorder in a substance is zero.What is the unit of entropy answer correct? ›
The units of entropy are JK−1mol−1, which basically means joules of energy per unit heat (in Kelvin) per mol.What is entropy explanation with examples? ›
Entropy is a measure of how much the atoms in a substance are free to spread out, move around, and arrange themselves in random ways. For instance, when a substance changes from a solid to a liquid, such as ice to water, the atoms in the substance get more freedom to move around.What is the summary of entropy of reaction? ›
A reaction is favoured if the enthalpy of the system decreases over the reaction. Entropy refers to the measure of the level of disorder in a thermodynamic system. For any process to occur to occur spontaneously, it is a necessary condition that the entropy of the system undergoing the process should increase.