Entropy: Difference between revisions
m Added a {{citation needed}} to a statement that had no citation link |
Extended confirmed users 79,276 edits Expanding bare references using ReferenceExpander |
||
Line 209: | Line 209: | ||
=== World's technological capacity to store and communicate entropic information === |
=== World's technological capacity to store and communicate entropic information === |
||
A 2011 study in [[Science (journal)]] estimated the world's technological capacity to store and communicate optimally compressed information normalized on the most effective compression algorithms available in the year 2007, therefore estimating the entropy of the technologically available sources.<ref name="HilbertLopez2011"> |
A 2011 study in [[Science (journal)]] estimated the world's technological capacity to store and communicate optimally compressed information normalized on the most effective compression algorithms available in the year 2007, therefore estimating the entropy of the technologically available sources.<ref name="HilbertLopez2011">{{Cite journal|last=Hilbert|first=Martin|last2=López|first2=Priscila|date=2011-04|title=The World’s Technological Capacity to Store, Communicate, and Compute Information|url=https://www.science.org/doi/10.1126/science.1200970|journal=Science|language=en|volume=332|issue=6025|pages=60–65|doi=10.1126/science.1200970|issn=0036-8075}}</ref> The author's estimate that human kind's technological capacity to store information grew from 2.6 (entropically compressed) [[exabytes]] in 1986 to 295 (entropically compressed) [[exabytes]] in 2007. The world's technological capacity to receive information through one-way broadcast networks was 432 [[exabytes]] of (entropically compressed) information in 1986, to 1.9 [[zettabytes]] in 2007. The world's effective capacity to exchange information through two-way telecommunication networks was 281 [[petabytes]] of (entropically compressed) information in 1986, to 65 (entropically compressed) [[exabytes]] in 2007.<ref name="HilbertLopez2011"/> |
||
=== Entropy balance equation for open systems === |
=== Entropy balance equation for open systems === |
Revision as of 20:03, 19 February 2023
Entropy | |
---|---|
Common symbols | S |
SI unit | joules per kelvin (J⋅K−1) |
In SI base units | kg⋅m2⋅s−2⋅K−1 |
Thermodynamics |
---|
Entropy articles |
---|
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from
The thermodynamic concept was referred to by Scottish scientist and engineer William Rankine in 1850 with the names thermodynamic function and heat-potential.[2] In 1865, German physicist Rudolf Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.[3]
A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been identified as a universal definition of the concept of entropy.[4]
History
In his 1803 paper, Fundamental Principles of Equilibrium and Movement, the French mathematician
The
In the 1850s and 1860s, German physicist
Later, scientists such as
Etymology
In 1865, Clausius named the concept of "the differential of a quantity which depends on the configuration of the system," entropy (Entropie) after the Greek word for 'transformation'.[10] He gave "transformational content" (Verwandlungsinhalt) as a synonym, paralleling his "thermal and ergonal content" (Wärme- und Werkinhalt) as the name of , but preferring the term entropy as a close parallel of the word energy, as he found the concepts nearly "analogous in their physical significance."[10] This term was formed by replacing the root of ἔργον ('ergon', 'work') by that of τροπή ('tropy', 'transformation').[9]
In more detail, Clausius explained his choice of "entropy" as a name as follows:[11]
I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful.
Leon Cooper added that in this way "he succeeded in coining a word that meant the same thing to everybody: nothing."[11]
Definitions and descriptions
Any method involving the notion of entropy, the very existence of which depends on the second law of thermodynamics, will doubtless seem to many far-fetched, and may repel beginners as obscure and difficult of comprehension.
The concept of entropy is described by two principal approaches, the macroscopic perspective of
State variables and functions of state
Many
Reversible process
Total entropy may be conserved during a reversible process. The entropy change of the system (not including the surroundings) is well-defined as heat transferred to the system divided by the system temperature , . A reversible process is a quasistatic one that deviates only infinitesimally from thermodynamic equilibrium and avoids friction or other dissipation. Any process that happens quickly enough to deviate from thermal equilibrium cannot be reversible, total entropy increases, and the potential for maximum work to be done in the process is also lost.[14] For example, in the Carnot cycle, while the heat flow from the hot reservoir to the cold reservoir represents an increase in entropy, the work output, if reversibly and perfectly stored in some energy storage mechanism, represents a decrease in entropy that could be used to operate the heat engine in reverse and return to the previous state; thus the total entropy change may still be zero at all times if the entire process is reversible. An irreversible process increases the total entropy of system and surroundings.[15]
Carnot cycle
The concept of entropy arose from Rudolf Clausius's study of the Carnot cycle that is a thermodynamic cycle performed by a Carnot heat engine as a reversible heat engine.[16] In a Carnot cycle, heat QH is absorbed isothermally at temperature TH from a 'hot' reservoir (in the isothermal expansion stage) and given up isothermally as heat QC to a 'cold' reservoir at TC (in the isothermal compression stage). According to Carnot's principle or theorem, work from a heat engine with two thermal reservoirs can be produced only when there is a temperature difference between these reservoirs, and for reversible engines which are mostly and equally efficient among all heat engines for a given thermal reservoir pair, the work is a function of the reservoir temperatures and the heat absorbed to the engine QH (heat engine work output = heat engine efficiency × heat to the engine, where the efficiency is a function of the reservoir temperatures for reversible heat engines). Carnot did not distinguish between QH and QC, since he was using the incorrect hypothesis that caloric theory was valid, and hence heat was conserved (the incorrect assumption that QH and QC were equal in magnitude) when, in fact, QH is greater than the magnitude of QC in magnitude.[17][18] Through the efforts of Clausius and Kelvin, it is now known that the work done by a reversible heat engine is the product of the Carnot efficiency (it is the efficiency of all reversible heat engines with the same thermal reservoir pairs according to the Carnot's theorem) and the heat absorbed from the hot reservoir:
|
(1)
|
Here is work done by the Carnot heat engine, is heat to the engine from the hot reservoir, and is heat to the cold reservoir from the engine. To derive the Carnot efficiency, which is 1 − TC/TH (a number less than one), Kelvin had to evaluate the ratio of the work output to the heat absorbed during the isothermal expansion with the help of the Carnot–Clapeyron equation, which contained an unknown function called the Carnot function. The possibility that the Carnot function could be the temperature as measured from a zero point of temperature was suggested by Joule in a letter to Kelvin. This allowed Kelvin to establish his absolute temperature scale.[19] It is also known that the net work W produced by the system in one cycle is the net heat absorbed, which is the sum (or difference of the magnitudes) of the heat QH > 0 absorbed from the hot reservoir and the waste heat QC < 0 given off to the cold reservoir:[20]
|
(2)
|
Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle, work and heat would not be equal, but rather their difference would be the change of a state function that would vanish upon completion of the cycle. The state function was called the internal energy, that is central to the first law of thermodynamics.[21]
Now equating (1) and (2) gives, for the engine per Carnot cycle,[22][20]
|
(3)
|
This implies that there is a function of state whose change is Q/T and this state function is conserved over a complete Carnot cycle, like other state function such as the internal energy. Clausius called this state function entropy. One can see that entropy was discovered through mathematics rather than through laboratory experimental results.[citation needed] It is a mathematical construct and has no easy physical analogy.[citation needed] This makes the concept somewhat obscure or abstract, akin to how the concept of energy arose.
This equation shows an entropy change per Carnot cycle is zero. In fact, an entropy change in the both thermal reservoirs per Carnot cycle is also zero since that change is simply expressed by reverting the sign of each term in the equation (3) according to the fact that, for example, for heat transfer from the hot reservoir to the engine, the engine receives the heat while the hot reservoir loses the same amount of the heat;
|
(4)
|
where we denote an entropy change for a thermal reservoir by ΔSr,i = - Qi/Ti, for i as either H (Hot reservoir) or C (Cold reservoir), by considering the abovementioned signal convention of heat for the engine.
Clausius then asked what would happen if less work is produced by the system than that predicted by Carnot's principle for the same thermal reservoir pair and the same heat transfer from the hot reservoir to the engine QH. In this case, the right-hand side of the equation (1) would be the upper bound of the work output by the system, and the equation would now be converted into an inequality
|
(5)
|
telling that the magnitude of the entropy earned by the cold reservoir is greater than the entropy lost by the hot reservoir. The net entropy change in the engine per its thermodynamic cycle is zero, so the net entropy change in the engine and both the thermal reservoirs per cycle increases if work produced by the engine is less than the work achieved by a Carnot engine in the equation (1).
The Carnot cycle and Carnot efficiency as shown in the equation (1) are useful because they define the upper bound of the possible work output and the efficiency of any classical thermodynamic heat engine. Other cycles, such as the Otto cycle, Diesel cycle and Brayton cycle, can be analyzed from the standpoint of the Carnot cycle. Any machine or cyclic process that converts heat to work and is claimed to produce an efficiency greater than the Carnot efficiency is not viable because it violates the second law of thermodynamics.
For very small numbers of particles in the system, statistical thermodynamics must be used. The efficiency of devices such as photovoltaic cells requires an analysis from the standpoint of quantum mechanics.
Classical thermodynamics
Conjugate variables of thermodynamics | ||||||||
|
The thermodynamic definition of entropy was developed in the early 1850s by
While Clausius based his definition on a reversible process, there are also irreversible processes that change entropy. Following the second law of thermodynamics, entropy of an isolated system always increases for irreversible processes. The difference between an isolated system and closed system is that energy may not flow to and from an isolated system, but energy flow to and from a closed system is possible. Nevertheless, for both closed and isolated systems, and indeed, also in open systems, irreversible thermodynamics processes may occur.
According to the Clausius equality, for a reversible cyclic process: . This means the line integral is path-independent.
So we can define a state function S called entropy, which satisfies .
To find the entropy difference between any two states of a system, the integral must be evaluated for some reversible path between the initial and final states.[23] Since entropy is a state function, the entropy change of the system for an irreversible path is the same as for a reversible path between the same two states.[24] However, the heat transferred to or from, and the entropy change of, the surroundings is different.
We can only obtain the change of entropy by integrating the above formula. To obtain the absolute value of the entropy, we need the third law of thermodynamics, which states that S = 0 at absolute zero for perfect crystals.
From a macroscopic perspective, in
Statistical mechanics
The statistical definition was developed by Ludwig Boltzmann in the 1870s by analyzing the statistical behavior of the microscopic components of the system. Boltzmann showed that this definition of entropy was equivalent to the thermodynamic entropy to within a constant factor—known as the Boltzmann constant. In short, the thermodynamic definition of entropy provides the experimental verification of entropy, while the statistical definition of entropy extends the concept, providing an explanation and a deeper understanding of its nature.
The
The Boltzmann constant, and therefore entropy, have
Specifically, entropy is a logarithmic measure of the number of system states with significant probability of being occupied:
( is the probability that the system is in th state, usually given by the Boltzmann distribution; if states are defined in a continuous manner, the summation is replaced by an integral over all possible states) or, equivalently, the expected value of the logarithm of the probability that a microstate is occupied
where kB is the Boltzmann constant, equal to 1.38065×10−23 J/K. The summation is over all the possible microstates of the system, and pi is the probability that the system is in the i-th microstate.[28] This definition assumes that the basis set of states has been picked so that there is no information on their relative phases. In a different basis set, the more general expression is
where is the density matrix, is trace and is the
In what has been called the fundamental assumption of statistical thermodynamics or
In thermodynamics, such a system is one in which the volume, number of molecules, and internal energy are fixed (the microcanonical ensemble).
For a given thermodynamic system, the excess entropy is defined as the entropy minus that of an ideal gas at the same density and temperature, a quantity that is always negative because an ideal gas is maximally disordered.[30] This concept plays an important role in liquid-state theory. For instance, Rosenfeld's excess-entropy scaling principle[31][32] states that reduced transport coefficients throughout the two-dimensional phase diagram are functions uniquely determined by the excess entropy.[33][34]
The most general interpretation of entropy is as a measure of the extent of uncertainty about a system. The
The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has deep implications: if two observers use different sets of macroscopic variables, they see different entropies. For example, if observer A uses the variables U, V and W, and observer B uses U, V, W, X, then, by changing X, observer B can cause an effect that looks like a violation of the second law of thermodynamics to observer A. In other words: the set of macroscopic variables one chooses must include everything that may change in the experiment, otherwise one might see decreasing entropy.[36]
Entropy can be defined for any
In Boltzmann's 1896 Lectures on Gas Theory, he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics.
Entropy of a system
Entropy arises directly from the Carnot cycle. It can also be described as the reversible heat divided by temperature. Entropy is a fundamental function of state.
In a
) than any other state.As an example, for a glass of ice water in air at room temperature, the difference in temperature between the warm room (the surroundings) and the cold glass of ice and water (the system and not part of the room) decreases as portions of the thermal energy from the warm surroundings spread to the cooler system of ice and water. Over time the temperature of the glass and its contents and the temperature of the room become equal. In other words, the entropy of the room has decreased as some of its energy has been dispersed to the ice and water, of which the entropy has increased.
However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the "universe" of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.
Thermodynamic entropy is a non-conserved
Unlike many other functions of state, entropy cannot be directly observed but must be calculated. Absolute standard molar entropy of a substance can be calculated from the measured temperature dependence of its heat capacity. The molar entropy of ions is obtained as a difference in entropy from a reference state defined as zero entropy. The second law of thermodynamics states that the entropy of an isolated system must increase or remain constant. Therefore, entropy is not a conserved quantity: for example, in an isolated system with non-uniform temperature, heat might irreversibly flow and the temperature become more uniform such that entropy increases.[42] Chemical reactions cause changes in entropy and system entropy, in conjunction with enthalpy, plays an important role in determining in which direction a chemical reaction spontaneously proceeds.
One dictionary definition of entropy is that it is "a measure of thermal energy per unit temperature that is not available for useful work" in a cyclic process. For instance, a substance at uniform temperature is at maximum entropy and cannot drive a heat engine. A substance at non-uniform temperature is at a lower entropy (than if the heat distribution is allowed to even out) and some of the thermal energy can drive a heat engine.
A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there is no net exchange of heat or work – the entropy change is entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing.[43]
Equivalence of definitions
Proofs of equivalence between the definition of entropy in statistical mechanics (the Gibbs entropy formula ) and in classical thermodynamics ( together with the fundamental thermodynamic relation) are known for the microcanonical ensemble, the canonical ensemble, the grand canonical ensemble, and the isothermal–isobaric ensemble. These proofs are based on the probability density of microstates of the generalized Boltzmann distribution and the identification of the thermodynamic internal energy as the ensemble average .
Furthermore, it has been shown that the definitions of entropy in statistical mechanics is the only entropy that is equivalent to the classical thermodynamics entropy under the following postulates:[46]
- The probability density function is proportional to some function of the ensemble parameters and random variables.
- Thermodynamic state functions are described by ensemble averages of random variables.
- At infinite temperature, all the microstates have the same probability.
Second law of thermodynamics
The
It follows from the second law of thermodynamics that the entropy of a system that is not isolated may decrease. An
In mechanics, the second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work.[47] The entropy change of a system at temperature absorbing an infinitesimal amount of heat in a reversible way, is given by . More explicitly, an energy is not available to do useful work, where is the temperature of the coldest accessible reservoir or heat sink external to the system. For further discussion, see Exergy.
Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in an isolated system. Although this is possible, such an event has a small probability of occurring, making it unlikely.[48]
The applicability of a second law of thermodynamics is limited to systems in or sufficiently near equilibrium state, so that they have defined entropy.[49] Some inhomogeneous systems out of thermodynamic equilibrium still satisfy the hypothesis of local thermodynamic equilibrium, so that entropy density is locally defined as an intensive quantity. For such systems, there may apply a principle of maximum time rate of entropy production.[50][51] It states that such a system may evolve to a steady state that maximizes its time rate of entropy production. This does not mean that such a system is necessarily always in a condition of maximum time rate of entropy production; it means that it may evolve to such a steady state.[52][53]
Applications
The fundamental thermodynamic relation
The entropy of a system depends on its internal energy and its external parameters, such as its volume. In the thermodynamic limit, this fact leads to an equation relating the change in the internal energy to changes in the entropy and the external parameters. This relation is known as the fundamental thermodynamic relation. If external pressure bears on the volume as the only external parameter, this relation is:
Since both internal energy and entropy are monotonic functions of temperature , implying that the internal energy is fixed when one specifies the entropy and the volume, this relation is valid even if the change from one state of thermal equilibrium to another with infinitesimally larger entropy and volume happens in a non-quasistatic way (so during this change the system may be very far out of thermal equilibrium and then the whole-system entropy, pressure, and temperature may not exist).
The fundamental thermodynamic relation implies many thermodynamic identities that are valid in general, independent of the microscopic details of the system. Important examples are the Maxwell relations and the relations between heat capacities.
Entropy in chemical thermodynamics
Thermodynamic entropy is central in chemical thermodynamics, enabling changes to be quantified and the outcome of reactions predicted. The second law of thermodynamics states that entropy in an isolated system – the combination of a subsystem under study and its surroundings – increases during all spontaneous chemical and physical processes. The Clausius equation of introduces the measurement of entropy change, . Entropy change describes the direction and quantifies the magnitude of simple changes such as heat transfer between systems – always from hotter to cooler spontaneously.
The thermodynamic entropy therefore has the dimension of energy divided by temperature, and the unit joule per kelvin (J/K) in the International System of Units (SI).
Thermodynamic entropy is an extensive property, meaning that it scales with the size or extent of a system. In many processes it is useful to specify the entropy as an intensive property independent of the size, as a specific entropy characteristic of the type of system studied. Specific entropy may be expressed relative to a unit of mass, typically the kilogram (unit: J⋅kg−1⋅K−1). Alternatively, in chemistry, it is also referred to one mole of substance, in which case it is called the molar entropy with a unit of J⋅mol−1⋅K−1.
Thus, when one mole of substance at about 0 K is warmed by its surroundings to 298 K, the sum of the incremental values of constitute each element's or compound's standard molar entropy, an indicator of the amount of energy stored by a substance at 298 K.[54][55] Entropy change also measures the mixing of substances as a summation of their relative quantities in the final mixture.[56]
Entropy is equally essential in predicting the extent and direction of complex chemical reactions. For such applications, must be incorporated in an expression that includes both the system and its surroundings, . This expression becomes, via some steps, the Gibbs free energy equation for reactants and products in the system: [the Gibbs free energy change of the system] [the enthalpy change] [the entropy change].[54]
World's technological capacity to store and communicate entropic information
A 2011 study in
Entropy balance equation for open systems
In chemical engineering, the principles of thermodynamics are commonly applied to "open systems", i.e. those in which heat, work, and mass flow across the system boundary. Flows of both heat () and work, i.e. (shaft work) and (pressure-volume work), across the system boundaries, in general cause changes in the entropy of the system. Transfer as heat entails entropy transfer , where is the absolute thermodynamic temperature of the system at the point of the heat flow. If there are mass flows across the system boundaries, they also influence the total entropy of the system. This account, in terms of heat and work, is valid only for cases in which the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.[58][59]
To derive a generalized entropy balanced equation, we start with the general balance equation for the change in any
where
- is the net rate of entropy flow due to the flows of mass into and out of the system (where is entropy per unit mass).
- is the rate of entropy flow due to the flow of heat across the system boundary.
- is the rate of entropy production within the system. This entropy production arises from processes within the system, including chemical reactions, internal matter diffusion, internal heat transfer, and frictional effects such as viscosity occurring within the system from mechanical work transfer to or from the system.
If there are multiple heat flows, the term is replaced by where is the heat flow and is the temperature at the th heat flow port into the system.
Note that the nomenclature "entropy balance" is misleading and often deemed inappropriate because entropy is not a conserved quantity. In other words, the term is never a known quantity but always a derived one based on the expression above. Therefore, the open system version of the second law is more appropriately described as the "entropy generation equation" since it specifies that , with zero for reversible processes or greater than zero for irreversible ones.
Entropy change formulas for simple processes
For certain simple transformations in systems of constant composition, the entropy changes are given by simple formulas.[62]
Isothermal expansion or compression of an ideal gas
For the expansion (or compression) of an ideal gas from an initial volume and pressure to a final volume and pressure at any constant temperature, the change in entropy is given by:
Here is the amount of gas (in moles) and is the
Cooling and heating
For pure heating or cooling of any system (gas, liquid or solid) at constant pressure from an initial temperature to a final temperature , the entropy change is
provided that the constant-pressure molar heat capacity (or specific heat) CP is constant and that no phase transition occurs in this temperature interval.
Similarly at constant volume, the entropy change is
where the constant-volume molar heat capacity Cv is constant and there is no phase change.
At low temperatures near absolute zero,
Since entropy is a
Similarly if the temperature and pressure of an ideal gas both vary,
Phase transitions
Reversible phase transitions occur at constant temperature and pressure. The reversible heat is the enthalpy change for the transition, and the entropy change is the enthalpy change divided by the thermodynamic temperature.[65] For fusion (melting) of a solid to a liquid at the melting point Tm, the entropy of fusion is
Similarly, for vaporization of a liquid to a gas at the boiling point Tb, the entropy of vaporization is
Approaches to understanding entropy
As a fundamental aspect of thermodynamics and physics, several different approaches to entropy beyond that of Clausius and Boltzmann are valid.
Standard textbook definitions
The following is a list of additional definitions of entropy from a collection of textbooks:
- a measure of energy dispersalat a specific temperature.
- a measure of disorder in the universe or of the availability of the energy in a system to do work.[66]
- a measure of a system's thermal energy per unit temperature that is unavailable for doing useful work.[67]
In Boltzmann's analysis in terms of constituent particles, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium.
Order and disorder
Entropy is often loosely associated with the amount of order or disorder, or of chaos, in a thermodynamic system. The traditional qualitative description of entropy is that it refers to changes in the status quo of the system and is a measure of "molecular disorder" and the amount of wasted energy in a dynamical energy transformation from one state or form to another. In this direction, several recent authors have derived exact entropy formulas to account for and measure disorder and order in atomic and molecular assemblies.[68][69][70] One of the simpler entropy order/disorder formulas is that derived in 1984 by thermodynamic physicist Peter Landsberg, based on a combination of thermodynamics and information theory arguments. He argues that when constraints operate on a system, such that it is prevented from entering one or more of its possible or permitted states, as contrasted with its forbidden states, the measure of the total amount of "disorder" in the system is given by:[69][70]
Similarly, the total amount of "order" in the system is given by:
In which CD is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, CI is the "information" capacity of the system, an expression similar to Shannon's channel capacity, and CO is the "order" capacity of the system.[68]
Energy dispersal
The concept of entropy can be described qualitatively as a measure of energy dispersal at a specific temperature.
Ambiguities in the terms disorder and chaos, which usually have meanings directly opposed to equilibrium, contribute to widespread confusion and hamper comprehension of entropy for most students.[72] As the second law of thermodynamics shows, in an isolated system internal portions at different temperatures tend to adjust to a single uniform temperature and thus produce equilibrium. A recently developed educational approach avoids ambiguous terms and describes such spreading out of energy as dispersal, which leads to loss of the differentials required for work even though the total energy remains constant in accordance with the first law of thermodynamics[73] (compare discussion in next section). Physical chemist Peter Atkins, in his textbook Physical Chemistry, introduces entropy with the statement that "spontaneous changes are always accompanied by a dispersal of energy or matter and often both".[74]
Relating entropy to energy usefulness
It is possible (in a thermal context) to regard lower entropy as a measure of the effectiveness or usefulness of a particular quantity of energy.[75] Energy supplied at a higher temperature (i.e. with low entropy) tends to be more useful than the same amount of energy available at a lower temperature. Mixing a hot parcel of a fluid with a cold one produces a parcel of intermediate temperature, in which the overall increase in entropy represents a "loss" that can never be replaced.
As the entropy of the universe is steadily increasing, its total energy is becoming less useful. Eventually, this leads to the heat death of the universe.[76]
Entropy and adiabatic accessibility
A definition of entropy based entirely on the relation of adiabatic accessibility between equilibrium states was given by E.H.Lieb and J. Yngvason in 1999.[77] This approach has several predecessors, including the pioneering work of Constantin Carathéodory from 1909[78] and the monograph by R. Giles.[79] In the setting of Lieb and Yngvason one starts by picking, for a unit amount of the substance under consideration, two reference states and such that the latter is adiabatically accessible from the former but not vice versa. Defining the entropies of the reference states to be 0 and 1 respectively the entropy of a state is defined as the largest number such that is adiabatically accessible from a composite state consisting of an amount in the state and a complementary amount, , in the state . A simple but important result within this setting is that entropy is uniquely determined, apart from a choice of unit and an additive constant for each chemical element, by the following properties: It is monotonic with respect to the relation of adiabatic accessibility, additive on composite systems, and extensive under scaling.
Entropy in quantum mechanics
In quantum statistical mechanics, the concept of entropy was developed by John von Neumann and is generally referred to as "von Neumann entropy",
where ρ is the density matrix and Tr is the trace operator.
This upholds the correspondence principle, because in the classical limit, when the phases between the basis states used for the classical probabilities are purely random, this expression is equivalent to the familiar classical definition of entropy,
i.e. in such a basis the density matrix is diagonal.
Von Neumann established a rigorous mathematical framework for quantum mechanics with his work Mathematische Grundlagen der Quantenmechanik. He provided in this work a theory of measurement, where the usual notion of wave function collapse is described as an irreversible process (the so-called von Neumann or projective measurement). Using this concept, in conjunction with the density matrix he extended the classical concept of entropy into the quantum domain.
Information theory
I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". [...] Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
Conversation between Claude Shannon and John von Neumann regarding what name to give to the attenuation in phone-line signals[80]
When viewed in terms of information theory, the entropy state function is the amount of information in the system that is needed to fully specify the microstate of the system. Entropy is the measure of the amount of missing information before reception.[81] Often called Shannon entropy, it was originally devised by Claude Shannon in 1948 to study the size of information of a transmitted message. The definition of information entropy is expressed in terms of a discrete set of probabilities so that
In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of the average size of information of a message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of binary questions needed to determine the content of the message.[28]
Most researchers consider information entropy and thermodynamic entropy directly linked to the same concept,[82][83][84][85][86] while others argue that they are distinct.[87] Both expressions are mathematically similar. If is the number of microstates that can yield a given macrostate, and each microstate has the same
- and if entropy is measured in units of per nat, then the entropy is given by:
which is the Boltzmann entropy formula, where is the Boltzmann constant, which may be interpreted as the thermodynamic entropy per nat. Some authors argue for dropping the word entropy for the function of information theory and using Shannon's other term, "uncertainty", instead.[88]
Measurement
The entropy of a substance can be measured, although in an indirect way. The measurement, known as entropymetry,[89] is done on a closed system (with particle number N and volume V being constants) and uses the definition of temperature[90] in terms of entropy, while limiting energy exchange to heat ().
The resulting relation describes how entropy changes when a small amount of energy is introduced into the system at a certain temperature .
The process of measurement goes as follows. First, a sample of the substance is cooled as close to absolute zero as possible. At such temperatures, the entropy approaches zero – due to the definition of temperature. Then, small amounts of heat are introduced into the sample and the change in temperature is recorded, until the temperature reaches a desired value (usually 25 °C). The obtained data allows the user to integrate the equation above, yielding the absolute value of entropy of the substance at the final temperature. This value of entropy is called calorimetric entropy.[91]
Interdisciplinary applications
Although the concept of entropy was originally a thermodynamic concept, it has been adapted in other fields of study,[60] including information theory, psychodynamics, thermoeconomics/ecological economics, and evolution.[68][92][93][94][95]
Philosophy and theoretical physics
Entropy is the only quantity in the physical sciences that seems to imply a particular direction of progress, sometimes called an arrow of time. As time progresses, the second law of thermodynamics states that the entropy of an isolated system never decreases in large systems over significant periods of time. Hence, from this perspective, entropy measurement is thought of as a clock in these conditions[citation needed].
Biology
Chiavazzo et al. proposed that where cave spiders choose to lay their eggs can be explained through entropy minimization.[96]
Entropy has been proven useful in the analysis of base pair sequences in DNA. Many entropy-based measures have been shown to distinguish between different structural regions of the genome, differentiate between coding and non-coding regions of DNA, and can also be applied for the recreation of evolutionary trees by determining the evolutionary distance between different species.[97]
Cosmology
Assuming that a finite universe is an isolated system, the second law of thermodynamics states that its total entropy is continually increasing. It has been speculated, since the 19th century, that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy so that no more work can be extracted from any source.
If the universe can be considered to have generally increasing entropy, then – as
The role of entropy in cosmology remains a controversial subject since the time of Ludwig Boltzmann. Recent work has cast some doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly, moving the universe further from the heat death with time, not closer.[102][103][104] This results in an "entropy gap" pushing the system further away from the posited heat death equilibrium.[105] Other complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.[106]
Current theories suggest the entropy gap to have been originally opened up by the early rapid exponential expansion of the universe.[107]
Economics
In economics, Georgescu-Roegen's work has generated the term 'entropy pessimism'.[111]: 116 Since the 1990s, leading ecological economist and steady-state theorist Herman Daly – a student of Georgescu-Roegen – has been the economics profession's most influential proponent of the entropy pessimism position.[112]: 545f [113]
See also
- Boltzmann entropy
- Brownian ratchet
- Configuration entropy
- Conformational entropy
- Entropic explosion
- Entropic force
- Entropy unit
- Entropic value at risk
- Entropy and life
- Free entropy
- Harmonic entropy
- Info-metrics
- Negentropy (negative entropy)
- Phase space
- Principle of maximum entropy
- Residual entropy
- Thermodynamic potential
Notes
- ^ The overdots represent derivatives of the quantities with respect to time.
References
- .
- ISBN 0387904034 – via Internet Archive.
- ISBN 0-444-87009-1, pp. 576–577.
- ISBN 9789812707062.
- ^ "Carnot, Sadi (1796–1832)". Wolfram Research. 2007. Retrieved 24 February 2010.
- ^ McCulloch, Richard, S. (1876). Treatise on the Mechanical Theory of Heat and its Applications to the Steam-Engine, etc. D. Van Nostrand.
{{cite book}}
: CS1 maint: multiple names: authors list (link) - ^ . [On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat] : Poggendorff's Annalen der Physik und Chemie
- ISBN 0198642148, pp. 1826–1827.
- ^ . "Sucht man für S einen bezeichnenden Namen, so könnte man, ähnlich wie von der Gröſse U gesagt ist, sie sey der Wärme- und Werkinhalt des Körpers, von der Gröſse S sagen, sie sey der Verwandlungsinhalt des Körpers. Da ich es aber für besser halte, die Namen derartiger für die Wissenschaft wichtiger Gröſsen aus den alten Sprachen zu entnehmen, damit sie unverändert in allen neuen Sprachen angewandt werden können, so schlage ich vor, die Gröſse S nach dem griechischen Worte ἡ τροπή, die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habei ich absichtlich dem Worte Energie möglichst ähnlich gebildet, denn die beiden Gröſsen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, daſs eine gewisse Gleichartigkeit in der Benennung mir zweckmäſsig zu seyn scheint." (p. 390).
- ^ ISBN 0-691-02350-6. "Clausius coined the word entropy for : ″I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, accordingly, to call the entropy of a body, after the Greek word 'transformation.' I have designedly coined the word entropy to be similar to 'energy,' for these two quantities are so analogous in their physical significance, that an analogy of denomination seemed to me helpful.″"
- ^ a b Cooper, Leon N. (1968). An Introduction to the Meaning and Structure of Physics. Harper. p. 331.
- ^ The scientific papers of J. Willard Gibbs in Two Volumes. Vol. 1. Longmans, Green, and Co. 1906. p. 11. Retrieved 26 February 2011.
- ^ J. A. McGovern,"2.5 Entropy". Archived from the original on 23 September 2012. Retrieved 5 February 2013.
- ^ "6.5 Irreversibility, Entropy Changes, and Lost Work". web.mit.edu. Retrieved 21 May 2016.
- ^ Lower, Stephen. "What is entropy?". www.chem1.com. Retrieved 21 May 2016.
- ISBN 978-1-4419-1430-9.
- ISBN 978-0-936508-16-0.
- ISBN 978-0-387-90403-0.
- ISBN 978-0-486-41735-6.
- ^ a b Planck, M. (1945). Treatise on Thermodynamics. Dover Publications. p. §90 & §137.
eqs.(39), (40), & (65)
. - ISBN 978-1-4981-6733-8.
- ^ Fermi, E. (1956). Thermodynamics. Dover Publications (still in print). p. 48.
eq.(64)
. - ISBN 978-0-19-870072-2.
- ISBN 978-0-8053-3842-3.
- ^ ISBN 978-0-07-143953-4.
- ^ ISBN 978-0-19-856677-9.
- ISBN 978-0-7607-4616-5.
- ^ a b Frigg, R. and Werndl, C. "Entropy – A Guide for the Perplexed". In Probabilities in Physics; Beisbart C. and Hartmann, S. Eds; Oxford University Press, Oxford, 2010
- ISBN 978-0-201-38027-9.
- ISBN 9780198803195.
- ISSN 0556-2791.
- PMID 30525736.
- PMID 30770449.
- PMID 32855393.
- ISSN 0002-9505.
- ^ Jaynes, E.T. (1992). Smith, C.R; Erickson, G.J; Neudorfer, P.O. (eds.). 'The Gibbs Paradox,' [in] Maximum Entropy and Bayesian Methods (PDF). Kluwer Academic: Dordrecht. pp. 1–22. Retrieved 17 August 2012.
- ^ ISBN 978-0-471-66174-0.
- ISBN 978-0-935702-99-6.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - ISBN 978-0-521-79165-6.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - ISBN 978-0-19-280641-3.
- ISBN 978-0-06-011029-1.
- ^ McGovern, J. A. "Heat Capacities". Archived from the original on 19 August 2012. Retrieved 27 January 2013.
- doi:10.3390/e9030133.
- ISBN 978-0-471-86256-7.
- S2CID 118981017.
- S2CID 221978379.
- ISBN 978-0-19-280628-4.
- S2CID 22204063.
- S2CID 119224112.
- ^ Ziegler, H. (1983). An Introduction to Thermomechanics. North Holland, Amsterdam.
{{cite book}}
: CS1 maint: location missing publisher (link) - .
- ^ Kleidon, A.; et., al. (2005). Non-equilibrium Thermodynamics and the Production of Entropy. Heidelberg: Springer.
- PMID 25662746.
- ^ ISBN 978-0-534-42201-1.
- S2CID 18081336.
- ISBN 978-0-07-231808-1.
- ^ )
- ISBN 978-1-298-49740-6.
- ISBN 978-0-12-245601-5.
- ^ Bibcode:2020tcsp.book.....P.
- ISBN 978-0-471-83050-4.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - ^ "GRC.nasa.gov". GRC.nasa.gov. 27 March 2000. Archived from the original on 21 August 2011. Retrieved 17 August 2012.
- ^ Franzen, Stefan. "Third Law" (PDF). ncsu.edu. Archived from the original (PDF) on 9 July 2017.
- ^ "GRC.nasa.gov". GRC.nasa.gov. 11 July 2008. Retrieved 17 August 2012.
- ISBN 978-1489983671. Retrieved 5 September 2019.
- ISBN 978-0-684-85578-3.
- ^ "Entropy: Definition and Equation". Encyclopædia Britannica. Retrieved 22 May 2016.
- ^ ISBN 978-0-226-07574-7.
- ^ S2CID 122424225.
- ^ .
- ^ Lambert, Frank L. "A Student's Approach to the Second Law and Entropy". entropysite.oxy.edu. Archived from the original on 17 July 2009. Retrieved 22 May 2016.
- ISSN 1369-5614.
- S2CID 97102995.
- ISBN 978-0-19-882336-0.
- ^ Sandra Saary (23 February 1993). "Book Review of "A Science Miscellany"". Khaleej Times. UAE: Galadari Press: xi.
- ISBN 978-0-521-26173-9.
- S2CID 119620408.
- S2CID 118230148.
- ISBN 978-1-4831-8491-3.
- ^ Tribus, M.; McIrvine, E.C. (1971). "Energy and information". Scientific American. 224: 178–184.
- ISBN 978-3-7643-7116-6.
- ISBN 978-0-486-43918-1.
- ^ ISBN 978-0-674-25781-8.
- ISBN 978-981-256-323-1.
- S2CID 118726162.
- ISBN 9789812832269.
- ISBN 978-3642401534. Retrieved 31 August 2019.].
Inked page forms pattern w/ information → page entropy < diff page w/ randomized letters. Reduced entropy trivial compared to inked paper entropy. If the paper is burned, it hardly matters in a thermodynamic context if the text contains the meaning of life or only jibberish [sic
- ^ Schneider, Tom, DELILA system (Deoxyribonucleic acid Library Language), (Information Theory Analysis of binding sites), Laboratory of Mathematical Biology, National Cancer Institute, Frederick, MD
- S2CID 212779004.
- ISBN 978-0-201-38027-9.
- ^ "Measuring Entropy". www.chem.wisc.edu.
- ISBN 978-981-238-399-0.
- ISBN 978-0-521-80293-2.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - S2CID 12418973.
- ISBN 978-1-4939-3464-5.
- PMID 25556697.
- ISSN 0378-4371.
- ISBN 978-0-674-01387-2.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - S2CID 9329564.
- S2CID 7115890.
- ISBN 978-0470190401. Retrieved 31 August 2019.
- ^ Layzer, David (1988). Growth of Order in the Universe. MIT Press.
- ISBN 978-0-674-00342-2.
- ISBN 978-1-107-02725-1.
- ISBN 978-1-59102-481-1.
- ISBN 978-0-387-96526-0.
- Bibcode:2002astro.ph.10527A. Retrieved 28 June 2017. In honor of John Wheeler's 90th birthday.
- .
- ISBN 978-1-59726-681-9.
- ISBN 978-0-8155-1537-1.
- S2CID 154728333.
- .
- ^
S2CID 13441670. Retrieved 23 November 2016.
Further reading
- Adam, Gerhard; ISBN 978-3-528-33311-9.
- Atkins, Peter; Julio De Paula (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN 978-0-19-870072-2.
- Baierlein, Ralph (2003). Thermal Physics. Cambridge University Press. ISBN 978-0-521-65838-6.
- ISBN 978-981-270-055-1.
- Callen, Herbert, B (2001). Thermodynamics and an Introduction to Thermostatistics (2nd ed.). John Wiley and Sons. ISBN 978-0-471-86256-7.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - Chang, Raymond (1998). Chemistry (6th ed.). New York: McGraw Hill. ISBN 978-0-07-115221-1.
- Cutnell, John, D.; Johnson, Kenneth, J. (1998). Physics (4th ed.). John Wiley and Sons, Inc. ISBN 978-0-471-19113-1.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - Dugdale, J. S. (1996). Entropy and its Physical Meaning (2nd ed.). Taylor and Francis (UK); CRC (US). ISBN 978-0-7484-0569-5.
- ISBN 978-0-486-60361-2.
- Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN 978-0-674-75325-9.
- Gyftopoulos, E.P.; G.P. Beretta (2010). Thermodynamics. Foundations and Applications. Dover. ISBN 978-0-486-43932-7.
- Haddad, Wassim M.; Chellaboina, VijaySekhar; Nersesov, Sergey G. (2005). Thermodynamics – A Dynamical Systems Approach. ISBN 978-0-691-12327-1.
- Johnson, Eric (2018). Anxiety and the Equation: Understanding Boltzmann's Entropy. The MIT Press. ISBN 978-0-262-03861-4.
- Kroemer, Herbert; Charles Kittel (1980). Thermal Physics (2nd ed.). W. H. Freeman Company. ISBN 978-0-7167-1088-2.
- Lambert, Frank L.; entropysite.oxy.edu
- ISBN 978-981-4449-53-3.
- ISBN 978-0-679-45443-4.
- Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN 978-0-07-051800-1.
- Schroeder, Daniel V. (2000). Introduction to Thermal Physics. New York: Addison Wesley Longman. ISBN 978-0-201-38027-9.
- Serway, Raymond, A. (1992). Physics for Scientists and Engineers. Saunders Golden Subburst Series. ISBN 978-0-03-096026-0.)
{{cite book}}
: CS1 maint: multiple names: authors list (link - Sharp, Kim (2019). Entropy and the Tao of Counting: A Brief Introduction to Statistical Mechanics and the Second Law of Thermodynamics (SpringerBriefs in Physics). Springer Nature. ISBN 978-3030354596.
- Spirax-Sarco Limited, Entropy – A Basic Understanding A primer on entropy tables for steam engineering
- vonBaeyer; Hans Christian (1998). ISBN 978-0-679-43342-2.
External links
- Entropy and the Second Law of Thermodynamics – an A-level physics lecture with 'derivation' of entropy based on Carnot cycle
- Khan Academy: entropy lectures, part of Chemistry playlist
- The Second Law of Thermodynamics and Entropy – Yale OYC lecture, part of Fundamentals of Physics I (PHYS 200)
- Entropy and the Clausius inequality MIT OCW lecture, part of 5.60 Thermodynamics & Kinetics, Spring 2008
- The Discovery of Entropy by Adam Shulman. Hour-long video, January 2013.
- Moriarty, Philip; Merrifield, Michael (2009). "S Entropy". Sixty Symbols. Brady Haran for the University of Nottingham.
- "Entropy" at Scholarpedia
- ^ David, Kover. "Entropia – fyzikálna veličina vesmíru a nášho života". www.stejfree.sk. Retrieved 13 April 2022.