Entropy and life
Research concerning the relationship between the
The 1944 book
Ideas about the relationship between entropy and living organisms have inspired hypotheses and speculations in many contexts, including psychology,
Early views
In 1863
The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy which exists in plenty in any body in the form of heat, but a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.[4]
In 1876 American civil engineer Richard Sears McCulloh, in his Treatise on the Mechanical Theory of Heat and its Application to the Steam-Engine, which was an early thermodynamics textbook, states, after speaking about the laws of the physical world, that "there are none that are established on a firmer basis than the two general propositions of Joule and Carnot; which constitute the fundamental laws of our subject." McCulloh then goes on to show that these two laws may be combined in a single expression as follows:
where
- entropy
- a differential amount of heat passed into a thermodynamic system
- absolute temperature
McCulloh then declares that the applications of these two laws, i.e. what are currently known as the first law of thermodynamics and the second law of thermodynamics, are innumerable:
When we reflect how generally physical phenomena are connected with thermal changes and relations, it at once becomes obvious that there are few, if any, branches of
mechanical theory of heat has been freely adopted, whole branches of physical science have been revolutionized by it.[5]: p. 267
McCulloh gives a few of what he calls the "more interesting examples" of the application of these laws in extent and utility. His first example is
To answer this question he turns to the mechanical theory of heat and goes on to loosely outline how the heart is what he calls a "force-pump", which receives blood and sends it to every part of the body, as discovered by William Harvey, and which "acts like the piston of an engine and is dependent upon and consequently due to the cycle of nutrition and excretion which sustains physical or organic life". It is likely that McCulloh modeled parts of this argument on that of the famous Carnot cycle. In conclusion, he summarizes his first and second law argument as such:
Everything physical being subject to the
mechanical work must from the same quantity of food generate less heat than one abstaining from exertion, the difference being precisely the heat equivalent of that of work.[5]: p. 270
Negative entropy
In the 1944 book
Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.
This, Schrödinger argues, is what differentiates life from other forms of the organization of matter. In this direction, although life's dynamics may be argued to go against the tendency of the second law, life does not in any way conflict with or invalidate this law, because the principle that entropy can only increase or remain constant applies only to a closed system which is adiabatically isolated, meaning no heat can enter or leave, and the physical and chemical processes which make life possible do not occur in adiabatic isolation, i.e. living systems are open systems. Whenever a system can exchange either heat or matter with its environment, an entropy decrease of that system is entirely compatible with the second law.[8]
Schrödinger asked the question: "How does the living organism avoid decay?" The obvious answer is: "By eating, drinking, breathing and (in the case of plants) assimilating." While energy from nutrients is necessary to sustain an organism's order, Schrödinger also presciently postulated the existence of other molecules equally necessary for creating the order observed in living organisms: "An organism's astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids..." We now know that this "aperiodic" crystal is DNA, and that its irregular arrangement is a form of information. "The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell."[9]
DNA and other macromolecules determine an organism's life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size, as genetics is the governing factor. At some point, virtually all organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or to be taught to each generation. Therefore, DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann's perspective of the second law, the change of state from a more probable, less ordered, and higher entropy arrangement to one of less probability, more order, and lower entropy (as is seen in biological ordering) calls for a function like that known of DNA. DNA's apparent information-processing function provides a resolution of the Schrödinger paradox posed by life and the entropy requirement of the second law.[10]
Gibbs free energy and biological evolution
In recent years the thermodynamic interpretation of evolution in relation to entropy has begun to use the concept of the Gibbs free energy, rather than entropy.[11][12] This is because biological processes on Earth take place at roughly constant temperature and pressure, a situation in which the Gibbs free energy is an especially useful way to express the second law of thermodynamics. The Gibbs free energy is given by:
where
- Gibbs free energy
- thermodynamic system
- absolute temperature of the system
- entropy
and
Similarly, according to the chemist John Avery, from his 2003 book Information Theory and Evolution, we find a presentation in which the phenomenon of life, including its origin and evolution, as well as human cultural evolution, has its basis in the background of thermodynamics, statistical mechanics, and information theory. The (apparent) paradox between the second law of thermodynamics and the high degree of order and complexity produced by living systems, according to Avery, has its resolution "in the information content of the Gibbs free energy that enters the biosphere from outside sources."[14] Assuming evolution drives organisms towards higher information content, it is postulated by Gregory Chaitin that life has properties of high mutual information,[15] and by Tamvakis that life can be quantified using mutual information density metrics, a generalisation of the concept of Biodiversity.[16]
In a study titled "Natural selection for least action" published in the Proceedings of the Royal Society A., Ville Kaila and Arto Annila of the University of Helsinki describe how the process of natural selection responsible for such local increase in order may be mathematically derived directly from the expression of the second law equation for connected non-equilibrium open systems. The second law of thermodynamics can be written as an equation of motion to describe evolution, showing how natural selection and the principle of least action can be connected by expressing natural selection in terms of chemical thermodynamics. In this view, evolution explores possible paths to level differences in energy densities and so increase entropy most rapidly. Thus, an organism serves as an energy transfer mechanism, and beneficial mutations allow successive organisms to transfer more energy within their environment.[17][18]
Counteracting the second law tendency
Second-law analysis is valuable in scientific and engineering analysis in that it provides a number of benefits over energy analysis alone, including the basis for determining energy quality (or exergy content[19][20][21]), understanding fundamental physical phenomena, improving performance evaluation and optimization, or in furthering our understanding of living systems.
The second law describes a universal tendency towards disorder and uniformity, or internal and external equilibrium. This means that real, non-ideal processes cause entropy production. Entropy can also be transferred to or from a system as well by the flow or transfer of matter and energy. As a result, entropy production does not necessarily cause the entropy of the system to increase. In fact the entropy or disorder in a system can spontaneously decrease, such as an aircraft gas turbine engine cooling down after shutdown, or like water in a cup left outside in sub-freezing winter temperatures. In the latter, a relatively unordered liquid cools and spontaneously freezes into a crystalized structure of reduced disorder as the molecules ‘stick’ together. It is important to note that although the entropy of the system decreases, the system approaches uniformity with, or becomes more thermodynamically similar to its surroundings.[22] This is a category III process, referring to the four combinations of either entropy (S) up or down, and uniformity (Y) - between system and its environment – either up or down.
The second law can be conceptually stated[22] as follows: Matter and energy have the tendency to reach a state of uniformity or internal and external equilibrium, a state of maximum disorder (entropy). Real non-equilibrium processes always produce entropy, causing increased disorder in the universe, while idealized reversible processes produce no entropy and no process is known to exist that destroys entropy. The tendency of a system to approach uniformity may be counteracted, and the system may become more ordered or complex, by the combination of two things, a work or exergy source and some form of instruction or intelligence. Where ‘exergy’ is the thermal, mechanical, electric or chemical work potential of an energy source or flow, and ‘instruction or intelligence’, is understood in the context of, or characterized by, the set of processes that are within category IV.
Consider as an example of a category IV process, robotic manufacturing and assembly of vehicles in a factory. The robotic machinery requires electrical work input and instructions, but when completed, the manufactured products have less uniformity with their surroundings, or more complexity (higher order) relative to the raw materials they were made from. Thus, system entropy or disorder decreases while the tendency towards uniformity between the system and its environment is counteracted. In this example, the instructions, as well as the source of work may be internal or external to the system, and they may or may not cross the system boundary. To illustrate, the instructions may be pre-coded and the electrical work may be stored in an energy storage system on-site. Alternatively, the control of the machinery may be by remote operation over a communications network, while the electric work is supplied to the factory from the local electric grid. In addition, humans may directly play, in whole or in part, the role that the robotic machinery plays in manufacturing. In this case, instructions may be involved, but intelligence is either directly responsible, or indirectly responsible, for the direction or application of work in such a way as to counteract the tendency towards disorder and uniformity.
As another example, consider the refrigeration of water in a warm environment. Due to refrigeration, heat is extracted or forced to flow from the water. As a result, the temperature and entropy of the water decreases, and the system moves further away from uniformity with its warm surroundings. The important point is that refrigeration not only requires a source of work, it requires designed equipment, as well as pre-coded or direct operational intelligence or instructions to achieve the desired refrigeration effect.
Observation is the basis for the understanding that category IV processes require both a source of exergy as well as a source or form of intelligence or instruction. With respect to living systems, sunlight provides the source of exergy for virtually all life on Earth, i.e. sunlight directly (for flora) or indirectly in food (for fauna). Note that the work potential or exergy of sunlight, with a certain spectral and directional distribution, will have a specific value[19][20][21] that can be expressed as a percentage of the energy flow or exergy content. Like the Earth as a whole, living things use this energy, converting the energy to other forms (the first law), while producing entropy (the second law), and thereby degrading the exergy or quality of the energy. Sustaining life, or the growth of a seed, for example, requires continual arranging of atoms and molecules into elaborate assemblies required to duplicate living cells. This assembly in living organisms decreases uniformity and disorder, counteracting the universal tendency towards disorder and uniformity described by the second law. In addition to a high quality energy source, counteracting this tendency requires a form of instruction or intelligence, which is contained primarily in the DNA/RNA.
In the absence of instruction or intelligence, high quality energy is not enough on its own to produce complex assemblies, such as a house. As an example of category I in contrast to IV, although having a lot of energy or exergy, a second tornado will never re-construct a town destroyed by a previous tornado, instead it increases disorder and uniformity (category I), the very tendency described by the second law. A related line of reasoning is that, even though improbable, over billions of years or trillions of chances, did life come about undirected, from non-living matter in the absence of any intelligence? Related questions someone can ask include; can humans with a supply of food (exergy) live without DNA/RNA, or can a house supplied with electricity be built in the forest without humans or a source of instruction or programming, or can a fridge run with electricity but without its functioning computer control boards?
The second law guarantees, that if we build a house it will, over time, have the tendency to fall apart or tend towards a state of disorder. On the other hand, if on walking through a forest we discover a house, we likely conclude that somebody built it, rather than concluding the order came about randomly. We know that living systems, such as the structure and function of a living cell, or the process of protein assembly/folding, are exceedingly complex. Could life have come about without being directed by a source of intelligence – consequently, over time, resulting in such things as the human brain and its intelligence, computers, cities, the quality of love and the creation of music or fine art? The second law tendency towards disorder and uniformity, and the distinction of category IV processes as counteracting this natural tendency,[22] offers valuable insight for us to consider in our search to answer these questions.
Entropy of individual cells
Entropy balancing
An entropy balance for an open system, or the change in entropy over time for a system at steady state, can be written as:
Assuming a steady state system, roughly stable pressure-temperature conditions, and exchange through cell surfaces only,[23] this expression can be rewritten to express entropy balance for an individual cell as:
Where
heat exchange with the environment
partial molar entropy of metabolite B
partial molar entropy of structures resulting from growth
rate of entropy production
and terms indicate rates of exchange with the environment in units of .
This equation can be adapted to describe the entropy balance of a cell, which is useful in reconciling the spontaneity of cell growth with the intuition that the development of complex structures must overall decrease entropy within the cell. From the second law, ; due to internal organization resulting from growth, will be small. Metabolic processes force the sum of the remaining two terms to be less than zero through either a large rate of heat transfer or the export of high entropy waste products.[3] Both mechanisms prevent excess entropy from building up inside the growing cell; the latter is what Schrödinger described as feeding on negative entropy, or "negentropy".[24]
Implications for metabolism
In fact it is possible for this "negentropy" contribution to be large enough that growth is fully endothermic, or actually removes heat from the environment. This type of metabolism, in which acetate, methanol, or a number of other hydrocarbon compounds are converted to methane (a high entropy gas),[25] is known as acetoclastic methanogenesis; one example is the metabolism of the anaerobic archaebacteria Methanosarcina barkeri.[26][27] At the opposite extreme is the metabolism of anaerobic thermophile archaebacteria Methanobacterium thermoautotrophicum,[28] for which the heat exported into the environment through fixation[29] is high (~3730 kJ/C-mol).[27]
Generally, in metabolic processes, spontaneous catabolic processes that break down biomolecules provide the energy to drive non-spontaneous anabolic reactions that build organized biomass from high entropy reactants.[30] Therefore, biomass yield is determined by the balance between coupled catabolic and anabolic processes, where the relationship between these processes can be described by:
where
total reaction driving force/ overall molar Gibbs energy
biomass produced
Gibbs energy of catabolic reactions (-)
Gibbs energy of anabolic reactions (+)
Organisms must maintain some optimal balance between and to both avoid thermodynamic equilibrium (), at which biomass production would be theoretically maximized but metabolism would proceed at an infinitely slow rate, and the opposite limiting case at which growth is highly favorable (), but biomass yields are prohibitively low. This relationship is best described in general terms, and will vary widely from organism to organism. Because the terms corresponding to catabolic and anabolic contributions would be roughly balanced in the former scenario, this case represents the maximum amount of organized matter that can be produced in accordance with the 2nd law of thermodynamics for a very generalized metabolic system.[23]
Entropy and the origin of life
The second law of thermodynamics applied to the
Relationship to prebiotic chemistry
In 1924 Alexander Oparin suggested that sufficient energy for generating early life forms from non-living molecules was provided in a "primordial soup".[31] The laws of thermodynamics impose some constraints on the earliest life-sustaining reactions that would have emerged and evolved from such a mixture. Essentially, to remain consistent with the second law of thermodynamics, self organizing systems that are characterized by lower entropy values than equilibrium must dissipate energy so as to increase entropy in the external environment.[32] One consequence of this is that low entropy or high chemical potential chemical intermediates cannot build up to very high levels if the reaction leading to their formation is not coupled to another chemical reaction that releases energy. These reactions often take the form of redox couples, which must have been provided by the environment at the time of the origin of life.[33] In today's biology, many of these reactions require catalysts (or enzymes) to proceed, which frequently contain transition metals. This means identifying both redox couples and metals that are readily available in a given candidate environment for abiogenesis is an important aspect of prebiotic chemistry.
The idea that processes that can occur naturally in the environment and act to locally decrease entropy must be identified has been applied in examinations of phosphate's role in the origin of life, where the relevant setting for abiogenesis is an early Earth lake environment. One such process is the ability of phosphate to concentrate reactants selectively due to its localized negative charge.[34]
In the context of the alkaline hydrothermal vent (AHV) hypothesis for the origin of life, a framing of lifeforms as "entropy generators" has been suggested in an attempt to develop a framework for abiogenesis under alkaline deep sea conditions. Assuming life develops rapidly under certain conditions, experiments may be able to recreate the first metabolic pathway, as it would be the most energetically favorable and therefore likely to occur. In this case, iron sulfide compounds may have acted as the first catalysts.[35] Therefore, within the larger framing of life as free energy converters, it would eventually be beneficial to characterize quantities such as entropy production and proton gradient dissipation rates quantitatively for origin of life relevant systems (particularly AHVs).[36]
Other theories
The evolution of order, manifested as biological complexity, in living systems and the generation of order in certain non-living systems was proposed to obey a common fundamental principal called "the Darwinian dynamic".[37] The Darwinian dynamic was formulated by first considering how microscopic order is generated in relatively simple non-biological systems that are far from thermodynamic equilibrium (e.g. tornadoes, hurricanes). Consideration was then extended to short, replicating RNA molecules assumed to be similar to the earliest forms of life in the RNA world. It was shown that the underlying order-generating processes in the non-biological systems and in replicating RNA are basically similar. This approach helps clarify the relationship of thermodynamics to evolution as well as the empirical content of Darwin's theory.
In 2009 physicist Karo Michaelian published a thermodynamic dissipation theory for the origin of life[38][39] in which the fundamental molecules of life; nucleic acids, amino acids, carbohydrates (sugars), and lipids are considered to have been originally produced as microscopic dissipative structures (through Prigogine's dissipative structuring[40]) as pigments at the ocean surface to absorb and dissipate into heat the UVC flux of solar light arriving at Earth's surface during the Archean, just as do organic pigments in the visible region today. These UVC pigments were formed through photochemical dissipative structuring from more common and simpler precursor molecules like HCN and H2O under the UVC flux of solar light.[38][39][41] The thermodynamic function of the original pigments (fundamental molecules of life) was to increase the entropy production of the incipient biosphere under the solar photon flux and this, in fact, remains as the most important thermodynamic function of the biosphere today, but now mainly in the visible region where photon intensities are higher and biosynthetic pathways are more complex, allowing pigments to be synthesized from lower energy visible light instead of UVC light which no longer reaches Earth's surface.
Jeremy England developed a hypothesis of the physics of the origins of life, that he calls 'dissipation-driven adaptation'.[42][43] The hypothesis holds that random groups of molecules can self-organize to more efficiently absorb and dissipate heat from the environment. His hypothesis states that such self-organizing systems are an inherent part of the physical world.[44]
Other types of entropy and their use in defining life
Like a thermodynamic system, an information system has an analogous concept to entropy called
In 1984 Brooks and Wiley introduced the concept of species entropy as a measure of the sum of entropy reduction within species populations in relation to free energy in the environment.[46] Brooks-Wiley entropy looks at three categories of entropy changes: information, cohesion and metabolism. Information entropy here measures the efficiency of the genetic information in recording all the potential combinations of heredity which are present. Cohesion entropy looks at the sexual linkages within a population. Metabolic entropy is the familiar chemical entropy used to compare the population to its ecosystem. The sum of these three is a measure of nonequilibrium entropy that drives evolution at the population level.
A 2022 article by Helman in Acta Biotheoretica suggests identifying a divergence measure of these three types of entropies: thermodynamic entropy, information entropy and species entropy.[47] Where these three are overdetermined, there will be a formal freedom that arises similar to how chirality arises from a minimum number of dimensions. Once there are at least four points for atoms, for example, in a molecule that has a central atom, left and right enantiomers are possible. By analogy, once a threshold of overdetermination in entropy is reached in living systems, there will be an internal state space that allows for ordering of systems operations. That internal ordering process is a threshold for distinguishing living from nonliving systems.
Entropy and the search for extraterrestrial life
In 1964
Because these ideas conflicted with more traditional approaches that assume biological signatures on other planets would look much like they do on Earth, in discussing this issue with some of his colleagues at the Jet Propulsion Laboratory, he was asked what he would do to look for life on Mars instead. To this, Lovelock replied "I'd look for an entropy reduction, since this must be a general characteristic of life." This idea was perhaps better phrased as a search for sustained chemical disequilibria associated with low entropy states resulting from biological processes, and through further collaboration developed into the hypothesis that biosignatures would be detectable through examining atmospheric compositions. Lovelock determined through studying the atmosphere of Earth that this metric would indeed have the potential to reveal the presence of life. This had the consequence of indicating that Mars was most likely lifeless, as its atmosphere lacks any such anomalous signature.[48]
This work has been extended recently as a basis for biosignature detection in exoplanetary atmospheres. Essentially, the detection of multiple gases that are not typically in stable equilibrium with one another in a planetary atmosphere may indicate biotic production of one or more of them, in a way that does not require assumptions about the exact biochemical reactions extraterrestrial life might use or the specific products that would result. A terrestrial example is the coexistence of methane and oxygen, both of which would eventually deplete if not for continuous biogenic production. The amount of disequilibrium can be described by differencing observed and equilibrium state
But there is a caveat related to the potential for chemical disequilibria to serve as an anti-biosignature depending on the context. In fact, there was probably a strong chemical disequilibrium present on early Earth before the origin of life due to a combination of the products of sustained volcanic outgassing and oceanic water vapor. In this case, the disequilibrium was the result of a lack of organisms present to metabolize the resulting compounds. This imbalance would actually be decreased by the presence of chemotrophic life, which would remove these atmospheric gases and create more thermodynamic equilibrium prior to the advent of photosynthetic ecosystems.[51]
In 2013 Azua-Bustos and Vega argued that, disregarding the types of lifeforms that might be envisioned both on Earth and elsewhere in the Universe, all should share in common the attribute of decreasing their internal entropy at the expense of free energy obtained from their surroundings. As entropy allows the quantification of the degree of disorder in a system, any envisioned lifeform must have a higher degree of order than its immediate supporting environment. These authors showed that by using fractal mathematics analysis alone, they could readily quantify the degree of structural complexity difference (and thus entropy) of living processes as distinct entities separate from their similar abiotic surroundings. This approach may allow the future detection of unknown forms of life both in the Solar System and on recently discovered exoplanets based on nothing more than entropy differentials of complementary datasets (morphology, coloration, temperature, pH, isotopic composition, etc.).[52]
Entropy in psychology
The notion of entropy as disorder has been transferred from thermodynamics to psychology by Polish
The idea was continued by Struzik, who proposed that Kępiński's information metabolism theory may be seen as an extension of Léon Brillouin's negentropy principle of information.[57] In 2011, the notion of "psychological entropy" was reintroduced to psychologists by Hirsh et al.[58] Similarly to Kępiński, these authors noted that uncertainty management is a critical ability for any organism. Uncertainty, arising due to the conflict between competing perceptual and behavioral affordances, is experienced subjectively as anxiety. Hirsh and his collaborators proposed that both the perceptual and behavioral domains may be conceptualized as probability distributions and that the amount of uncertainty associated with a given perceptual or behavioral experience can be quantified in terms of Claude Shannon's entropy formula.
Objections
This section needs expansion. You can help by adding to it. (December 2015) |
Entropy is well defined for equilibrium systems, so objections to the extension of the second law and of entropy to biological systems, especially as it pertains to its use to support or discredit the theory of evolution, have been stated.[59][60] Living systems and indeed many other systems and processes in the universe operate far from equilibrium.
However, entropy is well defined much more broadly based on the probabilities of a system's states, whether or not the system is a dynamic one (for which equilibrium could be relevant). Even in those physical systems where equilibrium could be relevant, (1) living systems cannot persist in isolation, and (2) the second principle of thermodynamics does not require that free energy be transformed into entropy along the shortest path: living organisms absorb energy from sunlight or from energy-rich chemical compounds and finally return part of such energy to the environment as entropy (generally in the form of heat and low free-energy compounds such as water and carbon dioxide).
The Belgian scientist Ilya Prigogine has, throughout all his research, contributed to this line of study and attempted to solve those conceptual limits, winning the Nobel prize in 1977. One of his major contributions was the concept of the dissipative system, which describes the thermodynamics of open systems in non-equilibrium states.[61]
See also
- Abiogenesis
- Adaptive system
- Complex systems
- Dissipative system
- Ecological entropy – a measure of biodiversityin the study of biological ecology
- Ectropy – a measure of the tendency of a dynamical system to do useful work and grow more organized[62]
- Entropy (order and disorder)
- Extropy– a metaphorical term defining the extent of a living or organizational system's intelligence, functional order, vitality, energy, life, experience, and capacity and drive for improvement and growth
- Negentropy – a shorthand colloquial phrase for negative entropy[63]
- Self-organization - In non-equilibrium thermodynamics, entropy and dissipative structures are connected to self-organization phenomenon (patterning, orderliness). Life systems and its subsystems are dissipative structures with some degree of self-organization.
References
- ^ Adams, Henry. (1986). History of the United States of America During the Administration of Thomas Jefferson (pg. 1299). Library of America.
- ^ Adams, Henry. (1910). A Letter to American Teachers of History. Google Books, Scanned PDF. Washington.
- ^ ISBN 978-0-429-32952-4.
- ISBN 978-90-277-0250-0.
- ^ a b McCulloch, Richard Sears (1876). Treatise on the mechanical theory of heat and its applications to the steam-engine, etc. New York: D. Van Nostrand.
- ISBN 978-0-521-42708-1.
- ^ Schneider, Eric D.; Sagan, Dorion (2005). Into the Cool: Energy Flow Thermodynamics and Life. Chicago, United States: The University of Chicago Press. p. 15.
- ^ The common justification for this argument, for example, according to renowned chemical engineer Kenneth Denbigh in his 1955 book The Principles of Chemical Equilibrium, is that "living organisms are open to their environment and can build up at the expense of foodstuffs which they take in and degrade."
- ISBN 0-7167-4372-8
- ^ Peterson, Jacob. "Understanding the Thermodynamics of Biological Order". The American Biology Teacher, 74, Number 1, January 2012, pp. 22–24.
- ISBN 978-0-12-385187-1.
- ^ Higgs, P. G., & Pudritz, R. E. (2009). "A thermodynamic basis for prebiotic amino acid synthesis and the nature of the first genetic code" Accepted for publication in Astrobiology
- ISBN 978-0-87901-711-8.
- ISBN 978-981-238-399-0.
- ^ Chaitin, Gregory (1979). "Towards a mathematical definition of Life" (PDF). MIT press. pp. 477–498.
- ^ Tamvakis, Ioannis (2018). "Quantifying life".
- ^ Lisa Zyga (11 August 2008). "Evolution as Described by the Second Law of Thermodynamics". Physorg.com. Retrieved 14 August 2008.
- .
- ^ ISSN 1164-0235.
- ^ ISSN 1164-0235.
- ^ ISSN 0199-6231.
- ^ ISSN 0020-7225.
- ^ S2CID 201057207.
- ISBN 978-0-521-42708-1.
- PMID 23420771.
- S2CID 53211370.
- ^ PMID 10482783.
- PMID 3422229.
- PMID 10397841.
- ^ Molnar, C.; Gair, J. (2015). Concepts of Biology – 1st Canadian Edition. BCcampus.
- ^ Translation (by Ann Synge) of The Origin of Life on the Earth by A.I. Oparin, 1958. 1958–1960.
- ISBN 978-3-642-31729-3, retrieved 5 December 2023
- Bibcode:2021pcol.book...63S.
- PMID 9299295.
- S2CID 610254.
- PMID 34440521.
- ^ Bernstein H, Byerly HC, Hopf FA, Michod RA, Vemulapalli GK. (1983) The Darwinian Dynamic. Quarterly Review of Biology 58, 185–207. JSTOR 2828805
- ^ S2CID 14574109.
- ^ S2CID 14574109.
- OCLC 1171126768.
- bioRxiv 10.1101/179382.
- ^ Wolchover, Natalie (28 January 2014). "A New Physics Theory of Life". Scientific American. Retrieved 11 December 2014.
- ^ Jones, Orion (9 December 2014). "MIT Physicist Proposes New "Meaning of Life"". Big Think. Retrieved 11 December 2014.
- Simona Weinglass, The Times of Israel, October 29, 2015.
- .
- ISBN 978-0471900269.
- S2CID 248394522.
- ISBN 978-0-19-286218-1.
- PMID 29387792.
- S2CID 26959254.
- ISSN 0004-637X.
- S2CID 122793675.
- ^ Kępiński, Antoni (1972). Rhythm of life (in Polish). Kraków: Wydawnictwo Literackie.
- S2CID 34672774.
- PMID 27251694.
- .
- PMID 3654085.
- PMID 22250757.
- ^ Callen, Herbert B (1985). Thermodynamics and an Introduction to Statistical Thermodynamics. John Wiley and Sons.
- ^ Ben-Naim, Arieh (2012). Entropy and the Second Law. World Scientific Publishing.
- ^ Özilgen, M.; Sorgüven, E. (2017). Biothermodynamics: Principles and Applications. Routledge & CRC Press. pp. 285–287.
- ISBN 978-0-691-12327-1.
- ISBN 978-0-521-42708-1.
Further reading
- Schneider, E. and ISBN 9780226739366
- Kapusta, A (2007). "Life circle, time and the self in Antoni Kępiński's conception of information metabolism". Filosofija. Sociologija. 18 (1): 46–51.
- La Cerra, P. (2003). The First Law of Psychology is the Second Law of Thermodynamics: The Energetic Evolutionary Model of the Mind and the Generation of Human Psychological Phenomena, Human Nature Review 3: 440–447.
- Moroz, A. (2011). The Common Extremalities in Biology and Physics. Elsevier Insights, NY. ISBN 978-0-12-385187-1
- John R. Woodward (2010). Artificial life, the second law of thermodynamics, and Kolmogorov Complexity. Artificial life, the second law of thermodynamics, and Kolmogorov Complexity. 2010 IEEE International Conference on Progress in Informatics and Computing
Vol. 2 Pages 1266–1269 IEEE
External links
- Thermodynamic Evolution of the Universe pi.physik.uni-bonn.de/~cristinz