circles
One of the best established facts in thermodynamics is that it is impossible in a system enclosed in an envelope which permits neither change of volume nor passage of heat, and in which both the temperature and the pressure are everywhere the same, to produce any inequality of temperature or of pressure without the expenditure of work. This is the second law of thermodynamics, and it is undoubtedly true as long as we can deal with bodies only in mass, and have no power of perceiving or handling the separate molecules of which they are made up. But if we conceive a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are still as essentially finite as our own, would be able to do what is at present impossible to us. For we have seen that the molecules in a vessel full of air at uniform temperature are moving with velocities by no means uniform, though the mean velocity of any great number of them, arbitrarily selected, is almost exactly uniform. Now let us suppose that such a vessel is divided into two portions, A and B, by a division in which there is a small hole, and that a being, who can see the individual molecules, opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower ones to pass from B to A. He will thus, without expenditure of work, raise the termperature of B and lower that of A, in contradiction to the second law of thermodynamics.

Theory of Heat
James Clerk MAXWELL


If your internet providers permits Automatic translation:
languages other than english

Themes

Information and Energy

Information is a currency of reality, on a par with mass, energy, space and time. This has been proven by scientists who succeeded in converting information into energy. Among the various thought experiments that led to the present observations, the idea of a mechanical device invented by James Clerk Maxwell (see citation on the left hand side panel) plays a central role. Analysis of the minimal set of functions that need to be implemented to design a genome driving the life of a minimal cell has uncovered that several dozens of Maxwell's demons are necessary to animate the cell, enabling it to have a functional assembly line. For a general discussion see myopic selection drives evolution, information of the chassis and information of the program in synthetic cells, bacteria as computers making computer and life's demons .

In the course of discussions with student and via a variety of lectures where I spoke about Maxwell's demon, thinking that this was a familiar thought experiment to my audience, I discovered that most attendants did not know about this little being. Here (page created in january 2011) is a short summary of his lively history, far from finished (see Information Processing and Thermodynamic Entropy). The general role of biology-specific information has been discussed in my book La Barque de Delphes (Odile Jacob 1998, translated as The Delphic Boat, Harvard University Press 2003).

While, from Antiquity to the Medieval Ages Nature was described using ten basic currencies: οὐσία, προσότης, ποιότης, πρός τι, κεῖσθαι, ἔξις, τόπος, χρόνος, πράττειν, παθεῖ, and in latin: essentia, quantitas, qualitas, ad aliquid, situs, habitus, locus, tempus, agere, pati, an essential step in understanding Nature required construction of some entanglement of these categories, a process which progressively reduced them to four: mass, space, time, and subsequently energy. A remarkable achievement was reached when, following others, Einstein combined them together in a surprisingly concise equation, E = mc2. Yet, it was obvious that these universal categories do not account for many phenomena: no one has been able, for example, to derive the crystal lattice of a mineral as simple as sodium chloride from the equations of microscopic physics.

Several features of the ancient categories are not straightforward: qualitas (quality), ad aliquid (relationships), situs and habitus (positioning in space-time), in particular, are not immediate consequences of mass, energy, space or time.

The three first of those are fairly easy to grasp intuitively. Energy is more complicated. Indeed it is associated to a wide variety of connotations, often with some psychological backgrounds. Typically one understands energy as producing work. A car engine uses energy. During the whole of the XIXth century, at the birth of the industrial world, scientists and engineers wondered about the way energy could be available to produce work. Sadi Carnot, first, showed that, in fact, when constructing a steam machine, energy had to be split into two parts, usable energy, that was producing work, and another part, that was depending on the temperature of the system. and could not be used to produce work [Carnot, 1824]. Indeed, steam machines required the presence of two temperature sources, and work was produced when a fluid (water vapour) was flowing from the hot part of the machine to its cold part.

In 1850, Rudolf Clausius revisits the view of Carnot and begins to formalize it and in 1865 proposes to name Entropie the temperature associated part of energy that cannot be transformed into work [Clausius, 1865]. Created from the Greek, as all correct neologisms in science, entropy expresses the idea of an internal metamorphosis (ἐν: within, and τροπεῖν: to alter, change, convert, transform, metamorphose), or Verwandlung in German.

Later on, James Clerk Maxwell, in his Theory of Heat, analyzed the process and related it to the second principle of thermophysics, that states that in a closed material system temperature tends to get uniform [Maxwell, 1871, 1891]. For this, he had to introduce the idea of the "molecular theory of matter", where movement is central: "The opinion that the observed properties of visible bodies apparently at rest are due to the action of invisible molecules in rapid motion is to be found in Lucretius. Daniel Bernoulli was the first to suggest that the pressure of air is due to the impact of its particles on the sides of the vessel containing it; but he made very little progress in the theory which he suggested. Lesage and Prevost of Geneva, and afterwards Herapath in his 'Mathematical Physics' made several important applications of the theory. Krönig also directed attention to this explanation of the phenomena of gases. It is to Professor Clausius, however, that we owe the recent development of the dynamical theory of gases." In gases this means that if one starts with an unsymmetrical distribution, with hot gas molecules in one compartment, and cold gas molecules in a contiguous one, the system will evolve so that the temperature is averaged after some time elapsed. Temperature here measures the degree of agitation of gas molecules: fast when hot, slow when cold. This shift from a continuous description of matter to a discontinuous, atomist view, was later on extended to biology with the birth of molecular biology. It is interesting to note that this took about one century, and the present situation, where "information" is slowly gaining ground, repeats a similar slow path.

Creating a link between information and entropy, Maxwell introduced the idea of a hypothetical being, a ‘demon’, that uses an in-built information-processing ability to reduce the entropy of a homogeneous gas (at a given temeperature). Briefly, the demon is able to measure the speed of gas molecules and open or close a door between two compartments as a function of the molecules' speed, keeping them on one side if fast, and on the other side if slow. This will build up two compartments, one hot, and one cold, reversing time, and acting apparently against the second principle of thermophysics.

Much work has been devoted since this first view, and the idea that information creation required energy was put forward by Leo Szilard to account for the way Maxwell's demon could act [Szilard, 1929].

demon

The role of thermodynamics in computation has been examined repeatedly over the past half century. The physics of information-processing derived a considerable variety of attempts to understand how Maxwell’s demon could function. One of the most important contributions to this work was the account provided by Marian Smoluchowski, professor at the Jagiellone university in Krakòw. At a lecture in Göttingen attended by the most creative physicists and mathematicians of the time, Smoluchowski gave details of the way Maxwell's demon could be implemented as a trap door, permitting information to be coupled to availability of energy and material states of molecules in the environment, [Smoluchowski, 1914].

Later on, Szilard proposed in a loose way to account for the relationship between infomation and entropy [Szilard, 1929], and von Neumann in the 1950s followed suit, stating that each logical operation performed in a computer at temperature T must use an energy of kTln2, thereby increasing entropy by kln2 [see von Neumann, 1966]. This remained the accepted intuition until the IBM company, which was concerned by the limits this would impose on computation, asked its engineers to explore the situation and possibly propose remedies.

Fortunately for computer sciences (you could not work on the machine you are using at this very time if this had reflected reality) this intuition proved to be wrong. Working at IBM, on the limits of physical computation — which would have been rapidly reached if the Szilard-von Neumann's intuition had been valid, Rolf Landauer demonstrated, fifty years ago, that computation could be made to be reversible, hence not consuming any energy [Landauer, 1961].

To understand the meaning of this statement, let us summarize the bases of all computations. Three core boolean operations, AND, NOT and REPLICATE are enough to permit all kinds of logical operations. The operation AND is boolean intersection (multiplication), as we learnt in our first years at school: it takes two binary inputs X and Y and returns the output 1 if and only if both X and Y are 1; otherwise it returns the output 0. Similarly, NOT takes a single binary input X and returns the output 1 if X = 0 and 0 if X = 1. REPLICATE takes a single binary input X and returns two binary outputs, each equal to X. Any boolean function can be constructed by repeated combination of AND, NOT and REPLICATE. Another operation, that can be derived from those, ERASE, is essential to our topic. ERASE is a one-bit logical operation that takes a bit, 0 or 1, and restores it to 0.

Concretely, these operations are implemented as 'logic gates'. A logic gate is a physical device that performs a logical operation. Microprocessors are combining millions and even billions of logic gates to perform the complex logical operations that you find in computers such as the one you are using to read this text.

In his conceptual work, Landauer showed that reversible, one-to-one, logical operations such as NOT can be performed without consuming energy. He also showed that irreversible, many-to-one operations such as RESET require consuming at least kTln2 of energy for each bit of information lost. The core of the argument behind Landauer’s theorem can be readily understood. Briefly, when a bit is erased, the information it contains must go somewhere. It has only two possible ways: either it moves to a place in the computer (or of the cell, if we consider cells as computers) corresponding to an observable degree of freedom, such as another place with a known bit in its memory. If so, it has obviously not been erased but merely moved. Or it goes into places with unobservable degrees of freedom such as the microscopic motion of molecules, and this results in an increase of entropy of at least kln2. Landauer had a seminal role at IBM to implement the C-MOS technology that was at the root of dense microprocessors construction. Landauer was also the father of electronic circuits based on reversible logic, thus exhibiting considerable reductions in energy wasting over conventional irreversible circuits.

In 1973, Bennett extended Landauer's theorem, showing that all computations could be performed using only reversible logical operations, that is, without consuming energy [Bennett, 1973, 1988]. But, where does the energy come from? To perform a logical operation, it is commonly extracted from a store of free energy, then used in the processor that performs the operation, and finally returned to the initial store once the operation has been performed. We note here that in usual computers the store is a battery or an outside electric supply, whereas in cells energy is distributed throughout the matter of the cell. This may have considerable consequences for the computing power of cells (not discussed here). The property of reversibility has been implemented in real computers under the term "adiabatic logic", and real circuits have been described in details to explain how this works [Younis and Knight, 1994]. In the domain of Synthetic Biology, it is interesting to note that Tom Knight, one of the founders of iGEM at the MIT has been seminal in the actualisation of this work. Hence, the connection between information theory, computer sciences and biology is much deeper than what laypersons (and many biologists) would like to think.

Back to Maxwell's demon: In a real computation, errors occur, and to get rid of errors will require an irreversible operation, erasure of the wrong information and replacement by the correct one. Hence, this will result in consuming energy in order to restore the errorless situation. If energy were not consumed, then the system would be able to go backwards in time, and we would have created the perpertual movement. How does this work in reality? The situation is similar to that proposed to be the action of Maxwell's demon: measure, store an information, use it via replication of the measurement to reestablish the initial state, and then erase the memory, to reset the initial state of the demon. Central to this action are two logical processes, REPLICATE and RESET.

If the error rate is x bits per second, for example, then error-correcting processes can be used to detect those errors and reject them to the environment at an energy cost of x kT ln2 J s–1, where T is the temperature of the environment. In fact, biological processes, even at the microscopic level, do not proceed bit by bit, but, rather are highly redundant and simultaneously change a fairly large number of bits. This is because at 300K, the average temperature of life environment, the thermal noise is fairly large so that redundancy is necessary to increase the signal to noise ratio. And the usual "quantum" of energy used is that of hydrolysis of a "energy-rich" phosphate bond, typically hydrolysis of ATP to ADP or GTP to GDP.

While these types of processes have not been presented as concrete illustrations of Maxwell's demon, we have a wealth of examples illustrating behaviours of that type. John Hopfield suggested that in order to identify proofreading functions we should be exploring “known reactions which otherwise appear to be useless or deleterious complications”. And, indeed, in the protein translation process, a proofreading step, using protein EFTu bound to charged transfer RNA, tests whether the tRNA can read correctly the codon immediately available after the tRNA carrying the growing polypeptide, and hydrolyzes a GTP molecule when the correct association has been found, thus acting as a Maxwell's demon [Hopfield, 1974]. We can note here that this is why it is so important for cells to carry energy supports (present in the covalent links making the backbones of macromolecules, in thioesters and in phosphate bonds), making that it is of course impossible that arsenic belongs to the backbone of energy-rich bonds, contrary to a recent mass media hype.

Such error-correcting routines are the norm in biological processes, and function as working analogues of Maxwell’s demon, getting information and using it to reduce entropy at an exchange rate of kT ln2 joules per bit, rejecting errors to the environment at a high rate to maintain reliable operations. This reflection is therefore at the core of what should be a renewed view of the process of ageing.

References

Bennett, C (1973) Logical reversibility of computation. IBM Journal of research and development 17, 525-532.

Bennett, C (1988) Notes on the history of reversible computation. IBM Journal of research and development 44, 270-277.

Carnot, S (1824) Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Bachelier, Paris)

Clausius, R (1850) Über die bewegende Kraft der Wärme und die Gesetze, welche sich daraus für die Wärmelehre selbst ableiten lassen. Annalen der Physik 155, 368-397

Clausius, R (1865) Über verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Annalen der Physik 201, 353-400

Hopfield JJ (1974) Kinetic proofreading: a new mechanism for reducing errors in biosynthetic processes requiring high specificity. Proc Natl Acad Sci U S A 71:4135-4139.

Landauer, R (1961) Irreversibility and heat generation in the computing process. IBM Journal of research and development 1961, 3, 184-191.

Maxwell, JC (1871, reprinted 1902) Theory of Heat (Longmans, Green and Co, London).

Smoluchowski, M (1914) Vorträge über die kinetische Theorie der Materie und der Elektrizität. Account of lecture at a conference held in Göttingen invited by the Foundation Wolfskehl (Teubner, Leipzig, 1914). This conference heard also M. Planck. P. Debye, W. Nernst A. Sommerfeld and H.A. Lorentz and was introduced by David Hilbert and H. Kamerlingh-Onnes...

Szilard, L (1929) Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Zeitchrift fur Physik 53:840-856.

von Neumann, J (posthumous, 1966) Theory of Self-Reproducing Automata (University of Illinois Press, Urbana).

Younis, SG, Knight, T (1994) Asymptotically zero energy computing using split- level charge recovery logic, Technical Report AITR-1500, MIT AI Laboratory

 

      accueil