Cellular Thermodynamics

Cellular Thermodynamics

Wolfe, J. (2002), Encyclopedia of Life Sciences.

Energy of various types is shared among molecules according to Boltzmann’s distribution. This simple law is used to explain aspects of the function and structure of cells, including reaction rates, electrochemical equilibria, osmosis, phase equilibria, molecular aggregation and the stability of membranes.

Introduction
Thermodynamics and thermal physics deal with many quantities and phenomena that are literally vital to the structure, function, operation and health of cells. These include temperature, energy, entropy, concentrations, molecular motions, electrochemistry, osmotic pressure, reaction rates, changes of phase molecular aggregation and much else. This article explains these ideas and serves as an introduction to the material of several other articles. The approach is to introduce the important and global ideas with biological examples and then to concentrate on explanations at the molecular level rather than to follow the traditional development using macroscopic variables and the ideal gas, although the two equivalent pictures are related via the chemical potential. The mathematical and formal rigour that is thus bypassed may be readily found in much longer accounts in classic texts in thermal physics (e.g. Landau and Lifschitz, 1969).

Equilibrium
By definition, equilibrium is achieved when the relevant parameters cease to vary with time. If a mammal dies in a cool, dry environment, its body, initially warmer than its surroundings, gradually comes to thermal equilibrium with them. Over a much longer time scale, its water content falls to several percent: it is approaching hydraulic equilibrium. Complete, global chemical equilibrium is rarely of direct interest to biologists: even seeds are not completely equilibrated with their environment. In many cases, however, local equilibria are achieved, to an excellent approximation. Two adjacent cells are usually extremely close to both thermal equilibrium and hydraulic equilibrium because of their proximity: if there were a temperature difference over the small distance between them, heat would usually flow rapidly from the hotter to the colder. Similarly, water can usually leave and enter cells relatively rapidly, because of the permeability of membranes to water. Therefore, if the cells are close, diffusion of water, in the liquid or vapour phase, can allow rapid approach to hydraulic equilibrium.

Steady State
In a system in steady state, energy and/or matter enter the system and leave the system at the same rate. The energy may leave in a different form or at a different temperature. For example, a portion of human skin approximates steady state with respect to some variables: it is losing heat via radiation and evaporation but is receiving heat via conduction from subcutaneous tissue and the flow of blood; it is losing water and some volatiles by evaporation but receiving them via bulk flow, diffusion and permeation. In steady state, most of the parameters of the system do not change with time. Steady state is a more obvious approximation to the behaviours that interest biologists, although it does not include such important features as growth.

Heat, Light, Photons and ‘Useful Energy’
The illustrations of steady state in Figure 1 introduce some important ideas about the transmission of heat, and other forms of energy, in systems approximating steady state. The temperatures are given in the thermodynamic or absolute scale (in kelvins, K); to convert to Celsius (8C), subtract 273.14 from the temperature in K.

In one sense, ultraviolet (UV) and visible light and infrared (IR) radiation (radiant heat) differ only quantitatively: they are all electromagnetic waves but have different wavelengths. In biochemistry, the difference is qualitative: visible and ultraviolet light are either useful energy for photosynthesis or photoreception, or potentially dangerous radiation for photolabile molecules, while the heat radiated by an organism at ~ 300 K to its surroundings (or from the surroundings to the organism) is not directly useful in biochemistry. The difference is due to the quantization of energy.

Like matter, light is not infinitely divisible. Light of a given colour (and thus, given frequency and wavelength) has a minimum quantity or quantum of energy. This is conveyed by a photon. A photon may be imagined as a minimum ‘packet of energy’, while in transit as light. The energy of a photon is hc/l, where h is the Planck constant, c is the velocity of light and l is the wavelength. For the sun, radiating at a temperature of 6000 K, most of the energy is carried by photons with wavelengths of typically 0.5 mm (visible light) and energies of about 4 x 10^ 19J, which gives 250 kJ per ‘mole of photons’. For living things, radiating at ~ 300 K, the corresponding values are 10 mm (far infrared), 2 x 10 ^ 20 J per photon or 10 kJ per ‘mole of photons’. The quantum of energy of visible (and especially ultraviolet) light is comparable with the separation of the quantized energy levels of electrons in atoms and molecules. Photons with this energy can activate many chemical reactions directly, as in photosynthesis and vision. The quantum energy of infrared radiation from a body at 300 K (~ 10 kJ per mole) is comparable with the typical kinetic energies of molecules due to their thermal motion. This energy can usually only activate a reaction indirectly: reactions that occur at temperatures of ~ 300 K are indeed activated by the thermal motion of molecules, but, as we shall see, it is usually only a small proportion of molecules, having rather more than the average energy, that produce the reaction.

Entropy
This difference between the chemically ‘useful’ energy (light or UV), and the ‘waste’ energy (infrared heat) can be quantified using the concept of entropy. When an amount of heat DQ is transferred at temperature T, the system gaining the heat has its entropy increased by DQ/T. Thus sunlight (which has, roughly speaking, a radiation temperature of ~ 6000 K) is radiation with low entropy. A joule of thermal radiation from the surface of an organism at ~ 300 K (~ 20 times cooler) transmits 20 times as much entropy as a joule of solar radiation. This theme is a common one in biology: energy is acquired from low-entropy sources, and is used in a variety of different biochemical, physiological and ecological mechanisms. Ultimately, the energy is lost to a ‘sink’ for energy at low temperature and correspondingly high entropy. Traditional thermodynamics discusses heat engines (such as the steam turbines at thermal power stations) taking heat from the high-temperature source, converting part of it into work, rejecting the rest as heat into the low-temperature sink (the cooling tower) and generating entropy in so doing. The biology of (most of) the biosphere depends thus not only upon the energy from the sun (a source at T ~ 6000 K) but also upon the ultimate sink of energy, the night sky (at ~ 3 K).

Macroscopic and Microscopic Views
A cell is, for our purposes, a macroscopic object. It contains perhaps 1014 molecules, and we can define and measure such parameters as its temperature, the pressure inside a vacuole, the tension in a membrane and so on. This is the domain of thermodynamics. The power of this discipline derives from the fact that one can often apply thermodynamics to the overall behaviour of systems without knowing the molecular details.

For a single small molecule, on the other hand, temperature and pressure are neither well-defined nor measurable. The complementary discipline of statistical mechanics applies the laws of physics to individual molecules, atoms and photons and, by considering the statistical behaviour of large numbers, deduces the behaviour of macroscopic systems. It thus relates thermodynamics to Newtonian or quantum-mechanical laws at the molecular level.

From cells to ultrastructure to molecules, we move from the macroscopic to microscopic points of view. For instance, work, kinetic energy and heat are forms of energy and, in the macroscopic world, they are clearly distinguished. This difference almost disappears at the molecular level.

The laws of thermodynamics apply to macroscopic systems. The Zeroth Law states that if one system is in thermal equilibrium with a second, and the second is in thermal equilibrium with a third, the first and the third are in thermal equilibrium. This law makes temperature a useful concept: temperature is defined as that quantity which is equal in two systems at thermal equilibrium. The First Law is the statement of conservation of energy at the macroscopic, thermal level. It states that the internal energy Uof a system is increased by the net amount of heat DQ added to it and decreased by the amount of work DW that the system does on its surroundings: DU = DQ 2 DW, and that the internal energy of a system is a function of its physical parameters (i.e. it is a state function). The Second Law states that the total entropy generated by a process is greater than or equal to zero. It is possible for the entropy of a system to decrease (e.g. the contents of a refrigerator during cooling), but in doing so the entropy of the surroundings is increased (the refrigerator heats the air in the kitchen). The Third Law states that no finite series of processes can attain zero absolute temperature. (Different but equivalent statements of the second and third law are possible.)

These laws apply to macroscopic systems, where entropy and temperature have a clear meaning, and where heat is clearly distinguished from other forms of energy. At the microscopic level, they may be applied to a large number of molecules in a statistical way, but not to individual molecules. For example, the kinetic energy of a small molecule is heat. If it is accelerated by interaction with another molecule, has work been done on it, or has heat been given to it? For a large collection of molecules, however, collective organized motion (say a current of air) can do work, whereas random individual motion (in a hot, stationary gas) can supply heat. The second law can be stated thus in molecular terms: on average, systems go from improbable states (states having a small number of possible configurations) to probable states (having many possible configurations). For very small numbers of molecules, however, the improbable is possible. If you toss 10 coins, you will occasionally (0.1% of the time) get 10 heads. If you could toss 6 x 1023 coins, you would never toss 6 x 1023 heads.

Conclusion
The ubiquitous Boltzmann distribution explains, at a molecular level, a range of effects and processes important to biochemistry and cell biology. This chapter may also be considered as an introduction to further reading in other articles including those dealing with aspects of thermodynamics in biochemistry, cell biophysics, membranes, macromolecules, water, hydrophobic effect, plant stress, and in other works in these fields.



One of the components of a person's success in our time is receiving modern high-quality education, mastering the knowledge, skills and abilities necessary for life in society. A person today needs to study almost all his life, mastering everything new and new, acquiring the necessary professional qualities.