The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life

by Werner R. Loewenstein
The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life

The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life

by Werner R. Loewenstein

eBook

$24.99  $32.99 Save 24% Current price is $24.99, Original price is $32.99. You Save 24%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

No one can escape a sense of wonder when looking at an organism from within. From the humblest amoeba to man, from the smallest cell organelle to the amazing human brain, life presents us with example after example of highly ordered cellular matter, precisely organized and shaped to perform coordinated functions. But where does this order spring from? How does a living organism manage to do what nonliving things cannot do--bring forth and maintain all that order against the unrelenting, disordering pressures of the universe? In The Touchstone of Life, world-renowned biophysicist Werner Loewenstein seeks answers to these ancient riddles by applying information theory to recent discoveries in molecular biology. Taking us into a fascinating microscopic world, he lays bare an all-pervading communication network inside and between our cells--a web of extraordinary beauty, where molecular information flows in gracefully interlaced circles. Loewenstein then takes us on an exhilarating journey along that web and we meet its leading actors, the macromolecules, and see how they extract order out of the erratic quantum world; and through the powerful lens of information theory, we are let in on their trick, the most dazzling of magician's acts, whereby they steal form out of formlessness. The Touchstone of Life flashes with fresh insights into the mystery of life. Boldly straddling the line between biology and physics, the book offers a breathtaking view of that hidden world where molecular information turns the wheels of life. Loewenstein makes these complex scientific subjects lucid and fascinating, as he sheds light on the most fundamental aspects of our existence.

Product Details

ISBN-13: 9780190283629
Publisher: Oxford University Press
Publication date: 01/07/1999
Sold by: Barnes & Noble
Format: eBook
File size: 5 MB

About the Author

Werner R. Loewenstein was Professor of Physiology and Director of the Cell Physics Laboratory at Columbia University and is presently Director of the Laboratory of Cell Communication at the Marine Biological Laboratory, Woods Hole, Massachusetts. He lives in Woods Hole and Key Biscayne, Florida.

Read an Excerpt

Chapter One

    Information and Organisms


    Maxwell's Demon

No one can escape a sense of wonder when looking at a living being from within. From the humblest unicellular organism to man, from the smallest cell organelle to the stupendous human brain, living things present us with example after example of highly ordered matter, precisely shaped and organized to perform coordinated functions. Where does this order spring from? How does a living organism manage to do what nonliving matter cannot do — bring forth and maintain order against the ever-present disordering pressures in the universe?

    The order prevailing in living beings strikes the eye at every level of their organization, even at the level of their molecular building blocks. These macromolecules are made of the same kind of atoms the molecules of nonliving things are made of and are held together by the same kind of forces, but they are more complex. Take a hemoglobin molecule, for example, the major protein of our blood, and set it against a sodium chloride molecule, the common salt from the inanimate mineral world. Both are orderly structures. The salt is an array of sodium and chloride atoms — always the same two atoms in an endless crystal repeat. The hemoglobin is an array of 574 amino acids constituting four chains that twist about each other, forming an intricate coil; the overall structure is essentially the same in zillions of hemoglobin molecules, and there normally is not an amino acid out of place in the linear sequence of the chains. Such a structure clearly embodies a higher level of organization than the monotonous salt repeat. It is also less likely to form spontaneously. While the salt crystal readily assembles itself from its atoms in solution — we can watch this day in and day out when seawater evaporates — it is highly improbable that a hemoglobin molecule would ever do so spontaneously.

    Is there some special organizing principle in living beings, some entity that manages to create order in the liquid pandemonium of atoms and molecules? About 150 years ago, the physicist Maxwell, while musing about the laws that rule the tumultuous molecular world, came up with an intriguing idea. He conceived an experiment, a thought experiment, in which a little demon who picks out molecules seemed to escape what was thought to be the supreme law, the second law of thermodynamics. This demon comes as close as anything to the organizing biological principle we are angling for. So, let's take a look at that gedankenexperiment.

    Maxwell chose a system made of just two sorts of molecules, a mix of hot and cold gas molecules moving randomly in two chambers, A and B, connected by a friction-free trap door (Fig. 1.1). Suppose, said Maxwell, we have a demon at that door, who can see individual molecules and distinguish a hot molecule from a cold one. When he sees a hot one approaching from B to A, he opens the door and lets it through. As he repeats that operation over and over, we would wind up with an organization in which the two kinds of molecules are neatly sorted out into a hot and a cold compartment — an organization which would cost nothing because the demon only selects molecules but does no work.

    One literally seemed to get something for nothing here. The assumption of a friction-free operation is valid in a thought experiment, because there is no intrinsic lower limit to performance. So, to all appearances, this cunning organizing entity of Maxwell's seemed to flout the Second Law. And adding insult to injury, it would be able to drive a machine without doing work — the old dream of the perpetual motion machine come true.

    This paradox kept physicists on tenterhooks for half a century. Two generations of scientists tried their hands at solving it, though to no avail. The light at the end of the tunnel finally came when Leo Szilard showed that the demon's stunt really isn't free of charge. Though he spends no energy, he puts up a precious commodity called information. The unit of this commodity is the bit, and each time the demon chooses between a hot and a cold molecule, he shells out one bit of information for this cognitive act, precisely balancing the thermodynamics accounts.

    Thus, the hocus-pocus here has an exact information price. As has happened often in physics, the encounter with a deep paradox led to a higher level of understanding. Szilard's introduction of the concept of information as a counterweight to thermodynamics disorder (entropy), opened a whole new perspective of the organization of energy and matter. It marked the beginning of information theory. This theory already has shown its pizzazz in a number of fields in physics and engineering; the present-day communication and computer revolution provides ample proof of that. But perhaps its greatest power lies in biology, for the organizing entities in living beings — the proteins and certain RNAs — are but Maxwell demons. They are the most cunning ones, as we shall see; their abilities of juggling information are so superlative that it boggles the mind.

    In this and some of the following chapters, we will try to get to the bottom of their tricks. We will see how they finagle information from their molecular neighbors — how they extract it, transmute it, and use it to organize their own world. But first of all we need a clear picture of what Information is. I capitalize the word here to signal that it has a somewhat different meaning in science than in daily usage. This meaning is defined by two formulas, the only occasion we resort to mathematics in this book. The concepts embodied in these formulas are straightforward and will be explained as we go along.


    What Is Information?

Information, in its connotation in physics, is a measure of order — a universal measure applicable to any structure, any system. It quantifies the instructions that are needed to produce a certain organization. This sense of the word is not too far from the one it once had in old Latin. Informare meant to "form," to "shape," to "organize." There are several ways one could quantify here, but a specially convenient one is in terms of binary choices. So, in the case of Maxwell's demon we would ask, how many "yes" and "no" choices he makes to achieve a particular molecular organization, and we would get the answer here directly in bits. In general, then, we compute the information inherent in any given arrangement of matter (or energy) from the number of choices we must make to arrive at that particular arrangement among all equally possible ones. We see intuitively that the more arrangements are possible, the more information is needed to get to that particular arrangement.

    Consider a simple linear array, like the deck of eight cards or the chain of eight amino acids in Figure 1.2. How much information would be needed to produce such an array with a perfectly shuffled set? With the cards, the problem is like finding out in a binary guessing game their order when they are turned face down. Let's start, for example, with the Ace of Hearts. We are allowed in such a binary game to ask someone who knows where the Ace is a series of questions that can be answered "yes" or "no" — a game where the question "Is it here?" is repeated as often as needed. Guessing randomly would eventually get us there; there are eight possibilities for drawing a first card. But, on the average, we would do better if we asked the question for successive subdivisions of the deck — first for subsets of four cards, then of two, and finally of one. This way, we would hit upon the Ace in three steps, regardless where it happens to be.

    Thus, 3 is the minimum number of correct binary choices or, by our definition above, the amount of information needed to locate a card in this particular arrangement. In effect, what we have been doing here is taking the binary logarithm of the number of possibilities (N); log2 8 = 3. In other words, the information required to determine the location in a deck of 8 cards is 3 bits. If instead of 8 cards our deck contained 16, the information would be 4 bits; or if it contained 32 cards, it would be 5 bits; and so on.

    In general, for any number of possibilities (N), the information (I) for specifying a member in such a linear array, thus, is given by

I = log2 1 / N .

I here has a negative sign for any N larger than 1, denoting that it is information that has to be acquired in order to make the correct choice.

    Leaving now this simple example, we consider the case where the N choices are subdivided into subsets (i) of uniform size (ni), like the four suits in a complete deck of cards. Then, the information needed to specify the membership of a subset again is given by the foregoing equation, where N now is replaced by N / ni, the number of subsets

I = log2 ni / N .

If we now identify ni / N with the proportion of the subsets (pi), we have

Ii = log2 pi .

As we take this one step further, to the general case where the subsets are nonuniform in size, that information will no longer be the same for all subsets. But we can specify a mean information which is given by

I = pi  log2 pi .
      i

This is the equation that Claude Shannon set forth in a theorem in the 1940s, a classic in information theory.

    Thus defined, information is a universal measure that can be applied equally well to a row of cards, a sequence of amino acids, a score of music, an arrangement of flowers, a cluster of cells, or a configuration of stars.

    Information is a dimensionless entity. There is, however, a partner entity in physics which has a dimension: entropy. The entropy concept pertains to phenomena that have a developmental history, events that unfold and cannot be undone, like the turning of red hot cinders into cold ashes, the crumbling of a mountain into dust, or the decay of an organism. Such events always have one direction, due to the spontaneous dispersal of the total energy, and tell us which way time's arrow is pointing. They belong to the world of atoms and molecules, the world whose statistical effects we experience through our senses and perceive as never turning back in time — our everyday world of which we say "time goes on."

    Such an arrow of time is absent inside the atoms, the sphere of the elementary particles. In that world entropy has no place; there is complete symmetry of time (at least in ordinary matter) — the constraints prevailing at the level of atomic interactions are lacking, and the time course of the particle interactions can be reversed. Of this world we have no sensory experience and no intuitive feeling. We will concern ourselves with it only in passing in this book.

    Entropy, in its deep sense brought to light by Ludwig Boltzmann, is a measure of the dispersal of energy — in a sense, a measure of disorder, just as information is a measure of order. Any thing that is so thoroughly defined that it can be put together only in one or a few ways is perceived by our minds as orderly. Any thing that can be reproduced in thousands or millions of different but entirely equivalent ways is perceived as disorderly. The degree of disorder of a given system can be gauged by the number of equivalent ways it can be constructed, and the entropy of the system is proportional to the logarithm of this number. When the number is 1, the entropy is zero (log 1 = 0). This doesn't happen very often; it is the case of a perfect crystal at absolute zero temperature. At that temperature there is then only one way of assembling the lattice structure with molecules that are all indistinguishable from one another. As the crystal is warmed, its entropy rises above zero. Its molecules vibrate in various ways about their equilibrium positions, and there are then several ways of assembling what is perceived as one and the same structure. In liquids, where we can have atoms of very many different kinds in mixture and in huge amounts, the number of equivalent ways for assembly gets to be astronomical. Enormous changes of entropy, therefore, are necessary to organize the large protein and nucleic acid molecules that are characteristic of living beings. The probabilities for spontaneous assembly of such molecules are extremely low; probability numbers of the order of 10-50 are not uncommon. Thus, it is practical to express entropy quantities (like information in equation 1) logarithmically.

    Boltzmann's entropy concept has the same mathematical roots as the information concept: the computing of the probabilities of sorting objects into bins — a set of N into subsets of sizes ni. By computing how many ways there are to assemble a particular arrangement of matter and energy in a physical system, he arrived at the expression of entropy (S), the statistical mechanical expression of the thermodynamic concept

S = -k   pi  ln pi .
             i

where k is Boltzmann's constant (3.2983 x 10-24 calories/ºC).

    Shannon's and Boltzmann's equations are formally similar. S and I have opposite signs, but otherwise differ only by their scaling factors; they convert to one another by the simple formula S -(k ln 2) I. Thus, an entropy unit equals -k ln 2 bit.


    The Information-Entropy Trade-Off

Information thus becomes a concept equivalent to entropy, and any system can be described in terms of one or the other. An increase of entropy implies a decrease of information, and vice versa. In this sense, we may regard the two entities as related by a simple conservation law: the sum of (macroscopic) information change and entropy change in a given system is zero. This is the law which every system in the universe, including the most cunning biological Maxwell demons, must obey. So, when a system's state is fully determinate, as in our two eight-state examples in Figure 1.2, the entropy or uncertainty associated with the description is evidently zero. At the other extreme, the completely naive situation where each of the eight states is equally probable (when all cards are face down), the information is zero and the thermodynamic entropy is equivalent to 3 bits. In general, for a system with 2n possible states, the maximum attainable information and its equivalent maximum attainable thermodynamic entropy equal n bits.

    With this we now can go beyond our earlier illustrative statement of information in terms of choices made by entities endowed with brains (the appropriate statement for human signals and communication) to a definition in the deeper sense of the two equations, in terms of the probabilities of the system's component states, that is, in terms of how far away the system is from total disorder or maximum entropy — how improbable it is. This is the sense of information that will be immediately useful when we concern ourselves with the self-organizing molecular systems of living beings, which pull themselves up from disorder.

    To illustrate the information-entropy liaison, let us take a look at the evolution of a puff of smoke in a closed room, one of the simplest molecular systems one might think of. Early on, when the puff has just been delivered, most of the smoke molecules are close to the source. At that instant, the system has significant order and information. With time, the smoke molecules spread through the room, distributing themselves more and more evenly; the system evolves toward more probable states, states of less order and less information (Fig. 1.3:1-3). Finally, the distribution of the molecules is completely even in the statistical sense (Fig. 1.3:4). The system then lacks a definable structure — its molecules merely move about at random; order and information have decayed to zero. This is the state of thermodynamic equilibrium, the condition of highest probability, which all systems, abandoned to themselves, sooner or later will run down to — once a system reaches that state, it is highly unlikely that it will go back to a state with more order.

    Thus, the molecular system loses its information with time; or, to use the other side of the coin, its entropy increases — which puts this in the traditional terms of thermodynamics, its Second Law. The information escapes, as it were, and no wall on earth could stop that; even the thickest wall won't stop the transfer of heat or shield off the gravitational force. Because we can see and feel this loss of information with the passage of time in everything around us, no one will feel like a stranger to the Second Law — it's time's arrow inside us. Indeed, the basic notions of that law are deeply ingrained in our folklore in phrases like "that's the way the cookie crumbles," "it does no good to cry over spilled milk," "you can't have your cake and eat it too." But, perhaps, nothing says it better than that old nursery rhyme at the end of the chapter (Figure 1.4).

    The time's arrow of thermodynamic events holds for nonliving as well as living matter. Living beings continuously lose information and would sink to thermodynamic equilibrium just as surely as nonliving systems do. There is only one way to keep a system from sinking to equilibrium: to infuse new information. Organisms manage to stay the sinking a little by linking their molecular building blocks with bonds that don't come easily apart. However, this only delays their decay, but doesn't stop it. No chemical bond can resist the continuous thermal jostling of molecules forever. Thus, to maintain its high order, an organism must continuously pump in information.

    Now, this is precisely what the protein demons do inside an organism. They take information from the environment and funnel it into the organism. By virtue of the conservation law, this means that the environment must undergo an equivalent increase in thermodynamic entropy; for every bit of information the organism gains, the entropy in the demon's environment must rise by a certain amount. There is thus a trade-off here, an information-for-entropy barter; and it is this curious trade which the protein demons ply. Indeed, they know it from the ground up and have honed it to perfection. Bartering nonstop, they draw in huge information amounts, and so manage to maintain the organism above equilibrium and locally to turn the thermodynamic arrow around.


    The Act of a Molecular Demon

The information-for-entropy barter brings us a little nearer to what makes living beings tick. So, it will pay to keep an eye on those demons as they juggle the bits and entropies. But let me say from the start that the juggling is entirely above-board; protein demons always obey the laws of thermodynamics. Though, to get to the bottom of their tricks, we may have to crane our necks, for they bend and twist, going through quite a few contortions. But we will see that much of this is only sleight of hand. What these proteins really do is shift their stance — they switch between two positions, two stable molecular configurations. They are like digital computers in this regard. Indeed, in a sense, they are minicomputers and, at bottom, their switching is cast in the same information mold as the switching of the logic gates of a computer memory register.

    Let me explain. Every protein is endowed with a certain amount of information. That information is inherent in the orderly structure of the molecule, just as the information in our deck of cards was inherent in its orderly sequence. The information arises as the protein molecule is assembled from its components in a unique configuration in the organism — the molecule then gets charged with information, as it were.

    With this information, its native intelligence, the protein now shows its demon side: it recognizes other molecules and selects them out of many choices. Though it may not see the molecules as the demon in Maxwell's thought experiment does (it needn't sense photons), it recognizes them just as surely at short range by direct molecule-to-molecule interaction; it fingers them, feeling their shape, so to speak. If a molecule fits the protein's own configuration, it will stick to it by dint of the intermolecular attractive forces. The protein, thus, can pick a particular amino acid molecule or a particular peptide molecule out of a myriad of choices.

    Such are the first cognitive steps from which biological order emerges, but the act still isn't over. Next, upon interaction with its chosen prey, it undergoes a change in configuration. This change doesn't necessarily involve the entire bulk of the protein molecule, but it is enough for it to disengage. It is then available for another round of cognition, but not before it is switched back to its original state of information; its configuration must be reset for each cognitive cycle.

    The kinship with the computer memory now begins to shine through. In a computer each of the possible logical states is represented by a certain configuration in the hardware — each of n logical states has its physical representation in the hardware. And when the computer memory of n bits is cleared, the number of 2n possible states is reduced to 1 — an increase in information which must be balanced by an increase in entropy in the computer's environment, in compliance with the conservation law.

    It is thus the forgetting that is thermodynamically costly. And that is why a computer, indeed, any cognitive entity, cannot skirt the Second Law. This solution to Maxwell's perennial riddle finally came in recent years through a penetrating inquiry into the energy requirements of digital computers by the mathematicians Rolf Landauer and Charles Bennett. It is in the resetting of the memory register where the unavoidable thermodynamic price is paid.

    The same holds for the biological demon. For each cognition cycle, the demon's memory must be reset; after each molecular selection, he must forget the results of that selection in order to select a molecule again. There may be, in addition, some thermodynamic cost owing to the thermal jitter in the biological demon's molecular structure (as there may be in the case of Maxwell's original demon owing to such jitter in the trap door), but it is in the forgetting where the irreducible energy cost of the demon's cognitive work is paid — where the thermodynamics accounts are settled by an increase in entropy exactly equal to the information gain.


    How the Molecular Demons in Organisms Gather Information

This tells us more about the act of a protein or RNA demon than a thousand chemical analyses. But where do the entropies come from in the memory resetting, and how do they flow? The entropy-information to-and-fro can get quite tangled in demons that do their thing in conjunction with other molecules, as most protein and RNA demons of our body do. So to see through their machinations, to see how they balance the thermodynamic accounts, we return to Maxwell's original thought experiment where the complexities of the entropy flow of the cognitive act are reduced to the minimum, because the demon has no molecular partner. There, the demon sees his cognitive target — he relies on photons bouncing off the target — and so he can do the balancing with entropy derived directly from photons. This keeps things relatively simple.

    Recall that Maxwell's demon starts with a molecular system in total disarray, a system at thermodynamic equilibrium. Everything is pitch-black then by definition. Thus, to see, the demon needs a source of light, say, a flashlight (in principle, any source of photons that is not at equilibrium will do). A source like that allows us to readily follow how the entropy comes and goes in the demon's world. So let us take advantage of this to pry into his well-guarded secret.

    Consider the flashlight first. That consists of a battery and a light bulb with a filament. The battery contains the energy but produces no entropy; the filament radiates the energy and produces entropy. Now, one of two things can happen, depending on what the demon does. When he does nothing, the radiated energy is absorbed by the molecules in the system, producing an overall increase in entropy — energy which goes to waste. When he intervenes, there is an additional increase of entropy as a quantum of light is scattered by a molecule and strikes his eye. But, in this case, not all of the energy is wasted; part of it is used as information.

    And this takes us to the essence of the cognitive act: the demon uses part of the entropy and converts it to information; he extracts from the flashlight system negative entropy, so to speak, to organize a system of his own.

    There is, then, nothing otherworldly in the demon's act. His operation is perfectly legal; the sum total of the entropies is decreased as he selects the molecule, decreasing the probability of the system (pi, equation 1), exactly satisfying the second law of thermodynamics. All the demon does is redeem part of the entropy as information to organize his system. His stunt is to reduce the degradation of energy infused into the system from without.

    This analytically simple system affords us a glimpse into the demon's secret. The demon's act here is not very different from that of some of the molecular demons populating our body. Drawing negative entropy from a battery or a light source for molecular organization is not just a figment of a thought experiment, but is precisely what certain proteins do. In our own organism, only few proteins (like those in the receptors of our eyes) can use a light source; most of our protein demons get their thermodynamic pound of flesh from high-energy phosphates in their surrounds, and some get it from the 0.1-volt battery of the cell membrane, as we shall see. But in plants and bacteria there are proteins that pump in information directly from the sunlight and, in a wider sense, this pumping is what keeps all organisms on earth alive.

    It may seem odd at first to think of energy as an information source, but energy, indeed, can hold much information; it can be as ordered as matter. Just as a gas will spread into a volume, energy will spread over the available modes as it interacts with matter. This simply reflects the tendency of the energy quantum to partition into various modes; the probability is higher for the entire quantum to exist in several modes than in only one particular one. Thus, in the sense of equation 2, the unpartitioned quantum, the less probable state, embodies more information.

    Consider now what happens when a photon of sunlight gets absorbed by matter. It ends up in many modes of smaller quantum — a state with less information. By degrading the energy of one photon into smaller quanta of thermal energy, the information content thus gets reduced by a certain amount. This way, with one kilocalorie of energy at ordinary room temperature, there are about 1023 bits of information to be had. Now, a plant protein demon, who specializes in photons, manages to recoup a good part of that degraded energy in information. Thus, he can build quite a nest egg over a lifetime.

    The amounts that animal protein demons gather by extracting information from phosphate and other molecules are no less impressive. Every interatomic bond in a molecule represents a certain energy; a covalent bond, for instance, amounts to the order of 100 kilocalories per mole (a mole contains 6.022 x 1023 molecules). The average protein demon in our organism can extract nearly one-half of that in information — about 5 x 1024 bits per mole (the rest heats the environment). A few moles of covalent bonds in the hands of a competent demon, thus, go a long way informationwise.


    Organisms Are Densely Packed with Information

Organisms have whole legions of protein demons at their beckon, hauling in information from their surrounds. Virtually all of this information ends up being compressed into the microscopic spaces of the cells (spaces often no more than 10-15 cubic meters). Order also exists elsewhere, of course. The beautiful world of mineral crystals, the splendid patterns of nonequilibrium hydrodynamic and chemical structures, the constellations in the night sky, all provide examples of information in the universe. But there nowhere seems to be so much information for a given outlay of space and time as there is in organisms — and if we came across an unfamiliar thing from outer space as cramfull with information, we would take it for a live one.

    The large molecules in living beings, the macromolecules, offer the most striking example of information density. Take, for instance, our DNA. Here on a 1-nanometer-thick strand are strung together a billion molecular units in an overall nonrecurring, yet perfectly determined sequence; all the DNA molecules of an individual display that same sequence. We are not yet up to the task of precisely calculating the information inherent in this structure, but we need no precision to see that its potential for storing information is huge. The possible positions in the linear order of units, the four types of DNA bases, represent the elements of stored information. So, with 109 bases and four possible choices for each base, there are 410 9 possible base sequences — 410 9 states that are in principle possible, yet they occur always in the same sequence. Even expressed in the logarithmic units of information, this is an impressive number. The number of possibilities is greater than the estimated number of particles in the universe!

    This gives us a feel for the immense potential for storing information in macromolecules and prepares us to be at home with the notion — as far as one can ever be with immense numbers — that the basic information for constructing an organism would fit on such a molecular string, which in humans, taking all 46 chromosomes together, is over a meter long.

    There is less storage capacity in the shorter DNA strings of lower organisms, though it is still quite impressive. For some single-celled organisms whose DNA nucleotide strings have only a fraction of the length of those in our own cells, we can make a crude estimate of the information content. The DNA of an amoeba (a nonsocial one), for example, holds on the order of 109 bits. In other words, one billion yes/no instructions are written down in that four-letter script — enough to make another amoeba. This script contains everything an amoeba ever needs to know — how to make its enzymes, how to make its cell membrane, how to slink about, how to digest the foodstuffs, how to react when it gets too dry or too hot, how to reproduce itself. And all that information is enscrolled into a space so small you would need a good microscope to make it out. If you wanted to give all these instructions in the English language, they would fill some 300 volumes of the size of this book (the information content of an average printed page in English is roughly 10,000 bits).

    As for the script of our own body cells, that would fill a whole library — a library with more volumes than in any library now in our civilized world. This immense information amount, too, is compressed into a microscopic space — it all fits into the nucleus of a cell. And nowhere else on this Planet is there such a high concentration of information.


    The Law that Rules It All

Immense as they are, those amounts of information in cells are always finite. This also is true for the amounts in any conglomerate of cells, even a conglomerate as large as our own organism, for it can be shown from first principles that, however large, the number of possible states in any physical system cannot be infinite. This is a good thing to know from the start, as we will shortly embark on a journey along the organismic information stream where the amounts and circularities of the flow might easily deceive one to think that the information is boundless.

    It is also good to know that organisms come into possession of all that information by legal means. The molecular demons who enable them to gather the information may be crafty, but they are not crooked; they always scrupulously settle their entropy accounts. So, what the living system gains in order comes at the expense of its surrounds, exactly in compliance with the second law of thermodynamics. And when the time comes and the system can no longer haul in enough information to balance the accounts, the game is up. The system falls apart, and all the king's horses and all the king's men couldn't put it together again.

    Thus, the living system gets no free rides to organization. Like anything else in the universe, it is subject to the laws of thermodynamics. There have been claims to the contrary — such claims crop up in one disguise or another from time to time. But it has invariably proven a safe bet — and Maxwell's stubborn demon turned out to be no exception — not to fall for explanations that violated these laws. Of all our mental constructs of the world, those of thermodynamics are the most solid ones. Albert Einstein, once musing over the rankings of scientific theories, expressed his conviction that classic thermodynamics is "the only physical theory of universal content which ..., within the framework of its basic notions, will never be toppled."

Table of Contents

AcknowledgmentsIntroductionPART ONE: INFORMATION AND THE ORIGINS OF LIFE1. Information and Organisms2. The Cosmic Origins of Information3. The Growth of the Cosmic Inheritance on Earth4. From Energy to Bonds to Molecular Information/Entropy Balancing Act5. The Three-Dimensional Carriers of Biological Information6. The Growth of Biological InformationPART TWO: INFORMATION FLOW INSIDE CELLS7. The Two Information Realms8. The Game of Seek and FindPART THREE: INFORMATION FLOW BETWEEN CELLS9. Rules of the Intercellular Communication Game10. The Broadcasting of Cellular Information11. Intercellular Communication Cheek-by-Jowl12. Intercellular Communication Via Channels in the Cell Membrane13. Neuronal Communication, Neuronal Computation, and Conscious ThoughtPART FOUR: THE SCIENCE OF THE PECULIAR14. How Can You Explain So Wise an Old Bird in a Few Words? Epilogue Appendix: A Brief Chronicle of "Information"Recommended ReadingReferencesIndex

What People are Saying About This

Jared M. Diamond

Loewenstein, the man who opened the field of cell-to-cell communication, now summarizes the whole field and its implications for cancer, thought, and much else.

From the B&N Reads Blog

Customer Reviews