Boltzmann's Atom: The Great Debate That Launched a Revolution in Physics

Boltzmann's Atom: The Great Debate That Launched a Revolution in Physics

by David Lindley
Boltzmann's Atom: The Great Debate That Launched a Revolution in Physics

Boltzmann's Atom: The Great Debate That Launched a Revolution in Physics

by David Lindley

eBook

$14.99  $19.99 Save 25% Current price is $14.99, Original price is $19.99. You Save 25%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

In 1900 many eminent scientists did not believe atoms existed, yet within just a few years the atomic century launched into history with an astonishing string of breakthroughs in physics that began with Albert Einstein and continues to this day. Before this explosive growth into the modern age took place, an all-but-forgotten genius strove for forty years to win acceptance for the atomic theory of matter and an altogether new way of doing physics. Ludwig Boltz-mann battled with philosophers, the scientific establishment, and his own potent demons. His victory led the way to the greatest scientific achievements of the twentieth century.

Now acclaimed science writer David Lindley portrays the dramatic story of Boltzmann and his embrace of the atom, while providing a window on the civilized world that gave birth to our scientific era. Boltzmann emerges as an endearingly quixotic character, passionately inspired by Beethoven, who muddled through the practical matters of life in a European gilded age.

Boltzmann's story reaches from fin de siècle Vienna, across Germany and Britain, to America. As the Habsburg Empire was crumbling, Germany's intellectual might was growing; Edinburgh in Scotland was one of the most intellectually fertile places on earth; and, in America, brilliant independent minds were beginning to draw on the best ideas of the bureaucratized old world.

Boltzmann's nemesis in the field of theoretical physics at home in Austria was Ernst Mach, noted today in the term Mach I, the speed of sound. Mach believed physics should address only that which could be directly observed. How could we know that frisky atoms jiggling about corresponded to heat if we couldn't see them? Why should we bother with theories that only told us what would probably happen, rather than making an absolute prediction? Mach and Boltzmann both believed in the power of science, but their approaches to physics could not have been more opposed. Boltzmann sought to explain the real world, and cast aside any philosophical criteria. Mach, along with many nineteenth-century scientists, wanted to construct an empirical edifice of absolute truths that obeyed strict philosophical rules. Boltzmann did not get on well with authority in any form, and he did his best work at arm's length from it. When at the end of his career he engaged with the philosophical authorities in the Viennese academy, the results were personally disastrous and tragic. Yet Boltzmann's enduring legacy lives on in the new physics and technology of our wired world.

Lindley's elegant telling of this tale combines the detailed breadth of the best history, the beauty of theoretical physics, and the psychological insight belonging to the finest of novels.

Product Details

ISBN-13: 9781501142673
Publisher: Simon & Schuster
Publication date: 02/13/2024
Sold by: Barnes & Noble
Format: eBook
Pages: 274
Sales rank: 131,263
File size: 971 KB

About the Author

David Lindley holds a Ph.D. in astrophysics and has been an editor at NatureScience, and Science News. He is the author of The End of PhysicsDegrees KelvinWhere Does the Weirdness Go?, and Boltzmann’s Atom. He lives in Alexandria, Virginia.

Read an Excerpt

Introduction

"I DON'T BELIEVE THAT ATOMS EXIST!"

As recently as the waning years of the 19th century, a respected physicist and philosopher could make that statement before an audience of his colleagues and expect to be met not with derision or ridicule but with thoughtful consideration. Many scientists at the time subscribed to the idea that all matter was made up of tiny, elemental particles called atoms, but their arguments were as yet circumstantial. No one could say exactly what an atom was. For such reasons, critics maintained that atoms were no more than clever speculation, unworthy of scientific consideration.

This blunt declaration of disbelief came, in fact, in January 1897 at a meeting of the Imperial Academy of Sciences in Vienna. The skeptic was Ernst Mach, not quite 60 years old, who had been for many years a professor of physics at the University of Prague and who was now a professor of the history and philosophy of science in Vienna. He pronounced his uncompromising opinion in the discussion following a lecture delivered by Ludwig Boltzmann, a theoretical physicist. Boltzmann, a few years younger than Mach, had likewise recently returned to Vienna after many years at other universities in Austria and Germany. He was an unabashed believer in the atomic hypothesis — indeed, his life's work had centered on that single theme.

Nowadays atoms are uncontroversial. Scientists have proved not only that they exist, but that they are made of still smaller objects. An atom is a cloud of electrons swirling around a dense nucleus; the nucleus contains protons and neutrons; inside them are quarks. Quarks, in all likelihood, are not truly fundamental particlesbut manifestations of some still more elementary theoretical structure.

It seems today unremarkable, indeed a matter of course, that theoretical physicists striving to understand the fundamental nature of the world should deal in esoteric ideas and strange objects far removed from the world around us. But it was not always so. Well into the latter half of the 19th century, most scientists saw their essential task as the measurement and codification of phenomena they could investigate directly: the passage of sound waves through air, the expansion of gas when heated, the conversion of heat to motive power in a steam engine. A scientific law was a quantitative relationship between one observable phenomenon and another.

But there came a time when, to understand more deeply, a few scientists found themselves impelled to dig deeper, to probe beyond surface appearances. Ludwig Boltzmann was such a pioneer. He understood before most of his contemporaries that if he pictured a gas as a lively collection of atoms, he could explain many of its properties. The atoms' incessant motion would produce the properties recognized as temperature and pressure. Instead of merely observing and recording that a heated gas expands, he could say why it would expand, and by how much. By understanding the behavior of atoms, he could understand the ability of hot gas to push on a piston, as in a steam engine, and turn its energy into mechanical work.

In his lifelong pursuit of this new atomic perspective, moreover, Boltzmann introduced wholly new theoretical concepts into physics. Because atoms are so numerous, and their motion so varied, he had to use techniques of statistics and probability to depict their collective activities. Although atoms move in fundamentally random ways, Boltzmann found he could nonetheless make accurate predictions of their collective effects; he proved that the disorderly actions of individual atoms could give rise to orderly behavior in bulk. He showed that laws of physics could be built on a foundation of probability, and yet still be reliable. To an audience of physicists raised in the belief that scientific laws ought to encapsulate absolute certainties and unerring rules, these were profound and disturbing changes.

In 19th-century Europe, not every scientist saw such novelties as developments to be cheered. Boltzmann encountered opposition. Many of his fellow physicists did not believe that his goals were worthwhile, or even qualified as science. They had measured the expansion of gases and could write down a simple law relating temperature, pressure, and volume. Boltzmann's alleged atoms, by contrast, were invisible, intangible, and imperceptible. What was the point of explaining a straightforward law, derived directly from experiment, in terms of hypothetical entities that could not be seen and might never be seen? This was why Ernst Mach proclaimed that he didn't believe that atoms existed.

Belief, in the context of a scientific debate, may seem an odd word. Is not science a matter of proof and reason, logic and fact? Atoms, surely, either exist or they do not. Where does belief come into it?

Nevertheless, Mach used that word particularly, and he meant it. Scientific certainty is achieved only gradually — some would say never. When ideas are new and theories tentative, scientists do not and cannot have proof that they are on the right track. They generate hypotheses (a fancy word for guesses) and try to follow where their insight and imagination lead them. But rarely does a scientific hypothesis — at least, not a useful or an interesting one — admit of a straightforward up-or-down, yes-or-no verdict. Valuable hypotheses survive the test of time in countless engagements with reality. The war against ignorance is a war of attrition.

In the meantime, scientists have to keep the faith. They pursue their chosen hypotheses because they believe they are moving forward.

Boltzmann's pursuit of the atomic hypothesis sprang from just such a belief. By explaining a wide variety of the properties of gases from a single starting point, he believed he was providing a new and powerful form of understanding. Mach thought otherwise. He did not doubt Boltzmann's acuity of mind or ingenuity with theories. He just didn't see the point of all that theorizing. And he evolved a philosophy to bolster his beliefs. Science ought to stick to what it can measure directly, and theories ought to restrict themselves to specifying exact relationships between those measured phenomena. Put some energy into a gas, heat it up, and it expands. The rules for such changes had been found out and established years before. Nothing further need be said.

The debate between Boltzmann and Mach was, therefore, less about atoms themselves and more about the purpose of doing physics, and about the nature of the understanding or explaining that physicists sought to achieve. Mach argued for sticking to simple equations linking tangible quantities. Boltzmann believed that more elaborate explanations, dependent on larger assumptions or hypotheses, nevertheless provided a more complete, more satisfying view of the physical world. At the cost of introducing theoretical ideas, Boltzmann claimed, he could generate a more valuable understanding of the way the world worked.

But value, like belief, is another word that doesn't quite sound scientific. Mach, through the course of his long and productive career, steered from physics into philosophy precisely because he grew fascinated by the question of value. What is the worth of a scientific explanation? What kinds of explanations should scientists aim for? By the time he returned to Vienna, his reputation rested on his philosophical writings more than on his scientific achievements (which were diverse and useful, but none of them truly remarkable. If his name is known to nonscientists today, it is because of the Mach number, the velocity of a projectile as a multiple of the speed of sound. A convenient notion, but hardly the product of genius).

Boltzmann, on the other hand, had the impatience with philosophical quibbling typical of most scientists. As his theories grew in power and scope, he knew he was making progress. He understood things better. He didn't worry about what he meant by saying that he understood things better.

When, in 1897, Mach stood up in the Viennese Academy of Sciences and declared flatly that he didn't believe in the existence of atoms, his words, as Boltzmann recalled later, "ran around in my head." Mach's objection set him thinking — thinking, specifically, that as a theoretical physicist of undoubted prowess he ought to be able to summon up some sort of argument, of a philosophical nature, that would dent Mach's stubborn skepticism. He had never been very interested in philosophy, but he would learn something about it and refute his critics.

It was to be an unhappy impulse.

Copyright © 2001 by David Lindley

Chapter 1: A Letter from Bombay Lessons in Obscurity

On December 11, 1845, a lengthy manuscript arrived in the London offices of the Royal Society, the highest scientific association in Great Britain. The author of this work hoped his essay might be published in the Society's august Philosophical Transactions, and the manuscript, by standard practice, was duly sent to a couple of experts for evaluation of its worth. "Nothing but nonsense" was the verdict of one of these eminent reviewers. The other allowed that the paper demonstrated "much skill and many remarkable accordances with the general facts," but concluded nevertheless that the ideas were "entirely hypothetical" and, in the end, "very difficult to admit."

On these recommendations, the manuscript was never published. Worse still, the author, one John James Waterston, never found out what had happened. Waterston was living at the time in Bombay, teaching navigation and gunnery to naval cadets employed by the East India Company. Born and educated in Edinburgh, he spent his life working as a civil engineer and teacher, retiring from his position in India in 1857 to return to Scotland, where he lived modestly on his savings and continued to dabble in science: astronomy, chemistry, and physics. He was known during his lifetime, if at all, as one of the numerous amateurs of Victorian science, working in isolation, contributing from time to time ideas that were more or less sound but of no great consequence.

His rejected manuscript of 1845 embodied Waterston's one truly innovative and profound piece of work, but it was ahead of its time. Only by a few years, admittedly, but that was enough to ensure its unhappy reception by the experts of the Royal Society. Waterston proposed that any gas consisted of numerous tiny particles — he called them molecules — bouncing around and colliding with each other. He showed that the energy of motion in these particles corresponded to the temperature of the gas, and that the incessant impacts of the particles on the walls of the container gave rise to the effect commonly known as pressure. There was more: Waterston calculated the "elasticity" of gases (their ability to flow, roughly speaking) from his model, and he made the subtle observation that in a mixture of different gases all the tiny particles would, on average, have the same energy, so that heavier molecules would move more slowly than lighter ones. He was not right in every detail, but his general arguments and suppositions have survived the test of time. Waterston's fundamental idea, that a gas is made of tiny, colliding particles whose microscopic behavior produces the measurable properties of the gas as a whole, was exactly right.

Waterston's calculations were somewhat rough and ready, and his proofs were not quite solid. It may have been these deficiencies that led to the rejection of his paper — that, and the fact that his name was unknown. It was certainly not revolutionary, in the middle of the 19th century, to suggest that gases consisted of tiny particles. The terms atom and molecule were known in scientific circles, although they designated objects whose true nature was unclear. Even the idea that the motion and collision of these particles had something to do with temperature and pressure was not altogether new. The Royal Society, admirably consistent, had in fact rejected a very similar proposal some 25 years earlier. The author of this earlier attempt was John Herapath, another unsung amateur of Victorian science and engineering. His work was by no means as sophisticated as Waterston's, but he had the right general idea: heat equals the motion of atoms or molecules. He wrote up his ideas in 1820 and sent them to the Royal Society. The chemist Humphrey Davy, then president of the society, declined to publish the paper. Though he was not unsympathetic to atomic thinking, Davy found Herapath's calculations unconvincing, and in truth, Herapath was confused about the mechanics of atomic collisions and came up with an incorrect formula for the temperature of a gas. Still, Herapath succeeded in getting accounts of his work published in other scientific journals, where they were roundly ignored by the scientific community of that age.

Waterston knew of Herapath's work, and of his erroneous formula for temperature, but neither of these two men, it appears, was aware that the atomic picture of a gas was close to a century old by the time they came to it. In 1738 Daniel Bernoulli, one of an extended Swiss clan of Bernoullis that made notable contributions to both mathematics and physics, succeeded in deriving theoretically a relationship between the pressure exerted by a gas and the energy of vibration of the supposed atoms within it. His theory attracted little attention, and was soon forgotten.

Bernoulli's was the first modern atomic or molecular model of a gas. He explained pressure in terms of atomic motion, but not temperature, largely because the nature of heat itself was quite mysterious in Bernoulli's day. Even so, neither he nor Herapath nor Waterston can take any credit for the idea of atoms themselves. They were the inheritors of a centuries-old tradition in natural philosophy according to which everything in the universe is composed fundamentally of minute, indivisible objects. The word atom is of Greek origin, meaning "uncuttable," and it is from ancient Greece that the idea itself descends.

Knowledge of the atomic hypothesis from ancient times is owed largely to the survival of a long poem called De Rerum Natura (On the Nature of Things) by the Roman writer Lucretius. The names of both this poem and its author had faded into oblivion in the centuries after the fall of Rome, but a church official traveling around the monasteries of France and Germany in the 13th century happened across a copy (not an original) and brought it back to the Vatican in 1417. Manuscripts dating back to the 9th or 10th century were subsequently rediscovered and found to be substantially the same as the Vatican copy. From these versions descend all modern editions of De Rerum Natura. Its author, Titus Lucretius Carus, lived from about 95 to 55 B.C. The six books of his great opus lay out a philosophical reflection on life as well as an exposition of a scientific hypothesis. It is fiercely atheistic. It enjoyed a good deal of renown in its time, but was later attacked by the Emperor Augustus in his attempt to restore some of the faded glory of the declining Roman world by reviving the ancient pre-Christian religion.

Lucretius derived his atheism from his adherence to what can be called, with the benefit of two millennia of hindsight, an atomic theory of the natural world. For example:

clothes hung above a surf-swept shore

grow damp; spread in the sun they dry again.

Yet it is not apparent to us how

the moisture clings to the cloth, or flees the heat.

Water, then, is dispersed in particles,

atoms too small to be observable.

In other words, a wet garment has atoms (we would now say molecules) of water clinging to its fabric; heat drives the atoms off, and thus dries the material. An atomic theory of clothes drying seems to be some way from disproving the existence of deities, but Lucretius goes on to observe that the atoms have no volition, and instead move willy-nilly:

For surely the atoms did not hold council, assigning

order to each, flexing their keen minds with

questions of place and motion and who goes where.

But shuffled and jumbled in many ways, in the course

of endless time they are buffeted, driven along,

chancing upon all motions, combinations.

At last they fall into such an arrangement

as would create this universe...

Examined closely, Lucretius says, the range and variety of all the familiar phenomena of the world about us arise from invisible atoms zipping aimlessly this way and that. No need for gods to direct events, or inspire actions and consequences. On the other hand, Lucretius's vision seems to leave little room for human decision or free will either. If the universe takes its course because atoms are following their random paths, then neither gods nor human beings have any control over their destinies; what will happen, will happen, and there is nothing anyone can do to change it.

This is a bleak form of atheism, implying what is nowadays called determinism, meaning that what happens in the future is wholly determined by what has happened in the past. To Lucretius and his followers this view was nevertheless a liberation. In their day the gods were fickle, cruel, and capricious, more inclined to pranks and practical jokes than to love or compassion. The citizens of Rome decidedly did not wish for a god to enter their lives. To believe, as Lucretius insisted, that there were no gods, and that the world proceeded for good or ill quite indifferent to human desires, was by contrast to achieve a measure of repose through calm acceptance. Even death was not to be feared: when the atoms of one's soul and body were forever dispersed, there could be no sensation, no pain. Compared to being taunted or tortured for all eternity by frivolous, merciless gods, that was indeed a blessing.

In his philosophy, based on atomism, Lucretius found a reason to give up the struggle against blind fate, and to live instead with equanimity in the world as it was. He lived in the time of Julius Caesar, when the Roman republic was failing. Tyrants, wayward generals, and corrupt politicians would thereafter take over. Peace was to be found in withdrawing as far as possible from the vicissitudes of life. Whether Lucretius was able to live according to his own recommendation is doubtful. He suffered periods of insanity or mental disturbance, and killed himself when he was about 40 years old. In a story handed down by St. Jerome, Lucretius was so much and so often wrapped in thought that his wife grew resentful, and to restore marital relations secretly gave him a love potion. Unfortunately, the potion was stronger than necessary, drove him mad, and thus impelled him to suicide. Tennyson wrote a poem about the poet and described the reasons for his wife's unhappiness:

Yet often when the woman heard his foot

return from pacings in the field, and ran

to greet him with a kiss, the master took

small notice, or austerely, for — his mind

half-buried in some weightier argument,

or fancy-borne perhaps on the rise

and long roll of the hexameter — he past

to turn and ponder those three hundred scrolls

left by the Teacher, whom he held divine.

This Teacher, the man by the contemplation of whose scrolls Lucretius earned his wife's displeasure, was the philosopher Epicurus, whose name survives in the notion of putting pleasure foremost among one's goals in life. A contemporary critic sniped that the ideal Epicurean way of life consisted of "eating, drinking, copulation, evacuation, and snoring," but there was more to it than that. Epicurus aimed for what might better be called contentedness, which meant freedom from pain and satiation of one's desires rather than any sort of unbridled hedonistic pleasure seeking.

To Epicurus, the greatest fear in life was the fear of death, or rather the fear of an unendurable afterlife that nevertheless had to be endured. As Lucretius reports, Epicurus employed the notion of atoms to argue that death was the final release from suffering, to be regretted, perhaps, but not feared. Lucretius differed from his teacher in one significant way: he went from atomism to atheism, but Epicurus still believed in the gods, and found the determinism of the atomic philosophy not to his taste. For that reason he introduced what seems now a rather odd idea:

When the atoms are carried straight down through the void

by their own weight, at an utterly random time

and a random point in space they swerve a little,

only enough to call it a tilt in motion.

Lucretius goes on to indicate that these "swerves" in the motion of atoms are what cause the atoms to cluster together or collide or otherwise interact in ways that can produce natural phenomena. The main point, however, was apparently to get around strict determinism by allowing atoms to alter their trajectories spontaneously, without any immediate cause. Perhaps this restores free will, or the ability of the gods to meddle, but it strikes the modern reader as an "unscientific" addition to the theory.

It was, indeed, Epicurus's own ill-considered addition. He did not dream up the notion of atoms, but got them from a still earlier source, in the writings of the Greek philosopher Democritus, and his teacher, Leucippus.

Of Leucippus little is known except that he flourished and taught in the years following 440 B.C. in what is now Turkey. His pupil, Democritus, lived from about that time until 371 B.C., mostly in northern Greece, and whether the beginnings of atomism should properly be credited to him or to Leucippus is impossible to say, since the latter's teaching is preserved only in the former's writings. Nevertheless, between the two of them, they put together what we can easily — perhaps too easily — see as the first intimation of a recognizably modern theory of atoms. They proposed that there exists a void, and in this void atoms move about, always in motion. Atom and void are all there is. The atoms come in a variety of distinct types and are indivisible; they band together in different ways to create the tangible and visible ingredients of the world.

To Democritus it was evident that there could be no up or down in an infinite void, and he therefore proposed that atoms move endlessly in all directions, changing course only when they ran into each other. But this implies determinism: once the atoms are off and running, their courses are fixed. There is still room for a deity at the beginning — a prime mover, an uncaused cause, or some other extraphysical influence that sets the atoms up and pushes them off in certain directions — but once that's done, determinism takes over. Does this mean there is no free will or volition? That the future is completely determined by the past? That question has haunted atomic theory, indeed physics in general, since the time of Democritus, and haunts us still today.

What distinguished Leucippus and Democritus from most of their contemporaries, and from almost all of the thinkers who followed them over the next two millennia, was that they were mainly interested in trying to understand how the world worked. Other philosophers began to focus their attention not so much on the universe as on the position of human beings in the universe, the extent to which human beings could know or understand the world around them, and how humans ought to behave. Thus arose the numerous brands of philosophy that have concerned themselves with the nature of knowledge and thought, and with the ethics and morality of human behavior. Religious philosophers took for granted that the universe has a purpose, and that humans have a purpose within it, which they may aspire to or fall away from. Leucippus and Democritus were, by contrast, scientists, aiming to understand as dispassionately as possible what is out there. Since their time, science and philosophy have become separate and frequently combative disciplines.

Atomic theory, with its implicit atheism and determinism, lost the favor of philosophical thinkers for a long period. But it crops up from time to time, for example in the writings of Isaac Newton:

It seems probable to me that God in the beginning form'd matter in solid, massy, hard, impenetrable, movable particles, of such sizes and figures and with such other properties, and in such proportion to space, as most conduced to the end for which he form'd them.

Whether from personal belief or caution, Newton is careful to cede to God the responsibility of creating atoms in the first place. But how, if at all, is this statement an advance on anything that Democritus (through Epicurus and then Lucretius) had said two thousand years earlier? Newton lists the attributes that atoms must or might have, but then concludes, quite circularly, that the properties and behavior are such "as most conduce" to the effects they need to generate. What atoms do, in other words, is whatever they need do in order to produce the phenomena of the natural world. Neither Democritus nor Newton is able to say how, in any specific sense, atoms behave so as to generate physical effects. In the absence of any such elaboration, atomism was bound to remain an appealing but speculative picture rather than a truly scientific theory.

By contrast there were, from the earliest times, plausibly scientific criticisms of the atomic philosophy. One objection that arose in Democritus's time was later taken up with enthusiasm by Aristotle: how could atoms move constantly, without let-up, for all time? In Aristotelian mechanics, inferred from direct observation, moving objects came to a halt unless something intervened to keep them moving. You had to keep kicking a rock to keep it rolling. What, therefore, kept atoms moving?

Once Newton came along with his laws of motion, this argument lost much of its force. Newton directly contradicted Aristotle: objects keep moving, in straight lines, until something stops them. The kicked rock rumbles to a halt because the impacts it suffers sap its energy.

The other knock against atomism was that the atoms moved around in empty space, a void, and many philosophers had satisfied themselves that a void was impossible. Their reasoning, briefly, was that for anything to exist, it must have a name that referred to something rather than nothing, and since nothing by definition could not have such a name, it could not therefore exist. This argument, we would now say, is the result of a philosophical confusion between the name of a thing and the thing itself, but it took philosophers a long time to sort that one out. If indeed they have, even now.

Democritus answered these objections in essence by refusing to answer them. He simply asserted that atoms exist and that they move incessantly in the void. He didn't attempt to provide any proof of these statements, but regarded them instead as assumptions from which he and the other atomists sought to explain what they saw in the world about them.

This attitude is strikingly modern and scientific. As Democritus saw it, you have to start somewhere. You make an assumption and explore the consequences. This is exactly what scientists continue to do today, and the fact that a certain assumption leads to all kinds of highly successful predictions and explanations does not, strictly speaking, prove that the original assumption is correct. To jump abruptly to the present day, many theoretical physicists now believe that the elementary particles of the universe are creatures called superstrings — literally, lines or loops that wiggle around in multidimensional space, creating, by wiggling in different ways, electrons and quarks and photons. (More recently still, these superstrings have been subsumed into more complicated multidimensional structures called branes.) Enthusiasts for superstring theory and its variants maintain that they have hit on a fundamentally simple explanation for everything in the physical world, although working out the observable consequences of that explanation is admittedly a complicated and perhaps inconclusive business. Critics point out that whether superstring theory can be tested satisfactorily depends crucially on whether working out the details can be done, even in principle. Neither side expects that anyone will ever see a superstring in its native form.

The modern debate over superstrings is philosophically not very different from the ancient debate over atoms. To Democritus it was self-evidently a step forward to be able to explain the wildly varying and seemingly unpredictable phenomena of the natural world in terms of unchanging and eternal atoms. But even this idea had detractors. Heraclitus — famous for his observation that "you can't step into the same river twice, for fresh waters are always flowing in upon you" — believed that change, not permanency, was the essential nature of the world.

How much credit should we give Democritus and the few atomists of his era for correctly anticipating what we now know to be true? The universe is either constant in its fundamental nature, or ever-changing; matter is either continuous and infinitely divisible, or else made of a finite number of indivisible parts. There seem to be no other possibilities. On both questions, Democritus happened to choose the right side.

Then again, the early atomists were far from right about everything. They believed that the soul was composed of especially subtle atoms. Lucretius had a theory that sweet and bitter tastes arise when the tongue encounters smooth or jagged atoms. With hindsight, we tend to dismiss these errors as the products of overenthusiasm and seize on the points where the atomists got it more or less right. As Bertrand Russell put it, "By good luck, the atomists hit on a hypothesis for which, more than two thousand years later, some evidence was found, but their belief, in their day, was nonetheless destitute of any solid foundation."

What's most important about Democritus is his insistence that explanation, if it is to have lasting value, must itself rest on permanent foundations — a requirement that seems today almost a definition of what science must aim for. The Heraclitean idea that all is change and flux, on the other hand, seems to lead nowhere. Democritus, in his style of thinking, was more like a modern scientist than any other ancient philosopher. He argued that we ought to understand the universe first and worry about our place in it afterward, not adjust our view of the universe for the sake of our own peace of mind. He believed that the complexity of the world at large could, in principle, be explained by means of a simple underlying hypothesis. He believed it was not foolish to imagine the world was made of tiny components, even if those components were too tiny ever to be seen. These self-same principles, and the controversy they engendered, rose up once again almost two thousand years after Democritus, when the modern version of atomic theory began its ascent.

In that interim, atomic theory languished, never quite forgotten but not much amplified either. A taint of atheism hung over it, and natural philosophy in the post-Roman, prescientific world was powerfully religious, or at least mystical. Philosophers of the middle ages set as their most important undertaking the task of proving that God existed. The alchemists, meanwhile, tried vainly to find secret recipes that would transform, in mysterious ways, one substance into another, and in particular, base metals into gold. The towering but enigmatic Isaac Newton was in many ways both the first modern scientist and also, in Keynes's phrase, the last alchemist. When he wasn't propounding mechanical laws of motion or inventing the differential and integral calculus, Newton pored over the Bible and other ancient texts, trying out bizarre numerological schemes in the pursuit of arcane knowledge.

Nevertheless, modern science gradually emerged. The alchemists — mystics and sorcerers — changed almost imperceptibly, as pupil outgrew teacher, into chemists. Both were looking to unlock the nature of the physical world and the transformations within it, but where alchemists stumbled blindly, hoping to come across secret recipes, chemists slowly adopted a more purposeful strategy, hoping to control chemical transformations by first understanding the rules that governed them.

Atomic theory began to rise again. The rules that chemists learned imposed some restrictions that would have gravely disappointed their alchemical predecessors. Metals such as iron, copper, and gold were elemental quantities, they found, that could under no circumstances be forcibly converted one into another. Fire, on the other hand, which to alchemists had always been the supernatural agent of transformation, turned out to be just such a transformation in its own right: a chemical reaction.

Chemists grasped the idea of elements and of chemical reactions as combinations of elements changing partners according to strict rules, as in a country dance. Water, for example, was a compound of two parts of hydrogen to one of oxygen. From there it was not so big a leap to think of "atoms" of these gases combining, two of hydrogen with one of oxygen, to create an "atom" of water. (The modern distinction between atoms and molecules, which consist of several atoms bonded together, did not become fully clear until chemists had sorted out what were elements and what were compounds of those elements. In the meantime, scientists used the terms atom and molecule somewhat interchangeably.)

Still, the chemists didn't care very much (because they didn't need to) what the atoms looked like, how they behaved, how they congregated or dispersed. Whether they were tiny, hard things flying about in empty space or fat, squishy things packed closely together like oranges in a carton didn't matter much. And it wasn't at all clear whether the atoms of hydrogen and oxygen were genuine, indivisible entities, or whether the two-plus-one formula for combining them into water was simply a handy accounting method. As had been the case for Democritus and Lucretius, atoms seemed like a nice idea, at least to those disposed to think that way, but still there didn't yet seem to be anything necessary or compelling about them.

What seems surprising in retrospect, perhaps, is that it took so long for physicists to combine Newton's laws of motion, so well established as the foundation of physics in the 17th and 18th centuries, with the resurgent atomic hypothesis — to think, in other words, of atoms as little objects moving, colliding, and bouncing off each other in accordance with standard Newtonian mechanics. This is what Daniel Bernoulli first tried, in 1738, with his argument deriving pressure from a consideration of atomic motion. But even after that, in 1763, Roger Boscovich wrote an exposition called Theoria Philosophiae Naturalis in which he offered an atomic theory that relied on essentially stationary atoms. Boscovich, a peripatetic philosopher-priest of Serbo-Croatian origins, argued that at very short range, atoms attracted each other: that was why a piece of cloth soaked up water. At somewhat longer range, however, atoms pushed each other away: that was why a gas exerted pressure.

Boscovich's account, though it has some modern elements, also illustrates why atomic theory was not taken seriously by many scientists for such a long time. Rather than imagining atoms as having certain properties, and seeking to draw conclusions about their behavior, he instead gave the atoms whatever properties he needed in order to explain the phenomena he addressed. This put into practical terms Newton's suggestion that atoms must "conduce themselves" so as to produce the behavior we see. It is easy to criticize this thinking as wholly speculative and unscientific. First you imagine that atoms exist, and then you imagine that they have whatever properties they need in order to account for the phenomena you want to explain.

These philosophical considerations aside, the other great barrier against the acceptance of atomism, especially as it applied to gases, was ignorance of the true nature of heat. At the beginning of the 19th century, opinion was divided. Some scientists thought that heat was a mechanical property of some sort, related to energy and other Newtonian concepts, but others, perhaps a majority, subscribed to the notion that heat was a kind of vaporous fluid or tenuous substance that went by the name caloric. This caloric was supposed to be an entity in its own right, not something composed or built from other components, and it could somehow soak into or pervade material objects, bestowing on them the property we recognize as heat. When a warm object lost heat to a colder object in contact with it, that was because caloric dribbled out of one and seeped into the other.

An argument against the caloric theory came from the Massachusetts-born scientist and inventor Benjamin Thompson, who spied for Britain in the years preceding the Revolutionary War, fled to London in 1775, returned briefly to America while the war was still going on, and after the newly independent United States had won, returned to Britain as a refugee. The appreciation shown to him there fell short of his expectations, and through political connections he obtained an appointment to the royal court of Bavaria, where he served mainly as a military adviser but succeeded in making himself indispensable in a variety of ways. He laid out the English Gardens in Munich, concocted a recipe for soup (along with specific chewing and swallowing instructions) that was meant to keep soldiers well nourished, and designed a portable coffeemaker. For these and other services he was made, in 1792, Count Rumford of the Holy Roman Empire — a name familiar to many American home renovators today in connection with the Rumford fireplace, an efficient hearth he designed in order to keep smokiness to a minimum.

Besides all this, Rumford also showed a genuine aptitude for scientific insight, and he made a number of useful observations concerning the nature of heat and energy. In his capacity as a military engineer in Bavaria he oversaw the boring out of cannons, and noticed that a dull bit would grind endlessly into a chunk of metal, achieving little except the generation of heat. He concluded that the amount of heat obtainable was essentially limitless, as long as the drill bit kept boring away. That was hard to understand if heat represented caloric being drawn out of the drilled metal; surely the original supply of caloric would run out after a while. Rumford saw instead that heat generation had something to do with the physical work of grinding the bit on the metal.

The caloric theory of heat lingered on into the first decades of the 19th century, despite observations such as Rumford's and despite the fact that no one could really say what sort of a substance caloric was supposed to be. In that respect, however, atoms — invisible particles with unknown properties — had no firmer standing. But physicists were at least familiar with gases and fluids in a general way, and if caloric was a peculiar kind of fluid, that was because heat was a peculiar kind of quantity. Atoms, on the other hand, were a complete unknown, and to explain something familiar yet enigmatic, such as heat, in terms of tiny, hard masses must have struck scientists of the early 19th century as too great a leap of imagination for them to follow.

Accustomed as we are nowadays to the idea of explaining all manner of observable or detectable phenomena in terms of remote, invisible entities — quarks and photons, electromagnetic fields, curved space, and the like — scientists of two hundred years ago were still essentially rooted in what they could see and measure directly. Heat could be detected at the fingertips; it was an undoubted physical phenomenon. The pressure of a gas could likewise be felt in the tautness of an inflated balloon or the powerful stroke of a piston in a steam engine. What did

Table of Contents

Contents

Introduction

Chapter 1: A Letter from Bombay

Lessons in Obscurity

Chapter 2: Invisible World

The Kind of Motion We Call Heat

Chapter 3: Dr. Boltzmann of Vienna

The Precocious Genius

Chapter 4: Irreversible Changes

The Enigma of Entropy

Chapter 5: "You Will Not Fit In"

The Daunting Prussians

Chapter 6: The British Engagement

Parsons, Lawyers, and Physicists

Chapter 7: "It's Easy to Mistake a Great Stupidity for a Great Discovery"

Philosophy Seduces Physics

Chapter 8: American Innovations

New World, New Ideas

Chapter 9: The Shock of the New

The Arrival of the Atomic Century

Chapter 10: Beethoven in Heaven

Shadows of the Mind

Chapter 11: Annus Mirabilis, Annus Mortis

Einstein Rises, and a Man Falls

Postscript

Acknowledgments

Bibliography and Notes

Index

From the B&N Reads Blog

Customer Reviews