The New Physics of Information: From "Cogito ergo sum" to "It from bit"July 10, 2001
The Bit and the Pendulum: From Quantum Computing to M Theory---The New Physics of Information. By Tom Siegfried, John Wiley, New York, 2000, 281 pages, $27.95 (cloth), $15.95 (paper).
Tom Siegfried is the science editor of the Dallas Morning News. He has received, among other honors, the Westinghouse Award for science journalism from the American Association for the Advancement of Science. In the course of his work, he interviews leading experts in fields that range from physics to food science. In the book under review---his first---he explores a topic that has begun to arise with considerable frequency in these interviews. He and others describe it as "the new physics of information." He makes no pretense of being a physicist himself, claiming only to have puzzled over countless nontechnical descriptions of recent discoveries, in which he finds a significant pattern. His title is a variation on "The Pit and the Pendulum," the Edgar Allan Poe story about a condemned man groping in the dark-much as Siegfried perceives science to be groping-toward an understanding of his surroundings.
Siegfried suggests, first of all, that machines and technology are more fruitful objects of study than nature itself, since particular machines and branches of technology tend to work on only a few scientific principles, which they accordingly isolate from other such principles in ways that nature seldom does. He focuses on three machines that stand out as what he calls "superparadigms" in the history of science: the clock, the steam engine, and the electronic computer. Each was, in its time, the "mother of all paradigms," in the sense that other paradigms were patterned after it. His observation that paradigms have genealogies brings to mind the Reagan cabinet member---the name escapes me---who used to tease youthful technocrats by singing a few bars of "brother, can you paradigm?"
The Three "Superparadigms"
Weight-driven mechanical clocks began to appear in European village squares about seven hundred years ago. The most intricate devices of their day, they reached a plateau of development after Galileo observed that the period of a pendulum's swing is (to an excellent first approximation) independent of the amplitude of that swing. Yet long before Galileo, others had begun to compare the turning of a clock's internal wheels with the motions of the heavenly bodies.
They also began---however incompletely---to appreciate the manner in which the force exerted by a hanging weight could be transmitted by gears and levers to other parts of the machine, an appreciation that helped prepare them to accept Newton's explanation of the movements of the heavenly bodies in terms of "gravitational" forces. Only later was it realized that clocks could be miniaturized and made portable by substituting the force of a coiled spring for that of a hanging weight, leading to the 18th-century invention of workable pocket watches and ship's chronometers.
Siegfried revives the term "clock-work universe" to describe Newtonian cosmology. He even traces the origin of the analogy to a book by one Nicole Oresmé, a scholastic born in or about the year 1320, somewhere in northern France, who compared the heavenly design to "a man making a clock and letting it run and continue in its own motion by itself."
The tale of the steam engine actually begins with a British patent, issued to one Thomas Savery in 1698, for a device to pump water out of coal mines. Thomas Newcomen, who later improved Savery's design, may have sold as many as a thousand "Newcomen engines" by the end of the 18th century. But the big breakthrough in steam technology came in 1765, when James Watt---an instrument maker at the University of Glasgow---hit on a design that would transform much more of the engine's available energy into useful work. It took him several years to patent his revolutionary design, and many more to refine it into the power source that would drive the Industrial Revolution. He did it not for the betterment of science, but for the wealth he hoped would be his.
In 1812, at the age of 16, Sadi Carnot entered the French Ecole Polytechnique, where he studied geometry, mechanics, and chemistry in preparation for a career as a military engineer. While in the military, he became aware that France lagged far behind England in steam power and industrial development. Accordingly, in 1824, soon after leaving the military, he published a treatise on steam power designed to permit an accurate (theoretical) understanding of what he called "heat engines," and to encourage their development in his native land. The book explained how to calculate, among other things, the maximum amount of work that a given engine could produce, and thus the "thermal efficiency" of any actual machine.
Carnot's work aroused little interest during his lifetime, which ended prematurely during a cholera epidemic in 1832. Only later did it attract the attention of Emile Clapeyron, a French engineer who publicized the work through his own writings and designs, and of the British physicist William Thomson (later Lord Kelvin), who developed Carnot's ideas into the science of thermodynamics. It was this new science that caused the physics community to shift its focus from force to energy.
Although the history of computers is relatively familiar to modern readers, Siegfried traces it from the crude mechanical designs of Pascal and Leibnitz, through the Analytical Engines of Charles Babbage, to the nascent science of quantum computing. Though a great admirer of Alan Turing, and of the universal Turing machine, Siegfried confines his list of candidates for the inventor of the modern electronic computer to participants in the 1973 trial in which John Atanasoff contested the claim of ENIAC builders J. Presper Eckert and John Mauchly to be the inventors of the beast. Although the court found in favor of Atanasoff, Siegfried describes the result as a miscarriage of justice made possible only by Eckert and Mauchly's inferior patent attorney.
(Siegfried makes no mention of Colossus, a device built to Turing's design during World War II for deciphering German military codes. According to those who, using the original blueprints, constructed the working replica now on display at Bletchley Park outside London, Colossus was in every respect a modern multipurpose stored-program electronic computer.)
Shift from Energy to Information
The connection between physics and information goes back at least to the late Claude Shannon's 1948 explanation of the advantages of measuring information in "bits"---defined as the quantity of information needed to decide between two otherwise equiprobable messages---and of associating with every message source a quantity he called "entropy" in order to emphasize its similarity to an important thermodynamic quantity. Although the analogy was and is a purely formal one, speculation continues that the entropy of physics and the entropy of information are but the two sides of a single coin. Shannon himself said in 1979 that he expected the connection between the two to "hold up in the long run," but added that "it has not been fully explored or understood. There is more there than we know at present."
No one doubts the beauty and utility of information theory, but no one knows whether the centrality of an entropy functional in it points to the existence of a still undiscovered law of nature. Siegfried detects the gradual formation of an affirmative consensus, which may or may not presage a shift of emphasis within the physics community from energy to information. Another prophet of a possible shift from energy to information was Rolf Landauer of IBM, who during the 1950s began to wonder what, if any, physical limits might apply to the computational process. He knew of course that both Shannon and Carnot had used their personal versions of entropy to deduce firm limits on what could be accomplished with heat engines and communication channels---limits that were independent of the technology employed. He speculated that, perhaps by similar methods, such limits could be imposed on computers as well.
The most enduring fruit of Landauer's speculation is the (seemingly counter-intuitive) conclusion that the only part of the computing process that necessarily consumes energy is the erasure of information. There seems to be no positive lower bound on the amount of energy required to add or compare two 32-bit numbers, although the energy required to erase such a number does seem to be bounded away from zero. Although everyone is familiar with the energy needed to rub out information written with pencil on paper, most are surprised to learn that a similar requirement seems to apply to information recorded electronically. Siegfried is among those who impute great significance to this (now) firmly established fact and refer to it as "Landauer's principle."
Landauer's IBM colleague Charles Bennett realized as early as 1973 that Landauer's principle might eventually be of immense value to the computer industry. Accordingly, he published, in that year, a scheme of "reversible logic" whereby all of the intermediate quantities obtained, used, and then discarded in the course of a lengthy calculation might be recovered from the answer alone, after the computation itself was complete. Because recoverable information is not truly lost, even if it has been overwritten in main memory, no energy need be expended in performing such calculations. Bennett's reversible computing scheme avoids, at least in theory, the energy requirements seemingly imposed by Landauer's principle.
Ralph Merkle, a computer scientist at Xerox PARC, concedes that the goal of complete reversibility is unachievable, but insists that that's beside the point. A substantial portion---currently between 5 and 10%---of the electrical power now produced in the U.S. is consumed by computers. Hence, a considerable saving awaits the design of a more energy-efficient computer. And appropriate hardware can produce such a saving, by making most of the "flops" reversible. Merkle claims that a demonstration model could even now be built from off-the-shelf parts. The possibility isn't particularly important at present, because fossil fuels are still plentiful, electric power is still cheap (at least to the end-user) and Moore's law still reigns supreme. But, because those conditions are unlikely to persist for more than a few decades, Merkle expects that "Reversible logic will dominate computing in the twenty-first century."
Because the possibility of quantum computing seems to offer escape from the inevitable failure of Moore's law, Siegfried discusses it at length. He also investigates the related notions of quantum information and quantum communication. The latter is of particular interest because (in contrast with quantum computing) the necessary hardware already exists, at least in the lab. There is even a wireless version! The intriguing feature of quantum communication is that users can detect the presence of an eavesdropper: Because some quantum information is necessarily erased when it is measured, eavesdropping automatically increases channel noise. Siegfried speaks somewhat skeptically of quantum computing, in part because (according to Landauer) quantum computers seem unable to correct incidental errors as effectively as ordinary computers do. They seem to have to work perfectly in order to work at all. "And nature," asserts Landauer, "abhors perfection."
"It from Bit"
Of all the experts quoted by Siegfried---and he seems to have interviewed every important physicist or computer scientist in the world at least once---he appeals most often to John Archibald Wheeler. In part, that is because, while a student of journalism at the University of Texas, Siegfried took a "physics for poets" class that Wheeler used to teach there. And in part it is because Wheeler was among the first to conclude that information is at least as basic a physical quantity as time, space, mass, or (electrical) charge. Wheeler even coined the phrase "it from bit" to serve as a modern alternative to the Cartesian epigram "Cogito ergo sum." It is evidently intended to apply to both animate and inanimate objects and to mean that "Because it contains information, it must exist." Not all theoretical physicists yet see the need for such a hypothesis, but most seem ready to concede that the notion deserves serious consideration.
Wheeler is of course famous for having given the name "black hole" to any "gravitationally completely collapsed object." The catchy name was actually suggested by a member of the audience at the first public lecture Wheeler ever gave on the subject, while he was explaining that nothing---even light---can escape from so dense an object, because of the strength of its gravitational field. Anything that falls into a black hole is there to stay. And since even the simplest objects contain some information---about their own internal structure, for instance---which is obliterated on their assimilation into a black hole, information is lost whenever a black hole gains weight. Wheeler concluded that a black hole's mass increases with its volume, which increases with the quantity of information it contains. To emphasize the information-theoretic aspect of his theory of black holes, which follows from Einstein's gravitational equations, Wheeler even drew the illustration shown on this page. (From a 1990 preprint of a paper by Wheeler, and reproduced in The Bit and the Pendulum, the illustration shows a black hole covered by "bits.")
Wheeler's conclusion that the space-time and mass of a black hole are intimately connected to the quantity of information it contains, and that entering information is automatically destroyed, brings his (gravitational) theory into direct conflict with quantum mechanics, which seemingly requires that information about the past is permanently recorded.
It was Stephen Hawking who pointed the way to the possible resolution of the seeming paradox, by suggesting that black holes, because they can and do radiate some energy, are really only charcoal gray. The radiation happens when "virtual" particles chance to become real ones precisely on the "event horizon" that forms the boundary of a black hole. Such "realizations" occur constantly, throughout space, and do not defy the laws of physics (quantum mechanics) as long as they occur in particle-antiparticle pairs that survive only briefly. Hawking saw that such originations are particularly significant when they occur precisely on an event horizon, because one member of the newly created pair might then enter the neighboring black hole, while the other moves away. The latter is then free to be detected, as radiant energy, while the former seems lost forever to mankind. But, according to Hawking, if radiation is constantly escaping from the immediate neighborhood, the mass therein must be steadily decreasing,causing the neighborhood to be evacuated in the fullness of time. If so, the information within will eventually be recovered. Its loss is only temporary, and does not contradict the laws of quantum mechanics.
The very fact that Siegfried covers more topics---including biological computation, string theory, and M-theory---than I am able to summarize here suggests that his book is worth reading. For those who find his discussions too sketchy, he includes a six-page guide to "further reading."
James Case writes from Baltimore, Maryland.