This book is the story of the universe and the bit. The universe is the biggest thing there is and the bit is the smallest possible chunk of information. The universe is made of bits. Every molecule, atom, and elementary particle registers bits of information. Every interaction between those pieces of the universe processes that information by altering those bits. That is, the universe computes, and because the universe is governed by the laws of quantum mechanics, it computes in an intrinsically quantum-mechanical fashion; its bits are quantum bits. The history of the universe is, in effect, a huge and ongoing quantum computation. The universe is a quantum computer.
This begs the question: What does the universe compute? It computes itself. The universe computes its own behavior. As soon as the universe began, it began computing. At first, the patterns it produced were simple, comprising elementary particles and establishing the fundamental laws of physics. In time, as it processed more and more information, the universe spun out ever more intricate and complex patterns, including galaxies, stars, and planets. Life, language, human beings, society, culture-all owe their existence to the intrinsic ability of matter and energy to process information. The computational capability of the universe explains one of the great mysteries of nature: how complex systems such as living creatures can arise from fundamentally simple physical laws. These laws allow us to predict the future, but only as a matter of probability, and only on a large scale. The quantum-computational nature of the universe dictates that the details of the future are intrinsically unpredictable. They can be computed only by a computer the size of the universe itself. Otherwise, the only way to discover the future is to wait and see what happens.
Allow me to introduce myself. The first thing I remember is living in a chicken house. My father was apprenticed to a furniture maker in Lincoln, Massachusetts, and the chicken house was in back of her barn. My father turned the place into a two-room apartment; the space where the chickens had roosted became bunks for my older brother and me. (My younger brother was allowed a cradle.) At night, my mother would sing to us, tuck us in, and close the wooden doors to the roosts, leaving us to lie snug and stare out the windows at the world outside.
My first memory is of seeing a fire leap up in a wire trash basket with an overlapping diamond pattern. Then I remember holding tight to my mother's blue-jeaned leg just above the knee and my father flying a Japanese fighter kite. After that, memories crowd on thick and fast. Each living being's perception of the world is unique and crowded with detail and structure. Yet we all inhabit the same space and are governed by the same physical laws. In school, I learned that the physical laws governing the universe are surprisingly simple. How could it be, I wondered, that the intricacy and complexity I saw outside my bedroom window was the result of these simple physical laws? I decided to study this question and spent years learning about the laws of nature.
Heinz Pagels, who died tragically in a mountaineering accident in Colorado in the summer of 1988, was a brilliant and unconventional thinker who believed in transgressing the conventional boundaries of science. He encouraged me to develop physically precise techniques for characterizing and measuring complexity. Later, under the guidance of Murray Gell-Mann at Caltech, I learned how the laws of quantum mechanics and elementary-particle physics effectively "program" the universe, planting the seeds of complexity.
These days, I am a professor of mechanical engineering at the Massachusetts Institute of Technology. Or, because I have no formal training in mechanical engineering, it might be more accurate to call me a professor of quantum-mechanical engineering. Quantum mechanics is the branch of physics that deals with matter and energy at its smallest scales. Quantum mechanics is to atoms what classical mechanics is to engines. In essence: I engineer atoms.
In 1993, I discovered a way to build a quantum computer. Quantum computers are devices that harness the information-processing ability of individual atoms, photons, and other elementary particles. They compute in ways that classical computers, such as a Macintosh or a PC, cannot. In the process of learning how to make atoms and molecules-the smallest pieces of the universe-compute, I grew to appreciate the intrinsic information-processing ability of the universe as a whole. The complex world we see around us is the manifestation of the universe's underlying quantum computation.
The digital revolution under way today is merely the latest in a long line of information-processing revolutions stretching back through the development of language, the evolution of sex, and the creation of life, to the beginning of the universe itself. Each revolution has laid the groundwork for the next, and all information-processing revolutions since the Big Bang stem from the intrinsic information-processing ability of the universe. The computational universe necessarily generates complexity. Life, sex, the brain, and human civilization did not come about by mere accident.
The Quantum Computer
Quantum mechanics is famously weird. Waves act like particles, and particles act like waves. Things can be in two places at once. It is perhaps not surprising that, at small scales, things behave in strange and counterintuitive ways; after all, our intuitions have developed for dealing with objects much larger than individual atoms. Quantum weirdness is still disconcerting, though. Niels Bohr, the father of quantum mechanics, once said that anyone who thinks he can contemplate quantum mechanics without getting dizzy hasn't properly understood it.
Quantum computers exploit "quantum weirdness" to perform tasks too complex for classical computers. Because a quantum bit, or "qubit," can register both 0 and 1 at the same time (a classical bit can register only one or the other), a quantum computer can perform millions of computations simultaneously.
Quantum computers process the information stored on individual atoms, electrons, and photons. A quantum computer is a democracy of information: every atom, electron, and photon participates equally in registering and processing information. And this fundamental democracy of information is not confined to quantum computers. All physical systems are at bottom quantum-mechanical, and all physical systems register and process information. The world is composed of elementary particles-electrons, photons, quarks-and each elementary piece of a physical system registers a chunk of information: one particle, one bit. When these pieces interact, they transform and process that information, bit by bit. Each collision between elementary particles acts as a simple logical operation, or "op."
To understand any physical system in terms of its bits, we need to understand in detail the mechanism by which each and every piece of that system registers and processes information. If we can understand how a quantum computer does this, then we can understand how a physical system does.
The idea of such a computer was proposed in the early 1980s by Paul Benioff, Richard Feynman, David Deutsch, and others. When they were first discussed, quantum computers were a wholly abstract concept: Nobody had a clue how to build them. In the early 1990s, I showed how they could be built using existing experimental techniques. Over the past ten years, I have worked with some of the world's greatest scientists and engineers to design, build, and operate quantum computers.
There are a number of good reasons to build quantum computers. The first is that we can. Quantum technologies-technologies for manipulating matter at the atomic scale-have undergone remarkable advances in recent years. We now possess lasers stable enough, fabrication techniques accurate enough, and electronics fast enough to perform computation at the atomic scale.
The second reason is that we have to-at least if we want to keep building ever faster and more powerful computers. Over the past half century, the power of computers has doubled every year and a half. This explosion of computer power is known as "Moore's law," after Gordon Moore, subsequently the chief executive of Intel, who noted its exponential advance in the 1960s. Moore's law is a law not of nature, but of human ingenuity. Computers have gotten two times faster every eighteen months because every eighteen months engineers have figured out how to halve the size of the wires and logic gates from which they are constructed. Every time the size of the basic components of a computer goes down by a factor of two, twice as many of them will fit on the same size chip. The resulting computer is twice as powerful as its predecessor of a year and half earlier.
If you project Moore's law into the future, you find that the size of the wires and logic gates from which computers are constructed should reach the atomic scale in about forty years; thus, if Moore's law is to be sustained, we must learn to build computers that operate at the quantum scale. Quantum computers represent the ultimate level of miniaturization.
The quantum computers my colleagues and I have constructed already attain this goal: each atom registers a bit. But the quantum computers we can build today are small, not only in size but also in power. The largest general-purpose quantum computers available at the time of this writing have seven to ten quantum bits and can perform thousands of quantum logic operations per second. (By contrast, a conventional desktop computer can register trillions of bits and can perform billions of conventional, classical logic operations per second.) We're already good at making computers with atomic-scale components; we're just not good at making big computers with atomic-scale components. Since the first quantum computers were constructed a decade ago, however, the number of bits they register has doubled almost every two years. Even if this exponential rate of progress can be sustained, it will still take forty years before quantum computers can match the number of bits registered by today's classical computers. Quantum computers are a long way from the desktop.
The third reason to build quantum computers is that they allow us to understand the way in which the universe registers and processes information. One of the best ways to understand a law of nature is to build and operate a machine that illustrates that law. Often, we build the machine first and the law comes later. The wheel and the top had existed for millennia before the establishment of the law of conservation of angular momentum. The thrown rock preceded Galileo's laws of motion; the prism and the telescope came before Newton's optics; the steam engine preceded James Watt's governor and Sadi Carnot's second law of thermodynamics. Since quantum mechanics is so hard to grasp, wouldn't it be nice to build a machine that embodies the laws of quantum mechanics? By playing with that machine, one could acquire a working understanding of quantum mechanics, just as a baby who plays with a top grasps the principles of angular momentum embodied by the toy. Without direct experience of how atoms actually behave, our understanding remains shallow. The "toy" quantum computers we build today are machines that will allow us to learn more and more about how physical systems register and process information at the quantum-mechanical level.
The final reason to build quantum computers is that it's fun. In the pages to come, you'll meet some of the world's foremost scientists and engineers: Jeff Kimble of Caltech, constructor of the world's first photonic quantum logic gate; Dave Wineland of the National Institute of Standards and Technology, who built the first simple quantum computer; Hans Mooij of the Delft University of Technology, whose group gave some of the earliest demonstrations of quantum bits in superconducting circuits; David Cory of MIT, who built the first molecular quantum computer, and whose quantum analog computers can perform computations that would require a classical computer larger than the universe itself. Once we have seen how quantum computers work, we will be able to put bounds on the computational capacity of the universe.
The Language of Nature
As it computes, the universe effortlessly spins out intricate and complex structures. To understand how the universe computes-and thus to understand better those complex structures-we must learn how it registers and processes information. That is, we must learn the underlying language of nature.
Think of me as a kind of atomic masseur. As a professor of quantum-mechanical engineering at MIT, my job is to massage electrons, photons, atoms, and molecules into those special states in which they become quantum computers and quantum communication systems. Atoms are tiny but strong, resilient but sensitive. They are easy to talk to (just hit the table and you've talked to billions upon billions of them) but hard to listen to (I bet you can't tell me what the table had to say beyond "thump"). They don't care about you, and they go about their business doing what they have always done. But if you massage them in just the right way, you can charm them. They will compute for you.
Atoms are not alone in their ability to process information. Photons (particles of light), phonons (particles of sound), quantum dots (artificial atoms), superconducting circuits-all these microscopic systems can register information. And if you speak their language and ask them nicely, they will process that information for you. What language do such systems speak? Like all physical systems, they respond to energy, force, and momentum, to light and sound, to electricity and gravity. Physical systems speak a language whose grammar consists of the laws of physics. Over the last ten years, we have learned this language well enough to talk to atoms-to convince them to perform computations and report the results.
How hard is it to "speak Atom"? To learn to converse fluently takes a lifetime. I myself am a poor atomic conversationalist, compared with other scientists and quantum-mechanical engineers you will meet in this book. To learn enough to carry on a simple conversation, however, is not hard.
Like all languages, Atom is easier to learn when you're younger. With Paul Penfield, I co-teach a freshman course at MIT called Information and Entropy. The goal of this course, like the goal of this book, is to reveal the fundamental role that information plays in the universe. Fifty years ago, MIT freshmen used to arrive full of knowledge about internal-combustion engines, gears and levers, drivetrains and pulleys. Twenty-five years ago, they arrived full of knowledge of vacuum tubes, transistors, ham radios, and electronic circuits. Now they arrive chock-a-block full of knowledge about computers, disk drives, fiber optics, bandwidth, and music- and image-compression codes. Their predecessors lived in worlds dominated by mechanical and electrical technologies; today's freshmen come from a world dominated by information. Their predecessors already knew lots about force and energy, voltage and charge; today's freshmen already know lots about bits and bytes. The freshmen in our course already know so much about information technology that we can teach them subjects-including quantum computation-that previously could be taught only to graduate students. (My senior colleagues in the Mechanical Engineering Department complain that the incoming freshmen have never used a screwdriver. This is untrue. Fully half of them have used a screwdriver to install more memory in their computers.)
Excerpted from Programming the Universe by Seth Lloyd Copyright © 2006 by Seth Lloyd. Excerpted by permission of Knopf, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.