Download high-resolution image Look inside
Listen to a clip from the audiobook
audio pause button
0:00
0:00

Turing's Cathedral

The Origins of the Digital Universe

Look inside
Listen to a clip from the audiobook
audio pause button
0:00
0:00
In this revealing account of how the digital universe exploded in the aftermath of World War II, George Dyson illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world.

In the 1940s and ‘50s, a small group of men and women—led by John von Neumann—gathered in Princeton, New Jersey, to begin building one of the first computers to realize Alan Turing’s vision of a Universal Machine. The codes unleashed within this embryonic, 5-kilobyte universe—less memory than is allocated to displaying a single icon on a computer screen today—broke the distinction between numbers that mean things and numbers that do things, and our universe would never be the same. Turing’s Cathedral is the story of how the most constructive and most destructive of twentieth-century inventions—the digital computer and the hydrogen bomb—emerged at the same time.

“The best book I’ve read on the origins of the computer. . . . Not only learned, but brilliantly and surprisingly idiosyncratic and strange.” —The Boston Globe

“A groundbreaking history. . . . The book brims with unexpected detail.” —The New York Times Book Review

“A technical, philosophical and sometimes personal account. . . . Wide-ranging and lyrical.” —The Economist

“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.” —The New York Review of Books

“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.” —The Philadelphia Inquirer

“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.” —The Oregonian

“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself." —The Wall Street Journal

“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.” —San Francisco Chronicle

“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.” —The Seattle Times

“A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.” —Cory Doctorow, Boing Boing

“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.” —The Guardian

“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.” —Literary Review

“More than just a great book about science. It’s a great book, period.” —The Globe and Mail
© Ann Yow-Dyson

George Dyson is a science historian as well as a boat designer and builder. He is also the author of Baidarka, Project Orion, and Darwin Among the Machines.

View titles by George Dyson

Preface
 
POINT SOURCE SOLUTION
 
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
 
 
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
 
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
 
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
 
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
 
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
 
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
 
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
 
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
 
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
 
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
 
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

  • FINALIST | 2012
    L.A. Times Book Prize (Science and Tech)

“The best book I’ve read on the origins of the computer. . . not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
The Boston Globe
 
“A groundbreaking history . . . the book brims with unexpected detail.”
The New York Times Book Review
 
“A technical, philosophical and sometimes personal account . . . wide-ranging and lyrical.”
The Economist
 
“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.”
The New York Review of Books

“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.”
The Philadelphia Inquirer
 
“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.”
The Oregonian
 
“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself."
The Wall Street Journal
 
“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.”
San Francisco Chronicle
 
“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.”
The Seattle Times
 
 “A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.”
The Guardian
 
“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.”
Literary Review
 
“More than just a great book about science. It’s a great book, period.”
The Globe and Mail

About

In this revealing account of how the digital universe exploded in the aftermath of World War II, George Dyson illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world.

In the 1940s and ‘50s, a small group of men and women—led by John von Neumann—gathered in Princeton, New Jersey, to begin building one of the first computers to realize Alan Turing’s vision of a Universal Machine. The codes unleashed within this embryonic, 5-kilobyte universe—less memory than is allocated to displaying a single icon on a computer screen today—broke the distinction between numbers that mean things and numbers that do things, and our universe would never be the same. Turing’s Cathedral is the story of how the most constructive and most destructive of twentieth-century inventions—the digital computer and the hydrogen bomb—emerged at the same time.

“The best book I’ve read on the origins of the computer. . . . Not only learned, but brilliantly and surprisingly idiosyncratic and strange.” —The Boston Globe

“A groundbreaking history. . . . The book brims with unexpected detail.” —The New York Times Book Review

“A technical, philosophical and sometimes personal account. . . . Wide-ranging and lyrical.” —The Economist

“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.” —The New York Review of Books

“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.” —The Philadelphia Inquirer

“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.” —The Oregonian

“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself." —The Wall Street Journal

“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.” —San Francisco Chronicle

“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.” —The Seattle Times

“A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.” —Cory Doctorow, Boing Boing

“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.” —The Guardian

“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.” —Literary Review

“More than just a great book about science. It’s a great book, period.” —The Globe and Mail

Author

© Ann Yow-Dyson

George Dyson is a science historian as well as a boat designer and builder. He is also the author of Baidarka, Project Orion, and Darwin Among the Machines.

View titles by George Dyson

Excerpt

Preface
 
POINT SOURCE SOLUTION
 
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
 
 
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
 
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
 
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
 
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
 
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
 
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
 
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
 
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
 
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
 
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
 
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

Awards

  • FINALIST | 2012
    L.A. Times Book Prize (Science and Tech)

Praise

“The best book I’ve read on the origins of the computer. . . not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
The Boston Globe
 
“A groundbreaking history . . . the book brims with unexpected detail.”
The New York Times Book Review
 
“A technical, philosophical and sometimes personal account . . . wide-ranging and lyrical.”
The Economist
 
“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.”
The New York Review of Books

“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.”
The Philadelphia Inquirer
 
“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.”
The Oregonian
 
“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself."
The Wall Street Journal
 
“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.”
San Francisco Chronicle
 
“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.”
The Seattle Times
 
 “A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.”
The Guardian
 
“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.”
Literary Review
 
“More than just a great book about science. It’s a great book, period.”
The Globe and Mail

PRH Education High School Collections

All reading communities should contain protected time for the sake of reading. Independent reading practices emphasize the process of making meaning through reading, not an end product. The school culture (teachers, administration, etc.) should affirm this daily practice time as inherently important instructional time for all readers. (NCTE, 2019)   The Penguin Random House High

Read more

PRH Education Translanguaging Collections

Translanguaging is a communicative practice of bilinguals and multilinguals, that is, it is a practice whereby bilinguals and multilinguals use their entire linguistic repertoire to communicate and make meaning (García, 2009; García, Ibarra Johnson, & Seltzer, 2017)   It is through that lens that we have partnered with teacher educators and bilingual education experts, Drs.

Read more

PRH Education Classroom Libraries

“Books are a students’ passport to entering and actively participating in a global society with the empathy, compassion, and knowledge it takes to become the problem solvers the world needs.” –Laura Robb   Research shows that reading and literacy directly impacts students’ academic success and personal growth. To help promote the importance of daily independent

Read more