Available:*
Library | Call Number | Status |
---|---|---|
Searching... R.H. Stafford Library (Woodbury) | 004.09 DYS | Searching... Unknown |
Searching... Stillwater Public Library | 004.09 DYS | Searching... Unknown |
Bound With These Titles
On Order
Summary
Summary
"It is possible to invent a single machine which can be used to compute any computable sequence," twenty-four-year-old Alan Turing announced in 1936. In Turing's Cathedral , George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing's vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things--and our universe would never be the same.
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
Dyson's account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It's no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
How did code take over the world? In retracing how Alan Turing's one-dimensional model became John von Neumann's two-dimensional implementation, Turing's Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.
Author Notes
George Dyson, the son of distinguished physicist Freeman Dyson, grew up immersed in the world of groundbreaking science. Dyson lives in Washington State.
Reviews (7)
Publisher's Weekly Review
An overstuffed meditation on all things digital sprouts from this engrossing study of how engineers at Princeton's Institute for Advanced Studies, under charismatic mathematician John von Neumann (the book should really be titled Von Neumann's Cathedral), built a pioneering computer (called MANIAC) in the years after WWII. To readers used to thinking of computers as magical black boxes, historian Dyson (Darwin Among the Machines) gives an arresting view of old-school mechanics hammering the first ones together from vacuum tubes, bicycle wheels, and punch-cards. Unfortunately, his account of technological innovations is too sketchy for laypeople to quite follow. The narrative frames a meandering tour of the breakthroughs enabled by early computers, from hydrogen bombs to weather forecasting, and grandiose musings on the digital worldview of MANIAC's creators, in which the author loosely connects the Internet, DNA, and the possibility of extraterrestrial invasion via interstellar radio signals. Dyson's portrait of the subculture of Von Neumann and other European emigre scientists who midwifed America's postwar technological order is lively and piquant. But the book bites off more science than it can chew, and its expositions of hard-to-digest concepts from Godel's theorem to the Turing machine are too hasty and undeveloped to sink in. (Mar.) (c) Copyright PWxyz, LLC. All rights reserved.
Booklist Review
Many sweeping histories of the computer revolution have already been written, tracing the origins of today's digital landscape back to the ancient Sumerian abacus, yet few are as thorough as this fascinating account from science-historian Dyson. Prior to the 1940s, mechanical devices like slide rules could solve equations or yield simple yes or no answers. It wasn't until a team of mathematicians and engineers led by John von Neumann convened in Princeton in 1945 that the first primitive random-access-memory computer, known as ENIAC, was born. Dyson draws on a wealth of long-hidden archival material to tell the full story of this breakthrough and its eccentric masterminds, including now-legendary figures such as Kurt Godel and Richard Feynman. The most eye-opening facet of ENIAC's creation is just how dependent it was on the same government program that funded the hydrogen bomb. Despite a plethora of technical explanations, Dyson's prose is never tedious as he sheds illuminating light on the genesis and evolution of our ubiquitously computerized world.--Hays, Carl Copyright 2010 Booklist
New York Review of Books Review
IT'S anyone's guess whether our digital world ends with a bang, a whimper or a singularity. One thing's for sure: It began with a double entendre. The digital age can be traced to a machine built circa 1951 in Princeton, N.J. That machine was given the bureaucratic-sounding name the Mathematical and Numerical Integrator and Computer, and was known by the acronym Maniac, meaning something wild and uncontrollable - which it proved to be. But the crucial double entendre was contained in the Computer's memory. For the first time, numbers could mean numbers or instructions. Data could be a noun or a verb. That turned out to be incredibly important, as George Dyson makes clear in his latest book, "Turing's Cathedral," a groundbreaking history of the Princeton computer. Though the English mathematician Alan Turing gets title billing, Dyson's true protagonist is the Hungarian-American John von Neumann, presented here as the Steve Jobs of early computers - a man who invented almost nothing, yet whose vision changed the world. Von Neumann was no stereotypical mathematician. He was urbane, witty, wealthy and (literally) entitled. At his 1926 doctoral exam, the mathematician David Hilbert is said to have asked but one question: "Pray, who is the candidate's tailor?" He had never seen such beautiful evening clothes. Already one of the century's great mathematicians, von Neumann pursued a career in academia before turning to consult on the building of bombs (and computers) during World War II. At the time, the Army had begun work on a "digital electronic computer" known as the Eniac that was programmed, via switches and cables, by hand. After Nagasaki, von Neumann sold the United States military on a more powerful "stored program" computer, one that could read coded sequences from highspeed memory and thus more rapidly, and automatically, run numerical simulations essential to the design of nuclear weapons. Von Neumann also sold his employer, the Institute for Advanced Study, on building the Faustian device in Princeton. Another institute scholar, the logician Kurt Gödel, is also a vital figure in Dyson's story. Gödel is known for his "incompleteness theorem," which demonstrated the existence of true statements that cannot be proved in any mathematically rigorous way. But it's not Gödel's conclusion that matters here so much as the trick he used to achieve it: He invented the mathematical double entendre. In his famous proof, numbers carry two meanings - the familiar one designating a quantity, and an encoded one designating a logical proposition (e.g., "No number can be multiplied by 0 to produce 7"). In this way Gödel commingled the data and "code" of arithmetic. Turing took this notion and in 1936 applied it to a hypothetical universe of "automata." In his conception, a simple robot (a "Turing machine") is supplied with a paper tape. The tape is a crude form of memory, its contents doubling as data and code. Turing proved that this minimalist design is a "universal" computer, capable of performing any calculation. The basic ideas of stored-program computers were therefore in place before von Neumann got to work. Yet it was he who had the prestige and the connections to turn the Turing machine into reality. Because city-destroying bombs couldn't be built by trial and error, computers were required to simulate the physics of detonation and blast waves. A computer helped build the bomb, and the bomb necessitated ever more advanced computers. Von Neumann and two colleagues codified their machine's architecture in a report issued in 1946. They could be called the fathers of the open-source movement, as they ultimately declined to seek any patents. Within a few years of the plans' being shared, over a dozen siblings to the Princeton machine existed across the globe. Indeed, the processors in every cellphone, tablet and laptop still hew closely to von Neumann's architecture. Not all the ideas von Neumann donated to the public domain were exclusively his. "Johnny was rephrasing our logic, but it was still the SAME logic," John W. Mauchly, a creator of the Eniac, complained. Mauchly's colleague John Eckert added: "He grasped what we were doing quite quickly. I didn't know he was going to go out and more or less claim it as his own." The Maniac was first tested in the summer of 1951, "with a thermonuclear calculation that ran for 60 days nonstop." About 6 by 2 by 8 feet and weighing a trim halfton, it was much smaller than the room-size Eniac. But it inherited some of its predecessor's reliability issues. Dyson quotes engineers' exasperated entries from the Princeton machine's logbook. May 7, 1953: "What's the use? GOOD NIGHT." June 14, 1953: "Damnit - I can be just as stubborn as this thing." June 17, 1956: "THE HELL WITH IT." Like the nuclear physicist Edward Teller, von Neumann was an apparently unconflicted proponent of the bomb. At the Institute for Advanced Study, his hawkishness clashed with Einstein's pacifism, and Einstein opposed building his computer there. Virginia Davis, wife of the logician Martin Davis, remembers writing "Stop the Bomb" in the dust on von Neumann's car. But von Neumann's second wife, Klári, recalled him being shaken by what his computer might wreak. One night in 1945, John announced, "What we are creating now is a monster whose influence is going to change history, provided there is any history left." His biggest worry wasn't the bomb, however, but, as Dyson writes, "the growing powers of machines." Klári recalled prescribing "a couple of sleeping pills and a very strong drink." "Turing's Cathedral," incorporating original research and reporting - Dyson interviewed several people present at the institute during von Neumann's tenure there, including his own father, the physicist Freeman Dyson - is an expansive narrative wherein every character, place and idea rates a digression. A brief history of Olden Farm, the site of the Princeton computer, begins with the Lenni Lenape Indians and carries on through William Penn, George Washington and so forth. One of Dyson's running jokes is the supposedly abominable climate: Princeton "in summer has been described as 'like the inside of a dog's mouth.'" Humidity caused the Maniac's air-conditioning units to freeze solid with ice. The book brims with unexpected detail. Maybe the bomb (or the specter of the machines) affected everyone. Gödel believed his food was poisoned and starved himself to death. Turing, persecuted for his homosexuality, actually did die of poisoning, perhaps by biting into a cyanide-laced apple. Less well known is the tragic end of Klári von Neumann, a depressive Jewish socialite who became one of the world's first machine-language programmers and enacted the grandest suicide of the lot, downing cocktails before walking into the Pacific surf in a black dress with fur cuffs. Dyson's well-made sentences are worthy of these operatic contradictions. One example: "'God does not play dice with the Universe,' Albert Einstein advised physicist Max Born (Olivia Newton-John's grandfather) in 1936." Unlike many historians, Dyson has no need to reach for contemporary relevance. He quotes Julian Bigelow, the Maniac's chief engineer, in a passage that could serve as the book's précis: "What von Neumann contributed" was "this unshakable confidence that said: 'Go ahead, nothing else matters, get it running at this speed and this capability, and the rest of it is just a lot of nonsense.' . . . People ordinarily of modest aspirations, we all worked so hard and selflessly because we believed - we knew - it was happening here and at a few other places right then, and we were lucky to be in on it. . . . A tidal wave of computational power was about to break and inundate everything in science and much elsewhere, and things would never be the same." John von Neumann was less worried about his work on the bomb than he was about the rising power of machines. William Poundstone's latest book is "Are You Smart Enough to Work at Google?"
Choice Review
Science and technology historian Dyson writes a superb history of early computing in the US. In 1945, John von Neumann began a secret project at the Institute for Advanced Study (IAS) in Princeton, New Jersey, to build a Turing universal machine known as the MANIAC (Mathematical and Numerical Integrator and Computer). The MANIAC was "among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced." The endeavor would last until July 15th, 1958. Dyson includes a variety of wonderful departures from the MANIAC story--the founding and evolution of the IAS, participants' backstories and their post-project lives, Monte Carlo methods as "emergency first aid," early weather forecasting, and episodes from the atomic and thermonuclear bomb programs. Resonating themes in the MANIAC story include the importance played by engineers, their impact on the IAS, and how much modern computing is indebted to the MANIAC. The author made extensive use of primary sources including over 10,000 pages from the IAS's Electronic Computer Project and hours of interviews. Dyson's Darwin among The Machines (CH, Nov'97, 35-1572) is an earlier related work. Summing Up: Highly recommended. All levels/libraries. M. Mounts Dartmouth College
Guardian Review
At first sight - and it's a long first sight, lasting a good 200 of the book's 340 brilliant and frustrating pages of text - Turing's Cathedral appears to be a project for which George Dyson has failed to find a form. Ostensibly the story of the building of one of the earliest computers at Princeton in the late 1940s and early 50s, it keeps digressing wildly. The Institute for Advanced Study's MANIAC gets under construction over and over, in chapter after chapter, only for Dyson to veer off again into the biographical backstories of the constructors, and a myriad of alternative intellectual hinterlands, from hydrogen bomb design to game theory to weather prediction, by way of the cafe society of interwar Budapest. It's not that these aren't relevant. They are; but they aren't introduced in the cumulative, surreptitiously spoon-feeding way in which good pop-sci writing usually coaxes a linear narrative out of complex material. If this is a cathedral, it doesn't have anything as geometrical as a nave. It's a mass of separate structures joined by spiders' webs of coloured string. But it isn't a failure. It isn't one thing at all. It's three successes: three separate and different and differently impressive books Dyson might have written, all bizarrely shredded and mixed into a heap whose sorting is left as an exercise for the reader. Some of it is a painstaking oral history of MANIAC, built on an archivist's certainty that everything is worth rescuing from entropy that can possibly be known about the dawn of the digital computer. Truly everything, from interviews with as many of the surviving engineers as possible in the 1990s, to the institute's cafeteria manager's unexpected history testing Bleriot monoplanes in 1912, and the director's complaint in 1946 that the engineers were putting too much sugar in their tea. This part of the book is a monument (or rather a bit-stream of a monument). Some of it is an intellectual biography of MANIAC's chief architect John Von Neumann and the circle around him, determined to do justice to the polymathic range of his genius, and therefore dipping into everything he contributed to, from bomb design to game theory to robotics. Alan Turing, after whom the book is misnamed - it should really be called "Johnny's Web" - only comes into the picture seriously on page 242. He is merely the collaborator of Von Neumann who happened to stand along the particular out-raying string of his interest that happened to lead to the intellectual foundations of the digital age. But since Dyson himself is passionately interested in those, in comes the third separate thing the book is, a speculative, even visionary account of the philosophy of programming. This last, marvellous element dominates the end of the book; and having reached it, and begun to be able to make sense in retrospect of the digressive tangle that came before, you ask yourself whether its design might possibly have been consciously, artfully non-linear. A kind of literary equivalent to the whole-genome shotgun method, maybe, with the shredding of multiple projects handing over to us the job of sequencing and unification. But it feels less willed than that, more the interference pattern of three different ambitions, none of which the author was ready to relinquish. And it does, no denying, take persistence. Is it worth persisting? Absolutely. Let me give you, appropriately enough, three reasons why. One: no other book about the beginnings of the digital age brings to life anything like so vividly or appreciatively the immense engineering difficulty of creating electronic logic for the first time; of creating originally, and without a template, the pattern of organisation which has since become absolutely routine, and been etched on silicon at ever smaller micron-distances in chip foundries. The very word "foundry" insists that logic is a commodity, a material, the steel of the information age. But it didn't start like that. It started as an elaborate, just-possible accomplishment, requiring both conceptual brilliance and ingenious hands-on tinkering. It had to be built from scratch at the macro level, as an assemblage of valves and hand-wired circuits and cathode-ray tubes, fed by power at many different voltages, and protected from hazards ranging from roofing-tar to thunderstorms to the magnetic fields of passing trams. When Dyson describes the MANIAC being designed into its casing "like the folding of a cerebral cortex", you know he means - specifically that, like a brain. He has read the error logs in which the baffled pioneers tried to work out which of a hundred causes produced each failure, from a simple error in coding logic to the finicky failure of adjacent phosphor spots to stay distinctly charged. "I know when I'm licked." "This now is the 3rd different output." "To hell with it!" Two: no other book has engaged so intelligently and disconcertingly with the digital age's relationship to nuclear weapons research, not just as a moral quandary to do with funding, but as an indispensable developmental influence, producing the conceptual tools that would unlock the intellectual power of the computer. The "Monte Carlo" method (Von Neumann and Stanislaw Ulam) was born as a means to track the probability of a thermonuclear reaction staying supercritical in a hydrogen bomb. If there had been no branching paths of scattering, splitting, absorbing or escaping neutrons to be modelled, there might well have been no algorithms to simulate the probabilistic paths of evolution, finance, climate. Conversely, if there had been no Monte Carlo algorithm running at electronic speed on Maniac itself, there would have been no American H-bomb in 1952, vaporising 80m tons of Enewetak Atoll in a red cloud boiling half the sky. Three: no other book - this is where we get visionary - makes the connections this one does between the lessons of the computer's origin and the possible paths of its future. Dyson takes his cue from Turing and Von Neumann's ability to see all the way to the limits of the digital architecture they were themselves proposing and struggling to substantiate for the first time. In the late 1940s they were already thinking about the essential rigidity and (from one point of view) logical inefficiency of machines which, unlike living information processors, can only do one thing at a time, leaving the whole elaborate structure of the rest idle. As Dyson puts it: "There is a thin veneer of instructions, and then there is a dark empty 99.9%." Yet the "Von Neumann architecture" of a memory passing individual bits to a processor, each with its own unique memory address, is not the only possible one, and not the only one considered by Von Neumann, for that matter. Dyson believes that the birth of other architectures atop the reliable substrate of the digital-as-we-know-it is now imminent. Some of his suggestions may be, let's say, in advance of the evidence, like the idea that Google represents a first sketch of what Turing called "an oracle machine", supplementing its own deterministic states with the non-deterministic input of human queries. But then so were many of Turing's and Von Neumann's ideas a little previous, to say the least. Most of us should persist in reading this for the scrambled richness of its history. But I suspect that one of its afterlives is going to be as a source of koans for coders, troublingly simple questions to be copied out, and sellotaped to workstations, and stared at until - eureka! - something new happens in a human mind, and shortly thereafter in one of its electric surrogates. Francis Spufford's Red Plenty is published by Faber. To order Turing's Catherdral for pounds 17 with free UK p&p call Guardian book service on 0330 333 6846 or go to guardian.co.uk./bookshop - Francis Spufford Caption: Captions: This computer, built in the 1940s, contained more than 18,000 vacuum tubes If this is a cathedral, it doesn't have anything as geometrical as a nave. It's a mass of separate structures joined by spiders' webs of coloured string. But it isn't a failure. It isn't one thing at all. It's three successes: three separate and different and differently impressive books [George Dyson] might have written, all bizarrely shredded and mixed into a heap whose sorting is left as an exercise for the reader. Some of it is a painstaking oral history of MANIAC, built on an archivist's certainty that everything is worth rescuing from entropy that can possibly be known about the dawn of the digital computer. Truly everything, from interviews with as many of the surviving engineers as possible in the 1990s, to the institute's cafeteria manager's unexpected history testing Bleriot monoplanes in 1912, and the director's complaint in 1946 that the engineers were putting too much sugar in their tea. This part of the book is a monument (or rather a bit-stream of a monument). One: no other book about the beginnings of the digital age brings to life anything like so vividly or appreciatively the immense engineering difficulty of creating electronic logic for the first time; of creating originally, and without a template, the pattern of organisation which has since become absolutely routine, and been etched on silicon at ever smaller micron-distances in chip foundries. The very word "foundry" insists that logic is a commodity, a material, the steel of the information age. But it didn't start like that. It started as an elaborate, just-possible accomplishment, requiring both conceptual brilliance and ingenious hands-on tinkering. It had to be built from scratch at the macro level, as an assemblage of valves and hand-wired circuits and cathode-ray tubes, fed by power at many different voltages, and protected from hazards ranging from roofing-tar to thunderstorms to the magnetic fields of passing trams. When Dyson describes the MANIAC being designed into its casing "like the folding of a cerebral cortex", you know he means - specifically that, like a brain. He has read the error logs in which the baffled pioneers tried to work out which of a hundred causes produced each failure, from a simple error in coding logic to the finicky failure of adjacent phosphor spots to stay distinctly charged. "I know when I'm licked." "This now is the 3rd different output." "To hell with it!" - Francis Spufford.
Kirkus Review
Project Orion: The Atomic Spaceship 19571965, 2002, etc.) The author establishes late 1945 as the birth date of the first stored-program machine, built at the Institute for Advanced Study, established in Princeton in 1932 as a haven for theoreticians. It happened under the watch of the brilliant mathematician John von Neumann, fresh from commutes to Los Alamos where the atom bomb had been built and the hydrogen bomb only a gleam in Edward Teller's eye. Dyson makes clear that the motivation for some of the world's greatest technological advances has always been to perfect instruments of war. Indeed, von Neumann's colleagues included some who had been at Aberdeen Proving Grounds, where a dedicated-purpose computer, ENIAC, had been built to calculate firing tables for antiaircraft artillery. The IAS computer, MANIAC, was used to determine the parameters governing the fission of an atom device inside an H-bomb that would then ignite the fusion reaction. But for von Neumann and others, the MANIAC was also the embodiment of Alan Turing's universal machine, an abstract invention in the '30s by the mathematician who would go on to crack the Nazi's infamous Enigma code in World War II. In addition to these stories, Dyson discusses climate and genetic-modeling projects programmed on the MANIAC. The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text. Who knew that eccentric mathematician-logician Kurt Gdel had married a Viennese cabaret dancer? Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well--the definitive history of the computer.]] Copyright Kirkus Reviews, used with permission.
Library Journal Review
Dyson's (Project Orion: The True Story of the Atomic Spaceship) history of the first computer is a compelling and readable narrative. Under the leadership of John von Neumann, researchers at the Institute of Advanced Study in New Jersey built the first working computer. The book details each of the principal scientists and their part in this grand scheme. Chapter by chapter, readers are introduced to more than 70 individuals, each of whom played a unique role in the project. Even Princeton University gets its own chapter. The novelistic structure of the book makes it more entertaining than a typical, chronological history text, though at times also more difficult to follow. Dyson often has newly introduced persons interact with other figures who do not appear until later chapters, which will make reading more difficult for those who are not already familiar with this topic. Verdict Recommended for readers interested in the history of computers, history of science during World War II, and modern American history.-Dawn Lowe-Wincentsen, Oregon Inst. of Technology, Portland (c) Copyright 2012. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Excerpts
Excerpts
Preface POINT SOURCE SOLUTION I am thinking about something much more important than bombs. I am thinking about computers. --John von Neumann, 1946 There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky. In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today. Von Neumann's project was the physical realization of Alan Turing's Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same. Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann's computer for less than $1 million in under five years. "He was in the right place at the right time with the right connections with the right idea," remembers Willis Ware, fourth to be hired to join the engineering team, "setting aside the hassle that will probably never be resolved as to whose ideas they really were." As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, "What's next?" Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the "Super," or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion "brighter than a thousand suns." Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann's desire to build a computer, and the push to build von Neumann's computer was accelerated by the race to build a hydrogen bomb. Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In "Point Source Solution," a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that "for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point." This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects. Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent. Turing's model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann's implementation of Turing's model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing's Universal Machines. Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down. Universal codes and universal machines, introduced by Alan Turing in his "On Computable Numbers, with an Application to the Entscheidungsproblem" of 1936, have prospered to such an extent that Turing's underlying interest in the "decision problem" is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That's what makes the digital universe so interesting, and that's what brings us here. It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get. Excerpted from Turing's Cathedral: The Origins of the Digital Universe by George Dyson All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.