Jump directly to the Content
Jump directly to the content
Article

Alan Jacobs


The Man Who Delivered the Computer

John von Neumann.

The novelist Jane Smiley has written an interesting and informative book called The Man Who Invented the Computer, which is marred chiefly by its title. Smiley herself clearly has mixed feelings about it. At times she is bold and straightforward in her claims:

The inventor of the computer was a thirty-four-year-old associate professor of physics at Iowa State College named John Vincent Atanasoff. There is no doubt that he invented the computer (his claim was affirmed in court in 1978) and there is no doubt that the computer was the most important (though not the most deadly) invention of the twentieth century …. Where and when did Atanasoff invent the computer? In a roadhouse in Rock Island, Illinois, while having a drink. He jotted his notes on a cocktail napkin.

(This happened in December of 1937.) But at other points in the book, Smiley is more nuanced and even evasive; she writes many sentences that freely acknowledge multiple progenitors of the computer: "All of the twentieth-century computer inventors were aware of [Charles] Babbage's work," one of them begins, and here is another: "There was no inventor of the computer who was not a vivid personality, and no two are alike." It's hard not to think that the bold, even grandiose, claim of Smiley's title stems from her determination to give credit to a forgotten genius, who also happens to have taught at the same university where Jane Smiley has been a faculty member for many years. But it surely also stems from a general public desire to know who came up with the machine that has done so much to transform our daily lives and the shape of our social order. The problem is that, however we might like to designate someone as The Inventor of the Computer, we just can't. No matter what a court might affirm.

Intrinsic to that problem is this: the question "Who invented the computer?" is hopelessly vague. What do you mean by "the computer"? Mechanical calculating machines go back at least to the 17th century, with Pascal. The idea of a machine that could be programmed to do the job of almost any other kind of machine was Alan Turing's in 1937, though he was dependent on George Boole's logical system, formulated in the 19th century, and on questions raised and addressed in the early 20th century by the mathematicians David Hilbert and Kurt Gödel. Computers, as we understand them, would end up recording and manipulating information, and the nature of information would be conceptually isolated by the American mathematician Claude Shannon in 1948, though some of the basic principles had been articulated by Francis Bacon 350 years earlier. The first electrical computers could only work because people had invented the vacuum tube, with the key contribution being that of an American inventor named Lee de Forest in 1907; later ones could be made small enough to carry around only because scientists at Bell Labs—where Claude Shannon also worked—invented the transistor or semiconductor in 1947. (Seven years later, scientists at Texas Instruments developed the first silicon semiconductor.) The best response to the question "Who invented the computer?" is to say that the question just doesn't make much sense. A far better question is: "Who and what made possible the current social and cultural and economic and political dominance of digital computing?"

Years ago I heard a lecture by Robert Kaplan, author of Balkan Ghosts, in which he described a visit he had made to Serbia during which the local newspapers and TV stations were full of news about a series of kidnappings of priests. Kaplan asked a Serbian acquaintance why these kidnappings were happening. Well, the man replied, you need to go back to the 16th century …. Writing about the history of computers is like that. It's very hard to know where to begin.

James Gleick begins The Information: A Theory, A History, A Flood with some 19th-century Englishmen discovering that people along the Niger River used "talking drums" to communicate across distances. The "language" of the talking drums had two major components: difference (in this case, variations in speed, rhythm, and tone) and repetition. A good deal of repetition was needed for users of talking drums, because the available variations were quite meager—too meager to serve as a fully adequate replacement for the manifold variations human speech is capable of—and because sounds are limited by distance and often overridden by other sounds. When listening to talking drums, one must deal with the presence of noise, noise that can make it hard to discern the signal. But this is always true when people strive to communicate, even if sometimes the noise is more metaphorical than literal. The components needed to transfer information via talking drum are the same as in every other informational environment: difference and repetition effectively are information.

Claude Shannon was the first person to grasp this point fully, and to offer a mathematical account of it. When asked later in life to give talks on the subject, he would quote Jesus' Sermon on the Mount from the King James Version: "Let your communication be Yea, yea; Nay, nay; for whatsoever is more than these cometh of evil" (Matt. 5:37). It was a shrewd citation, encompassing as it does both binary encoding, in the simple distinction between "Yea" and "Nay," and the need for redundancy, in the doubling of each term. All digital computers are built on this understanding.

But a decade before Shannon's paper, John Atanasoff—the hero of Jane Smiley's biography—was already putting together an elementary working computer. And at that same time Alan Turing was laying mathematical and logical foundations that Shannon would build on. (The two men knew each other and met when Turing came to the U.S. during World War II.) The first famous computers came before the public eye in the 1950s, but years earlier a number of people already knew how to build them. The problem was, simply, money. No one knew how to build small computers: to build just one you needed a great many highly expensive and fragile parts—registers, vacuum tubes, relays—along with a very large room in which to house the machine, a great deal of electricity not just to run it but also to cool it, and a team of engineers, mechanics, mathematicians, and programmers to keep it functioning. The modern computer would have been created in any case, perhaps already had been created in primitive form by people like Atanasoff, but could only reach its full world-changing potential when a person turned up who somehow managed to combine mathematical and scientific prowess with managerial genius, a far-ranging network of contacts in the academic and governmental spheres, and extraordinary gifts of salesmanship. And how likely was that to happen?

Future generations looking back on the 20th century may well decide that its most important year was 1953. As George Dyson notes in Turing's Cathedral: The Origins of the Digital Universe, "three technological revolutions dawned" in that year: "thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA." These three revolutions are entangled in curious ways. That first stored-program computer—called MANIAC (Mathematical and Numerical Integrator and Computer), and built at Princeton's Institute for Advanced Study—was used to run the complex mathematical calculations that let scientists know whether the thermonuclear reactions they were trying to create would indeed occur as planned; this computer worked according to the same basic principles of information processing that underlie the transmission of genetic code in living organisms. Information theory—or what Shannon, in his massively influential paper of 1948, called "A Mathematical Theory of Communication"—forms the link that joins the biological discoveries of Watson and Crick, the functioning of digital computers, and the capacity for nuclear destruction that kept much of the world in an underlying state of fear for the next several decades. As Dyson comments, "It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time."

In Turing's Cathedral, Dyson tells the story of the building of MANIAC, with an eye always to the unanticipated but vast consequences of the project. Both Dyson and Gleick depend on previous researchers and historians—though Dyson's work is also underpinned by a multitude of interviews—and for those who know the work that has been done in these areas, two figures have come, increasingly, to stand out as makers of the 20th century. One is Shannon. The second is Turing, the English mathematician and codebreaker whose 1936 paper "On Computable Numbers" presented for the first time the idea of a digital "universal machine," that is, a machine capable of implementing the functions of an indefinite number of other machines. But from reading Dyson's and Gleick's books, one might well draw the conclusion that another figure equals Shannon and Turing in importance—perhaps, when all is said and done, exceeds them; perhaps was the figure with a greater impact on the warp and woof of our everyday world in the past half-century than any other single person. His name was John von Neumann, and through an encounter with him we can come to see why the histories told by Dyson and Gleick are so vital.

Born Neumann János Lajos in Budapest in 1903, John von Neumann established a precocious career as a mathematician and physicist in Hungary and Germany before coming to Princeton in 1930, first as a member of the university's mathematics department and then, in 1933, as a participant in a brand-new endeavor called the Institute for Advanced Study. (Two of the other early members were Albert Einstein and Kurt Gödel; George Dyson's father Freeman has been a fellow there since 1953.) Von Neumann would remain at the IAS until the last year of his life; cancer killed him in 1957, when he was only fifty-three. Soon after his death, Hans Bethe, who would later win the Nobel Prize in Physics, told Life magazine, "I have sometimes wondered whether a brain like von Neumann's does not indicate a species superior to that of man."

The encomium to von Neumann in Life—its full title is "Passing of a Great Mind: John von Neumann, a Brilliant, Jovial Mathematician, Was a Prodigious Servant of Science and His Country"—acknowledges his many contributions to theoretical physics, applied physics, and mathematics, but places its emphasis elsewhere: "The foremost authority on computing machines in the U.S., von Neumann was more than anyone else responsible for the increased use of electronic 'brains' in government and industry." And this emphasis now seems to be the correct one. Despite his immense individual achievements, von Neumann's greatness is best marked by his ability to encourage, initiate, and manage collective endeavors that required the participation of many immensely gifted people covering a wide range of intellectual territories. (Imagine a Steve Jobs whose technical mastery of mathematics and engineering matched or exceeded that of his employees.) It is hard to see how anyone other than von Neumann could have pulled it all off.

One key to the puzzle of how he did it may be found in his name. When he moved to Germany, Neumann János Lajos became Johan von Neumann—his father, a prominent Jewish banker, had been elevated to the Austro-Hungarian nobility in 1913—but upon arrival in America "Johan" became "John," and he was universally known as "Johnny." This distinctive combination of Old World dignity and American convivial ease served von Neumann exceptionally well: he could be commanding or courtly, severe or gregarious, at need. He seems almost always to have gotten people to do what he wanted them to do. And the changeableness demanded by von Neumann's role as scientific impresario seems to have suited his personality: he was famously impatient, flitting from idea to idea, topic to topic. He bought a new car each year—always a Cadillac, because, he said, "no one would sell me a tank"—and drove well over the speed limit, humming and whistling and swaying the car back and forth to the tune. He wrecked several of those cars and received speeding tickets on a regular basis, but he charmed some police authority into fixing those for him; he never paid.

Those who worked at the IAS sometimes commented that he and Einstein, the two dominant figures in the institute's first decades, could scarcely have been more different. Einstein thought slowly and with endless patience, whereas von Neumann would either arrive at the solution to a problem instantly—when that happened, he would leave whatever company he was in, sometimes without explanation, to write down his thoughts—or grow bored with it and move on to something else. He was made nervous by silence and loved to work in noisy public spaces: he would have been a natural for the coffeehouse workplace culture that some have dubbed Laptopistan. Fittingly, then, it was a fidgety, impatient, endlessly distractible man who fathered the computers that fill our days with distractions.

But he could not have achieved what he did without his endlessly gregarious flitting from person to person, office to office, problem to problem. In an exceptionally apt metaphor, Dyson comments that "von Neumann served the role of messenger RNA, helping to convey the best of the ideas" of that time and place. Maybe the scientific and technical revolutions that have since come about would have happened anyway if von Neumann had not served as the Hermes of his world; but they would have happened more slowly, more haltingly, with more fits, starts, and dead ends. Whether von Neumann's incisive interventions have, overall, made the world better or worse I leave as an exercise for the reader.

But what, precisely, did von Neumann, and the groups he worked with, accomplish? And how did they do it? The chief thing they did was to build big machines, and in those days it was the hardware that mattered, not the software. Programming was so little considered at the time that it was often left to women—some of the same women who not long before had taken jobs as mathematical calculators or, as they were called from the 19th century to the mid-20th, "computers." To some degree programming was neglected because it was a far less complex task than it is today, given that those early machines were asked to do just a few number-crunching things, but to a greater degree it arose from the familiar belief that Big Toys make Big Men.

(Steve Lohr's 2001 book Go To does an excellent job of recovering those early days of programming, and especially of demonstrating the key role that a handful of women played in creating modern computer programming. Curiously, at our current moment coders get the lion's share of attention and the material aspect of our online lives is often neglected, except perhaps for the regular slavering over the design of Apple's products. Thus, if Lohr's book was a corrective in one direction—celebrating the dignity and importance of coding at the end of an era when hardware reigned supreme—Andrew Blum's new book Tubes: A Journey to the Center of the Internet redresses the balance in the other direction by reminding us of the stubbornly physical character of the Internet. We may believe that we're living in a wireless world, but our wireless routers are connected to wires that are connected to network exchanges that are connected to other continents by undersea fiberoptic cables—cables laid along the same paths that were established 150 years ago when the first transoceanic telegraph cables were set down.)

So, on the crassest level, von Neumann used his government contacts to procure funding for building enormous metal machines. It must be acknowledged that this was something close to a zero-sum game for the early computer-builders, since U.S. government policy at the time was to concentrate its funding on projects believed most likely to bear fruit. Atanasoff had created a reasonably functional computer by 1941 but had set that work aside when called upon to help the war effort in other ways; when he resumed his labors in 1946, he soon had his funding cut off because von Neumann convinced the officials responsible to throw all their money to the Institute for Advanced Study's MANIAC project.

Anyone who thinks that the rise of the computer is a good thing will also think that the government decided wisely. MANIAC was a stored-program computer, which is to say it contained both the work it did and the instructions for doing it. As Dyson puts it, "Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code." It was Turing who first realized that computations of anything can be broken down into such basic binary alternatives; it was Shannon who theorized the "bit" as the basic unit of information; and it was von Neumann who figured out how a machine meant to do these calculations should be put together. This "von Neumann architecture"—which he first laid out in a classified government paper in 1945—became the structural foundation of modern computing because it simply worked better than the alternatives, especially in enabling quick and easy re-programming, which in turn allowed the computer to perform any task conceivable in what Turing called "programmable numbers." (Atanasoff had come up with a lot, but not this; to re-program his computer he effectively had to rebuild it.) Dyson again: "The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same."

Late in her book, Jane Smiley reflects on the complicated story she has told—far more complicated than its title would suggest. Here she acknowledges that though the "seed" of the modern personal computer "was planted and its shoot cultivated by John Vincent Atanasoff and [his assistant] Clifford Berry," many other people had to intervene in many different ways for our current environment to emerge. Thinking on the pioneers who built, programmed, and ran the first computers, she muses especially on one:

Perhaps our most problematic character is John von Neumann. Scott McCartney considers him a thief, Norman Macrae and Kati Marton consider him a visionary. Everyone considers him a genius. As for me, von Neumann is the man whose memoirs I would have liked to read, the man at the center of everything, the man of Budapest and the man of Washington, D.C. I would like to know who he thought had invented the computer.

It is indeed tempting to speculate on the story von Neumann would have told had he lived to write those memoirs. But cancer came upon him early. (Some have wondered whether the disease stemmed from his exposure to tests of nuclear weapons at Bikini Atoll in 1946.) He had just left the IAS for a full-time role at the Atomic Energy Commission, for which he had already worked for some years, when he received his diagnosis. He spent the last year of his life in Walter Reed Hospital in Washington, D.C., where he surprised and dismayed friends and family by seeking out a priest for counsel. His brother explains this away as merely an effort to find intellectual companionship; his daughter Marina affirms, to the contrary, that von Neumann was taking up Pascal's Wager—betting on the existence of God, since he couldn't lose by being wrong: "My father told me, in so many words, once, that Catholicism was a very tough religion to live in but it was the only one to die in."

Whatever his spiritual state may have been, we know that he was distressed by the disease's effect on his mind. An old joke among mathematicians says, "Other mathematicians prove what they can; von Neumann proves what he wants," but in the last weeks of his life he felt his mind rapidly deteriorating. What had always come so easily to him—almost supernaturally easy—became difficult and then impossible. Marina von Neumann told George Dyson that near the end her father asked her "to test him on really simple arithmetic problems, like seven plus four, and I did this for a few minutes, and then I couldn't take it anymore." Even as von Neumann stumbled over the most elementary sums, he never lost his awareness that mathematics was important and that he had been marked from childhood by the astonishing fluency with which he could do it. Marina von Neumann fled the hospital room: in George Dyson's words, she could not bear watch her father, one of the greatest minds of the 20th or any other century, "recognizing that that by which he defined himself had slipped away."

The story Dyson and Gleick tell, and that I have retold selectively and with minor modifications, is highly contested. Many would say that it gives von Neumann too much credit. Certainly partisans of Atanasoff contend that he did most of the heavy lifting and that von Neumann came in late in the game, added some polish, and ended up carrying away all the prizes. Gleick himself shows, though he does not make this point explicit, how von Neumann was stimulated by conversations with scientists and engineers of all stripes, including the extraordinary polymath Norbert Wiener of MIT, whose understanding of what he called "cybernetics"—"the scientific study of control and communication in the animal and the machine"—was nearly as influential in early computer design as the work of Shannon and Turing. And none of the existing accounts, in my judgment, give sufficient credit to the engineers who had to figure out ways to build and maintain the machines that others theorized: Atanasoff depended on the many skills of his assistant Clifford Berry, and von Neumann depended radically on an engineer of true genius named Julian Bigelow. All that said, I do not think that Dyson rates von Neumann too highly. When he says that the stored-program computer was "conceived" by Turing and "delivered" by von Neumann, he is precisely right: perhaps other people could have gotten the world-transforming machine from conception to delivery, but only von Neumann did. (Again one thinks of Steve Jobs, who famously said, "Real artists ship.")

Moreover, he achieved this by linking the development of the stored-program computer with the military goals of the United States. He could not have "delivered" the computer had he not convinced leading government officials that his machine would help to keep the United States ahead of the Soviet Union in the struggle for political superiority, and ultimate to defeat Communism once and for all. In other words, von Neumann was a primary architect not just of modern computing but also of modern warfare, and of what President Eisenhower, in his farewell address given in January 1961, called "the military-industrial complex." Eisenhower had projects like MANIAC in mind when he coined that phrase, and when he uttered words of warning that should be read in conjunction with encomia to von Neumann like Dyson's and like the celebration in Life magazine cited earlier. Here is the key passage from Eisenhower's speech:

This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence—economic, political, even spiritual—is felt in every city, every statehouse, every office of the federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society. In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.

I quoted earlier Dyson's comment that it is "no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time." Indeed, the two projects propped each other up: the weapon could not have been made to work properly without the calculations performed by the computer, and the computer would not have been funded in the absence of the perceived need for such a weapon.

Similarly, the Internet was created by the Defense Advanced Research Projects Agency (DARPA), an office of the U.S. Department of Defense charged with developing advanced military technologies. In the minds of the more optimistic observers of the technological scene, the geeks of the world have gone a long way towards reprogramming governmental swords into ploughshares meant for the masses; more skeptical critics, like Evgeny Morozov in his book The Net Delusion, discern authoritarian governments finding ways to convert those ploughshares right back into dictatorial swords. We can hope that the optimists are right, but we must acknowledge that the military-industrial complex continues to be just that, a complex, a set of mutually reinforcing structures that continually generates instruments of extraordinary power—for good and for ill. No one ever negotiated the interstices of those structures as skillfully as John von Neumann. Probably no one ever will again. And that just may be a mercy.

Alan Jacobs is professor of English at Wheaton College. His edition of Auden's The Age of Anxiety was published last year by Princeton University Press. He is the author most recently of The Pleasure of Reading in an Age of Distraction (Oxford Univ. Press) and a brief sequel to that book, published as a Kindle Single: Reverting to Type: A Reader's Story.

Books mentioned in this essay:

George Dyson, Turing's Cathedral: The Origins of the Digital Universe (Pantheon Books, 2012).

James Gleick, The Information: A History, A Theory, A Flood (Pantheon Books, 2011).

Jane Smiley, The Man Who Invented the Computer: The Biography of John Atanasoff, Digital Pioneer (Doubleday, 2010).

Steve Lohr, Go To: The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts—The Programmers Who Created the Software Revolution (Basic Books, 2001).

Andrew Blum, Tubes: A Journey to the Center of the Internet (Ecco Press, 2012).

Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (Perseus Books, 2011).

Most ReadMost Shared