Jump directly to the Content
Jump directly to the content

Timothy Larsen

God & Math

Oil and water, or gin and tonic?

Editor's Note: This article first appeared in the September/October 2009 issue of Books & Culture. If you find it delightful, as I did, you might also enjoy a book published last year by Cambridge University Press, Daniel Brown's The Poetry of Victorian Scientists.

It's not hard to predict how eagerly the new atheists would pounce if an orthodox Christian theologian were to concede that the notion that God is three-in-one could be labeled "irrational." Or that the doctrine of the Trinity is so far beyond our normal ways of thinking that one might refer to the three persons of the Godhead as "imaginary." Yet mathematicians quite unapologetically speak of "imaginary" and "irrational" numbers. Moreover, they are content to assume that if others think they have thereby lapsed into nonsense, so much the worse for them.

At the heart of Daniel J. Cohen's welcome and informative book, Equations from God: Pure Mathematics and Victorian Faith, is an account of how in the middle of the 19th century, leading Anglo-American mathematicians were convinced that their discipline bolstered belief in God. One such figure was Benjamin Peirce, the Harvard professor who is sometimes referred to as the father of pure mathematics in America. (He was also the father of Charles Sanders Peirce, the eminent philosopher and polymath.) In a charming recounting of his life and thought, Cohen observes that "a fascination with imaginary quantities seized Peirce as a child." (Peirce's son recollected that his father had a "superstitious reverence for the square root of minus one.")

Peirce was thoroughly imbued with the conviction that his discipline revealed the Almighty and his thoughts. He observed that mathematics should be seen as akin to the burning bush: a source of divine revelation that was continuous rather than consumed. In his lectures, after emerging triumphantly from an involved mathematical demonstration, Professor Peirce was known suddenly to pronounce, "Gentlemen, there must be a God!" Did I mention he was in dead earnest?

One of Peirce's friends was Thomas Hill, president of Harvard for most of the 1860s. Hill's conversion to this way of thinking came while meditating on Euclid one lazy summer day: "Pondering geometrical postulates, the empiricist philosophy of Locke suddenly seemed faulty." Hill went on to write Geometry and Faith (1849), and his exuberance for the powers of spiritual uplift inherent in this discipline seemingly knew no bounds. He traced the decline and fall of the Roman Empire to its failure to appreciate the idealist perspective on geometry. Likewise, if a research university had been established in antebellum America to nurture original work in mathematics, the Civil War might have been averted. It was absurd to imagine that one could keep the teaching of religion out of state schools, he averred, because that would necessitate banning geometry. For good measure, Peirce fed Hill's monomania with the observation that, although Darwinism was true, natural theology would still be impregnable as long as there was geometry.

Figures such as Peirce need to be treated with respect, but Cohen has some fun with crasser attempts to marshal math for apologetic purposes. The clergyman Tresham Gregg, for example, wrote a book subtitled, Thoughts on the Nature of the Differential Calculus, and on the Application of Its Principles to Metaphysics, with a View to the Attainment of Demonstration and Certainty in Moral, Political, and Ecclesiastical Affairs. In it, he asked such probing questions as: "What moral truth does the truth, that a number multiplied by its inverse is always equal to unity, illustrate?" Oliver Bryne, professor of mathematics at the College of Civil Engineers, London, penned The Creed of Saint Athanasius Proved by a Mathematical Parallel. This historic creed, the reader may recall, affirms that the Father is God, the Son is God, and the Holy Spirit is God, and yet there are not three Gods but one God. The burden of Bryne's mathematical proof is that infinity + infinity + infinity = infinity.

Alas, if not guilty of monomania, Cohen himself seems to be in the grip of a tidy interpretive scheme that ignores inconvenient counter-evidence. So, for example, although John William Colenso, bishop of Natal, makes it into Equations from God as a beloved liberal theologian, Cohen fails to mention that Colenso was also a mathematician. Before he was a bishop, Colenso was a fellow of St John's College, Cambridge, mathematics tutor at Harrow, and the author of popular textbooks on mathematics and algebra. More to the point, Colenso's attempts to use mathematics in order to debunk traditional Christian beliefs were no less risible than Bryne's efforts in the opposite direction. Including such evidence would have complicated Cohen's tale.

There are other oddities. Cohen writes, "Students at his school in the late 1830s recalled that [George] Boole had drifted from the Church of England, reading from the Greek Bible rather than the authorized Church of England version." This piece of information is so incomprehensible it must be garbled. The only sense in which a translation was authorized was for the public reading of Scripture during an Anglican church service, and—so far from competing with the Greek original—the whole point of authorizing a translation was to ensure that it accurately reflected it.

Tellingly, Cohen insists in an endnote that the Bible teaches that the value of π is precisely 3, despite conceding that not a single person committed to the truthfulness of Scripture has ever been troubled by this alleged fact: "Biblical literalists never objected to the advanced mathematical conclusions about π ."

Cohen accurately and usefully reveals the way that professional mathematicians drove out the amateurs and secured the discipline as their own exclusive preserve. They banished puzzle-solving, relocated discussions to dry and technical journals, and generally attempted to make mathematics appear as boring as possible to the wider public. In this endeavor we must credit them with some enduring success.

Bloody-minded amateurs get the beating they deserve in Equations from God. A group of them refused to accept that π is transcendental and thus that there is no circle-squaring solution. James Smith, for example, "cut pieces of cardboard and copper into circles and squares of different sizes, weighted them, and compared the sums." He announced that π was exactly 3 1/16. Again, in order to make his narrative neater, Cohen asserts that these amateurs were motivated by a conservative religious perspective. The evidence given in his own account, however, indicates the opposite: from what one reads here, it seems clear that for figures such as James Smith, mathematics actually served as a substitute religion. Their tone-deaf, common-sense instincts (for example, John A. Parker's insistence that "infinite" is a woolly-headed concept) are no less damning for traditional theological claims.

Following a standard Victorian loss-of-faith narrative arc, Cohen's book ends with the complete defeat of an idealist or Platonic notion of mathematics. In keeping with this version of events, the last word is given to the atheist philosopher and mathematician Bertrand Russell. It is therefore rather startling to pick up Mario Livio's Is God A Mathematician? Livio, an astrophysicist working for the Hubble Space Telescope Science Institute, makes it clear that the Platonic view is very much a live one among contemporary mathematicians. His book explores the question of whether math is discovered (and thus reveals, as it were, divine thoughts—the Platonic view) or is merely a human construct (the formalist view), while at the same time offering a lively account of the history of the discipline that explains mathematical concepts clearly and cogently for non-specialists.

Here is Livio on the beginning of thinking about what we now call irrational numbers: "One of the Pythagoreans … managed to prove that the square root of two cannot be expressed as a ratio of any two whole numbers. In other words, even though we have an infinity of whole numbers to choose from, the search for two of them that give a ratio of √2 is doomed from the start." It could be the title of a new James Bond film: Infinity Is Not Enough. Or, as Buzz Lightyear would say: "To infinity and beyond!" (These cracks are my own, but if you do not like them, then you will probably find some of Livio's attempts to write for a popular audience irritating: he says our response to Newton's Principia should be "Wow!", illustrates the utility of probabilistic thinking with a reference to "gold diggers who marry for money," recurringly quotes Woody Allen, and much more.)

Livio has a flair for enlivening and humanizing his narrative through personal anecdotes about great mathematicians. René Descartes, for example, was prompted into his intellectual journey through mystical dreams in which, among other scenes, "An old man appeared and attempted to present him with a melon from a foreign land." (Although psychoanalytical interpretation now reveals this to have been sexual in nature, Descartes innocently mistook it for a summons to the pursuit of knowledge through reason.) Readers may also be reassured to learn that tossing a coin is indeed a fair way to determine which team gets to choose whether or not it will kick off at the start of a football game. In an unhurried quest for empirical knowledge, the statistician Karl Pearson tossed one coin 24,000 times in a row. (It came up heads 12,012 times.)

Although Cohen is certainly right that amateurs were deliberately and successfully squeezed out of the discipline, even this generalization has at least one exception. For the whole of the 20th century hitherto, mathematicians had accepted a table of 43 non-alternating knots of ten crossings. Then, in 1974, a New York lawyer, Kenneth Perko, discovered that two of them were identical while experimenting with ropes on his living room floor.

As to the question energizing the book, Livio quotes numerous 20th- and 21st-century mathematicians who believe in the Platonic view. In the folksy articulation of contemporary mathematician Martin Gardner: "If two dinosaurs joined two other dinosaurs in a clearing, there would be four there, even though no humans were around to observe it, and the beasts were too stupid to know it." Bolstering this conviction is what Nobel laureate physicist Eugene Wigner famously called "the unreasonable effectiveness of mathematics." If mathematics is merely a human construct, why do scientists keep discovering that nature conforms to these invented games? This is most strikingly brought home through the predictive powers of mathematics.

Livio is at his best demonstrating in breathless wonder how again and again, work in pure mathematics has subsequently found a practical application. In what was almost a self-parody of an academic, the mathematician G. H. Hardy boasted: "No discovery of mine has made, or is like to make, directly or indirectly, for good or ill, the least difference to the amenity of the world." He spoke too soon: the Hardy-Weinberg law is now used by geneticists to study the evolution of populations.

At the end of the book, Mario Livio weighs in on the dispute to pronounce magisterially: "Our mathematics is a combination of inventions and discoveries." This rather pedestrian sic et non does not seem to warrant its excited italics. Nevertheless, Livio himself humbly and astutely observes that perhaps mathematicians are not the people best equipped to address the question. Moreover, whatever one thinks of Livio's own answer, his book is of considerable worth as a history of mathematics and as a clear introduction to mathematical ideas and philosophical perspectives on the discipline.

All that remains is for me anti-climactically to pronounce my own view: If two dinosaurs joined two other dinosaurs in a clearing, even if there were no humans to generate theological ideas, and the beasts were too stupid to discern it, there would still be the mind of God.

Timothy Larsen is McManis Professor of Christian Thought at Wheaton College. He is the author most recently of Crisis of Doubt: Honest Faith in Nineteenth Century England (Oxford Univ. Press), and is at work on a book about the Bible in the 19th century.

Most ReadMost Shared