Jump directly to the Content
Jump directly to the content
Article

David N. Livingstone


Machines and Us

It is 5:30 a.m. I am struggling to make it to the airport for a 7:00 a.m. flight to London. The first news program of the day accompanies my last-minute preparations. There is a special feature on a new plague that is threatening much of the fabric of Western life. It's not a medical problem, however; it's a computer menace: the Millennium Bug. Many of the computers on which our banking, insurance, health, and other vital institutions depend are not prepared to cope with New Year's Day 2000. They don't know how to shift the date beyond 1999; they were not programmed to deal with all those zeros, and if they can't, system failure of massive proportions will ensue. The hunt for a solution is on, and time is running out.

This bit of apocalyptic whimsy reminds me how pervasive is technology's presence in our world and--more to the point--that the particular bits of machinery that should soon keep me several miles aloft are not fail-safe. It reminds me too of colleagues who lost their lives on January 8, 1989, returning from the same conference I had attended when a British Midland jet came down near Kegworth on a flight from Heathrow to Belfast. The interface between people and machines, between the mortal and the mechanical, can be fun and frivolity; it can also be lethal and lamentable.

The essays that Donald MacKenzie has drawn together in his most recent book, Knowing Machines, are animated by a concern to take stock of the diverse ways in which human beings and machines interact with one another. These essays continue the project on the social construction of scientific knowledge with which MacKenzie has been engaged for some two decades. (His earlier work includes an influential sociological reading of British statistics, connecting its modern history with eugenics, and a study of accuracy in the nuclear-missile industry.) He has drawn inspiration from those like Barry Barnes, David Bloor, Harry Collins, Bruno Latour, Steven Shapin, and Steve Woolgar, who have pressed sociological analysis into the very cognitive content of scientific claims. Mapping the putative boundary line between science and ideology is thus a cartographic venture on which MacKenzie remains unwilling to embark.

Given that MacKenzie is a sociologist at the Science Studies Unit of the University of Edinburgh, it is perhaps understandable that his inaugural cut at the problem of technology is via a re-examination of Karl Marx's take on the subject. Despite the collapse of nearly all the state regimes of Marxist inspiration, MacKenzie maintains that Marx's credo that "capital is not a thing, but a social relation between persons which is mediated through things" still has much to commend it. His account of "Marx and the Machine" predictably contains a certain amount of ritual intramural jousting with other commentators on whether Marx was a technological determinist, on social organization as the engine power of technical change, and on class conflict within capitalism as fundamentally a struggle between worker and machine. Whatever sense of irresolution may linger on these issues, however, he remains sure that Marx's insistence on the essentially political nature of technological history is a suspicion worthy of retrieval.

In a second preliminary essay before the case studies that make up the bulk of the book, MacKenzie reflects on the tensions between economic and sociological accounts of technological change. Explanations relying on neoclassical economics he finds unpersuasive since the profit-maximization thesis presumed to propel production technology is empirically faulty. Interpretations assuming a natural trajectory in technological history--frequently drawing conceptual sustenance from biological analogies--are no more convincing. A more reliable guide to making sense of the persistent patterns of technical change, and one that has the added attraction of mediating between the economic and the sociological, is what MacKenzie calls "ethnoaccountacy."

Ethnoaccountacy is a concept analogous to ethnomusicology or ethnobotany. These refer to the actual ways in which societies produce music or classify plants. By analogy, ethnoaccountacy would attend to how accounting really gets done, not on how it should be practiced or on what its theoretical principles lay down. MacKenzie mobilizes this idea to good effect in its application to technological decision-making; financial assessments of new technology turn out to be different in the United Kingdom, the United States, and Japan for the simple reason that what passes as "profit" is geographically variant. And profit is different because differing accounting practices are operative.

There is, I think, something altogether right-headed about this move; and surely there are implications of wider dimensions. Maybe our seminaries should devote a little more time to the study of ethnoecclesiology, ethnoethics, and ethnotheology. It's fine to elaborate ideal structures, moral principles, and conceptual frameworks. But there could be some advantage in taking a good hard look at, for example, how middle-class churches really resolve conflicts, or how tough moral choices are made in hospital wards, or the way theologies actually get made in sectarian societies. In my view, the "ethno" turn has much to commend it.

The chapters that follow illustrate something of the lineaments of such an expansive program. Not that we find here any simple coherence, sure-shot historical method, or universal model. To the contrary, the argument is elaborated on a case-by-case basis--messy, to be sure, but cumulatively impressive for that very reason. For after all, MacKenzie is convinced of the fundamentally contingent nature of technological change, and its history is therefore bound to subvert neat, unilinear schematizations. Besides this, to a lay reader like myself, the chapters look technically impressive, showing an intimate grasp of key mathematical concepts, engineering practices, computerspeak, and the ins and outs of laboratory arenas.

MacKenzie's scrutiny of the adoption of the laser gyroscope in aircraft navigation, for example, reveals that its establishment as the dominant technology in the industry was in large measure the outcome of a self-fulfilling prophecy rather than any intrinsic superiority over competing systems. With impressive technical detail and conceptual mastery, MacKenzie rehearses the history of the laser gyroscope's evolution, from nineteenth-century experiments to establish the existence of ether to its incorporation in the Boeing 757, but such matters are not presented in isolation from market forces, the personal prejudices of corporation vice presidents, and PR strategies. Indeed, Boeing's key role in the laser-gyro revolution, according to MacKenzie, had as much to do with the image of high-tech glamor it wanted to project and the vicissitudes of the civilian and military aviation markets as with technical matters of high accuracy or reliability. Technological revolutions are never simply about technological superiority. In sum, the story exemplifies what Bruno Latour dubs "technoscience": a networked amalgam of science, technology, social process, and social interest.

The same is true of the evolution of supercomputing, which had its origins in the imperatives of World War II and came to maturity during the Cold War. The narrative that MacKenzie weaves here takes him to the nuclear-weapons laboratories at Los Alamos and Livermore and is intended to show how "powerful organizations have shaped the technology of supercomputing as well as being its primary market." The reciprocal connections between the technological and the social are thereby again laid bare.

The "social shaping of technology" is revealed from yet another angle in the case of "The Charismatic Engineer," Seymour Cray, whose career MacKenzie (along with Boelie Elzen) considers in a separate chapter. As depicted in a 1990 Business Week cover story titled simply "The Genius," Cray, whose name became synonymous with "supercomputer," would seem to be the quintessential "rugged American individualist." In their account of his career, MacKenzie and Elzen pay full tribute to Cray's charisma. They note how his secrecy ("his very occasional 'public' appearances" were restricted "to carefully selected audiences, usually made up largely of technical specialists from current or potential customers") has fostered a mystique:

Around the privacy, anecdotes proliferate. Cray has become a legend, a myth, a symbol. Tales (many no doubt apocryphal) of his doings and sayings are told and retold. Display boards of these sayings accompany the exhibition devoted to Cray at Boston's Computer Museum. Rigorously rationed as they are, Cray's pronouncements take on an exceptional significance.

As MacKenzie and Elzen show, the image of the rugged individualist is finally misleading, for the "charisma of Cray was the product of a network of relationships that stretched far beyond the man and his brain." Indeed, the story of Cray, who repeatedly cut loose from the successful organizations he created in order to begin afresh, perfectly exemplifies what MacKenzie and Elzen call "the dialectic of charisma":

If a network is to grow and survive (more machines, more sales, a growing firm; an empire; a church), its links must multiply, expand, and solidify. Not only are more actors involved, but also many more specialist functions, often far removed from the skills of the leader. However entrenched the image of the charismatic leader's authorship of everything, strains in the opposite direction develop.

Even the seemingly abstract and disengaged world of mathematical proof, according to MacKenzie, is not insulated from social forces. When, with catastrophic results, computer systems fail, for example, questions about design accuracy suddenly come to the fore. Given the litigious environment of modern society, computer scientists increasingly need to come to terms with what passes as providing a formal proof that a system is correct. One such case was the microprocessor called VIPER (verifiable integrated processor for enhanced reliability) developed by the U.K. Ministry of Defence. Its design was marketed as enjoying the status of mathematical proof. The declaration ended up heading for the High Court with the Ministry of Defence contesting the challenge that its claim to mathematical proof was a misrepresentation. The case never got to court for one reason or another, but MacKenzie reflects on just what would happen if lawyers and judges were asked to rule on what constitutes a mathematical proof, for it is clear that different sides in the dispute operated with quite different expectations of what a proof amounted to. MacKenzie thus probes and problematizes the seeming innocence of mathematical proof and displays the negotiated character of what passes for it. He draws the inevitable conclusion: we need a far better understanding of "the sociology of proof."

In the relationship between humans and machines, the litigational is one thing, the lethal quite another. Consider in this connection MacKenzie's account of computer-related accidental death. In an altogether fascinating exploration, he reviews--insofar as methodological quandary and data availability allow--the issue of fatal accidents involving computer systems, and provides an unparalleled inventory of some 1,100 cases. These arise from a variety of causes--software error, design failure, and operator misjudgment--and range from medical accidents and military incidents to air disasters and deaths from automated industrial equipment. All these press him to the conclusion that, in efforts to address matters of safety, it is necessary to move beyond issues of technical accuracy to the organizational aspects of real-world operation. In the middle of this investigation, moreover, he makes the provocative observation--merely as an aside--that compared with such professions as medicine and law, the computer world has been far more diligent in self-monitoring and publicizing the failures of its own profession. That comment hints at something profound in the moral economy of the professions.

The nuclear industry is MacKenzie's final port of call in this sustained sociological interrogation of modern technology. In a chapter coauthored with Graham Spinardi and enticingly entitled "Tacit Knowledge and the Uninvention of Nuclear Weapons," the tensions between the explicit and the tacit are exploited to the full. The whole argument is premised on the conviction that, unlike explicit knowledge, which is recorded in a form that can readily be retrieved, tacit knowledge can be lost.

It's a bit like thatching, I guess: the craft almost died out some years ago because there were virtually no master thatchers left to pass on the art to succeeding generations. Now, according to our authors, the same sort of thing just could happen to the nuclear-weapons industry--which, it turns out, is far more dependent on tacit knowledge than we might imagine.

If specific, local knowledge was indeed crucial to the design and production of nuclear weapons, then their "uninvention"--by accident or design--would be at least a theoretical possibility. MacKenzie and Spinardi pursue this line of investigation with much vigor, reviewing the history of the industry, interviewing key players in the drama, and surveying the difficulties that various states experienced in their attempt to go nuclear. In the latter case, the authors reckon, explicit descriptions of the construction of earlier atomic bombs were never enough to guarantee success, and so the attainment of atomic competency would be better described as independent re-invention rather than mere reproduction. Copy-catting, it is clear, was never the simple task it seemed. In every case, such tacit factors as expert judgment, lengthy apprenticeship, pragmatic inventiveness, hands-on experience, and a feel for the job were decisive. As one official put it, having "a cookbook design doesn't mean you can make a cake on the first try."

Whether or not MacKenzie and Spinardi's account--not to mention their utopian hope--is persuasive, there are important implications to be drawn from the general tenor of their analysis. If how-to books don't really do the job in the high-tech world of the nuclear-weapons industry, where the meticulous following of regulations is supposed to produce the goods, then we can hardly expect objectivist do-it-yourself routes to the contemplative life, quick-fix guides to psychotherapy, and five-point plans to spirituality to be very effective. Ironically, if MacKenzie is to be believed, it is in the realm of technology that there is a growing realization of the central pedagogic significance of an old craft--apprenticeship. Here, working with a master, as Leonardo da Vinci himself recalled, one learns far more than ever the teacher knows he or she is conveying. Elsewhere it is called discipleship. Maybe that's why Jesus left no guidebook or systematic theology; instead, he apprenticed followers.

It is doubtless often true that mechanization is intended to reduce labor time and thereby increase capitalist accumulation. No doubt it is also true that "from the viewpoint of the worker, the machine is thus a direct threat" since it is "capital's material mode of existence." And it can further be conceded that a machine culture imposes new disciplinary regimes on the labor force that can be manipulative, managerial, and--indeed--mechanistic. Such may well be the case. But this can hardly be the whole story. For one thing, the absence of latter-day Luddites is surely noteworthy. Perhaps the sheer entertainment value of the ever-smaller screen has bred a society of technophiles that--even if careless of contemporary philosophy--instantiates Richard Rorty's postmodern vision of playfulness as the highest value. For another, economically reductionist accounts of technology tout court are bound to founder on the rock of hard particularity--such as the forgotten fact that the washing machine, as a labor-saving device, was invented by the Shakers to leave more time for prayer!

David N. Livingstone is professor of geography at the Queen's University of Belfast. He is the author of several books, including Nathaniel Southgate Shaler and the Culture of American Science, Darwin's Forgotten Defenders, and The Geographical Tradition.

Copyright(c) 1997 by the author or Christianity Today, Inc./Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail BCedit@aol.com.

Sep/Oct 1997, Vol. 3, No. 5, Page 27

7B5lm7B50277826

Most ReadMost Shared