By: Langdon Winner.
WE ARE TOLD that "it" looms before us as an irresistible force, a world-transforming dynamism that will eliminate our jobs, educate our children, revolutionize our families, erode our privacy and modify our genes. Faced with "it," there is no alternative, nothing left but to accept the inevitable and celebrate its coming. From now on "it" will determine what the future brings.
The "it" is, of course, technology. As the new millennium beckons, a dazzling array of books, news stories, advertisements and television specials boldly proclaims: Technology holds the key to human destiny.
Typical of this barrage is a recent issue of The New York Times Magazine on the theme "What Technology Is Doing to Us." The magazine assembled a dozen well known writers to enthusiastically explain how technology accelerates the pulse of activity, dominates our personal habits, reshapes the social order and fosters exotic dreams of transcendence.
The Times' musings - which led off with an essay glibly proclaiming that "Technology Is Making Us Better" - merely update what is now a cottage industry in cheerful soothsaying. It's a tradition as old as the internal-combustion engine. During the past two centuries, students of the social "impacts" of technological change have been drawn to ideas of determinism and fatalism. But never, in my experience, has the pungency of such beliefs been as strong as we see today. From the tacky neon pages of Wired magazine to fawning features in the Sunday papers, we find wholehearted embrace of the notion that a technology-driven universe is at hand and that any hope for reasonable human intervention is beside the point. This willful fatalism even permeates examinations of technology's darker side.
Consider the growing stream of reports about threats to personal privacy posed by emerging systems of digital information. America's leading news magazines have been warning readers for some time about the ways that workplace surveillance, on-line monitoring and the built-in tracking abilities of electronic networks generate data trails that provide distant organizations access to the most intimate details of everybody's life history. While these stories sometimes offer advice on how clever consumers can achieve partial shelter from the onslaught, they typically assume that privacy-wrecking electronics are now so deeply entrenched that systematic remedies are unthinkable.
In "No Place to Hide," a chilling sketch of surveillance techniques in Forbes, reporter Ann Marsh worries that information networks "will bring on Orwell's 1984, making us all slaves of the state." Does this mean we need new laws and vigorous citizen action to resist the spreading menace? Heavens, no. Marsh concludes that "the damned thing is practically here. Let the chips fall where they may."
The belief that technological innovation renders human choice irrelevant is directly contradicted by those who study the history and sociology of technology. When scholars open the black box of technological innovation, they find social, cultural and political choices through and through. If one looks closely enough, the creation of hardware, software and large-scale technical systems is never simply a matter invention and application, but of complex negotiations and sometimes fierce conflicts among competing groups. Choices that affect the distribution of wealth and power in society are intricately woven into the very substance of technical design, right down to the last pipe fitting, circuit breaker and computer chip. In the early decades of this century, for example, urban trolleys abruptly vanished as Detroit companies bought and dismantled the lines, making way for the empire of the automobile.
On occasion, battles over these choices erupt in public view, as for example in the Justice Department suit against Microsoft for the company's ingenious scheme to impose its Internet Explorer on all computer users. More often, however, the politics of technology remains concealed in the details of the blueprints, programs and corporate plans for devices people come to regard as "neutral" and "inevitable."
But recognizing that technological development is a vast arena of social choice is no guarantee that important choices are readily available. In fact, technology developers (corporations for the most part) invest a great deal of effort in advertising, public relations and organizational arm-twisting to prevent seemingly insignificant technical decisions from exploding as social controversies. Those who tell us that the future is foreordained are, in effect, asking that we give up our role in what could be some lively debates that we take no part in decisions that blend society and technology together in new patterns. No, don't fret about the giveaway of a public treasure, the telecommunications bandwidth, to narrow self-interested firms. Suggested instead is the Rip Van Winkle approach: just go to sleep and we'll wake you when it's over.
At present the energetic sales pitch for Van Winkle-ism appears to be working just fine. Large segments of our population have been lulled into thinking that technology simply wafts down to us out of some celestial fount. We need only disturb our slumber long enough to acquire the appropriate hardware and to master new forms of social deference to the grand inevitability of it all.
One does not have to look far to see this somnambulism at work. Many parents, for example, approach computers in the schools with an almost dreamy passivity. They are told computers and the Internet are necessary for their children's well-being and must be inserted into every classroom. But few bother to ask an obvious question: Exactly how and when are computers distinctly useful as compared to other tools of learning? If parents looked back at earlier, ill-fated "technological revolutions" in the schools (including the great leap forward with computers in the 1980s) they might temper their enthusiasm and begin making intelligent demands to counter what computer vendors want to sell. They might open the broom closets at our elementary schools, filled with the wreckage of Apple IIe's and Logo programs, and ask teachers, "If these devices were marvelous as advertised, why is there still a crisis in education?"
There are many other ways in which misplaced fascination with "the inevitable next step" in technical change distract us from squarely facing what the future holds. Today's neatly packaged innovations in computer networking, office automation, factory production, telecommunications, reproductive technology and genetic engineering contain all the moral dilemmas and political choices that have ever engaged philosophers, statesmen and ordinary working people. Will the world we are making be better than the one we have known before? Will it secure our freedom or curtail it? Will it enlarge social justice or limit it? Will it protect the biosphere or further assault it?
Clearly, questions of this magnitude suggest the need for a new vision of the human prospect and for voices able to articulate it. Alas, if one listens to what our leaders in the White House, Congress, business, academia and the media are saying about technology and humanity, there is no compelling, positive vision whatsoever. No one seems willing to imagine technologies that might strengthen local communities, revitalize democratic politics, eradicate chronic urban poverty and encourage environmentally sound means of production around the globe. Instead, our policy elites peddle lists of gadgets, gizmos and trends - along with trivial exhortations about how people will "have to change." From President Bill Clinton and House Speaker Newt Gingrich all the way to your local TV anchor person, technomania has become the dominant fin-de-siecle myth. So what we call "technology" may, in an oddly unintended way, indeed become our destiny, by weakening public imagination and our desire to make choices at all.
Copyright 1997, Newsday Inc.
CULTURE WATCH / How Technomania Is Overtaking the Millennium., 11-23-1997,