A Cultural Ecology of Nanotechnology
Bonnie A. Nardi
Agilent Laboratories
Agilent Technologies
Palo Alto, California
bonnie_nardi@agilent.com

 Radically new technologies imply radically new social issues and opportunities. I propose a cultural ecology of nanotechnology in which we find ways to infuse technological development with deeper, more thoughtful and wide-ranging discussions of the social purposes of technology. I chose the ecology metaphor to signify the integration of science and society, to draw attention to interdependencies characteristic of ecologies. (See Nardi and O’Day, 1999 on  information ecologies.)

As part of a nanotechnology initiative I would like to see a new science of cost-benefit analysis in which issues of ethics and social responsibility as they relate to technology  can be rethought in radical new ways. Perhaps the term “cost-benefit analysis” should be replaced as it connotes limited economic considerations to many. We need to channel energy into the invention of a holistic process of  technological development within which we can entertain questions of  the human purposes and benefits of technology. The thrust of such an effort would not be prediction or a simplistic notion of managed change, but the development of a new way of approaching our relationship to technology.
 
In a cultural ecology of nanotechnology, we would take seriously the promises of nanotechnology such as cleaner manufacturing, decreased waste, marvelous medical devices. We would put socially beneficial technologies at the top of the research list. We would find new ways to distribute technologies such as medical devices equitably, we would encourage (somehow) companies to use safer technologies such as, say, nanotech tires that don't fray. Whatever our social desiderata, we would  find ways to fuse them to the development and deployment of new technology.

For me, such a process of designing ecologies of technology is desirable because I am not as optimistic as others that technological development always comes out for the best in the long run. Sometimes it does and sometimes it does not. I feel we are currently paying too high a price in pollution, noise, traffic congestion, loss of nature, and lack of safety in our technologies. These poor outcomes are a result of the characteristics of  specific technologies coupled with the ways we use the technologies. Traffic gridlock could probably never have happened without the electric self-starter; it is important to remember that specific technologies do matter. On the use side, we needn't have had gridlock had we planned our transportation system differently, insisting on a diverse system of public and private vehicles.  If we had gone with the early electric vehicles that were starting to be marketed early in the 20th century, air pollution would be much less of a problem.

In a cultural ecology of  technology, the relationships between attributes of specific technologies and the ways we actually use the technologies are a key focus of interest.  In a cultural ecology of nanotechnology, I'd like to see discussion of how to embed the new technologies into society in radically new, socially beneficial ways. We need to have discussions, for example, about the social contradictions inherent in some of the nanotech promises. How do more long-lasting durable products play with our current economic system based on not-so-durable products that must be replaced often to increase profits for manufacturers? Nanotechnology promises tires that don't fray but we have technologies for safer vehicles now that corporations have sometimes chosen not to use. We know how to manufacture less wastefully, but often we don't do it because it reduces profit.  In a cultural ecology of technology such concerns would be a serious topic of discussion and focus of creativity.

The government documents promoting nanotechnology that I have read make no mention of the risks of nanotechnology. Are there none? What if we had had cultural ecologies of technology a hundred years ago, and had thought through the implications of having millions of internal combustion engines on the road?   Kettering invented the electric self-starter in 1911. That would have been a great time to undertake a serious envisioning task in which we would have imagined everyone having a car, however fantastic it might have seemed at the time.

Techniques of Envisionment
A key activity of cultural ecologies of technology is envisionment. Techniques for examining multiple possible scenarios exist now and could be expanded and developed. Could we have envisioned millenial Los Angeles in 1911? Probably not. We didn't know  how. We still don't. But we should learn how.

Actually, we are already envisioning, but doing a poor job of it. We do not hesitate to conjure wondrous benefits of  technology, or, less often, to forecast dystopic visions of technological annihilation. Neither is usually realistic. We need to create new processes to envision both benefits and risks of technology and the relations between them. This endeavor in itself is an area for technical creativity.  That we have been bad at predicting the future in the past is no reason to avoid this critical task now.  If we can talk about creating self-replicating machines out of atoms, we can talk about new techniques for envisioning the consequences of technology. There is no reason we cannot apply our sociological and scientific imaginations to assessing the benefits and risks of technology.

I have noticed that proponents of new technologies often follow a two part logic in advocating for the development of their technology, however risky it might seem to others. The first part of the argument is the confident prediction of  great new benefits. For nanotechnology, we have faster computer  chips, very high resolution printers, compact high-volume data storage devices, and new medical technologies such as tiny probes, sensors, drug delivery devices, and ways to regenerate bone and tissue. The second part of the argument, should we pose questions about the potential risks of the technology,  is that we don't know how to predict where technology is going. When we stop to ponder potential, even likely, risks of technologies, derisive stories of our poor record of prediction in the past are trotted out.

This duality--prediction of fantastic benefits coupled with an assertion that we cannot predict outcomes--is an unhealthy, illogical combination. There's something wrong when the prediction can only be on one side, when we are promised benefits but not allowed to assess risks. Lured by the promises, in which "objective scientific facts" are often invoked as part of the rhetoric of prediction, we go forward, leaving ourselves powerless to envision and prevent negative consequences.

 The Russian psychologist Vladimir Zinchenko posits something he calls "the ideal form" as a crucial aspect of human social and psychological development (Zinchenko, 1996). The ideal form is where we want to be. Techniques of envisionment are not simply simulations of predicted outcomes, because they contain an element of social purpose. A cultural ecology of technology envisions ideal forms grounded in realistic assumptions, and suggests desirable paths where choices can be made.

Design for Co-evolution
A second thing to work out in a cultural ecology is design for co-evolution. To do this, we would give ourselves ample time for discussing and designing how a social and a technical process could co-evolve together gracefully and proactively. Possible venues for such discussions are workshops and programs sponsored by agencies such as the National Science Foundation.

The notion of design for co-evolution resonates with the idea of co-evolution in Brown and Duguid's response to Bill Joy's Wired article on the dangers of nanotechnology. Brown and Duguid point out the strong social influences on technological development. However, I would like to propose that we design and implement a socio-scientific cycle quite different than Brown and Duguid’s which is largely reactive. Our current model of  technological development is full speed ahead--and then slam on the brakes when we get scared.  Brown and Duguid provide examples of the application of posthoc corrective measures with technologies such as nuclear power. As they point out,  it took “the environmental movement, anti-nuclear protests, concerned scientists, worried neighbors of Chernobyl and Three Mile Island, and NIMBY corporate shareholder rebellions to slow the nuclear juggernaut to a manageable crawl."

Good grief,  do we have call out the scientists, investors, tree huggers, and little old ladies in tennis shoes every time? With nanotechnology, genetically modified foods, cloning, and other technologies with global implications looming, we need a better process.    In a cultural ecology of technology, we would be proactive about technological development, not reactive. We would shape technology for our own collectively defined purposes and not confine ourselves to mobilizing to  slow dangerous or undesirable juggernauts. And while nuclear power is a hopeful example of co-evolution, it  also required the sacrifice of thousands of lives and continuing ill health for many thousands more.

While we have successfully held some technologies at bay, other technologies are out of control. Such as the automobile. We don't have a healthy ecology for the automobile. Breathing exhaust and spending one's short existence crawling along the freeway is hardly a gift from the gods.  It's difficult to say what will happen with genetically modified foods. Despite protests, the brake has not been   applied. Half the soybeans in the U.S. and a third of the corn are grown from genetically modified seed. We do not know what ecological or economic effects this will have in the medium to long run.

Less threatening but still a kind of daily water torture are technologies such as phone menus that leave people feeling frustrated and diminished. The automated voice response system my bank uses employs poor concatenation technology, stringing together  individually recorded numbers that produce a sing-song-y, barely comprehensible voice response to simple requests for account information. These voice systems are especially difficult for people who are hard of hearing or not native speakers of English.  Our ecologies often have a monocultural character, serving the single need of profit or so-called efficiency. (I sometimes wonder whose efficiency is served as I fight my way through a maze of key presses in a phone menu.)

Zinchenko (1996) suggests that two of our most human attributes are creativity and the ability to resist. Brown and Duguid are betting on resistance. This is a time-tested strategy, and one advocated by some of the best minds of our  time such as Michel Foucault and Jacques Ellul. But with the rapid pace of technological change, resistance may no longer be sufficient. It’s looking to me like Star Trek was right, "Resistance is futile." By the time we mobilize to resist, a lot of damage may have been done. We need to anticipate and plan. My suggestion is to apply human creativity to the problem of designing our technologies in a process that marries the social and the scientific, that treats technology systematically, ecologically. I believe we can draw on deep wells of creativity that we have not tapped to do this.

Brown and Duguid suggest that Bill Joy's concerns about nanotechnology are distorted by technological tunnel vision. But Joy is far from oblivious to the social. He situates the development of nanotechnology very realistically as a product of "global capitalism and its manifold financial incentives and competitive pressures." Capitalism is a social force more  powerful than the "government, courts, formal and informal organizations, social movements, professional networks, local communities, and market institutions" enumerated by Brown and Duguid. Indeed many of these social forms are deeply implicated in capitalism, not outside of it. Forces that can "redirect the raw power of technologies," as Brown and Duguid say, come up against the "manifold financial and competitive pressures" of which Joy speaks. Some parts of government and certain social movements do exist as reactive forces trying to slow and restrain rampant capitalism. However,  I believe such forces should be primary generative stimuli of planned societal progress, not catch-up rearguard actions. Joy's fear of "undirected growth" is one to take seriously.

Politically experienced people probably find the idea of creating a new social process that intimately links society and science naive and unwieldy. But if we can manufacture devices where a billionth of a meter is a meaningful measure, there is the possibility that we can shape our social processes in just as radical a way.

References
Brown, J.S. and Duguid, P. this volume.

Ellul, J. The Technological Society. New York: Alfred Knopf. 1964.
(First published 1954).

Foucault, M. Power/Knowledge: Selected Interviews and Other Writings 1972--1977. New York, Pantheon Books. 1977.

Joy, W. 2000. Why the Future Doesn't Need Us. Wired, April.

Nardi, B., O’Day, V. Information Ecologies: Using Technology with Heart. MIT Press. 1999.

Zinchenko, V. Developing Activity Theory. In B. Nardi, Ed. Context and Consciousness: Activity Theory and Human-Computer Interaction. MIT Press. 1996.