Those of us who raise the question of social responsibility of science are in an uncomfortable position. It’s considered quite bad form in our society to blow the whistle on any activity for which money can be paid. Nevertheless some of us feel even more uncomfortable at the thought that the scientific work which is central to our lives may sometimes do more harm than good.
One kind of objection we encounter is: ‘What are you, anti-science? anti-progress? Science is knowledge, knowledge is power. How could you be against knowing more?’
Another kind of objection is: ‘You can’t reverse progress. It might have been better not to discover nuclear fission, but you can’t undiscover it.’
A third objection takes the form: ‘There’s no such thing as responsibility in scientific research, because when you do the research you can’t foresee what its uses may be.’
Then there is the sort that goes: ‘You have no right to censor science anyway. What are we scientists? Just the employees who do the work. Decisions on what to do with the results of scientific work are made by society at large; it would be arrogance for us to claim exclusive rights to them just because we have this special role of generating the ideas.’
Finally, though not often stated openly, there is this — which may be the most important of all: ‘Those of you who don’t think you can conscientiously do scientific work certainly have a right to freedom of conscience. No need to mount campaigns, just vote with your feet. You can just change your field of science, or even change to non-technical employment.’
Indeed, in response to this last type of objection, many of my colleagues and students have switched from one ‘normal’ position to another to reduce their involvement with destructive technology or increase
the constructive utility of their work. And yet I wouldn’t accept the notion that responsibility in science should mean only appeals to individuals to opt out. That would be an artificial limitation.
As an alternative approach (see also the Bulletin of November 1989), an organization called the Committee for Responsible Genetics, led by the MIT biologist Jonathan King among others, has been circulating this statement internationally:
We, the undersigned biologists and chemists, oppose the use of our research for military purposes. Rapid advances in biotechnology have catalyzed a growing interest by the military in many countries in chemical and biological weapons and in the possible development of new and novel chemical and biological warfare agents. We are concerned that this may lead to another arms race. We believe that biomedical research should support rather than threaten life. Therefore, WE PLEDGE not to engage knowingly in research and teaching that will further the development of chemical and biological warfare agents.
Bearing in mind the Hippocratic Oath traditionally taken by medical doctors, we might put such statements in broader terms. If physicians state their obligation to use their specialty only for the good of humanity, why not other professions?
Here’s a simpler one, offered as a graduation pledge originally at Humboldt State in California in 1987, and subscribed to since then by large contingents at graduations there and at many other universities.
I, … pledge to thoroughly investigate and take into account the social and environmental consequences of any job opportunity I consider.
What is really meant by such pledges? Certainly, it may not be enough for those who feel that way to move to a different job. The reason we raise the issue in scientific societies, many members of which may already be doing clearly constructive work, and in graduating classes of students, and in general audiences, is that science and technology are social products. The technology of Zyklon-B for the Nazis’ gas chambers, or of binary nerve gas for today’s weapons, is a product of scientific lore built up by an intellectual community. The social responsibility of the biological scientist is not merely to get somebody else’s name than one’s own attached to the job! Just as the Hippocratic Oath should make each doctor repudiate Nazi-style experimentation on human subjects by all doctors, biological responsibility should mean that each biologist refrains from misuse of the science and gets others to refrain too. Responsibility should be applied collectively.
Unrealistic — sure. The level of mutuality I’m imagining here is unattainable now. The vision is of a process of deepening a community code of ethics over many stages. As the need is felt more widely, it can happen. Right now we see medical ethics being reworked, with great attention from thousands of specialists. Scientific and engineering ethics can be developed the same way: publicly, and world-wide. Only it is lagging way behind.
To this point, I’ve been speaking as if it were typically easy to see the difference between healthy and noxious science. As if the only thing lacking were good will and honesty. No, the big problems are really problematic. Answers aren’t clear. And even when hard work makes them clear, there may still be battles to get the ethical thing done.
Sometimes those of us who are sounding wake-up calls, following the examples of Rachel Carson, give the impression that once we all wake up the way will be plain. We do that by emphasizing a glaring incongruity, focusing on it so everyone will see it is serious, at the cost of making it simple — whereas really its complexity is part of what makes it so serious.
One way to bring order into a confusingly complex problem is ‘cost-benefit analysis.’ Analysts try to weigh the power to be drawn from the Aswan High Dam or the proposed Sardar Sarovar Dam against the damage caused by flooding of farmland upstream, loss of silting downstream, and destruction of river and sea ecosystems. In the case of British Columbia, they weigh the value of the aluminum smelted with the hydroelectric power against the value of the salmon fisheries destroyed. Dollar values on each item, and add up the balance. More ambitiously, one may calculate the dollar cost of revising power generation methods world-wide so as to restore the carbon dioxide balance (the total cost is in the trillions). Such computations have great potential, but keep in mind their limitations.
First, they are no more precise than the inputs, and it’s very hard to know some of the numbers going in. I’ve never done such a labor of marshalling quantitative data to be synthesized, and I respect the audacity of those who do; my skepticism is not ungenerous to them, I hope; but every user of such analyses can see that skepticism is a necessary part of using them sensibly.
Second, I sometimes insist on asking, whose dollars. The aluminum company owns its refinery and makes a profit. If the analysis shows that the costs outweighed the benefits, does that mean the company owes the salmon-fishing coastal Indians damages for the fish they don’t have any more? If not, why not? If so, then the refinery was a bad investment — or the aluminum was underpriced. (There’s a can of worms! If the economic realities are different from what the market saw at the time, then the dollar figures have to be revised throughout.) Similarly, we hear talk of whether ‘industry’ can ‘afford’ eliminating chlorofluorocarbons. Come now! If the physics of the ozone hole is as now believed, then cost-benefit analysis will show on the contrary that ‘industry’ — that is, the owners of Hoechst and the other producers of CFCs — can’t afford to produce another gram of them. Just let all the billions of people who will lose if the ozone depletion is allowed to continue claim damages, and the alleged profitability of freon refrigerant is sharply reversed. I’d better make it clear I’m not offering such a lawsuit as a practical course. The suit brought by victims of the Bhopal gas leak showed that the courts are not likely to be the agency of correcting this kind of abuse. I just say that the dollars can be added up with a view to exposing it as an abuse. If a corporation appropriated my land to build its factory, this would be recognized as theft in our culture, and its profits would not be sacred but could be attached to repay me for my property. If the corporation takes away a people’s livelihood or its air, this should be recognized as a crime in a new higher concept of economic justice.
Try applying these elementary notions to all instances of toxic wastes. Mines, chemical firms, and nuclear plants operate at a profit and pay dividends, without accounting for the great lakes of poison they spread around them. The wastes are costs of the original production, but they aren’t charged to those who profited while failing to account them as costs. Instead, when citizens demand they be cleaned up, government taxes the citizens to pay for a clean-up operation — on which the original polluter makes a profit.
A third reservation about the cost-benefit analyses is that some things don’t have dollar values. You may have seen the claim by Rene Dumont to have calculated that the present excess of greenhouse gases from modern industrial practices is causing deaths in the tropics, via drought caused by climatic change, at the rate of a million deaths a year. Now he would not claim much precision in his conclusion, and the chain of inference leading to it is rather long, involving subtle and recent atmospheric physics. His attempted calculation is not absurd, however, and its relevance is evident. My point is just that he was right to present his conclusion as he did, and not as a cost-benefit analysis. If the reason for deeming our energy usage destructive is measured in human lives, then by all means let us speak not just of dollars but of human lives.
So often we see scenarios of the same form: A way of life is built around some economic activity, and then unsuspected damage comes to light. Why are we so often caught unawares? If you have the impression there’s a pattern here, I think you’re right. Greed and opportunism, to be sure. The successful exploiter of resources can defend himself by the riches and the influence got from the very exploitation, to be sure. There’s another common thread in many of these cases: complexity. The science of the initial technology is less complex than the counting of its consequences.
The computation of yield from an ore, or of energy required to raise it and smelt it, is an easier kind of computation than the prediction of the ecological effect of the tailings fifty years later. The interaction of a hundred species at the edge of the desert may determine whether the desert advances into fertile land. Each species can be studied by ‘clean’ science, but study of their interaction is a ‘messy’ science, ecology. Messy sciences like geochemistry and sociology tended to be shoved aside in the first centuries of the scientific revolution, precedence was given to clean sciences because they worked. These days, messy sciences are much studied, and seeing this, you may get the impression that great advances are being made.
Now it is true that some big models of complex systems are being run on very fast computers. Some of them even work pretty well; for example, predicting the weather a week ahead is a fairly messy problem which seemed thirty years ago to be intractable but is now fairly successfully handled. Don’t confuse this sort of success with understanding. All messy sciences today are poorly understood, some of them much more poorly than meteorology. It’s good that some serious and resourceful people like working on them, because they are so important. But if those people are honest, they can hack it in these areas of study only by having great talent for getting satisfaction out of partial results. Tor small blessings give thanks’ might be the motto of the worker in messy sciences. You probably remember that the very valuable projections of nuclear winter, which were rightly taken into account by policy makers (both the powerful and us ordinary citizens), left out of account most of the known complexities of atmospheric circulation. To say nothing of unknown mechanisms.
Ecology, however, will be a messy science for some time to come. Yet I confess to a bit of unprovable optimism. We may not always be as helpless before messy situations as we are now. Looking at the past fuels my optimism. Three hundred years ago, Newtonian mechanics gave philosophers the feeling that the future could be predicted, but only to the extent that the present was known. It seemed the universe would be understood only by grasping at well-detemined causes. Yet, probability, which came upon the scene at about the same time, allowed undetermined causes to be part of understanding too. By the 19th century, they explained thermodynamics as neatly as anything in the deterministic realm, and statistical physics is going on to new triumphs today. In the same way, physics of matter first concentrated on pure crystals because they were neat enough that you could get somewhere with them, and gases because they were simple and (with the aid of probability) you could get somewhere with them; yet later, glasses and liquids also became manageable. I venture to hope that we will find new ways of thinking about today’s messy models, as different from the deterministic way and the probabilistic way as they are from each other. Am I referring to ideas of holistic science now being developed by followers of Prigogine? I don’t know; I’m ignorant on the subject; but I don’t think so, I think what I’m hoping for is something not yet clearly in sight.
Here’s one more weak point to watch for, as important as any of the others: uncertainty. Criticism of science and technology often hinges on risk. The critic declares the risk unacceptable, the defender insists that the critic is impeding progress. A spectacular instance was — and still is — the guidelines for containing genetically engineered organisms. I’m going to use a less prominent example.
The space probe Galileo was launched by a space shuttle. Aboard Galileo was a small plutonium pile. You may have seen the criticism of this plan by Karl Grossman and others. They pointed out that the shuttle launch isn’t perfectly safe (as sane astronauts like John Glenn knew even before the Challenger exploded). Although they are exceedingly small, very real risks do exist that, if Galileo’s nuclear reactor had been shattered during launch it would spray into the atmosphere a quantity of plutonium sufficient to poison millions of people (see also ‘The Galileo Mission,’ p.16, Science for Peace Bulletin, Vol. 9, No. 3, November 1989). The planetologists, almost all of them, stuck by the plan. The launch took place, and the space-ship Galileo went safely on its way; but the issue is still current because more research space vessels with reactor power are planned, and because of Galileo’s planned route to Jupiter. It’s not going outward all the way. It’s going step-by-step, picking up a little additional energy in each of a sequence of near encounters with planets. Do you know this trick? It wasn’t in my mathematical astronomy course at university. A small object coming in at a planet may brush by either side, getting deflected — or, of course, it may come in between and crash on the planet. If it brushes by a large planet all alone in space, it will leave with as much energy as it came in with, only its direction will change. On the other hand, in the complicated system of planet and sun, a fly-by can send the small object off with a little more energy. Galileo is to get such boosts at each of its stepping-stones; and two of these boosts are from this planet, Earth. That’s right. This little nuclear ship that Karl Grossman was trying to get us to worry about will come heading just about straight at us (remember these fly-bys have to be pretty near misses if the helpful change in orbit is to take place). Suppose there were a miscalculation? Some miscalculations will just put it onto a course which will spoil its mission, but some miscalculations will make it become a meteorite. Actually the issue is not primarily miscalculation but loss of control. The steering can’t be corrected if communications with space scientists on Earth is lost, and we know that can happen because it did happen with the Soviet mission to Phobos this year.
I was saying this to some friends and they thought I was arguing against the Galileo mission. Not necessarily. The risk of Galileo crashing into Earth is tiny indeed; and maybe the next nuclear-powered space ship will be a little safer and be launched by a safer booster. I’m advancing this example merely as an instance of difficult risk evaluation.
Some important scientific experiments — interplanetary probes, genetic engineering — entail small risks of significant damage. How much do we have to want to know something in order to take such risks?
We really do need an analysis of the risks. Even when they’re small. The basic decision to give Galileo reactor power was made by the US military, which wants reactors in space because it wants reactors in space. The US military let Ronald Reagan lie on its behalf about what its satellites were going to do up there, it wouldn’t be above lying about this. I certainly don’t entrust my risk analysis to these people, who have been playing their game of Mutual Assured Destruction for almost forty years and would be playing it still if they hadn’t found ways to put the survival of the world at even greater risk from their weapons. Safe enough for the generals does not mean safe enough for responsible people.
But suppose we do get together a trustworthy team to do a serious risk analysis, say with the participation of the Natural Resources Defense Council. What should it consist of? A probability computation? But probability theory is regarded as applicable to situations where many repetitions of a random process are made or could be made. The only way decision theory is non-controversial is for the gains and the losses to be subject to addition and subtraction. Here we deal with small but unknown probabilities and unknown (perhaps large) penalties. We are in different conceptual territory, that of risk analysis, decision theory, and statistics with small non-random samples. This area is like the area of messy sciences: many people are working at it these days, its importance has received well deserved recognition, but I have to tell you that things are not coming clear. Even an earth-bound example will do, it’s been with us for years: if your friendly nuclear power plant next door has a chance of 1 in a million of blowing up within a year causing hundreds of immediate casualties, plus fall-out, is that sufficient reason in itself for closing it down? 1 in a billion? If you have trouble answering such a question, this does not prove that mathematical education is foundering and you are a generation of innumerates. Nobody can answer such a question in a clear-cut way. I am not speaking against the study of decision theory, I am reporting that its present status is pretty primitive. Really, if anything I am speaking for studying it. Just don’t hold your breath waiting for definitive answers.
In short — I’m calling for fellow scientists to accept their responsibility for the future. I’m calling for those who aren’t scientists by job description to join in the effort. Science and technology are central to the problems I’ve been thinking about, but there’s no limitation on who can help solve them. I’m trying to communicate my feeling that seeking overall solutions — solutions that will stick — is even harder than the case-by-case solutions we usually think about.
We won’t make great improvements in a few years, perhaps. We may have to rely on theories and approaches not yet developed and on co-workers not yet born. That’s all right. The future has a right to a share of the action. But only some problems can be left for the future. Any species we allow to die off this decade will not regrow a decade later. The minimum we have to insist on is to leave the next generations a world they can live on. To give the future a chance.
Comments