Physics is Fun Memoires of Richard Wilson Version of September 25th 2009



Yüklə 1,99 Mb.
səhifə21/31
tarix17.01.2019
ölçüsü1,99 Mb.
#99220
1   ...   17   18   19   20   21   22   23   24   ...   31

Risk Analysis
When in 1971 I decided to spend time discussing nuclear power, I decided that one should not discuss one energy source in isolation but compare them across the board, both in terms of health hazard, cost and environmental effects, taking the complete fuel chain into account. How to influence the choices led me to write about pollution charges. (37, 141, 144, 146,148, 156, 160, 161, 167, 171, 174). But a brief letter “Kilowatt Deaths” (145) that appeared in Physics Today in 1972 seems to have had the most influence. In a simple table I produced my best estimate of the number of people who would die from using electricity generated from a number of sources. Leonard Hamilton, MD, of BNL told me that the “kilowatt deaths” letter stimulated his more systematic studies at BNL. Others also found these early papers stimulating.
I was slowly moving into an intellectual vacuum now filled by the field of Risk Analysis. Since 1973 I have written extensively in the field of Risk Analysis (174, 227) and was a founder member of the Society of Risk Analysis and was given their distinguished service award. In this, I have been a pioneer in the use of comparing risks to aid in their understanding and a co author with Crouch, of a seminal book in the field which went into a second edition (227, 406). I have lectured on this subject in over 25 countries. (206, 208, 209, 210, 212, 218, 220, 240, 256, 272, 305, 306, 307, 309, 311, 317, 318, 327, 343, 345, 365, 378, 380, 384, 391, 395, 408, 533, 544, 563, 571, 627, 628, 635, 696, 746, 807, 847, 887)
I have worked in particular on possible effects of low doses of anything on public health. It started with the effect of low doses of radiation. (457, 458, 470, 471, 695, 784, 787, 869) and then on low doses of chemicals. This is a very fundamental subject on which many scientists are confused and therefore talk nonsense. Sometimes we know that a substance or action causes adverse effects on health when given at high doses. It might, for example, cause adverse reactions in 30% of all people. But what happens if the substance is given at a low dose, perhaps 100 times smaller? Is there no effect? Or does it cause adverse reactions in 30%/100 or 0.3% of all people? This last idea, called “low dose linearity” was first raised for effects of ionizing radiation. But it may well apply, and I think it does, for may other substances. I am, therefore, particularly concerned when anti-nuclear people considered radiation to be uniquely dangerous. They call themselves environmentalists. I hate to give them that honor. I believe, as the late Senator Tsongas publicly declared, that anyone who prefers burning of coal to nuclear power cannot legitimately call him/herself an environmentalist. I suspect that the intellectual concept and meaning of the phrase “low dose linearity” started with radiation issues, because physicists were much involved, and to someone with a training in physical sciences the idea of low dose linearity comes naturally. Physicists are also well used to discussing “negative” results in terms of quantitative upper limits on a possible effect. The issue is the extrapolation of risk from high doses in major events, such as the effects of the Hiroshima and Nagasaki bombs, to the low doses normally encountered by humans. It is crucial. This makes a study of all factors that might influence the dose-response all important.
I trace the concept to Professor Geoffrey Crowther in Reading University UK who in 1924 or so suggested that radiation cancers come from the ionization of a cell by radiation. All the concepts of stochastic processes that a physicist learns then apply and low dose linearity. But hundreds of cells get ionized each minute, yet on average less than one cancer appears in a lifetime. So this needs a major modification. Nonetheless it influenced thinking. In 1927 the newly formed International Commission on Radiological Protection (ICRP) adopted low dose linearity, guided in part, so some stories go, by recent studies that showed that the effect of radiation on fruit flies seemed to be linear. But the idea became more general many years later and applies not only to radiation but also to chemicals and not only to cancer but other medical end points. The crucial point is that radiation increases the incidence of cancers that otherwise occur naturally but does not produce cancers that do not otherwise occur.
If the “radiation cancers” are indistinguishable from “natural cancers”, and no pathologist can tell the difference unless he knows the exposure, then it seems reasonable to suppose that radiation and natural background, whatever that is, act similarly at some stage in th development of the cancer. Then Taylor’s theorem in mathematics applies and at low doses there must be linearity. Not only is this a contentious issue in radiation risk assessment but also in studies of risks of chemicals at low doses. The Taylor’s theorem argument clearly also applies to chemicals whether they are “initiators” or “promoters”, or alternatively whether they are “early stage carcinogens” or “late stage carcinogens”. Theoretically Guess, Crump and Peto emphasized this and their argument was used by the US EPA in its early days to justify a default of low dose linearity - a fact which most people have forgotten. Martha Crawford (now Martha Heitzman) and myself argue that the same argument applies to chemical end points other than cancer (568). In particular it applies to lung cancers and other respiratory ailments caused by air pollution or cigarette smoking. The argument makes no mention of the animal toxicologists distinction between initiators and promoters, or early stage and late stage carcinogens. Interestingly the European Union in its EXTERNe study accepted low dose linearity for air pollutants, correctly in our view, but the US EPA does not.

I always insist to my students that an idea must be presented in three ways. In a formula. In a graph. And in words. And the three must agree in as many details as possible. When talking in 1979 to an audience of toxicologists and epidemiologist at a meeting of the Toxicology Forum in New York City I used a graph to explain why low dose linearity is a probable scientific reality . Professor Arthur Upton, then at NCI and shortly to return to NYU came up and said that he at last understood the point.


Toxicologists, whether working for industry or for regulators, tended to assume a non-linear response at low doses. In a review in 1987, (379), Lauren Zeise, Edmund Crouch and myself found that the direct evidence for or against a linear, or near linear, dose response relationship for chemical exposures in the workplace and environment, at the low doses of present interest, is weak or non-existent. The “megamouse” study carried out at the National Toxicology Laboratory in Arkansas seemed to show that the carcinogen (2-AAF) has a threshold response for bladder tumors, although of course one cannot rule out a linear effect at low doses much smaller than the direct extrapolation from high doses. But the bioassay results are consistent with a linear response for liver tumors. This suggests that some carcinogens at some sites might exhibit a linear dose-response and others might exhibit a non-linear dose response relationship characteristic of acute toxicity. If the high dose carcinogenicity is secondary to an acute toxic effect, the dose-response relationship will be that of the acute toxic effect, which is often expected to be non-linear, rather than the linear relationship usually assumed for carcinogens. This might occur for asbestos where the Dr Mereweather, the Chief Inspector of factories in the UK asked just this question in 1938, and of benzene where it may be that cancer is always preceded by pancytopenia. Perhaps it is true of arsenic also. Various scientists are pursuing methods of explicitly incorporating the effects of high dose toxicity on cell proliferation into general dose-response models for risk assessment although empirical support is lacking. If chemicals likely to have nonlinear dose response relationships and those for which all we have is the linear default with Taylor’s theorem, this characteristic could be included in the risk assessment and different substances could be treated differently. If this could be achieved, it would be most useful for carcinogen risk assessment. I believe that one of the factors most likely to lead to strong nonlinearities in the dose-response relationship is toxicity to the target organ. Now that DNA analysis can be done routinely I have suggested experiments along these lines. For example, persons exposed to fallout from Chernobyl, such as people near Gomel, Belarus, are at a higher risk of leukemia than those not so exposed such as residents of Minks. Do they have identical or similar DNA structure? If different, Taylor’s theorem would not seem the proper default. A farsighted Belorussian blood specialist, Dr Eugene Ivanov, had frozen many samples of blood for further analysis When I suggested such a study, and tried to get funding from the US DOE for the experiment, he was fired from his job for discussing these matters with an American. He was later reinstated but I could not get anyone to do the studies. I have suggested similar studies for lung cancer caused by arsenic and cigarette smoking but no takers.

This argument that low dose linearity is the proper default for almost all pollutants has to be argued again and again and I have referred to it many times (600, 600a, 635, 638, 640, 644, 695, 717, 787, 869). As of June 2008 I find at a meeting of the Society of Risk Analysis that toxicologists are still struggling to separate linear and non linear mutagens and carcinogens. I still insist that statements of threshold and non linearity at low doses must be accompanied by a an explanation of why the naturally occurring adverse events take place.


An important issue is how to describe any technical situation. If it is not understood by the decision makers and the people who elect them, it will be ignored. A physicist will naturally talk about the probability of an individual getting a cancer and is understood by most other physicists. But often not by physicians and toxicologists. A professor at Harvard Medical School said once “one should only regulate on the basis of those risks that are certain” - meaning by certain those directly verified by epidemiology and not merely based on “theory” however well based. But that is confusing terminology. Toxicologists have persuaded the US EPA that while low dose linearity may apply to carcinogens, other toxic effects cannot be regulated on the basis of low dose linearity. It is also closely related to the issue of what one does when one has little information It is crucially connected with the idea that a substance is deemed dangerous only when a risk ratio exceeds 2 in some observed situation. For common ailments that measured a risk of a few per cent and society wants to reduce risks to far below this level. A consistent approach seems necessary.
Consistency becomes of major practical importance in two major situations. Some anti nuclear activists would argue that radiation is uniquely dangerous at low doses and ban nuclear power plants for this reason. I do not believe that nuclear power is uniquely dangerous. Pro nuclear engineers counter that there really is a threshold for harm, and below this threshold radiation is good for you. But I see no good data for this. I am often in the middle and try to persuade the pro nuclear engineers to drop that argument. “Do you want to get into an esoteric argument at a public meeting on this point, or do you want my support, and support of many other physicists, that the environmental and public health hazards of nuclear power, are far exceeded by the alternatives?” Captains of the chemical industry like to argue both that there is a threshold, at least for their particular chemical, or that no one has produced cancer with that chemical either in man or in laboratory animals, and therefore that the risk is zero and society should stop bothering them. This led me naturally into major discussion of how one knows what one does about chemical carcinogenesis. Sometimes I think that I am winning. Not often. But in the last year, 2006/7 some people understand..
For the work on risk assessment I have been given several wards:

A medal as "Chernobyl Liquidator" in the USSR, 1987

Forum Award, American Physical Society for Forum on Science and Society 1990

"For his outstanding research and promotion of public understanding of a broad spectrum of issues dealing with physics, the environment, and public health, including his work on reactor safety, estimation of risks posed by environmental pollution and pioneering use of comparative risk analysis."

Society for Risk Analysis, Distinguished Achievement Award 1993

Honorary Doctor; International Sakharov Environmental University (ISEU), Minsk 2001

2005 Erice (Ettore Majorana) Prize for Science and Peace, Erice 2006

"For his long lasting involvement in "The Spirit of Erice" and its promotion to people of different culture and various civilizations with remarkable success, thus allowing the new generation to envision the future with hope and confidence. Confidence and hope rooted in Scientific Culture of which Professor Wilson has been an illustrious contributor"

2007 Dixy Lee Ray Award, American Society for Mechanical Engineers

"For significant contributions to the science and engineering of environmental protection, particularly methodology of risk assessments, risk assessment of specific pollutants, riska assessment of nuclear power including nuclear waste, and ethics in environmental science and engineering.""

Presidential Citation, American Nuclear Society, (2008)

"For mentoring students for over 50 years in nuclear science, engineering and technology and his tireless efforts promoting peaceful application of nuclear power in support of “Getting the Word Out.” Through over 900 papers and publications, and myriad lectures, he has provided invaluable insight and wisdom giving the nuclear community a profound legacy from which to draw knowledge. Professor Wilson’s distinguished career is an inspiration."


It was, and is very gratifying to receive these awards. In particular I liked Dixy Lee Ray and her straightforwardness and honesty. Although I did not always agree with her, we became good friends. I would have preferred to be rewarded for my experimental particle physics . I was elected to the American Academy of Arts and Sciences about 1958, much younger than is now the usual situation. In about 1967 Denys Wilkinson proposed me for the Royal Society since I am still a British subject as well as a US citizen. I was never elected. I was not supposed to know, but I was informed that I was nominated at least once for membership in the National Academy of Sciences in about 1969 and 1970. About once a year someone assumes, incorrectly, that I am member of NAS. That is embarrassing. Jack Ruiana (professor of Electrical Engineering emeritus at MIT) does this and Andree comments that it is because he thinks I ought to be e member. But I certainly made the right decisions in 1971 to become involved in environmental and public affairs when I was blocked in my preferred pursuit of colliding electrons and positrons. Curiously, the National Academy of Sciences was started by President Lincoln to give advice on national affairs. Most members of the National Academy don’t involve themselves in national affairs as much as I do. If I had been elected a member I might have had greater opportunities to help . When for example, I criticized the membership of a number of critical NAS/NRC committees, I drafted a letter to the President of the National Academy of Sciences. I could not send it myself but Allan Bromley kindly took it and forwarded it as his own. Similarly I wrote a letter for the NY Times in 1980 supporting nuclear power, which Hans Bethe and Glenn Seaborg co-signed. But the NY Times did not at that time accept more than two signatures, so although the author of the letter, I removed my name.
Sabotage and Terrorism
My thoughts about terrorism arose in the 1990s. At that time we talked about sabotage. A disgruntled employee of Consolidated Edison Company of New York had recently blown up a substation to cause trouble. What could he do with nuclear power? I discussed this with Norman Rasmussen as early as 1973. We agreed that a full risk analysis should include sabotage as an accident initiator. I have a copy of a report from Sandia corporation in 1976 doing just that. I was insistent, that power companies should not be allowed to forget this and hide it under the rug because we did not know the frequency of the first step in the accident chain - the existence of a saboteur. In describing the AEC’s defense in depth philosophy and its numerical analysis in Rasmussen’s Reactor Safety Study I pointed out as early as 1973 that there is a chain of 4 events the probabilities of which must be multiplied to obtain the accident probability. The probability of the initiator. The failure of the Emergency Core Cooling System (ECCS), and the probability of the wind blowing from the reactor toward a populated city. The overall probability becomes small. But a saboteur who is knowledgeable could start an emergency; then set off two small bombs. One to disable the ECCS system and the second to make a hole in the containment. All to be done when the wind is blowing toward the city!. Moreover a terrorist, such as a Lebanese in one of the twenty or so small groups in that fractionated country, or a PLO fighter could be a “sleeper”. He could enroll at MIT and take Rasmussen’s reactor safety course. Then he could take a job with a nuclear power utility and learn the weak points of his reactor and finally start on his nefarious pursuit. This was my thought in 1975 at a time when it was already difficult to persuade people to take extreme events, and sabotage or terrorism seriously. But I thought, incorrectly, that 20 terrorists acting in concert, as they did on September 11th 2001 was unlikely. That changes the picture completely.
But when I looked at alternatives I realized that natural gas posed greater opportunities for an imaginative terrorist. I wrote the little paper, using the words from a TV advertisement “Natural Gas is a Beautiful Thing?” (156), merely adding the question mark. The total energy content of a 17 million gallon tank of LNG, two of which sit in Everett on Boston Harbor, was equivalent to 20 Hiroshima bombs. Can one set them off? It is not easy but it is possible. One of the possible initiators was a liquid-liquid superheat explosion similar to the explosion when molten steel is poured onto a wet floor. A couple of the Argonne National Laboratory scientists were discussing, both theoretically and experimentally, the criteria for that to happen and it seemed to be possible. Ed. Purcell introduced me to Stirling Colgate of Los Alamos who was also concerned. I was asked, in 1977, to be on a committee of the General Accounting Office on Safety of Liquefied Petroleum Gases. After the final dinner meeting we played a game. Who can find the worst way of sabotaging an LNG facility? There were a variety of good, or perhaps I should say bad, ideas. The organizer then said he would collect the notes and destroy them. This is because one does not want to give ideas to a terrorist, who, it was assumed, was of lesser intelligence. This raises the whole question of what one should keep secret and what not. Borrowing a terminology from the military, I believe that in general short term issues, tactics, should be secret, but long term issues, strategies, cannot be kept secret for long and it is best to make them widely available so that counter measures may be prepared. I like to refer to the case of the Canadian heroine Laura Secord, a farmer’s teen aged daughter, who drove her cows across the invading US forces in 1812, and found out the order of battle for the next day, thereby foiling the US advance. My own suggestion, details of which must remain unstated, was to add a little liquid oxygen into the tank. The way I then proposed, cannot now be done.

Of course LNG accidents are only one type of severe accidents that can be created by a saboteur. I had already thought about dam failures and the crude failure probability was already listed in my letter “kilowatt deaths” But I had a more sinister thought. The Connecticut River between New Hampshire and Vermont is almost owned by the utility companies who have installed hydroelectric dams along almost the whole length. Near Littleton is the Moore reservoir and hydroelectric station with a 140 Mwe hydroelectric plant. Suppose that I was to blow up this dam at a time of flood. The cascading waters would cause each dam downstream to fail in turn all the way down to the sea at Old Saybrook. I discussed this with professor Arthur Casagrande, Harvard’s soil science expert who had designed half the earth filled dams in North America. He agreed that it was possible and went one better. What about a dam failure in the upper Missouri, with the resultant cascading flood taking out dams all the way down the Mississippi? He added that he had designed a dam near St Louis which was large enough and sturdy enough to hold in the event of such a catastrophe upstream. It was about this time that I was at the energy and environment meeting in Boulder in 1974. In discussions with Eugene Wigner, we, or more likely he alone, formulated the simple concept that when there is a lot of easily available stored energy, such as water in a reservoir, easily burnable fuel, (or indeed carbon atoms knocked off the lattice by fast neutrons) in one place and a lot of people in a nearby location there is potential for disaster. We agreed that all energy systems should be analyzed for such possible disasters, technically called the high consequence, low probability accident. But of course “stored energy” is a slightly vague concept. The rate at which a chemical reaction can take place is crucial also. Should we worry that iron, in the presence of air, will oxidize? We do not call it burning, but call it rusting, because it happens so slowly. Nonetheless the concept is useful and instructive.


In the 1970s the concept of Event Tree Analysis was being developed for nuclear power, although old line engineers were loath to accept it. The Nuclear Regulatory Commission backed away from it, largely as result of public pressure from Henry Kendall and the Union of Concerned Scientists. I tried then to get it accepted (239, 264). I counted 5 ways in which the use of Rasmussen’s even tree analysis by industry and NRC would have prevented the TMI accident. Others saw this too and after Three Mile Island NRC officially embraced event tree analysis, but it was not completely accepted till the mid nineties. . LNG and Chemical industries accepted it about 1980. NASA was urged to accept event tree analysis after the Challenger accident but only did so about 1990. I had talked about it in many meetings. The 9/11 disaster at the World Trade Center should have persuaded the building industry to accept it but it seems that buildings will be rebuilt at the site without any such careful reanalysis. I have brought this up several times but although I ha had superficial agreement, I have had no success.
When 9/11 came around I was therefore well equipped to note that paying attention to the high consequence low probability accident was the most important step to make society less vulnerable to the consequences of terrorism. I gave talks on this and other aspects. For the Global Foundation in London in 2001 (811), and Florida in 2002; to the PSAMS conference in Puerto Rico, and at the World Federation of Sciences seminar on Planetary emergencies in Erice in 2002, and the Biological Terrorism meeting (BTR3) in Albuquerque in 2003 (863). I sent a copy to my former colleague, John D. Graham, who had started the Harvard Center for Risk Analysis and was then administrator in the Office of Information and Regulatory Affairs in the Office of management and Budget.. He sent it around to the heads of all relevant government offices and it can still be located through their website. It is surprising to me how new this concept seemed to almost everyone at these meetings.
As noted earlier in these memoires I had, as an graduate student in Oxford wondered how much havoc I could create with the silver cyanide plating solution in the cupboard by putting into the water supply. I wondered about epidemics and Crouch and I discussed them a little in our book on Risk-Benefit Analysis. I postulated that the same concept, that a serious attention to extreme natural disasters might be important as a first step in coping with potential terrorist acts. I was discouraged with this by the biological warfare people at an Erice meeting but when I gave my general terrorism talk in Albuquerque in 2003, it resonated with General Annette Sobel, a physician in the Air Force who was the organizer of the meeting and in charge of Homeland Security in the area. She was instrumental in ensuring that some of the funds from the US Department of Homeland Security go into preventing natural outbreaks of diseases. At the 2004 and 2005 BTR meetings several people were saying the same thing. The natural epidemic of SARS also helped to change peoples’ attitudes. The Dean of Harvard School of Public Health, Dr Barry Bloom, said this also, so I became sure that my postulate is correct for biological terrorism as well as for terrorism involving physical and engineering projects..
Although the postulate is widely accepted by experts, very little preventive action seems to be based thereon. The Permanent Monitoring Panel on Terrorism of the World Federation of Sciences has picked this up and issued recommendations. In particular how to reduce the average number of persons that a sufferer from a disease infects and how to reduce this number below one by urging the populace to take simple precautions if the early stages of a pandemic become apparent.. The World Health organization have known this for years, and my friend, Dr Kazem Behbehani, uses it in his program to eliminate elephantitis, country by country. But where is the elementary instruction on coping with epidemics in schools? Why are not the kits available and advertized at the major drug store chains? Why is the Massachusetts Emergency Planning group so laid back? Why does not every School of Public Health in the universities world highlight this as an important issue? Barry Bloom talked about this but as Dean of the Harvard School of Public Health, he could have taken action. As I contemplate this I realize that I and others still have interesting work to do. This is comforting to me on a personal level, but when I worry about the future of my grandchildren and the human race generally I am not encouraged.

Yüklə 1,99 Mb.

Dostları ilə paylaş:
1   ...   17   18   19   20   21   22   23   24   ...   31




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin