Rachel's Precaution Reporter #103, August 15, 2007


[Rachel's introduction: In response to Adam Finkel's review of Cass Sunstein's book, Peter Montague wrote this letter explaining why President Bush's invasion of Iraq, and the nation's asbestos-removal program for schools, are not examples of precaution -- and pointing out some major problems with decisions based narrowly on quantitative risk assessment.]

Dear Adam,

It was really good to hear from you. I'm delighted that you're now working in New Jersey, with joint appointments at Princeton University and at the School of Public health at the University of Medicine and Dentistry of New Jersey (UMDNJ). Everyone in New Jersey is taking a bath in low levels of toxic chemicals, so we need all the help we can get, and you're exactly the kind of person we most need help from -- honest, knowledgeable, committed, and principled. Your knowledge of toxicology and risk assessment -- and the courage you demonstrated as a government whistle-blower within the Occupational Safety and Health Administration -- are sorely needed in the Garden State, as they are everywhere in America.

I read with real interest your review of Cass Sunstein's book, Risk and Reason. I thought you did an excellent job of showing that Sunstein "has managed to sketch out a brand of QRA [quantitative risk assessment] that may actually be less scientific and more divisive, than no analysis at all." To me, it seems that Sunstein is more interested in picking a fight with his fellow liberals than in helping people make better decisions.

What I want to discuss in this note to you is your attack on the precautionary principle. In the second paragraph of your review, you wrote, "I believe that at its best, QRA [quantitative risk assessment] can serve us better than a 'precautionary principle' that eschews analysis in favor of crusades against particular hazards that we somehow know are needlessly harmful and can be eliminated at little or no economic or human cost. After all, this orientation has brought us increased asbestos exposure for schoolchildren and remediation workers in the name of prevention, and also justified an ongoing war with as pure a statement of the precautionary principle as we are likely to find ("we have every reason to assume the worst, and we have an urgent duty to prevent the worst from occurring," said President Bush in October 2002 about weapons of mass destruction in Iraq)."

The two examples you give -- asbestos removal, and Mr. Bush's pre- emptive war in Iraq -- really aren't good examples of the precautionary principle. Let's look at the details.

All versions of the precautionary principle have three basic parts:

1) If you have reasonable suspicion of harm

2) and you have scientific uncertainty

3) then you have a duty to take action to avert harm (though the kind of action to take is not spelled out in the precautionary principle).

Those are the three basic elements of the precautionary principle, found in every definition of the principle.

And here is a slightly more verbose version of the same thing:

1. Pay attention and heed early warnings. In a highly technological society, we need to keep paying attention because so many things can go wrong, with serious consequences.

2. When we develop reasonable suspicion that harm is occurring or is about to occur, we have a duty to take action to avert harm. The action to take is not spelled out by the precautionary principle but proponents of the principle have come up with some suggestions for action:

3. We can set goals -- and in doing this, we can make serious efforts to engage the people who will be affected by the decisions.

4. We can examine all reasonable alternatives for achieving the goal(s), again REALLY engaging the people who will be affected.

5. We can give the benefit of the doubt to nature and to public health.

6. After the decision, we can monitor, to see what happens, and again heed early warnings.

7. And we can favor decisions that are reversible, in case our monitoring reveals that things are going badly.

Now, let's look at the examples you gave to see if they really represent failures of the precautionary principle.

The asbestos removal craze (1985-2000) was essentially started by one man, James J. Florio, former Congressman from New Jersey (and later Governor of New Jersey).

As we learn from this 1985 article from the Philadelphia Inquirer, gubernatorial candidate Florio was fond of bashing his opponent for ignoring the "asbestos crisis" and the 'garbage crisis.' In Mr. Florio's campaign for governor of New Jersey, the "asbestos crisis" was a political ploy -- and it worked. He got elected by promising to fix the asbestos crisis and the garbage crisis.

As governor, Mr. Florio's approach to the "garbage crisis" was to site a new garbage incinerator in each of New Jersey's 21 counties. At $300 to $600 million per machine, this would have set loose an unprecedented quantity of public monies sloshing around in the political system. Later it turned out that Mr. Florio's chief of staff had close ties to the incinerator industry.

I don't know whether Mr. Florio or his political cronies profited directly from the asbestos removal industry that he created almost single-handedly, but the decision-making for asbestos was similar to his approach to the "garbage crisis." He did not engage the affected public in setting goals and then examine all reasonable alternatives for achieving those goals. He did not take a precautionary approach.

In the case of garbage, Mr. Florio and his cronies decided behind closed doors that New Jersey needed to build 21 incinerators. He and his cronies then justified those incinerators using quantitative risk assessments.

Mr. Florio's approach to asbestos was the same: without asking the affected public, he decided that removal of asbestos from 100,000 of the nation's schools was the correct goal, and thus creating a new "asbestos removal" industry was the only reasonable alternative. You can read about Mr. Florio's "Asbestos Hazard Emergency Response Act of 1986" here.

So, the goal ("asbestos removal") was not set through consultation with affected parties. Perhaps if the goal had been to "Eliminate exposures of students and staff to asbestos in schools," a different set of alternatives would have seemed reasonable. Asbestos removal might not have been judged the least-harmful approach.

Even after the questionable decision was made to remove all asbestos from 100,000 school buildings nationwide, systematic monitoring was not done, or not done properly. Furthermore, the decision to remove asbestos was not easily reversible once removal had been undertaken and new hazards had been created.

So Mr. Florio's law created overnight a new asbestos removal industry, companies without asbestos removal experience rushed to make a killing on public contracts, in some cases worker training was poor, removals were carried out sloppily, monitoring was casual or non-existent, and so new hazards were created.

This was not an example of a precautionary approach. It was missing almost all the elements of a precautionary approach -- from goal setting to alternatives assessment, to monitoring, heeding early warnings, and making decisions that are reversible.

Now let's examine President Bush's pre-emptive strike against Iraq.

True enough, Mr. Bush claimed that he had strong evidence of an imminent attack on the U.S. -- perhaps even a nuclear attack. He claimed to know that Saddam Hussein was within a year of having a nuclear weapon. But we now know that this was all bogus. This was not an example of heeding an early warning, because no threat existed -- it was an example of manufacturing an excuse to carry out plans that had been made even before 9/11.

Here is the lead paragraph from a front-page story in the Washington Post April 28, 2007:

"White House and Pentagon officials, and particularly Vice President Cheney, were determined to attack Iraq from the first days of the Bush administration, long before the Sept. 11, 2001, attacks, and repeatedly stretched available intelligence to build support for the war, according to a new book by former CIA director George J. Tenet."

Fully a year before the war began, Time magazine reported March 24, 2003, President Bush stuck his head in the door of a White House meeting between National Security Advisor Condoleeza Rice and three U.S. Senators discussing how to deal with Saddam Hussein through the United Nations or perhaps in a coalition with the U.S's Middle East partners. "Bush wasn't interested," reported Time. "He waved his hand dismissively, recalls a participant, and neatly summed up his Iraq policy:" "F--- Saddam. We're taking him out." This President was not weighing alternatives, seeking the least-harmful.

The CIA, the State Department, members of Congress, and the National Security staff wanted to examine alternatives to war, but Mr. Bush was not interested. With Mr. Cheney, he had set the goal of war, without wide consultation. He then manufactured evidence to support his decision to "take out" Saddam without much thought for the consequences. Later he revealed that God had told him to strike Saddam. Mr. Bush believed he was a man on a mission from God. Assessing alternatives was not part of God's plan, as Mr. Bush saw it.

This was not an example of the precautionary approach. Goals were not set through a democratic process. Early warning were not heeded -- instead fraudulent scenarios were manufactured to justify a policy previous set (if you can call "God told me to f--- Saddam" a policy). As George Tenet's book makes clear again and again, alternatives were not thoroughly examined, with the aim of selecting the least harmful. For a long time, no monitoring was done (for example, no one has been systematically counting Iraqi civilian casualties), and the decision was not reversible, as is now so painfully clear.

This was definitely not an example of the precautionary approach.


So I believe your gratuitous attack on the precautionary principle is misplaced because you bring forward examples that don't have anything to do with precautionary decision-making.

Now I want to acknowledge that the precautionary principle can lead to mistakes being made -- it is not a silver bullet that can minimize all harms. However, it we take to heart its advice to monitor and heed early warnings, combined with favoring decisions that are reversible, it gains a self-correcting aspect that can definitely reduce harms as time passes.

I am even more worried by the next paragraph of your review of Sunstein's book. Here you seem to disparage the central goal of public health, which is primary prevention. You write,

"More attention to benefits and costs might occasionally dampen the consistent enthusiasm of the environmental movement for prevention, and might even moderate the on-again, off-again role precaution plays in current U.S. economic and foreign policies." I'm not sure what you mean by the 'on-again, off-again role precaution plays in current U.S. economic and foreign policies." I presume you're referring here to asbestos removal and pre-emptive war in Iraq, but I believe I've shown that neither of these was an example of precautionary decision-making.

Surely you don't REALLY want environmentalists or public health practitioners to "dampen their consistent enthusiasm for prevention." Primary prevention should always be our goal, shouldn't it? But we must ask what are we trying to prevent? And how best to achieve the goal(s)? These are always the central questions, and here I would agree with you: examining the pros and cons of every reasonable approach is the best way to go. (I don't want to use your phrase "costs and benefits" because in common parlance this phrase implies quantitative assessment of costs and benefits, usually in dollar terms. I am interested in a richer and fuller discussion of pros and cons, which can include elements and considerations that are not strictly quantifiable but are nevertheless important human considerations, like local culture, history, fairness, justice, community resilience and beauty.)

So this brings me to my fundamental criticisms of quantitative risk assessment, which I prefer to call "numerical risk assessment."

1. We are all exposed to multiple stressors all of the time and numerical risk assessment has no consistent way to evaluate this complexity. In actual practice, risk assessors assign a value of zero to most of the real-world stresses, and thus create a mathematical model of an artificial world that does not exist. They then use that make-believe model to drive decisions about the real world that does exist.

2. The timing of exposures can be critical. Indeed, a group of 200 physicians and scientists recently said they believe the main adage of toxicology -- "the dose makes the poison" -- needs to be changed to, "The timing makes the poison." Numerical risk assessment is, today, poorly prepared to deal with the timing of exposures.

3. By definition, numerical risk assessment only takes into consideration things that can be assigned a number. This means that many perspectives that people care about must be omitted from decisions based on numerical risk assessments -- things like historical knowledge, local preferences, ethical perspectives of right and wrong, and justice or injustice.

4. Numerical risk assessment is difficult for many (probably most) people to understand. Such obscure decision-making techniques run counter to the principles of an open society.

5. Politics can and do enter into numerical risk assessments. William Ruckelshaus, first administrator of U.S. Environmental Protection Agency said in 1984, "We should remember that risk assessment data can be like the captured spy: If you torture it long enough, it will tell you anything you want to know."

6. The results of numerical risk assessment are not reproducible from laboratory to laboratory, so this decision-technique does not meet the basic criterion for being considered "science" or "scientific."

As the National Academy of Sciences said in 1991, "Risk assessment techniques are highly speculative, and almost all rely on multiple assumptions of fact -- some of which are entirely untestable." (Quoted in Anthony B. Miller and others, Environmental Epidemiology, Volume 1: Public Health and Hazardous Wastes [Washington, DC: National Academy of Sciences, 1991], pg. 45.)

7. By focusing attention on the "most exposed individual," numerical risk assessments have given a green light to hundreds of thousands or millions of "safe" or "acceptable" or "insignificant" discharges that have had the cumulative effect of contaminating the entire planet with industrial poisons. See Travis and Hester, 1991 and Rachel's News #831.

All this is not to say that numerical risk assessment has no place in decision-making. Using a precautionary approach, as we set goals and, later, as we examine the full range of alternatives, numerical risk assessments might be one technique for generating information to be used by interested parties in their deliberations. Other techniques for generating useful information might be citizen juries, a Delphi approach, or consensus conferences. (I've discussed these techniques briefly elsewhere.)

It isn't so much numerical risk assessment itself that creates problems -- it's reliance almost solely on numerical risk assessment as the basis for decisions that has gotten us into the mess we're in.

Used within a precautionary framework for decision-making, numerical risk assessment of the available alternatives in many cases may be able to give us useful new information that can contribute to better decisions. And that of course is the goal: better decisions producing fewer harms.