Rachel's Precaution Reporter #25

"Foresight and Precaution, in the News and in the World"

Wednesday, February 15, 2006.........Printer-friendly version
www.rachel.org -- To make a secure donation, click here.

Table of Contents...

Op-Ed: Curing Our Public Health System
  For 100 years, the cornerstone of public health theory and practise
  has been "primary prevention" -- preventing disease instead of having
  to cure it. Really, the precautionary principle is nothing more than a
  traditional public health approach. If they ever got together,
  citizens who favor the precautionary approach and public health
  specialists in every county and municipality might discover that they
  are natural allies.
Bush Administration Proposes New Risk Assessment Methods
  New rules proposed for government "risk assessments" may stifle
  regulation. Toxicologist Jennifer Sass of the Natural Resources
  Defense Council suggests that scientists won't be able to meet the
  standards for risks for which there are little underlying data. "I'm
  concerned that regulations will die at OMB [Office of Management and
  Budget]" as a result, she says.
EPA Scientists Criticize OMB's Call for Alternative Risk Models
  The Office of Management and Budget (OMB) has proposed new rules
  for risk assessments conducted by federal agencies. Scientists within
  the U.S. Environmental Protection Agency are not enthusiastic.
Studies of Chemicals in Humans Are Driving a Prevention Agenda
  "Instead of waiting for conclusive evidence, many activists say
  there is enough information to start banning chemicals. They argue
  that manufacturers should have to prove new chemicals are safe before
  they are approved for use. This requires embracing the precautionary
  principle, which argues that in the absence of conclusive evidence in
  face of a serious threat, we must still take action."
Income Inequality Grew Across the Country Over the Past Two Decades
  A prevention philosophy applied to the health of U.S. citizens
  would focus on growing inequalities of income and wealth. Numerous
  studies over the past 30 years have shown that inequalities, social
  alienation, and a pyramid of hierarchies make people especially
  susceptible to many diseases including cancer, diabetes, arthritis and
  heart disease.


From: Boston Globe, Feb. 11, 2006
[Printer-friendly version]


By Madeline Drexler

Last week, in his State of the Union address, President Bush bemoaned
spiraling medical costs -- and rightly so. What he didn't say was that
99 percent of US healthcare dollars are spent on treating and curing
disease, and only 1 percent on preventing disease. That logic is
backward -- and the president's proposed 2007 budget makes matters
even worse.

Health savings accounts, medical liability reform, and token infusions
of cash are the wrong medicine for what ails Americans. If the
president were sincere about nurturing a "compassionate, decent,
hopeful society," he would reinvigorate our public health system.
Public health, after all, is both morally enlightened and economically
prudent. It rests on the idea that promoting health and averting
disease saves more lives more cheaply than does high-tech medicine.

It's a concept Bush has consistently weakened. "This is probably the
worst administration ever for public health," said Dr. Walter Tsou,
immediate past president of the American Public Health Association, at
the group's annual meeting in December. "They're constantly cutting
back money -- with the exception of things that actually scare them,
like bioterrorism and pandemic flu." What's killing us now, as opposed
to what we fear will kill us, is cancer, heart disease, tobacco-
related afflictions, complications of obesity, drug-resistant
infections, and other ills, both chronic and acute.

In fiscal 2006, the Centers for Disease Control and Prevention's core
programs suffered a 4 percent funding cut, compared with the previous
year; Bush's proposed 2007 budget lops off another 4 percent. These
core programs -- that is, nonterrorism-related activities -- are the
bread and butter of public health. Among the programmatic victims are
chronic disease prevention and health promotion, occupational safety
and health, environmental health, and health services block grants to
states, which cover everything from cancer screening to flu shots.

These aren't just meaningless line items; they're people's lives. In
2004, Dr. Julie Gerberding, the CDC director, stated that "robust"
funding of disease prevention programs could each year save diabetics
from 43,000 amputations, 165,000 kidney failures, and more than 10,000
cases of eye disease; reduce by half 40,000 new HIV infections; and
forestall two-thirds of alcohol-exposed pregnancies.

Healthcare coverage is another foundation stone of public health. In
his enthusiasm about insurance portability, Bush forgets to mention
that 46 million Americans don't have health insurance to haul around.
That's a death sentence. According to a 2004 Institute of Medicine
report, 18,000 adults die unnecessarily each year because they lack

So what's the alternative? How can the administration truly improve
the state of the union's health?

First, it must follow its own advice. Every decade, the US Department
of Health and Human Services publishes a document that sets national
objectives for curbing disease and improving health. Its "Healthy
People 2010" report calls for reducing obesity levels to 15 percent of
the adult population and 5 percent of children and adolescents;
cutting tobacco use to 12 percent of adults and 16 percent of
adolescents; and eliminating exposure to hazardous ozone levels. These
official goals are so far out of reach as to be cynical. Achieving
them requires action -- not 11 brief sentences of prime-time

Second, the administration must think as globally about health as Bush
is fond of boasting he does about the economy. Many nations, both rich
and poor, have a keener sense of the value of comprehensive health
measures than does the United States -- and we should learn from them.
In Sweden, for example, the national agenda is to "Create social
conditions to ensure good health, on equal terms, for the entire
population." That includes not just wholesome foods and local parks in
which to exercise, but also jobs and a good education. In the United
Kingdom, far-flung government authorities are required to collaborate
on ambitious health targets.

Finally, our history-loving president might borrow a page from the
annals of public health. During the late 19th and early 20th centuries
-- the profession's golden age -- its leaders thought big. They didn't
dole out scraps of rhetoric; the language of social reform came
naturally. In 1905, Hermann Biggs, New York City's legendary health
commissioner, famously asserted: "Public health is purchasable. Within
natural limitations a community can determine its own death rate."

And no, he wasn't talking about tax-deductible health savings

Madeline Drexler is a Boston-based journalist and author of "Secret
Agents: The Menace of Emerging Infections." She has a visiting
appointment at the Harvard School of Public Health.

Copyright 2005 The New York Times Company

Return to Table of Contents


From: Science Magazine (pg. 161), Jan. 13, 2006
[Printer-friendly version]


By Jocelyn Kaiser

The Bush Administration this week proposed new federal standards for
analyzing health and environmental risks underlying regulations that
ask for more details on the evidence that a pollutant causes harm.

Experts agree that the changes should improve the quality of
assessments, although one critic worries that the bar would be set so
high that it could also slow the pace of new regulations.

The draft bulletin "provides clear, minimum standards for the
scientific quality of federal agency risk assessments," says John
Graham, the outgoing director of the Office of Management and Budget's
(OMB's) Office of Information and Regulatory Affairs. Graham, a former
Harvard University professor who in the past 5 years has bolstered the
office's influence on agency rulemaking, says the standards should
help risk assessments pass scientific review more quickly.

The proposed rules include steps that aren't always routine, such as
requiring that agencies weigh both positive and negative studies. The
document also asks agencies preparing assessments that could have a
major economic or policy impact to look more closely at the
uncertainties, including variability in the population and both middle
estimates and the range of risks. Some agencies tend to emphasize the
high end of risk, says an OMB official. "This is a big change in
practice, especially for parts of EPA [the Environmental Protection
Agency]," explains the official.

Kimberly Thompson, a risk expert at the Massachusetts Institute of
Technology in Cambridge and president-elect of the Society for Risk
Analysis, applauds the greater emphasis on quantitative tools. "This
basically outlines things agencies should have been doing all along,"
agrees Granger Morgan of Carnegie Mellon University in Pittsburgh,
Pennsylvania, who chairs EPA's scientific advisory board. But
toxicologist Jennifer Sass of the Natural Resources Defense Council in
Washington, D.C., suggests that scientists won't be able to meet the
standards for risks for which there are little underlying data. "I'm
concerned that regulations will die at OMB" as a result, she says.

Graham leaves next month to head the Pardee RAND Graduate School in
Santa Monica, California. The comment period closes on 15 June, and
the proposed bulletin will also be reviewed by the National Academies.

Copyright 2006 American Association for the Advancement of Science

Return to Table of Contents


From: Risk Policy Report, Feb. 7, 2006
[Printer-friendly version]


EPA staff scientists are strongly criticizing recent calls by the
White House Office of Management & Budget (OMB) to evaluate
alternative "models" for how chemicals affect health, saying the
mandate may force the agency to accept "junk science" and
unjustifiably lower environmental standards.

The criticism could lead to a rift with EPA science chief George Gray
over crafting agency comments on the draft guidance, which are due to
be submitted to OMB later this year.

Gray said in a recent interview that considering alternative models
submitted to the agency in support of risk assessments about
chemicals' biological impact will help improve decisions by detailing
the uncertainty surrounding those estimates.

Gray has made expanding uncertainty analysis in agency risk reviews a
key element of his emerging agenda (Risk Policy Report, Jan. 31, p1).

One senior EPA science manager reportedly said at a Jan. 23 meeting of
the agency's high-level Science Policy Council (SPC) that the bulletin
is "dreadful" and reflects a "1980s view of risk assessment" that does
not reflect currently accepted practices, according to sources who
attended the meeting.

OMB's recently proposed risk guidance calls on agencies to consider
all "plausible" models of the biochemical impacts of a contaminant at
low levels of exposure. "Where a risk can be plausibly characterized
by alternative models, the difference between the results of the
alternative models is model uncertainty ...When model uncertainty is
substantial, the central or expected estimate may be a weighted
average of the results from alternative models," according to the
draft guidance.

But some EPA scientists and observers say this provision will result
in "model-shopping" among industry scientists. "This may open the door
to everyone coming in with their own model." Simply because they can
write a computer program to "compute results they prefer, doesn't mean
that a model is valid," according to one science policy expert.
"Accepting all 'plausible' models sets a pretty low bar. These models
should require strong biological support before they are considered,"
an agency source says.

Agency critics also say the approach is flawed because the strength of
the biological support should determine which models the agency
accepts for inclusion in chemical risk reviews. They also say it is
hard to find chemicals for which there is sufficient biological
information to conduct fully informed modeling exercises mandated by
OMB. "There are only a few dioxins, arsenics and mercury's out there
with substantial enough databases," according to another agency

But administration officials are strongly defending their approach.
During a Jan. 19 interview, then-OMB regulatory chief John Graham said
"if there's no evidence for looking at an alternative model, then
there's no value in considering it. Statistical backing is evidence,
although if the support is biological as well as statistical, then
that's stronger than statistical evidence alone."

And, during a Jan. 25 interview with Risk Policy Report, Gray agreed
with EPA critics that biological evidence is important, but argued
that multiple models are essential to understanding the uncertainty of
chemicals' impacts. "I don't think anyone would suggest that
statistical goodness-of-fit would determine [the quality of a model].
It's biological evidence that counts." But Gray said that examining
different models with "biological plausibility" is a useful way to
understand the uncertainty that surrounds chemical potency estimates.

Gray also said during his interview, "Models are just ways for us to
implement different biological theories about what may be going on.
What I think is important is when we don't know about different
biological pathways, considering multiple models is probably a good

Asked whether the call in the OMB bulletin would open the agency to
alternatives to the precautionary "linear model" currently used for
evaluating carcinogens, Gray said, "The linear model has certain
biological principles it is based on and what you can do is look at
how well the data fit that model. Other models have different
biological data that support them. That process helps you think about
the data and the model helps you quantify the uncertainty around

But industry analysts say EPA and academic scientists also manipulate
statistics to buttress their arguments about the health impacts of
environmental contaminants without strong biological support in agency
risk-based drinking water and air standards. Industry observers say
the linear model does not describe the impact of many potentially
carcinogenic chemicals at low doses and point to the agency's 1998
decision to use the linear model for chloroform, which was rejected
both by agency science advisers and in a March 2000 U.S. Court of
Appeals for the District of Columbia Circuit decision as untenable
(Risk Policy Report, April 18, 2000, p5).

But EPA risk assessors say the OMB may recreate the confusion sparked
by open calls for a variety of risk models accepted by EPA programs in
the 1970s and 1980s. At the time, industry and other outside experts
could chose among the linear, "the one-hit, multi-hit, Weibull, log-
probit and other models. Eventually EPA and other federal agencies
realized that interested parties would 'model shop' and use the model
that gave them the answers they preferred," according to the science
policy observer.

This resulted in the agency formally adopting the linear model to
provide some comparability across risk estimates in the 1986 cancer
risk guidelines and a set of interagency principles the White House
Office of Science and Technology Policy adopted.

But industry officials say risk modeling has grown more sophisticated
since that time and companies know convincing and detailed data on a
chemical's biochemical impacts are now required. "With the publication
of the 2005 updated cancer guidelines, companies know that significant
biological evidence is required to overcome protective assumptions
like the linear model EPA uses," according to one industry source.

Agency staff and environmentalists say they are concerned key
protections will be eroded if the bulletin is finalized in its current

"This [OMB guidance] is worrisome because models can be constructed to
give any kind of low-dose answer you want, and the practical
alternatives" are limited, an EPA source says. And an environmental
scientist says, "This is an assault on protective assumptions with
far-reaching consequences." A science policy observer agrees, saying,
"Concerns about an assault on the linear model are accurate and well

But conservative science policy analysts also argue that academic and
EPA scientists can rely too heavily on statistically-backed but
biologically deficient arguments to support their views.

In a Jan. 17 critique of EPA air pollution standards for particulate
matter, American Enterprise Institute visiting fellow Joel Schwartz
says, "There are literally millions of plausible statistical models
relating to air pollution health outcomes, and no objective way to
choose among them. Under these circumstances, researchers have a
tendency to select those models that give the largest or most
statistically significant effects."

Return to Table of Contents


From: Ottawa (Canada) Citizen, Feb. 14, 2006
[Printer-friendly version]


By Shelley Page and Susan Allan, Ottawa Citizen

[RPR Introduction: We have added internal links within this story for
clarification.--RPR editors]

During the past 50 years, breast cancer incidence has climbed 90 per

- Lung cancer rates have jumped more than 600 per cent

- Non-Hodgkins lymphoma, a cancer tied to a weakened immune system,
has increased 250 per cent

- Asthma affects more kids than ever before. Biomonitoring tests
reveal toxic chemicals are in all of us, on that everyone seems to

So what does it mean to human health? Scientists are only beginning to
explore the links

Infants start life polluted with PCBs, pesticides and 200 industrial
chemicals. Rocket fuel laces their mother's breast milk. Carcinogens,
hormone disruptors and the chemicals used to make GOR-TEX and Teflon
infuse everyone's blood.

New technology is making it possible to find ever smaller
concentrations of chemicals in people's blood.

Suddenly, TV stars in Britain and celebrities in California are
rolling up shirt sleeves to have their blood tested to draw attention
to the issue. When almost 100 journalists in Pittsburgh handed over
hair samples for testing, close to one-third showed elevated levels of
mercury. Last year, the Environmental Working Group in Washington,
D.C., released Body Burden: The Pollution in People, a study that
examined the levels of 210 chemicals in nine people. In April, the
World Wildlife Federation tested 39 members of the European
Parliament for 101 compounds.

Until recently, Canadians stayed out of the bloodletting. Then last
week the Toronto-based group Environmental Defence announced it had
tested 11 Canadians for 88 chemicals believed to be either
carcinogenic or to disrupt reproduction, hormonal function and/or
interfere with fetal development. The study found, on average,
participants had 44 chemicals in their bodies.

Renowned artist and environmentalist Robert Bateman was one of the
test subjects. Despite eating organic food, using environmentally
friendly cleaners and living on an idyllic West Coast island, his body
was revealed to be a repository for 48 toxins: heavy metals; PCBs
(polychlorinated biphenyls used in electrical transformers and now
banned); PBDEs (polybrominated diphenyl ethers used as fire
retardants); PFOs (perfluorinated chemicals used in stain repellents,
non-stick cookware and food packaging), pesticides and insecticides.

"I had no idea when they were taking those samples out of my arm that
there was a possibility that all those chemicals could be in there," a
bewildered Bateman told a journalist.

No one emerges unscathed.

But so what?

The Canadian report raised more questions than it answered.

Does the fact our bodies are laden with chemicals mean anything? Are
the increasing number of cancers, fertility problems, auto-immune
diseases, autism and other horrors related to these chemicals in our
bodies? Can we trace learning difficulties, mental illness and brain
fog to the chemicals our bodies absorbed as far back as the womb? Were
the harsh pollutants found in Mr. Bateman and others in concentrations
considered dangerous? If the majority of the studies' participants
were healthy, at it appears, does this mean the toxic findings are

For half a century, advancing societies have pumped the global
environment full of synthetic chemicals, while realizing previously
unimagined benefits. But during these five decades, our bodies,
particularly our fat cells, have become storage tanks for the
byproducts of vastly improved lifestyles.

Environmental activists call this "chemical body burden" -- the
price of technological advance.

A few years ago, scientists could not detect these chemicals. Now that
they can, they want to know if and how they affect human health.
Researchers have yet to draw a conclusive line between body burden and
health problems. Still, there are rising incidences of illness to

- During the past 50 years, breast cancer incidence has climbed 90 per
cent. The incidence of non-Hodgkins lymphoma, a cancer tied to a
weakened immune system, increased 250 per cent in the same period.

- During the past decade, several studies have linked weak or
defective sperm to employment in occupations with exposure to
chemicals and pesticides. In 1938, only one-half of one per cent of
men were functionally sterile. Today, that number is between eight and
12 per cent. Meanwhile, eight per cent of women of child-bearing age
suffer fertility problems.

- Asthma affects more kids than ever before -- about 12 per cent of
children under age 14.

Are these increases in health problems a result of the increase in
chemicals in our environment?

Sometimes yes. Lung cancer rates jumped more than 600 per cent in the
past 50 years, not because of industrial chemicals but rather tobacco
smoke. That is when large numbers of women started smoking.

As for the other diseases, no one is certain environmental toxins play
a role. Biomonitoring, the latest trend in public health, may provide

By examining chemicals in the blood and urine, it's sometimes easy to
see how public policy makes a difference to people's health.

By measuring cotinine -- a metabolite of nicotine -- U.S. researchers
showed that blood levels of secondhand tobacco smoke decreased 75 per
cent in adults during the 1990s, simply because smoking was banned at
work and in public places. Levels in children remain twice as high as
adults, showing that they continue to be exposed to second-hand smoke
at home, out of the reach of government policy.

It is hoped biomonitoring will reveal much more about the presence and
perhaps effects of toxins. The largest effort is the National Report
on Human Exposure to Environmental Chemicals, an ongoing $6.5-million
survey by the Centers for Disease Control and Prevention that measures
about 145 chemicals in 2,500 people across the United States every two
years. All year long, teams from the CDC dispatch four tractor-
trailers to neighbourhoods in 30 locations across the country,
interviewing residents, performing exams and sampling blood and urine.

Last July, it heralded its third nationwide report as the "largest and
most comprehensive of its kind ever released anywhere by anyone... a
giant step forward in our ability to understand the relationship
between exposures to various chemicals and the potential human health

Among the study's findings:

- Lead levels among children in the U.S. have dropped significantly
during the past few years. Only 1.6 per cent of children in the study
between the ages of one and five had elevated blood levels, down from
4.4 per cent in the early 1990s.

Lead exposure in children has been found to damage the brain and
nervous system, cause behavioural and learning problems, such as
hyperactivity, slowed growth, hearing problems and headaches.

"We continue to strive that all children are free from lead exposure
in their home, in their environment," said CDC director Dr. Julie
Gerberding, "but nevertheless, this is an astonishing public health
achievement and I think really speaks to the removal of lead from

Lead is one of the early successes of biomonitoring. Early concerns
about lead were met with skepticism by some. Today no level is
considered safe. According to a recent report in Science magazine,
when the U.S. and other countries set out to reduce automobile
emissions, models suggested lead levels in children would decrease
slightly as gas lead levels declined.

Beginning in 1976, CDC began to check lead levels in children and
adults. Tests revealed blood lead levels declined about tenfold more
than expected between 1976 and 1980. These numbers prompted the
Environmental Protection Agency to remove lead from gasoline more

- The 2005 CDC report also showed that organochlorine pesticides
eliminated from use in the 1980s -- the study names Aldrin, Endrin and
Dieldrin -- did not show up in the humans studied.

"Since these chemicals have no longer been used as pesticides, we have
virtually eliminated them from the human population," said Dr.

- Phthalates, plasticizers found in everything from plastics to vinyl
to hairspray, were found at levels that demand further investigation.

The CDC warned that just because an environmental chemical appears in
blood or urine does not mean it causes disease. "The toxicity of a
chemical is related to its dose or concentration in addition to a
person's susceptibility. Small amounts may be of no health
consequence, whereas larger amounts may cause adverse health effects."

Not surprisingly, this caution was echoed by the American Chemistry
Council, a trade group that represents 135 leading manufacturers in
the chemical industry, a $450-billion enterprise in the U.S.

"The benefits of chemistry have helped all of us live longer and
healthier lives," it said in response to the CDC report. "These
advances should not be underestimated or undermined by unnecessarily
alarming people about products that protect our health, keep us from
harm and contribute to our well being."

The World Wildlife Fund, meanwhile, said the report addressed only
"the tip of the iceberg" because it did not measure a range of
chemicals including fire retardants.

The CDC studies are used to determine what chemicals are getting into
people at what levels; to assess efforts to reduce certain exposures;
to determine research priorities.

"Now that we can accurately measure these exposures in humans,"

Dr. Gerberding said, "it sets the stage for us to get the kind of
information we really need: What does this mean for people? What does
it mean for me, that this is present or absent?"

In some countries, biomonitoring studies prompted a phase-out of
certain substances. Sweden, for example, banned fire retardants six
years ago after breast milk monitoring found that levels were doubling
every two to five years. Since that time, the corresponding curve of
concentration in breast milk has gone down.

As early as this Thursday, the European Union is expected to vote on
legislation that would overhaul chemical regulations in the EU and
force businesses to prove their products are safe. The Registration,
Evaluation and Authorization of Chemicals (REACH) legislation would
phase out chemicals found to be carcinogenic.

Last Tuesday in Brussels, a group representing two million physicians
urged members of the European Parliament to pass the legislation.

"We are in a serious situation," Dr. Dominique Belpomme told the
legislators. "Some 75 per cent of cancers are due to mutations induced
by environmental factors, mainly chemicals."

With all of this biomonitoring data, scientists are working to
determine typical exposures within the general population. Once they
establish "reference ranges," they can investigate for incidences of
sickness when they find atypically high exposure rates.

Science magazine notes a 2001 case in which the state of Nevada
asked CDC to study higher than normal leukemia rates in Fallon,
Nevada. Of the 110 chemicals measured, tungsten and arsenic were found
in much higher concentrations among all residents than in the rest of
the population. The government has put tungsten on its priority list
to determine if the metal increases cancer rates in animals.

CDC data also showed that eight per cent of women of childbearing age
-- higher than anticipated -- have levels of mercury, a potent
neurotoxin, above the level the government considers safe. They are
now trying to determine how much mercury comes from fish, drinking
water and other sources.

Mercury is known to cause health problems. So too lead. But others
toxins found during biomonitoring are more controversial.

When the CDC said it had reported several phthalates -- ingredients
used in nail polish, cosmetics and fragrances -- were higher in women
aged 20 to 40 than in other groups, the Environmental Working Group
launched a campaign to remove these compounds from cosmetics. Industry
groups accused the EWG of needlessly scaring women. The CDC revised
its findings, when a second report with a much larger sample size
didn't find elevated levels among women of child-bearing age. In its
third report, released in July, it suggests the compounds warrant
further investigation.

It's also not known if PBDEs -- or polybrominated diphenyl ethers --
are toxic to humans. These compounds are widely used as flame
retardants in mattresses, electrical equipment and other household
products. Researchers found very high levels in mothers' breast milk.

As a result, the European Union and California banned the compounds.
The manufacturer is voluntarily phasing them out of products, but it's
still not known if they make anyone sick.

Others are trying to identify the source of such toxins.

The Harvard University School of Public Health is giving people
backpacks that catch air particles as they move about their daily
lives. They want to gauge where the greatest exposures are to
chemicals. Their studies have shown that people spend 65 per cent of
their time in their residences, 25 per cent in some other indoor
environment, five to seven per cent in transit, and usually less than
five per cent of their time outdoors.

Harvard is especially interested in indoor pollutants (fungi, dust
mites, nitrogen dioxide, tobacco smoke, lead, asbestos, volatile
organic compounds, formaldehyde, radon) because concentrations are
many times greater than outdoor levels.

Instead of waiting for conclusive evidence, many activists say there
is enough information to start banning chemicals. They argue that
manufacturers should have to prove new chemicals are safe before they
are approved for use. This requires embracing the precautionary
principle, which argues that in the absence of conclusive evidence in
face of a serious threat, we must still take action.

Charlotte Brody, of the California-based environmental group
Commonweal, took part in the EWG body burden survey.

"The 11 Canadian volunteers join the 12 notable Californians that were
biomonitored earlier this year and the 32 other people around the
world who now know their toxic chemical levels," she said this week in
an interview.

"I am one of those 32 people who were shocked to learn about the
levels of mercury, phthalates, dioxins and other chemicals in me."

Even more shocking, she says, "no government and no polluting industry
knows how these chemicals in people are connected to the growing
number of people with cancer, learning disabilities, infertility, and

Brody's colleague, Davis Baltz, also took part in the biomonitoring
survey. He says the Canadian report released last week, and others
like it, shows "environmental contaminants have penetrated into every
nook and cranny of our lives." He said people can try to reduce
personal exposure, but it's up to governments to protect the people.

He called REACH proposals in Europe, which would overhaul the chemical
industry, "the most significant environmental legislation in 30

REACH would not only apply to chemicals manufactured in Europe, but
also to chemicals imported into Europe, which means the U.S., Canada
and other chemical makers would have to abide by its provisions. The
centerpiece of REACH is the requirement that chemicals be tested
before they are allowed into commerce, the wider environment, and as
tests are increasingly showing, our bodies.

"This shifting of responsibility is a sea change in how chemicals are
regulated and will create momentous market shifts encouraging safer
alternatives and driving bad actors out," Mr. Baltz said this week.

He added that in light of Health Canada's recent decision to ban an
increasing number of chemicals in cosmetics, it would be both
appropriate and very helpful if Health Canada publicly signals its
support for a strong REACH program. A vote on the legislation is
expected as early as this Thursday in European Parliament.

Rick Smith, of Environmental Defence, also pointed to REACH and to the
U.S.'s. proposed Child, Worker, and Consumer-Safe Chemicals Act,
which he said has the potential to follow the European lead.

"When these frameworks become law, citizens of the EU and the U.S.
will be granted a higher level of protection for their health and
safety than Canadians," he said. He called on Canada to become a world
leader, noting it is the third worst polluter in the industrialized

He said the opportunity exists to bring the regulation of toxic
chemicals in Canada up to international standards. In Canada, the
Canadian Environmental Protection Act regulates toxic chemicals used
and produced by industry. "It's time to give this ineffective piece of
legislation a makeover during its mandatory five-year review beginning
in the fall of 2005."

Under CEPA, safety testing is not required for most chemicals.

"Industry is not held accountable for its chemicals. Pollution
prevention and the phase-out of toxics is only granted lip service.
The result is that as each year passes, increasing volumes of
chemicals are entering the environment and making their way into

Meanwhile, Health Canada told reporters the sample size of 11 people
was small but they would look into it.

Copyright The Ottawa Citizen

Return to Table of Contents


From: Center on Budget and Policy Priorities, Jan. 26, 2006
[Printer-friendly version]


Early Signs Suggest Inequality Now Growing Again After Brief

In most states, the gap between the highest-income families and poor
and middle-income families grew significantly between the early 1980s
and the early 2000s, according to a new study by the Center on Budget
and Policy Priorities and the Economic Policy Institute. The study is
one of the few to examine income inequality at the state as well as
national level.

The incomes of the country's richest families have climbed
substantially over the past two decades, while middle- and lower-
income families have seen only modest increases. This trend is in
marked contrast to the broadly shared increases in prosperity between
World War II and the 1970s.

In addition, while income inequality declined following the bursting
of the stock and high-tech bubbles in 2000 -- both of which were quite
costly to the highest-income families -- early national-level data
suggest that inequality began growing again in 2003. Incomes at the
top have rebounded strongly from the stock market correction, while
the negative effects of the recent recession on lowand moderate-income
families have lasted longer than usual. Thus, it appears that the two-
decade-long trend of worsening income inequality has resumed.

The study is based on Census income data that have been adjusted to
account for inflation, the impact of federal taxes, and the cash value
of food stamps, subsidized school lunches, and housing vouchers.
Income from capital gains is also included. The study compares
combined data from 2001-2003 with data from the early 1980s and early
1990s, time periods chosen because they stand as comparable low points
of their respective business cycles. Its findings include:

** In 38 states, the incomes of the bottom fifth of families grew more
slowly than the incomes of the top fifth of families between the early
1980s and the early 2000s. In these 38 states, the incomes of the
richest grew by an average of $45,800 (62 percent), while the incomes
of the poorest grew by only $3,000 (21 percent) In other words, the
poorest families -- who saw an increase in purchasing power of only
$143 per year -- have not fared nearly as well as the richest families
during this period. In only one state -- Alaska -- did the incomes of
the low- income families grow faster than the incomes of the top

** In 39 states, the incomes of the middle fifth of families grew more
slowly than the incomes of the top fifth of families between the early
1980s and the early 2000s. In no state did the income gap (degree of
income inequality) between middle- and high-income families narrow
during this period.

** Within the top fifth of families, the wealthiest families enjoyed
the highest income growth over the past two decades. In the 11 states
that are large enough to permit this calculation, the incomes of the
top 5 percent of families rose between 66 percent and 132 percent
during this period. This is faster than the income growth among the
top fifth of families as a whole in these states -- and much faster
than the income growth among the bottom fifth of families in these
states, which ranged from 11 percent to 24 percent.

** The five states with the largest income gap between the top and
bottom fifths of families are New York, Texas, Tennessee, Arizona, and
Florida. Generally, income gaps are larger in the Southeast and
Southwest and smaller in the Midwest, Great Plains, and Mountain
states. Income gaps tend to be larger in states where incomes in the
bottom fifth are below the national average, and to be smaller in
states where incomes in the bottom fifth are above the national

** The five states with the largest income gaps between the top and
middle fifths of families are Texas, Kentucky, Florida, Arizona, and

Income inequality increased rapidly during the 1980s. During the 1990s
exceptionally low unemployment produced relatively broad-based wage
growth during the latter part of the decade. This broad-based growth
ended with the 2001 economic downturn. Growth in real wages for low-
and moderate-income families began to slow and by 2003 wages began to
decline. Thus far, the recovery from the downturn has not been strong
enough to generate the kind of income gains among low- and middle-
income families seen in the late 1990s.

Growing Inequality Has Costly Consequences

"Growing income inequality harms this nation in a number of ways,"
stated Jared Bernstein, Senior Economist, Economic Policy Institute
and co-author of the report. "When income growth is concentrated at
the top of the income scale, the people at the bottom have a much
harder time lifting themselves out of poverty and giving their
children a decent start in life."

"A fundamental principle of our economic system is that the benefits
of economic growth will flow to those responsible for their creation.
When how fast your income grows depends on your position in the income
scale, this principle is violated. In that sense, today's
unprecedented gap between the growth of the typical family's income
and productivity is our most pressing economic problem."

States Can Partially Offset Trend Toward Larger Income Gaps

The biggest cause of rising income inequality over the past two
decades has been the erosion of wages for the 70 percent of workers
with less than a college education. That erosion, in turn, reflects
long periods of higher-than-average unemployment, globalization, the
shift from manufacturing jobs to low-wage service jobs, immigration,
the weakening of unions, and the decline in the minimum wage.

More recently, even college-educated workers have experienced real
declines in wages, in part because of offshore competition. While many
of these economic factors are largely outside the control of state
policymakers, "there's a lot that states can do to mitigate the
effects of increasing inequality," Elizabeth McNichol, Senior Fellow,
Center on Budget and Policy Priorities and co-author of the report
noted. Possible steps include raising the state minimum wage,
strengthening supports for low-income working families, and reforming
the unemployment insurance system. In addition, states can pursue tax
policies that partially offset the growing inequality of pre-tax


Greatest Income Inequality Between the Top and the Bottom, Early 2000s

1. New York
2. Texas
3. Tennessee
4. Arizona
5. Florida
6. California
7. Louisiana
8. Kentucky
9. New Jersey
10. North Carolina

Greatest Income Inequality Between the Top and the Middle, Early 2000s

1. Texas
2. Kentucky
3. Florida
4. Arizona
5. Tennessee
6. New York
7. Pennsylvania
8. North Carolina
9. New Mexico
10. California

Greatest Increases in Income Inequality Between the Top and the
Bottom, Early 1980s to Early 2000s

1. Arizona
2. New York
3. Massachusetts
4. Tennessee
5. New Jersey
6. West Virginia
7. Connecticut
8. Hawaii
9. Kentucky
10. South Carolina

Greatest Increases in Income Inequality Between the Top and the
Middle, Early 1980s to Early 2000s

1. Kentucky
2. Pennsylvania
3. West Virginia
4. Indiana
5. Hawaii
6. Texas
7. Tennessee
8. North Carolina
9. Arizona
10. New York

Greatest Increases in Income Inequality Between the Top and the
Bottom, Early 1990s to Early 2000s

1. Tennessee
2. Connecticut
3. Washington
4. North Carolina
5. Utah
6. Texas
7. West Virginia
8. Pennsylvania
9. Florida
10. Maine

Greatest Increases in Income Inequality Between the Top and the
Middle, Early 1990s to Early 2000s

1. Kentucky
2. Pennsylvania
3. North Carolina
4. Indiana
5. Tennessee
6. Texas
7. West Virginia
8. Vermont
9. New Jersey
10. Connecticut

The Center on Budget and Policy Priorities is a nonprofit, nonpartisan
research organization and policy institute that conducts research and
analysis on a range of government policies and programs. The Economic
Policy Institute is a nonprofit, nonpartisan think tank that seeks to
broaden the public debate about strategies to achieve a prosperous and
fair economy.

Center on Budget and Policy Priorities
820 First Street NE, Suite 510
Washington, DC 20002
Tel: 202-408-1080 Fax: 202-408-1056
center@cbpp.org http://www.cbpp.org

Economic Policy Institute
1333 H Street NW, Suite 300
Washington, DC 20005
Tel: 202-775-8810 Fax: 202-775-0819
epi@epi.org http://www.epi.org

Return to Table of Contents


  Rachel's Precaution Reporter offers news, views and practical
  examples of the Precautionary Principle, or Foresight Principle, in
  action. The Precautionary Principle is a modern way of making
  decisions, to minimize harm. Rachel's Precaution Reporter tries to
  answer such questions as, Why do we need the precautionary
  principle? Who is using precaution? Who is opposing precaution?

  We often include attacks on the precautionary principle because we  
  believe it is essential for advocates of precaution to know what
  their adversaries are saying, just as abolitionists in 1830 needed
  to know the arguments used by slaveholders.

  Rachel's Precaution Reporter is published as often as necessary to
  provide readers with up-to-date coverage of the subject.

  As you come across stories that illustrate the precautionary 
  principle -- or the need for the precautionary principle -- 
  please Email them to us at rpr@rachel.org.

  Peter Montague - peter@rachel.org
  Tim Montague   -   tim@rachel.org

  To start your own free Email subscription to Rachel's Precaution
  Reporter send a blank Email to one of these addresses:

  Full HTML edition: join-rpr-html@gselist.org
  Table of Contents edition: join-rpr-toc@gselist.org

  In response, you will receive an Email asking you to confirm that
  you want to subscribe.

Environmental Research Foundation
P.O. Box 160, New Brunswick, N.J. 08903