Wednesday, March 28, 2012

The Neighborhood Toxicologist is Evolving


When I started writing this blog, my goal was to explain why certain chemicals in consumer products were toxic, as well as discuss some of the uncertainties in toxicology. Over the years, all this writing about one chemical after another - many of them industrial age chemicals - got me thinking about all the defenses we have that protects us to some degree against toxics. Would these systems hold up to the onslaught of chemicals in the world today? Why do we handle some chemicals better than others? How can we better predict and prevent toxicity?

One thing led to another, which eventually led to a book! So I am happy to announce the publication of my first toxicology book, Evolution in a Toxic World, and another blog by the same name. Hope to see you there.

Monday, August 16, 2010

Peanut allergies in a nutshell

This summer I met a family from Australia who’d mentioned their daughter was highly allergic to peanuts. Wondering if all the concern about peanut allergies was yet another case of Americans overreacting to anything health-related I asked if they’d ever heard of schools in Australia banning peanuts.

“Our daughter’s school has been peanut-free for years,” they replied, as if it were an odd question. They added, “Lots of schools are.”

Like many people, I’ve also wondered if the seeming rise in prevalence of peanut allergies was real. After all, how many times have I heard someone say, “Well, we all grew up with peanut butter, and I didn’t know anyone who was allergic. What’s all the fuss about now?”

Turns out -- according to several studies published in medical and allergy journals over the past decade -- that peanut and tree nut related allergies, or hypersensitivity of the immune system to specific proteins in these nut families, truly is on the rise in Australia, the US and other Westernized countries. It is now estimated that over 1% of the US population has peanut or tree nut allergies, and one study reported a doubling of peanut allergies in children over a five year period.

So what’s going on? Has something changed in the way we are exposed to peanuts, tree nuts and other increasingly allergenic foods (sesame, and soy for example)? Or is it simply that our immune systems are going haywire?

The immune response is complex. While we’re all familiar with the role of antibodies, which confer immunity to anything from the common cold to polio, they are only one of five different types of immune proteins, or immunoglobulins. Other immune proteins protect vulnerable regions of the digestive and respiratory tract from pathogens, elicit our bodies to produce antimicrobials, and help us get a “jump” on our response once pathogens have breached other protections and entered our bloodstream.

Then there is immunoglobulin E (IgE). Although recent studies suggest that IgE may protect against certain parasitic worms (less of a problem these days in western countries compared with other regions of the globe), IgEs are most notorious for their role in causing allergic reactions, or an inappropriate immune response to a relatively harmless substance. Basically, once a body is sensitized by a potential allergen, a bit of basement mold perhaps, or a whiff of pollen from the old oak tree, IgEs are then distributed thoughout the body in association with immune cells like mast cells and basophils, which lay in wait for the next exposure.

When subsequent exposure occurs, these sensitized immune cells release a slew of potent chemicals including histamine, cytokines, and prostaglandins. These are all useful chemicals when released at the appropriate time and place, as during a normal immune response when the body is combating a pathogen or healing a wound (and even then they may cause some damage to healthy cells and tissues.) But as far as anyone knows, there is no appropriate time or place for an allergic response. Yet no matter the reason, when these chemicals are released the body responds.

The allergic responses many of us experience are caused by the increases in vascular permeability, constriction of smooth muscles (including those around the smallest passages of our lungs), and increased mucus production caused by histamine and other chemicals. The impacts on a body can range from mild to severe.

So, while I might suffer through a month or two of asthma, sneezing and itchy eyes (along with the more than 20% of the U.S. population affected by allergies), thankfully my IgEs seem to respond relatively mildly. But for some, an IgE response can cause anaphylaxis, a far more severe and systemic condition which may include vomiting, constricted breathing, and plunging blood pressure. The onset of these life-threatening responses can lead to anaphylactic shock and can occur within minutes of exposure.

A 2008 study published in the journal Current Opinion in Allergy and Clinical Immunology estimated that allergic anaphylaxis may occur in up to 2% of the U.S. population at some point in their life, with varying degrees of severity. And the risk of occurrence, particularly in children, is on the rise.

Which brings us to some of the top triggers for anaphylaxis - a list that includes many common substances like latex, insect venom (e.g. bee stings), medications (e.g. penicillin) and certain foods including shellfish, milk, tree nuts, and peanuts. Of these, food allergies are among the most common triggers of anaphylaxis requiring emergency room treatment. By some estimates, in the US food allergies account for roughly 30,000 visits to the emergency room and at least 100 fatalities a year, and several reviews of the medical literature including a 2009 review published in Clinical Pediatrics conclude that peanuts and tree nuts cause the majority of reported allergy-induced fatalities.

When a food is allergenic, the allergic reaction is usually caused by a specific type of protein contained in the food. In peanuts, eight different allergens have been identified. What differentiates allergenic proteins from other food proteins is that they resist acid, heat, and enzymatic breakdown in the gut. So they tend to be identified by the body’s immune system as an intruder rather than a nutrient, with potentially devastating consequences.

Efforts to understand why the US and other Westernized populations has a higher prevalence of peanut allergies than, say, China, where peanut consumption is also high, have identified the U.S. food industry’s practice of dry roasting peanuts rather than boiling or frying peanuts as one potentially relevant factor. The higher temperatures reached by the dry roasting process increases the allergenicity of peanut proteins. Other factors contributing to higher prevalence likely include differences in diet, routes (oral or dermal) and timing of nut exposures. Additionally, scientists have hypothesized that improved hygiene and reduced disease incidence in young children may also contribute to increased prevalence of allergies in general. Scientists and allergists have also speculated that increased use of peanuts in common consumer products, from soaps to shampoos and skin creams, may contribute to creating a more sensitized population.

Whatever the underlying cause, some people, once they are sensitized, need only ingest a very small amount (50 millgrams, approximately 100th of a teaspoon, down to as low as 2 mg) of peanut product to cause what could become a life-threatening reaction.

It is a mind-boggling response. Consider the tiniest oral exposure setting off a systemic response within minutes. How does this happen?

“What you think of as low dose might contain plenty of stable antigen [or allergenic protein],” explains Southeastern Louisiana University Immunologist Dr. Penny Shockett. “Also,” Shockett added, “once the system is sensitized it doesn't necessarily take a high dose for tripping the mast cell response. If you are highly sensitized (i.e. allergic) you have more sensitized mast cells in tissues (or basophils in the blood) sitting and waiting for the allergen, which can potentially detect it quickly and strongly.”

Studies indicate that not only has the prevalence of peanut allergies risen over the past few decades, but also the risk of anaphylaxis in general, at least in the United States and other Western countries. As we alter our diets based on the ever-changing suggestions of health and nutrition experts, cultures adopt one another’s diets, and diseases are reduced through changes in hygiene and vaccines, scientists are in a quandary as to the causes of increased peanut and tree-nut sensitivity. Hopefully both the underlying causes and solutions for those who are allergic will be identified sooner than later.

For those currently affected by severe allergies, the focus is on management. In addition to education of individuals with allergies, particularly children, this means a range of options for schools. First and foremost involves appropriate medical and treatment plans in schools, followed by education of the school community, and strategies to avoid exposures for allergic individuals. In the case of peanut allergies avoidance in schools ranges from peanut free buildings to peanut free classrooms or separate lunch tables. As to the most effective management practice, the jury is still out.

Emily Monosson, Ph.D. writes and blogs as the Neighborhood Toxicologist, is a member of the GMRSD school committee, and is a member of the district’s Wellness Committee. The information presented here is the product of her own research into the issue and does not represent the opinion or work of the GMRSD school district, or the Wellness Committee.

Wednesday, May 05, 2010

McElligott's Plastic

“Ask for a cone, save the environment!” proclaimed the sign at the local Creamee. The girls asked for cups anyway, to catch the drippings of the oversized soft-serve half-and-half cones they'd ordered. “Guess we’re not saving the environment today,” said one, dipping her plastic spoon into the Styrofoam cup.

Styrofoam is one incarnation of polystyrene plastic – more affectionately known as “#6” or, the plastic we can’t recycle. Polystyrene is also the black polystyrene casing of my computer, my bicycle helmet, the foamed polystyrene clamshell we were offered to carry home the remainders from a local restaurant and, the countless little white Styrofoam pellets degraded from sheets of weathered insulation I spent the weekend picking from the weeds at the local junk-yard turned conservation land along with a handful of diligent volunteers.

While collecting the little white bits from the earth, I imagine how each year some portion of those beads along with larger rafts of insulation are blown or washed into the bordering Sawmill River, some journeying only as far as the local swimming hole, while others carried by the Sawmill make their way to the Connecticut and beyond. I imagine their journey a perverse version of Dr.Seuss’s McElligot’s Pool, where you never know what exotic species might make their way from the deep ocean to a backyard pond, only these make their way to the deep ocean. This isn’t fanciful fiction. Just this year scientists confirmed the presence of a plastic “patch” of our own in the North Atlantic, the evil twin of the infamous North Pacific trash gyre – a region known for its accumulation of plastic from soccer balls to microscopic bits of Styrofoam and other assorted plastics. Looking around at all the Styrofoam I’ve missed, the scientist in me wants to radio-tag those naughty bits and send them on their way. Maybe in a few years we’d know for sure if pieces of Montague were swirling about the wide Sargasso Sea.

Captain Charles Moore, an adventurer, environmentalist and researcher, credited with discovering the North Pacific patch once commented on the return of plastic to the oceans and its consumption by marine life in an article for Natural History Magazine, “Ironically,” wrote Moore “the debris is re-entering the oceans whence it came; the ancient plankton that once floated on Earth's primordial sea gave rise to the petroleum now being transformed into plastic polymers. That exhumed life, our ‘civilized plankton,’ is, in effect, competing with its natural counterparts, as well as with those life-forms that directly or indirectly feed on them.” Research by Moore and others, now shows that plastics in the ocean can accumulate toxicants long banned like PCBs and DDTs, and there is some concern that once ingested, contaminated plastics might release these chemicals, along with others used for plastics production including colorants, fire retardants and plasticizers into their host. Someday there may be no need to shrink-wrap seafood.

Like other plastics, polystyrene – the base material for Styrofoam or foamed polystyrene clamshell food containers, microwavable cups (think cup-o-noodles), plastic plates and coffee cups – is a polymer, a chemical chain of repeating units, like beads on a string. In this case the beads or monomers are styrene. Produced naturally by plants and animals, styrene – like many chemicals - is relatively non-toxic in these small amounts. And, like many chemicals, natural production is dwarfed by human production (at least in localized concentrations,) which in the case of styrene tops 13 billion pounds a year in the US alone. The majority is used to produce polystyrene. While polystyrene might not appear on the top ten list for toxic chemicals, it is made from benzene. Over 50% of all benzene that is produced from oil is eventually turned into styrene. And sweet smelling benzene is nasty stuff. Just a whiff brings me back to organic chemistry lab in college. We used it without a care until the day it was officially deemed a carcinogen – and then we didn’t. At the risk of showing my age, that was in 1979. And in a strange case of collective heads- in-sand, benzene was known to cause cancer since the 1920s. (We can thank industry along with federal regulators to for that small lapse.) Benzene is now one of the few industrial chemicals officially listed as a known human carcinogen – causing leukemia in this case – and it is industry workers who are most at risk.

So what happens to all that polystyrene? The EPA estimated that in 2007, nearly 3 billion pounds of it was used in the production of disposable goods, including foamed polystyrene plastic plates, cups, egg cartons, and packaging peanuts. Aside from the packaging peanuts we might bring to a UPS store for reuse, with a recycling rate for all polystyrene estimated as a mere 0.8%, most will end up in a landfill. At worst it’ll end up our local streams, rivers and oceans.

And, when it does according to new research by Katsuhiko Saido and colleagues from the Nihon University, in Chiba, Japan, it will not only degrade more rapidly than it would on land (under certain marine conditions) but it will also release toxicants including a small amount of bisphenol A, notoriously linked with polycarbonate plastics, and styrene which brings us back to – d’oh!

The good news is that like most other plastics, technically, polystyrene foam is recyclable. In fact, it can be recycled back into many of the products from which it came – plates, clamshells, egg cartons and insulation, or into less desirable “dead end” products like light-weight concrete. The bad news is that the process isn’t cost effective, at least in the US – and so isn’t all that popular.

Then there are the more creative uses for this problem plastic. Some, like Cass Phillips, writer and co-owner of Kamuela Greenhouse/Specialty Orchids in Waimea, Hawaii have considered turning the environmental blight into beauty. With USDA grant funding, Phillips is currently testing the utility of various locally collected and processed recycled plastics as a growth medium additive with an eye to providing a durable low cost product for the Hawaii orchid industry. When asked about foamed polystyrene, she responded:

“I found that a certain type of orchid, miltoniopsis (aka the pansy orchid), grew fastest and largest in straight granulated polystyrene foam, in a trial that included three controls (cinder, coconut fiber and orchid bark)…... What truly stunned me is that the pansy orchids went into their bloom cycle 2-3 months before any other sample." There could be several reasons for the accelerated growth. One might suppose improved water retention could be a factor, but the ground polystyrene foam dried out almost instantly. That leaves us pondering other possibilities, including one that could be considered insidious: the release of growth-inducing chemicals. Sorting out the differences will require further analysis, but in the meantime Phillips has found herself wondering about the wisdom of schools using Styrofoam plates in their lunch programs, and the consequences of slurping down cups-o-soup from Styrofoam tubs.

Of course the best way to keep this ubiquitous plastic from polluting the oceans and clogging the landfills is to reduce use (according to the American Chemistry Council, the PS industry has been in decline for the past four years, though they give no reason), and close the recycling loop. More immediately, I’m sure there’ll be many more opportunities to pick Styrofoam from newly acquired conservation land, and for those rare occasions when I can’t clean my plate while dining at one of the local eateries, I’ve begun asking for foil or cardboard for the leftovers.

Monday, January 25, 2010

Yankee Swap: tritium contaminated water anyone?

First published in the Montague Reporter

First we hear about tens of thousands of picocuries* in the groundwater beneath Vermont Yankee Nuclear Power plant, next it’s over one hundred gallons of water contaminated with over 2 million picocuries in some sort of concrete trench. Oops. Besides sloppy practices, lax monitoring, shoddy construction, and obfuscation (what underground pipes?) what do these numbers mean? Should we worry about all that tritium? And what the heck is a picocurie anyway?

Tritium is a radioactive isotope of the element hydrogen. What sets apart the radioactive elements from the non-radioactive is their lack of stability. They can disintegrate spontaneously, sometimes changing into other elements over time. Uranium, for example, decays into lead (although it may take billions of years,) while it takes roughly a decade for tritium to decay into helium.

The difference between a radioactive element and a plain old element depends upon what’s in the nucleus. The nucleus of any atom consists of protons (positive elements), neutrons (neutral elements) and electrons (negative elements). While the chemical properties of an element mostly depend on the number of protons in the nucleus, the radioactive properties are determined by the number of neutrons and the balance amongst the protons, neutrons and electrons. An element like hydrogen and its radioactive twin, tritium, have the same number of protons (and so, the same chemical properties), but instead of a single neutron, tritium has three neutrons. Tritium occurs naturally in small amounts, in addition to being produced by man either purposefully for research and consumer products (ever wonder about that glowing watch dial or that luminous EXIT sign?), or as a by-product of the nuclear industry.

Because tritium is chemically similar to hydrogen it can and does take the place of hydrogen – when this happens in water tritiated water or radioactive water is formed.

The radiation released by tritium is referred to as a beta particle. Beta particles, or electrons, are a form of ionizing radiation capable stripping electrons from other atoms, causing a sort of chain reaction of destabilization, and breaking chemical bonds. Although the beta particles released by tritium are low energy, incapable of penetrating through barriers such as skin (unlike some other forms of radiation), should tritium enter the body through inhalation or umm…water, those emitted particles would then have full access to vulnerable tissues and molecules.

Tritiated water is particularly insidious. The tritiated water lurking below Vermont Yankee for example, could be absorbed by the root systems of nearby plants, or imbibed by unsuspecting animals. Once consumed, distributes rapidly throughout the body of plant or animal. Additionally, ingestion of tritiated water, can lead to incorporation of tritium into organic materials like DNA, proteins and amino acids. Only, unlike hydrogen, tritium will eventually decay, leaving behind an atom of helium and releasing a beta particle with enough energy to break nearby chemical bonds.

In the body, the making and breaking of the chemical bonds between atoms is a highly coordinated process, normal and essential to life. The “unscheduled” breaking of chemical bonds can cause permanent cell damage, damage to the cell’s DNA or cell death.

The human genome is contained within the DNA of our 46 chromosomes located in a cell’s nucleus. Replication of these chromosomes during cell division is a critical process, requiring a number of complex biochemical interactions including copying and construction of identical chromosomal pairs that are then split off into the newly divided cell. Because integrity of the genetic material is essential to life, not only are there biochemical systems involved in maintaining chromosomes during division, but there are also a number of mechanisms by which errors may be repaired.

Say a few molecules of tritium enter the cell and cozy up to nuclear DNA. At some point in their unstable life-time they will disintegrate, releasing their energized electrons. Should the cells’ chromosomes be in their pathway, the transfer of energy from electron to chromosome may be enough to break off a bit of chromosome. Sometimes, depending on conditions within the cell and location of the break, the broken pieces may rejoin the chromosome, leaving little or no evidence of damage; other times a broken piece remains separate, becoming a chromosomal deletion; or both the deleted piece and the damaged chromosome will be copied as if nothing happened, only it will be altered. Or, instead of direct interference with DNA, emitted electrons may interact with other molecules such as oxygen, causing “indirect” damage by creating highly reactive oxygen radicals.

Since DNA tends to be a target of ionizing radiation, tissues made up of cells that are rapidly dividing – such as blood forming organs constantly churning out cells – tend to be far more sensitive to radiation damage than say, brain cells. Similarly, embryos and fetal tissues are more susceptible to radiation damage than adult tissues.

There is some good news amidst all this havoc and destruction. That is, most if not all cells have some capacity for DNA repair. These include an array of enzymes and proteins that find and correct damaged DNA in addition to a number of antioxidants capable of disarming those reactive oxygen radicals. The presence of such repair mechanisms have led some to speculate that exposures to very low amounts of radiation may be a good thing, “priming” these repair systems and leading to greater protection with low levels of exposure – a phenomenon referred to as hormesis. However, a National Academy of Science report on The Health Effects of Low Level Ionizing Radiation, published in 2007, found no available evidence of radiation induced hormesis in mammals, and concluded that any single track of ionizing radiation (for example by a single ejected electron in the case of tritium) has the potential to cause cellular damage.

And, despite the capacity for repair, sometimes the system is overwhelmed, or sometimes the repair itself introduces a new error (think sloppy auto mechanic.) At this point the genetic damage has the potential to become permanent, or “fixed.” Permanent damage to DNA can result in the eventual development of cancerous cells, or a defect in an exposed fetus or as a mutation passed on to the next generation. While the evidence for carcinogenicity in human populations is strong for some radioactive isotopes like strontium-90, plutonium and radium, the health effects of tritium, a weak beta emitter are less clear.

Which brings us to concentration. How much is too much? What does it mean that the groundwater has over 200,000 picocuries of tritium per liter of water, or that there are “troughs” with over 2 million picocuries per liter? A curie (named in honor of radiation pioneers Pierre and Marie Curie) is a quantity of radionuclide in which there are 37 billion disintegrations a second. That’s a lot of disintegration and in the case of tritium would be a lot of beta particles whizzing about. But the amounts drawn from the ground water were measured in picocuries per liter – or one millionth of a millionth of a curie. So, every second, until all the tritium has disintegrated to helium (the half-life for tritium is 12.5 years) there would be roughly 7,400 electrons winging about in a liter of Vermont Yankee groundwater.

As a result of the current hypothesis that exposure to any amount of ionizing radiation carries with it some risk of cancer, the U.S. EPA’s Maximum Contaminant Level Goal for all radionuclides in drinking water, a goal which aims for “zero-risk” to public health, is zero picocuries per liter. Unfortunately, achieving “zero risk” is not only wishful thinking but currently unenforceable and, because there is some naturally occurring tritium impracticable. Instead, EPA has developed Maximum Contaminant Levels (MCL) for drinking water. While the MCLs are enforceable, they are calculated considering best available technology and economic feasibility. For tritium, the derived** MCL is 20,000 picocuries per liter, while the derived MCL for strontium 90, a more powerful beta emitter associated with bone cancer and leukemia, is 8 picocuries.

Here’s the thing. Right now we’re talking two wells and a trench (where, incidentally, a small amount of radioactive cobalt has turned up as well.) While current concentrations in the ground water (the trench is another story) may not present an immediate health risk, who knows what a more comprehensive analysis - currently underway - might reveal?

*As of Feb 10, 2010 over 2 million pCi was measured in test wells around the plant.

For more see: http://www.rutlandherald.com/article/20100205/NEWS04/2050349/1003/NEWS02

**The MCL for beta emitters is based on a dose of 4mrem/year to the total body and assumes ingestion of 2L a day – the picocurie concentrations are derived for each specific beta emitting isotope depending on their strength. Over the years, there has been discussing of using different calculations for tritium that would dramatically reduce the MCL.

Thursday, January 07, 2010

Evolution of the Toxic Response: In the beginning there were chemicals....

The following is what I intend to be the first in a series of essays on the Evolution of the Toxic Response – a topic which piqued my interest after what could either be called a disastrous flirtation with the publishing world, or an invaluable lesson in pursuing your passion. The disaster was allowing myself to be duped into thinking the content and style of this blog would actually make an engaging book (wrong,) the passion was in realizing that writing primarily about toxicants of interest to the consumer (and in the style that would be most appealing to mass market publishers) has caused me to lose my way as a toxicologist and a scientist.

There is no doubt that some toxicants are, well, toxic. But there is always the question of exposure, dose, and potency. Topics often lost in breezy articles meant to engage a reader – rather than inform about the complexities not only of toxicology but science in general. Unfortunately the publishing world seems to have no confidence in its mass readership. Readers are attracted by alarmism, so hype it up. They’ll doze if there is too much science, so keep it simple. They just want to be told what’s best for them, so just tell them. But after whipping off one light and fluffy page after another about dangerous toxicants hidden away our homes and gardens (along with a few good toxins in our ‘fridges) all in preparation for my failed Book Proposal, a request by the local news paper to write about bisphenol A or BPA resulted in a nearly visceral reaction at the thought of writing yet one more article for consumer consumption about chemicals consumed by consumers.

But after the storm, and the lull where I could barely bring myself to write another word about chemicals, came the passion. I was attracted to toxicology because I was fascinated by chemicals that screwed up the normal processes of life. But that was back in a time long long ago when toxicology meant PCBs, lead, mercury, dioxin, and assorted pesticides. These were obvious chemicals in concentrations that couldn’t hide within the peaks and valleys of the chemists’ printout. But science has come a long way since then. Now, we know far more about the minute amounts of a myriad of chemicals contaminating our water, air and food than we do about the way they might interact with our lung cells, or livers, or brains. We know that our bodies sequester the smallest amounts of these chemicals in our bones, brains, and fat cells.

Many of these chemicals will stick around on earth at least for our lifetimes, and those of our children. What will be the consequences of these chemical exposures – if any? What do we really mean when we say that these chemicals are toxic? At what point does a contaminant become a toxicant? Given all the synthetic and naturally occurring chemicals entering and exiting our bodies with virtually every breath – some of which by now are unavoidable, others we might choose to inhale and ingest, and still others have been with us for eons, how can I, as toxicologist better understand the collective impact?

This was when I remembered I’ve inherited more than my big ears, hazel eyes and dry skin from my ancestors. I’ve inherited a whole system of toxic defense mechanisms, because really, well before the first animal ventured onto land, well before the first single-celled organism respired oxygen, life on earth relied upon chemical defense mechanisms of one sort or another.

And to some extent, we owe our lives -- as do all life forms -- from bacteria, to plants and all animals -- to these toxic detoxification processes.

Yet are they enough to protect life from the steady rain of natural and synthetic chemicals experienced by life on earth today?

That is the question I intend to explore in this upcoming series of essays, so stay tuned if you dare.

Also if you are a toxicologist, chemist, geologist etc. and would like to discuss the topic further please don't hesitate to contact me at emonosson@verizon.net I'd love to begin a virtual journal group on this topic.

Wednesday, November 25, 2009

Is there bias in bi(a)sphenol A?

Over the past two years the debate about bisphenol A (BPA) has become a quagmire where highly regarded scientists who once worked side by side, now sit across the fence virtually flinging insults at one another. You wouldn’t know this reading the Sunday paper or countless mainstream press articles, blogs and even academic journals which have successfully vilified this ubiquitous chemical. Like many Americans, you’re probably tossing away your polycarbonate bottles and looking askance at the stash of cans in your pantry.

Yet two summary panel reports on BPA prepared by the National Toxicology Program (NTP) and by the Food and Drug Association (FDA)* downplayed the risks of BPA, while at the same time, NTP highlighted the need for more research - and as of January 2010 the FDA indicated they too have concluded there is some cause for concern, particularly in infants and children. As a writer I find this disconnect fascinating. As a mother who replaced the polycarbonate bottles shortly after the first round of BPA press, I wonder if the chemical is deserving of its reputation as the evil twin of estrogen. As a toxicologist, I am dismayed by the apparent bias found on both sides of the fence.

Years ago, while interviewing for a job, I was asked if science was objective. I quickly answered in the affirmative. My future employer’s brow wrinkled, but she remained silent – giving me time to think. While science is objective, it is carried out by mere humans. And we all have our biases. I wouldn’t have been interviewing with a group whose mission was to support communities affected by industrial contaminants and who could only offer a pittance in salary if I didn’t lean towards the affected. Yet, I pondered, when reviewing the literature in support of their mission would I be biased? Here’s the truth – when reading studies funded by either the military or industry my sci-dar is on full alert. Likewise, I’m just as wary when reading studies conducted by environmental activist organizations, yet I am more trusting of studies produced by academics, particularly those funded by sources that tend not to have a stake in the outcome. Really, my sci-dar should be on full alert at all times, and in the end, I am careful not to cherry-pick studies from any one source, just to support a position.

Are there concerns about bias in the bisphenol A analysis? As a recent memoirist who shall not be named, likes to say, “You Betcha.” Just Google “BPA bias” and you’ll find over one million pages.

One need only read Environmental Health Perspectives, published by the National Institute of Environmental Health Sciences (NIEHS), where the most recent BPA battle is playing out. But the stakes are higher than simply resolving BPA’s toxicity. Bisphenol A has brought to the fore the very nature of toxicity testing and regulation, questioning the role or (or lack of) basic research in chemical testing and regulation.

That toxicity testing, particularly of endocrine disrupting chemicals like BPA is in dire need of overhaul is not in question. Says Dr. L. Earl Gray**, Research Biologist and Team Leader of the Reproductive Toxicology Division at the US EPA, about updating routine chemical testing:

“There is a lot more awareness of the issues with endocrine disrupting chemicals and thoughts about screening….they are also trying to shorten the multigenerational protocol [one of the standard toxicity tests required of industry]…hopefully and likely the new assays will be able to replace the old ones fairly quickly.

Problem is, it took a decade to develop and validate those new assays. A snail’s pace, and a significant chunk of time for those at greatest risk, the very young. Even so, when it comes to BPA, there are those who suggest reviewers and regulators stick with studies based on regulatory testing protocols, because those methods have been rigorously validated, even if they don’t incorporate the latest science.

Of the hundreds of scientific articles on BPA many could be classified as basic science, while only a fraction use regulatory testing protocols. Studies in rodents report that BPA causes diabetes, weight gain, mammary gland cancer, early onset puberty, infertility and behavioral changes. Some of these findings cannot be repeated (reproducibility is a central tenet of science). Meanwhile studies in human populations report associations (which are not cause and effect linkages) between BPA and heart disease, diabetes, infertility in industry workers, and behavioral changes in toddlers born to mothers whose urine concentrations during pregnancy mirror those in the general population.

Despite the uncertainties, aren’t all of these studies enough to require that industry remove the chemical from our food and drink? While I am skeptical of studies produced by industries whose bottom line depends upon a particular chemical and in sticking with decades old testing procedures, I also know that a chemical posing an imminent danger is good for academic business, generating more grant money, more publications, and more consulting. It’s not an ideal system, but given time the scientific method prevails – and in the interim we have guidance from the expert panels. In the case of BPA both panels had the freedom to consider any and all relevant and valid studies.

While the NTP panel concluded there was cause of “some concern,” noting the need for more research, the FDA concluded that current exposures to BPA do not present a health risk. So began the fireworks. Critics charged the panels were biased omitting too many basic studies from their final analysis. In his congressional testimony, Gray who was a member of the NTP panel disagrees. Testifying before congress about BPA, Gray noted, that “the criteria [for inclusion in the review] provided minimum standards for experimental design and statistical analysis. Many studies failed to meet these minimal criteria – these studies came from industry, government and academic laboratories.

“The controversy,” says Gray “resides over the fact that standard and enhanced multigenerational studies are negative for low dose effects and many academic studies were positive…. several of the multigenerational studies have added low dose groups, estrogen sensitive endpoints and tried to replicate the low dose effects to no avail... These differences are due in part to differences in how a chemical is administered in a study.” Differences which also include the use of live animals versus test tube studies (which preclude metabolism and excretion of a chemical), the timing of exposure and the range of doses tested.

Yet based on accounts by the popular press, enviro-blogs and magazines – if you still drink from a polycarbonate bottles or serve your kids canned foods then you must be an irresponsible parent. When I replaced our reusable bottles with BPA free – but didn’t toss the canned goods, my inner toxicologist reasoned that we are exposed to a myriad of natural estrogenic chemicals in foods like soy, plants, and milk – was BPA any worse? Meanwhile the environmentalist in my brain reminds me that we don’t choose to consume industrial chemicals like BPA. Shouldn’t we have that choice? But since BPA does not accumulate in humans (a quality that may trigger a chemical ban), “that choice” depends primarily upon the amount and frequency of BPA exposure, how it’s metabolized and its potency.

So is it or isn’t it? Maybe the nearly 30 million dollars recently committed by NIEHS for BPA research will solve the question, and maybe BPA will be one more example for the scientific flip-flopper pile along with fiber, mammograms, and therapeutic estrogens. For now, there’s FDA’s final report due at the end of the month, and Consumer Report’s recent investigation of BPA in canned goods – both of which will surely add a few feet to the fence separating some very good scientists.

*This report is the 2008 draft, a final report was just published you can find FDA's current position on BPA here.

Here is a recent article on the relationship between BPA in urine and heart disease

CHECK OUT the US DEPT Health and Human Services site for the latest on BPA (added Jan 16 2010)

AND a Jan 28 2010 interview with Dr. Linda Birnbaum of NIEHS

**In the spirit of disclosure, I worked for Earl back in the early nineties, he was not only a great guy to work for, but I also respect his science and opinions.

Wednesday, October 07, 2009

Recombinant DNA, Synthetic Biology,and Nanotechnology, oh my!

There is an interesting article on Synthetic Biology in last week's New Yorker. Though I just gave it a skim, and didn't read the ending – the topic is intriguing and describes a field of science devoted to developing the capacity to build and manipulate biological systems as if they were Legos. According to SyntheticBiology.Org their goals begin with identification of the parts that “have well-defined performance characteristics and can be used (and re-used) to build biological systems” and end with “reverse engineer and re-design a ‘simple’ natural bacterium."

Wow. Should they succeed, they’d bear a hefty biological, ethical, environmental responsibility. Were these people nut jobs? Nascent Frankensteins? Or were they just being realistic about the future of their science? As I thought about what this all meant it dawned on me that Synthetic Biology, being an extension of Genetic Engineering, in some ways wasn’t so different or separate from nanotechnology.

I don’t mean that they’re similar in how the products of these technologies interact with living systems, all threats of “grey goo” (a worst-case scenario hypothesized by Eric Drexler, popularizer of nanotechnology, whereby nanobots run a muck, literally mucking up the world) aside - one science proposes to build biological systems while the other builds chemicals. Although, I suspect, as time goes on these two technologies will mingle if not marry (if they haven't run off to Las Vegas and done so already.) Biological systems after all are nothing more than chemical building blocks – so once those building blocks are better understood, and once we have the capability to not only engineer one cell at a time, but also to build chemicals one atom at a time, why not?

As a toxicologist observing the emergence of nanotechnology it has been easy to ask what nanotechnology can learn from past practices of chemical production, regulation, use and disposal. But beyond toxicology, biotechnology, has also laid some groundwork as to how to proceed with – or not-- development of a new technology that will impact all of our lives for better or worse, in ways we cannot fully understand.

Genetic engineering, the cornerstone of biotechnology, has been around since 1972 when scientists including Paul Berg of Stanford University first recombined pieces of DNA – the molecule which holds the secrets of all live on earth. Two years later, Berg and others raised serious concerns about unfettered recombinant DNA research, eventually calling for a temporary moratorium on certain types of research. Berg’s committee proposed that, “…until the potential hazards of such recombinant DNA molecules have been better evaluated or until adequate methods are developed for preventing their spread, scientists throughout the world join with the members of this committee in voluntarily deferring the following types of experiments....” the authors then listed specific research that they considered most risky, acknowledging that…”our concern is based on judgments of potential rather than demonstrated risk since there are few available experimental data…and that adherence to our major recommendations will entail postponement or possibly abandonment of certain types of scientifically worthwhile experiments.” A year later, the first conference on “Recombinant DNA molecules” widely referred to as Asilomar for the idyllic conference center by the sea, took place, and is still referred to, and reflected upon as a model of “self-regulation” by the scientific community (the meeting included scientists from around the world, lawyers, government officials and journalists as well.)

Of course the concept of self-regulation may be an oversimplification since the conference purposefully focused on health and environmental safety only. The ethics and legalities of recombinant DNA were not on the agenda, “This choice of agenda,” wrote Berg years later, “was deliberate, partly because of lack of time at Asilomar and partly because it was premature to consider applications that were so speculative and certainly not imminent.” Perhaps. I imagine, like my district’s school committee meetings which I’ve sometimes referred as “adults behaving badly” – if we stuck with the nuts and bolts rather than the deeper questions – we too might be more successful.

Berg revealed one other key to success on at a symposium celebrating the 25th anniversary of Asilomar: molecular biologists weren’t yet heavily invested in the science and the public knew very little – so that there was still room for fluidity in the conversation. Positions on the recombinant DNA were not yet “hardened,” and scientists were primarily academic. This was a time when government funding was flush, when there was separation of academia and industry and the biotechnology industry with all its promises of the next million dollar drug was more “Jetsons” than reality.

Which brings me back to nanotechnology - a field developing under incredible public, government, and scientific scrutiny. Even industry, as I’ve read and heard, wishing to avoid the genetically modified foods fiasco (which is either ironic or inevitable considering Asilomar), seems willing to tread carefully when it comes to development of nanomaterials. A recent report by the DEEPEN (Deepening Ethical Engagement and Participation in Emerging Nanotechnologies) project – emphasizes a role for increased public participation in governance decisions related to nanotechnology development. In part because nanotechnology is poised to affect everyday life – so why not include all participants -- those who deliberately participant and those who are incidental nano-tourists in the conversation?

There's one caveat to suggestions by DEEPEN and others. There have been so many meetings, and project reports on how best to move forward conscientiously with nanotechnology, that there is some concern there’s too much talk and too little action. Meanwhile, nanomaterials find their way into more and more consumer products (1000 and counting,) and the body of research papers continue grow like a bacterial culture in log growth phase. But that's no reason not to broaden the conversation.

Perhaps comparisons between nanotechnology and Asilomar are unfair for nanotech.

As Berg noted, in 1975, neither Joe Public nor Joe the Plumber were invited members of the 'Recombinant DNA steering committee,' the focus of the meeting was strictly focused, and recombinant DNA was, and still is fairly easily defined. Isolating and rejoining segments of DNA – that was recombinant DNA. Today we have the world wide web of information where the public, if they wish can be informed, NGOs following and reporting on nanotechnology, a technology that is already in use, and scientists who can’t even agree on what constitutes a nanoparticle. Are they particles with one dimension measuring 100 nm or less? Or, should they be much smaller, encompassing particles in the 30 nm or smaller range, particles most likely to exhibit new and different physical-chemistry?

Then there are nanodots, nano-metals oxides, nanotubes and other nanos – all very different chemically although they may share some basic properties in terms of size, or increased reactivity as a result of decreased size, but how much do we know of their differences in terms of how nanoparticles will move and react inside a living being, or outside in the big wide world?

Our best hope right now, is that nanotechnology as a field is still young and flexible. Hopefully the talk with turn to action before nanotech’s arteries begin to harden before, as Berg observed twenty-five years after Asilomar – the issues become “chronic.”

(For the results of a recent poll on public understanding of nanotechnology and synthetic biology click here. )