1,4-dioxane: The case of the disappearing tumors

Right now, EPA is in the process of conducting “risk evaluations” for existing chemicals in commerce, as mandated by the recently passed Lautenberg Chemical Safety for the 21st Century Act (which amended the original, ineffective Toxic Substances Control Act).

[For a refresher on how this effort fits into the bigger picture of chemical assessments, you can review my infographic.]

So far, the agency has released draft risk evaluations for four chemicals: PV-29, HBCD, 1,4-dioxane, and 1-bromopropane. I’ve been working with my former colleagues at EDF Health to carefully review the drafts for the latter two chemicals.  Unfortunately, as expected, these drafts put out by the Trump EPA have a number of problems, which we’ve detailed in public comments.

For a window into one particularly concerning issue, you can check out a post that I wrote with Dr. Richard Denison on the EDF Health blog, 1,4-dioxane: The case of the disappearing tumors.

The Big Chart of Federal Chemical Evaluations & Assessments

Today, I’m excited to share an infographic that I made, depicting all of the different chemical evaluations and assessments that various federal agencies (in the U.S.) conduct.

If you want to hear about the backstory & process for creating this, read on below.

Otherwise, here’s a link to a PDF version of the graphic. There are hyperlinks throughout, if you want to explore any of the information further. Yes, I know this is very detailed; it is meant to be digested by zooming around to different sections of the graphic.

I’ve tried to be as accurate as possible. But if you catch something that doesn’t look right, please let me know.

I hope this helps the environmental health community (and others who might be interested) better understand the review processes that are intended to keep us safe (unless/until politics get in the way…).

The backstory

In April, the Agency for Toxic Substances and Disease Control (ATSDR) released a draft ToxProfile for glyphosate. If you’ve been following this blog, you know that I’ve been paying a lot of attention to glyphosate lately (see some of my recent posts here and here). Given my interest in this topic, I decided to review the document and take the opportunity to prepare public comments.

[What are public comments, you might ask? Public comments are a way for the public to provide feedback during the federal rulemaking process. Under the Administrative Procedure Act (1946), whenever a federal agency develops a new regulation, they are required to solicit input from the public. (For more on the public comment process and how you can get involved, check out the Public Comment Project!)]

As I was reviewing ATSDR’s ToxProfile, I realized that I did not fully understand how this effort was distinct from EPA’s assessment of glyphosate. ATSDR and EPA are two separate federal agencies with different missions, so clearly these assessments served different purposes.

I soon realized that elucidating this distinction was just one part of a larger story. So, I decided to create a master chart to better understand all of the different types of reviews, evaluations, and assessments that different federal agencies conduct, the main purposes of these evaluations, and what other processes or regulations they might relate to.

The process

Some of the agency assessments were quite familiar to me or fairly well-explained online; for example, those that EPA is supposed to conduct under the recently reformed Toxic Substances Control Act. It was surprisingly hard to get clear information on other assessments and related agency activities, however (even for me, someone who is relatively well-versed in this field). Specifically, I found the online information for the Occupational Safety and Health Administration (OSHA), the National Institute for Occupational Safety and Health (NIOSH), and the Consumer Product Safety Commission (CPSC) to be a bit confusing. I actually ended up calling several people at these agencies (using phone numbers listed online) to get clarifying information. (Thank you to those federal employees who picked up my cold calls and answered my questions!)

I started collecting this information in an excel chart, but this format is not very conducive to easy online reading or sharing. So, I decided to challenge myself to make an infographic, which I had never done before. I experimented with various online tools before settling on draw.io, which I also used to make the timeline in the glyphosate meta-analysis. I’ll spare you the details, but let’s just say, this took me a LONG time (sorry, dissertation, I’ll get back to you soon).

I imagine that I’ll continue to refine this over the next few months/years. If you see anything that looks wrong or have suggestions for improvement, let me know.

Ignorance is not bliss for “inert” pesticide ingredients

One of the complicated parts of assessing the hazards and risks of glyphosate is that the product that everyone uses (for example, Round-Up) is not just glyphosate. The active ingredient is glyphosate, but the final formulation sold in stores is a combination of glyphosate and other “inert” ingredients.

[Note: I’m going to stubbornly use quotation marks around the words “inert” throughout this article, to emphasize my point that this is not an accurate characterization. “Inert” implies inactive, which is not true. Read on for more.]

These “inert” ingredients are subject to essentially no testing, disclosure, and regulatory requirements, even though they almost always make up a larger percentage of the final product than active ingredients. And, evidence indicates that combinations of the “inert” and active ingredients can actually be more toxic than the pure active compound (for example, see here, here, and here).

A new publication by Mesnage et al. in Food and Chemical Toxicology reviews the problems with the status quo and the implications for health effects research. Given the relevance of this topic to my previous blog posts on glyphosate (see here and here) and pesticides in general, I’ll summarize some of the authors’ key points below.

But first, some terminology: what is the difference between active and “inert” pesticide ingredients?

Under the U.S. Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA), an active ingredient is one that is intended to be toxic to the target species. For example, glyphosate, the active ingredient used in glyphosate-based herbicides (GBHs), blocks an essential enzyme pathway in plants. All other ingredients in a pesticide product, often added to improve effectiveness, are classified as “inert.”

Abounding uncertainty about “inerts”

Contrary to what their name suggests, however, “inert” ingredients may not actually be inert. In fact, the U.S. Environmental Protection Agency (EPA) explicitly states that this designation “does not mean non-toxic.”

But, it’s challenging to get extensive and accurate information about these chemicals because:

  • Neither the “inert” ingredients nor the final formulations (the combination of active + “inert” ingredients) are subject to most standard regulatory toxicity tests, such as evaluation of cancer and reproductive effects. As a result, pesticide approvals are based on the unrealistic scenario of exposure to the active ingredient alone.
  • Companies can routinely claim the identity and concentration of these “inert” ingredients as confidential business information (CBI). That is why you’ll often see labels like the extremely vague one below. As a result, it’s difficult – actually, essentially impossible – for scientists to independently evaluate possible toxicity. We are kept blind to these final products.

label
Image: Beyond Pesticides

  • Because we don’t know the identity of these “inert” ingredients, there are essentially no monitoring data on environmental or human exposure.

So, in summary, we don’t know how toxic the “inert” ingredients or final formulations are; the identity of these “inert” ingredients is kept secret from the public; and we aren’t monitoring any of these chemicals for their presence in our bodies or the environment.

All of this makes it challenging for the EPA to conduct accurate pesticide risk assessments, which require information on both hazard (ie: toxicity) and exposure.

Constant change 

An added barrier is that companies often change their formulations over time and across regions. Apparently, there are usually between 100-150 different formulations of GBHs on the market at any given time, and over 2000 different varieties have been registered in Europe since 2012.

How are we supposed to evaluate the health effects of such a moving target? Robust epidemiological studies require precise definitions of exposure (referred to as the “consistency” principle) to prove causality. In essence, the exposure under investigation should be defined very specifically, such that it is not possible for variations in versions of the exposure to have different effects, which could muddy the overall conclusion of the study.

(As a concrete example, think about investigating the impact of “exercise” on health. Exercise is very broad, so it wouldn’t be helpful or informative to evaluate the effect of general “exercise,” which could span everything from a 30-minute walk once per month to a 2-hour run each day. The effects of these different types of exercise could have very different impacts on health. So, a better study question would be focused on a more specific type of exercise.)

For pesticide epidemiology, all of these changing formulations make it very challenging to draw conclusions on health effects across time and space. It’s quite likely that one study based in multiple locations could be evaluating the effects of different products at the same time. A study looking at one region over a period of several years also faces the same problem. As the authors of the recent publication stated, “formulated GBHs with the same product name from different countries may not be the same mixture of chemicals, nor the same as the brand-name product bought previously, or in the future.”

This is one possible reason for differing conclusions about hazard, and it makes reproducibility nearly impossible.

Overcoming INERTia 

The authors put forth a few suggestions to improve this murky situation. Some can be acted on by researchers now, such as including detailed product information (ex: trade name, dates of manufacture, product ID number) in the methods sections of their papers, to facilitate reproducibility and comparison across studies.

Other proposals will need to wait until there is political will for policy change. Most important is the need for full public disclosure of pesticide product composition. (By the way, back in 1997, the American Medical Association urged Congress to “support all efforts to list both active and inert ingredients on pesticide container labels.”) The authors also suggest monitoring of food and feed for concentrations of the “inert” ingredients (that is, if we can get access to information about their identities!), so we can understand patterns of exposure.

Additionally, it is essential to revise the pesticide approval processes to include full testing of “inert” ingredients as well as the final formulated products. We urgently need a regulatory system that accounts for these real-world exposures.

It’s high time for transparency on these formulations and their effects on human health and the environment.

What’s the Risk?

Last week, we published a meta-analysis that found that high exposure to glyphosate-based herbicides was associated with an increased risk of non-Hodgkin Lymphoma (NHL). There was a lot of discussion about this paper in the news, on social media, and across internet forums (as expected, given the ongoing controversy and high stakes of this conclusion). Most articles focused on the specific risk estimate that we reported, with headlines such as:

Glyphosate exposure increases cancer risk up to 41%, study finds

Weedkiller raises risk of non-Hodgkin lymphoma by 41%”

Common weed killer increases cancer risk by almost half” 

A common critique of these headlines (and our article) was that they (and we) were being misleading, because we reported the 41% increased relative risk of NHL – which sounds very scary!—rather than a 0.8% increased absolute risk of NHL – which sounds less scary.

At the risk of more undue attention on the 41% number (which as I said in my previous post, is less important than the finding of a significant association itself), let me explain a few things about (1) how we report results in epidemiological research, (2) why small increases in risk matter, and (3) how agencies like the Environmental Protection Agency (EPA) regulate on risk.

Relative risks vs. absolute risks

In epidemiology, we are trying to understand whether an exposure is associated with a disease. To do this, we compare the disease rate in the exposed group with the disease rate in the unexposed group.  This ratio gives us the relative risk of disease between the two groups.

[Side note: this is why it is crucial for researchers to select an appropriate comparison group! The relative risk depends entirely on this decision! If your comparison group has an unusually high rate of cancer, you will get a very skewed (and wrong) answer about the effects of the exposure.]

This relative risk, however, does not give us any information on the absolute risk of the disease at the individual level. It only tells us whether the exposed group has a higher or lower chance of developing the disease than the comparison group. In our paper, we report that individuals with high exposure to glyphosate-based herbicides (for example, people who spray it daily for many years) have a 41% increased risk of developing NHL over their lifetimes, compared to those who were not highly exposed (infrequent or no history of use).

The absolute risk, by contrast, tells us the actual risk of the disease for a given level of exposure. This is much more intuitive. For example, on average in the US, approximately 2 out of every 100 people develop NHL during their lifetime. So, the absolute risk of NHL over a lifetime is 2%. Therefore, when our study reports a 41% increased risk for those who are highly exposed, that is equivalent to saying that these individuals now have an absolute risk of 2.8% risk of NHL.

These statistics are communicating the same basic information, but they sound very different. In our epidemiology courses, we learn that absolute risk is better for communicating to the public because it is easier to understand. But, because of the way that epidemiological studies are designed (comparing disease rates in one group vs. the other), our default is to report relative risks. And because we are used to thinking about these ratios, we don’t always realize that this information can be misinterpreted, misunderstood, and confusing. Maybe we should report both metrics in our abstracts.

Nevertheless, both ways of talking about risk give us the same answer to the central question of carcinogenicity: evidence suggests that glyphosate exposure is associated with an increased risk of cancer.

Why seemingly low risks are still important

Some environmental exposures have very high relative risks. Individuals exposed to high levels of asbestos in their homes, for example, have an 800% increased risk of developing mesothelioma, a very rare type of lung cancer.

Most common environmental exposures, however, are associated with relatively small increased relative risks. Let’s take a look at air pollution, a very common exposure. And more specifically, fine particulate matter (PM2.5), very tiny particles emitted from vehicles, industrial facilities, and fires. While exact estimates vary based on the population studied, an increased concentration (of 10 ug/m3, to be exact) in 24-hour average PM2.5 has been associated with a 0.4%-1.0% increased risk of death (mostly from cardiovascular disease). An increase (again, of 10 ug/m3) in long term average PM2.5 has been associated with an overall 10% increased risk of death.

Those seem like small changes in risk. So, can we stop worrying about air pollution?

No, definitely not.

Low relative risks applied to large populations can be extremely consequential. We are all exposed to air pollution. Everyday. And all of those exposures add up. In fact, PM2.5 was ranked as the 5th most important cause of death around the world in 2015, accounting for approximately 4.2 million deaths.

Glyphosate-based herbicides are the most heavily used herbicides in the world, with an estimated 1.8 billion pounds applied in 2014. Most of this usage is on commercial agricultural farms by workers with potentially high cumulative exposures over their lifetimes. Given the large number of people possibly exposed, any significant increase in risk – especially the 41% estimate that we report – is meaningful to consider at the population level.

Regulating risk

Finally, I want to bring up a point about cancer risk in relation to regulations. The US EPA and Food and Drug Administration (FDA), among other agencies, have to manage and regulate risks for the population. For most scenarios, they have decided that an “acceptable risk” for a carcinogen in the general population is between 1 in a million and 1 in 10,000 (over a lifetime).  In other words, EPA and FDA are supposed to take action to prevent exposure to carcinogens that would result in risks higher than those rates (the specific threshold depends on the scenario and, sometimes, technologic feasibility).

Our findings suggest that the absolute risk of NHL over a lifetime might shift from approximately 2% to 2.8% with high exposure to glyphosate-based herbicides. This difference represents an increase of 8/1000 – certainly above EPA’s threshold of concern for the general population.

Note, however, that some of the studies in our meta-analysis were focused on people using glyphosate in commercial agricultural settings. EPA usually allows a higher risk of cancer in occupational scenarios, approximately 1 in 1000. Even with that standard, however, our results would suggest a need for action.

I’m just using these comparisons to put our results in context, because many people seemed to discount this work because of the small absolute risk estimates. Before any actual regulatory action, EPA would need to consider extensive evidence on hazard and exposure in a formal risk assessment.

Summary

In closing, I hope that I’ve clarified a few points about risk that were raised in the aftermath of the glyphosate publication. But once again, let me emphasize that you should not focus too much on the specific numerical estimates above but rather use them to better understand that:

  • Relative risks are different than absolute risks. Epidemiologists usually use relative risks, so that is what you will see in published papers (and, likely, the headlines as well).
  • Exposures with low relative risks can still have huge impacts at the population level.
  • Regulatory agencies set certain benchmarks for acceptable lifetime cancer risk in the population. You might not agree with the thresholds, but those are the standards. Keep that in mind when you are reading about risks from environmental exposures.

Concerning Glyphosate

Apologies for the long blog absence. I’ve been busy PhD-ing (including preparing for and passing my oral general exam!) and working on various side projects.

One of those side projects has been focused on glyphosate. Glyphosate, the active ingredient in Monsanto’s (now owned by Bayer) Roundup, is the most widely used herbicide in the world. First marketed in 1974, its usage skyrocketed after the introduction of “Roundup-ready” (i.e.: Roundup resistant) crops in 1996 and the practice of “green-burndown” (i.e.: using the chemical as a desiccant shortly before harvest) in the mid-2000s. In 2014, global usage was estimated to be 1.8 billion pounds.  

But these staggering statistics are not the only claim to fame for glyphosate. It has also been the subject of intense international regulatory and scientific scrutiny in recent years, for its possible link to cancer. The stakes are high (billions of dollars for Monsanto, related to sales of both the herbicide itself and its line of herbicide-resistant crops), and the conclusions are controversial.

Carcinogenic or not, that is the question.

In 2015, the International Agency on Cancer (IARC) declared that glyphosate was a “probable human carcinogen” (relevant links: explanation of IARC classifications; official summary for glyphosate; IARC webpage with follow-up links). However, that same year, the European Food Safety Authority (EFSA) concluded that “glyphosate is unlikely to pose a carcinogenic hazard to humans, and the evidence does not support classification with regard to its carcinogenic potential.” In 2016, the US Environmental Protection Agency (EPA) determined that glyphosate was “not likely to be carcinogenic to humans at doses relevant for human health risk assessment.”

Ok, so that’s confusing. How did these agencies, all of which are supposed to conduct unbiased reviews of all of the evidence come to such different conclusions? There have been several recent publications that explain these inconsistencies (for example, see here and here). In essence, it boils down to: 1) differences in how the agencies weighed peer-reviewed, publicly available studies (most show adverse health effects) versus unpublished regulatory studies submitted by manufacturers (most do not show adverse health effects); 2) whether the agencies focused on studies of pure glyphosate or the final formulated glyphosate-based product that is used in agricultural applications (which is known to be more toxic); and 3) whether the agencies considered dietary exposures to the general population only or also took into account elevated exposures in occupational scenarios (i.e. individuals who apply glyphosate-based herbicides in agricultural settings).

Meanwhile, as the debate continues… 27 countries (as of November 2018) have decided to move forward with implementing their own bans or restrictions. And, Monsanto/Bayer faces more than 9,000 lawsuits in the US from individuals who link their cancer to the herbicide. (The courts ruled the first case in favor of the plaintiff, though Monsanto is appealing the decision).

My connection

This highly contentious area is outside the topic of my dissertation research, but I got involved because my advisor was a member of the EPA scientific advisory panel that reviewed the agency’s draft assessment of glyphosate in 2016. The panel’s final report raised a number of concerns with EPA’s process and conclusions, including that the agency did not follow its own cancer guidelines and made some inappropriate statistical decisions in the analysis.

Because of their dissatisfaction with EPA’s report, my advisor and two other panel members decided to pursue related research to dig further into the issues. I enthusiastically accepted the invitation to join.   

Our collaborative group recently published two review papers on glyphosate. I’ll provide brief highlights of both below.

Reviewing our reviews, part 1: exposure to glyphosate  

In January 2019, we published a review of the evidence of worldwide exposure to glyphosate. Even though glyphosate-based products are the most heavily used herbicides in the world, we were surprised (and dismayed) to find less than twenty published studies documenting exposure in only 3721 individuals.

So, our paper mostly serves to highlight the limitations of the existing data:

  • These studies sampled small numbers of individuals from certain geographic regions, mostly in the US and Europe, and therefore are not representative of the full scope of global exposures
  • Most studies relied on a single urine spot sample, which does not represent exposure over the long term and/or in different agricultural seasons
  • The occupational studies only covered 403 workers in total, a serious deficiency given its widespread agricultural use. Few assessed exposure before and after spraying; and no studies evaluated patterns related to seasonality, crop use, etc.
  • Only two small studies evaluated how population exposure has changed over time. So, we definitely don’t know enough about whether the dramatic increases in global usage have resulted in similarly dramatic increased concentrations in our bodies. (Presumably, yes).  

In addition to highlighting the need to address the points above, we specifically recommended  incorporating glyphosate into the National Health and Nutrition Examination Survey (NHANES), a national survey that monitors exposure to many chemicals – including other common pesticides. This is an obvious and fairly straightforward suggestion; in reality, it’s quite bizarre that it has not already been incorporated into NHANES. Testing for glyphosate would allow us to better understand exposure across the US – which is not reflective of global levels, of course, but an important start.

Reviewing our reviews, part 2: glyphosate & non-Hodgkin Lymphoma (NHL)  

Our second paper, published earlier this week, was a meta-analysis of the link between glyphosate exposure and non-Hodgkin Lymphoma (NHL). Yes, diving right in to the controversy.

There had already been several prior meta-analyses that showed an association between glyphosate and NHL, but ours incorporates new research and applies a method that would be more sensitive to detecting an association.

A meta-analysis combines results from separate studies to better understand the overall association. While they technically do not generate any “new” data, meta-analyses are essential in the field of public health. A single study may have certain weaknesses, focus only on selected populations, or reflect a chance finding. In drawing conclusions about hazards (especially in this scenario, affecting millions of people and billions of dollars), we want to look across the collection of data from many studies so we can be confident in our assessment.

We were able to include a newly published follow-up study of over 54,000 licensed pesticide applicators (part of the Agricultural Health Study (AHS)). Compared to an earlier paper of the same cohort, this updated AHS study reports on data for an additional 11-12 years. This extension is important to consider, given that cancer develops over a long period of time, and shorter studies may not have followed individuals long enough for the disease to arise.

We conducted this meta-analysis with a specific and somewhat unusual approach. We decided to focus on the highly exposed groups in order to most directly address the question of carcinogenicity. In other words, we would expect the dangers (or, proof of safety: is it safe enough to drink?) to be most obvious in those who are highly exposed. Combining people who have low exposure with those who have high exposure would dilute the association. IMPORTANT NOTE: this approach of picking out the high exposure groups is only appropriate because we are simply looking for the presence or absence of a link. If you were interested in the specific dose-response relationship (i.e.: how a certain level of exposure relates to a certain level of hazard), this would not be ok.

Our results indicate that individuals who are highly exposed to glyphosate have an increased risk of NHL, compared to the control/comparison groups. This finding itself is not entirely earth-shattering: the results from prior meta-analyses were similar. But, it adds more support to the carcinogenic classification.

More specifically, we report a 41% increased risk. For comparison, the average lifetime risk of NHL is about 2%However, I want to emphasize that because our analytical method prioritized the high exposure groups, the precise numerical estimate is less important than the significant positive correlation. Basically, the purpose of this and other related assessments (like IARC’s) is to understand whether glyphosate is carcinogenic or not: this is a yes/no question. It is up to regulatory agencies to judge the scale of this effect and decide how to act on this information.

As with any scientific project, there are several limitations. In particular, we combined estimates from studies that differed in important ways, including their design (cohort vs. case-control), how they controlled for confounding by exposure to other pesticides, and which reference group they chose for the comparison (unexposed vs. lowest exposed). When studies are very different, we need to be cautious about combining them. This is another reason to focus more on the direction of the effect rather than the exact numerical estimate.  

Beyond the headlines

The news coverage of this work has focused on the overarching results (especially the 41% statistic), as expected. But I want to highlight a few other aspects that have been overlooked.

To better understand the timing of these studies in relation to glyphosate usage, we put together a timeline of market milestones and epidemiological study events.

 

Glyphosate_V9
This took me SO MANY HOURS.

Of note is that all of the studies conducted to date evaluated cancers that developed prior to 2012-2013, at the latest. Most were much earlier (80s, 90s, early 00s). As illustrated in the timeline, we’ve seen a huge increase in glyphosate usage since green burndown started in the mid-2000s. Yet none of these studies would have captured the effects of these exposures, which means the correlation should be easier to see in newer studies if/when they are conducted.

Also, as I mentioned above, we included the newly published AHS cohort study in our meta-analysis. One might expect the old and new AHS studies to be directly comparable, given that they were conducted by the same research group. However, our deep dive into both papers elucidated important differences; consequently, they are not directly comparable (see Table 8 of our paper). An in-depth discussion of these issues (and some of their potential implications) is a topic for a separate post, but there’s a clear lesson here about how important it is to carefully understand study design and exposure assessment methods when interpreting results.

Finally, two brief points on the animal toxicology studies, which we also reviewed in our paper because they provide complementary evidence for assessing hazard in humans. We discuss these data but did not conduct a formal pooled analysis (to combine results from separate but similarly designed animal studies), which would allow us to better understand overarching results from the animal studies. Anyone ready for a project?  

Additionally, in future animal toxicology studies, researchers should use the formulated glyphosate product that is actually used around the world rather than the pure glyphosate chemical that has been the focus of prior testing. There is growing evidence to suggest that the final formulated product is more toxic, perhaps due to the added adjuvants and surfactants. And this would allow for better comparisons to the human epidemiological studies, which assess effects of exposure to the formulated product.

Reflecting on the process

I had followed the evolving story on glyphosate with great interest for several years, so it was exciting to be part of these projects. Contributing to research with a real-world public health impact has always been a priority for me, and this high-profile research (affecting millions of people, billions of dollars) certainly fits the bill.

That being said, it was not an easy process. These two papers represent years of work by our group, which we did on top of our regular commitments. Collaborating with three researchers whom I had never met also proved challenging, since we did not have established rapport or an understanding of each other’s work and communication styles. So, in addition to gaining skills in conducting literature reviews and meta-analyses, I learned valuable lessons in group dynamics. 🙂

Given the high-stakes and high-profile nature of this work, we were extra meticulous about the details of this project. We knew that it would be scrutinized carefully, and any error could damage our credibility (especially worrisome for me, since I’m just establishing myself in my career). It took many, many rounds of review and editing to get everything right. A good lesson in patience.

Speaking of patience, I know that scientific research and related policy decisions take time. But I hope that these two projects can contribute to moving forward in a direction that protects public health.

 

High Impact Report on Low-Dose Toxicity

A condensed version of this post was originally published on The Conversation with the title “Can low doses of chemicals affect your health? A new report weighs the evidence.

Toxicology’s founding father, Paracelsus (1493-1541), is famous for his paraphrased proclamation: “the dose makes the poison.” This phrase represents a pillar of traditional toxicology: Essentially, chemicals are harmful only at high enough doses.

But, increasing evidence suggests that even low levels of “endocrine disrupting chemicals (EDCs)” can interfere with hormonal signals in the body in potentially harmful ways.

Standard toxicity tests don’t always detect the effects that chemicals can have at lower levels. There are many reasons for this shortcoming, including a focus on high dose animal testing as well as failure to include endpoints relevant to low dose disruption. And, even when the data do suggest such effects, scientists and policymakers may not act upon this information in a timely manner.

Recognizing these challenges, the U.S. Environmental Protection Agency (EPA) asked the National Academy of Sciences convene a committee to study the issue in detail. How can we better identify whether chemicals have effects at low doses? And how can we act on this information to protect public health?

After several years of work, the committee’s report was released in July. This landmark report provides the EPA with a strategy to identify and systematically analyze data about low-dose health effects, as well as two case study examples. It is an evidence-based call to action, and scientists and policymakers should take notice.

Delving into definitions

Before discussing the report, let’s review some definitions…

We know that animal experiments usually use high doses, but in comparison, what is a “low dose?”

This issue was a matter of considerable debate, but ultimately, the committee decided to proceed with a fairly general definition of low dose as “external or internal exposure that falls with the range estimated to occur in humans.” Therefore, any dose that we would encounter in our daily lives could be included, as well as doses that would be experienced in the workplace.

The committee also clarified the meaning of “adverse effects.” When a chemical produces a visible malformation, it is easy to conclude that it is adverse. But, when a chemical causes a small change in hormone levels, it is more difficult to conclusively state that the change is adverse. Are all hormone changes adverse? If not, what is the threshold of change that should be considered adverse?

In this context, an adverse effect was defined as “a biological change in an organism that results in an impairment of functional capacity, a decrease in the capacity to compensate for stress, or an increase in susceptibility to other influences.”

A strategy to identify low dose toxicity

With these semantics settled, the committee developed a 3-part strategy to help with timely identification, analysis, and action on low-dose toxicity information:

(1) Surveillance: Active monitoring of varied data sources and solicitation of stakeholder input can provide information on low dose effects of specific chemicals, especially since EPA’s standard regulatory testing framework may not always identify such effects. Human exposure and biomonitoring data should also be collected to help define relevant exposure levels of concern across the population.

(2) Investigation & Analysis: Systematic review and related evidence integration methods can be used to conduct targeted analysis of the human, animal, and in vitro studies identified in the surveillance step. Each of these approaches has different strengths and weaknesses, so examining the evidence together offers insight that a single approach could not provide.

(3) Actions: New evidence can be incorporated into risk assessments or utilized to improve toxicity testing. For example, protocols could be updated to include newly identified outcomes relevant to endocrine disruption.

Leading by example: systematic review case studies

To put their strategy into practice, the committee conducted two systematic reviews of low dose EDC effects.

The first case study looked at phthalates, chemicals that increase the flexibility of plastic products such as shower curtains and food wrapping.

The committee found that diethylhexyl phthalate and other selected phthalates are associated with changes in male reproductive and hormonal health. Overall, the data were strong enough to classify diethylhexyl phthalate as a “presumed reproductive hazard” in humans.

The second case study focused on polybrominated diphenyl ethers (PBDEs), flame retardants used for over 30 years. Though they are now being phased out, these chemicals remain a concern for humans. They are still present in older products and can persist in the environment for many years.

Based on data showing the impact of these chemicals on learning and IQ, the panel concluded that developmental exposure is “presumed to pose a hazard to intelligence in humans.”

Questions and challenges for the future

During its review, the committee encountered a variety of barriers that could impede similar investigations into specific chemicals.

First, when reviewing evidence, it’s important to assess any systematic errors – also known as biases – that might have led to incorrect results. These errors can arise from study design flaws, such as failure to properly blind the researchers during analysis.

Some journals have strict guidelines for reporting details related to bias, but many do not. Better adherence to reporting guidelines would improve scientists’ ability to assess the quality of evidence.

Second, the committee noted a discrepancy between the concept of doses used in human and animal studies, which made it difficult to compare data from different sources.

For example, most toxicologists simply report the dose that they delivered to animals. But some of that administered dose might not actually be absorbed. The actual internal dose of chemical circulating in the body and causing harm may differ from the amount that was administered. By contrast, epidemiologists usually think about dose as the level of chemical they detect in the body, but they may not know how much of the chemical an individual was actually exposed to.

Biological modeling techniques can help scientists draw the connection between administered and internal doses and more closely compare results from animal and human studies.

Finally, many toxicology studies focus on only a single chemical. This is a valuable way to identify how one chemical affects the body. However, given that we are all exposed to chemical mixtures, these procedures may be of limited use in the real world.

The committee suggested that toxicologists incorporate real-world mixtures into their studies, to provide more relevant information about the risk to human health.

Leveraging toxicity testing for answers about low dose effects

This report demonstrates one of the challenges facing the field of toxicology and environmental health: How well can existing and emerging laboratory techniques predict adverse outcomes in humans? (If you’ve read some of my previous posts, you know that this issue is of particular interest to me.)

Traditional animal experiments usually use high doses, which don’t necessarily reflect the real world. These studies can be an important first step in identifying health hazards, but they cannot accurately predict how or at what levels the chemicals affect humans. The committee noted that more relevant doses and better modeling could help mitigate this problem.

Emerging high-throughput testing techniques use cell-based methods to detect how a chemical changes specific molecular or cellular activities. These newer methods are increasingly used in toxicology testing. They have the potential to quickly identify harmful chemicals, but have yet to be fully accepted or validated by the scientific community.

For these two case studies, the committee noted that high-throughput tests were not particularly helpful in drawing conclusions about health effects. Many of these studies are narrowly focused – looking at, for example, just a single signaling pathway, without indicating a chemical’s overall influence on an organism. Nevertheless, these methods could be used to prioritize chemicals for further in-depth testing, since activity in one pathway may predict a chemical’s capacity to cause harm.

Putting the report into action

Despite the imperfections of our testing methods, there’s already ample evidence about low-dose effects from many chemicals (including the two cases studies from the committee). The EPA should implement this new strategy to efficiently identify and act on problematic endocrine-disrupting chemicals. Only through such strong, science-based efforts can we prevent adverse effects from chemical exposures – and allow everyone to live the healthy lives that they deserve.

A Decade into the “Vision,” Environmental Health gets a Progress Report

This year represents an important 10-year milestone for science and society.

No, I’m not referring to the 10th anniversary of the Apple iPhone, though that has undoubtedly changed all of our lives. Rather, 2017 marks ten years since the National Academy of Sciences (NAS) released its seminal report, Toxicity Testing in the 21st Century: A Vision and a Strategy.

In that report, the NAS laid out a vision for a new approach to toxicology that incorporates emerging cell-based testing techniques, rather than costly and time-intensive whole animal models, and utilizes early biological pathway perturbations as indications of adverse events, rather than relying on evaluations of end disease states. Tox21 and ToxCast, two federal programs focused on using alternative assays to predict adverse effects in humans, were initiated as first steps in this strategy. In the years since its release, the report has profoundly shaped the direction of environmental health sciences, particularly toxicology. (An analogous exposure sciences report, Exposure Science in the 21st Century: A Vision and a Strategy, was published in 2012.)

Now, one decade later, the NAS has reviewed progress on these efforts in its recently released report, Using 21st Century Science in Risk-Based Evaluations.

How are we doing, and what are next steps?

Overall, the committee supports efforts to use data from new tools, such as biological pathway evaluations, in risk assessment and decision-making. (Of course, limitations should be clearly communicated, and tools should be validated for their specific purposes.) Several case studies are described as examples of situations where emerging tools can be useful, such as quickly prioritizing chemicals of concern or evaluating risks from chemical mixtures at a contaminated site.

This report also documents advancements and challenges for each of the three interconnected fields of environmental health sciences: toxicology, exposure science, and epidemiology. I’ve summarized some of these key points in the chart below, and additional (digestible) information is available in the NAS report summary.

  

 

Recent Advancements

Key Challenges

Toxicology

  • Incorporate metabolic capacity in in vitro assays
  • Understand applicability & limitations of in vitro assays
  • Improve biological coverage
  • Address human variability & diversity in response

Exposure Science

  • Coordination of exposure science data (ex: databases)
  • Integration of exposure data of multiple chemicals obtained through varied methods

Epidemiology

  • Improved data management & data sharing
  • Improved methods for estimation of exposures

I won’t go into detail on all of these points, but I do want to highlight some of the key challenges that the field of toxicology will need to continue to address in the coming years, such as:

  • Improving metabolic capacity of in vitro assays: Cell-based assays hold promise for predicting biological responses of whole animals, but it is critical to remember that these new tools rarely reflect human metabolic capacity. For example, if a chemical is activated or detoxified by an enzyme in our bodies, reductionist assays would not adequately reflect these changes – and thus their prediction would not be fully relevant to human health. We need continued work to incorporate metabolic capacity into such assays.
  • Improving biological coverage: An analogy that I’ve often heard in relation to the limitations of these new tools is that they are only “looking under the biological lamp post.” Essentially, we can only detect effects that the assays are designed to evaluate. So, we need further development of assays that capture the wide array of possible adverse outcomes. And we cannot assume that there is no hazard for endpoints that have not been evaluated.

New models of disease causation

Not only is the environmental health science ‘toolkit’ changing but also our understanding of disease causation. As discussed in the report, 21st century risk assessment must acknowledge that disease is “multifactorial” (multiple different exposures can contribute to a single disease) and “nonspecific” (a single exposure can lead to multiple different adverse outcomes). This advanced understanding of causality will pose challenges for interpreting data and making decisions about risk, and we will need to incorporate new practices and methods to address these complexities.

For example, we can no longer just investigate whether a certain exposure triggering a certain pathway causes disease in isolation, but also whether it may increase risk of disease when combined with other potential exposures. It gets even more complicated when we consider the fact that individuals may respond to the same exposures in different ways, based on their genetics or pre-existing medical conditions.

The Academy suggests borrowing a tool from epidemiology to aid in these efforts. The sufficient-component-cause model provides a framework for thinking about a collection of events or exposures that, together, could lead to an outcome.

screen-shot-2017-01-22-at-10-39-18-pm

Sufficient-component-cause model. Three disease mechanisms (I, II, III), each with different component causes. Image from NAS Report, Using 21st Century Science to Improve Risk Related Evaluations

 

Briefly, each disease has multiple component causes that fit together to complete the causal pie. These components may be necessary (present in every disease pie) or sufficient (able to cause disease alone), and different combinations of component causes can produce the same disease. Using this model may promote a transition away from a focus on finding a single pathway of disease to a broadened evaluation of causation that better incorporates the complexities of reality. (I’ve blogged previously about the pitfalls of a tunnel-vision, single pathway approach in relation to cancer causation.)

Integration of information, and the importance of interdisciplinary training

As the fields of toxicology, exposure science, and epidemiology continue to contribute data towards this updated causal framework, a related challenge will be the integration of these diverse data streams for risk assessment and decision-making. How should we weigh different types of data in drawing conclusions about causation and risk? For example, what if the in vitro toxicology studies provide results that are different than the epidemiology studies?

The committee notes that we will need to rely on “expert judgment” in this process, at least in the short term until standardized methods are developed. And they discuss the need for more interaction between individuals from different disciplines, so that knowledge can be shared and applied towards making these difficult decisions.

One issue that was not discussed, however, is the importance of training the next generation of scientists to address these complex challenges. Given the inevitable need to integrate multiple sources of data, I believe it is critical that the students in these fields (like me!) receive crosscutting training as well as early practice with examples of these multi-faceted assessments. Some programs offer more opportunities in this area than others, but this should be a priority for all departments in the coming years. Otherwise, how can we be prepared to step up to the challenges of 21st century environmental health sciences?

Looking forward

Speaking of challenges, we certainly have our next decade of work cut out for us. It is amazing to think about how much progress we have made over the last ten years to develop new technologies, particularly in toxicology and exposure sciences. Now we must: refine and enhance these methods so they provide more accurate information about hazard and exposure; address the complexities of multifactorial disease causation and inter-individual susceptibility; and work across disciplines to make decisions that are better protective of public health and the environment.

Cancer Risk Assessment: Are We Missing the Forest for the Trees?

In recent years, national and international environmental public health organizations (including the US Environmental Protection Agency and the World Health Organization) have begun to use the adverse outcome pathway (AOP) and/or mode of action (MOA) as unifying frameworks for chemical testing and risk assessment. While the details of these frameworks vary, their underlying ideas are similar: researchers link specific molecular changes caused by environmental chemicals with adverse outcomes at the organism level (ie: disease), and then risk assessment is conducted based on the premise that preventing the early molecular disruption will prevent the development of the end-stage adverse event.

While there are practical advantages and real logic to this mechanism-based approach, a new review article published in Carcinogenesis suggests that this strategy may be overly simplistic and could potentially hinder our ability to adequately identify chemicals that contribute to the development of cancer.

This international team of cancer biologists and environmental health scientists organized their discussion around the “Hallmarks of Cancer,” a list of acquired characteristics that commonly occur in cancer (for example: continued growth, resistance to cell death, and tissue invasion). For each key characteristic, they identified typical target sites for disruption as well as environmental chemicals that have been shown to act on those targets. The researchers focused their discussion solely on chemicals that were not already categorized as human carcinogens by the International Agency for Research on Cancer (IARC), and they took careful note of effects observed at low doses. In addition, they specifically mapped connections between different pathways to highlight cases in which alterations leading to a given cancer hallmark could also lead to another.

Their lengthy review provides an important overview of the procarcinogenic effects of numerous common chemicals, but perhaps the most significant conclusion of this work is to emphasize the pitfalls in the status quo for risk assessment. By focusing on categorizing single chemicals as ‘carcinogens,’ we neglect to acknowledge that combinations of chemicals that individually do not meet criteria to be categorized as ‘carcinogenic’ may act in synergistic ways to promote the development of cancer. Even recent efforts to evaluate the effects of chemical mixtures may be inadequate, as they mostly focus on chemicals with common cellular pathways or targets. What about the numerous compounds, as identified in this review, that act on disparate pathways and organs to contribute to a similar disease process in the body?

To address these problems, the authors propose several key principles for an improved framework for cumulative risk assessment, including consideration of the synergistic activity of:

  • chemicals that act via different pathways
  • chemicals that act on different target tissues
  • non-carcinogens that act at low doses to contribute to pro-carcinogenic processes
  • chemicals that are not structurally similar

Carcinogenesis, like many disease processes, is complicated, and identifying the numerous pathways and organs involved is – and will continue to be – an enormous scientific challenge. Slow progress can be made, nevertheless, with a shift towards testing real-world combinations of chemicals and by using the ‘Hallmarks of Cancer’ to guide relevant and appropriate research. New technologies, such as high throughput screening, computational modeling and systems biology-based analysis, can aid in this process. However, the authors stress that traditional in vivo testing still holds an important place in cancer-related research – at least until there is appropriate validation of these emerging tools.

This publication highlights that our current chemical testing and risk assessment system is overly narrow and negates the complexity with which chemicals can interact in the body. We must broaden our approach to acknowledge that distinct chemicals can act in distinct ways at distinct sites – even at low doses – to contribute synergistically to a specific disease process. Reframing our perspective is daunting, and it will emphasize our limited knowledge about the mixtures of chemicals that we are exposed to everyday. But, if we can look up to see the forest, we may begin to make our way towards safer territory.