The Big Chart of Federal Chemical Evaluations & Assessments

Today, I’m excited to share an infographic that I made, depicting all of the different chemical evaluations and assessments that various federal agencies (in the U.S.) conduct.

If you want to hear about the backstory & process for creating this, read on below.

Otherwise, here’s a link to a PDF version of the graphic. There are hyperlinks throughout, if you want to explore any of the information further. Yes, I know this is very detailed; it is meant to be digested by zooming around to different sections of the graphic.

I’ve tried to be as accurate as possible. But if you catch something that doesn’t look right, please let me know.

I hope this helps the environmental health community (and others who might be interested) better understand the review processes that are intended to keep us safe (unless/until politics get in the way…).

The backstory

In April, the Agency for Toxic Substances and Disease Control (ATSDR) released a draft ToxProfile for glyphosate. If you’ve been following this blog, you know that I’ve been paying a lot of attention to glyphosate lately (see some of my recent posts here and here). Given my interest in this topic, I decided to review the document and take the opportunity to prepare public comments.

[What are public comments, you might ask? Public comments are a way for the public to provide feedback during the federal rulemaking process. Under the Administrative Procedure Act (1946), whenever a federal agency develops a new regulation, they are required to solicit input from the public. (For more on the public comment process and how you can get involved, check out the Public Comment Project!)]

As I was reviewing ATSDR’s ToxProfile, I realized that I did not fully understand how this effort was distinct from EPA’s assessment of glyphosate. ATSDR and EPA are two separate federal agencies with different missions, so clearly these assessments served different purposes.

I soon realized that elucidating this distinction was just one part of a larger story. So, I decided to create a master chart to better understand all of the different types of reviews, evaluations, and assessments that different federal agencies conduct, the main purposes of these evaluations, and what other processes or regulations they might relate to.

The process

Some of the agency assessments were quite familiar to me or fairly well-explained online; for example, those that EPA is supposed to conduct under the recently reformed Toxic Substances Control Act. It was surprisingly hard to get clear information on other assessments and related agency activities, however (even for me, someone who is relatively well-versed in this field). Specifically, I found the online information for the Occupational Safety and Health Administration (OSHA), the National Institute for Occupational Safety and Health (NIOSH), and the Consumer Product Safety Commission (CPSC) to be a bit confusing. I actually ended up calling several people at these agencies (using phone numbers listed online) to get clarifying information. (Thank you to those federal employees who picked up my cold calls and answered my questions!)

I started collecting this information in an excel chart, but this format is not very conducive to easy online reading or sharing. So, I decided to challenge myself to make an infographic, which I had never done before. I experimented with various online tools before settling on draw.io, which I also used to make the timeline in the glyphosate meta-analysis. I’ll spare you the details, but let’s just say, this took me a LONG time (sorry, dissertation, I’ll get back to you soon).

I imagine that I’ll continue to refine this over the next few months/years. If you see anything that looks wrong or have suggestions for improvement, let me know.

Ignorance is not bliss for “inert” pesticide ingredients

One of the complicated parts of assessing the hazards and risks of glyphosate is that the product that everyone uses (for example, Round-Up) is not just glyphosate. The active ingredient is glyphosate, but the final formulation sold in stores is a combination of glyphosate and other “inert” ingredients.

[Note: I’m going to stubbornly use quotation marks around the words “inert” throughout this article, to emphasize my point that this is not an accurate characterization. “Inert” implies inactive, which is not true. Read on for more.]

These “inert” ingredients are subject to essentially no testing, disclosure, and regulatory requirements, even though they almost always make up a larger percentage of the final product than active ingredients. And, evidence indicates that combinations of the “inert” and active ingredients can actually be more toxic than the pure active compound (for example, see here, here, and here).

A new publication by Mesnage et al. in Food and Chemical Toxicology reviews the problems with the status quo and the implications for health effects research. Given the relevance of this topic to my previous blog posts on glyphosate (see here and here) and pesticides in general, I’ll summarize some of the authors’ key points below.

But first, some terminology: what is the difference between active and “inert” pesticide ingredients?

Under the U.S. Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA), an active ingredient is one that is intended to be toxic to the target species. For example, glyphosate, the active ingredient used in glyphosate-based herbicides (GBHs), blocks an essential enzyme pathway in plants. All other ingredients in a pesticide product, often added to improve effectiveness, are classified as “inert.”

Abounding uncertainty about “inerts”

Contrary to what their name suggests, however, “inert” ingredients may not actually be inert. In fact, the U.S. Environmental Protection Agency (EPA) explicitly states that this designation “does not mean non-toxic.”

But, it’s challenging to get extensive and accurate information about these chemicals because:

  • Neither the “inert” ingredients nor the final formulations (the combination of active + “inert” ingredients) are subject to most standard regulatory toxicity tests, such as evaluation of cancer and reproductive effects. As a result, pesticide approvals are based on the unrealistic scenario of exposure to the active ingredient alone.
  • Companies can routinely claim the identity and concentration of these “inert” ingredients as confidential business information (CBI). That is why you’ll often see labels like the extremely vague one below. As a result, it’s difficult – actually, essentially impossible – for scientists to independently evaluate possible toxicity. We are kept blind to these final products.

label
Image: Beyond Pesticides

  • Because we don’t know the identity of these “inert” ingredients, there are essentially no monitoring data on environmental or human exposure.

So, in summary, we don’t know how toxic the “inert” ingredients or final formulations are; the identity of these “inert” ingredients is kept secret from the public; and we aren’t monitoring any of these chemicals for their presence in our bodies or the environment.

All of this makes it challenging for the EPA to conduct accurate pesticide risk assessments, which require information on both hazard (ie: toxicity) and exposure.

Constant change 

An added barrier is that companies often change their formulations over time and across regions. Apparently, there are usually between 100-150 different formulations of GBHs on the market at any given time, and over 2000 different varieties have been registered in Europe since 2012.

How are we supposed to evaluate the health effects of such a moving target? Robust epidemiological studies require precise definitions of exposure (referred to as the “consistency” principle) to prove causality. In essence, the exposure under investigation should be defined very specifically, such that it is not possible for variations in versions of the exposure to have different effects, which could muddy the overall conclusion of the study.

(As a concrete example, think about investigating the impact of “exercise” on health. Exercise is very broad, so it wouldn’t be helpful or informative to evaluate the effect of general “exercise,” which could span everything from a 30-minute walk once per month to a 2-hour run each day. The effects of these different types of exercise could have very different impacts on health. So, a better study question would be focused on a more specific type of exercise.)

For pesticide epidemiology, all of these changing formulations make it very challenging to draw conclusions on health effects across time and space. It’s quite likely that one study based in multiple locations could be evaluating the effects of different products at the same time. A study looking at one region over a period of several years also faces the same problem. As the authors of the recent publication stated, “formulated GBHs with the same product name from different countries may not be the same mixture of chemicals, nor the same as the brand-name product bought previously, or in the future.”

This is one possible reason for differing conclusions about hazard, and it makes reproducibility nearly impossible.

Overcoming INERTia 

The authors put forth a few suggestions to improve this murky situation. Some can be acted on by researchers now, such as including detailed product information (ex: trade name, dates of manufacture, product ID number) in the methods sections of their papers, to facilitate reproducibility and comparison across studies.

Other proposals will need to wait until there is political will for policy change. Most important is the need for full public disclosure of pesticide product composition. (By the way, back in 1997, the American Medical Association urged Congress to “support all efforts to list both active and inert ingredients on pesticide container labels.”) The authors also suggest monitoring of food and feed for concentrations of the “inert” ingredients (that is, if we can get access to information about their identities!), so we can understand patterns of exposure.

Additionally, it is essential to revise the pesticide approval processes to include full testing of “inert” ingredients as well as the final formulated products. We urgently need a regulatory system that accounts for these real-world exposures.

It’s high time for transparency on these formulations and their effects on human health and the environment.

What’s the Risk?

Last week, we published a meta-analysis that found that high exposure to glyphosate-based herbicides was associated with an increased risk of non-Hodgkin Lymphoma (NHL). There was a lot of discussion about this paper in the news, on social media, and across internet forums (as expected, given the ongoing controversy and high stakes of this conclusion). Most articles focused on the specific risk estimate that we reported, with headlines such as:

Glyphosate exposure increases cancer risk up to 41%, study finds

Weedkiller raises risk of non-Hodgkin lymphoma by 41%”

Common weed killer increases cancer risk by almost half” 

A common critique of these headlines (and our article) was that they (and we) were being misleading, because we reported the 41% increased relative risk of NHL – which sounds very scary!—rather than a 0.8% increased absolute risk of NHL – which sounds less scary.

At the risk of more undue attention on the 41% number (which as I said in my previous post, is less important than the finding of a significant association itself), let me explain a few things about (1) how we report results in epidemiological research, (2) why small increases in risk matter, and (3) how agencies like the Environmental Protection Agency (EPA) regulate on risk.

Relative risks vs. absolute risks

In epidemiology, we are trying to understand whether an exposure is associated with a disease. To do this, we compare the disease rate in the exposed group with the disease rate in the unexposed group.  This ratio gives us the relative risk of disease between the two groups.

[Side note: this is why it is crucial for researchers to select an appropriate comparison group! The relative risk depends entirely on this decision! If your comparison group has an unusually high rate of cancer, you will get a very skewed (and wrong) answer about the effects of the exposure.]

This relative risk, however, does not give us any information on the absolute risk of the disease at the individual level. It only tells us whether the exposed group has a higher or lower chance of developing the disease than the comparison group. In our paper, we report that individuals with high exposure to glyphosate-based herbicides (for example, people who spray it daily for many years) have a 41% increased risk of developing NHL over their lifetimes, compared to those who were not highly exposed (infrequent or no history of use).

The absolute risk, by contrast, tells us the actual risk of the disease for a given level of exposure. This is much more intuitive. For example, on average in the US, approximately 2 out of every 100 people develop NHL during their lifetime. So, the absolute risk of NHL over a lifetime is 2%. Therefore, when our study reports a 41% increased risk for those who are highly exposed, that is equivalent to saying that these individuals now have an absolute risk of 2.8% risk of NHL.

These statistics are communicating the same basic information, but they sound very different. In our epidemiology courses, we learn that absolute risk is better for communicating to the public because it is easier to understand. But, because of the way that epidemiological studies are designed (comparing disease rates in one group vs. the other), our default is to report relative risks. And because we are used to thinking about these ratios, we don’t always realize that this information can be misinterpreted, misunderstood, and confusing. Maybe we should report both metrics in our abstracts.

Nevertheless, both ways of talking about risk give us the same answer to the central question of carcinogenicity: evidence suggests that glyphosate exposure is associated with an increased risk of cancer.

Why seemingly low risks are still important

Some environmental exposures have very high relative risks. Individuals exposed to high levels of asbestos in their homes, for example, have an 800% increased risk of developing mesothelioma, a very rare type of lung cancer.

Most common environmental exposures, however, are associated with relatively small increased relative risks. Let’s take a look at air pollution, a very common exposure. And more specifically, fine particulate matter (PM2.5), very tiny particles emitted from vehicles, industrial facilities, and fires. While exact estimates vary based on the population studied, an increased concentration (of 10 ug/m3, to be exact) in 24-hour average PM2.5 has been associated with a 0.4%-1.0% increased risk of death (mostly from cardiovascular disease). An increase (again, of 10 ug/m3) in long term average PM2.5 has been associated with an overall 10% increased risk of death.

Those seem like small changes in risk. So, can we stop worrying about air pollution?

No, definitely not.

Low relative risks applied to large populations can be extremely consequential. We are all exposed to air pollution. Everyday. And all of those exposures add up. In fact, PM2.5 was ranked as the 5th most important cause of death around the world in 2015, accounting for approximately 4.2 million deaths.

Glyphosate-based herbicides are the most heavily used herbicides in the world, with an estimated 1.8 billion pounds applied in 2014. Most of this usage is on commercial agricultural farms by workers with potentially high cumulative exposures over their lifetimes. Given the large number of people possibly exposed, any significant increase in risk – especially the 41% estimate that we report – is meaningful to consider at the population level.

Regulating risk

Finally, I want to bring up a point about cancer risk in relation to regulations. The US EPA and Food and Drug Administration (FDA), among other agencies, have to manage and regulate risks for the population. For most scenarios, they have decided that an “acceptable risk” for a carcinogen in the general population is between 1 in a million and 1 in 10,000 (over a lifetime).  In other words, EPA and FDA are supposed to take action to prevent exposure to carcinogens that would result in risks higher than those rates (the specific threshold depends on the scenario and, sometimes, technologic feasibility).

Our findings suggest that the absolute risk of NHL over a lifetime might shift from approximately 2% to 2.8% with high exposure to glyphosate-based herbicides. This difference represents an increase of 8/1000 – certainly above EPA’s threshold of concern for the general population.

Note, however, that some of the studies in our meta-analysis were focused on people using glyphosate in commercial agricultural settings. EPA usually allows a higher risk of cancer in occupational scenarios, approximately 1 in 1000. Even with that standard, however, our results would suggest a need for action.

I’m just using these comparisons to put our results in context, because many people seemed to discount this work because of the small absolute risk estimates. Before any actual regulatory action, EPA would need to consider extensive evidence on hazard and exposure in a formal risk assessment.

Summary

In closing, I hope that I’ve clarified a few points about risk that were raised in the aftermath of the glyphosate publication. But once again, let me emphasize that you should not focus too much on the specific numerical estimates above but rather use them to better understand that:

  • Relative risks are different than absolute risks. Epidemiologists usually use relative risks, so that is what you will see in published papers (and, likely, the headlines as well).
  • Exposures with low relative risks can still have huge impacts at the population level.
  • Regulatory agencies set certain benchmarks for acceptable lifetime cancer risk in the population. You might not agree with the thresholds, but those are the standards. Keep that in mind when you are reading about risks from environmental exposures.

Concerning Glyphosate

Apologies for the long blog absence. I’ve been busy PhD-ing (including preparing for and passing my oral general exam!) and working on various side projects.

One of those side projects has been focused on glyphosate. Glyphosate, the active ingredient in Monsanto’s (now owned by Bayer) Roundup, is the most widely used herbicide in the world. First marketed in 1974, its usage skyrocketed after the introduction of “Roundup-ready” (i.e.: Roundup resistant) crops in 1996 and the practice of “green-burndown” (i.e.: using the chemical as a desiccant shortly before harvest) in the mid-2000s. In 2014, global usage was estimated to be 1.8 billion pounds.  

But these staggering statistics are not the only claim to fame for glyphosate. It has also been the subject of intense international regulatory and scientific scrutiny in recent years, for its possible link to cancer. The stakes are high (billions of dollars for Monsanto, related to sales of both the herbicide itself and its line of herbicide-resistant crops), and the conclusions are controversial.

Carcinogenic or not, that is the question.

In 2015, the International Agency on Cancer (IARC) declared that glyphosate was a “probable human carcinogen” (relevant links: explanation of IARC classifications; official summary for glyphosate; IARC webpage with follow-up links). However, that same year, the European Food Safety Authority (EFSA) concluded that “glyphosate is unlikely to pose a carcinogenic hazard to humans, and the evidence does not support classification with regard to its carcinogenic potential.” In 2016, the US Environmental Protection Agency (EPA) determined that glyphosate was “not likely to be carcinogenic to humans at doses relevant for human health risk assessment.”

Ok, so that’s confusing. How did these agencies, all of which are supposed to conduct unbiased reviews of all of the evidence come to such different conclusions? There have been several recent publications that explain these inconsistencies (for example, see here and here). In essence, it boils down to: 1) differences in how the agencies weighed peer-reviewed, publicly available studies (most show adverse health effects) versus unpublished regulatory studies submitted by manufacturers (most do not show adverse health effects); 2) whether the agencies focused on studies of pure glyphosate or the final formulated glyphosate-based product that is used in agricultural applications (which is known to be more toxic); and 3) whether the agencies considered dietary exposures to the general population only or also took into account elevated exposures in occupational scenarios (i.e. individuals who apply glyphosate-based herbicides in agricultural settings).

Meanwhile, as the debate continues… 27 countries (as of November 2018) have decided to move forward with implementing their own bans or restrictions. And, Monsanto/Bayer faces more than 9,000 lawsuits in the US from individuals who link their cancer to the herbicide. (The courts ruled the first case in favor of the plaintiff, though Monsanto is appealing the decision).

My connection

This highly contentious area is outside the topic of my dissertation research, but I got involved because my advisor was a member of the EPA scientific advisory panel that reviewed the agency’s draft assessment of glyphosate in 2016. The panel’s final report raised a number of concerns with EPA’s process and conclusions, including that the agency did not follow its own cancer guidelines and made some inappropriate statistical decisions in the analysis.

Because of their dissatisfaction with EPA’s report, my advisor and two other panel members decided to pursue related research to dig further into the issues. I enthusiastically accepted the invitation to join.   

Our collaborative group recently published two review papers on glyphosate. I’ll provide brief highlights of both below.

Reviewing our reviews, part 1: exposure to glyphosate  

In January 2019, we published a review of the evidence of worldwide exposure to glyphosate. Even though glyphosate-based products are the most heavily used herbicides in the world, we were surprised (and dismayed) to find less than twenty published studies documenting exposure in only 3721 individuals.

So, our paper mostly serves to highlight the limitations of the existing data:

  • These studies sampled small numbers of individuals from certain geographic regions, mostly in the US and Europe, and therefore are not representative of the full scope of global exposures
  • Most studies relied on a single urine spot sample, which does not represent exposure over the long term and/or in different agricultural seasons
  • The occupational studies only covered 403 workers in total, a serious deficiency given its widespread agricultural use. Few assessed exposure before and after spraying; and no studies evaluated patterns related to seasonality, crop use, etc.
  • Only two small studies evaluated how population exposure has changed over time. So, we definitely don’t know enough about whether the dramatic increases in global usage have resulted in similarly dramatic increased concentrations in our bodies. (Presumably, yes).  

In addition to highlighting the need to address the points above, we specifically recommended  incorporating glyphosate into the National Health and Nutrition Examination Survey (NHANES), a national survey that monitors exposure to many chemicals – including other common pesticides. This is an obvious and fairly straightforward suggestion; in reality, it’s quite bizarre that it has not already been incorporated into NHANES. Testing for glyphosate would allow us to better understand exposure across the US – which is not reflective of global levels, of course, but an important start.

Reviewing our reviews, part 2: glyphosate & non-Hodgkin Lymphoma (NHL)  

Our second paper, published earlier this week, was a meta-analysis of the link between glyphosate exposure and non-Hodgkin Lymphoma (NHL). Yes, diving right in to the controversy.

There had already been several prior meta-analyses that showed an association between glyphosate and NHL, but ours incorporates new research and applies a method that would be more sensitive to detecting an association.

A meta-analysis combines results from separate studies to better understand the overall association. While they technically do not generate any “new” data, meta-analyses are essential in the field of public health. A single study may have certain weaknesses, focus only on selected populations, or reflect a chance finding. In drawing conclusions about hazards (especially in this scenario, affecting millions of people and billions of dollars), we want to look across the collection of data from many studies so we can be confident in our assessment.

We were able to include a newly published follow-up study of over 54,000 licensed pesticide applicators (part of the Agricultural Health Study (AHS)). Compared to an earlier paper of the same cohort, this updated AHS study reports on data for an additional 11-12 years. This extension is important to consider, given that cancer develops over a long period of time, and shorter studies may not have followed individuals long enough for the disease to arise.

We conducted this meta-analysis with a specific and somewhat unusual approach. We decided to focus on the highly exposed groups in order to most directly address the question of carcinogenicity. In other words, we would expect the dangers (or, proof of safety: is it safe enough to drink?) to be most obvious in those who are highly exposed. Combining people who have low exposure with those who have high exposure would dilute the association. IMPORTANT NOTE: this approach of picking out the high exposure groups is only appropriate because we are simply looking for the presence or absence of a link. If you were interested in the specific dose-response relationship (i.e.: how a certain level of exposure relates to a certain level of hazard), this would not be ok.

Our results indicate that individuals who are highly exposed to glyphosate have an increased risk of NHL, compared to the control/comparison groups. This finding itself is not entirely earth-shattering: the results from prior meta-analyses were similar. But, it adds more support to the carcinogenic classification.

More specifically, we report a 41% increased risk. For comparison, the average lifetime risk of NHL is about 2%However, I want to emphasize that because our analytical method prioritized the high exposure groups, the precise numerical estimate is less important than the significant positive correlation. Basically, the purpose of this and other related assessments (like IARC’s) is to understand whether glyphosate is carcinogenic or not: this is a yes/no question. It is up to regulatory agencies to judge the scale of this effect and decide how to act on this information.

As with any scientific project, there are several limitations. In particular, we combined estimates from studies that differed in important ways, including their design (cohort vs. case-control), how they controlled for confounding by exposure to other pesticides, and which reference group they chose for the comparison (unexposed vs. lowest exposed). When studies are very different, we need to be cautious about combining them. This is another reason to focus more on the direction of the effect rather than the exact numerical estimate.  

Beyond the headlines

The news coverage of this work has focused on the overarching results (especially the 41% statistic), as expected. But I want to highlight a few other aspects that have been overlooked.

To better understand the timing of these studies in relation to glyphosate usage, we put together a timeline of market milestones and epidemiological study events.

 

Glyphosate_V9
This took me SO MANY HOURS.

Of note is that all of the studies conducted to date evaluated cancers that developed prior to 2012-2013, at the latest. Most were much earlier (80s, 90s, early 00s). As illustrated in the timeline, we’ve seen a huge increase in glyphosate usage since green burndown started in the mid-2000s. Yet none of these studies would have captured the effects of these exposures, which means the correlation should be easier to see in newer studies if/when they are conducted.

Also, as I mentioned above, we included the newly published AHS cohort study in our meta-analysis. One might expect the old and new AHS studies to be directly comparable, given that they were conducted by the same research group. However, our deep dive into both papers elucidated important differences; consequently, they are not directly comparable (see Table 8 of our paper). An in-depth discussion of these issues (and some of their potential implications) is a topic for a separate post, but there’s a clear lesson here about how important it is to carefully understand study design and exposure assessment methods when interpreting results.

Finally, two brief points on the animal toxicology studies, which we also reviewed in our paper because they provide complementary evidence for assessing hazard in humans. We discuss these data but did not conduct a formal pooled analysis (to combine results from separate but similarly designed animal studies), which would allow us to better understand overarching results from the animal studies. Anyone ready for a project?  

Additionally, in future animal toxicology studies, researchers should use the formulated glyphosate product that is actually used around the world rather than the pure glyphosate chemical that has been the focus of prior testing. There is growing evidence to suggest that the final formulated product is more toxic, perhaps due to the added adjuvants and surfactants. And this would allow for better comparisons to the human epidemiological studies, which assess effects of exposure to the formulated product.

Reflecting on the process

I had followed the evolving story on glyphosate with great interest for several years, so it was exciting to be part of these projects. Contributing to research with a real-world public health impact has always been a priority for me, and this high-profile research (affecting millions of people, billions of dollars) certainly fits the bill.

That being said, it was not an easy process. These two papers represent years of work by our group, which we did on top of our regular commitments. Collaborating with three researchers whom I had never met also proved challenging, since we did not have established rapport or an understanding of each other’s work and communication styles. So, in addition to gaining skills in conducting literature reviews and meta-analyses, I learned valuable lessons in group dynamics. 🙂

Given the high-stakes and high-profile nature of this work, we were extra meticulous about the details of this project. We knew that it would be scrutinized carefully, and any error could damage our credibility (especially worrisome for me, since I’m just establishing myself in my career). It took many, many rounds of review and editing to get everything right. A good lesson in patience.

Speaking of patience, I know that scientific research and related policy decisions take time. But I hope that these two projects can contribute to moving forward in a direction that protects public health.