A Decade into the “Vision,” Environmental Health gets a Progress Report

This year represents an important 10-year milestone for science and society.

No, I’m not referring to the 10th anniversary of the Apple iPhone, though that has undoubtedly changed all of our lives. Rather, 2017 marks ten years since the National Academy of Sciences (NAS) released its seminal report, Toxicity Testing in the 21st Century: A Vision and a Strategy.

In that report, the NAS laid out a vision for a new approach to toxicology that incorporates emerging cell-based testing techniques, rather than costly and time-intensive whole animal models, and utilizes early biological pathway perturbations as indications of adverse events, rather than relying on evaluations of end disease states. Tox21 and ToxCast, two federal programs focused on using alternative assays to predict adverse effects in humans, were initiated as first steps in this strategy. In the years since its release, the report has profoundly shaped the direction of environmental health sciences, particularly toxicology. (An analogous exposure sciences report, Exposure Science in the 21st Century: A Vision and a Strategy, was published in 2012.)

Now, one decade later, the NAS has reviewed progress on these efforts in its recently released report, Using 21st Century Science in Risk-Based Evaluations.

How are we doing, and what are next steps?

Overall, the committee supports efforts to use data from new tools, such as biological pathway evaluations, in risk assessment and decision-making. (Of course, limitations should be clearly communicated, and tools should be validated for their specific purposes.) Several case studies are described as examples of situations where emerging tools can be useful, such as quickly prioritizing chemicals of concern or evaluating risks from chemical mixtures at a contaminated site.

This report also documents advancements and challenges for each of the three interconnected fields of environmental health sciences: toxicology, exposure science, and epidemiology. I’ve summarized some of these key points in the chart below, and additional (digestible) information is available in the NAS report summary.



Recent Advancements

Key Challenges


  • Incorporate metabolic capacity in in vitro assays
  • Understand applicability & limitations of in vitro assays
  • Improve biological coverage
  • Address human variability & diversity in response

Exposure Science

  • Coordination of exposure science data (ex: databases)
  • Integration of exposure data of multiple chemicals obtained through varied methods


  • Improved data management & data sharing
  • Improved methods for estimation of exposures

I won’t go into detail on all of these points, but I do want to highlight some of the key challenges that the field of toxicology will need to continue to address in the coming years, such as:

  • Improving metabolic capacity of in vitro assays: Cell-based assays hold promise for predicting biological responses of whole animals, but it is critical to remember that these new tools rarely reflect human metabolic capacity. For example, if a chemical is activated or detoxified by an enzyme in our bodies, reductionist assays would not adequately reflect these changes – and thus their prediction would not be fully relevant to human health. We need continued work to incorporate metabolic capacity into such assays.
  • Improving biological coverage: An analogy that I’ve often heard in relation to the limitations of these new tools is that they are only “looking under the biological lamp post.” Essentially, we can only detect effects that the assays are designed to evaluate. So, we need further development of assays that capture the wide array of possible adverse outcomes. And we cannot assume that there is no hazard for endpoints that have not been evaluated.

New models of disease causation

Not only is the environmental health science ‘toolkit’ changing but also our understanding of disease causation. As discussed in the report, 21st century risk assessment must acknowledge that disease is “multifactorial” (multiple different exposures can contribute to a single disease) and “nonspecific” (a single exposure can lead to multiple different adverse outcomes). This advanced understanding of causality will pose challenges for interpreting data and making decisions about risk, and we will need to incorporate new practices and methods to address these complexities.

For example, we can no longer just investigate whether a certain exposure triggering a certain pathway causes disease in isolation, but also whether it may increase risk of disease when combined with other potential exposures. It gets even more complicated when we consider the fact that individuals may respond to the same exposures in different ways, based on their genetics or pre-existing medical conditions.

The Academy suggests borrowing a tool from epidemiology to aid in these efforts. The sufficient-component-cause model provides a framework for thinking about a collection of events or exposures that, together, could lead to an outcome.


Sufficient-component-cause model. Three disease mechanisms (I, II, III), each with different component causes. Image from NAS Report, Using 21st Century Science to Improve Risk Related Evaluations


Briefly, each disease has multiple component causes that fit together to complete the causal pie. These components may be necessary (present in every disease pie) or sufficient (able to cause disease alone), and different combinations of component causes can produce the same disease. Using this model may promote a transition away from a focus on finding a single pathway of disease to a broadened evaluation of causation that better incorporates the complexities of reality. (I’ve blogged previously about the pitfalls of a tunnel-vision, single pathway approach in relation to cancer causation.)

Integration of information, and the importance of interdisciplinary training

As the fields of toxicology, exposure science, and epidemiology continue to contribute data towards this updated causal framework, a related challenge will be the integration of these diverse data streams for risk assessment and decision-making. How should we weigh different types of data in drawing conclusions about causation and risk? For example, what if the in vitro toxicology studies provide results that are different than the epidemiology studies?

The committee notes that we will need to rely on “expert judgment” in this process, at least in the short term until standardized methods are developed. And they discuss the need for more interaction between individuals from different disciplines, so that knowledge can be shared and applied towards making these difficult decisions.

One issue that was not discussed, however, is the importance of training the next generation of scientists to address these complex challenges. Given the inevitable need to integrate multiple sources of data, I believe it is critical that the students in these fields (like me!) receive crosscutting training as well as early practice with examples of these multi-faceted assessments. Some programs offer more opportunities in this area than others, but this should be a priority for all departments in the coming years. Otherwise, how can we be prepared to step up to the challenges of 21st century environmental health sciences?

Looking forward

Speaking of challenges, we certainly have our next decade of work cut out for us. It is amazing to think about how much progress we have made over the last ten years to develop new technologies, particularly in toxicology and exposure sciences. Now we must: refine and enhance these methods so they provide more accurate information about hazard and exposure; address the complexities of multifactorial disease causation and inter-individual susceptibility; and work across disciplines to make decisions that are better protective of public health and the environment.