cancel
Showing results for 
Search instead for 
Did you mean: 

ToxCast and Public Health Toxicology

ACSGCI
Honored Contributor
0 0 737

Contributed by Marie Bourgeois PhD, MPH Research Assistant Professor Center for Environmental/Occupational, Risk Analysis and Management Department of Environmental and Occupational Health, University of South Florida College of Public Health

For decades, the development of new chemicals has outpaced our ability to adequately assess risks potentially posed by exposure. Permissible Exposure Limits (PELs), set by OSHA to protect the health of America’s workforce, are a great example of the existing backlog. Fewer than 600 PELs have been established for common industrial chemicals. However, the Chemical and Product Categories database (CPCat) has catalogued over 40,000 chemicals currently used in different consumer products. While every effort is made to assure the safety of these chemicals, toxicity information is limited or absent for the majority of these compounds. Traditional testing methods are time consuming, expensive, heavily reliant on laboratory animals and frequently limited in scope. Stakeholders all agreed it was time for a new approach.

In 2007, the National Academies published ‘Toxicity Testing in the 21st Century’. This report highlighted the shortcomings of traditional testing methods, advocating a shift from whole animal based testing, to ‘toxicity pathway’ cellular and tissue based testing. It also stressed the utility of dose-response extrapolation of cellular response to toxicants to potential adverse effects in human systems. Report authors recommended continued use of real world data including biomonitoring results from projects such as NHANES. Perhaps most importantly, the report described an adaptive approach to risk assessment that stressed placing risk in context; some risks require rapid testing results from a single environmental agent while others may need a multitude of chemicals screened. Restricting testing to the most hazardous compounds is an economical approach to minimize adverse human health effects. The effort to modernize testing required the development of new tools.

ToxCast and similar tools provide methods of predictive toxicology suitable for the 21st century. ToxCast, or Toxicity Forecasting, is part of the US Environmental Protection Agency’s National Center for Computational Toxicology (EPA/NCCT) Research initiative to generate bioactivity profiles for chemicals posing the greatest risk to human health. ToxCast uses multiple in vitro assays to predict in vivo effects. Phase I profiled 320 chemicals as a ‘Proof of Concept’ in 2009. These chemicals were chosen because decades of existing data in the literature permitted validation via one to one comparison. Well correlated experimental results derived from the approximately 500 in vitro assays and 75 in vivo endpoints demonstrated the reliability of the ToxCast methodology. Phase II examined 2000 chemicals from categories including food additives, ‘green alternatives’ to existing industrial chemicals, nanomaterials and consumer products that were never released. The testing utilized 700 high throughput screening (HTS) assays running the gamut of cellular responses and signaling pathways.

ToxCast chemical data is publicly available via Interactive Chemical Safety for Sustainability Dashboards (iCSS). Other CompTox tools and resources include ACToR (Aggregated Computational Toxicology Resource), ToxRefDB (Toxicity Reference Database), DSSTox (Distributed Structure- Searchable Toxicity Database), ExpoCast (Exposure Forecaster) and Virtual Tissues. These tools provide searchable toxicity testing results, access to thirty years of animal studies, structural information, exposure predictions and virtual tissue models that map existing research to computer simulations of expected effects using an adverse outcome pathway approach. ToxCast data also becomes part of Tox21.

Tox21, as the collaboration between the EPA, National Institute of Environmental Health Sciences (NIEHS)/National Toxicology Program (NTP), National Center for Advancing Translational Sciences (NCATS)/NIH Chemical Genomics Center (NCGC) and the U.S. Food and Drug Administration (FDA) became known, utilizes robotic technology to conduct HTS of chemicals of interest.  HTS shortens test times required for data generation by covering a large range of test concentrations using 1536 well microtiter plates. Tox21 currently screens approximately 10,000 chemicals using biochemical and cellular HTS assays.

The overall goals of ToxCast/Tox21 include creating and screening a large library of chemicals, the creation of HTS assays for pathways, linkage of HTS results to adverse human health effects and identification of toxicity pathways. ToxCast/Tox21 runs multiple assays per target and multiple targets per pathway to develop a prioritization index (ToxPi) for each chemical.

     ToxPi = f(HTS assays + Chemical properties + Pathways)

In the future, additional information on exposure, chemical properties or QSAR models may also be incorporated.

     ToxPi = f(Exposure + Chemical properties + In vitro assays + Pathways)

The EPA formed partnerships with academic, industrial and governmental agencies to revolutionize toxicity and risk assessment. ToxCast data is currently being used by screening programs in Endocrine Disruption, Toxic Substances Control Act and Safe Drinking Water Act to prioritize chemical testing and set contaminant candidate lists.

So what does this mean to the average public health toxicologist? A risk assessment typically proceeds in 4 steps: hazard identification, generation of dose-response data, exposure assessment and risk characterization. Simply put, ToxCast/Tox21 moves the starting line up. A toxicologist looking for information on a priority chemical can begin with any one of these databases. They might check to see if a priority index had been established for their chemical(s) of interest. They could check Tox21, ToxRefDB, ToxExpo and iCSS for previously run in vivo studies and exposure assessments. A toxicologist could even use the virtual tissue models to predict adverse outcomes. This information may not obviate the use of research animals but it should lead to more informed experimental designs. Armed with this data, toxicologists everywhere will be better prepared to design a study that answers their objectives.

For more information:

ToxCast: http://epa.gov/ncct/toxcast/

ToxCastDB: http://actor.epa.gov/actor/faces/ToxCastDB/Home.jsp

ACToR: http://actor.epa.gov/actor

ToxRefDB: http://actor.epa.gov/toxrefdb

CSS Dashboards: http://actor.epa.gov/actor/faces/CSSDashboardLaunch.jsp

CompTox Tools: www.epa.gov/comptox

Chemical Safety for Sustainability Research Program: www.epa.gov/research/chemicalscience

ExpoCast: http://www.epa.gov/ncct/expocast/

CPCat: http://actor.epa.gov/cpcat/faces/home.xhtml

Virtual Liver: www.epa.gov/ncct/virtual_liver

Virtual Embryo: www.epa.gov/ncct/v-Embryo

ToxCast and Tox21: High Throughput Screening for Hazard & Risk of Environmental Chemicals: http://www.toxicology.org/isot/rc/nlsot/docs/Dix.pdf

References:

Dix et al (2006) “The ToxCast Program for Prioritizing Toxicity Testing of Environmental Chemicals.” Toxicological Sciences.

Judson et al. (2010) “Analysis of Eight Oil Spill Dispersants Using Rapid, In Vitro Tests for Endocrine and Other Biological Activity" Environmental Science and Technology.

Reif et al. (2010) “Endocrine Profiling and Prioritization of Environmental Chemicals Using ToxCast Data.” Environmental Health Perspectives.

Rotroff et al (2010) “Incorporating Human Dosimetry and Exposure into High Throughput In Vitro Toxicity.” Toxicological Sciences.

“The Nexus Blog” is a sister publication of “The Nexus” newsletter. To sign up for the newsletter, please email gci@acs.org, or if you have an ACS ID, login to your email preferences and select “The Nexus” to subscribe.

To read other posts, go to Green Chemistry: The Nexus Blog home.