Can an algorithm replace animal testing for chemicals?

A low-cost, high-speed algorithm may make animal toxicity testing a thing of the past.

Toxicity testing—determining the amount of exposure to a chemical that is unsafe for humans—is vital to the safety of millions of workers in various industries. But researchers have not comprehensively tested a majority of the 85,000 compounds in consumer products for safety.

Animal testing, in addition to its ethical concerns, can be too costly and time consuming to meet this need, according to a new study in Environmental Health Perspectives.

“There is an urgent, worldwide need for an accurate, cost-effective and rapid way to test the toxicity of chemicals, in order to ensure the safety of the people who work with them and of the environments in which they are used,” says lead researcher Daniel Russo, a doctoral candidate at the Rutgers University-Camden Center for Computational and Integrative Biology. “Animal testing alone cannot meet this need.”

Previous efforts to solve this problem used computers to compare untested chemicals with structurally similar compounds whose toxicity is already known. But those methods could not assess structurally unique chemicals—and the fact that some structurally similar chemicals have very different levels of toxicity confounded them.

The researchers overcame these challenges by developing a first-of-its-kind algorithm that automatically extracts data from PubChem, a National Institutes of Health database of information on millions of chemicals. The algorithm compares chemical fragments from tested compounds with those of untested compounds, and uses multiple mathematical methods to evaluate their similarities and differences in order to predict an untested chemical’s toxicity.

“The algorithm developed by Daniel and the Zhu laboratory mines massive amounts of data, and discerns relationships between fragments of compounds from different chemical classes, exponentially faster than a human could,” says coauthor Lauren Aleksunes, an associate professor at Rutgers’ Ernest Mario School of Pharmacy and the Environmental and Occupational Health Sciences Institute.

“This model is efficient and provides companies and regulators with a tool to prioritize chemicals that may need more comprehensive testing in animals before use in commerce.”

To fine-tune the algorithm, the researchers began with 7,385 compounds with known toxicity data, and compared it with data on the same chemicals in PubChem. They then tested the algorithm with 600 new compounds.

For several groups of chemicals, the algorithm had a 62 percent to 100 percent success rate in predicting their level of oral toxicity. And by comparing relationships between sets of chemicals, they shed light on new factors that can determine the toxicity of a chemical.

Although researchers only directed the algorithm to assess the chemicals’ level of toxicity when consumed orally, the researchers conclude that their strategy can extend to predict other types of toxicity.

“While the complete replacement of animal testing is still not feasible, this model takes an important step toward meeting the needs of industry, in which new chemicals are constantly under development, and for environmental and ecological safety,” says coauthor Hao Zhu, an associate professor of chemistry.

Additional researchers from Rutgers, Integrated Laboratory Systems, Johns Hopkins Bloomberg School of Health, and the University of Kostanz contributed to the work.

Source: Rutgers University