Thousands of compounds have been screened in a bid to develop in vitro toxicity tests

iStock

US researchers have taken a step forward in a large-scale collaborative effort to develop ways of assessing compounds’ toxicity without relying on laborious, expensive and ethically contentious animal experiments.

New results from the Tox21 initiative, a multi-centre project in the US, suggest that it should be possible to predict the toxicity of a compound by measuring how it interferes with a range of different processes in cultured cells. Potentially toxic molecules identified by such tests could then be singled out for more detailed investigation.

The project has tested thousands of compounds, including drugs, pesticides and household products, against various cell-based assays using automated, high-throughput techniques.

In the latest work, researchers have analysed the results of the tests of 10,000 compounds against 30 assays. The assays were of two classes, nuclear receptor disruption and stress-response pathways in the cell. ‘Fifteen different concentrations of each compound were used against each assay, which were carried out in triplicate,’ explains team member Ruili Huang, of the National Center for Advancing Translational Sciences, in Rockville, Maryland. ‘This generated around 50 million data points.’

The multiple assays produce a specific fingerprint for each compound, and similar fingerprints can be clustered. In this way, Huang’s team showed that there were between 500 and 600 distinct clusters among the 10,000 molecules tested. ‘It is hypothesised that compounds lying within a particular cluster may exert their toxicity through a similar mechanism or attack similar targets in the cell,’ says Huang’s colleague Menghang Xia.

When the researchers combined the assay data with structural information relating to the compounds, they were able to find ‘some correlation’ with the compounds’ known toxicity from animal or human studies.

‘This shows us that we are on the right track,’ says Huang. ‘What we want to do next is to extend the range of assays to cover more pathways and targets in the cell, and to use other types of cell that are more directly relevant to human toxicity.’ These could include specific organ types derived from stem cells.

Probably the greatest gains will be in decreased costs and time in streamlining and prioritising processes for drug development

‘Ultimately we hope to be able to narrow down the assays to the ones that deliver the most useful information,’ Huang says. ‘So we will be able to get increasingly refined fingerprints that give us information about the magnitude of toxicity, the possible mode of action and better predictions of in vivo toxicity.’

The aim is to arrive at a manageable suite of assays so that all compounds of toxicological interest – currently there are around 80,000 – could be screened. ‘If we tried to do this through animal testing that would take probably forever,’ says Huang.

The hope is that in the future researchers will be able to take any compound – a drug candidate for example – and run it through the system to see if it yields a fingerprint that suggested it may have toxicological properties that merits further investigation.

Commenting on the study, Andy Smith, senior scientist at the MRC Toxicology Unit at the University of Leicester in the UK, says, ‘The results are a demonstration of how careful systematic multi in vitro toxicity endpoints and large, wide-ranging examples of candidate chemicals at more than one exposure level can be used to screen and predict for potential human toxicity. As the authors describe, this could greatly aid in prioritisation of chemicals, especially in the developments of drugs, for further in-depth testing and understanding mechanisms of toxicity.’

‘Probably the greatest gains will be in decreased costs and time in streamlining and prioritising processes for drug development.’