A cross-agency agreement that should ease reliance on laboratory animals in testing for environmental toxins and generate data more relevant to humans has been signed by the National Institutes of Health (NIH) and the Environmental Protection Agency (EPA) in the US.
Under a five-year Memorandum of Understanding (MOU) between the NIH and the EPA, the NIH Chemical Genomics Center's (NCGC) high-speed, automated screening robots will be used to test suspected toxic compounds that may pose risks to the health of humans and animals as well as the environment. The tests will rely on cells and isolated molecular targets rather than the traditional animal models.
In addition to the high-throughput screening technology at NCGC, which is managed by the NIH's National Human Genome Research Institute (NHGRI), the MOU will build on experimental toxicology expertise at the National Toxicology Program, based at NIH's National Institute of Environmental Health Sciences; and the computational toxicology capabilities of the EPA's recently formed National Center for Computational Toxicology (NCCT).
Traditionally, data collection to determine chemical toxicity has been heavily dependent on whole-animal tests, which are not only expensive and raise ethical issues but do not always accurately mimic human responses to compounds. The search for alternative toxicology testing methods has been spurred by the growing number of new chemicals in circulation, high testing costs and public unease about the use of laboratory animals, the NIH noted.
The collaboration with the EPA is expected to produce data more applicable to humans, expand the number of chemicals tested, and reduce the time and money spent on, as well as the number of animals used in, toxicity testing. However, an immediate and dramatic shift away from animal testing is not on the cards.
"Full implementation of the hoped-for paradigm shift in toxicity testing will require validation of the new approaches, a substantial effort that could consume many years," the NIH warned.
The MOU provides for sample and information sharing, with the aim of identifying potentially harmful chemicals more rapidly and effectively. It addresses opportunities for co-ordination between the two agencies in four basic areas: identification of toxicity pathways; selection of chemicals for testing; analysis and interpretation of data; and outreach to the scientific and regulatory communities.
High-throughput screening assays will be central to the programme. As the NIH pointed out, the ability of quantitative high-throughput screening (qHTS) developed at NCGC to increase the rate at which chemicals are tested and to profile compounds over a wider range of concentrations makes the technology "ideal for toxicology testing, with the potential for advancing the goal of more accurate and timely public health decisions".
According to Dr John Bucher, associate director of the NTP, the "experimental and computational expertise required to transform toxicology is an enormous undertaking and too great for any of our existing organisations to accomplish alone".
Together with Dr Francis Collins, director of the NHGRI, and Dr George Gray, assistant administrator in the EPA's Office of Research and Development (which houses the NCCT), Dr Bucher outlines the collaborative research programme and the thinking behind it in an article for the 15 February issue of Science.
The co-authors address the potential for moving from reliance on animals in toxicology testing to biochemical- and cell-based assays, as well as tests using lower organisms such as zebrafish and roundworms.
These plans and the inter-agency MOU provide a framework for implementing the long-term recommendations of last year's US National Research Council report, Toxicity Testing in the 21st Century: A Vision and a Strategy, which called for a collaborative effort across the toxicology community to reduce reliance on animal studies and put more emphasis on in vitro tests using human cells and cellular components.
Importantly, the NIH noted, this strategy included efforts to improve dose-response research, "which will help predict toxicity at exposures that humans will encounter".
The EPA's role in the collaborative initiative is also part of its ToxCast programme, launched in 2007 to revolutionise the agency's procedures for chemical toxicity evaluation. ToxCast draws on advances in computer technology, genomics and cellular biology to speed up toxicity testing and enhance the EPA's capacity to screen new compounds.