Journal
TOXICOLOGY IN VITRO
Volume 23, Issue 8, Pages 1576-1579Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.tiv.2009.06.012
Keywords
Bioinformatics; Computational toxicology; High throughput; In vitro toxicology; Risk assessment; Systems biology; Toxicity pathway
Categories
Ask authors/readers for more resources
Conventional toxicological testing methods are often decades old, costly and low-throughput, with questionable relevance to the human condition. Several of these factors have contributed to a backlog of chemicals that have been inadequately assessed for toxicity. Some authorities have responded to this challenge by implementing large-scale testing programmes. Others have concluded that a paradigm shift in toxicology is warranted. One such call came in 2007 from the United States National Research Council (NRC), which articulated a vision of 21st century toxicology based predominantly on non-animal techniques. Potential advantages of such an approach include the capacity to examine a far greater number of chemicals and biological outcomes at more relevant exposure levels; a substantial reduction in testing costs, time and animal use; and the grounding of regulatory decisions on human rather than rodent biology. In order for the NRC's and similar proposals to make a significant impact on regulatory toxicology in the foreseeable future, they must be translated into sustained multidisciplinary research programmes that are well co-ordinated and funded on a multinational level. The Humane Society is calling for a big biology project to meet this challenge. We are in the process of forging an international, multi-stakeholder consortium dedicated to implementing the NRC vision. (C) 2009 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available