Toxicogenomics

Source: Wikipedia, the free encyclopedia.

Toxicogenomics is a subdiscipline of

transcriptomics, proteomics and metabolomics.[1][2] Toxicogenomics endeavors to elucidate the molecular mechanisms evolved in the expression of toxicity, and to derive molecular expression patterns (i.e., molecular biomarkers
) that predict toxicity or the genetic susceptibility to it.

Pharmaceutical research

In pharmaceutical research, toxicogenomics is defined as the study of the structure and function of the genome as it responds to adverse xenobiotic exposure. It is the toxicological subdiscipline of pharmacogenomics, which is broadly defined as the study of inter-individual variations in whole-genome or candidate gene single-nucleotide polymorphism maps, haplotype markers, and alterations in gene expression that might correlate with drug responses.[3][4] Though the term toxicogenomics first appeared in the literature in 1999,[5] it was by that time already in common use within the pharmaceutical industry as its origin was driven by marketing strategies from vendor companies. The term is still not universally accepted, and others have offered alternative terms such as chemogenomics to describe essentially the same field of study.[6]

Bioinformatics

The nature and complexity of the data (in volume and variability) demands highly developed processes of automated handling and storage. The analysis usually involves a wide array of bioinformatics and statistics,[7] often including statistical classification approaches.[8]

Drug discovery

In pharmaceutical

Food and Drug Administration (FDA) currently preclude basing regulatory decision-making on genomics data alone. However, they do encourage the voluntary submission of well-documented, quality genomics data. Both agencies are considering the use of submitted data on a case-by-case basis for assessment purposes (e.g., to help elucidate mechanism of action or contribute to a weight-of-evidence approach) or for populating relevant comparative databases by encouraging parallel submissions of genomics data and traditional toxicological test results.[9]

Public projects

Chemical Effects in Biological Systems is a project hosted by the National Institute of Environmental Health Sciences building a knowledge base of toxicology studies including study design, clinical pathology, and histopathology and toxicogenomics data.[10][11]

InnoMed PredTox assesses the value of combining results from various omics technologies together with the results from more conventional toxicology methods in more informed decision-making in preclinical safety evaluation.[12]

Open TG-GATEs (Toxicogenomics Project-Genomics Assisted Toxicity Evaluation System) is a Japanese public-private effort which has published gene expression and pathology information for more than 170 compounds (mostly drugs).[13]

The Predictive Safety Testing Consortium aims to identify and clinically qualify safety biomarkers for regulatory use as part of the

FDA's "Critical Path Initiative".[12]

ToxCast is a program for Predicting Hazard, Characterizing Toxicity Pathways, and Prioritizing the Toxicity Testing of Environmental Chemicals at the United States Environmental Protection Agency.[14]

Tox21 is a federal collaboration involving the National Institutes of Health (NIH), Environmental Protection Agency (EPA), and Food and Drug Administration (FDA), is aimed at developing better toxicity assessment methods.[15] Within this project the toxic effects of chemical compounds on cell lines derived from the 1000 Genomes Project individuals were assessed and associations with genetic markers were determined.[16] Parts of this data were used in the NIEHS-NCATS-UNC DREAM Toxicogenetics Challenge in order to determine methods for cytotoxicity predictions for individuals.[17][18]

See also

References

External links