92 research outputs found

    Innovative approaches to reduce animal testing : replace whenever possible, reduce through refinement and mechanistic understanding

    Get PDF
    'Many of the in vitro toxicological studies have not been sufficiently validated to determine their applicability domain, even less have gained regulatory acceptance. Major advantage of in vitro testing today is the early identification of significant hazards in compound development and reduced and targeted animal testing. Replacing complex animal tests may be achieved by a battery of in vitro test addressing the adverse outcome pathway in question. Kinetics models are needed to translate in vitro results into in vivo values.

    Mechanistic and quantitative aspects of liver tumour promotion in mice

    Get PDF
    A variety of xenobiotic compounds is known to induce characteristic changes in the livers of laboratory animals. These changes include enlargement of the liver, usually as a result of cell enlargement (hypertrophy) or Increased cell replication (hyperplasia), induction of drugmetabolizing enzymes and proliferation of the smooth endoplasmic reticulum (SER). Such changes are usually not accompanied by evidence of liver damage and thus are reversible upon withdrawal and elimination of the compound. Consequently, most authors regard this phenomenon as an adaptive response of the organ to increased functional demands.However, chronic exposure of various strains of mice to dieldrin, phenobarbitone, DDT and the α-, β- and γ-stereoisomers of hexachlorocyclohexane (HCH, also known as benzenehexachloride, MC) may lead to the development of liver tumours.The tumorigenic effects of microsomal enzyme Inducers In mice may result from (A) a weak carcinogenic action of the xenobiotics themselves or (h) an enhancing (promoting) action of xenobiotics on a pre-existing oncogenic factor in mouse liver. The first objective of this study was to discriminate between these two possible types.Druckrey and his associates have established both theoretically and experimentally the dose-response characteristics of chemical carcinogens:D.T n= constant (1)where D = daily dose, T = the median tumour Induction period and n = an exponent, always>1.Since the mechanisms by which enhancers or promotors of carcinogenesis operate is quite different from the one used by carcinogens, it is be conceivable that promotors also exhibit different dose-response characteristics.The dose-response characteristics of dieldrin-mediated enhancement of liver tumour formation in CF-1 mice were analysed, using existing tumour data from chronic feeding studies at six exposure levels of dieldrin (a model compound for microsomal enzyme induction). It was found that the dose- response relationship can be expressed as:(d o + δx).t = constant (2)where d o stands for the background dose equivalent required for the induction of spontaneous liver tumours, δx represents the actual dieldrin dose (ppm in the diet) and t the median tumour induction period in the respective treatment groups. It was also established that the doseresponse characteristics of limited dieldrin exposures and those of de layed exposure were consistent with equation (2), which is a Druckrey relation where n = 1.From these findings it is concluded that dieldrin interacts reversibly with its receptors, resulting in an acceleration of tumour formation (which is essentially Irreversible); dieldrin may thus be regarded as a tumour promotor. The validity of equation (2) for both chronic and limited dieldrin exposure Indicates that ( a ) the velocity of liver tumour development is proportional to the daily dose level (δx), ( b ) the total tumorigenic dose is constant across all doses, ( c ) the effects of dieldrin on the neoplastic process In mouse liver are essentially irreversible and cumulative, and ( d ) there is no evidence for a threshold level.Tumour formation Is a dose- and time-dependent process. The induction of liver enlargement, microsomal enzyme systems and proliferation of the smooth endoplasmic reticulum by dieldrin are only dose-dependent. In contrast, polyploldization Is dose- and time-dependent. To establish apossible link between microsomal enzyme induction, nuclear polyploidization and liver tumour formation, nuclear polyploidization in livers of CF-1 mice was studied at five different dieldrin dose levels from 1.85 months up to tumour development. Nuclear polyploidization, expressed in the proportion of octaploid (8c) nuclei, was found to be characterized by a linear increase with age in untreated control CF-1 mice. Dieldrin treatment induced a dose-dependent increase in the proportion of 8c-nuclei in the initial phases of treatment. In "steady-state" situations nuclear polyploidization (as expressed by the percentage of 8c-nuclei) was maintained on a dose-dependent, higher level, and the percentage was was observed to increase with age, the velocity of which was the same as in untreated controls. Tumour formation was found to be associated with a constant degree of nuclear polyploidization In all treatment groups Including controls. The observed quantitative link between nuclear polyploidization and tumour formation leads to the question whether or not a causal relationship between the two exists. Assuming that polyploidization reflects the ageing process, the data suggest that liver tumour formation Is Imminent at a constant biological age and that dieldrin could operate by advancing the biological age of CF-1 mouse liver.Further support for this hypothesis was obtained from the determination of cytoplasmic alanine amino transferase (AAT) isoenzymes. The expression of the isoenzyme decreases with age In untreated control CF-1 mice. Dieldrin treatment was found to enhance (accelerate) this process in a dose-dependent manner.Although the nature of the development of "spontaneous" liver tumours in CF-1 mice remains unknown, the decrease In the tetraplold(4c)-diplold (2c) ratio of liver nuclei, observed in the study of polyploidization, may be related to tumour formation. The decrease was observed in alltreatment groups, including controls, and its onset was dose-dependently advanced by dieldrin treatment, occurring approximately 4 months before the median liver tumour induction period in all cases. Two mechanisms are proposed that may explain the tumorigenic features of a decreasein the 4c-2c ratio.1. Tetraploid cells could be more sensitive to accumulative toxic stress. Thus, their turnover may be Increased. To replace one tetraploid cell a diploid cell has to divide twice; the loss of tetraploid cells would therefore result in a proliferative response of the diploidpopulation (resulting tumour formation).2. A reduction in the 4c-2c ratio could be induced by the occurrence of amitotic nuclear divisions in the tetraploid cells. Evidence for this possibility was obtained from experiments with 3H-thymidine-labelled nuclei. Amitotic nuclear divisions could give rise to chromosomalre-arrangements, resulting in the expression of the intrinsic neoplastic potential of CF-1 mouse liver.Both hypotheses imply that the diploid population is the source of livertumours. The determination of nuclear polyploidization in liver tumours confirmed that these tumours originate from the diploid liver cell population.</TT

    Development of an effective outsourcing strategy for toxicological studies in the chemical industry

    Full text link
    The chemical industry has been put under considerable time pressure by the European Community Regulation REACH (Registration, Evaluation, and Authorization of Chemicals). The work outlined here has been developed at the BASF SE’s Experimental Toxicology and Ecology Unit with the objective of promoting a faster reaction to the testing demand generated by the new legislation. A considerable increase in forecasted demand for tests has created the necessity to increase the Toxicology Unit’s outsourcing activities. The first goal was to optimize the selection and management process of Contract Research Organizations (CROs), so that toxicological studies can be performed with minimal risk while maximizing quality and cost advantage. A second objective was to develop performance measurement system in form of a balanced scorecard to evaluate contracting efficiency by monitoring major drivers in the outsourcing process to ensure the alignment between strategic objectives and actual performance.<br

    Use cases, best practice and reporting standards for metabolomics in regulatory toxicology

    Get PDF
    Metabolomics is a widely used technology in academic research, yet its application to regulatory science has been limited. The most commonly cited barrier to its translation is lack of performance and reporting standards. The MEtabolomics standaRds Initiative in Toxicology (MERIT) project brings together international experts from multiple sectors to address this need. Here, we identify the most relevant applications for metabolomics in regulatory toxicology and develop best practice guidelines, performance and reporting standards for acquiring and analysing untargeted metabolomics and targeted metabolite data. We recommend that these guidelines are evaluated and implemented for several regulatory use cases

    Analysis of health concerns not addressed by REACH for low tonnage chemicals and opportunities for new approach methodology

    Get PDF
    In Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) the criterion for deciding the studies that must be performed is the annual tonnage of the chemical manufactured or imported into the EU. The annual tonnage may be considered as a surrogate for levels of human exposure but this does not take into account the physico-chemical properties and use patterns that determine exposure. Chemicals are classified using data from REACH under areas of health concern covering effects on the skin and eye; sensitisation; acute, repeated and prolonged systemic exposure; effects on genetic material; carcinogenicity; and reproduction and development. We analysed the mandated study lists under REACH for each annual tonnage band in terms of the information they provide on each of the areas of health concern. Using the European Chemicals Agency (ECHA) REACH Registration data base of over 20,000 registered substances, we found that only 19% of registered substances have datasets on all areas of health concern. Information limited to acute exposure, sensitisation and genotoxicity was found for 62%. The analysis highlighted the shortfall of information mandated for substances in the lower tonnage bands. Deploying New Approach Methodologies (NAMs) at this lower tonnage band to assess health concerns which are currently not covered by REACH, such as repeat and extended exposure and carcinogenicity, would provide additional information and would be a way for registrants and regulators to gain experience in the use of NAMs. There are currently projects in Europe aiming to develop NAM-based assessment frameworks and they could find their first use in assessing low tonnage chemicals once confidence has been gained by their evaluation with data rich chemicals

    Toward Good Read-Across Practice (GRAP) guidance.

    Get PDF
    Grouping of substances and utilizing read-across of data within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity and, increasingly often, also on mechanistic (biological) similarity. While read-across can play a key role in complying with legislations such as the European REACH regulation, the lack of consensus regarding the extent and type of evidence necessary to support it often hampers its successful application and acceptance by regulatory authorities. Despite a potentially broad user community, expertise is still concentrated across a handful of organizations and individuals. In order to facilitate the effective use of read-across, this document aims to summarize the state-of-the-art, summarizes insights learned from reviewing ECHA published decisions as far as the relative successes/pitfalls surrounding read-across under REACH and compile the relevant activities and guidance documents. Special emphasis is given to the available existing tools and approaches, an analysis of ECHA's published final decisions associated with all levels of compliance checks and testing proposals, the consideration and expression of uncertainty, the use of biological support data and the impact of the ECHA Read-Across Assessment Framework (RAAF) published in 2015

    Framework for the quality assurance of 'omics technologies considering GLP requirements

    Get PDF
    ‘Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing ‘omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying ‘omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective ‘omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of ‘omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, ‘omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use ‘omics data in a fit-for-purpose context, which enhances their applicability for risk assessment

    Demonstrating the reliability of in vivo metabolomics based chemical grouping:towards best practice

    Get PDF
    While grouping/read-across is widely used to fill data gaps, chemical registration dossiers are often rejected due to weak category justifications based on structural similarity only. Metabolomics provides a route to robust chemical categories via evidence of shared molecular effects across source and target substances. To gain international acceptance, this approach must demonstrate high reliability, and best-practice guidance is required. The MetAbolomics ring Trial for CHemical groupING (MATCHING), comprising six industrial, government and academic ring-trial partners, evaluated inter-laboratory reproducibility and worked towards best-practice. An independent team selected eight substances (WY-14643, 4-chloro-3-nitroaniline, 17α-methyl-testosterone, trenbolone, aniline, dichlorprop-p, 2-chloroaniline, fenofibrate); ring-trial partners were blinded to their identities and modes-of-action. Plasma samples were derived from 28-day rat tests (two doses per substance), aliquoted, and distributed to partners. Each partner applied their preferred liquid chromatography–mass spectrometry (LC–MS) metabolomics workflows to acquire, process, quality assess, statistically analyze and report their grouping results to the European Chemicals Agency, to ensure the blinding conditions of the ring trial. Five of six partners, whose metabolomics datasets passed quality control, correctly identified the grouping of eight test substances into three categories, for both male and female rats. Strikingly, this was achieved even though a range of metabolomics approaches were used. Through assessing intrastudy quality-control samples, the sixth partner observed high technical variation and was unable to group the substances. By comparing workflows, we conclude that some heterogeneity in metabolomics methods is not detrimental to consistent grouping, and that assessing data quality prior to grouping is essential. We recommend development of international guidance for quality-control acceptance criteria. This study demonstrates the reliability of metabolomics for chemical grouping and works towards best-practice

    Metabolite Profiling of Alzheimer's Disease Cerebrospinal Fluid

    Get PDF
    Alzheimer's disease (AD) is a neurodegenerative disorder characterized by progressive loss of cognitive functions. Today the diagnosis of AD relies on clinical evaluations and is only late in the disease. Biomarkers for early detection of the underlying neuropathological changes are still lacking and the biochemical pathways leading to the disease are still not completely understood. The aim of this study was to identify the metabolic changes resulting from the disease phenotype by a thorough and systematic metabolite profiling approach. For this purpose CSF samples from 79 AD patients and 51 healthy controls were analyzed by gas and liquid chromatography-tandem mass spectrometry (GC-MS and LC-MS/MS) in conjunction with univariate and multivariate statistical analyses. In total 343 different analytes have been identified. Significant changes in the metabolite profile of AD patients compared to healthy controls have been identified. Increased cortisol levels seemed to be related to the progression of AD and have been detected in more severe forms of AD. Increased cysteine associated with decreased uridine was the best paired combination to identify light AD (MMSE>22) with specificity and sensitivity above 75%. In this group of patients, sensitivity and specificity above 80% were obtained for several combinations of three to five metabolites, including cortisol and various amino acids, in addition to cysteine and uridine
    • …
    corecore