47 research outputs found

    Toxicology databases and the concept of thresholds of toxicological concern as used by the JECFA for the safety evaluation of flavouring agents

    No full text
    Since 1996 the FAO/WHO Joint Expert Committee on Food Additives (JECFA) has evaluated the safety of 1259 flavouring substances, based on a decision tree that incorporates a series of thresholds of toxicological concern. Safety conclusions are based on the predicted consequences of metabolism and whether the estimated intake is above or below a threshold of toxicological concern that is relevant to that compound. Compounds are allocated to one of three structural classes, and the intake compared with a threshold of toxicological concern derived using data from chronic and sub-chronic toxicity studies on compounds in the same structural class. If the substance is predicted to be metabolised to innocuous products there is no safety concern if the intake is below the threshold, but suitable toxicity data on the compound or structural analogues are required if the intake exceeds the threshold. If the substance is not predicted to be metabolised to innocuous products, and the intake is below the appropriate threshold, safety evaluation is based on data on the compound or structural analogues. An additional threshold of 1.5 ?g per day, derived from doses of investigated chemicals giving a calculated cancer risk of one in a million, is applied when appropriate toxicity data are not available

    Structure-based thresholds of toxicological concern-guidance for application to substances present at low levels in the diet

    No full text
    Health-based guidance values, such as the ADI, use chemical-specific data to determine the highest intake that would be without significant adverse health effects. A threshold of toxicological concern (TTC) is a level of intake predicted to be without adverse effects based on the toxicity of structurally related compounds. The main advantage of the use of TTCs is that the risk of low exposures can be evaluated without the need for chemical-specific animal toxicity data. TTCs have been used for many years for screening the safety of packaging migrants by the FDA in the USA, and of flavoring substances, by the JECFA. A recent reassessment of the use of TTCs, organized by ILSI Europe, has developed a decision tree which allows a systematic approach to the evaluation of low levels of diverse chemicals in food. The decision tree incorporates a series of increasing TTC values into a step-wise approach. Potentially genotoxic carcinogens are considered first, based on the presence of known structural alerts. Aflatoxin-like, azoxy- and nitroso-compounds are removed from consideration because they are the most potent, and a practical TTC could not be established. Other compounds with structural alerts for genotoxicity are allocated a TTC of 0.15 ?g/person per day. Compounds without structural alerts for genotoxicity are evaluated based on chemical structure and intake using a series of TTC values derived by the application of a 100-fold uncertainty factor to the 5th percentile of the distribution of NOAELs from chronic studies on compounds sharing similar structural characteristics

    Risk characterisation of chemicals in food

    No full text
    Information on hazard identification, hazard characterisation (including dose–response assessment) and exposure assessment are brought together under risk characterisation in order to provide advice to risk managers. Hazard identification and characterisation define the adverse effects produced by the chemical and are used to determine a health-based guidance value, such as an acceptable daily intake for threshold effects, or an estimate of the intake associated with a predefined level of risk for non-threshold effects. Exposure assessment aims to define average and high intakes, and has to take into account different dietary patterns. Hazard characterisation and exposure data need to be closely inter-related, for example the data should relate to the same life-stages and durations of exposure. Risk characterisation has to be an iterative process in which information on hazard and exposure are matched and any discrepancy taken into account where necessary by the generation of additional data. There is an established method of quantitative risk characterisation for non-threshold effects. A population distribution model is proposed that is applicable to estimate the risk associated with exposure to compounds producing threshold effects

    The refinement of uncertainty/safety factors in risk assessment by the incorporation of data on toxicokinetic variability in humans

    No full text
    The derivation of safe levels of exposure in humans for compounds that are assumed to cause threshold toxicity has relied on the application of a 100-fold uncertainty factor to a measure for the threshold, such as the no observed adverse effect level (NOAEL) or the benchmark dose (BMD). This 100-fold safety factor consists of the product of two 10-fold factors allowing for human variability and interspecies differences. The International Programme on Chemical Safety has suggested the subdivision of these 10-fold factors to allow for variability in toxicokinetics and toxicodynamics. This subdivision allows the replacement of the default uncertainty factors with a chemical-specific adjustment factor (CSAF) when suitable data are available. This short review describes potential options to refine safety factors used in risk assessment, with particular emphasis on pathway-related uncertainty factors associated with variability in kinetics. These pathway-related factors were derived from a database that quantified interspecies differences and human variability in phase I metabolism, phase II metabolism, and renal excretion. This approach allows metabolism and pharmacokinetic data in healthy adults and subgroups of the population to be incorporated in the risk-assessment process and constitutes an intermediate approach between simple default factors and chemical-specific adjustment factors

    Human variability in the renal elimination of foreign compounds and renal excretion-related uncertainty factors for risk assessment

    No full text
    Renal excretion is an important route of elimination for xenobiotics and three processes determine the renal clearance of a compound [glomerular filtration (about 120 ml/min), active renal tubular secretion (>120 ml/min) and passive reabsorption (<120 ml/min)]. Human variability in kinetics has been quantified using a database of 15 compounds excreted extensively by the kidney (>60% of a dose) to develop renal-excretion related uncertainty factors for the risk assessment of environmental contaminants handled via this route. Data were analysed from published pharmacokinetic studies (after oral and intravenous dosing) in healthy adults and other subgroups using parameters relating primarily to chronic exposure [renal and total clearances, area under the plasma concentration time-curve (AUC)] and acute exposure (Cmax). Interindividual variability in kinetics was low for both routes of exposure, with coefficients of variation of 21% (oral) and 24% (intravenous) that were largely independent of the renal processes involved. Renal-excretion related uncertainty factors were below the default kinetic uncertainty factor of 3.16 for most subgroups analysed with the exception of the elderly (oral data) and neonates (intravenous data) for whom renal excretion-related factors of 4.2 and 3.2 would be required to cover up to 99% of these subgroups respectively.CLr Renal clearance CL Total clearance AUC Area under the plasma-concentration-time-curve Cmax Maximum plasma concentration Ns Number of studies Np Number of publications n number of subjects XW Arithmetic weighted mean (normal distribution) SDw Weighted standard deviation (normal distribution) CVN Coefficient of variation (normal distribution) GMW Geometric weighted mean (lognormal distribution) GSDw Weighted geometric standard deviation (lognormal distribution) CVLN Coefficient of variation (lognormal distribution) GF Glomerular filtration TS Tubular secretio

    The role of stratum corneum and dermal microvascular perfusion in penetration and tissue levels of water-soluble drugs investigated by microdialysis

    No full text
    BACKGROUND: Hydrophilic drugs are poorly absorbed when applied topically, due to low partitioning through the lipid matrix of the stratum corneum. Cutaneous blood flow rapidly clears the absorbed drug, which may result in low tissue levels. This is of importance for topically applied drugs whose site of action is within the epidermis or dermis. Dermal drug levels can be measured using cutaneous microdialysis, which is a means of continuously sampling substances from the dermal extracellular fluid. OBJECTIVES: To measure the contribution of stratum corneum barrier and microvascular perfusion in determining dermal tissue levels of hydrophilic drugs (aciclovir and penciclovir) in vivo. METHODS: Studies were performed using microdialysis of the volar surface of the forearm of healthy volunteers (n = 55) over a 5-h collection period. Stratum corneum was removed by tape stripping, and barrier disruption quantified by measurement of transepidermal water loss (TEWL); dermal microvascular perfusion was modulated by inclusion of noradrenaline in the microdialysis perfusate. RESULTS: With intact skin and normal cutaneous blood flow the concentration of penciclovir recovered was below assay threshold (0.05 ng x mL(-1). With noradrenaline-induced local vasoconstriction, the area under the curve of drug absorbed through normal skin (+/- SEM) was 13.3 +/- 2.9 ng mL(-1) h(0-5) for penciclovir and 27.6 +/- 10.6 ng mL(-1) h(0-5) for aciclovir. Removal of the stratum corneum (to glistening) by tape stripping increased penciclovir absorption by 1300-fold and aciclovir absorption by 440-fold, confirming the stratum corneum as the major barrier to hydrophilic drug absorption. Sequential barrier disruption by tape stripping gave a close correlation between penciclovir concentration absorbed per hour and barrier disruption measured by TEWL (r2 = 0.9283). There was a 15.6-fold difference in the recovery of penciclovir through barrier-deficient skin with and without cutaneous blood flow. There was no relationship between fibre depth and amount of drug dialysed, which suggests free movement of antiviral drug on reaching the aqueous environment of the dermis. CONCLUSIONS: This study defines for the first time the relationship between the degree of mechanical barrier impairment and drug absorption at the same anatomical site in humans, and the role of blood flow in drug clearance in vivo

    Human variability in CYP3A4 metabolism and CYP3A4-related uncertainty factors for risk assessment

    No full text
    CYP3A4 constitutes the major liver cytochrome P450 isoenzyme and is responsible for the oxidation of more than 50% of all known drugs. Human variability in kinetics for this pathway has been quantified using a database of 15 compounds metabolised extensively (>60%) by this CYP isoform in order to develop CYP3A4-related uncertainty factors for the risk assessment of environmental contaminants handled via this route. Data were analysed from published pharmacokinetic studies (after oral and intravenous dosing) in healthy adults and other subgroups using parameters relating primarily to chronic exposure [metabolic and total clearances, area under the plasma concentration–time curve (AUC)] and acute exposure (Cmax). Interindividual variability in kinetics was greater for the oral route (46%, 12 compounds) than for the intravenous route (32%, 14 compounds). The physiological and molecular basis for the difference between these two routes of exposure is discussed. In relation to the uncertainty factors used for risk assessment, the default kinetic factor of 3.16 would be adequate for adults, whereas a CYP3A4-related factor of 12 would be required to cover up to 99% of neonates, which have lower CYP3A4 activity

    Human variability in xenobiotic metabolism and pathway-related uncertainty factors for chemical risk assessment: a review

    No full text
    This review provides an account of recent developments arising from a database that defined human variability in phase I metabolism (CYP1A2, CYP2A6, CYP2C9, CYP2C19, CYP2D6, CYP2E1, CYP3A4, hydrolysis, alcohol dehydrogenase), phase II metabolism (N-acetyltransferases, glucuronidation, glycine conjugation, sulphation) and renal excretion. This database was used to derive pathway-related uncertainty factors for chemical risk assessment that allow for human variability in toxicokinetics. Probe substrates for each pathway of elimination were selected on the basis that oral absorption was &gt;95% and that the metabolic route was the primary route of elimination of the compound (60–100% of a dose). Intravenous data were used for compounds for which absorption was variable. Human variability in kinetics was quantified for each compound from published pharmacokinetic studies (after oral and intravenous dosing) in healthy adults and other subgroups of the population using parameters relating to chronic exposure (metabolic and total clearances, area under the plasma concentration–time curve (AUC) and acute exposure (Cmax) (data not presented here). The pathway-related uncertainty factors were calculated to cover 95%, 97.5% and 99% of the population of healthy adults and of each subgroup.Pathway-related uncertainty factors allow metabolism data to be incorporated into the derivation of health-based guidance values. They constitute an intermediate approach between the general kinetic default factors (3.16) and a chemical-specific adjustment factor. Applications of pathway-related uncertainty factors for chemical risk assessment and future refinements of the approach are discussed. A knowledge-based framework to predict human variability in kinetics for xenobiotics showing a threshold dose below which toxic effects are not observed, is proposed to move away from default assumptions. <br/

    Uncertainty factors for chemical risk assessment: interspecies differences in the in vivo pharmacokinetics and metabolism of human CYP1A2 substrates

    No full text
    The 100-fold default uncertainty factor is used to convert a no-observed-adverse-effect level (NOAEL) from a animal toxicity study, to a "safe" value for human intake. The composite uncertainty factor (100) has to allow for interspecies (10-fold) and interindividual (10-fold) differences in toxicokinetics and toxicodynamics. The aim of the current study was to assess the validity of the interspecies default for toxicokinetics (4.0) for each of the test species (dog, rabbit, rat and mouse), using published data for compounds eliminated by CYP1A2 in humans (caffeine, theobromine, theophylline and paraxanthine). An analysis of the published literature showed that the absorption, bioavailability and route of excretion were generally similar between humans and the test species, for each probe substrate. However, interspecies differences in the route of metabolism, and the enzymes involved in this process, were identified. The magnitude of difference in the internal dose, between species, showed that values for the mouse (10.6) and rat (5.4) exceed the 4.0-fold default, whereas the rabbit (2.6) and dog (1.6) were below this value. This work supports the need to replace the generic default factors by a compound-related value derived from specific, relevant, quantitative data; this would result in more relevant and reliable non-cancer risk assessments
    corecore