30 research outputs found

    An integrative risk assessment approach for persistent chemicals: A case study on dioxins, furans and dioxin-like PCBs in France

    Get PDF
    a b s t r a c t For persistent chemicals slowly eliminated from the body, the accumulated concentration (body burden), rather than the daily exposure, is considered the proper starting point for the risk assessment. This work introduces an integrative approach for persistent chemical risk assessment by means of a dynamic body burden approach. To reach this goal a Kinetic Dietary Exposure Model (KDEM) was extended with the long term time trend in the exposure (historic exposure) and the comparison of bioaccumulation with body burden references for toxicity. The usefulness of the model was illustrated on the dietary exposure to PolyChlorinatedDibenzo-p-Dioxins (PCDDs), PolyChlorinatedDibenzoFurans (PCDFs) and PolyChlorinated Biphenyls (PCBs) in France. Firstly the dietary exposure to these compounds was determined in 2009 and combined with its long term time trend. In order to take differences between the kinetics of PCDD/F and dl-PCBs into account, three groups of congeners were considered i.e. PCDD/Fs, PCB 126 and remaining dl-PCBs. The body burden was compared with reference body burdens corresponding to reproductive, hepatic and thyroid toxicity. In the case of thyroid toxicity this comparison indicated that in 2009 the probability of the body burden to exceed its reference ranged from 2.8% (95% CI: 1.5-4.9%) up to 3.9% (95% CI: 2.7-7.1%) (18-29 vs. 60-79 year olds). Notwithstanding the decreasing long-term time trend of the dietary dioxin exposure in France, this probability still is expected to be 1.5% (95% CI: 0.3-2.5%) in 2030 in 60-79 olds. In the case of reproductive toxicity the probability of the 2009 body burden to exceed its reference ranged from 3.1% (95% CI: 1.4-5.0%) (18-29 year olds) to 3.5% (95% CI: 2.2-5.2%) (30-44 year olds). In 2030 this probability is negligible in 18-29 year olds, however small though significant in 30-44 year olds (0.7%, 95% CI: 0-1.6%). In the case of hepatic toxicity the probability in 2009 even in 60-79 year olds already was negligible. In conclusion this approach indicates that in France dioxin levels in food form a declining, though still present, future health risk with respect to thyroid and reproductive toxicity

    Bayesian meta-analysis of inter-phenotypic differences in human serum paraoxonase-1 activity for chemical risk assessment

    Get PDF
    Human variability in paraoxonase-1 (PON1) activities is driven by genetic polymorphisms that affect the internal dose of active oxons of organophosphorus (OP) insecticides. Here, an extensive literature search has been performed to collect human genotypic frequencies (i.e. L55M, Q192R, and C-108T) in subgroups from a range of geographical ancestry and PON1 activities in three probe substrates (paraoxon, diazoxon and phenyl acetate). Bayesian meta-analyses were performed to estimate variability distributions for PON1 activities and PON1-related uncertainty factors (UFs), while integrating quantifiable sources of inter-study, inter-phenotypic and inter-individual differences. Inter-phenotypic differences were quantified using the population with high PON1 activity as the reference group. Results from the meta-analyses provided PON1 variability distributions and these can be implemented in generic physiologically based kinetic models to develop quantitative in vitro in vivo extrapolation models. PON1-related UFs in the Caucasian population were above the default toxicokinetic UF of 3.16 for two specific genotypes namely −108CC using diazoxon as probe substrate and, −108CT, −108TT, 55MM and 192QQ using paraoxon as probe substrate. However, integration of PON1 genotypic frequencies and activity distributions showed that all UFs were within the default toxicokinetic UF. Quantitative inter-individual differences in PON1 activity are important for chemical risk assessment particularly with regards to the potential sensitivity to organophosphates' toxicity. Keywords: Human variability, PON1 activity, Polymorphism, Uncertainty facto

    Les Revendications ouvriĂšres en France, par A. BĂ©chaux,...

    No full text
    Contient une table des matiĂšresAvec mode text

    Méthodes haut-débit en toxicologie et en évaluation des risques sanitaires

    No full text
    International audienceToxicology is changing its experimental approaches from animal testing to less expensive, more ethical and relevant methods. From the beginning of this century, various regulations and research programs on both sides of the Atlantic have pushed and contributed to this change. Modern toxicology relies on two main components: in vitro testing and in silico analyses. Toxicology has also entered a world of “big data” production, switching from a low-throughput to a high-throughput mode of screening. Complementary to the assessment of toxicological impact, a large effort has also been made to evaluate human exposure to chemicals: new human and field surveys, analytical measurements, computational capacities, and the use of mathematical modeling have open new possibilities for exposure assessment. Accounting for several sources and routes of exposures, estimating combined exposure to mixtures, integrating exposure variability, and simulating long-term exposure are new challenges on their way to be solved. In addition, biomonitoring data, internal exposure biomarkers, and toxicokinetics are all adding to the list of tools and techniques helping to link the pieces of the yet incomplete puzzle of high-throughput risk assessment. Yet, high-throughput applications in toxicology have been criticized, for their inadequate representation of the biological interactions at the organism level, for the experimental noise they suffer from, for the complexity of the in vivo to in vitro extrapolation and for their yet undefined validation protocols. We propose here a brief panorama of those developments.Les approches expĂ©rimentales en toxicologie sont en train de passer de l’expĂ©rimentation animale Ă  des mĂ©thodes moins coĂ»teuses, plus Ă©thiques et pertinentes. DĂšs le dĂ©but de ce siĂšcle, diffĂ©rents programmes de recherche et rĂ©glementations des deux cĂŽtĂ©s de l’Atlantique ont contribuĂ© Ă  ce changement. La toxicologie moderne repose sur deux Ă©lĂ©ments principaux: les tests in vitro et les analyses in silico. La toxicologie s’est Ă©galement lancĂ©e dans la production de donnĂ©es Ă  grande Ă©chelle, passant d’un faible dĂ©bit Ă  un mode de criblage haut-dĂ©bit. D’une maniĂšre complĂ©mentaire Ă  l’évaluation de l’impact toxicologique, un effort important est Ă©galement fait pour Ă©valuer l’exposition humaine aux substances chimiques. De nouvelles enquĂȘtes de consommation, de nouveaux dosages analytiques, des capacitĂ©s de calcul plus importantes, ainsi que l’utilisation de la modĂ©lisation mathĂ©matique ont ouvert de nouvelles possibilitĂ©s pour l’évaluation de l’exposition. MalgrĂ© le grand nombre de sources et de voies d’exposition, l’estimation de l’exposition aux mĂ©langes, l’intĂ©gration de sa variabilitĂ©, et la simulation des expositions Ă  long terme sont de nouveaux dĂ©fis en voie de rĂ©solution. De plus, les donnĂ©es de biosurveillance, les biomarqueurs d’exposition interne, et la toxicocinĂ©tique sont des outils qui aident Ă  complĂ©ter le puzzle d’évaluation Ă  haut dĂ©bit des risques. Pourtant, les applications Ă  haut dĂ©bit en toxicologie ont Ă©tĂ© critiquĂ©es pour leur reprĂ©sentation inadĂ©quate des interactions biologiques au niveau de l’organisme, les incertitudes dont elles sont entachĂ©es, la complexitĂ© de l’extrapolation in vivo-in vitro et Ă  cause de leurs protocoles de validation encore mal dĂ©finis. Nous prĂ©sentons ici un bref aperçu de ces dĂ©veloppements

    La escuela econĂłmica francesa

    No full text
    corecore