153 research outputs found

    Integrating population dynamics into mapping human exposure to seismic hazard

    Get PDF
    Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making

    Moving Forward in Human Cancer Risk Assessment

    Get PDF
    The goal of human risk assessment is to decide whether a given exposure level to a particular chemical or substance is acceptable to human health, and to provide risk management measures based on an evaluation and prediction of the effects of that exposure on human health. Within this framework, the current safety paradigm for assessing possible carcinogenic properties of drugs, cosmetics, industrial chemicals and environmental exposures relies mainly on in vitro genotoxicity testing followed by 2-year bioassays in mice and rats. This testing paradigm was developed 40 to 50 years ago with the initial premise that ¿mutagens are also carcinogens¿ and that the carcinogenic risk to humans can be extrapolated from the tumor incidence after lifetime exposure to maximally tolerated doses of chemicals in rodents. Genotoxicity testing is used as a surrogate for carcinogenicity testing and is required for initiation of clinical trials (Jacobs and Jacobson-Kram 2004) and for most industrial chemicals safety assessment. Although the carcinogenicity-testing paradigm has effectively protected patients and consumers from introduction of harmful carcinogens as drugs and other products, the testing paradigm is clearly not sustainable in the future. The causal link between genetic damage and carcinogenicity is well documented; however, the limitations of genotoxicity/carcinogenicity testing assays, the presence of additional non-genotoxic mechanisms, issues of species-specific effects, and the lack of mechanistic insights provide an enormous scientific challenge. The 2-year rodent carcinogenicity bioassays are associated with technical complexity, high costs, high animal burden as well as the uncertainty associated with extrapolating from rodents to humans. Additional frustrations exist because of the limited predictability of the 2-year bioassay and, in particular, with regard to the problem of the prediction of false positives. For instance, in the Carcinogenic Potency Project DataBase (CPDB) which includes results from chronic, long-term animal cancer tests with mice, rats, hamsters amounting to a total of 6540 individual experiments with 1547 chemicals, 751 of those chemicals or 51% have positive findings in rodent studies. Similarly, when one considers all chronically used human pharmaceuticals, some 50% induce tumors in rodents. Yet only 20 human pharmaceutical compounds have been identified as carcinogens in epidemiological studies, despite the fact that quite a large number of epidemiological studies have been carried out on these compounds, e.g. NSAID¿s, benzodiazepines, phenobarbital. This high incidence of tumors in bioassays has lead to questions concerning the human relevance of tumors induced in rodents (Knight et al. 2006; Ward 2008). In summary, dependency on the rodent model as a golden standard of cancer risk assessment is neglecting the high number of false positives and clearly has serious limitations. Consequently, there is a growing appeal for a paradigm change after "50 years of rats and mice". For instance, the current demands for volume of carcinogenic testing together with limitations of animal usage as initially stipulated by REACH (Combes et al. 2006) will require revolutionary change in the testing paradigm. For the purpose of developing a road map for this needed paradigm change in carcinogenicity testing, a workshop was held in August 2009 in Venice, Italy entitled ¿Genomics in Cancer Risk Assessment.¿ This workshop brought together toxicologists from academia and industry with governmental regulators and risk assessors from the US and the EU, for discussing the state-of-the-art in developing alternative testing strategies for genotoxicity and carcinogenicity, thereby focusing on the contribution from the ¿omics technologies. What follows is a highlight of the major conclusions and suggestions from this workshop as a path forward.JRC.DG.I.3-In-vitro method

    Quantum Theory contents insertion in High School curricula

    Get PDF
    A inovação científica e tecnológica do século XX foi esmagadora. Contudo, a nossa experiência docente permite-nos afirmar que as aprendizagens na escola secundária estão longe de acompanhar a evolução que se verifica na sociedade atual. No entanto, já estão a ser incluídos alguns conteúdos de Física Moderna nos currículos oficiais de vários países, parecendo evidenciar uma preocupação em atualizar a preparação dos jovens para o mundo que os rodeia. Neste artigo relatamos um estudo que fizemos acerca dos currículos de Física de alguns países, sobretudo no que diz respeito à Teoria Quântica, a grande invenção do século XX, tentando identificar os conteúdos selecionados e a maneira como estes se integram nas orientações curriculares gerais.In the XXth century, scientific and technological innovation has been overwhelming. Our teaching profession lead us to believe that studies in High Schools do not follow the evolution of ideas that characterizes our modern society. However, some contents of Modern Physics are already included in the official curricula of several countries, which seem to be worried about the improvement of today’s youth training. In this paper we report a study made about Physics’ curricula in several countries, emphasizing Quantum Theory issues, the biggest invention of the XXth century, trying to identify the selected subjects selected and the way they fit into general curricula orientations

    Precise measurement of hadronic tau-decays with an eta meson

    Full text link
    We have studied hadronic tau decay modes involving an eta meson using 490 fb^{-1} of data collected with the Belle detector at the KEKB asymmetric-energy e+e- collider. The following branching fractions have been measured: B(tau- -> K- eta nu)=(1.58 +- 0.05 +- 0.09)x 10^{-4}, B(tau- -> K- pi0 eta nu)=(4.6 +- 1.1 +- 0.4)x 10^{-5}, B(tau- -> pi- pi0 eta nu)=(1.35 +- 0.03 +- 0.07)x 10^{-3}, B(tau- -> pi- KS eta nu)=(4.4 +- 0.7 +- 0.2)x 10^{-5}, and B(tau- -> K^{*-} eta nu)=(1.34 +- 0.12 +- 0.09)x 10^{-4}. These results are substantially more precise than previous measurements. The new measurements are compared with theoretical calculations based on the CVC hypothesis or the chiral perturbation theory. We also set upper limits on branching fractions for tau decays into K- KS eta nu, pi- KS pi0 eta nu, K- eta eta nu, pi- eta eta nu and non-resonant K- pi^0 eta nu final states.Comment: 24 pages, 7 figure

    Transcription Inhibition by DRB Potentiates Recombinational Repair of UV Lesions in Mammalian Cells

    Get PDF
    Homologous recombination (HR) is intricately associated with replication, transcription and DNA repair in all organisms studied. However, the interplay between all these processes occurring simultaneously on the same DNA molecule is still poorly understood. Here, we study the interplay between transcription and HR during ultraviolet light (UV)-induced DNA damage in mammalian cells. Our results show that inhibition of transcription with 5,6-dichloro-1-beta-D-ribofuranosylbenzimidazole (DRB) increases the number of UV-induced DNA lesions (γH2AX, 53BP1 foci formation), which correlates with a decrease in the survival of wild type or nucleotide excision repair defective cells. Furthermore, we observe an increase in RAD51 foci formation, suggesting HR is triggered in response to an increase in UV-induced DSBs, while inhibiting transcription. Unexpectedly, we observe that DRB fails to sensitise HR defective cells to UV treatment. Thus, increased RAD51 foci formation correlates with increased cell death, suggesting the existence of a futile HR repair of UV-induced DSBs which is linked to transcription inhibition
    corecore