2,266 research outputs found

    Helium in natal HII regions: the origin of the X-ray absorption in gamma-ray burst afterglows

    Full text link
    Soft X-ray absorption in excess of Galactic is observed in the afterglows of most gamma-ray bursts (GRBs), but the correct solution to its origin has not been arrived at after more than a decade of work, preventing its use as a powerful diagnostic tool. We resolve this long-standing problem and find that He in the GRB's host HII region is responsible for most of the absorption. We show that the X-ray absorbing column density (N_Hx) is correlated with both the neutral gas column density and with the optical afterglow extinction (Av). This correlation explains the connection between dark bursts and bursts with high N_Hx values. From these correlations we exclude an origin of the X-ray absorption which is not related to the host galaxy, i.e. the intergalactic medium or intervening absorbers are not responsible. We find that the correlation with the dust column has a strong redshift evolution, whereas the correlation with the neutral gas does not. From this we conclude that the column density of the X-ray absorption is correlated with the total gas column density in the host galaxy rather than the metal column density, in spite of the fact that X-ray absorption is typically dominated by metals. The strong redshift evolution of N_Hx/Av is thus a reflection of the cosmic metallicity evolution of star-forming galaxies. We conclude that the absorption of X-rays in GRB afterglows is caused by He in the HII region hosting the GRB. While dust is destroyed and metals are stripped of all of their electrons by the GRB to great distances, the abundance of He saturates the He-ionising UV continuum much closer to the GRB, allowing it to remain in the neutral or singly-ionised state. Helium X-ray absorption explains the correlation with total gas, the lack of strong evolution with redshift as well as the absence of dust, metal or hydrogen absorption features in the optical-UV spectra.Comment: 10 pages, 4 figures, submitted to Ap

    Searching for hexagonal analogues of the half-metallic half-Heusler XYZ compounds

    Full text link
    The XYZ half-Heusler crystal structure can conveniently be described as a tetrahedral zinc blende YZ structure which is stuffed by a slightly ionic X species. This description is well suited to understand the electronic structure of semiconducting 8-electron compounds such as LiAlSi (formulated Li+^+[AlSi]^-) or semiconducting 18-electron compounds such as TiCoSb (formulated Ti4+^{4+}[CoSb]4^{4-}). The basis for this is that [AlSi]^- (with the same electron count as Si2_2) and [CoSb]4^{4-} (the same electron count as GaSb), are both structurally and electronically, zinc-blende semiconductors. The electronic structure of half-metallic ferromagnets in this structure type can then be described as semiconductors with stuffing magnetic ions which have a local moment: For example, 22 electron MnNiSb can be written Mn3+^{3+}[NiSb]3^{3-}. The tendency in the 18 electron compound for a semiconducting gap -- believed to arise from strong covalency -- is carried over in MnNiSb to a tendency for a gap in one spin direction. Here we similarly propose the systematic examination of 18-electron hexagonal compounds for semiconducting gaps; these would be the "stuffed wurtzite" analogues of the "stuffed zinc blende" half-Heusler compounds. These semiconductors could then serve as the basis for possibly new families of half-metallic compounds, attained through appropriate replacement of non-magnetic ions by magnetic ones. These semiconductors and semimetals with tunable charge carrier concentrations could also be interesting in the context of magnetoresistive and thermoelectric materials.Comment: 11 pages, 6 figures, of which 4 are colou

    Enhancing public sector enterprise risk management through interactive information processing

    Get PDF
    IntroductionFederal agencies are increasingly expected to adopt enterprise risk management (ERM). However, public sector adoption of ERM has typically focused on the economic efficiency of tax-financed activities based on control-based practices. This reflects an emphasis on quantifiable concerns that invariably directs attention to risk, that (by definition) relates to identifiable and measurable events, thereby downplaying uncertain and unknown aspects of public exposures. This is a potentially serious shortcoming as government entities often act as society's risk managers of last resort. When extreme events happen what were previously considered private matters can quickly turn into public obligations. Hence, there is a need for proactive assessments of the evolving public risk landscape to discern unpredictable-even unknowable-developments.MethodsThe article reviews recent empirical studies on public risk management practices, effects of digitalization in public sector institutions, current strategic management research, and insights uncovered from a recent study of risk management practices in federal agencies. On this basis, the article explains how the ability to generate value from ERM can be enhanced when it intertwines with local responsive initiatives and central strategic risk analyses. It can form a dynamic adaptive risk management process where insights from dispersed actors inform updated risk analyses based on local autonomy and open exchange of information. This approach builds on specific structural features embedded in culture-driven aspirations to generate collaborative solutions. Its functional mode is an interactive control system with open discussions across levels and functions in contrast to conventional diagnostic controls that monitor predetermined key performance indicators (KPIs) and key risk indicators (KRIs).FindingsBacked by theoretical rationales and empirical research evidence, it is found that applications of ERM frameworks can produce positive results but is unable to deal with a public risk landscape characterized by uncertain unpredictable conditions with potentially extreme outcome effects. It is shown how interactive exchange of fast local insights and slow integrated strategic risk analyses supported by digitized data processing can form a dynamic adaptive system that enable public sector institutions to deal with emergent high-scale exposures. It is explained how the requirement for conducive organizational structures and supportive values require a new strategic risk leadership approach, which is contrasted to observed practices in federal agencies that are constrained by prevailing public governance requirements.DiscussionThe need to deal with uncertainty and unknown conditions demands a cognitive shift in current thinking from a primary focus on risk to also appraise complexity and prepare for the unexpected where data-driven methods can uncover emergent exposures through dynamic information processing. This requires strategic risk leaders that recognize the significance of complex public exposures with many unknowns and a willingness to facilitate digitalized information processing rooted in a collaborative organizational climate. If handled properly, adoption of ERM in public risk management can consider emergent dimensions in complex public exposures applying interactive information processing as a dynamic adaptive risk management approach incorporating digitized methods to solicit collective intelligence for strategic risk updating

    Categorization of species as native or nonnative using DNA sequence signatures without a complete reference library.

    Get PDF
    New genetic diagnostic approaches have greatly aided efforts to document global biodiversity and improve biosecurity. This is especially true for organismal groups in which species diversity has been underestimated historically due to difficulties associated with sampling, the lack of clear morphological characteristics, and/or limited availability of taxonomic expertise. Among these methods, DNA sequence barcoding (also known as "DNA barcoding") and by extension, meta-barcoding for biological communities, has emerged as one of the most frequently utilized methods for DNA-based species identifications. Unfortunately, the use of DNA barcoding is limited by the availability of complete reference libraries (i.e., a collection of DNA sequences from morphologically identified species), and by the fact that the vast majority of species do not have sequences present in reference databases. Such conditions are critical especially in tropical locations that are simultaneously biodiversity rich and suffer from a lack of exploration and DNA characterization by trained taxonomic specialists. To facilitate efforts to document biodiversity in regions lacking complete reference libraries, we developed a novel statistical approach that categorizes unidentified species as being either likely native or likely nonnative based solely on measures of nucleotide diversity. We demonstrate the utility of this approach by categorizing a large sample of specimens of terrestrial insects and spiders (collected as part of the Moorea BioCode project) using a generalized linear mixed model (GLMM). Using a training data set of known endemic (n = 45) and known introduced species (n = 102), we then estimated the likely native/nonnative status for 4,663 specimens representing an estimated 1,288 species (412 identified species), including both those specimens that were either unidentified or whose endemic/introduced status was uncertain. Using this approach, we were able to increase the number of categorized specimens by a factor of 4.4 (from 794 to 3,497), and the number of categorized species by a factor of 4.8 from (147 to 707) at a rate much greater than chance (77.6% accuracy). The study identifies phylogenetic signatures of both native and nonnative species and suggests several practical applications for this approach including monitoring biodiversity and facilitating biosecurity

    Severity scoring of manganese health effects for categorical regression

    Get PDF
    Characterizing the U-shaped exposure response relationship for manganese (Mn) is necessary for estimating the risk of adverse health from Mn toxicity due to excess or deficiency. Categorical regression has emerged as a powerful tool for exposure-response analysis because of its ability to synthesize relevant information across multiple studies and species into a single integrated analysis of all relevant data. This paper documents the development of a database on Mn toxicity designed to support the application of categorical regression techniques. Specifically, we describe (i) the conduct of a systematic search of the literature on Mn toxicity to gather data appropriate for dose-response assessment; (ii) the establishment of inclusion/exclusion criteria for data to be included in the categorical regression modeling database; (iii) the development of a categorical severity scoring matrix for Mn health effects to permit the inclusion of diverse health outcomes in a single categorical regression analysis using the severity score as the outcome variable; and (iv) the convening of an international expert panel to both review the severity scoring matrix and assign severity scores to health outcomes observed in studies (including case reports, epidemiological investigations, and in vivo experimental studies) selected for inclusion in the categorical regression database. Exposure information including route, concentration, duration, health endpoint(s), and characteristics of the exposed population was abstracted from included studies and stored in a computerized manganese database (MnDB), providing a comprehensive repository of exposure-response information with the ability to support categorical regression modeling of oral exposure data

    Gluon Quasiparticles and the Polyakov Loop

    Full text link
    A synthesis of Polyakov loop models of the deconfinement transition and quasiparticle models of gluon plasma thermodynamics leads to a class of models in which gluon quasiparticles move in a non-trivial Polyakov loop background. These models are successful candidates for explaining both critical behavior and the equation of state for the SU(3) gauge theory at temperatures above the deconfinement temperature T_c. Polyakov loops effects are most important at intermediate temperatures from T_c up to roughly 2.5 T_c, while quasiparticle mass effects provide the dominant correction to blackbody behavior at higher temperatures.Comment: 6 pages, 7 eps figures, revtex
    corecore