516 research outputs found

    New Constraints (and Motivations) for Abelian Gauge Bosons in the MeV-TeV Mass Range

    Full text link
    We survey the phenomenological constraints on abelian gauge bosons having masses in the MeV to multi-GeV mass range (using precision electroweak measurements, neutrino-electron and neutrino-nucleon scattering, electron and muon anomalous magnetic moments, upsilon decay, beam dump experiments, atomic parity violation, low-energy neutron scattering and primordial nucleosynthesis). We compute their implications for the three parameters that in general describe the low-energy properties of such bosons: their mass and their two possible types of dimensionless couplings (direct couplings to ordinary fermions and kinetic mixing with Standard Model hypercharge). We argue that gauge bosons with very small couplings to ordinary fermions in this mass range are natural in string compactifications and are likely to be generic in theories for which the gravity scale is systematically smaller than the Planck mass - such as in extra-dimensional models - because of the necessity to suppress proton decay. Furthermore, because its couplings are weak, in the low-energy theory relevant to experiments at and below TeV scales the charge gauged by the new boson can appear to be broken, both by classical effects and by anomalies. In particular, if the new gauge charge appears to be anomalous, anomaly cancellation does not also require the introduction of new light fermions in the low-energy theory. Furthermore, the charge can appear to be conserved in the low-energy theory, despite the corresponding gauge boson having a mass. Our results reduce to those of other authors in the special cases where there is no kinetic mixing or there is no direct coupling to ordinary fermions, such as for recently proposed dark-matter scenarios.Comment: 49 pages + appendix, 21 figures. This is the final version which appears in JHE

    Establishing a proactive safety and health risk management system in the fire service

    Get PDF
    BACKGROUND: Formalized risk management (RM) is an internationally accepted process for reducing hazards in the workplace, with defined steps including hazard scoping, risk assessment, and implementation of controls, all within an iterative process. While required for all industry in the European Union and widely used elsewhere, the United States maintains a compliance-based regulatory structure, rather than one based on systematic, risk-based methodologies. Firefighting is a hazardous profession, with high injury, illness, and fatality rates compared with other occupations, and implementation of RM programs has the potential to greatly improve firefighter safety and health; however, no descriptions of RM implementation are in the peer-reviewed literature for the North American fire service. METHODS: In this paper we describe the steps used to design and implement the RM process in a moderately-sized fire department, with particular focus on prioritizing and managing injury hazards during patient transport, fireground, and physical exercise procedures. Hazard scoping and formalized risk assessments are described, in addition to the identification of participatory-led injury control strategies. Process evaluation methods were conducted to primarily assess the feasibility of voluntarily instituting the RM approach within the fire service setting. RESULTS: The RM process was well accepted by the fire department and led to development of 45 hazard specific-interventions. Qualitative data documenting the implementation of the RM process revealed that participants emphasized the: value of the RM process, especially the participatory bottom-up approach; usefulness of the RM process for breaking down tasks to identify potential risks; and potential of RM for reducing firefighter injury. CONCLUSIONS: As implemented, this risk-based approach used to identify and manage occupational hazards and risks was successful and is deemed feasible for U.S. (and other) fire services. While several barriers and challenges do exist in the implementation of any intervention such as this, recommendations for adopting the process are provided. Additional work will be performed to determine the effectiveness of select controls strategies that were implemented; however participants throughout the organizational structure perceived the RM process to be of high utility while researchers also found the process improved the awareness and engagement in actively enhancing worker safety and health.This item is part of the UA Faculty Publications collection. For more information this item or other items in the UA Campus Repository, contact the University of Arizona Libraries at [email protected]

    Functional analyses of glycyl-tRNA synthetase mutations suggest a key role for tRNA-charging enzymes in peripheral axons

    Get PDF
    Charcot-Marie-Tooth disease type 2D (CMT2D) and distal spinal muscular atrophy type V (dSMA-V) are axonal neuropathies characterized by a phenotype that is more severe in the upper extremities. We previously implicated mutations in the gene encoding glycyl-tRNA synthetase (GARS) as the cause of CMT2D and dSMA-V. GARS is a member of the family of aminoacyl-tRNA synthetases responsible for charging tRNA with cognate amino acids; GARS ligates glycine to tRNAGly. Here, we present functional analyses of disease-associated GARS mutations and show that there are not any significant mutation-associated changes in GARS expression levels; that the majority of identified GARS mutations modeled in yeast severely impair viability; and that, in most cases, mutant GARS protein mislocalizes in neuronal cells. Indeed, four of the five mutations studied show loss-of-function features in at least one assay, suggesting that tRNA-charging deficits play a role in disease pathogenesis. Finally, we detected endogenous GARS-associated granules in the neurite projections of cultured neurons and in the peripheral nerve axons of normal human tissue. These data are particularly important in light of the recent identification of CMT-associated mutations in another tRNA synthetase gene [YARS(tyrosyl-tRNA synthetase gene)]. Together, these findings suggest that tRNA-charging enzymes play a key role in maintaining peripheral axons

    From Rational Bubbles to Crashes

    Full text link
    We study and generalize in various ways the model of rational expectation (RE) bubbles introduced by Blanchard and Watson in the economic literature. First, bubbles are argued to be the equivalent of Goldstone modes of the fundamental rational pricing equation, associated with the symmetry-breaking introduced by non-vanishing dividends. Generalizing bubbles in terms of multiplicative stochastic maps, we summarize the result of Lux and Sornette that the no-arbitrage condition imposes that the tail of the return distribution is hyperbolic with an exponent mu<1. We then extend the RE bubble model to arbitrary dimensions d and, with the renewal theory for products of random matrices applied to stochastic recurrence equations, we extend the theorem of Lux and Sornette to demonstrate that the tails of the unconditional distributions follow power laws, with the same asymptotic tail exponent mu<1 for all assets. Two extensions (the crash hazard rate model and the non-stationary growth rate model) of the RE bubble model provide ways of reconciliation with the stylized facts of financial data. The later model allows for an understanding of the breakdown of the fundamental valuation formula as deeply associated with a spontaneous breaking of the price symmetry. Its implementation for multi-dimensional bubbles explains why the tail index mu seems to be the same for any group af assets as observed empirically. This work begs for the introduction of a generalized field theory which would be able to capture the spontaneous breaking of symmetry, recover the fundamental valuation formula in the normal economic case and extend it to the still unexplored regime where the economic growth rate is larger than the discount growth rate.Comment: Latex 27 pages with 3 eps figur

    Dynamical Casimir Effect with Semi-Transparent Mirrors, and Cosmology

    Full text link
    After reviewing some essential features of the Casimir effect and, specifically, of its regularization by zeta function and Hadamard methods, we consider the dynamical Casimir effect (or Fulling-Davis theory), where related regularization problems appear, with a view to an experimental verification of this theory. We finish with a discussion of the possible contribution of vacuum fluctuations to dark energy, in a Casimir like fashion, that might involve the dynamical version.Comment: 11 pages, Talk given in the Workshop ``Quantum Field Theory under the Influence of External Conditions (QFEXT07)'', Leipzig (Germany), September 17 - 21, 200

    Non-treatment of children with community health worker-diagnosed fast-breathing pneumonia in rural Malawi: exploratory subanalysis of a prospective cohort study

    Get PDF
    BACKGROUND: Despite recent progress, pneumonia remains the largest infectious killer of children globally. This paper describes outcomes of not treating community-diagnosed fast-breathing pneumonia on patient recovery. METHODS: We conducted an exploratory subanalysis of an observational prospective cohort study in Malawi. We recruited children (2-59 months) diagnosed by community health workers with fast-breathing pneumonia using WHO integrated community case management (iCCM) guidelines. Children were followed at days 5 and 14 with a clinical assessment of recovery. We conducted bivariate and multivariable logistic regression for the association between treatment of fast-breathing pneumonia and recovery, adjusting for potential confounders. RESULTS: We followed up 847 children, of whom 78 (9%) had not been given antibiotics (non-treatment). Non-treatment cases had higher baseline rates of diarrhoea, non-severe hypoxaemia and fever. Non-recovery (persistence or worsening of symptoms) was 13% and 23% at day 5 in those who did receive and those who did not receive co-trimoxazole. Non-recovery, when defined as worsening of symptoms only, at day 5 was 7% in treatment and 10% in non-treatment cases. For both definitions, combined co-trimoxazole and lumefantrine-artemether (LA) treatment trended towards protection (adjusted OR (aOR) 0.28; 95% CI 0.12 to 0.68/aOR 0.29; 95% CI 0.08 to 1.01). CONCLUSION: We found that children who did not receive co-trimoxazole treatment had worse clinical outcomes; malaria co-diagnosis and treatment also play a significant role in non-recovery. Further research into non-treatment of fast-breathing pneumonia, using a pragmatic approach with consideration for malaria co-diagnosis and HIV status is needed to guide refinement of community treatment algorithms in this region

    Toxicity of dietary methylmercury to fish: Derivation of ecologically meaningful threshold concentrations

    Full text link
    Threshold concentrations associated with adverse effects of dietary exposure to methylmercury (MeHg) were derived from published results of laboratory studies on a variety of fish species. Adverse effects related to mortality were uncommon, whereas adverse effects related to growth occurred only at dietary MeHg concentrations exceeding 2.5 ”g g −1 wet weight. Adverse effects on behavior of fish had a wide range of effective dietary concentrations, but generally occurred above 0.5 ”g g −1 wet weight. In contrast, effects on reproduction and other subclinical endpoints occurred at dietary concentrations that were much lower (<0.2 ”g g −1 wet wt). Field studies generally lack information on dietary MeHg exposure, yet available data indicate that comparable adverse effects have been observed in wild fish in environments corresponding to high and low MeHg contamination of food webs and are in agreement with the threshold concentrations derived here from laboratory studies. These thresholds indicate that while differences in species sensitivity to MeHg exposure appear considerable, chronic dietary exposure to low concentrations of MeHg may have significant adverse effects on wild fish populations but remain little studied compared to concentrations in mammals or birds. Environ. Toxicol. Chem. 2012; 31: 1536–1547. © 2012 SETACPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/92130/1/etc_1859_sm_SupplReferences.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/92130/2/1859_ftp.pd
    • 

    corecore