1,057 research outputs found

    Relic density calculations beyond tree-level, exact calculations versus effective couplings: the ZZ final state

    Full text link
    The inferred value of the relic density from cosmological observations has reached a precision that is akin to that of the LEP precision measurements. This level of precision calls for the evaluation of the annihilation cross sections of dark matter that goes beyond tree-level calculations as currently implemented in all codes for the computation of the relic density. In supersymmetry radiative corrections are known to be large and thus must be implemented. Full one-loop radiative corrections for many annihilation processes have been performed. It is important to investigate whether the bulk of these corrections can be parameterised through an improved Born approximation that can be implemented as a selection of form factors to a tree-level code. This paper is a second in a series that addresses this issue. After having provided these form factors for the annihilation of the neutralinos into fermions, which cover the case of a bino-like LSP (Lightest Supersymmetric Particle), we turn our attention here to a higgsino-like dark matter candidate through its annihilation into ZZZZ. We also investigate the cases of a mixed LSP. In all cases we compare the performance of the form factor approach with the result of a full one-loop correction. We also study the issue of the renormalisation scheme dependence. An illustration of the phenomenon of non decoupling of the heavy sfermions that takes place for the annihilation of the lightest neutralino into ZZZZ is also presented.Comment: 20

    One-loop corrections, uncertainties and approximations in neutralino annihilations: Examples

    Full text link
    The extracted value of the relic density has reached the few per-cent level precision. One can therefore no longer content oneself with calculations of this observable where the annihilation processes are computed at tree-level, especially in supersymmetry where radiative corrections are usually large. Implementing full one-loop corrections to all annihilation processes that would be needed in a scan over parameters is a daunting task. On the other hand one may ask whether the bulk of the corrections are taken into account through effective couplings of the neutralino that improve the tree-level calculation and would be easy to implement. We address this issue by concentrating in this first study on the neutralino coupling to i) fermions and sfermions and ii) Z. After constructing the effective couplings we compare their efficiency compared to the full one-loop calculation and comment on the failures and success of the approach. As a bonus we point out that large non decoupling effects of heavy sfermions could in principle be measured in the annihilation process, a point of interest in view of the latest limit on the squark masses from the LHC. We also comment on the scheme dependencies of the one-loop corrected results

    Positive Feedback Trading: Google Trends and Feeder Cattle Futures

    Get PDF
    What do investors’ searches for public information reveal about their subsequent trading strategies?  Does their search for information support the hypothesis of market efficiency or does it lend support to the idea that investors have behavioral biases. Using Google Trends, we find that the volume of Google searches about feeder cattle is associated with re-enforcement of momentum trading in a manner consistent with a positive feedback mechanism.  Further, we find evidence that search volume for “cattle” is associated with higher volatility and thus amplifies the positive feedback trading mechanism, while the search volume for “corn”, a major input to cattle production, is associated with a reduction in volatility

    Measuring Up: Assessment In Microeconomics

    Get PDF
    Appropriate assessment is of major importance for universities today. Many faculty perceive assessment as already occurring through grade assignment. This paper investigates grades versus knowledge of learning objectives as forms of assessment. By analyzing the relationship between examination questions and post-test comprehension of learning objectives in Principles of Microeconomics, this study tests differences in proportions of correct responses from the two evaluation methods. For some learning objectives there are statistical differences between the two proportions but insignificant differences for the others. These mixed results demonstrate exam questions and learning objective post-test questions are not necessarily equal measures of student learning

    SUSY Higgs searches : beyond the MSSM

    Full text link
    The recent results from the ATLAS and CMS collaborations show that the allowed range for a Standard Model Higgs boson is now restricted to a very thin region. Although those limits are presented exclusively in the framework of the SM, the searches themselves remain sensitive to other Higgs models. We recast the limits within a generic supersymmetric framework that goes beyond the usual minimal extension. Such a generic model can be parameterised through a supersymmetric effective Lagrangian with higher order operators appearing in the K\"ahler potential and the superpotential, an approach whose first motivation is to alleviate the fine-tuning problem in supersymmetry with the most dramatic consequence being a substantial increase in the mass of the lightest Higgs boson as compared to the minimal supersymmetic model. We investigate in this paper the constraints set by the LHC on such models. We also investigate how the present picture will change when gathering more luminosity. Issues of how to combine and exploit data from the LHC dedicated to searches for the standard model Higgs to such supersymmetry inspired scenarios are discussed. We also discuss the impact of invisible decays of the Higgs in such scenarios.Comment: - Mathematics typos corrected. - Added a small section to take into account the latest update from ATLAS and CMS with 5fb-1. - Small changes to the text especially the conclusions in view of the latest data from the LHC. - Updated bibliograph

    Online, Instructional Television And Traditional Delivery: Student Characteristics And Success Factors In Business Statistics

    Get PDF
    Distance education has surged in recent years while research on student characteristics and factors leading to successful outcomes has not kept pace. This study examined characteristics of regional university students in undergraduate Business Statistics and factors linked to their success based on three modes of delivery - Online, Instructional Television (ITV), and Traditional classroom. The three groups were found to have similar GPAs prior to taking their statistics courses. Online students were more likely to be repeating the course, to have earned more credit hours prior to enrolling, and to be significantly older. Ordinary Least Squares regression identified GPA and % absences (or an effort proxy) as highly significant predictors of course performance. Academic advisors are encouraged to suggest a traditional format to students who are repeating the course and to caution students that previous online coursework may produce expectations that are not appropriate for online courses in statistics

    Optimum design and control of amine scrubbing in response to electricity and CO2 prices

    Get PDF
    AbstractThis paper presents steady state and dynamic modelling of post combustion CO2 capture using 30 wt% MEA integrated with models of CO2 compression and the steam power cycle. It uses multivariable optimization tools to maximize hourly profit of a 100 MWe coal-fired power plant. Steady state optimization for design provided optimum lean loading and CO2 removal as a function of price ratio (CO2 price/electricity price). The results indicated that for price ratio between 2.1 and 7, the plant should be designed at removal between 70% and 98% and lean loading in the range of 0.22–0.25. Dynamic optimization determined the operation of the capture system in response to two partial load scenarios (reboiler steam load reduction and power plant boiler load reduction) and provided optimum set points for steam rate, solvent circulation rate and stripper pressure control loops. Maximum profit is maintained by allowing the stripper pressure to drop and implementing a ratio control between solvent and steam rate (and flue gas rate for partial boiler load operation)

    Monte Carlo direct simulation technique user's manual

    Get PDF
    User manual for Monte Carlo direct simulation techniqu

    Quantification of sediment-water interactions in a polluted tropical river through biogeochemical modeling

    Get PDF
    Diagenetic modeling presents an interesting and robust way to understand sediment-water column processes. Here we present the application of such a model to the Day River in Northern Vietnam, a system that is subject to high levels of domestic wastewater inputs from the Hanoi metropolitan area. Experimental data from three areas of different water and sediment quality, combined with some additional data from the river, are used to set up and calibrate a diagenetic model. The model was used to determine the role of the sediments as a sink for carbon and nutrients and shows that in the dry season, 27% of nitrogen, 25% of carbon, and 38% of phosphorus inputs into the river system are stored in sediments. The corresponding numbers during the rainy season are 15%, 10%, and 20%, respectively. The diagenetic model was then used to test the impact of an improvement in the treatment of Hanoi's municipal wastewater. We show that improved wastewater treatment could reduce by about 17.5% the load of organic matter to the sediment. These results are the first to highlight the importance of sediments as a potential removal mechanism of organic matter and nutrients from the water column in this type of highly impacted tropical urban river, further demonstrating that rivers need to be considered as reaction sites and not just as inert conduits
    • …
    corecore