4,401 research outputs found

    Large Scale Baryon Isocurvature Inhomogeneities

    Get PDF
    Big bang nucleosynthesis constraints on baryon isocurvature perturbations are determined. A simple model ignoring the effects of the scale of the perturbations is first reviewed. This model is then extended to test the claim that large amplitude perturbations will collapse, forming compact objects and preventing their baryons from contributing to the observed baryon density. It is found that baryon isocurvature perturbations are constrained to provide only a slight increase in the density of baryons in the universe over the standard homogeneous model. In particular it is found that models which rely on power laws and the random phase approximation for the power spectrum are incompatible with big bang nucleosynthesis unless an {\em ad hoc}, small scale cutoff is included.Comment: 11pages + 8figures, LaTeX (2.09), postscript figures available via anonymous ftp from oddjob.uchicago.edu:/ftp/ibbn/fig?.ps where ?=1-8 or via email from [email protected], Fermilab-Pub-94/???-A and UMN-TH-1307/9

    Accelerator Constraints on Neutralino Dark Matter

    Get PDF
    The constraints on neutralino dark matter \chi obtained from accelerator searches at LEP, the Fermilab Tevatron and elsewhere are reviewed, with particular emphasis on results from LEP 1.5. These imply within the context of the minimal supersymmetric extension of the Standard Model that m_\chi \ge 21.4 GeV if universality is assumed, and yield for large tan\beta a significantly stronger bound than is obtained indirectly from Tevatron limits on the gluino mass. We update this analysis with preliminary results from the first LEP 2W run, and also preview the prospects for future sparticle searches at the LHC.Comment: Presented by J. Ellis at the Workshop on the Identification of Dark Matter, Sheffield, September, 1996. 14 pages; Latex; 12 Fig

    Exploration of the MSSM with Non-Universal Higgs Masses

    Get PDF
    We explore the parameter space of the minimal supersymmetric extension of the Standard Model (MSSM), allowing the soft supersymmetry-breaking masses of the Higgs multiplets, m_{1,2}, to be non-universal (NUHM). Compared with the constrained MSSM (CMSSM) in which m_{1,2} are required to be equal to the soft supersymmetry-breaking masses m_0 of the squark and slepton masses, the Higgs mixing parameter mu and the pseudoscalar Higgs mass m_A, which are calculated in the CMSSM, are free in the NUHM model. We incorporate accelerator and dark matter constraints in determining allowed regions of the (mu, m_A), (mu, M_2) and (m_{1/2}, m_0) planes for selected choices of the other NUHM parameters. In the examples studied, we find that the LSP mass cannot be reduced far below its limit in the CMSSM, whereas m_A may be as small as allowed by LEP for large tan \beta. We present in Appendices details of the calculations of neutralino-slepton, chargino-slepton and neutralino-sneutrino coannihilation needed in our exploration of the NUHM.Comment: 92 pages LaTeX, 32 eps figures, final version, some changes to figures pertaining to the b to s gamma constrain

    What if the Higgs Boson Weighs 115 GeV?

    Get PDF
    If the Higgs boson indeed weighs about 114 to 115 GeV, there must be new physics beyond the Standard Model at some scale \la 10^6 GeV. The most plausible new physics is supersymmetry, which predicts a Higgs boson weighing \la 130 GeV. In the CMSSM with R and CP conservation, the existence, production and detection of a 114 or 115 GeV Higgs boson is possible if \tan\beta \ga 3. However, for the radiatively-corrected Higgs mass to be this large, sparticles should be relatively heavy: m_{1/2} \ga 250 GeV, probably not detectable at the Tevatron collider and perhaps not at a low-energy e^+ e^- linear collider. In much of the remaining CMSSM parameter space, neutralino-stau coannihilation is important for calculating the relic neutralino density, and we explore implications for the elastic neutralino-nucleon scattering cross section.Comment: 17 pages, 5 eps figure

    SCRIPTKELL : a tool for measuring cognitive effort and time processing in writing and other complex cognitive activities

    Get PDF
    We present SCRIPTKELL, a computer-assisted experimental tool that makes it possible to measure the time and cognitive effort allocated to the subprocesses of writing and other cognitive activities, SCRIPTKELL was designed to easily use and modulate Kellogg's (1986) triple-task procedure,.which consists of a combination of three tasks: a writing task (or another task), a reaction time task (auditory signal detection), and a directed retrospection task (after each signal detection during writing). We demonstrate how this tool can be used to address several novel empirical and theoretical issues. In sum, SCRIPTKELL should facilitate the flexible realization of experimental designs and the investigation of critical issues concerning the functional characteristics of complex cognitive activities

    On the Feasibility of a Stop NLSP in Gravitino Dark Matter Scenarios

    Get PDF
    We analyze the possibility that the lighter stop {\tilde t_1} could be the next-to-lightest supersymmetric particle (NLSP) in models where the gravitino is the lightest supersymmetric particle (LSP). We do not find any possibility for a stop NLSP in the constrained MSSM with universal input soft supersymmetry-breaking masses at the GUT scale (CMSSM), but do find small allowed regions in models with non-universal Higgs masses (NUHM). We discuss the cosmological evolution of stop hadrons. Most {\tilde t_1}qq `sbaryons' and the corresponding `antisbaryons' annihilate with conventional antibaryons and baryons into {\tilde t_1}{\bar q} `mesinos' and the corresponding `antimesinos', respectively, shortly after the quark-hadron transition in the early Universe, and most mesinos and antimesinos subsequently annihilate. As a result, insufficient metastable charged stop hadrons survive to alter Big Bang nucleosynthesis.Comment: 31 pages, 14 figure

    Orthogonal Decomposition of Some Affine Lie Algebras in Terms of their Heisenberg Subalgebras

    Full text link
    In the present note we suggest an affinization of a theorem by Kostrikin et.al. about the decomposition of some complex simple Lie algebras G{\cal G} into the algebraic sum of pairwise orthogonal Cartan subalgebras. We point out that the untwisted affine Kac-Moody algebras of types Apm1A_{p^m-1} (pp prime, m1m\geq 1), Br,C2m,Dr,G2,E7,E8B_r, \, C_{2^m}, D_r,\, G_2,\, E_7,\, E_8 can be decomposed into the algebraic sum of pairwise or\-tho\-go\-nal Heisenberg subalgebras. The Apm1A_{p^m-1} and G2G_2 cases are discussed in great detail. Some possible applications of such decompositions are also discussed.Comment: 16 pages, LaTeX, no figure

    A Bayesian Estimate of the Primordial Helium Abundance

    Get PDF
    We introduce a new statistical method to estimate the primordial helium abundance, Y_p from observed abundances in a sample of galaxies which have experienced stellar helium enrichment. Rather than using linear regression on metal abundance we construct a likelihood function using a Bayesian prior, where the key assumption is that the true helium abundance must always exceed the primordial value. Using a sample of measurements compiled from the literature we find estimates of Y_p between 0.221 and 0.236, depending on the specific subsample and prior adopted, consistent with previous estimates either from a linear extrapolation of the helium abundance with respect to metallicity, or from the helium abundance of the lowest metallicity HII region, I Zw 18. We also find an upper limit which is insensitive to the specific subsample or prior, and estimate a model-independent bound Y_p < 0.243 at 95% confidence, favoring a low cosmic baryon density and a high primordial deuterium abundance. The main uncertainty is not the model of stellar enrichment but possible common systematic biases in the estimate of Y in each individual HII region.Comment: 14 pages, latex, 3 ps figure
    corecore