392 research outputs found

    Observationally Determining the Properties of Dark Matter

    Get PDF
    Determining the properties of the dark components of the universe remains one of the outstanding challenges in cosmology. We explore how upcoming CMB anisotropy measurements, galaxy power spectrum data, and supernova (SN) distance measurements can observationally constrain their gravitational properties with minimal assumptions on the theoretical side. SN observations currently suggest the existence of dark matter with an exotic equation of state p/rho < -1/3 that accelerates the expansion of the universe. When combined with CMB anisotropy measurements, SN or galaxy survey data can in principle determine the equation of state and density of this component separately, regardless of their value, as long as the universe is spatially flat. Combining these pairs creates a sharp consistency check. If p/rho > -1/2, then the clustering behavior (sound speed) of the dark component can be determined so as to test the scalar-field ``quintessence'' hypothesis. If the exotic matter turns out instead to be simply a cosmological constant (p/rho = -1), the combination of CMB and galaxy survey data should provide a significant detection of the remaining dark matter, the neutrino background radiation (NBR). The gross effect of its density or temperature on the expansion rate is ill-constrained as it is can be mimicked by a change in the matter density. However, anisotropies of the NBR break this degeneracy and should be detectable by upcoming experiments.Comment: 16 pages, 10 figures, RevTeX, submitted to PR

    Weighing Neutrinos with Galaxy Surveys

    Full text link
    We show that galaxy redshift surveys sensitively probe the neutrino mass, with eV mass neutrinos suppressing power by a factor of two. The Sloan Digital Sky Survey can potentially detect NN nearly degenerate massive neutrino species with mass m_nu > 0.65 (Omega_m h^2/0.1 N)^{0.8} eV at better than 2sigma once microwave background experiments measure two other cosmological parameters. Significant overlap exists between this region and that implied by the LSND experiment, and even m_nu ~ 0.01-0.1 eV, as implied by the atmospheric anomaly, can affect cosmological measurements.Comment: 4 pages, Revtex, 3 ps figures included, version accepted by PRL; caveats added about scale-dependent bias; additional color delta m^2 plot available at http://www.sns.ias.edu/~wh

    ENIGMA and global neuroscience: A decade of large-scale studies of the brain in health and disease across more than 40 countries

    Get PDF
    This review summarizes the last decade of work by the ENIGMA (Enhancing NeuroImaging Genetics through Meta Analysis) Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease. Building on large-scale genetic studies that discovered the first robustly replicated genetic loci associated with brain metrics, ENIGMA has diversified into over 50 working groups (WGs), pooling worldwide data and expertise to answer fundamental questions in neuroscience, psychiatry, neurology, and genetics. Most ENIGMA WGs focus on specific psychiatric and neurological conditions, other WGs study normal variation due to sex and gender differences, or development and aging; still other WGs develop methodological pipelines and tools to facilitate harmonized analyses of "big data" (i.e., genetic and epigenetic data, multimodal MRI, and electroencephalography data). These international efforts have yielded the largest neuroimaging studies to date in schizophrenia, bipolar disorder, major depressive disorder, post-traumatic stress disorder, substance use disorders, obsessive-compulsive disorder, attention-deficit/hyperactivity disorder, autism spectrum disorders, epilepsy, and 22q11.2 deletion syndrome. More recent ENIGMA WGs have formed to study anxiety disorders, suicidal thoughts and behavior, sleep and insomnia, eating disorders, irritability, brain injury, antisocial personality and conduct disorder, and dissociative identity disorder. Here, we summarize the first decade of ENIGMA's activities and ongoing projects, and describe the successes and challenges encountered along the way. We highlight the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings, offering the opportunity to identify brain systems involved in clinical syndromes across diverse samples and associated genetic, environmental, demographic, cognitive, and psychosocial factors

    Economic-demographic interactions in long-run growth

    Get PDF
    Cliometrics confirms that Malthus’ model of the pre-industrial economy, in which increases in productivity raise population but higher population drives down wages, is a good description for much of demographic/economic history. A contributor to the Malthusian equilibrium was the Western European Marriage Pattern, the late age of female first marriage, which promised to retard the fall of living standards by restricting fertility. The demographic transition and the transition from Malthusian economies to modern economic growth attracted many Cliometric models surveyed here. A popular model component is that lower levels of mortality over many centuries increased the returns to, or preference for, human capital investment so that technical progress eventually accelerated. This initially boosted birth rates and population growth accelerated. Fertility decline was earliest and most striking in late eighteenth century France. By the 1830s the fall in French marital fertility is consistent with a response to the rising opportunity cost of children. The rest of Europe did not begin to follow until end of the nineteenth century. Interactions between the economy and migration have been modelled with Cliometric structures closely related to those of natural increase and the economy. Wages were driven up by emigration from Europe and reduced in the economies receiving immigrants

    Commissioning of the CMS High Level Trigger

    Get PDF
    The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-crossing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008

    Performance and Operation of the CMS Electromagnetic Calorimeter

    Get PDF
    The operation and general performance of the CMS electromagnetic calorimeter using cosmic-ray muons are described. These muons were recorded after the closure of the CMS detector in late 2008. The calorimeter is made of lead tungstate crystals and the overall status of the 75848 channels corresponding to the barrel and endcap detectors is reported. The stability of crucial operational parameters, such as high voltage, temperature and electronic noise, is summarised and the performance of the light monitoring system is presented

    Dimethyl sulfide production: what is the contribution of the coccolithophores?

    Get PDF

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns

    Refinement type contracts for verification of scientific investigative software

    Full text link
    Our scientific knowledge is increasingly built on software output. User code which defines data analysis pipelines and computational models is essential for research in the natural and social sciences, but little is known about how to ensure its correctness. The structure of this code and the development process used to build it limit the utility of traditional testing methodology. Formal methods for software verification have seen great success in ensuring code correctness but generally require more specialized training, development time, and funding than is available in the natural and social sciences. Here, we present a Python library which uses lightweight formal methods to provide correctness guarantees without the need for specialized knowledge or substantial time investment. Our package provides runtime verification of function entry and exit condition contracts using refinement types. It allows checking hyperproperties within contracts and offers automated test case generation to supplement online checking. We co-developed our tool with a medium-sized (\approx3000 LOC) software package which simulates decision-making in cognitive neuroscience. In addition to helping us locate trivial bugs earlier on in the development cycle, our tool was able to locate four bugs which may have been difficult to find using traditional testing methods. It was also able to find bugs in user code which did not contain contracts or refinement type annotations. This demonstrates how formal methods can be used to verify the correctness of scientific software which is difficult to test with mainstream approaches
    corecore