5,875 research outputs found

    Observational Tests and Predictive Stellar Evolution II: Non-standard Models

    Full text link
    We examine contributions of second order physical processes to results of stellar evolution calculations amenable to direct observational testing. In the first paper in the series (Young et al. 2001) we established baseline results using only physics which are common to modern stellar evolution codes. In the current paper we establish how much of the discrepancy between observations and baseline models is due to particular elements of new physics. We then consider the impact of the observational uncertainties on the maximum predictive accuracy achievable by a stellar evolution code. The sun is an optimal case because of the precise and abundant observations and the relative simplicity of the underlying stellar physics. The Standard Model is capable of matching the structure of the sun as determined by helioseismology and gross surface observables to better than a percent. Given an initial mass and surface composition within the observational errors, and no additional constraints for which the models can be optimized, it is not possible to predict the sun's current state to better than ~7%. Convectively induced mixing in radiative regions, seen in multidimensional hydrodynamic simulations, dramatically improves the predictions for radii, luminosity, and apsidal motions of eclipsing binaries while simultaneously maintaining consistency with observed light element depletion and turnoff ages in young clusters (Young et al. 2003). Systematic errors in core size for models of massive binaries disappear with more complete mixing physics, and acceptable fits are achieved for all of the binaries without calibration of free parameters. The lack of accurate abundance determinations for binaries is now the main obstacle to improving stellar models using this type of test.Comment: 33 pages, 8 figures, accepted for publication in the Astrophysical Journa

    Dynamical system analysis of unstable flow phenomena in centrifugal blower

    Get PDF
    Methods of dynamical system analysis were employed to analyze unsteady phenomena in a centrifugal blower. Pressure signals gathered at different control points were decomposed into their Principal Components (PCs) by means of Singular Spectrum Analysis (SSA). Certain number of PCs was considered in the analysis based on their statistical correlation. Projection of the original signal onto its PCs allowed to draw the phase trajectory that clearly separated non-stable blower working conditions from its regular operation

    Functions and Requirements of the CMS Centre at CERN

    Get PDF
    This report of the CMS Centre Requirements and Technical Assessment Group describes the functions of the CMS Centre on the CERN Meyrin site in terms of data quality monitoring, calibrations and rapid analysis and operations of the offline computing systems. It then defines the corresponding requirements for building space, computing consoles and other equipment, technical services and refurbishments, and communications systems

    GPQA: A Graduate-Level Google-Proof Q&A Benchmark

    Full text link
    We present GPQA, a challenging dataset of 448 multiple-choice questions written by domain experts in biology, physics, and chemistry. We ensure that the questions are high-quality and extremely difficult: experts who have or are pursuing PhDs in the corresponding domains reach 65% accuracy (74% when discounting clear mistakes the experts identified in retrospect), while highly skilled non-expert validators only reach 34% accuracy, despite spending on average over 30 minutes with unrestricted access to the web (i.e., the questions are "Google-proof"). The questions are also difficult for state-of-the-art AI systems, with our strongest GPT-4 based baseline achieving 39% accuracy. If we are to use future AI systems to help us answer very hard questions, for example, when developing new scientific knowledge, we need to develop scalable oversight methods that enable humans to supervise their outputs, which may be difficult even if the supervisors are themselves skilled and knowledgeable. The difficulty of GPQA both for skilled non-experts and frontier AI systems should enable realistic scalable oversight experiments, which we hope can help devise ways for human experts to reliably get truthful information from AI systems that surpass human capabilities.Comment: 28 pages, 5 figures, 7 table

    Fast Beam Condition Monitor for CMS: performance and upgrade

    Get PDF
    The CMS beam and radiation monitoring subsystem BCM1F (Fast Beam Condition Monitor) consists of 8 individual diamond sensors situated around the beam pipe within the pixel detector volume, for the purpose of fast bunch-by-bunch monitoring of beam background and collision products. In addition, effort is ongoing to use BCM1F as an online luminosity monitor. BCM1F will be running whenever there is beam in LHC, and its data acquisition is independent from the data acquisition of the CMS detector, hence it delivers luminosity even when CMS is not taking data. A report is given on the performance of BCM1F during LHC run I, including results of the van der Meer scan and on-line luminosity monitoring done in 2012. In order to match the requirements due to higher luminosity and 25 ns bunch spacing, several changes to the system must be implemented during the upcoming shutdown, including upgraded electronics and precise gain monitoring. First results from Run II preparation are shown.Comment: 10 pages, 8 figures. To be published in NIM A as proceedings for the 9th Hiroshima Symposium on Semiconductor Tracking Detectors (2013

    GK Per (Nova Persei 1901): HST Imagery and Spectroscopy of the Ejecta, and First Spectrum of the Jet-Like Feature

    Full text link
    We have imaged the ejecta of GK Persei (Nova Persei 1901 A.D.) with the Hubble Space Telescope (HST), revealing hundreds of cometary-like structures. One or both ends of the structures often show a brightness enhancement relative to the structures' middle sections, but there is no simple regularity to their morphologies (in contrast with the Helix nebula). Some of the structures' morphologies suggest the presence of slow-moving or stationary material with which the ejecta is colliding, while others suggest shaping from a wind emanating from GK Per itself. A detailed expansion map of the nova's ejecta was created by comparing HST images taken in successive years. WFPC2 narrowband images and STIS spectra demonstrate that the physical conditions in the ejecta vary strongly on spatial scales much smaller than those of the ejecta. Directly measuring accurate densities and compositions, and hence masses of this and other nova shells, will demand data at least as resolved spatially as those presented here. The filling factor the ejecta is < 1%, and the nova ejecta mass must be less than 10^{-4} \Msun. A few of the nebulosities vary in brightness by up to a factor of two on timescales of one year. Finally, we present the deepest images yet obtained of a jet-like feature outside the main body of GK Per nebulosity, and the first spectrum of that feature. Dominated by strong, narrow emission lines of [NII], [OII], [OIII], and [SII], this feature is probably a shock due to ejected material running into stationary ISM, slowly moving ejecta from a previous nova episode, or circum-binary matter present before 1901. An upper limit to the mass of the jet is of order a few times 10^{-6} \Msun. The jet might be an important, or even dominant mass sink from the binary system. The jet's faintness suggests that similar features could easily have been missed in other cataclysmic binaries.Comment: 43 pages, 17 figure

    HILT : High-Level Thesaurus Project M2M Feasibility Study : [Final Report]

    Get PDF
    The project was asked to investigate the feasibility of developing SOAP-based interfaces between JISC IE services and Wordmap APIs and non-Wordmap versions of the HILT pilot demonstrator created under HILT Phase II and to determine the scope and cost of the provision of an actual demonstrator based on each of these approaches. In doing so it was to take into account the possibility of a future Zthes1-based solution using Z39.50 or OAI-PMH and syntax and data-exchange protocol implications of eScience and semantic-web developments. It was agreed that the primary concerns of the study should be an assessment of the feasibility, scope, and cost of a follow-up M2M pilot that considered the best options in respect of: o Query protocols (SOAP, Z39.50, SRW, OAI) and associated data profiles (e.g. Zthes for Z39.50 and for SRW); o Standards for structuring thesauri and thesauri-type information (e.g. the Zthes XML DTD and SRW version of it and SKOS-Core2); The study was carried out within the allotted timescale, with this Final Report submitted to JISC on 31st March 2005 as scheduled. The detailed proposal for a follow-up project is currently under discussion and will be finalised – as agreed with JISC – by mid-April. It was concluded that an M2M pilot was feasible. A proposal for a follow-up M2M pilot project has been scoped, and is currently being costed

    Comparison of Optimised MDI versus Pumps with or without Sensors in Severe Hypoglycaemia (the Hypo COMPaSS trial).

    Get PDF
    BACKGROUND: Severe hypoglycaemia (SH) is one of the most feared complications of type 1 diabetes (T1DM) with a reported prevalence of nearly 40%. In randomized trials of Multiple Daily Injections (MDI) and Continuous Subcutaneous Insulin Infusion (CSII) therapy there is a possible benefit of CSII in reducing SH. However few trials have used basal insulin analogues as the basal insulin in the MDI group and individuals with established SH have often been excluded from prospective studies. In published studies investigating the effect of Real Time Continuous Glucose Monitoring (RT-CGM) benefit in terms of reduced SH has not yet been demonstrated. The primary objective of this study is to elucidate whether in people with T1DM complicated by impaired awareness of hypoglycaemia (IAH), rigorous prevention of biochemical hypoglycaemia using optimized existing self-management technology and educational support will restore awareness and reduce risk of recurrent SH. METHODS/DESIGN: This is a multicentre prospective RCT comparing hypoglycaemia avoidance with optimized MDI and CSII with or without RT-CGM in a 2×2 factorial design in people with type 1 diabetes who have IAH. The primary outcome measure for this study is the difference in IAH (Gold score) at 24 weeks. Secondary outcomes include biomedical measures such as HbA1c, SH incidence, blinded CGM analysis, self monitored blood glucose (SMBG) and response to hypoglycaemia in gold standard clamp studies. Psychosocial measures including well-being and quality of life will also be assessed using several validated and novel measures. Analysis will be on an intention-to-treat basis. DISCUSSION: Most existing RCTs using this study's interventions have been powered for change in HbA1c rather than IAH or SH. This trial will demonstrate whether IAH can be reversed and SH prevented in people with T1DM in even those at highest risk by using optimized conventional management and existing technology. TRIAL REGISTRATION: ISRCTN52164803 Eudract No: 2009-015396-27.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
    • …
    corecore