18,695 research outputs found

    Preliminary galaxy extraction from DENIS images

    Get PDF
    The extragalactic applications of NIR surveys are summarized with a focus on the ability to map the interstellar extinction of our Galaxy. Very preliminary extraction of galaxies on a set of 180 consecutive images is presented, and the results illustrate some of the pitfalls in attempting an homogeneous extraction of galaxies from these wide-angle and shallow surveys.Comment: Invited talk at "The Impact of Large-Scale Near-IR Sky Surveys", meeting held in Tenerife, Spain, April 1996. 10 pages LaTeX with style file and 4 PS files include

    Coherent states for compact Lie groups and their large-N limits

    Full text link
    The first two parts of this article surveys results related to the heat-kernel coherent states for a compact Lie group K. I begin by reviewing the definition of the coherent states, their resolution of the identity, and the associated Segal-Bargmann transform. I then describe related results including connections to geometric quantization and (1+1)-dimensional Yang--Mills theory, the associated coherent states on spheres, and applications to quantum gravity. The third part of this article summarizes recent work of mine with Driver and Kemp on the large-N limit of the Segal--Bargmann transform for the unitary group U(N). A key result is the identification of the leading-order large-N behavior of the Laplacian on "trace polynomials."Comment: Submitted to the proceeding of the CIRM conference, "Coherent states and their applications: A contemporary panorama.

    Synthesis of Recursive ADT Transformations from Reusable Templates

    Full text link
    Recent work has proposed a promising approach to improving scalability of program synthesis by allowing the user to supply a syntactic template that constrains the space of potential programs. Unfortunately, creating templates often requires nontrivial effort from the user, which impedes the usability of the synthesizer. We present a solution to this problem in the context of recursive transformations on algebraic data-types. Our approach relies on polymorphic synthesis constructs: a small but powerful extension to the language of syntactic templates, which makes it possible to define a program space in a concise and highly reusable manner, while at the same time retains the scalability benefits of conventional templates. This approach enables end-users to reuse predefined templates from a library for a wide variety of problems with little effort. The paper also describes a novel optimization that further improves the performance and scalability of the system. We evaluated the approach on a set of benchmarks that most notably includes desugaring functions for lambda calculus, which force the synthesizer to discover Church encodings for pairs and boolean operations

    The impact of the ATLAS zero-lepton, jets and missing momentum search on a CMSSM fit

    Full text link
    Recent ATLAS data significantly extend the exclusion limits for supersymmetric particles. We examine the impact of such data on global fits of the constrained minimal supersymmetric standard model (CMSSM) to indirect and cosmological data. We calculate the likelihood map of the ATLAS search, taking into account systematic errors on the signal and on the background. We validate our calculation against the ATLAS determinaton of 95% confidence level exclusion contours. A previous CMSSM global fit is then re-weighted by the likelihood map, which takes a bite at the high probability density region of the global fit, pushing scalar and gaugino masses up.Comment: 16 pages, 7 figures. v2 has bigger figures and fixed typos. v3 has clarified explanation of our handling of signal systematic

    Suppression of electron spin decoherence in a quantum dot

    Full text link
    The dominant source of decoherence for an electron spin in a quantum dot is the hyperfine interaction with the surrounding bath of nuclear spins. The decoherence process may be slowed down by subjecting the electron spin to suitable sequences of external control pulses. We investigate the performance of a variety of dynamical decoupling protocols using exact numerical simulation. Emphasis is given to realistic pulse delays and the long-time limit, beyond the domain where available analytical approaches are guaranteed to work. Our results show that both deterministic and randomized protocols are capable to significantly prolong the electron coherence time, even when using control pulse separations substantially larger than what expected from the {\em upper cutoff} frequency of the coupling spectrum between the electron and the nuclear spins. In a realistic parameter range, the {\em total width} of such a coupling spectrum appears to be the physically relevant frequency scale affecting the overall quality of the decoupling.Comment: 8 pages, 3 figures. Invited talk at the XXXVII Winter Colloquium on the Physics of Quantum Electronics, Snowbird, Jan 2007. Submitted to J. Mod. Op

    Update on the transfusion in gastrointestinal bleeding (TRIGGER) trial: statistical analysis plan for a cluster-randomised feasibility trial

    Get PDF
    BACKGROUND: Previous research has suggested an association between more liberal red blood cell (RBC) transfusion and greater risk of further bleeding and mortality following acute upper gastrointestinal bleeding (AUGIB). METHODS AND DESIGN: The Transfusion in Gastrointestinal Bleeding (TRIGGER) trial is a pragmatic cluster-randomised feasibility trial which aims to evaluate the feasibility of implementing a restrictive vs. liberal RBC transfusion policy for adult patients admitted to hospital with AUGIB in the UK. This trial will help to inform the design and methodology of a phase III trial. The protocol for TRIGGER has been published in Transfusion Medicine Reviews. Recruitment began in September 2012 and was completed in March 2013. This update presents the statistical analysis plan, detailing how analysis of the TRIGGER trial will be performed. It is hoped that prospective publication of the full statistical analysis plan will increase transparency and give readers a clear overview of how TRIGGER will be analysed. TRIAL REGISTRATION: ISRCTN85757829

    Pre-specification of statistical analysis approaches in published clinical trial protocols was inadequate

    Get PDF
    OBJECTIVES: Results from randomized trials can depend on the statistical analysis approach used. It is important to prespecify the analysis approach in the trial protocol to avoid selective reporting of analyses based on those which provide the most favourable results. We undertook a review of published trial protocols to assess how often the statistical analysis of the primary outcome was adequately prespecified. METHODS: We searched protocols of randomized trials indexed in PubMed in November 2016. We identified whether the following aspects of the statistical analysis approach for the primary outcome were adequately prespecified: (1) analysis population; (2) analysis model; (3) use of covariates; and (4) method of handling missing data. RESULTS: e identified 99 eligible protocols. Very few protocols adequately prespecified the analysis population (8/99, 8%), analysis model (27/99, 27%), covariates (40/99, 40%), or approach to handling missing data (10/99, 10%). Most protocols did not adequately predefine any of these four aspects of their statistical analysis approach (39%) or predefined only one aspect (36%). No protocols adequately predefined all four aspects of the analysis. CONCLUSION: The statistical analysis approach is rarely prespecified in published trial protocols. This may allow selective reporting of results based on different analyses

    Cytotoxic activity of Treponema denticola

    Get PDF
    published_or_final_versio

    Adjusting for multiple prognostic factors in the analysis of randomised trials

    Get PDF
    Background: When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods: We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results: Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions: It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size
    • …
    corecore