461,793 research outputs found

    Cosmic Shear Statistics and Cosmology

    Get PDF
    We report a measurement of cosmic shear correlations using an effective area of 6.5 sq. deg. of the VIRMOS deep imaging survey in progress at the Canada-France-Hawaii Telescope. We measured various shear correlation functions, the aperture mass statistic and the top-hat smoothed variance of the shear with a detection significance exceeding 12 sigma for each of them. We present results on angular scales from 3 arc-seconds to half a degree. The consistency of different statistical measures is demonstrated and confirms the lensing origin of the signal through tests that rely on the scalar nature of the gravitational potential. For Cold Dark Matter models we find σ8Ω00.6=0.430.05+0.04\sigma_8 \Omega_0^{0.6}=0.43^{+0.04}_{-0.05} at the 95% confidence level. The measurement over almost three decades of scale allows to discuss the effect of the shape of the power spectrum on the cosmological parameter estimation. The degeneracy on sigma_8-Omega_0 can be broken if priors on the shape of the linear power spectrum (that can be parameterized by Gamma) are assumed. For instance, with Gamma=0.21 and at the 95% confidence level, we obtain 0.60.65 and Omega_0<0.4 for flat (Lambda-CDM) models. From the tangential/radial modes decomposition we can set an upper limit on the intrinsic shape alignment, which was recently suggested as a possible contribution to the lensing signal. Within the error bars, there is no detection of intrinsic shape alignment for scales larger than 1'.Comment: 13 pages, submitted to A&

    Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    Get PDF
    Background Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure–risk relationships, but involves a number of analytical challenges. Methods This article describes statistical approaches adopted in the Emerging Risk Factors Collaboration, in which primary data from more than 1 million participants in more than 100 prospective studies have been collated to enable detailed analyses of various risk markers in relation to incident cardiovascular disease outcomes. Results Analyses have been principally based on Cox proportional hazards regression models stratified by sex, undertaken in each study separately. Estimates of exposure–risk relationships, initially unadjusted and then adjusted for several confounders, have been combined over studies using meta-analysis. Methods for assessing the shape of exposure–risk associations and the proportional hazards assumption have been developed. Estimates of interactions have also been combined using meta-analysis, keeping separate within- and between-study information. Regression dilution bias caused by measurement error and within-person variation in exposures and confounders has been addressed through the analysis of repeat measurements to estimate corrected regression coefficients. These methods are exemplified by analysis of plasma fibrinogen and risk of coronary heart disease, and Stata code is made available. Conclusion Increasing numbers of meta-analyses of individual participant data from observational data are being conducted to enhance the statistical power and detail of epidemiological studies. The statistical methods developed here can be used to address the needs of such analyses

    Automatic generation of statistical pose and shape models for articulated joints

    Get PDF
    Statistical analysis of motion patterns of body joints is potentially useful for detecting and quantifying pathologies. However, building a statistical motion model across different subjects remains a challenging task, especially for a complex joint like the wrist. We present a novel framework for simultaneous registration and segmentation of multiple 3-D (CT or MR) volumes of different subjects at various articulated positions. The framework starts with a pose model generated from 3-D volumes captured at different articulated positions of a single subject (template). This initial pose model is used to register the template volume to image volumes from new subjects. During this process, the Grow-Cut algorithm is used in an iterative refinement of the segmentation of the bone along with the pose parameters. As each new subject is registered and segmented, the pose model is updated, improving the accuracy of successive registrations. We applied the algorithm to CT images of the wrist from 25 subjects, each at five different wrist positions and demonstrated that it performed robustly and accurately. More importantly, the resulting segmentations allowed a statistical pose model of the carpal bones to be generated automatically without interaction. The evaluation results show that our proposed framework achieved accurate registration with an average mean target registration error of mm. The automatic segmentation results also show high consistency with the ground truth obtained semi-automatically. Furthermore, we demonstrated the capability of the resulting statistical pose and shape models by using them to generate a measurement tool for scaphoid-lunate dissociation diagnosis, which achieved 90% sensitivity and specificity

    Spurious Shear in Weak Lensing with LSST

    Full text link
    The complete 10-year survey from the Large Synoptic Survey Telescope (LSST) will image \sim 20,000 square degrees of sky in six filter bands every few nights, bringing the final survey depth to r27.5r\sim27.5, with over 4 billion well measured galaxies. To take full advantage of this unprecedented statistical power, the systematic errors associated with weak lensing measurements need to be controlled to a level similar to the statistical errors. This work is the first attempt to quantitatively estimate the absolute level and statistical properties of the systematic errors on weak lensing shear measurements due to the most important physical effects in the LSST system via high fidelity ray-tracing simulations. We identify and isolate the different sources of algorithm-independent, \textit{additive} systematic errors on shear measurements for LSST and predict their impact on the final cosmic shear measurements using conventional weak lensing analysis techniques. We find that the main source of the errors comes from an inability to adequately characterise the atmospheric point spread function (PSF) due to its high frequency spatial variation on angular scales smaller than 10\sim10' in the single short exposures, which propagates into a spurious shear correlation function at the 10410^{-4}--10310^{-3} level on these scales. With the large multi-epoch dataset that will be acquired by LSST, the stochastic errors average out, bringing the final spurious shear correlation function to a level very close to the statistical errors. Our results imply that the cosmological constraints from LSST will not be severely limited by these algorithm-independent, additive systematic effects.Comment: 22 pages, 12 figures, accepted by MNRA

    Generating Survival Times to Simulate Cox Proportional Hazards Models

    Get PDF
    This paper discusses techniques to generate survival times for simulation studies regarding Cox proportional hazards models. In linear regression models, the response variable is directly connected with the considered covariates, the regression coefficients and the simulated random errors. Thus, the response variable can be generated from the regression function, once the regression coefficients and the error distribution are specified. However, in the Cox model, which is formulated via the hazard function, the effect of the covariates have to be translated from the hazards to the survival times, because the usual software packages for estimation of Cox models require the individual survival time data. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived. It is shown how the exponential, the Weibull and the Gompertz distribution can be used to generate appropriate survival times for simulation studies. Additionally, the general relation between hazard and survival time can be used to develop own distributions for special situations and to handle flexibly parameterized proportional hazards models. The use of other distributions than the exponential distribution only is indispensable to investigate the characteristics of the Cox proportional hazards model, especially in non-standard situations, where the partial likelihood depends on the baseline hazard
    corecore