461,793 research outputs found
Cosmic Shear Statistics and Cosmology
We report a measurement of cosmic shear correlations using an effective area
of 6.5 sq. deg. of the VIRMOS deep imaging survey in progress at the
Canada-France-Hawaii Telescope. We measured various shear correlation
functions, the aperture mass statistic and the top-hat smoothed variance of the
shear with a detection significance exceeding 12 sigma for each of them. We
present results on angular scales from 3 arc-seconds to half a degree. The
consistency of different statistical measures is demonstrated and confirms the
lensing origin of the signal through tests that rely on the scalar nature of
the gravitational potential. For Cold Dark Matter models we find at the 95% confidence level. The
measurement over almost three decades of scale allows to discuss the effect of
the shape of the power spectrum on the cosmological parameter estimation. The
degeneracy on sigma_8-Omega_0 can be broken if priors on the shape of the
linear power spectrum (that can be parameterized by Gamma) are assumed. For
instance, with Gamma=0.21 and at the 95% confidence level, we obtain
0.60.65 and
Omega_0<0.4 for flat (Lambda-CDM) models. From the tangential/radial modes
decomposition we can set an upper limit on the intrinsic shape alignment, which
was recently suggested as a possible contribution to the lensing signal. Within
the error bars, there is no detection of intrinsic shape alignment for scales
larger than 1'.Comment: 13 pages, submitted to A&
Recommended from our members
Modelling non-linear exposure-disease relationships in a large individual participant meta-analysis allowing for the effects of exposure measurement error
This thesis was motivated by data from the Emerging Risk Factors Collaboration (ERFC), a
large individual participant data (IPD) meta-analysis of risk factors for coronary heart disease(CHD). Cardiovascular disease is the largest cause of death in almost all countries in the world, therefore it is important to be able to characterise the shape of risk factor–CHD relationships.
Many of the risk factors for CHD considered by the ERFC are subject to substantial measurement error, and their relationship with CHD non-linear. We firstly consider issues associated with modelling the risk factor–disease relationship in a single study, before using meta-analysis
to combine relationships across studies.
It is well known that classical measurement error generally attenuates linear exposure–disease relationships, however its precise effect on non-linear relationships is less well understood. We
investigate the effect of classical measurement error on the shape of exposure–disease relationships
that are commonly encountered in epidemiological studies, and then consider methods for correcting for classical measurement error. We propose the application of a widely used correction method, regression calibration, to fractional polynomial models. We also consider
the effects of non-classical error on the observed exposure–disease relationship, and the impact on our correction methods when we erroneously assume classical measurement error.
Analyses performed using categorised continuous exposures are common in epidemiology. We
show that MacMahon’s method for correcting for measurement error in analyses that use categorised continuous exposures, although simple, does not provide the correct shape for nonlinear exposure–disease relationships. We perform a simulation study to compare alternative methods for categorised continuous exposures.
Meta-analysis is the statistical synthesis of results from a number of studies addressing similar research hypotheses. The use of IPD is the gold standard approach because it allows for consistent analysis of the exposure–disease relationship across studies. Methods have recently been proposed for combining non-linear relationships across studies. We discuss these methods,
extend them to P-spline models, and consider alternative methods of combining relationships across studies.
We apply the methods developed to the relationships of fasting blood glucose and lipoprotein(a) with CHD, using data from the ERFC.This work was supported by the Medical Research Counci
Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies
Background Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure–risk relationships, but involves a number of analytical challenges. Methods This article describes statistical approaches adopted in the Emerging Risk Factors Collaboration, in which primary data from more than 1 million participants in more than 100 prospective studies have been collated to enable detailed analyses of various risk markers in relation to incident cardiovascular disease outcomes. Results Analyses have been principally based on Cox proportional hazards regression models stratified by sex, undertaken in each study separately. Estimates of exposure–risk relationships, initially unadjusted and then adjusted for several confounders, have been combined over studies using meta-analysis. Methods for assessing the shape of exposure–risk associations and the proportional hazards assumption have been developed. Estimates of interactions have also been combined using meta-analysis, keeping separate within- and between-study information. Regression dilution bias caused by measurement error and within-person variation in exposures and confounders has been addressed through the analysis of repeat measurements to estimate corrected regression coefficients. These methods are exemplified by analysis of plasma fibrinogen and risk of coronary heart disease, and Stata code is made available. Conclusion Increasing numbers of meta-analyses of individual participant data from observational data are being conducted to enhance the statistical power and detail of epidemiological studies. The statistical methods developed here can be used to address the needs of such analyses
Automatic generation of statistical pose and shape models for articulated joints
Statistical analysis of motion patterns of body joints is potentially useful for detecting and quantifying pathologies. However, building a statistical motion model across different subjects remains a challenging task, especially for a complex joint like the wrist. We present a novel framework for simultaneous registration and segmentation of multiple 3-D (CT or MR) volumes of different subjects at various articulated positions. The framework starts with a pose model generated from 3-D volumes captured at different articulated positions of a single subject (template). This initial pose model is used to register the template volume to image volumes from new subjects. During this process, the Grow-Cut algorithm is used in an iterative refinement of the segmentation of the bone along with the pose parameters. As each new subject is registered and segmented, the pose model is updated, improving the accuracy of successive registrations. We applied the algorithm to CT images of the wrist from 25 subjects, each at five different wrist positions and demonstrated that it performed robustly and accurately. More importantly, the resulting segmentations allowed a statistical pose model of the carpal bones to be generated automatically without interaction. The evaluation results show that our proposed framework achieved accurate registration with an average mean target registration error of mm. The automatic segmentation results also show high consistency with the ground truth obtained semi-automatically. Furthermore, we demonstrated the capability of the resulting statistical pose and shape models by using them to generate a measurement tool for scaphoid-lunate dissociation diagnosis, which achieved 90% sensitivity and specificity
Spurious Shear in Weak Lensing with LSST
The complete 10-year survey from the Large Synoptic Survey Telescope (LSST)
will image 20,000 square degrees of sky in six filter bands every few
nights, bringing the final survey depth to , with over 4 billion
well measured galaxies. To take full advantage of this unprecedented
statistical power, the systematic errors associated with weak lensing
measurements need to be controlled to a level similar to the statistical
errors.
This work is the first attempt to quantitatively estimate the absolute level
and statistical properties of the systematic errors on weak lensing shear
measurements due to the most important physical effects in the LSST system via
high fidelity ray-tracing simulations. We identify and isolate the different
sources of algorithm-independent, \textit{additive} systematic errors on shear
measurements for LSST and predict their impact on the final cosmic shear
measurements using conventional weak lensing analysis techniques. We find that
the main source of the errors comes from an inability to adequately
characterise the atmospheric point spread function (PSF) due to its high
frequency spatial variation on angular scales smaller than in the
single short exposures, which propagates into a spurious shear correlation
function at the -- level on these scales. With the large
multi-epoch dataset that will be acquired by LSST, the stochastic errors
average out, bringing the final spurious shear correlation function to a level
very close to the statistical errors. Our results imply that the cosmological
constraints from LSST will not be severely limited by these
algorithm-independent, additive systematic effects.Comment: 22 pages, 12 figures, accepted by MNRA
Generating Survival Times to Simulate Cox Proportional Hazards Models
This paper discusses techniques to generate survival times for simulation studies regarding Cox proportional hazards models. In linear regression models, the response variable is directly connected with the considered covariates, the regression coefficients and the simulated random errors. Thus, the response variable can be generated from the regression function, once the regression coefficients and the error distribution are specified. However, in the Cox model, which is formulated via the hazard function, the effect of the covariates have to be translated from the hazards to the survival times, because the usual software packages for estimation of Cox models require the individual survival time data. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived. It is shown how the exponential, the Weibull and the Gompertz distribution can be used to generate appropriate survival times for simulation studies. Additionally, the general relation between hazard and survival time can be used to develop own distributions for special situations and to handle flexibly parameterized proportional hazards models. The use of other distributions than the exponential distribution only is indispensable to investigate the characteristics of the Cox proportional hazards model, especially in non-standard situations, where the partial likelihood depends on the baseline hazard
- …