1,961 research outputs found
Comparing compact binary parameter distributions I: Methods
Being able to measure each merger's sky location, distance, component masses,
and conceivably spins, ground-based gravitational-wave detectors will provide a
extensive and detailed sample of coalescing compact binaries (CCBs) in the
local and, with third-generation detectors, distant universe. These
measurements will distinguish between competing progenitor formation models. In
this paper we develop practical tools to characterize the amount of
experimentally accessible information available, to distinguish between two a
priori progenitor models. Using a simple time-independent model, we demonstrate
the information content scales strongly with the number of observations. The
exact scaling depends on how significantly mass distributions change between
similar models. We develop phenomenological diagnostics to estimate how many
models can be distinguished, using first-generation and future instruments.
Finally, we emphasize that multi-observable distributions can be fully
exploited only with very precisely calibrated detectors, search pipelines,
parameter estimation, and Bayesian model inference
Twentyâyear study of inâhospital and postdischarge mortality following emergency general surgical admission
We are grateful to Lizzie Nicholson, and the team at the Information Services Division, Scotland for their support in providing us with these data and the Data Safehaven Department of the University of Aberdeen for its storage. The authors would also like to thanks Dr Neil Scott and Dr Rute Vieira of the Department of Medical Statistics, University of Aberdeen for their advice in conducting this research.Peer reviewedPublisher PD
Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes
Functional data are defined as realizations of random functions (mostly
smooth functions) varying over a continuum, which are usually collected with
measurement errors on discretized grids. In order to accurately smooth noisy
functional observations and deal with the issue of high-dimensional observation
grids, we propose a novel Bayesian method based on the Bayesian hierarchical
model with a Gaussian-Wishart process prior and basis function representations.
We first derive an induced model for the basis-function coefficients of the
functional data, and then use this model to conduct posterior inference through
Markov chain Monte Carlo. Compared to the standard Bayesian inference that
suffers serious computational burden and unstableness for analyzing
high-dimensional functional data, our method greatly improves the computational
scalability and stability, while inheriting the advantage of simultaneously
smoothing raw observations and estimating the mean-covariance functions in a
nonparametric way. In addition, our method can naturally handle functional data
observed on random or uncommon grids. Simulation and real studies demonstrate
that our method produces similar results as the standard Bayesian inference
with low-dimensional common grids, while efficiently smoothing and estimating
functional data with random and high-dimensional observation grids where the
standard Bayesian inference fails. In conclusion, our method can efficiently
smooth and estimate high-dimensional functional data, providing one way to
resolve the curse of dimensionality for Bayesian functional data analysis with
Gaussian-Wishart processes.Comment: Under revie
Functional Data Analysis of Amplitude and Phase Variation
The abundance of functional observations in scientific endeavors has led to a
significant development in tools for functional data analysis (FDA). This kind
of data comes with several challenges: infinite-dimensionality of function
spaces, observation noise, and so on. However, there is another interesting
phenomena that creates problems in FDA. The functional data often comes with
lateral displacements/deformations in curves, a phenomenon which is different
from the height or amplitude variability and is termed phase variation. The
presence of phase variability artificially often inflates data variance, blurs
underlying data structures, and distorts principal components. While the
separation and/or removal of phase from amplitude data is desirable, this is a
difficult problem. In particular, a commonly used alignment procedure, based on
minimizing the norm between functions, does not provide
satisfactory results. In this paper we motivate the importance of dealing with
the phase variability and summarize several current ideas for separating phase
and amplitude components. These approaches differ in the following: (1) the
definition and mathematical representation of phase variability, (2) the
objective functions that are used in functional data alignment, and (3) the
algorithmic tools for solving estimation/optimization problems. We use simple
examples to illustrate various approaches and to provide useful contrast
between them.Comment: Published at http://dx.doi.org/10.1214/15-STS524 in the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Causes of death after emergency general surgical admission : population cohort study of mortality
Funding: This study was funded by the NHS Highland Endowments fund.Peer reviewedPublisher PD
Cosmic Calibration: Constraints from the Matter Power Spectrum and the Cosmic Microwave Background
Several cosmological measurements have attained significant levels of
maturity and accuracy over the last decade. Continuing this trend, future
observations promise measurements of the statistics of the cosmic mass
distribution at an accuracy level of one percent out to spatial scales with
k~10 h/Mpc and even smaller, entering highly nonlinear regimes of gravitational
instability. In order to interpret these observations and extract useful
cosmological information from them, such as the equation of state of dark
energy, very costly high precision, multi-physics simulations must be
performed. We have recently implemented a new statistical framework with the
aim of obtaining accurate parameter constraints from combining observations
with a limited number of simulations. The key idea is the replacement of the
full simulator by a fast emulator with controlled error bounds. In this paper,
we provide a detailed description of the methodology and extend the framework
to include joint analysis of cosmic microwave background and large scale
structure measurements. Our framework is especially well-suited for upcoming
large scale structure probes of dark energy such as baryon acoustic
oscillations and, especially, weak lensing, where percent level accuracy on
nonlinear scales is needed.Comment: 15 pages, 14 figure
An Optimal Linear Time Algorithm for Quasi-Monotonic Segmentation
Monotonicity is a simple yet significant qualitative characteristic. We
consider the problem of segmenting a sequence in up to K segments. We want
segments to be as monotonic as possible and to alternate signs. We propose a
quality metric for this problem using the l_inf norm, and we present an optimal
linear time algorithm based on novel formalism. Moreover, given a
precomputation in time O(n log n) consisting of a labeling of all extrema, we
compute any optimal segmentation in constant time. We compare experimentally
its performance to two piecewise linear segmentation heuristics (top-down and
bottom-up). We show that our algorithm is faster and more accurate.
Applications include pattern recognition and qualitative modeling.Comment: This is the extended version of our ICDM'05 paper (arXiv:cs/0702142
Why do some intermediate polars show soft X-ray emission? A survey of XMM-Newton spectra
We make a systematic analysis of the XMM-Newton X-ray spectra of intermediate
polars (IPs) and find that, contrary to the traditional picture, most show a
soft blackbody component. We compare the results with those from AM Her stars
and deduce that the blackbody emission arises from reprocessing of hard X-rays,
rather than from the blobby accretion sometimes seen in AM Hers. Whether an IP
shows a blackbody component appears to depend primarily on geometric factors: a
blackbody is not seen in those that have accretion footprints that are always
obscured by accretion curtains or are only visible when foreshortened on the
white-dwarf limb. Thus we argue against previous suggestions that the blackbody
emission characterises a separate sub-group of IPs which are more akin to AM
Hers, and develop a unified picture of the blackbody emission in these stars.Comment: 9 pages, 6 figures. Accepted for publication in Ap
The Three Dimensional Structure of EUV Accretion Regions in AM Herculis Stars: Modeling of EUV Photometric and Spectroscopic Observations
We have developed a model of the high-energy accretion region for magnetic
cataclysmic variables and applied it to {\it Extreme Ultraviolet Explorer}
observations of 10 AM Herculis type systems. The major features of the EUV
light curves are well described by the model. The light curves exhibit a large
variety of features such as eclipses of the accretion region by the secondary
star and the accretion stream, and dips caused by material very close to the
accretion region. While all the observed features of the light curves are
highly dependent on viewing geometry, none of the light curves are consistent
with a flat, circular accretion spot whose lightcurve would vary solely from
projection effects. The accretion region immediately above the WD surface is a
source of EUV radiation caused by either a vertical extent to the accretion
spot, or Compton scattering off electrons in the accretion column, or, very
likely, both. Our model yields spot sizes averaging 0.06 R, or the WD surface area, and average spot heights of 0.023
R. Spectra extracted during broad dip phases are softer than spectra
during the out-of-dip phases. This spectral ratio measurement leads to the
conclusion that Compton scattering, some absorption by a warm absorber,
geometric effects, an asymmetric temperature structure in the accretion region
and an asymmetric density structure of the accretion columnare all important
components needed to fully explain the data. Spectra extracted at phases where
the accretion spot is hidden behind the limb of the WD, but with the accretion
column immediately above the spot still visible, show no evidence of emission
features characteristic of a hot plasma.Comment: 30 Pages, 11 Figure
Credit bureaus between risk-management, creditworthiness assessment and prudential supervision
"This text may be downloaded for personal research purposes only. Any additional reproduction for other purposes, whether in hard copy or electronically, requires the consent of the author. If cited or quoted, reference should be made to the full name of the author, the title, the working paper or other series, the year, and the publisher."This paper discusses the role and operations of consumer Credit Bureaus in the European Union in the context of the economic theories, policies and law within which they work. Across Europe there is no common practice of sharing the credit data of consumers which can be used for several purposes. Mostly, they are used by the lending industry as a practice of creditworthiness assessment or as a risk-management tool to underwrite borrowing decisions or price risk. However, the type, breath, and depth of information differ greatly from country to country. In some Member States, consumer data are part of a broader information centralisation system for the prudential supervision of banks and the financial system as a whole. Despite EU rules on credit to consumers for the creation of the internal market, the underlying consumer data infrastructure remains fragmented at national level, failing to achieve univocal, common, or defined policy objectives under a harmonised legal framework. Likewise, the establishment of the Banking Union and the prudential supervision of the Euro area demand standardisation and convergence of the data used to measure debt levels, arrears, and delinquencies. The many functions and usages of credit data suggest that the policy goals to be achieved should inform the legal and institutional framework of Credit Bureaus, as well as the design and use of the databases. This is also because fundamental rights and consumer protection concerns arise from the sharing of credit data and their expanding use
- âŠ