12,328 research outputs found
Sensitivity Analysis for Multiple Comparisons in Matched Observational Studies through Quadratically Constrained Linear Programming
A sensitivity analysis in an observational study assesses the robustness of
significant findings to unmeasured confounding. While sensitivity analyses in
matched observational studies have been well addressed when there is a single
outcome variable, accounting for multiple comparisons through the existing
methods yields overly conservative results when there are multiple outcome
variables of interest. This stems from the fact that unmeasured confounding
cannot affect the probability of assignment to treatment differently depending
on the outcome being analyzed. Existing methods allow this to occur by
combining the results of individual sensitivity analyses to assess whether at
least one hypothesis is significant, which in turn results in an overly
pessimistic assessment of a study's sensitivity to unobserved biases. By
solving a quadratically constrained linear program, we are able to perform a
sensitivity analysis while enforcing that unmeasured confounding must have the
same impact on the treatment assignment probabilities across outcomes for each
individual in the study. We show that this allows for uniform improvements in
the power of a sensitivity analysis not only for testing the overall null of no
effect, but also for null hypotheses on \textit{specific} outcome variables
while strongly controlling the familywise error rate. We illustrate our method
through an observational study on the effect of smoking on naphthalene
exposure
Discrete Optimization for Interpretable Study Populations and Randomization Inference in an Observational Study of Severe Sepsis Mortality
Motivated by an observational study of the effect of hospital ward versus
intensive care unit admission on severe sepsis mortality, we develop methods to
address two common problems in observational studies: (1) when there is a lack
of covariate overlap between the treated and control groups, how to define an
interpretable study population wherein inference can be conducted without
extrapolating with respect to important variables; and (2) how to use
randomization inference to form confidence intervals for the average treatment
effect with binary outcomes. Our solution to problem (1) incorporates existing
suggestions in the literature while yielding a study population that is easily
understood in terms of the covariates themselves, and can be solved using an
efficient branch-and-bound algorithm. We address problem (2) by solving a
linear integer program to utilize the worst case variance of the average
treatment effect among values for unobserved potential outcomes that are
compatible with the null hypothesis. Our analysis finds no evidence for a
difference between the sixty day mortality rates if all individuals were
admitted to the ICU and if all patients were admitted to the hospital ward
among less severely ill patients and among patients with cryptic septic shock.
We implement our methodology in R, providing scripts in the supplementary
material
The generalized identification of truly interfacial molecules (ITIM) algorithm for nonplanar interfaces
We present a generalized version of the ITIM algorithm for the identification of interfacial molecules, which is able to treat arbitrarily shaped interfaces. The algorithm exploits the similarities between the concept of probe sphere used in ITIM and the circumsphere criterion used in the α-shapes approach, and can be regarded either as a reference-frame independent version of the former, or as an extended version of the latter that includes the atomic excluded volume. The new algorithm is applied to compute the intrinsic orientational order parameters of water around a dodecylphosphocholine and a cholic acid micelle in aqueous environment, and to the identification of solvent-reachable sites in four model structures for soot. The additional algorithm introduced for the calculation of intrinsic density profiles in arbitrary geometries proved to be extremely useful also for planar interfaces, as it allows to solve the paradox of smeared intrinsic profiles far from the interface. © 2013 American Institute of Physics
Surrogate-assisted network analysis of nonlinear time series
The performance of recurrence networks and symbolic networks to detect weak
nonlinearities in time series is compared to the nonlinear prediction error.
For the synthetic data of the Lorenz system, the network measures show a
comparable performance. In the case of relatively short and noisy real-world
data from active galactic nuclei, the nonlinear prediction error yields more
robust results than the network measures. The tests are based on surrogate data
sets. The correlations in the Fourier phases of data sets from some surrogate
generating algorithms are also examined. The phase correlations are shown to
have an impact on the performance of the tests for nonlinearity.Comment: 9 pages, 5 figures, Chaos
(http://scitation.aip.org/content/aip/journal/chaos), corrected typo
- …
