21,298 research outputs found

    Defining and Estimating Intervention Effects for Groups that will Develop an Auxiliary Outcome

    Get PDF
    It has recently become popular to define treatment effects for subsets of the target population characterized by variables not observable at the time a treatment decision is made. Characterizing and estimating such treatment effects is tricky; the most popular but naive approach inappropriately adjusts for variables affected by treatment and so is biased. We consider several appropriate ways to formalize the effects: principal stratification, stratification on a single potential auxiliary variable, stratification on an observed auxiliary variable and stratification on expected levels of auxiliary variables. We then outline identifying assumptions for each type of estimand. We evaluate the utility of these estimands and estimation procedures for decision making and understanding causal processes, contrasting them with the concepts of direct and indirect effects. We motivate our development with examples from nephrology and cancer screening, and use simulated data and real data on cancer screening to illustrate the estimation methods.Comment: Published at http://dx.doi.org/10.1214/088342306000000655 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Detecting periodicity in experimental data using linear modeling techniques

    Get PDF
    Fourier spectral estimates and, to a lesser extent, the autocorrelation function are the primary tools to detect periodicities in experimental data in the physical and biological sciences. We propose a new method which is more reliable than traditional techniques, and is able to make clear identification of periodic behavior when traditional techniques do not. This technique is based on an information theoretic reduction of linear (autoregressive) models so that only the essential features of an autoregressive model are retained. These models we call reduced autoregressive models (RARM). The essential features of reduced autoregressive models include any periodicity present in the data. We provide theoretical and numerical evidence from both experimental and artificial data, to demonstrate that this technique will reliably detect periodicities if and only if they are present in the data. There are strong information theoretic arguments to support the statement that RARM detects periodicities if they are present. Surrogate data techniques are used to ensure the converse. Furthermore, our calculations demonstrate that RARM is more robust, more accurate, and more sensitive, than traditional spectral techniques.Comment: 10 pages (revtex) and 6 figures. To appear in Phys Rev E. Modified styl

    Surrogate-assisted network analysis of nonlinear time series

    Full text link
    The performance of recurrence networks and symbolic networks to detect weak nonlinearities in time series is compared to the nonlinear prediction error. For the synthetic data of the Lorenz system, the network measures show a comparable performance. In the case of relatively short and noisy real-world data from active galactic nuclei, the nonlinear prediction error yields more robust results than the network measures. The tests are based on surrogate data sets. The correlations in the Fourier phases of data sets from some surrogate generating algorithms are also examined. The phase correlations are shown to have an impact on the performance of the tests for nonlinearity.Comment: 9 pages, 5 figures, Chaos (http://scitation.aip.org/content/aip/journal/chaos), corrected typo

    Dynamical modeling of collective behavior from pigeon flight data: flock cohesion and dispersion

    Get PDF
    Several models of flocking have been promoted based on simulations with qualitatively naturalistic behavior. In this paper we provide the first direct application of computational modeling methods to infer flocking behavior from experimental field data. We show that this approach is able to infer general rules for interaction, or lack of interaction, among members of a flock or, more generally, any community. Using experimental field measurements of homing pigeons in flight we demonstrate the existence of a basic distance dependent attraction/repulsion relationship and show that this rule is sufficient to explain collective behavior observed in nature. Positional data of individuals over time are used as input data to a computational algorithm capable of building complex nonlinear functions that can represent the system behavior. Topological nearest neighbor interactions are considered to characterize the components within this model. The efficacy of this method is demonstrated with simulated noisy data generated from the classical (two dimensional) Vicsek model. When applied to experimental data from homing pigeon flights we show that the more complex three dimensional models are capable of predicting and simulating trajectories, as well as exhibiting realistic collective dynamics. The simulations of the reconstructed models are used to extract properties of the collective behavior in pigeons, and how it is affected by changing the initial conditions of the system. Our results demonstrate that this approach may be applied to construct models capable of simulating trajectories and collective dynamics using experimental field measurements of herd movement. From these models, the behavior of the individual agents (animals) may be inferred

    Causal inference for continuous-time processes when covariates are observed only at discrete times

    Get PDF
    Most of the work on the structural nested model and g-estimation for causal inference in longitudinal data assumes a discrete-time underlying data generating process. However, in some observational studies, it is more reasonable to assume that the data are generated from a continuous-time process and are only observable at discrete time points. When these circumstances arise, the sequential randomization assumption in the observed discrete-time data, which is essential in justifying discrete-time g-estimation, may not be reasonable. Under a deterministic model, we discuss other useful assumptions that guarantee the consistency of discrete-time g-estimation. In more general cases, when those assumptions are violated, we propose a controlling-the-future method that performs at least as well as g-estimation in most scenarios and which provides consistent estimation in some cases where g-estimation is severely inconsistent. We apply the methods discussed in this paper to simulated data, as well as to a data set collected following a massive flood in Bangladesh, estimating the effect of diarrhea on children's height. Results from different methods are compared in both simulation and the real application.Comment: Published in at http://dx.doi.org/10.1214/10-AOS830 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Modulation of Thermoelectric Power of Individual Carbon Nanotubes

    Full text link
    Thermoelectric power (TEP) of individual single walled carbon nanotubes (SWNTs) has been measured at mesoscopic scales using a microfabricated heater and thermometers. Gate electric field dependent TEP-modulation has been observed. The measured TEP of SWNTs is well correlated to the electrical conductance across the SWNT according to the Mott formula. At low temperatures, strong modulations of TEP were observed in the single electron conduction limit. In addition, semiconducting SWNTs exhibit large values of TEP due to the Schottky barriers at SWNT-metal junctions.Comment: to be published in Phys. Rev. Let

    rPICARD: A CASA-based Calibration Pipeline for VLBI Data

    Full text link
    Currently, HOPS and AIPS are the primary choices for the time-consuming process of (millimeter) Very Long Baseline Interferometry (VLBI) data calibration. However, for a full end-to-end pipeline, they either lack the ability to perform easily scriptable incremental calibration or do not provide full control over the workflow with the ability to manipulate and edit calibration solutions directly. The Common Astronomy Software Application (CASA) offers all these abilities, together with a secure development future and an intuitive Python interface, which is very attractive for young radio astronomers. Inspired by the recent addition of a global fringe-fitter, the capability to convert FITS-IDI files to measurement sets, and amplitude calibration routines based on ANTAB metadata, we have developed the the CASA-based Radboud PIpeline for the Calibration of high Angular Resolution Data (rPICARD). The pipeline will be able to handle data from multiple arrays: EHT, GMVA, VLBA and the EVN in the first release. Polarization and phase-referencing calibration are supported and a spectral line mode will be added in the future. The large bandwidths of future radio observatories ask for a scalable reduction software. Within CASA, a message passing interface (MPI) implementation is used for parallelization, reducing the total time needed for processing. The most significant gain is obtained for the time-consuming fringe-fitting task where each scan be processed in parallel.Comment: 6 pages, 1 figure, EVN 2018 symposium proceeding
    corecore