1,623 research outputs found

    Defining and Estimating Intervention Effects for Groups that will Develop an Auxiliary Outcome

    Get PDF
    It has recently become popular to define treatment effects for subsets of the target population characterized by variables not observable at the time a treatment decision is made. Characterizing and estimating such treatment effects is tricky; the most popular but naive approach inappropriately adjusts for variables affected by treatment and so is biased. We consider several appropriate ways to formalize the effects: principal stratification, stratification on a single potential auxiliary variable, stratification on an observed auxiliary variable and stratification on expected levels of auxiliary variables. We then outline identifying assumptions for each type of estimand. We evaluate the utility of these estimands and estimation procedures for decision making and understanding causal processes, contrasting them with the concepts of direct and indirect effects. We motivate our development with examples from nephrology and cancer screening, and use simulated data and real data on cancer screening to illustrate the estimation methods.Comment: Published at http://dx.doi.org/10.1214/088342306000000655 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    STUDY OF EDGELESS TIMEPIX PIXEL DEVICES

    Get PDF
    Silicon micropattern devices are crucial components of detector systems designed to study decays of exotic subatomic particles containing beauty and charm quarks. Among the technologies under consideration for use in future particle physics experiments are edgeless silicon pixel detectors. In these devices a state-of-the-art fabrication process is used to create sensors with a nearly full active area, as compared to conventional sensors which have a “guard ring” which is a dead region at the sensor periphery. Prototypes used for the study described in this paper were designed and fabricated by VTT Technical Research Centre of Finland. In a test beam study, we find that these devices perform in accordance with expectations and fulfill the technical needs of their intended implementation. This active edge technology is indeed efficient in maximizing the useful area of the sensor. More broadly, these devices meet the needs of a detector for particle physics, and may also find a role in medical imaging or X-ray spectroscopy

    Lever Drive Wheelchair with Gear-Changing Concept

    Get PDF
    The team has taken notice of the limitations disabled people in wheelchairs experience in everyday life due to the insufficiencies of their wheelchairs. The purpose of this project is to create a wheelchair that is affordable for users and enables them to have more freedom when it comes to their areas of travel. Our wheelchair would allow patients to be more integrated into society by not only having more freedom in the areas that they can travel, but also to be a part of the economic society by potentially providing services to people with their updated mode of transport. The wheelchair shall be geared toward all-pavement use that features a lever-drive propulsion system coupled with disc brakes, and a mechanical advantage gearing system. The concept of this wheelchair is inspired by lever-driven wheelchairs currently on the market, through which users gain continued mobility and freedom in exploring the outdoors. Lastly, the wheelchair will be competitively priced compared to other wheelchairs on the market because it is more sustainable. The incorporation of reused bike frames will both reduce waste into the environment as well as make our product more affordable due to lowered demand for newly machined parts

    Perils and Prospects of Using Aggregate Area Level Socioeconomic Information as a Proxy for Individual Level Socioeconomic Confounders in Instrumental Variables Regression

    Get PDF
    A frequent concern in making statistical inference for causal effects of a policy or treatment based on observational studies is that there are unmeasured confounding variables. The instrumental variable method is an approach to estimating a causal relationship in the presence of unmeasured confounding variables. A valid instrumental variable needs to be independent of the unmeasured confounding variables. It is important to control for the confounding variable if it is correlated with the instrument. In health services research, socioeconomic status variables are often considered as confounding variables. In recent studies, distance to a specialty care center has been used as an instrument for the effect of specialty care vs. general care. Because the instrument may be correlated with socioeconomic status variables, it is important that socioeconomic status variables are controlled for in the instrumental variables regression. However, health data sets often lack individual socioeconomic information but contain area average socioeconomic information from the US Census, e.g., average income or education level in a county. We study the effects on the bias of the two stage least squares estimates in instrumental variables regression when using an area-level variable as a controlled confounding variable that may be correlated with the instrument. We propose the aggregated instrumental variables regression using the concept of Wald’s method of grouping, provided the assumption that the grouping is independent of the errors. We present simulation results and an application to a study of perinatal care for premature infants

    Strong Control of the Familywise Error Rate in Observational Studies that Discover Effect Modification by Exploratory Methods

    Get PDF
    An effect modifier is a pretreatment covariate that affects the magnitude of the treatment effect or its stability. When there is effect modification, an overall test that ignores an effect modifier may be more sensitive to unmeasured bias than a test that combines results from subgroups defined by the effect modifier. If there is effect modification, one would like to identify specific subgroups for which there is evidence of effect that is insensitive to small or moderate biases. In this paper, we propose an exploratory method for discovering effect modification, and combine it with a confirmatory method of simultaneous inference that strongly controls the familywise error rate in a sensitivity analysis, despite the fact that the groups being compared are defined empirically. A new form of matching, strength-k matching, permits a search through more than k covariates for effect modifiers, in such a way that no pairs are lost, provided that at most k covariates are selected to group the pairs. In a strength-k match, each set of k covariates is exactly balanced, although a set of more than k covariates may exhibit imbalance. We apply the proposed method to study the effects of the earthquake that struck Chile in 2010

    FPGA-accelerated machine learning inference as a service for particle physics computing

    Full text link
    New heterogeneous computing paradigms on dedicated hardware with increased parallelization, such as Field Programmable Gate Arrays (FPGAs), offer exciting solutions with large potential gains. The growing applications of machine learning algorithms in particle physics for simulation, reconstruction, and analysis are naturally deployed on such platforms. We demonstrate that the acceleration of machine learning inference as a web service represents a heterogeneous computing solution for particle physics experiments that potentially requires minimal modification to the current computing model. As examples, we retrain the ResNet-50 convolutional neural network to demonstrate state-of-the-art performance for top quark jet tagging at the LHC and apply a ResNet-50 model with transfer learning for neutrino event classification. Using Project Brainwave by Microsoft to accelerate the ResNet-50 image classification model, we achieve average inference times of 60 (10) milliseconds with our experimental physics software framework using Brainwave as a cloud (edge or on-premises) service, representing an improvement by a factor of approximately 30 (175) in model inference latency over traditional CPU inference in current experimental hardware. A single FPGA service accessed by many CPUs achieves a throughput of 600--700 inferences per second using an image batch of one, comparable to large batch-size GPU throughput and significantly better than small batch-size GPU throughput. Deployed as an edge or cloud service for the particle physics computing model, coprocessor accelerators can have a higher duty cycle and are potentially much more cost-effective.Comment: 16 pages, 14 figures, 2 table
    • …
    corecore