18,262 research outputs found

    Monodromy Substitutions and Rational Blowdowns

    Full text link
    We introduce several new families of relations in the mapping class groups of planar surfaces, each equating two products of right-handed Dehn twists. The interest of these relations lies in their geometric interpretation in terms of rational blowdowns of 4-manifolds, specifically via monodromy substitution in Lefschetz fibrations. The simplest example is the lantern relation, already shown by the first author and Gurtas to correspond to rational blowdown along a -4 sphere; here we give relations that extend that result to realize the "generalized" rational blowdowns of Fintushel-Stern and Park by monodromy subsitution, as well as several of the families of rational blowdowns discovered by Stipsicz-Szab\'o-Wahl.Comment: 28 pages, many figures. v2: minor edits; this version accepted for publication in Journal of Topolog

    The Ghost of Extinction: Preservation Values and Minimum Viable Population in Wildlife Models

    Get PDF
    The inclusion of a minimum viable population in bioeconomic modeling creates at least two complications that are not resolved by using a modified logistic growth function. The first complication can be dealt with by choosing a different depensational growth function. The second complication relates to the inclusion of the in situ benefits of wildlife into the analysis. Knowledge about the magnitude of the in situ benefits provides no guide for policy about conservation management. Simply knowing that people are willing to pay a large amount each year to protect a species says nothing about whether one should manage habitat to protect or enhance the species numbers, unless the species is in imminent danger of extinction. If willingness to pay is to be a guide, it needs to be better tied to population numbers, especially the minimum viable population.marginal willingness to pay, endangered species and extinction, minimum viable population, Resource /Energy Economics and Policy, Q20, Q24, C61,

    Functional Decomposition using Principal Subfields

    Full text link
    Let fK(t)f\in K(t) be a univariate rational function. It is well known that any non-trivial decomposition ghg \circ h, with g,hK(t)g,h\in K(t), corresponds to a non-trivial subfield K(f(t))LK(t)K(f(t))\subsetneq L \subsetneq K(t) and vice-versa. In this paper we use the idea of principal subfields and fast subfield-intersection techniques to compute the subfield lattice of K(t)/K(f(t))K(t)/K(f(t)). This yields a Las Vegas type algorithm with improved complexity and better run times for finding all non-equivalent complete decompositions of ff.Comment: 8 pages, accepted for ISSAC'1

    A generalization of moderated statistics to data adaptive semiparametric estimation in high-dimensional biology

    Full text link
    The widespread availability of high-dimensional biological data has made the simultaneous screening of numerous biological characteristics a central statistical problem in computational biology. While the dimensionality of such datasets continues to increase, the problem of teasing out the effects of biomarkers in studies measuring baseline confounders while avoiding model misspecification remains only partially addressed. Efficient estimators constructed from data adaptive estimates of the data-generating distribution provide an avenue for avoiding model misspecification; however, in the context of high-dimensional problems requiring simultaneous estimation of numerous parameters, standard variance estimators have proven unstable, resulting in unreliable Type-I error control under standard multiple testing corrections. We present the formulation of a general approach for applying empirical Bayes shrinkage approaches to asymptotically linear estimators of parameters defined in the nonparametric model. The proposal applies existing shrinkage estimators to the estimated variance of the influence function, allowing for increased inferential stability in high-dimensional settings. A methodology for nonparametric variable importance analysis for use with high-dimensional biological datasets with modest sample sizes is introduced and the proposed technique is demonstrated to be robust in small samples even when relying on data adaptive estimators that eschew parametric forms. Use of the proposed variance moderation strategy in constructing stabilized variable importance measures of biomarkers is demonstrated by application to an observational study of occupational exposure. The result is a data adaptive approach for robustly uncovering stable associations in high-dimensional data with limited sample sizes

    Robust and Flexible Estimation of Stochastic Mediation Effects: A Proposed Method and Example in a Randomized Trial Setting

    Full text link
    Causal mediation analysis can improve understanding of the mechanisms underlying epidemiologic associations. However, the utility of natural direct and indirect effect estimation has been limited by the assumption of no confounder of the mediator-outcome relationship that is affected by prior exposure---an assumption frequently violated in practice. We build on recent work that identified alternative estimands that do not require this assumption and propose a flexible and double robust semiparametric targeted minimum loss-based estimator for data-dependent stochastic direct and indirect effects. The proposed method treats the intermediate confounder affected by prior exposure as a time-varying confounder and intervenes stochastically on the mediator using a distribution which conditions on baseline covariates and marginalizes over the intermediate confounder. In addition, we assume the stochastic intervention is given, conditional on observed data, which results in a simpler estimator and weaker identification assumptions. We demonstrate the estimator's finite sample and robustness properties in a simple simulation study. We apply the method to an example from the Moving to Opportunity experiment. In this application, randomization to receive a housing voucher is the treatment/instrument that influenced moving to a low-poverty neighborhood, which is the intermediate confounder. We estimate the data-dependent stochastic direct effect of randomization to the voucher group on adolescent marijuana use not mediated by change in school district and the stochastic indirect effect mediated by change in school district. We find no evidence of mediation. Our estimator is easy to implement in standard statistical software, and we provide annotated R code to further lower implementation barriers.Comment: 24 pages, 2 tables, 2 figure

    Effect of breastfeeding on gastrointestinal infection in infants: A targeted maximum likelihood approach for clustered longitudinal data

    Full text link
    The PROmotion of Breastfeeding Intervention Trial (PROBIT) cluster-randomized a program encouraging breastfeeding to new mothers in hospital centers. The original studies indicated that this intervention successfully increased duration of breastfeeding and lowered rates of gastrointestinal tract infections in newborns. Additional scientific and popular interest lies in determining the causal effect of longer breastfeeding on gastrointestinal infection. In this study, we estimate the expected infection count under various lengths of breastfeeding in order to estimate the effect of breastfeeding duration on infection. Due to the presence of baseline and time-dependent confounding, specialized "causal" estimation methods are required. We demonstrate the double-robust method of Targeted Maximum Likelihood Estimation (TMLE) in the context of this application and review some related methods and the adjustments required to account for clustering. We compare TMLE (implemented both parametrically and using a data-adaptive algorithm) to other causal methods for this example. In addition, we conduct a simulation study to determine (1) the effectiveness of controlling for clustering indicators when cluster-specific confounders are unmeasured and (2) the importance of using data-adaptive TMLE.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS727 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The Temporal Doppler Effect: When The Future Feels Closer Than The Past

    Get PDF
    People routinely remember events that have passed and imagine those that are yet to come. The past and the future are sometimes psychologically close ( just around the corner ) and other times psychologically distant ( ages away ). Four studies demonstrate a systematic asymmetry whereby future events are psychologically closer than past events of equivalent objective distance. When considering specific times (e.g., 1 year) or events (e.g., Valentine\u27s Day), people consistently reported that the future was closer than the past. We suggest that this asymmetry arises because the subjective experience of movement through time (whereby future events approach and past events recede) is analogous to the physical experience of movement through space. Consistent with this hypothesis, experimentally reversing the metaphorical arrow of time (by having participants move backward through virtual space) completely eliminated the past-future asymmetry. We discuss how reducing psychological distance to the future may function to prepare people for upcoming action
    corecore