4,339 research outputs found

    From treebank resources to LFG F-structures

    Get PDF
    We present two methods for automatically annotating treebank resources with functional structures. Both methods define systematic patterns of correspondence between partial PS configurations and functional structures. These are applied to PS rules extracted from treebanks, or directly to constraint set encodings of treebank PS trees

    Maladaptation and the paradox of robustness in evolution

    Get PDF
    Background. Organisms use a variety of mechanisms to protect themselves against perturbations. For example, repair mechanisms fix damage, feedback loops keep homeostatic systems at their setpoints, and biochemical filters distinguish signal from noise. Such buffering mechanisms are often discussed in terms of robustness, which may be measured by reduced sensitivity of performance to perturbations. Methodology/Principal Findings. I use a mathematical model to analyze the evolutionary dynamics of robustness in order to understand aspects of organismal design by natural selection. I focus on two characters: one character performs an adaptive task; the other character buffers the performance of the first character against perturbations. Increased perturbations favor enhanced buffering and robustness, which in turn decreases sensitivity and reduces the intensity of natural selection on the adaptive character. Reduced selective pressure on the adaptive character often leads to a less costly, lower performance trait. Conclusions/Significance. The paradox of robustness arises from evolutionary dynamics: enhanced robustness causes an evolutionary reduction in the adaptive performance of the target character, leading to a degree of maladaptation compared to what could be achieved by natural selection in the absence of robustness mechanisms. Over evolutionary time, buffering traits may become layered on top of each other, while the underlying adaptive traits become replaced by cheaper, lower performance components. The paradox of robustness has widespread implications for understanding organismal design

    Simulating strongly correlated quantum systems with tree tensor networks

    Get PDF
    We present a tree-tensor-network-based method to study strongly correlated systems with nonlocal interactions in higher dimensions. Although the momentum-space and quantum-chemistry versions of the density-matrix renormalization group (DMRG) method have long been applied to such systems, the spatial topology of DMRG-based methods allows efficient optimizations to be carried out with respect to one spatial dimension only. Extending the matrix-product-state picture, we formulate a more general approach by allowing the local sites to be coupled to more than two neighboring auxiliary subspaces. Following [Y. Shi, L. Duan, and G. Vidal, Phys. Rev. A 74, 022320 (2006)], we treat a treelike network ansatz with arbitrary coordination number z, where the z=2 case corresponds to the one-dimensional (1D) scheme. For this ansatz, the long-range correlation deviates from the mean-field value polynomially with distance, in contrast to the matrix-product ansatz, which deviates exponentially. The computational cost of the tree-tensor-network method is significantly smaller than that of previous DMRG-based attempts, which renormalize several blocks into a single block. In addition, we investigate the effect of unitary transformations on the local basis states and present a method for optimizing such transformations. For the 1D interacting spinless fermion model, the optimized transformation interpolates smoothly between real space and momentum space. Calculations carried out on small quantum chemical systems support our approach

    Quantification of mesoscale variability and geometrical reconstruction of a textile

    Get PDF
    Automated image analysis of textile surfaces allowed determination and quantification of intrinsic yarn path variabilities in a 2/2 twill weave during the lay-up process. The yarn paths were described in terms of waves and it was found that the frequencies are similar in warp and weft directions and hardly affected by introduced yarn path deformations. The most significant source of fabric variability was introduced during handling before cutting. These resulting systematic deformations will need to be considered when designing or analysing a composite component. An automated method for three dimensional reconstruction of the analysed lay-up was implemented in TexGen which will allow virtual testing of components in the future

    Air Pollution Exposure Assessment for Epidemiologic Studies of Pregnant Women and Children: Lessons Learned from the Centers for Children’s Environmental Health and Disease Prevention Research

    Get PDF
    The National Children’s Study is considering a wide spectrum of airborne pollutants that are hypothesized to potentially influence pregnancy outcomes, neurodevelopment, asthma, atopy, immune development, obesity, and pubertal development. In this article we summarize six applicable exposure assessment lessons learned from the Centers for Children’s Environmental Health and Disease Prevention Research that may enhance the National Children’s Study: a) Selecting individual study subjects with a wide range of pollution exposure profiles maximizes spatial-scale exposure contrasts for key pollutants of study interest. b) In studies with large sample sizes, long duration, and diverse outcomes and exposures, exposure assessment efforts should rely on modeling to provide estimates for the entire cohort, supported by subject-derived questionnaire data. c) Assessment of some exposures of interest requires individual measurements of exposures using snapshots of personal and microenvironmental exposures over short periods and/or in selected microenvironments. d) Understanding issues of spatial–temporal correlations of air pollutants, the surrogacy of specific pollutants for components of the complex mixture, and the exposure misclassification inherent in exposure estimates is critical in analysis and interpretation. e) “Usual” temporal, spatial, and physical patterns of activity can be used as modifiers of the exposure/outcome relationships. f) Biomarkers of exposure are useful for evaluation of specific exposures that have multiple routes of exposure. If these lessons are applied, the National Children’s Study offers a unique opportunity to assess the adverse effects of air pollution on interrelated health outcomes during the critical early life period

    Computational Complexity of interacting electrons and fundamental limitations of Density Functional Theory

    Get PDF
    One of the central problems in quantum mechanics is to determine the ground state properties of a system of electrons interacting via the Coulomb potential. Since its introduction by Hohenberg, Kohn, and Sham, Density Functional Theory (DFT) has become the most widely used and successful method for simulating systems of interacting electrons, making their original work one of the most cited in physics. In this letter, we show that the field of computational complexity imposes fundamental limitations on DFT, as an efficient description of the associated universal functional would allow to solve any problem in the class QMA (the quantum version of NP) and thus particularly any problem in NP in polynomial time. This follows from the fact that finding the ground state energy of the Hubbard model in an external magnetic field is a hard problem even for a quantum computer, while given the universal functional it can be computed efficiently using DFT. This provides a clear illustration how the field of quantum computing is useful even if quantum computers would never be built.Comment: 8 pages, 3 figures. v2: Version accepted at Nature Physics; differs significantly from v1 (including new title). Includes an extra appendix (not contained in the journal version) on the NP-completeness of Hartree-Fock, which is taken from v

    A framework for Distributional Formal Semantics

    Get PDF
    Formal semantics and distributional semantics offer complementary strengths in capturing the meaning of natural language. As such, a considerable amount of research has sought to unify them, either by augmenting formal semantic systems with a distributional component, or by defining a formal system on top of distributed representations. Arriving at such a unified framework has, however, proven extremely challenging. One reason for this is that formal and distributional semantics operate on a fundamentally different `representational currency': formal semantics defines meaning in terms of models of the world, whereas distributional semantics defines meaning in terms of linguistic co-occurrence. Here, we pursue an alternative approach by deriving a vector space model that defines meaning in a distributed manner relative to formal models of the world. We will show that the resulting Distributional Formal Semantics offers probabilistic distributed representations that are also inherently compositional, and that naturally capture quantification and entailment. We moreover show that, when used as part of a neural network model, these representations allow for capturing incremental meaning construction and probabilistic inferencing. This framework thus lays the groundwork for an integrated distributional and formal approach to meaning

    Inferring R0 in emerging epidemics: the effect of common population structure is small

    Get PDF
    When controlling an emerging outbreak of an infectious disease, it is essential to know the key epidemiological parameters, such as the basic reproduction number R0 and the control effort required to prevent a large outbreak. These parameters are estimated from the observed incidence of new cases and information about the infectious contact structures of the population in which the disease spreads. However, the relevant infectious contact structures for new, emerging infections are often unknown or hard to obtain. Here, we show that, for many common true underlying heterogeneous contact structures, the simplification to neglect such structures and instead assume that all contacts are made homogeneously in the whole population results in conservative estimates for R0 and the required control effort. This means that robust control policies can be planned during the early stages of an outbreak, using such conservative estimates of the required control effort

    Sampling constrained probability distributions using Spherical Augmentation

    Full text link
    Statistical models with constrained probability distributions are abundant in machine learning. Some examples include regression models with norm constraints (e.g., Lasso), probit, many copula models, and latent Dirichlet allocation (LDA). Bayesian inference involving probability distributions confined to constrained domains could be quite challenging for commonly used sampling algorithms. In this paper, we propose a novel augmentation technique that handles a wide range of constraints by mapping the constrained domain to a sphere in the augmented space. By moving freely on the surface of this sphere, sampling algorithms handle constraints implicitly and generate proposals that remain within boundaries when mapped back to the original space. Our proposed method, called {Spherical Augmentation}, provides a mathematically natural and computationally efficient framework for sampling from constrained probability distributions. We show the advantages of our method over state-of-the-art sampling algorithms, such as exact Hamiltonian Monte Carlo, using several examples including truncated Gaussian distributions, Bayesian Lasso, Bayesian bridge regression, reconstruction of quantized stationary Gaussian process, and LDA for topic modeling.Comment: 41 pages, 13 figure
    • 

    corecore