6,917 research outputs found

    How much of commonsense and legal reasoning is formalizable? A review of conceptual obstacles

    Get PDF
    Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities

    Adaptive evolution of transcription factor binding sites

    Get PDF
    The regulation of a gene depends on the binding of transcription factors to specific sites located in the regulatory region of the gene. The generation of these binding sites and of cooperativity between them are essential building blocks in the evolution of complex regulatory networks. We study a theoretical model for the sequence evolution of binding sites by point mutations. The approach is based on biophysical models for the binding of transcription factors to DNA. Hence we derive empirically grounded fitness landscapes, which enter a population genetics model including mutations, genetic drift, and selection. We show that the selection for factor binding generically leads to specific correlations between nucleotide frequencies at different positions of a binding site. We demonstrate the possibility of rapid adaptive evolution generating a new binding site for a given transcription factor by point mutations. The evolutionary time required is estimated in terms of the neutral (background) mutation rate, the selection coefficient, and the effective population size. The efficiency of binding site formation is seen to depend on two joint conditions: the binding site motif must be short enough and the promoter region must be long enough. These constraints on promoter architecture are indeed seen in eukaryotic systems. Furthermore, we analyse the adaptive evolution of genetic switches and of signal integration through binding cooperativity between different sites. Experimental tests of this picture involving the statistics of polymorphisms and phylogenies of sites are discussed.Comment: published versio

    Absorptive capacity and the growth and investment effects of regional transfers : a regression discontinuity design with heterogeneous treatment effects

    Get PDF
    Researchers often estimate average treatment effects of programs without investigating heterogeneity across units. Yet, individuals, firms, regions, or countries vary in their ability, e.g., to utilize transfers. We analyze Objective 1 Structural Funds transfers of the European Commission to regions of EU member states below a certain income level by way of a regression discontinuity design with systematically heterogeneous treatment effects. Only about 30% and 21% of the regions - those with sufficient human capital and good-enough institutions - are able to turn transfers into faster per-capita income growth and per-capita investment. In general, the variance of the treatment effect is much bigger than its mean

    Concepts and Their Dynamics: A Quantum-Theoretic Modeling of Human Thought

    Full text link
    We analyze different aspects of our quantum modeling approach of human concepts, and more specifically focus on the quantum effects of contextuality, interference, entanglement and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, i.e. prototype theory, exemplar theory and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this paper by analyzing human concepts and their dynamics.Comment: 31 pages, 5 figure

    Inconstant Planck's constant

    Full text link
    Motivated by the Dirac idea that fundamental constant are dynamical variables and by conjectures on quantum structure of spacetime at small distances, we consider the possibility that Planck constant ℏ\hbar is a time depending quantity, undergoing random gaussian fluctuations around its measured constant mean value, with variance σ2\sigma^2 and a typical correlation timescale Δt\Delta t. We consider the case of propagation of a free particle and a one--dimensional harmonic oscillator coherent state, and show that the time evolution in both cases is different from the standard behaviour. Finally, we discuss how interferometric experiments or exploiting coherent electromagnetic fields in a cavity may put effective bounds on the value of τ=σ2Δt\tau= \sigma^2 \Delta t.Comment: To appear on the International Journal of Modern Physics

    Typicality, graded membership, and vagueness

    Get PDF
    This paper addresses theoretical problems arising from the vagueness of language terms, and intuitions of the vagueness of the concepts to which they refer. It is argued that the central intuitions of prototype theory are sufficient to account for both typicality phenomena and psychological intuitions about degrees of membership in vaguely defined classes. The first section explains the importance of the relation between degrees of membership and typicality (or goodness of example) in conceptual categorization. The second and third section address arguments advanced by Osherson and Smith (1997), and Kamp and Partee (1995), that the two notions of degree of membership and typicality must relate to fundamentally different aspects of conceptual representations. A version of prototype theory—the Threshold Model—is proposed to counter these arguments and three possible solutions to the problems of logical selfcontradiction and tautology for vague categorizations are outlined. In the final section graded membership is related to the social construction of conceptual boundaries maintained through language use
    • 

    corecore