310 research outputs found

    Modelling contextuality by probabilistic programs with hypergraph semantics

    Full text link
    Models of a phenomenon are often developed by examining it under different experimental conditions, or measurement contexts. The resultant probabilistic models assume that the underlying random variables, which define a measurable set of outcomes, can be defined independent of the measurement context. The phenomenon is deemed contextual when this assumption fails. Contextuality is an important issue in quantum physics. However, there has been growing speculation that it manifests outside the quantum realm with human cognition being a particularly prominent area of investigation. This article contributes the foundations of a probabilistic programming language that allows convenient exploration of contextuality in wide range of applications relevant to cognitive science and artificial intelligence. Specific syntax is proposed to allow the specification of "measurement contexts". Each such context delivers a partial model of the phenomenon based on the associated experimental condition described by the measurement context. The probabilistic program is translated into a hypergraph in a modular way. Recent theoretical results from the field of quantum physics show that contextuality can be equated with the possibility of constructing a probabilistic model on the resulting hypergraph. The use of hypergraphs opens the door for a theoretically succinct and efficient computational semantics sensitive to modelling both contextual and non-contextual phenomena. Finally, this article raises awareness of contextuality beyond quantum physics and to contribute formal methods to detect its presence by means of hypergraph semantics.Comment: Accepted for "Theoretical Computer Science

    Analysing Ambiguous Nouns and Verbs with Quantum Contextuality Tools

    Get PDF
    Psycholinguistic research uses eye-tracking to show that polysemous words are disambiguated differently from homonymous words, and that ambiguous verbs are disambiguated differently than ambiguous nouns. Research in Compositional Distributional Semantics uses cosine distances to show that verbs are disambiguated more efficiently in the context of their subjects and objects than when on their own. These two frameworks both focus on one ambiguous word at a time and neither considers ambiguous phrases with two (or more) ambiguous words. We borrow methods and measures from Quantum Information Theory, the framework of Contextuality-by-Default and degrees of contextual influences, and work with ambiguous subject-verb and verb-object phrases of English, where both the subject/object and the verb are ambiguous. We show that differences in the processing of ambiguous verbs versus ambiguous nouns, as well as between different levels of ambiguity in homonymous versus polysemous nouns and verbs can be modelled using the averages of the degrees of their contextual influences

    Complexity: against systems

    Get PDF
    Abstract This article assumes a specific intuitive notion of complexity as a difficulty to generate and/or assess the plausibility of models. Based on this intuitive understanding of complexity, it identifies two main causes of complexity, namely, radical openness and contextuality. The former is the idea that there are no natural systems. The modeler always needs to draw artificial boundaries around phenomena to generate feasible models. Contextuality is intimately connected to the requirement to simplify models and to leave out most aspects. Complexity occurs when contextuality and radical openness cannot be contained that is when it is not clear where the boundaries of the system are and which abstractions are the correct ones. This concept of complexity is illustrated using a number of example from evolution

    Second Generation General System Theory: Perspectives in Philosophy and Approaches in Complex Systems

    Get PDF
    Following the classical work of Norbert Wiener, Ross Ashby, Ludwig von Bertalanffy and many others, the concept of System has been elaborated in different disciplinary fields, allowing interdisciplinary approaches in areas such as Physics, Biology, Chemistry, Cognitive Science, Economics, Engineering, Social Sciences, Mathematics, Medicine, Artificial Intelligence, and Philosophy. The new challenge of Complexity and Emergence has made the concept of System even more relevant to the study of problems with high contextuality. This Special Issue focuses on the nature of new problems arising from the study and modelling of complexity, their eventual common aspects, properties and approaches—already partially considered by different disciplines—as well as focusing on new, possibly unitary, theoretical frameworks. This Special Issue aims to introduce fresh impetus into systems research when the possible detection and correction of mistakes require the development of new knowledge. This book contains contributions presenting new approaches and results, problems and proposals. The context is an interdisciplinary framework dealing, in order, with electronic engineering problems; the problem of the observer; transdisciplinarity; problems of organised complexity; theoretical incompleteness; design of digital systems in a user-centred way; reaction networks as a framework for systems modelling; emergence of a stable system in reaction networks; emergence at the fundamental systems level; behavioural realization of memoryless functions

    Natural Language Syntax Complies with the Free-Energy Principle

    Full text link
    Natural language syntax yields an unbounded array of hierarchically structured expressions. We claim that these are used in the service of active inference in accord with the free-energy principle (FEP). While conceptual advances alongside modelling and simulation work have attempted to connect speech segmentation and linguistic communication with the FEP, we extend this program to the underlying computations responsible for generating syntactic objects. We argue that recently proposed principles of economy in language design - such as "minimal search" criteria from theoretical syntax - adhere to the FEP. This affords a greater degree of explanatory power to the FEP - with respect to higher language functions - and offers linguistics a grounding in first principles with respect to computability. We show how both tree-geometric depth and a Kolmogorov complexity estimate (recruiting a Lempel-Ziv compression algorithm) can be used to accurately predict legal operations on syntactic workspaces, directly in line with formulations of variational free energy minimization. This is used to motivate a general principle of language design that we term Turing-Chomsky Compression (TCC). We use TCC to align concerns of linguists with the normative account of self-organization furnished by the FEP, by marshalling evidence from theoretical linguistics and psycholinguistics to ground core principles of efficient syntactic computation within active inference
    • …
    corecore