261 research outputs found

    Generalized information theory meets human cognition: Introducing a unified framework to model uncertainty and information search

    Get PDF
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people’s goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism

    Generalized information theory meets human cognition: Introducing a unified framework to model uncertainty and information search

    Get PDF
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people’s goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism

    Structural and Dynamical Anomalies of a Gaussian Core Fluid: a Mode Coupling Theory Study

    Full text link
    We present a theoretical study of transport properties of a liquid comprised of particles uist1:/home/sokrates/egorov/oldhome/Pap41/Submit > m abs.tex We present a theoretical study of transport properties of a liquid comprised of particles interacting via Gaussian Core pair potential. Shear viscosity and self-diffusion coefficient are computed on the basis of the mode-coupling theory, with required structural input obtained from integral equation theory. Both self-diffusion coefficient and viscosity display anomalous density dependence, with diffusivity increasing and viscosity decreasing with density within a particular density range along several isotherms below a certain temperature. Our theoretical results for both transport coefficients are in good agreement with the simulation data

    The Statistical Foundations of Entropy

    Get PDF
    In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems

    Cosmological Inflation, Dark Matter and Dark Energy

    Get PDF
    Various cosmological observations support not only cosmological inflation in the early universe, which is also known as exponential cosmic expansion, but also that the expansion of the late-time universe is accelerating. To explain this phenomenon, the existence of dark energy is proposed. In addition, according to the rotation curve of galaxies, the existence of dark matter, which does not shine, is also suggested. If primordial gravitational waves are detected in the future, the mechanism for realizing inflation can be revealed. Moreover, there exist two main candidates for dark matter. The first is a new particle, the existence of which is predicted in particle physics. The second is an astrophysical object which is not found by electromagnetic waves. Furthermore, there are two representative approaches to account for the accelerated expansion of the current universe. One is to assume the unknown dark energy in general relativity. The other is to extend the gravity theory to large scales. Investigation of the origins of inflation, dark matter, and dark energy is one of the most fundamental problems in modern physics and cosmology. The purpose of this book is to explore the physics and cosmology of inflation, dark matter, and dark energy

    DiSCoMaT: Distantly Supervised Composition Extraction from Tables in Materials Science Articles

    Full text link
    A crucial component in the curation of KB for a scientific domain (e.g., materials science, foods & nutrition, fuels) is information extraction from tables in the domain's published research articles. To facilitate research in this direction, we define a novel NLP task of extracting compositions of materials (e.g., glasses) from tables in materials science papers. The task involves solving several challenges in concert, such as tables that mention compositions have highly varying structures; text in captions and full paper needs to be incorporated along with data in tables; and regular languages for numbers, chemical compounds and composition expressions must be integrated into the model. We release a training dataset comprising 4,408 distantly supervised tables, along with 1,475 manually annotated dev and test tables. We also present a strong baseline DISCOMAT, that combines multiple graph neural networks with several task-specific regular expressions, features, and constraints. We show that DISCOMAT outperforms recent table processing architectures by significant margins.Comment: Accepted long paper at ACL 2023 (https://2023.aclweb.org/program/accepted_main_conference/

    New Directions for Contact Integrators

    Get PDF
    Contact integrators are a family of geometric numerical schemes which guarantee the conservation of the contact structure. In this work we review the construction of both the variational and Hamiltonian versions of these methods. We illustrate some of the advantages of geometric integration in the dissipative setting by focusing on models inspired by recent studies in celestial mechanics and cosmology.Comment: To appear as Chapter 24 in GSI 2021, Springer LNCS 1282
    corecore