17,929 research outputs found

    The Doxastic Interpretation of Team Semantics

    Full text link
    We advance a doxastic interpretation for many of the logical connectives considered in Dependence Logic and in its extensions, and we argue that Team Semantics is a natural framework for reasoning about beliefs and belief updates

    Reason Maintenance - State of the Art

    Get PDF
    This paper describes state of the art in reason maintenance with a focus on its future usage in the KiWi project. To give a bigger picture of the field, it also mentions closely related issues such as non-monotonic logic and paraconsistency. The paper is organized as follows: first, two motivating scenarios referring to semantic wikis are presented which are then used to introduce the different reason maintenance techniques

    Data Processing In The Texas A & M University Library

    Get PDF
    The Texas A & M University Library embraced automation as a way of life when it became the first library in the Southwest to employ a Data Processing Supervisor as a full-time Library staff member in September, 1964. The creation of such a position as part of the Library staff was only one of several favorable circumstances which combined to provide the necessary foundation for the achievements outlined in this paper. In addition to an enthusiastic University administration which provided requested supplemental funds for a special conversion project, the Library has access to the University's centralized data processing facility, which is one of the largest such University installations in the Southwest. The Data Processing Center houses an IBM 7094-1401 computer system with 14 magnetic tape drives, two separate off-line 1401 tape systems (one with a 1404 printer), and a battery of high speed sorters, collators, and card punches. This tremendous hardware capability has proved to be a great asset to our automation program.published or submitted for publicatio

    Imprecise Bayesianism and Global Belief Inertia

    Get PDF
    Traditional Bayesianism requires that an agent’s degrees of belief be represented by a real-valued, probabilistic credence function. However, in many cases it seems that our evidence is not rich enough to warrant such precision. In light of this, some have proposed that we instead represent an agent’s degrees of belief as a set of credence functions. This way, we can respect the evidence by requiring that the set, often called the agent’s credal state, includes all credence functions that are in some sense compatible with the evidence. One known problem for this evidentially motivated imprecise view is that in certain cases, our imprecise credence in a particular proposition will remain the same no matter how much evidence we receive. In this article I argue that the problem is much more general than has been appreciated so far, and that it’s difficult to avoid it without compromising the initial evidentialist motivation. _1_ Introduction _2_ Precision and Its Problems _3_ Imprecise Bayesianism and Respecting Ambiguous Evidence _4_ Local Belief Inertia _5_ From Local to Global Belief Inertia _6_ Responding to Global Belief Inertia _7_ Conclusio

    Concurrent Data Structures Linked in Time

    Get PDF
    Arguments about correctness of a concurrent data structure are typically carried out by using the notion of linearizability and specifying the linearization points of the data structure's procedures. Such arguments are often cumbersome as the linearization points' position in time can be dynamic (depend on the interference, run-time values and events from the past, or even future), non-local (appear in procedures other than the one considered), and whose position in the execution trace may only be determined after the considered procedure has already terminated. In this paper we propose a new method, based on a separation-style logic, for reasoning about concurrent objects with such linearization points. We embrace the dynamic nature of linearization points, and encode it as part of the data structure's auxiliary state, so that it can be dynamically modified in place by auxiliary code, as needed when some appropriate run-time event occurs. We name the idea linking-in-time, because it reduces temporal reasoning to spatial reasoning. For example, modifying a temporal position of a linearization point can be modeled similarly to a pointer update in separation logic. Furthermore, the auxiliary state provides a convenient way to concisely express the properties essential for reasoning about clients of such concurrent objects. We illustrate the method by verifying (mechanically in Coq) an intricate optimal snapshot algorithm due to Jayanti, as well as some clients

    Performance of VIDEBAS in an operational environment

    Get PDF
    VIDEBAS is a relational database management system in which a database consists of two parts, namely a “real-only” and an “update” part. The first part remains unmodified until the next reorganization and exploits redundancy to achieve fast access to data. A prototype of VIDEBAS has been built. In this paper a performance comparison between this relational system and a DBTG-system (UDS) is made. The used external memory and the number of page accesses to retrieve and update tuples is estimated. Although it is commonly assumed that in an operational environment relational systems are slower than network systems the opposite appears. On the other hand UDS needs less external memory

    On the cavity method for decimated random constraint satisfaction problems and the analysis of belief propagation guided decimation algorithms

    Full text link
    We introduce a version of the cavity method for diluted mean-field spin models that allows the computation of thermodynamic quantities similar to the Franz-Parisi quenched potential in sparse random graph models. This method is developed in the particular case of partially decimated random constraint satisfaction problems. This allows to develop a theoretical understanding of a class of algorithms for solving constraint satisfaction problems, in which elementary degrees of freedom are sequentially assigned according to the results of a message passing procedure (belief-propagation). We confront this theoretical analysis to the results of extensive numerical simulations.Comment: 32 pages, 24 figure
    corecore