1,310 research outputs found

    An Epistemic Perspective on Consistency of Concurrent Computations

    Full text link
    Consistency properties of concurrent computations, e.g., sequential consistency, linearizability, or eventual consistency, are essential for devising correct concurrent algorithms. In this paper, we present a logical formalization of such consistency properties that is based on a standard logic of knowledge. Our formalization provides a declarative perspective on what is imposed by consistency requirements and provides some interesting unifying insight on differently looking properties

    A Note on Parameterised Knowledge Operations in Temporal Logic

    Full text link
    We consider modeling the conception of knowledge in terms of temporal logic. The study of knowledge logical operations is originated around 1962 by representation of knowledge and belief using modalities. Nowadays, it is very good established area. However, we would like to look to it from a bit another point of view, our paper models knowledge in terms of linear temporal logic with {\em past}. We consider various versions of logical knowledge operations which may be defined in this framework. Technically, semantics, language and temporal knowledge logics based on our approach are constructed. Deciding algorithms are suggested, unification in terms of this approach is commented. This paper does not offer strong new technical outputs, instead we suggest new approach to conception of knowledge (in terms of time).Comment: 10 page

    A Local-Dominance Theory of Voting Equilibria

    Full text link
    It is well known that no reasonable voting rule is strategyproof. Moreover, the common Plurality rule is particularly prone to strategic behavior of the voters and empirical studies show that people often vote strategically in practice. Multiple game-theoretic models have been proposed to better understand and predict such behavior and the outcomes it induces. However, these models often make unrealistic assumptions regarding voters' behavior and the information on which they base their vote. We suggest a new model for strategic voting that takes into account voters' bounded rationality, as well as their limited access to reliable information. We introduce a simple behavioral heuristic based on \emph{local dominance}, where each voter considers a set of possible world states without assigning probabilities to them. This set is constructed based on prospective candidates' scores (e.g., available from an inaccurate poll). In a \emph{voting equilibrium}, all voters vote for candidates not dominated within the set of possible states. We prove that these voting equilibria exist in the Plurality rule for a broad class of local dominance relations (that is, different ways to decide which states are possible). Furthermore, we show that in an iterative setting where voters may repeatedly change their vote, local dominance-based dynamics quickly converge to an equilibrium if voters start from the truthful state. Weaker convergence guarantees in more general settings are also provided. Using extensive simulations of strategic voting on generated and real preference profiles, we show that convergence is fast and robust, that emerging equilibria are consistent across various starting conditions, and that they replicate widely known patterns of human voting behavior such as Duverger's law. Further, strategic voting generally improves the quality of the winner compared to truthful voting

    On the Behaviour of General-Purpose Applications on Cloud Storages

    Get PDF
    Managing data over cloud infrastructures raises novel challenges with respect to existing and well studied approaches such as ACID and long running transactions. One of the main requirements is to provide availability and partition tolerance in a scenario with replicas and distributed control. This comes at the price of a weaker consistency, usually called eventual consistency. These weak memory models have proved to be suitable in a number of scenarios, such as the analysis of large data with Map-Reduce. However, due to the widespread availability of cloud infrastructures, weak storages are used not only by specialised applications but also by general purpose applications. We provide a formal approach, based on process calculi, to reason about the behaviour of programs that rely on cloud stores. For instance, one can check that the composition of a process with a cloud store ensures `strong' properties through a wise usage of asynchronous message-passing

    Lewis meets Brouwer: constructive strict implication

    Full text link
    C. I. Lewis invented modern modal logic as a theory of "strict implication". Over the classical propositional calculus one can as well work with the unary box connective. Intuitionistically, however, the strict implication has greater expressive power than the box and allows to make distinctions invisible in the ordinary syntax. In particular, the logic determined by the most popular semantics of intuitionistic K becomes a proper extension of the minimal normal logic of the binary connective. Even an extension of this minimal logic with the "strength" axiom, classically near-trivial, preserves the distinction between the binary and the unary setting. In fact, this distinction and the strong constructive strict implication itself has been also discovered by the functional programming community in their study of "arrows" as contrasted with "idioms". Our particular focus is on arithmetical interpretations of the intuitionistic strict implication in terms of preservativity in extensions of Heyting's Arithmetic.Comment: Our invited contribution to the collection "L.E.J. Brouwer, 50 years later

    Fifty years of Hoare's Logic

    Get PDF
    We present a history of Hoare's logic.Comment: 79 pages. To appear in Formal Aspects of Computin

    Assimilating Non-Probabilistic Assessments of the Estimation of Uncertainty Bias in Expert Judgment Elicitation Using an Evidence Based Approach in High Consequence Conceptual Designs

    Get PDF
    One of the major challenges in conceptual designs of complex systems is the identification of uncertainty embedded in the information due to lack of historic data. This becomes of increased concern especially in high-risk industries. This document reports a developed methodology that allows for the cognitive bias, estimation of uncertainty, to be elucidated to improve the quality of elicited data. It consists of a comprehensive literature review that begins by defining a \u27High Consequence Conceptual Engineering Environment\u27 and identifies the high-risk industries in which these environments are found. It proceeds with a discussion that differentiates risk and uncertainty in decision-making in these environments. An argument was built around the identified epistemic category of uncertainty, the impact on hard data for decision-making, and from whom we obtain this data. The review shifts to defining and selecting the experts, the elicitation process in terms of the components, the process phases and steps involved, and an examination of a probabilistic and a fuzzy example. This sets the stage for this methodology that uses evidence theory for the mathematical analysis after the data is elicited using a tailored elicitation process. Yager\u27s combination rule is used to combine evidence and fully recognize the ignorance without ignoring available information. Engineering and management teams from NASA Langley Research Center were the population from which the experts for this study were identified. NASA officials were interested in obtaining uncertainty estimates, and a comparison of these estimates, associated with their Crew Launch Vehicle (CLV) designs; the existing Exploration Systems Architecture Study Crew Launch Vehicle (ESAS CLV) and the Parallel-Staged Crew Launch Vehicle (P-S CLV) which is currently being worked. This evidence-based approach identified that the estimation of cost parameters uncertainty is not specifically over or underestimated in High Consequence Conceptual Engineering Environments; rather, there is more uncertainty present than what is being anticipated. From the perspective of maturing designs, it was concluded that the range of cost parameters\u27 uncertainty at different error-state-values were interchangeably larger or smaller when compared to each other even as the design matures
    • …
    corecore