300 research outputs found

    Event-driven Adaptation in COP

    Full text link
    Context-Oriented Programming languages provide us with primitive constructs to adapt program behaviour depending on the evolution of their operational environment, namely the context. In previous work we proposed ML_CoDa, a context-oriented language with two-components: a declarative constituent for programming the context and a functional one for computing. This paper describes an extension of ML_CoDa to deal with adaptation to unpredictable context changes notified by asynchronous events.Comment: In Proceedings PLACES 2016, arXiv:1606.0540

    A Context-Oriented Extension of F#

    Get PDF
    Context-Oriented programming languages provide us with primitive constructs to adapt program behaviour depending on the evolution of their operational environment, namely the context. In previous work we proposed ML_CoDa, a context-oriented language with two-components: a declarative constituent for programming the context and a functional one for computing. This paper describes the implementation of ML_CoDa as an extension of F#.Comment: In Proceedings FOCLASA 2015, arXiv:1512.0694

    Differential Privacy versus Quantitative Information Flow

    Get PDF
    Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities of two different entries to originate a certain answer is bound by e^\epsilon. In the fields of anonymity and information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of the information-theoretic notion of mutual information. There are two main approaches that fall in this category: One based on Shannon entropy, and one based on R\'enyi's min entropy. The latter has connection with the so-called Bayes risk, which expresses the probability of guessing the secret. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of mutual information. We show that the notion of differential privacy is strictly stronger, in the sense that it implies a bound on the mutual information, but not viceversa

    Typing Context-Dependent Behavioural Variation

    Get PDF
    Context Oriented Programming (COP) concerns the ability of programs to adapt to changes in their running environment. A number of programming languages endowed with COP constructs and features have been developed. However, some foundational issues remain unclear. This paper proposes adopting static analysis techniques to reason on and predict how programs adapt their behaviour. We introduce a core functional language, ContextML, equipped with COP primitives for manipulating contexts and for programming behavioural variations. In particular, we specify the dispatching mechanism, used to select the program fragments to be executed in the current active context. Besides the dynamic semantics we present an annotated type system. It guarantees that the well-typed programs adapt to any context, i.e. the dispatching mechanism always succeeds at run-time.Comment: In Proceedings PLACES 2012, arXiv:1302.579

    Differential Privacy: on the trade-off between Utility and Information Leakage

    Get PDF
    Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities that two adjacent datasets give the same answer is bound by e^epsilon. In the field of information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of R\'enyi min mutual information, a notion strictly related to the Bayes risk. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of mutual information. We show that differential privacy implies a bound on the mutual information (but not vice-versa). Furthermore, we show that our bound is tight. Then, we consider the utility of the randomization mechanism, which represents how close the randomized answers are, in average, to the real ones. We show that the notion of differential privacy implies a bound on utility, also tight, and we propose a method that under certain conditions builds an optimal randomization mechanism, i.e. a mechanism which provides the best utility while guaranteeing differential privacy.Comment: 30 pages; HAL repositor

    Preface

    Get PDF

    Formal executable descriptions of biological systems

    Get PDF
    The similarities between systems of living entities and systems of concurrent processes may support biological experiments in silico. Process calculi offer a formal framework to describe biological systems, as well as to analyse their behaviour, both from a qualitative and a quantitative point of view. A couple of little examples help us in showing how this can be done. We mainly focus our attention on the qualitative and quantitative aspects of the considered biological systems, and briefly illustrate which kinds of analysis are possible. We use a known stochastic calculus for the first example. We then present some statistics collected by repeatedly running the specification, that turn out to agree with those obtained by experiments in vivo. Our second example motivates a richer calculus. Its stochastic extension requires a non trivial machinery to faithfully reflect the real dynamic behaviour of biological systems
    • …
    corecore