3,135 research outputs found

    Opacity with Orwellian Observers and Intransitive Non-interference

    Full text link
    Opacity is a general behavioural security scheme flexible enough to account for several specific properties. Some secret set of behaviors of a system is opaque if a passive attacker can never tell whether the observed behavior is a secret one or not. Instead of considering the case of static observability where the set of observable events is fixed off line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we consider Orwellian partial observability where unobservable events are not revealed unless a downgrading event occurs in the future of the trace. We show how to verify that some regular secret is opaque for a regular language L w.r.t. an Orwellian projection while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. We finally illustrate relevancy of our results by proving the equivalence between the opacity property of regular secrets w.r.t. Orwellian projection and the intransitive non-interference property

    Verification of Information Flow Properties under Rational Observation

    Get PDF
    Information flow properties express the capability for an agent to infer information about secret behaviours of a partially observable system. In a language-theoretic setting, where the system behaviour is described by a language, we define the class of rational information flow properties (RIFP), where observers are modeled by finite transducers, acting on languages in a given family L\mathcal{L}. This leads to a general decidability criterion for the verification problem of RIFPs on L\mathcal{L}, implying PSPACE-completeness for this problem on regular languages. We show that most trace-based information flow properties studied up to now are RIFPs, including those related to selective declassification and conditional anonymity. As a consequence, we retrieve several existing decidability results that were obtained by ad-hoc proofs.Comment: 19 pages, 7 figures, version extended from AVOCS'201

    Possibilistic Information Flow Control for Workflow Management Systems

    Full text link
    In workflows and business processes, there are often security requirements on both the data, i.e. confidentiality and integrity, and the process, e.g. separation of duty. Graphical notations exist for specifying both workflows and associated security requirements. We present an approach for formally verifying that a workflow satisfies such security requirements. For this purpose, we define the semantics of a workflow as a state-event system and formalise security properties in a trace-based way, i.e. on an abstract level without depending on details of enforcement mechanisms such as Role-Based Access Control (RBAC). This formal model then allows us to build upon well-known verification techniques for information flow control. We describe how a compositional verification methodology for possibilistic information flow can be adapted to verify that a specification of a distributed workflow management system satisfies security requirements on both data and processes.Comment: In Proceedings GraMSec 2014, arXiv:1404.163

    The Anatomy and Facets of Dynamic Policies

    Full text link
    Information flow policies are often dynamic; the security concerns of a program will typically change during execution to reflect security-relevant events. A key challenge is how to best specify, and give proper meaning to, such dynamic policies. A large number of approaches exist that tackle that challenge, each yielding some important, but unconnected, insight. In this work we synthesise existing knowledge on dynamic policies, with an aim to establish a common terminology, best practices, and frameworks for reasoning about them. We introduce the concept of facets to illuminate subtleties in the semantics of policies, and closely examine the anatomy of policies and the expressiveness of policy specification mechanisms. We further explore the relation between dynamic policies and the concept of declassification.Comment: Technical Report of publication under the same name in Computer Security Foundations (CSF) 201

    First-mover disadvantage: The sovereign ratings mousetrap. CEPS Working Document No 2020/02, February 2020

    Get PDF
    Using 102 sovereigns rated by the three largest credit rating agencies (CRA), S&P, Moody’s and Fitch between January 2000 and January 2019, we are the first to document that the first mover CRA (S&P) in downgrades falls into a commercial trap. Namely, each first-mover downgrade by one notch by S&P results in a 2.4% increase in the probability of a rating contract being cancelled by the sovereign client, and a 1.2% decrease in the ratio of S&P’s sovereign rating coverage relative to Moody’s. The more first-mover downgrades S&P makes, the more their sovereign rating coverage declines relative to Moody’s. This paper interrelates three themes of the literature: herding behaviour amongst CRAs, issues of conflict of interest and ratings quality

    Synthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews

    Get PDF
    Objectives: The nature of public health evidence presents challenges for conventional systematic review processes, with increasing recognition of the need to include a broader range of work including observational studies and qualitative research, yet with methods to combine diverse sources remaining underdeveloped. The objective of this paper is to report the application of a new approach for review of evidence in the public health sphere. The method enables a diverse range of evidence types to be synthesized in order to examine potential relationships between a public health environment and outcomes. Study design: The study drew on previous work by the National Institute for Health and Clinical Excellence on conceptual frameworks. It applied and further extended this work to the synthesis of evidence relating to one particular public health area: the enhancement of employee mental well-being in the workplace. Methods: The approach utilized thematic analysis techniques from primary research, together with conceptual modelling, to explore potential relationships between factors and outcomes. Results: The method enabled a logic framework to be built from a diverse document set that illustrates how elements and associations between elements may impact on the well-being of employees. Conclusions: Whilst recognizing potential criticisms of the approach, it is suggested that logic models can be a useful way of examining the complexity of relationships between factors and outcomes in public health, and of highlighting potential areas for interventions and further research. The use of techniques from primary qualitative research may also be helpful in synthesizing diverse document types. (C) 2010 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved

    Delimited Persistent Stochastic Non-Interference

    Get PDF
    Non-Interference is an information flow security property which aims to protect confidential data by ensuring the complete absence of any information flow from high level entities to low level ones. However, this requirement is too demanding when dealing with real applications: indeed, no real policy ever guarantees a total absence of information flow. In order to deal with real applications, it is often necessary to allow mechanisms for downgrading or declassifying information such as information filters and channel control. In this paper we generalize the notion of Persistent Stochastic Non-Interference (PSNI) in order to allow information to flow from a higher to a lower security level through a downgrader. We introduce the notion of Delimited Persistent Stochastic Non-Interference (D_PSNI) and provide two characterizations of it, one expressed in terms of bisimulation-like equivalence checks and another one formulated through unwinding conditions. Then we prove some compositionality properties. Finally, we present a decision algorithm and discuss its complexity

    Quality of Service optimisation framework for Next Generation Networks

    Get PDF
    Within recent years, the concept of Next Generation Networks (NGN) has become widely accepted within the telecommunication area, in parallel with the migration of telecommunication networks from traditional circuit-switched technologies such as ISDN (Integrated Services Digital Network) towards packet-switched NGN. In this context, SIP (Session Initiation Protocol), originally developed for Internet use only, has emerged as the major signalling protocol for multimedia sessions in IP (Internet Protocol) based NGN. One of the traditional limitations of IP when faced with the challenges of real-time communications is the lack of quality support at the network layer. In line with NGN specification work, international standardisation bodies have defined a sophisticated QoS (Quality of Service) architecture for NGN, controlling IP transport resources and conventional IP QoS mechanisms through centralised higher layer network elements via cross-layer signalling. Being able to centrally control QoS conditions for any media session in NGN without the imperative of a cross-layer approach would result in a feasible and less complex NGN architecture. Especially the demand for additional network elements would be decreased, resulting in the reduction of system and operational costs in both, service and transport infrastructure. This thesis proposes a novel framework for QoS optimisation for media sessions in SIP-based NGN without the need for cross-layer signalling. One key contribution of the framework is the approach to identify and logically group media sessions that encounter similar QoS conditions, which is performed by applying pattern recognition and clustering techniques. Based on this novel methodology, the framework provides functions and mechanisms for comprehensive resource-saving QoS estimation, adaptation of QoS conditions, and support of Call Admission Control. The framework can be integrated with any arbitrary SIP-IP-based real-time communication infrastructure, since it does not require access to any particular QoS control or monitoring functionalities provided within the IP transport network. The proposed framework concept has been deployed and validated in a prototypical simulation environment. Simulation results show MOS (Mean Opinion Score) improvement rates between 53 and 66 percent without any active control of transport network resources. Overall, the proposed framework comes as an effective concept for central controlled QoS optimisation in NGN without the need for cross-layer signalling. As such, by either being run stand-alone or combined with conventional QoS control mechanisms, the framework provides a comprehensive basis for both the reduction of complexity and mitigation of issues coming along with QoS provision in NGN
    • 

    corecore