5,231 research outputs found
A Semantic Hierarchy for Erasure Policies
We consider the problem of logical data erasure, contrasting with physical
erasure in the same way that end-to-end information flow control contrasts with
access control. We present a semantic hierarchy for erasure policies, using a
possibilistic knowledge-based semantics to define policy satisfaction such that
there is an intuitively clear upper bound on what information an erasure policy
permits to be retained. Our hierarchy allows a rich class of erasure policies
to be expressed, taking account of the power of the attacker, how much
information may be retained, and under what conditions it may be retained.
While our main aim is to specify erasure policies, the semantic framework
allows quite general information-flow policies to be formulated for a variety
of semantic notions of secrecy.Comment: 18 pages, ICISS 201
Matchmaking for covariant hierarchies
We describe a model of matchmaking suitable for the implementation of services, rather than their for their discovery and composition. In the model, processing requirements are modelled by client requests and computational resources are software processors that compete for request processing as the covariant implementations of an open service interface. Matchmaking then relies on type analysis to rank processors against requests in support of a wide range of dispatch strategies. We relate the model to the autonomicity of service provision and briefly report on its deployment within a production-level infrastructure for scientic computing
Provably correct Java implementations of Spi Calculus security protocols specifications
Spi Calculus is an untyped high level modeling language for security protocols, used for formal protocols specification and verification. In this paper, a type system for the Spi Calculus and a translation function are formally defined, in order to formalize the refinement of a Spi Calculus specification into a Java implementation. The Java implementation generated by the translation function uses a custom Java library. Formal conditions on such library are stated, so that, if the library implementation code satisfies such conditions, then the generated Java implementation correctly simulates the Spi Calculus specification. A verified implementation of part of the custom library is further presente
Understanding and Enforcing Opacity
AbstractāThis paper puts a spotlight on the specification and enforcement of opacity, a security policy for protecting sensitive properties of system behavior. We illustrate the fine granularity of the opacity policy by location privacy and privacy-preserving aggregation scenarios. We present a frame-work for opacity and explore its key differences and formal connections with such well-known information-flow models as noninterference, knowledge-based security, and declassifica-tion. Our results are machine-checked and parameterized in the observational power of the attacker, including progress-insensitive, progress-sensitive, and timing-sensitive attackers. We present two approaches to enforcing opacity: a whitebox monitor and a blackbox sampling-based enforcement. We report on experiments with prototypes that utilize state-of-the-art Satisfiability Modulo Theories (SMT) solvers and the random testing tool QuickCheck to establish opacity for the location and aggregation-based scenarios. I
Creating a Relational Distributed Object Store
In and of itself, data storage has apparent business utility. But when we can
convert data to information, the utility of stored data increases dramatically.
It is the layering of relation atop the data mass that is the engine for such
conversion. Frank relation amongst discrete objects sporadically ingested is
rare, making the process of synthesizing such relation all the more
challenging, but the challenge must be met if we are ever to see an equivalent
business value for unstructured data as we already have with structured data.
This paper describes a novel construct, referred to as a relational distributed
object store (RDOS), that seeks to solve the twin problems of how to
persistently and reliably store petabytes of unstructured data while
simultaneously creating and persisting relations amongst billions of objects.Comment: 12 pages, 5 figure
Goal driven theorem proving using conceptual graphs and Peirce logic
The thesis describes a rational reconstruction of Sowa's theory of Conceptual
Graphs. The reconstruction produces a theory with a firmer logical foundation than was
previously the case and which is suitable for computation whilst retaining the
expressiveness of the original theory. Also, several areas of incompleteness are
addressed. These mainly concern the scope of operations on conceptual graphs of
different types but include extensions for logics of higher orders than first order. An
important innovation is the placing of negation onto a sound representational basis.
A comparison of theorem proving techniques is made from which the principles of
theorem proving in Peirce logic are identified. As a result, a set of derived inference rules,
suitable for a goal driven approach to theorem proving, is developed from Peirce's beta
rules. These derived rules, the first of their kind for Peirce logic and conceptual graphs,
allow the development of a novel theorem proving approach which has some similarities
to a combined semantic tableau and resolution methodology. With this methodology it is
shown that a logically complete yet tractable system is possible. An important result is the
identification of domain independent heuristics which follow directly from the
methodology. In addition to the theorem prover, an efficient system for the detection of
selectional constraint violations is developed.
The proof techniques are used to build a working knowledge base system in Prolog
which can accept arbitrary statements represented by conceptual graphs and test their
semantic and logical consistency against a dynamic knowledge base. The same proof
techniques are used to find solutions to arbitrary queries. Since the system is logically
complete it can maintain the integrity of its knowledge base and answer queries in a fully
automated manner. Thus the system is completely declarative and does not require any
programming whatever by a user with the result that all interaction with a user is
conversational. Finally, the system is compared with other theorem proving systems
which are based upon Conceptual Graphs and conclusions about the effectiveness of the
methodology are drawn
Machine Understandable Policies and GDPR Compliance Checking
The European General Data Protection Regulation (GDPR) calls for technical
and organizational measures to support its implementation. Towards this end,
the SPECIAL H2020 project aims to provide a set of tools that can be used by
data controllers and processors to automatically check if personal data
processing and sharing complies with the obligations set forth in the GDPR. The
primary contributions of the project include: (i) a policy language that can be
used to express consent, business policies, and regulatory obligations; and
(ii) two different approaches to automated compliance checking that can be used
to demonstrate that data processing performed by data controllers / processors
complies with consent provided by data subjects, and business processes comply
with regulatory obligations set forth in the GDPR
Big Data and Analytics in the Age of the GDPR
The new European General Data Protection Regulation places stringent restrictions on the processing of personally identifiable data. The GDPR does not only affect European companies, as the regulation applies to all the organizations that track or provide services to European citizens. Free exploratory data analysis is permitted only on anonymous data, at the cost of some legal risks.We argue that for the other kinds of personal data processing, the most flexible and safe legal basis is explicit consent. We illustrate the approach to consent management and compliance with the GDPR being developed by the European H2020 project SPECIAL, and highlight some related big data aspects
- ā¦