12,626 research outputs found
Possibilistic Information Flow Control for Workflow Management Systems
In workflows and business processes, there are often security requirements on
both the data, i.e. confidentiality and integrity, and the process, e.g.
separation of duty. Graphical notations exist for specifying both workflows and
associated security requirements. We present an approach for formally verifying
that a workflow satisfies such security requirements. For this purpose, we
define the semantics of a workflow as a state-event system and formalise
security properties in a trace-based way, i.e. on an abstract level without
depending on details of enforcement mechanisms such as Role-Based Access
Control (RBAC). This formal model then allows us to build upon well-known
verification techniques for information flow control. We describe how a
compositional verification methodology for possibilistic information flow can
be adapted to verify that a specification of a distributed workflow management
system satisfies security requirements on both data and processes.Comment: In Proceedings GraMSec 2014, arXiv:1404.163
Formal Verification of Security Protocol Implementations: A Survey
Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approac
A Verified Information-Flow Architecture
SAFE is a clean-slate design for a highly secure computer system, with
pervasive mechanisms for tracking and limiting information flows. At the lowest
level, the SAFE hardware supports fine-grained programmable tags, with
efficient and flexible propagation and combination of tags as instructions are
executed. The operating system virtualizes these generic facilities to present
an information-flow abstract machine that allows user programs to label
sensitive data with rich confidentiality policies. We present a formal,
machine-checked model of the key hardware and software mechanisms used to
dynamically control information flow in SAFE and an end-to-end proof of
noninterference for this model.
We use a refinement proof methodology to propagate the noninterference
property of the abstract machine down to the concrete machine level. We use an
intermediate layer in the refinement chain that factors out the details of the
information-flow control policy and devise a code generator for compiling such
information-flow policies into low-level monitor code. Finally, we verify the
correctness of this generator using a dedicated Hoare logic that abstracts from
low-level machine instructions into a reusable set of verified structured code
generators
Integrating Information Flow Analysis in Unifying Theories of Programming
This research is supported by the China National R&D Key Research Program (2019YFB1705703) and the In-terdisciplinary Program of SJTU, Shanghai, China (No. YG2019ZDA07).Postprin
Composition and Declassification in Possibilistic Information Flow Security
Formal methods for security can rule out whole classes of security vulnerabilities, but applying them in practice remains challenging. This thesis develops formal verification techniques for information flow security that combine the expressivity and scalability strengths of existing frameworks. It builds upon Bounded Deducibility (BD) Security, which allows specifying and verifying fine-grained policies about what information may flow when to whom. Our main technical result is a compositionality theorem for BD Security, providing scalability by allowing us to verify security properties of a large system by verifying smaller components. Its practical utility is illustrated by a case study of verifying confidentiality properties of a distributed social media platform. Moreover, we discuss its use for the modular development of secure workflow systems, and for the security-preserving enforcement of safety and security properties other than information flow control
ICT–supported reforms of service delivery in Flemish cities: testing the concept of 'information ecology'
This paper explores organizational reforms in Flemish cities related to making the cities’ individual service delivery more efficient, customer orientated, customer friendly and integrated. The paper is the first one of a recently started research project and PhD research about the complexity of managing ICT-supported change of ‘individual’ service delivery. The overall objective of this paper is to set the stage for the research project’s research design in terms of its theoretical framework. Therefore, we report about our first explorative, inductive and descriptive findings related to this type of change within one city. We firstly inductively report about the objectives and the objects of change. Secondly, we develop a provisional theoretical framework. We therefore take the notion of an information ecology as a conceptual starting point and use a combination of elements of neo-institutional theory, system theory and a political perspective on organizational development. In order to explore the potentialities of this approach, we test the framework’s value for understanding the changes within the city. The framework enabled us to describe and analyze this type of reforms without neglecting the complexity of these changes. It tries to link some important public administration theories to the study of the e-government phenomenon that is still an important challenge. The most important lesson is that further refinement of the conceptual framework is needed. Although the analysis shows that the framework offers a conceptual basis to analyze front and back office reforms within public organizations, it still lacks a full and straightforward operationalization of its components, constructs, relations, etc
Qualitative and Qualitative Longitudinal Resources in Europe
In April 2009 the UK Timescapes Initiative, in collaboration
with the University of Bremen, organised a residential
workshop to explore the nature of qualitative (Q) and
qualitative longitudinal (QL) research and resources across
Europe. The workshop was hosted by the Archive for Life
Course Research (Archiv für Lebenslaufforschung, ALLF)
at Bremen and funded by Timescapes with support from
CESSDA (The Council of European Social Science Data
Archives, Preparatory Phase Project). It was attended by
archivists and researchers from 14 countries, including
‘transitional’ states such as Belarus and Lithuania. The broad aim of the workshop was to map existing infrastructures for qualitative and QL data archiving among the participating countries, including the extent of archiving and the ethos of data sharing and re-use in different national contexts. The group also explored strategies to develop infrastructure and to support qualitative and QL research and resources, including
collaborative research across Europe and beyond
- …