64 research outputs found

    Solving the Workflow Satisfiability Problem using General Purpose Solvers

    Get PDF
    The workflow satisfiability problem (WSP) is a well-studied problem in access control seeking allocation of authorised users to every step of the workflow, subject to workflow specification constraints. It was noticed that the number kk of steps is typically small compared to the number of users in the real-world instances of WSP; therefore kk is considered as the parameter in WSP parametrised complexity research. While WSP in general was shown to be W[1]-hard, WSP restricted to a special case of user-independent (UI) constraints is fixed-parameter tractable (FPT). However, restriction to the UI constraints might be impractical. To efficiently handle non-UI constraints, we introduce the notion of branching factor of a constraint. As long as the branching factors of the constraints are relatively small and the number of non-UI constraints is reasonable, WSP can be solved in FPT time. Extending the results from Karapetyan et al. (2019), we demonstrate that general-purpose solvers are capable of achieving FPT-like performance on WSP with arbitrary constraints when used with appropriate formulations. This enables one to tackle most of practical WSP instances. While important on its own, we hope that this result will also motivate researchers to look for FPT-aware formulations of other FPT problems.Comment: Associated data: http://doi.org/10.17639/nott.711

    Constraint Branching in Workflow Satisfiability Problem

    Get PDF

    Tools and techniques for analysing the impact of information security

    Get PDF
    PhD ThesisThe discipline of information security is employed by organisations to protect the confidentiality, integrity and availability of information, often communicated in the form of information security policies. A policy expresses rules, constraints and procedures to guard against adversarial threats and reduce risk by instigating desired and secure behaviour of those people interacting with information legitimately. To keep aligned with a dynamic threat landscape, evolving business requirements, regulation updates, and new technologies a policy must undergo periodic review and change. Chief Information Security Officers (CISOs) are the main decision makers on information security policies within an organisation. Making informed policy modifications involves analysing and therefore predicting the impact of those changes on the success rate of business processes often expressed as workflows. Security brings an added burden to completing a workflow. Adding a new security constraint may reduce success rate or even eliminate it if a workflow is always forced to terminate early. This can increase the chances of employees bypassing or violating a security policy. Removing an existing security constraint may increase success rate but may may also increase the risk to security. A lack of suitably aimed impact analysis tools and methodologies for CISOs means impact analysis is currently a somewhat manual and ambiguous procedure. Analysis can be overwhelming, time consuming, error prone, and yield unclear results, especially when workflows are complex, have a large workforce, and diverse security requirements. This thesis considers the provision of tools and more formal techniques specific to CISOs to help them analyse the impact modifying a security policy has on the success rate of a workflow. More precisely, these tools and techniques have been designed to efficiently compare the impact between two versions of a security policy applied to the same workflow, one before, the other after a policy modification. This work focuses on two specific types of security impact analysis. The first is quantitative in nature, providing a measure of success rate for a security constrained workflow which must be executed by employees who may be absent at runtime. This work considers quantifying workflow resiliency which indicates a workflow’s expected success rate assuming the availability of employees to be probabilistic. New aspects of quantitative resiliency are introduced in the form of workflow metrics, and risk management techniques to manage workflows that must work with a resiliency below acceptable levels. Defining these risk management techniques has led to exploring the reduction of resiliency computation time and analysing resiliency in workflows with choice. The second area of focus is more qualitative, in terms of facilitating analysis of how people are likely to behave in response to security and how that behaviour can impact the success rate of a workflow at a task level. Large amounts of information from disparate sources exists on human behavioural factors in a security setting which can be aligned with security standards and structured within a single ontology to form a knowledge base. Consultations with two CISOs have been conducted, whose responses have driven the implementation of two new tools, one graphical, the other Web-oriented allowing CISOs and human factors experts to record and incorporate their knowledge directly within an ontology. The ontology can be used by CISOs to assess the potential impact of changes made to a security policy and help devise behavioural controls to manage that impact. The two consulted CISOs have also carried out an evaluation of the Web-oriented tool. vii

    Obstructions in Security-Aware Business Processes

    Get PDF
    This Open Access book explores the dilemma-like stalemate between security and regulatory compliance in business processes on the one hand and business continuity and governance on the other. The growing number of regulations, e.g., on information security, data protection, or privacy, implemented in increasingly digitized businesses can have an obstructive effect on the automated execution of business processes. Such security-related obstructions can particularly occur when an access control-based implementation of regulations blocks the execution of business processes. By handling obstructions, security in business processes is supposed to be improved. For this, the book presents a framework that allows the comprehensive analysis, detection, and handling of obstructions in a security-sensitive way. Thereby, methods based on common organizational security policies, process models, and logs are proposed. The Petri net-based modeling and related semantic and language-based research, as well as the analysis of event data and machine learning methods finally lead to the development of algorithms and experiments that can detect and resolve obstructions and are reproducible with the provided software

    Temporal and Resource Controllability of Workflows Under Uncertainty

    Get PDF
    Workflow technology has long been employed for the modeling, validation and execution of business processes. A workflow is a formal description of a business process in which single atomic work units (tasks), organized in a partial order, are assigned to processing entities (agents) in order to achieve some business goal(s). Workflows can also employ workflow paths (projections with respect to a total truth value assignment to the Boolean variables associated to the conditional split connectors) in order (not) to execute a subset of tasks. A workflow management system coordinates the execution of tasks that are part of workflow instances such that all relevant constraints are eventually satisfied. Temporal workflows specify business processes subject to temporal constraints such as controllable or uncontrollable durations, delays and deadlines. The choice of a workflow path may be controllable or not, considered either in isolation or in combination with uncontrollable durations. Access controlled workflows specify workflows in which users are authorized for task executions and authorization constraints say which users remain authorized to execute which tasks depending on who did what. Access controlled workflows may consider workflow paths too other than the uncertain availability of resources (users, throughout this thesis). When either a task duration or the choice of the workflow path to take or the availability of a user is out of control, we need to verify that the workflow can be executed by verifying all constraints for any possible combination of behaviors arising from the uncontrollable parts. Indeed, users might be absent before starting the execution (static resiliency), they can also become so during execution (decremental resiliency) or they can come and go throughout the execution (dynamic resiliency). Temporal access controlled workflows merge the two previous formalisms by considering several kinds of uncontrollable parts simultaneously. Authorization constraints may be extended to support conditional and temporal features. A few years ago some proposals addressed the temporal controllability of workflows by encoding them into temporal networks to exploit "off-the-shelf" controllability checking algorithms available for them. However, those proposals fail to address temporal controllability where the controllable and uncontrollable choices of workflow paths may mutually influence one another. Furthermore, to the best of my knowledge, controllability of access controlled workflows subject to uncontrollable workflow paths and algorithms to validate and execute dynamically resilient workflows remain unexplored. To overcome these limitations, this thesis goes for exact algorithms by addressing temporal and resource controllability of workflows under uncertainty. I provide several new classes of (temporal) constraint networks and corresponding algorithms to check their controllability. After that, I encode workflows into these new formalisms. I also provide an encoding into instantaneous timed games to model static, decremental and dynamic resiliency and synthesize memoryless execution strategies. I developed a few tools with which I carried out some initial experimental evaluations

    On the Workflow Satisfiability Problem with Class-Independent Constraints for Hierarchical Organizations

    Get PDF
    A workflow specification defines a set of steps, a set of users, and an access control policy. The policy determines which steps a user is authorized to perform and imposes constraints on which sets of users can perform which sets of steps. The workflow satisfiability problem (WSP) is the problem of determining whether there exists an assignment of users to workflow steps that satisfies the policy. Given the computational hardness of WSP and its importance in the context of workflow management systems, it is important to develop algorithms that are as efficient as possible to solve WSP. In this article, we study the fixed-parameter tractability of WSP in the presence of class-independent constraints, which enable us to (1) model security requirements based on the groups to which users belong and (2) generalize the notion of a user-independent constraint. Class-independent constraints are defined in terms of equivalence relations over the set of users. We consider sets of nested equivalence relations because this enables us to model security requirements in hierarchical organizations. We prove that WSP is fixed-parameter tractable (FPT) for class-independent constraints defined over nested equivalence relations and develop an FPT algorithm to solve WSP instances incorporating such constraints. We perform experiments to evaluate the performance of our algorithm and compare it with that of SAT4J, an off-the-shelf pseudo-Boolean SAT solver. The results of these experiments demonstrate that our algorithm significantly outperforms SAT4J for many instances of WSP

    Pattern-Based Approach to the Workflow Satisfiability Problem with User-Independent Constraints

    Get PDF
    The fixed parameter tractable (FPT) approach is a powerful tool in tackling computationally hard problems.  In this paper, we link FPT results to classic artificial intelligence (AI) techniques to show how they complement each other.  Specifically, we consider the workflow satisfiability problem (WSP) which asks whether there exists an assignment of authorised users to the steps in a workflow specification, subject to certain constraints on the assignment.  It was shown by Cohen et al. (JAIR 2014) that WSP restricted to the class of user-independent constraints (UI), covering many practical cases, admits FPT algorithms, i.e. can be solved in time exponential only in the number of steps k and polynomial in the number of users n.  Since usually k << n in WSP, such FPT algorithms are of great practical interest. We present a new interpretation of the FPT nature of the WSP with UI constraints giving a decomposition of the problem into two levels.  Exploiting this two-level split, we develop a new FPT algorithm that is by many orders of magnitude faster than the previous state-of-the-art WSP algorithm and also has only polynomial-space complexity.  We also introduce new pseudo-Boolean (PB) and Constraint Satisfaction (CSP) formulations of the WSP with UI constraints which efficiently exploit this new decomposition of the problem and raise the novel issue of how to use general-purpose solvers to tackle FPT problems in a fashion that meets FPT efficiency expectations.  In our computational study, we investigate, for the first time, the phase transition (PT) properties of the WSP, under a model for generation of random instances.  We show how PT studies can be extended, in a novel fashion, to support empirical evaluation of scaling of FPT algorithms

    Dependability Assessment of Wireless Sensor Networks with Formal Methods

    Get PDF
    Wireless Sensor Networks (WSNs) are increasingly being adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events, such as node crash and packet loss, may undermine the dependability of the WSN. Hence their effects need to be properly assessed from the early stages of the development process onwards to minimize the chances of unexpected problems during use. It is also necessary to monitor the system during operation in order to avoid unexpected results or dangerous effects. In this thesis we propose a framework to investigate the correctness of the design of a WSN from the point of view of its dependability, i.e., resilience to undesired events. The framework is based on the Event Calculus formalism and it is backed-up by a support tool aimed to simplify its adoption by system designers. The tool allows to specify the target WSN in a user-friendly way and it is able to generate automatically the Event Calculus specifications used to check correctness properties and evaluate dependability metrics, such as connection resiliency, coverage and lifetime. It is able to work at design time and runtime. In particular at runtime the tool works a server that is in waiting for new events coming from the WSN and, performed the reasoning using the same specifications, is able to do prediction about future criticalities of the WSN. The effectiveness of the approach is shown in the context of five case studies, aiming to illustrate how the framework is helpful to drive design choices by means of what-if scenario analysis and robustness checking, and to check the correctness properties of the WSN at runtime

    Optimizing performance of workflow executions under authorization control

    Get PDF
    “Business processes or workflows are often used to model enterprise or scientific applications. It has received considerable attention to automate workflow executions on computing resources. However, many workflow scenarios still involve human activities and consist of a mixture of human tasks and computing tasks. Human involvement introduces security and authorization concerns, requiring restrictions on who is allowed to perform which tasks at what time. Role- Based Access Control (RBAC) is a popular authorization mechanism. In RBAC, the authorization concepts such as roles and permissions are defined, and various authorization constraints are supported, including separation of duty, temporal constraints, etc. Under RBAC, users are assigned to certain roles, while the roles are associated with prescribed permissions. When we assess resource capacities, or evaluate the performance of workflow executions on supporting platforms, it is often assumed that when a task is allocated to a resource, the resource will accept the task and start the execution once a processor becomes available. However, when the authorization policies are taken into account,” this assumption may not be true and the situation becomes more complex. For example, when a task arrives, a valid and activated role has to be assigned to a task before the task can start execution. The deployed authorization constraints may delay the workflow execution due to the roles’ availability, or other restrictions on the role assignments, which will consequently have negative impact on application performance. When the authorization constraints are present to restrict the workflow executions, it entails new research issues that have not been studied yet in conventional workflow management. This thesis aims to investigate these new research issues. First, it is important to know whether a feasible authorization solution can be found to enable the executions of all tasks in a workflow, i.e., check the feasibility of the deployed authorization constraints. This thesis studies the issue of the feasibility checking and models the feasibility checking problem as a constraints satisfaction problem. Second, it is useful to know when the performance of workflow executions will not be affected by the given authorization constraints. This thesis proposes the methods to determine the time durations when the given authorization constraints do not have impact. Third, when the authorization constraints do have the performance impact, how can we quantitatively analyse and determine the impact? When there are multiple choices to assign the roles to the tasks, will different choices lead to the different performance impact? If so, can we find an optimal way to conduct the task-role assignments so that the performance impact is minimized? This thesis proposes the method to analyze the delay caused by the authorization constraints if the workflow arrives beyond the non-impact time duration calculated above. Through the analysis of the delay, we realize that the authorization method, i.e., the method to select the roles to assign to the tasks affects the length of the delay caused by the authorization constraints. Based on this finding, we propose an optimal authorization method, called the Global Authorization Aware (GAA) method. Fourth, a key reason why authorization constraints may have impact on performance is because the authorization control directs the tasks to some particular roles. Then how to determine the level of workload directed to each role given a set of authorization constraints? This thesis conducts the theoretical analysis about how the authorization constraints direct the workload to the roles, and proposes the methods to calculate the arriving rate of the requests directed to each role under the role, temporal and cardinality constraints. Finally, the amount of resources allocated to support each individual role may have impact on the execution performance of the workflows. Therefore, it is desired to develop the strategies to determine the adequate amount of resources when the authorization control is present in the system. This thesis presents the methods to allocate the appropriate quantity for resources, including both human resources and computing resources. Different features of human resources and computing resources are taken into account. For human resources, the objective is to maximize the performance subject to the budgets to hire the human resources, while for computing resources, the strategy aims to allocate adequate amount of computing resources to meet the QoS requirements
    • 

    corecore