171 research outputs found

    Understanding computer security

    Get PDF
    Few things in society and everyday life have changed in the last 10 years as much as the concept of security. From bank robberies to wars, what used to imply a great deal of violence is now silently happening on the Internet. Perhaps more strikingly, the very idea of privacy – a concept closely related to that of individual freedom – is undergoing such a profound revolution that people are suddenly unable to make rational and informed decisions: we protested for the introduction of RFID tags (Kelly and Erickson, 2005; Lee and Kim, 2006) and now we throw away en-masse most of our private information by subscribing to services (social media, free apps, cloud services), which have their reason of existence in the commerce of intimate personal data. The ICT revolution has changed the game, and the security paradigms that were suitable for people and systems just up to 10 years ago are now obsolete. It looks like we do not know what to replace them with. As of today, we keep patching systems but we do not understand how to make them reasonably secure (Rice, 2007); perhaps more importantly, we do not understand what reasonable privacy guarantees are for human beings, let alone how to enforce them. We do not understand how to combine accountability and freedom in this new world, in which firewalls and digital perimeters cannot guarantee security and privacy any longer. We believe that the root of the challenge that we face is understanding security and how information technology can enable and support such an understanding. And just like security is a broad, multidisciplinary topic covering technical as well as non-technical issues, the challenge of understanding security is a multifaceted one, spanning across a myriad of noteworthy topics. Here, we mention just three that we consider particularly important

    On the use of hierarchical subtrace mining for efficient local process model mining

    Get PDF
    Mining local patterns of process behavior is a vital tool for the analysis of event data that originates from flexible processes, for which it is generally not possible to describe the behavior of the process in a single process model without overgeneralizing the behavior allowed by the process. Several techniques for mining such local patterns have been developed throughout the years, including Local Process Model (LPM) mining and the hierarchical mining of frequent subtraces (i.e., subprocesses). These two techniques can be considered to be orthogonal, i.e., they provide different types of insights on the behavior observed in an event log. As a consequence, it is often useful to apply both techniques to the data. However, both techniques can be computationally intensive, hindering data analysis. In this work, we explore how the output of a subtrace mining approach can be used to mine LPMs more efficiently. We show on a collection of real-life event logs that exploiting the ordering constraints extracted from subtraces lowers the computation time needed for LPM mining compared to state-of-the-art techniques, while at the same time mining higher quality LPMs. Additionally, by mining LPMs from subtraces, we can obtain a more structured and meaningful representation of subprocesses allowing for classic process-flow constructs such as parallel ordering, choices, and loops, besides the precedence relations shown by subtraces.</p

    GEM : a distributed goal evaluation algorithm for trust management

    Get PDF
    Trust Management (TM) is an approach to distributed access control where access decisions are based on policy statements issued by multiple principals and stored in a distributed manner. Most of the existing goal evaluation algorithms for TM either rely on a centralized evaluation strategy, which consists of collecting all the relevant policy statements in a single location (and therefore they do not guarantee the confidentiality of intensional policies), or do not detect the termination of the computation (i.e., when all the answers of a goal are computed). In this paper we present GEM, a distributed goal evaluation algorithm for TM systems. GEM detects termination in a completely distributed way without the need of disclosing intensional policies, thereby preserving their confidentiality. We demonstrate that the algorithm terminates and is sound and complete w.r.t. the standard semantics for logic programs

    Requirements engineering within a large-scale security-oriented research project : lessons learned

    Get PDF
    Requirements engineering has been recognized as a fundamental phase of the software engineering process. Nevertheless, the elicitation and analysis of requirements are often left aside in favor of architecture-driven software development. This tendency, however, can lead to issues that may affect the success of a project. This paper presents our experience gained in the elicitation and analysis of requirements in a large-scale security-oriented European research project, which was originally conceived as an architecture-driven project. In particular, we illustrate the challenges that can be faced in large-scale research projects and consider the applicability of existing best practices and off-the-shelf methodologies with respect to the needs of such projects. We then discuss how those practices and methods can be integrated into the requirements engineering process and possibly improved to address the identified challenges. Finally, we summarize the lessons learned from our experience and the benefits that a proper requirements analysis can bring to a project

    Requirements engineering within a large-scale security-oriented research project : lessons learned

    Get PDF
    Requirements engineering has been recognized as a fundamental phase of the software engineering process. Nevertheless, the elicitation and analysis of requirements are often left aside in favor of architecture-driven software development. This tendency, however, can lead to issues that may affect the success of a project. This paper presents our experience gained in the elicitation and analysis of requirements in a large-scale security-oriented European research project, which was originally conceived as an architecture-driven project. In particular, we illustrate the challenges that can be faced in large-scale research projects and consider the applicability of existing best practices and off-the-shelf methodologies with respect to the needs of such projects. We then discuss how those practices and methods can be integrated into the requirements engineering process and possibly improved to address the identified challenges. Finally, we summarize the lessons learned from our experience and the benefits that a proper requirements analysis can bring to a project

    Evidence-based discounting rule in subjective logic

    Get PDF
    We identify an inconsistency in Subjective Logic caused by the discounting operator ‘...’. We propose a new operator, ‘...’, which resolves all the consistency problems. The new algebra makes it possible to compute Subjective Logic trust values (reputations) in arbitrarily connected trust networks. The material presented here is an excerpt of [3]

    Evidence-based discounting rule in subjective logic

    Get PDF
    We identify an inconsistency in Subjective Logic caused by the discounting operator ‘...’. We propose a new operator, ‘...’, which resolves all the consistency problems. The new algebra makes it possible to compute Subjective Logic trust values (reputations) in arbitrarily connected trust networks. The material presented here is an excerpt of [3]

    A formal privacy analysis of identity management systems

    Get PDF
    With the growing amount of personal information exchanged over the Internet, privacy is becoming more and more a concern for users. In particular, personal information is increasingly being exchanged in Identity Management (IdM) systems to satisfy the increasing need for reliable on-line identification and authentication. One of the key principles in protecting privacy is data minimization. This principle states that only the minimum amount of information necessary to accomplish a certain goal should be collected. Several "privacy-enhancing" IdM systems have been proposed to guarantee data minimization. However, currently there is no satisfactory way to assess and compare the privacy they offer in a precise way: existing analyses are either too informal and high-level, or specific for one particular system. In this work, we propose a general formal method to analyse privacy in systems in which personal information is communicated and apply it to analyse existing IdM systems. We first elicit privacy requirements for IdM systems through a study of existing systems and taxonomies, and show how these requirements can be verified by expressing knowledge of personal information in a three-layer model. Then, we apply the formal method to study four IdM systems, representative of different research streams, analyse the results in a broad context, and suggest improvements. Finally, we discuss the completeness and (re)usability of the proposed method

    Flow-based reputation with uncertainty: evidence-based subjective logic

    Get PDF
    The concept of reputation is widely used as a measure of trustworthiness based on ratings from members in a community. The adoption of reputation systems, however, relies on their ability to capture the actual trustworthiness of a target. Several reputation models for aggregating trust information have been proposed in the literature. The choice of model has an impact on the reliability of the aggregated trust information as well as on the procedure used to compute reputations. Two prominent models are flow-based reputation (e.g., EigenTrust, PageRank) and subjective logic-based reputation. Flow-based models provide an automated method to aggregate trust information, but they are not able to express the level of uncertainty in the information. In contrast, subjective logic extends probabilistic models with an explicit notion of uncertainty, but the calculation of reputation depends on the structure of the trust network and often requires information to be discarded. These are severe drawbacks. In this work, we observe that the ‘opinion discounting’ operation in subjective logic has a number of basic problems. We resolve these problems by providing a new discounting operator that describes the flow of evidence from one party to another. The adoption of our discounting rule results in a consistent subjective logic algebra that is entirely based on the handling of evidence. We show that the new algebra enables the construction of an automated reputation assessment procedure for arbitrary trust networks, where the calculation no longer depends on the structure of the network, and does not need to throw away any information. Thus, we obtain the best of both worlds: flow-based reputation and consistent handling of uncertainties. Keywords: Reputation systems; Evidence theory; Subjective logic; Flow-based reputation model

    History-based construction of log-process alignments for conformance checking : discovering what really went wrong

    Get PDF
    Alignments provide a robust approach for conformance checking which has been largely applied in various contexts such as auditing and performance analysis. Alignment-based conformance checking techniques pinpoint the deviations causing nonconformity based on a cost function. However, such a cost function is often manually defined on the basis of human judgment and thus error-prone, leading to alignments that do not provide the most probable explanations of nonconformity. This paper proposes an approach to automatically define the cost function based on information extracted from the past process executions. The cost function only relies on objective factors and thus enables the construction of the most probable alignments, i.e. alignments that provide the most probable explanations of nonconformity. Our approach has been implemented in ProM and assessed using both synthetic and real-life data. Keywords: Conformance checking, alignments, cost function
    • …
    corecore