1,691 research outputs found

    Workflow Behavior Auditing for Mission Centric Collaboration

    Get PDF
    Successful mission-centric collaboration depends on situational awareness in an increasingly complex mission environment. To support timely and reliable high level mission decisions, auditing tools need real-time data for effective assessment and optimization of mission behaviors. In the context of a battle rhythm, mission health can be measured from workflow generated activities. Though battle rhythm collaboration is dynamic and global, a potential enabling technology for workflow behavior auditing exists in process mining. However, process mining is not adequate to provide mission situational awareness in the battle rhythm environment since event logs may contain dynamic mission states, noise and timestamp inaccuracy. Therefore, we address a few key near-term issues. In sequences of activities parsed from network traffic streams, we identify mission state changes in the workflow shift detection algorithm. In segments of unstructured event logs that contain both noise and relevant workflow data, we extract and rank workflow instances for the process analyst. When confronted with timestamp inaccuracy in event logs from semi automated, distributed workflows, we develop the flower chain network and discovery algorithm to improve behavioral conformance. For long term adoption of process mining in mission centric collaboration, we develop and demonstrate an experimental framework for logging uncertainty testing. We show that it is highly feasible to employ process mining techniques in environments with dynamic mission states and logging uncertainty. Future workflow behavior auditing technology will benefit from continued algorithmic development, new data sources and system prototypes to propel next generation mission situational awareness, giving commanders new tools to assess and optimize workflows, computer systems and missions in the battle space environment

    Discovering and Utilising Expert Knowledge from Security Event Logs

    Get PDF
    Security assessment and configuration is a methodology of protecting computer systems from malicious entities. It is a continuous process and heavily dependent on human experts, which are widely attributed to being in short supply. This can result in a system being left insecure because of the lack of easily accessible experience and specialist resources. While performing security tasks, human experts often revert to a system's event logs to determine status of security, such as failures, configuration modifications, system operations etc. However, finding and exploiting knowledge from event logs is a challenging and time-consuming task for non-experts. Hence, there is a strong need to provide mechanisms to make the process easier for security experts, as well as providing tools for those with significantly less security expertise. Doing so automatically allows for persistent and methodical testing without an excessive amount of manual time and effort, and makes computer security more accessible to on-experts. In this thesis, we present a novel technique to process security event logs of a system that have been evaluated and configured by a security expert, extract key domain knowledge indicative of human decision making, and automatically apply acquired knowledge to previously unseen systems by non-experts to recommend security improvements. The proposed solution utilises association and causal rule mining techniques to automatically discover relationships in the event log entries. The relationships are in the form of cause and effect rules that define security-related patterns. These rules and other relevant information are encoded into a PDDL-based domain action model. The domain model and problem instance generated from any vulnerable system can then be used to produce a plan-of-action by employing a state-of-the-art automated planning algorithm. The plan can be exploited by non-professionals to identify the security issues and make improvements. Empirical analysis is subsequently performed on 21 live, real world event log datasets, where the acquired domain model and identified plans are closely examined. The solution's accuracy lies between 73% - 92% and gained a significant performance boost as compared to the manual approach of identifying event relationships. The research presented in this thesis is an automation of extracting knowledge from event data steams. The previous research and current industry practices suggest that this knowledge elicitation is performed by human experts. As evident from the empirical analysis, we present a promising line of work that has the capacity to be utilised in commercial settings. This would reduce (or even eliminate) the dire and immediate need for human resources along with contributing towards financial savings

    Self-adaptive Authorisation Infrastructures

    Get PDF
    Traditional approaches in access control rely on immutable criteria in which to decide and award access. These approaches are limited, notably when handling changes in an organisation’s protected resources, resulting in the inability to accommodate the dynamic aspects of risk at runtime. An example of such risk is a user abusing their privileged access to perform insider attacks. This thesis proposes self-adaptive authorisation, an approach that enables dynamic access control. A framework for developing self-adaptive authorisation is defined, where autonomic controllers are deployed within legacy based authorisation infrastructures to enable the runtime management of access control. Essential to the approach is the use of models and model driven engineering (MDE). Models enable a controller to abstract from the authorisation infrastructure it seeks to control, reason about state, and provide assurances over change to access. For example, a modelled state of access may represent an active access control policy. Given the diverse nature in implementations of authorisation infrastructures, MDE enables the creation and transformation of such models, whereby assets (e.g., policies) can be automatically generated and deployed at runtime. A prototype of the framework was developed, whereby management of access control is focused on the mitigation of abuse of access rights. The prototype implements a feedback loop to monitor an authorisation infrastructure in terms of modelling the state of access control and user behaviour, analyse potential solutions for handling malicious behaviour, and act upon the infrastructure to control future access control decisions. The framework was evaluated against mitigation of simulated insider attacks, involving the abuse of access rights governed by access control methodologies. In addition, to investigate the framework’s approach in a diverse and unpredictable environment, a live experiment was conducted. This evaluated the mitigation of abuse performed by real users as well as demonstrating the consequence of self-adaptation through observation of user response

    The Impact of ISO 9000 Diffusion on Trade and FDI: A New Institutional Analysis

    Get PDF
    The effects of ISO 9000 diffusion on trade and FDI have gone understudied. We employ panel data reported by OECD nations over the 1995-2002 period to estimate the impact of ISO adoptions on country-pair economic relations. We find ISO diffusion to have no effect in developed nations, but to positively pull FDI (i.e., enhancing inward FDI) and positively push trade (i.e., enhancing exports) in developing nations. ZUSAMMENFASSUNG - (Der Einfluss der Verbreitung des ISO 9000 Standards auf Außenhandel und ausländische Direktinvestitionen: eine neue institutionelle Analyse) Die Folgen der Verbreitung des ISO 9000 Standards auf den Außenhandel und ausländische Direktinvestitionen wurden bislang noch nicht ausreichend untersucht. In diesem Papier werden Datensätze von OECD-Mitgliedsstaaten im Zeitraum von 1995 - 2002 verwendet, um die Auswirkung von ISO-Einführungen auf bilaterale Außenhandelsaktivitäten zu untersuchen. Es zeigt sich, dass die Anwendung der ISO-Norm keine Auswirkungen auf den Handel zwischen entwickelten Ländern hat. In Entwicklungsländern dagegen führt die Einführung der ISO 9000 Norm dazu, dass die ins Land geholten ausländischen Direktinvestitionen (FDI) steigen und dass der Außenhandel - messbar an der Steigerung des Exportvolumens - positiv beeinflusst wird.FDI, Trade, Transaction Costs, Institutions

    ESMC: Entire Space Multi-Task Model for Post-Click Conversion Rate via Parameter Constraint

    Full text link
    Large-scale online recommender system spreads all over the Internet being in charge of two basic tasks: Click-Through Rate (CTR) and Post-Click Conversion Rate (CVR) estimations. However, traditional CVR estimators suffer from well-known Sample Selection Bias and Data Sparsity issues. Entire space models were proposed to address the two issues via tracing the decision-making path of "exposure_click_purchase". Further, some researchers observed that there are purchase-related behaviors between click and purchase, which can better draw the user's decision-making intention and improve the recommendation performance. Thus, the decision-making path has been extended to "exposure_click_in-shop action_purchase" and can be modeled with conditional probability approach. Nevertheless, we observe that the chain rule of conditional probability does not always hold. We report Probability Space Confusion (PSC) issue and give a derivation of difference between ground-truth and estimation mathematically. We propose a novel Entire Space Multi-Task Model for Post-Click Conversion Rate via Parameter Constraint (ESMC) and two alternatives: Entire Space Multi-Task Model with Siamese Network (ESMS) and Entire Space Multi-Task Model in Global Domain (ESMG) to address the PSC issue. Specifically, we handle "exposure_click_in-shop action" and "in-shop action_purchase" separately in the light of characteristics of in-shop action. The first path is still treated with conditional probability while the second one is treated with parameter constraint strategy. Experiments on both offline and online environments in a large-scale recommendation system illustrate the superiority of our proposed methods over state-of-the-art models. The real-world datasets will be released

    Investigating system intrusions with data provenance analytics

    Get PDF
    To aid threat detection and investigation, enterprises are increasingly relying on commercially available security solutions, such as Intrusion Detection Systems (IDS) and Endpoint Detection and Response (EDR) tools. These security solutions first collect and analyze audit logs throughout the enterprise and then generate threat alerts when suspicious activities occur. Later, security analysts investigate those threat alerts to separate false alarms from true attacks by extracting contextual history from the audit logs, i.e., the trail of events that caused the threat alert. Unfortunately, investigating threats in enterprises is a notoriously difficult task, even for expert analysts, due to two main challenges. First, existing enterprise security solutions are optimized to miss as few threats as possible – as a result, they generate an overwhelming volume of false alerts, creating a backlog of investigation tasks. Second, modern computing systems are operationally complex that produce an enormous volume of audit logs per day, making it difficult to correlate events for threats that span across multiple processes, applications, and hosts. In this dissertation, I propose leveraging data provenance analytics to address the challenges mentioned above. I present five provenance-based techniques that enable system defenders to effectively and efficiently investigate malicious behaviors in enterprise settings. First, I present NoDoze, an alert triage system that automatically prioritizes generated alerts based on their anomalous contextual history. Following that, RapSheet brings benefits of data provenance to commercial EDR tools and provides compact visualization of multi-stage attacks to system defenders. Swift then realized a provenance graph database that generates contextual history around generated alerts in real-time even when analyzing audit logs containing tens of millions of events. Finally, OmegaLog and Zeek Agent introduced the vision of universal provenance analysis, which unifies all forensically relevant provenance information on the system regardless of their layer of origin, improving investigation capabilities
    • …
    corecore