338 research outputs found

    Automated Design of Network Security Metrics

    Get PDF
    Many abstract security measurements are based on characteristics of a graph that represents the network. These are typically simple and quick to compute but are often of little practical use in making real-world predictions. Practical network security is often measured using simulation or real-world exercises. These approaches better represent realistic outcomes but can be costly and time-consuming. This work aims to combine the strengths of these two approaches, developing efficient heuristics that accurately predict attack success. Hyper-heuristic machine learning techniques, trained on network attack simulation training data, are used to produce novel graph-based security metrics. These low-cost metrics serve as an approximation for simulation when measuring network security in real time. The approach is tested and verified using a simulation based on activity from an actual large enterprise network. The results demonstrate the potential of using hyper-heuristic techniques to rapidly evolve and react to emerging cybersecurity threats

    Predictive Cyber-security Analytics Framework: A non-homogenous Markov model for Security Quantification

    Full text link
    Numerous security metrics have been proposed in the past for protecting computer networks. However we still lack effective techniques to accurately measure the predictive security risk of an enterprise taking into account the dynamic attributes associated with vulnerabilities that can change over time. In this paper we present a stochastic security framework for obtaining quantitative measures of security using attack graphs. Our model is novel as existing research in attack graph analysis do not consider the temporal aspects associated with the vulnerabilities, such as the availability of exploits and patches which can affect the overall network security based on how the vulnerabilities are interconnected and leveraged to compromise the system. Gaining a better understanding of the relationship between vulnerabilities and their lifecycle events can provide security practitioners a better understanding of their state of security. In order to have a more realistic representation of how the security state of the network would vary over time, a nonhomogeneous model is developed which incorporates a time dependent covariate, namely the vulnerability age. The daily transition-probability matrices are estimated using Frei's Vulnerability Lifecycle model. We also leverage the trusted CVSS metric domain to analyze how the total exploitability and impact measures evolve over a time period for a given network.Comment: 16 pages, 6 Figures in International Conference of Security, Privacy and Trust Management 201

    Automating Cyber Analytics

    Get PDF
    Model based security metrics are a growing area of cyber security research concerned with measuring the risk exposure of an information system. These metrics are typically studied in isolation, with the formulation of the test itself being the primary finding in publications. As a result, there is a flood of metric specifications available in the literature but a corresponding dearth of analyses verifying results for a given metric calculation under different conditions or comparing the efficacy of one measurement technique over another. The motivation of this thesis is to create a systematic methodology for model based security metric development, analysis, integration, and validation. In doing so we hope to fill a critical gap in the way we view and improve a system’s security. In order to understand the security posture of a system before it is rolled out and as it evolves, we present in this dissertation an end to end solution for the automated measurement of security metrics needed to identify risk early and accurately. To our knowledge this is a novel capability in design time security analysis which provides the foundation for ongoing research into predictive cyber security analytics. Modern development environments contain a wealth of information in infrastructure-as-code repositories, continuous build systems, and container descriptions that could inform security models, but risk evaluation based on these sources is ad-hoc at best, and often simply left until deployment. Our goal in this work is to lay the groundwork for security measurement to be a practical part of the system design, development, and integration lifecycle. In this thesis we provide a framework for the systematic validation of the existing security metrics body of knowledge. In doing so we endeavour not only to survey the current state of the art, but to create a common platform for future research in the area to be conducted. We then demonstrate the utility of our framework through the evaluation of leading security metrics against a reference set of system models we have created. We investigate how to calibrate security metrics for different use cases and establish a new methodology for security metric benchmarking. We further explore the research avenues unlocked by automation through our concept of an API driven S-MaaS (Security Metrics-as-a-Service) offering. We review our design considerations in packaging security metrics for programmatic access, and discuss how various client access-patterns are anticipated in our implementation strategy. Using existing metric processing pipelines as reference, we show how the simple, modular interfaces in S-MaaS support dynamic composition and orchestration. Next we review aspects of our framework which can benefit from optimization and further automation through machine learning. First we create a dataset of network models labeled with the corresponding security metrics. By training classifiers to predict security values based only on network inputs, we can avoid the computationally expensive attack graph generation steps. We use our findings from this simple experiment to motivate our current lines of research into supervised and unsupervised techniques such as network embeddings, interaction rule synthesis, and reinforcement learning environments. Finally, we examine the results of our case studies. We summarize our security analysis of a large scale network migration, and list the friction points along the way which are remediated by this work. We relate how our research for a large-scale performance benchmarking project has influenced our vision for the future of security metrics collection and analysis through dev-ops automation. We then describe how we applied our framework to measure the incremental security impact of running a distributed stream processing system inside a hardware trusted execution environment

    A Unified Framework for Measuring a Network's Mean Time-to-Compromise

    Get PDF
    Measuring the mean time-to-compromise provides important insights for understanding a network's weaknesses and for guiding corresponding defense approaches. Most existing network security metrics only deal with the threats of known vulnerabilities and cannot handle zero day attacks with consistent semantics. In this thesis, we propose a unified framework for measuring a network's mean time-to-compromise by considering both known, and zero day attacks. Specifically, we first devise models of the mean time for discovering and exploiting individual vulnerabilities. Unlike existing approaches, we replace the generic state transition model with a more vulnerability-specific graphical model. We then employ Bayesian networks to derive the overall mean time-to-compromise by aggregating the results of individual vulnerabilities. Finally, we demonstrate the framework's practical application to network hardening through case studies

    Threat Assessment for Multistage Cyber Attacks in Smart Grid Communication Networks

    Get PDF
    In smart grids, managing and controlling power operations are supported by information and communication technology (ICT) and supervisory control and data acquisition (SCADA) systems. The increasing adoption of new ICT assets in smart grids is making smart grids vulnerable to cyber threats, as well as raising numerous concerns about the adequacy of current security approaches. As a single act of penetration is often not sufficient for an attacker to achieve his/her goal, multistage cyber attacks may occur. Due to the interdependence between the power grid and the communication network, a multistage cyber attack not only affects the cyber system but impacts the physical system. This thesis investigates an application-oriented stochastic game-theoretic cyber threat assessment framework, which is strongly related to the information security risk management process as standardized in ISO/IEC 27005. The proposed cyber threat assessment framework seeks to address the specific challenges (e.g., dynamic changing attack scenarios and understanding cascading effects) when performing threat assessments for multistage cyber attacks in smart grid communication networks. The thesis looks at the stochastic and dynamic nature of multistage cyber attacks in smart grid use cases and develops a stochastic game-theoretic model to capture the interactions of the attacker and the defender in multistage attack scenarios. To provide a flexible and practical payoff formulation for the designed stochastic game-theoretic model, this thesis presents a mathematical analysis of cascading failure propagation (including both interdependency cascading failure propagation and node overloading cascading failure propagation) in smart grids. In addition, the thesis quantifies the characterizations of disruptive effects of cyber attacks on physical power grids. Furthermore, this thesis discusses, in detail, the ingredients of the developed stochastic game-theoretic model and presents the implementation steps of the investigated stochastic game-theoretic cyber threat assessment framework. An application of the proposed cyber threat assessment framework for evaluating a demonstrated multistage cyber attack scenario in smart grids is shown. The cyber threat assessment framework can be integrated into an existing risk management process, such as ISO 27000, or applied as a standalone threat assessment process in smart grid use cases

    Déterminants de la capacité d'adaptation des agriculteurs et méthode d'évaluation de la vulnérabilité au changement climatique, en cas des exploitations maïsicoles en Occitanie

    Get PDF
    Le changement climatique est un (si ce n’est le) défi de notre siècle. Il impacte fortement l’agriculture française et particulièrement la culture de maïs, de plus en plus vulnérable du fait de ses besoins en eau pendant la période estivale. Les maïsiculteurs doivent s’adapter pour tendre vers des systèmes résilients capables de maintenir la production, l’activité économique des régions productrices ainsi que le revenu des agriculteurs de ces régions. La communauté scientifique a déjà proposé une multitude de stratégies d’adaptation, qui sont cependant adoptées de façon hétérogène dans des situations similaires (même région et niveau de production) En outre, parmi les adoptants, certaines stratégies s’avèrent parfois être des maladaptations. Ainsi, il est essentiel de proposer des stratégies adoptables par les agriculteurs et adaptées au cas par cas. Pour y contribuer, l’objectif de ma thèse était de construire des méthodes d’évaluation de la vulnérabilité au changement climatique des exploitations agricoles. L’approche mobilisée peut être qualifiée de systémique, participative et interdisciplinaire. Dans un premier temps, ma thèse vise à mieux caractériser les dimensions de la vulnérabilité telle que définie par le GIEC. Il s’agit en particulier d’identifier les déterminants de la capacité d’adaptation, dimension encore peu étudiée et pourtant essentielle pour comprendre et accompagner le choix des stratégies d’adaptation par les agriculteurs. En empruntant des approches et des outils à l’économie comportementale, cette première étape de ma thèse a permis de mettre en évidence le rôle du profil cognitif et psychologique de l’agriculteur dans sa capacité d’adaptation, dans un contexte de changement climatique. Dans un second temps, je développe un cadre conceptuel de la vulnérabilité au changement climatique intégrant les déterminants de la capacité d’adaptation précédemment identifiés. Ce cadre sert ensuite de guide pour l’élicitation d’indicateurs d’évaluation au cours d’entretiens semi-directifs auprès d’agriculteurs. A partir de ces indicateurs, ma thèse propose en troisième partie une méthodologie de construction d’une méthode d’évaluation de la vulnérabilité au changement climatique à l’échelle de l’exploitation. La concrétisation de ce travail de recherche à travers la création d’un outil d’évaluation opérationnel et adoptable par les agriculteurs et leurs conseillers est en cours de développement. Cet outil permettrait (i) le diagnostic de la vulnérabilité au changement climatique, (ii) le dialogue autour des leviers permettant d’améliorer la capacité d’adaptation de l’agriculteur, et (iii) le test de stratégies à mettre en place afin de réduire la vulnérabilité au changement climatique de l’exploitation

    Modeling of mobile end-user context

    Get PDF
    Emerging mobile services have spawned new revenue sources, such as messaging, Internet browsing and multimedia. One of the business opportunities that mobile industry has not yet fully exploited is the contextual status of end-users. Context-aware systems are gaining importance in telecommunications since the applications are numerous and have relevance from the industrial (e.g. in aspects involving user segmentation) and academic (e.g. analysis of mobile service adoption dynamics) points of view.This thesis first presents a theoretical discussion: evolution of telecommunications services, prior studies in context-aware systems and other concepts such as data mining techniques and network theory, and network visualization. The second part focuses on the development of a context detection algorithm. This algorithm extracts contextual information from data logs containing cell-id transitions. It follows two steps in context detection: first a clustering process where physically close cells are grouped into clusters and second the context detection for every one of those clusters by using time-based assumptions. The thesis uses a handset-based tool in collecting data logs. The strength and accuracy of the algorithm are tested through analysis of the output files. Finally, a study of real data (from the Finnish market) is carried out in order to deliver results. Through this analysis, the thesis focuses on the service usage perspective. The driving question is how and where the end-users spend their time with the handsets.Context is not only about location but also about the physical status and social settings of end-users. Context detection provides a new dimension for example in service usage analysis or modeling of service adoption. The results show that e.g. most of the usage of WLAN takes place at "home" and applications such as "Navigation and Maps" or "Browsing" are used "on the move" context. Intensity graphs prove that e.g. "Home" is not the most active context despite being the most frequent one and the intensity of usage abroad is most active in "Multimedia" and "Messaging" applications. On the other hand, there is a significant business opportunity in applications that automatically identify the context of mobile users (all their movements along the day). Targeted marketing and handset-based contextual adaptation are some of the examples of possible applications. At last, the thesis confirms that network visualization tools are useful in the process of context modeling, not only for testing the results but also to help in the detection by using all their functionalities (as e.g. clustering). Some examples using one specific tool will be provided at the end
    • …
    corecore