9 research outputs found
Cyber Security Risk Evaluation Research Based on Entropy Weight Method
© 2016 IEEE. The risk assessment of any Network or Security systems has a high level of uncertainties because usually probability and statistics were used to evaluate the security of different cyber security systems. In this paper we will use Shannon entropy to represent the uncertainty of information used to calculate systems risk and entropy weight method since the weight of the object index is normally used and point to the significant components of the index. We evaluate the risk of security systems in terms of different security layers and protections. The information system is analysed by perimeter, network, host, application and data layers' protections. The capability of protections is measured by introducing the concept of protection effectiveness. We write the security evaluations algorithm to normalized the protection matrix and calculate the entropy and the entropy weight, then we will use the weight and paths to evaluate and calculate the total risk in the system and give the systems administrator a clear guidance on the vulnerable security entities. We try to develop a novel approach to evaluate the cyber security suitable for the majority of cyber systems by introducing the term of security entities
A Threat Computation Model using a Markov Chain and Common Vulnerability Scoring System and its Application to Cloud Security
Copyright © 2019 Securing cyber infrastructures has become critical because they are increasingly exposed to attackers while accommodating a huge number of IoT devices and supporting numerous sophisticated emerging applications. Security metrics are essential for assessing the security risks and making effective decisions concerning system security. Many security metrics rely on mathematical models, but are mainly based on empirical data, qualitative methods, or compliance checking, and this renders the outcome far from satisfactory. Computing the probability of an attack, or more precisely a threat that materialises into an attack, forms an essential basis for a quantitative security metric. This paper proposes a novel approach to compute the probability distribution of cloud security threats based on a Markov chain and Common Vulnerability Scoring System. Moreover, the paper introduces the method to estimate the probability of security attacks. The use of the new security threat model and its computation is demonstrated through their application to estimating the probabilities of cloud threats and types of attacks
Attack graph approach to dynamic network vulnerability analysis and countermeasures
A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyIt is widely accepted that modern computer networks (often presented as a heterogeneous collection of functioning organisations, applications, software, and hardware) contain vulnerabilities. This research proposes a new methodology to compute a dynamic severity cost for each state. Here a state refers to the behaviour of a system during an attack; an example of a state is where an attacker could influence the information on an application to alter the credentials. This is performed by utilising a modified variant of the Common Vulnerability Scoring System (CVSS), referred to as a Dynamic Vulnerability Scoring System (DVSS). This calculates scores of intrinsic, time-based, and ecological metrics by combining related sub-scores and modelling the problemâs parameters into a mathematical framework to develop a unique severity cost.
The individual static nature of CVSS affects the scoring value, so the author has adapted a novel model to produce a DVSS metric that is more precise and efficient.
In this approach, different parameters are used to compute the final scores determined from a number of parameters including network architecture, device setting, and the impact of vulnerability interactions.
An attack graph (AG) is a security model representing the chains of vulnerability exploits in a network. A number of researchers have acknowledged the attack graph visual complexity and a lack of in-depth understanding. Current attack graph tools are constrained to only limited attributes or even rely on hand-generated input. The automatic formation of vulnerability information has been troublesome and vulnerability descriptions are frequently created by hand, or based on limited data. The network architectures and configurations along with the interactions between the individual vulnerabilities are considered in the method of computing the Cost using the DVSS and a dynamic cost-centric framework.
A new methodology was built up to present an attack graph with a dynamic cost metric based on DVSS and also a novel methodology to estimate and represent the cost-centric approach for each hostâ states was followed out.
A framework is carried out on a test network, using the Nessus scanner to detect known vulnerabilities, implement these results and to build and represent the dynamic cost centric attack graph using ranking algorithms (in a standardised fashion to Mehta et al. 2006 and Kijsanayothin, 2010). However, instead of using vulnerabilities for each host, a CostRank Markov Model has developed utilising a novel cost-centric approach, thereby reducing the complexity in the attack graph and reducing the problem of visibility.
An analogous parallel algorithm is developed to implement CostRank. The reason for developing a parallel CostRank Algorithm is to expedite the states ranking calculations for the increasing number of hosts and/or vulnerabilities. In the same way, the author intends to secure large scale networks that require fast and reliable computing to calculate the ranking of enormous graphs with thousands of vertices (states) and millions of arcs (representing an action to move from one state to another). In this proposed approach, the focus on a parallel CostRank computational architecture to appraise the enhancement in CostRank calculations and scalability of of the algorithm. In particular, a partitioning of input data, graph files and ranking vectors with a load balancing technique can enhance the performance and scalability of CostRank computations in parallel.
A practical model of analogous CostRank parallel calculation is undertaken, resulting in a substantial decrease in calculations communication levels and in iteration time. The results are presented in an analytical approach in terms of scalability, efficiency, memory usage, speed up and input/output rates.
Finally, a countermeasures model is developed to protect against network attacks by using a Dynamic Countermeasures Attack Tree (DCAT). The following scheme is used to build DCAT tree (i) using scalable parallel CostRank Algorithm to determine the critical asset, that system administrators need to protect; (ii) Track the Nessus scanner to determine the vulnerabilities associated with the asset using the dynamic cost centric framework and DVSS; (iii) Check out all published mitigations for all vulnerabilities. (iv) Assess how well the security solution mitigates those risks; (v) Assess DCAT algorithm in terms of effective security cost, probability and cost/benefit analysis to reduce the total impact of a specific vulnerability
Recommended from our members
Mixed structural models for decision making under uncertainty using stochastic system simulation and experimental economic methods: application to information security control choice
This research is concerned with whether and to what extent information security managers may be biased
in their evaluation of and decision making over the quantifiable risks posed by information management
systems where the circumstances may be characterized by uncertainty in both the risk inputs (e.g. system
threat and vulnerability factors) and outcomes (actual efficacy of the selected security controls and the
resulting system performance and associated business impacts). Although âquantified securityâ and any
associated risk management remains problematic from both a theoretical and empirical perspective (Anderson 2001; Verendel 2009; Appari 2010), professional practitioners in the field of information security continue to advocate the consideration of quantitative models for risk analysis and management wherever possible because those models permit a reliable economic determination of optimal operational control decisions (Littlewood, Brocklehurst et al. 1993; Nicol, Sanders et al. 2004; Anderson and Moore 2006; Beautement, Coles et al. 2009; Anderson 2010; Beresnevichiene, Pym et al. 2010; Wolter and Reinecke 2010; Li, Parker et al. 2011) The main contribution of this thesis is to bring current quantitative economic methods and experimental choice models to the field of information security risk management to examine the potential for biased decision making by security practitioners, under conditions where
information may be relatively objective or subjective and to demonstrate the potential for informing decision makers about these biases when making control decisions in a security context. No single quantitative security approach appears to have formally incorporated three key features of the security risk management problem addressed in this research: 1) the inherently stochastic nature of the information system inputs and outputs which contribute directly to decisional uncertainty (Conrad 2005; Wang, Chaudhury et al. 2008; Winkelvos, Rudolph et al. 2011); 2) the endogenous estimation of a decision makerâs risk attitude using models which otherwise typically assume risk neutrality or an inherent degree of risk aversion (Danielsson 2002; Harrison, Johnson et al. 2003); and 3) the application of structural modelling which allows for the possible combination and weighting between multiple latent models of choice (Harrison and Rutström 2009). The identification, decomposition and tractability of these decisional factors is of crucial importance to understanding the economic trade-offs inherent in security control choice under conditions of both risk and uncertainty, particularly where established psychological decisional biases such as ambiguity aversion (Ellsberg 1961) or loss aversion (Kahneman and Tversky 1984) may be assumed to be endemic to, if not magnified by, the institutional setting in which these
decisions take place. Minimally, risk averse managers may simply be overspending on controls, overcompensating
for anticipated losses that do not actually occur with the frequency or impact they imagine. On the other hand, risk-seeking managers, where they may exist (practitioners call them âcowboysâ â they are a familiar player in equally risky financial markets) may be simply gambling against ultimately losing odds, putting the entire firm at risk of potentially catastrophic security losses. Identifying and correcting for these scenarios would seem to be increasingly important for now universally networked business computing infrastructures.
From a research design perspective, the field of behavioural economics has made significant and recent
contributions to the empirical evaluation of psychological theories of decision making under uncertainty (Andersen, Harrison et al. 2007) and provides salient examples of lab experiments which can be used to
elicit and isolate a range of latent decision-making behaviours for choice under risk and uncertainty within
relatively controlled conditions versus those which might be obtainable in the field (Harrison and Rutström 2008). My research builds on recent work in the domain of information security control choice by 1) undertaking a series of lab experiments incorporating a stochastic model of a simulated information management system at risk which supports the generation of observational data derived from a range of security control choice decisions under both risk and uncertainty (Baldwin, Beres et al. 2011); and 2) modeling the resulting decisional biases using structural models of choice under risk and uncertainty (ElGamal and Grether 1995; Harrison and Rutström 2009; Keane 2010). The research contribution consists of the novel integration of a model of stochastic system risk and domain relevant structural utility modeling using a mixed model specification for estimation of the latent decision making behaviour. It is anticipated that the research results can be applied to the real world problem of âtuningâ quantitative information security risk management models to the decisional biases and characteristics of the decision maker (Abdellaoui and Munier 1998