5,977 research outputs found
Usability and Trust in Information Systems
The need for people to protect themselves and their assets is as old as humankind. People's physical safety and their possessions have always been at risk from deliberate attack or accidental damage. The advance of information technology means that many individuals, as well as corporations, have an additional range of physical (equipment) and electronic (data) assets that are at risk. Furthermore, the increased number and types of interactions in cyberspace has enabled new forms of attack on people and their possessions. Consider grooming of minors in chat-rooms, or Nigerian email cons: minors were targeted by paedophiles before the creation of chat-rooms, and Nigerian criminals sent the same letters by physical mail or fax before there was email. But the technology has decreased the cost of many types of attacks, or the degree of risk for the attackers. At the same time, cyberspace is still new to many people, which means they do not understand risks, or recognise the signs of an attack, as readily as they might in the physical world. The IT industry has developed a plethora of security mechanisms, which could be used to mitigate risks or make attacks significantly more difficult. Currently, many people are either not aware of these mechanisms, or are unable or unwilling or to use them. Security experts have taken to portraying people as "the weakest link" in their efforts to deploy effective security [e.g. Schneier, 2000]. However, recent research has revealed at least some of the problem may be that security mechanisms are hard to use, or be ineffective. The review summarises current research on the usability of security mechanisms, and discusses options for increasing their usability and effectiveness
Quantitative analysis of distributed systems
PhD ThesisComputing Science addresses the security of real-life systems by using
various security-oriented technologies (e.g., access control solutions
and resource allocation strategies). These security technologies
signficantly increase the operational costs of the organizations in
which systems are deployed, due to the highly dynamic, mobile and
resource-constrained environments. As a result, the problem of designing
user-friendly, secure and high efficiency information systems
in such complex environment has become a major challenge for the
developers.
In this thesis, firstly, new formal models are proposed to analyse the
secure information
flow in cloud computing systems. Then, the opacity of work
flows in cloud computing systems is investigated, a threat
model is built for cloud computing systems, and the information leakage
in such system is analysed. This study can help cloud service
providers and cloud subscribers to analyse the risks they take with
the security of their assets and to make security related decision.
Secondly, a procedure is established to quantitatively evaluate the
costs and benefits of implementing information security technologies.
In this study, a formal system model for data resources in a dynamic
environment is proposed, which focuses on the location of different
classes of data resources as well as the users. Using such a model, the
concurrent and probabilistic behaviour of the system can be analysed.
Furthermore, efficient solutions are provided for the implementation of
information security system based on queueing theory and stochastic
Petri nets. This part of research can help information security officers
to make well judged information security investment decisions
Attack graph approach to dynamic network vulnerability analysis and countermeasures
A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyIt is widely accepted that modern computer networks (often presented as a heterogeneous collection of functioning organisations, applications, software, and hardware) contain vulnerabilities. This research proposes a new methodology to compute a dynamic severity cost for each state. Here a state refers to the behaviour of a system during an attack; an example of a state is where an attacker could influence the information on an application to alter the credentials. This is performed by utilising a modified variant of the Common Vulnerability Scoring System (CVSS), referred to as a Dynamic Vulnerability Scoring System (DVSS). This calculates scores of intrinsic, time-based, and ecological metrics by combining related sub-scores and modelling the problem’s parameters into a mathematical framework to develop a unique severity cost.
The individual static nature of CVSS affects the scoring value, so the author has adapted a novel model to produce a DVSS metric that is more precise and efficient.
In this approach, different parameters are used to compute the final scores determined from a number of parameters including network architecture, device setting, and the impact of vulnerability interactions.
An attack graph (AG) is a security model representing the chains of vulnerability exploits in a network. A number of researchers have acknowledged the attack graph visual complexity and a lack of in-depth understanding. Current attack graph tools are constrained to only limited attributes or even rely on hand-generated input. The automatic formation of vulnerability information has been troublesome and vulnerability descriptions are frequently created by hand, or based on limited data. The network architectures and configurations along with the interactions between the individual vulnerabilities are considered in the method of computing the Cost using the DVSS and a dynamic cost-centric framework.
A new methodology was built up to present an attack graph with a dynamic cost metric based on DVSS and also a novel methodology to estimate and represent the cost-centric approach for each host’ states was followed out.
A framework is carried out on a test network, using the Nessus scanner to detect known vulnerabilities, implement these results and to build and represent the dynamic cost centric attack graph using ranking algorithms (in a standardised fashion to Mehta et al. 2006 and Kijsanayothin, 2010). However, instead of using vulnerabilities for each host, a CostRank Markov Model has developed utilising a novel cost-centric approach, thereby reducing the complexity in the attack graph and reducing the problem of visibility.
An analogous parallel algorithm is developed to implement CostRank. The reason for developing a parallel CostRank Algorithm is to expedite the states ranking calculations for the increasing number of hosts and/or vulnerabilities. In the same way, the author intends to secure large scale networks that require fast and reliable computing to calculate the ranking of enormous graphs with thousands of vertices (states) and millions of arcs (representing an action to move from one state to another). In this proposed approach, the focus on a parallel CostRank computational architecture to appraise the enhancement in CostRank calculations and scalability of of the algorithm. In particular, a partitioning of input data, graph files and ranking vectors with a load balancing technique can enhance the performance and scalability of CostRank computations in parallel.
A practical model of analogous CostRank parallel calculation is undertaken, resulting in a substantial decrease in calculations communication levels and in iteration time. The results are presented in an analytical approach in terms of scalability, efficiency, memory usage, speed up and input/output rates.
Finally, a countermeasures model is developed to protect against network attacks by using a Dynamic Countermeasures Attack Tree (DCAT). The following scheme is used to build DCAT tree (i) using scalable parallel CostRank Algorithm to determine the critical asset, that system administrators need to protect; (ii) Track the Nessus scanner to determine the vulnerabilities associated with the asset using the dynamic cost centric framework and DVSS; (iii) Check out all published mitigations for all vulnerabilities. (iv) Assess how well the security solution mitigates those risks; (v) Assess DCAT algorithm in terms of effective security cost, probability and cost/benefit analysis to reduce the total impact of a specific vulnerability
Theory of entropic security decay: The gradual degradation in effectiveness of commissioned security systems
As a quantitative auditing tool for Physical Protection Systems (PPS) the Estimated Adversary Sequence Interruption (EASI) model has been available for many years. Nevertheless, once a systems macro-state measure has been commissioned (Pi) against its defined threat using EASI, there must be a means of articulating its continued efficacy (steady state) or its degradation over time. The purpose of this multi-phase study was to develop the concept and define the term entropic security decay. Phase one presented documentary benchmarks for security decay. This phase was broken into three stages; stage one presented General Systems Theory (GST) as a systems benchmark for the study. Stage two applied the writings from stage one to physical security, and stage three presented a benchmark for considering physical system decay. Phase two incorporated the pilot study towards validating the feasibility of undertaking the main study and refining interview instrumentation. Phase three executed the main study, extracting and presenting security experts (N=6) thoughts, feelings and experiences with the phenomenon of security decay. Phase four provided the interpretative analysis, responding to the study’s research question
Decentralized nation, solving the web identity crisis
The web of today whether you prefer to call it web 2.0, web 3.0, web 5.0 or
even the metaverse is at a critical stage of evolution and challenge, largely
centered around its crisis of identity. Like teenagers who cannot assess
properly their reason for being and do not seem ready to take responsibility
for their actions, we are constantly blaming the very system we are trying to
get away from. To truly realize the benefits from innovation and technology,
this crisis has to be resolved, not just through tactical solutions but through
developments that enhance the sustainability of the web and its benefits.
Significant strides are being made in the evolution of digital services enabled
by technology, regulation, and the sheer pace of societal change. The journey
to the decentralized web is mirroring the convergence of the physical and
digital worlds across all economies and is increasingly embracing the digital
native world. Technology has provided the foundational platform for individuals
and entities to create and manage wealth, potentially without the need for big
institutions. Ironically, despite all of the advancements, we are still facing
an unprecedented and increasing wealth gap. Clearly, the system is broken, not
just around the edges but at the very core of the democratic underpinning of
our society. In this whitepaper, we propose how artificial intelligence on
blockchain can be used to generate a new class of identity through direct human
computer interaction. We demonstrate how this, combined with new perspectives
for sustaining community and governance embedded within the use of blockchain
technology, will underpin a sustainable solution to protect identity,
authorship and privacy at the same time while contributing to restore trust
amongst members of a future decentralized nation and hence contribute to
solving the web most significant identity crisis.Comment: 11 pages, 1 figur
- …