9,311 research outputs found
Towards a Network-based Approach for Smartphone Security
Smartphones have become an important utility that affects many aspects of our daily life. Due to their large dissemination and the tasks that are performed with them, they have also become a valuable target for criminals. Their specific capabilities and the way they are used introduce new threats in terms of information security. The research field of smartphone security has gained a lot of momentum in the past eight years. Approaches that have been presented so far focus on investigating design flaws of smartphone operating systems as well as their potential misuse by an adversary. Countermeasures are often realized based upon extensions made to the operating system itself, following a host-based design approach. However, there is a lack of network-based mechanisms that allow a secure integration of smartphones into existing IT infrastructures. This topic is especially relevant for companies whose employees use smartphones for business tasks. This thesis presents a novel, network-based approach for smartphone security called CADS: Context-related Signature and Anomaly Detection for Smartphones. It allows to determine the security status of smartphones by analyzing three aspects: (1) their current configuration in terms of installed software and available hardware, (2) their behavior and (3) the context they are currently used in. Depending on the determined security status, enforcement actions can be defined in order to allow or to deny access to services provided by the respective IT infrastructure. The approach is based upon the distributed collection and central analysis of data about smartphones. In contrast to other approaches, it explicitly supports to leverage existing security services both for analysis and enforcement purposes. A proof of concept is implemented based upon the IF-MAP protocol for network security and the Google Android platform. An evaluation verifies (1) that the CADS approach is able to detect so-called sensor sniffing attacks and (2) that reactions can be triggered based on detection results to counter ongoing attacks. Furthermore, it is demonstrated that the functionality of an existing, host-based approach that relies on modifications of the Android smartphone platform can be mimicked by the CADS approach. The advantage of CADS is that it does not need any modifications of the Android platform itself
Assuring Safety and Security
Large technological systems produce new capabilities that allow innovative solutions to social, engineering and environmental problems. This trend is especially important in the safety-critical systems (SCS) domain where we simultaneously aim to do more with the systems whilst reducing the harm they might cause. Even with the increased uncertainty created by these opportunities, SCS still need to be assured against safety and security risk and, in many cases, certified before use.
A large number of approaches and standards have emerged, however there remain challenges related to technical risk such as identifying inter-domain risk interactions, developing safety-security causal models, and understanding the impact of new risk information. In addition, there are socio-technical challenges that undermine technical risk activities and act as a barrier to co-assurance, these include insufficient processes for risk acceptance, unclear responsibilities, and a lack of legal, regulatory and organisational structure to support safety-security alignment. A new approach is required.
The Safety-Security Assurance Framework (SSAF) is proposed here as a candidate solution. SSAF is based on the new paradigm of independent co-assurance, that is, keeping the disciplines separate but having synchronisation points where required information is exchanged. SSAF is comprised of three parts - the Conceptual Model defines the underlying philosophy, and the Technical Risk Model (TRM) and Socio-Technical Model (STM) consist of processes and models for technical risk and socio-technical aspects of co-assurance. Findings from a partial evaluation of SSAF using case studies reveal that the approach has some utility in creating inter-domain relationship models and identifying socio-technical gaps for co-assurance.
The original contribution to knowledge presented in this thesis is the novel approach to co-assurance that uses synchronisation points, explicit representation of a technical risk argument that argues over interaction risks, and a confidence argument that explicitly considers co-assurance socio-technical factors
Spatio-Temporal Analysis of Crime Incidents for Forensic Investigation
Crime analysis and mapping has been routinely employed to gather intelligence which informs security efforts and forensic investigations. Traditionally, geographic information systems in the form of third-party mapping applications are used for analysis of crime data but are often expensive and lack flexibility, transparency, or efficiency in uncovering associations and relationships in crime. Each crime incident and article of evidence within that incident has an associated spatial and temporal component which may yield significant and relevant information to the case. Wide variations exist in the techniques that departments use and commonly spatial and temporal components of crime are evaluated independently, if at all. Thus, there is a critical need to develop and implement spatio-temporal investigative strategies so police agencies can gain a foundational understanding of crime occurrence within their jurisdiction, develop strategic action for disruption and resolution of crime, conduct more informed investigations, better utilize resources, and provide an overall more effective service.
The purpose of this project was to provide foundational knowledge to the investigative and security communities and demonstrate the utility of empirical spatio-temporal methods for the assessment and interpretation of crime incidents. Two software packages were developed as an open source (R) solution to expand current techniques and provide an implementable spatio-temporal methodology for crime analysis. Additionally, an actionable method for near repeat analysis was developed. Firstly, the premise of the near repeat phenomenon was evaluated across crime types and cities to discern optimal parameters for spatial and temporal bandwidths. Using these parameters, a method for identifying near repeat series was developed which draws inter-incident linkages given the spatio-temporal clustering of the incidents. Resultant crime networks and maps provide insight regarding near repeat crime incidents within the landscape of their jurisdiction for targeted investigation. Finally, a new approach to the geographic profiling problem was developed which assesses and integrates the travel environment of road networks, beliefs and assumptions formed through the course of the investigation process about the perpetrator, and information derived from the analysis of evidence. Each piece of information is evaluated in conjunction with spatio-temporal routing functions and then used to update prior beliefs about the anchor point of the perpetrator. Adopting spatio-temporal methodologies for the investigation of crime offers a new framework for forensic operations in the investigation of crime. Systematic consideration about the value and implications of the relationship between space, time, and crime was shown to provide insight regarding crime. In a forward-looking sense this work shows that the interpretation of crime within a spatio-temporal context can provide insight into crime occurrence, linkage of crime incidents, and investigations of those incidents
Trusted Computing and Secure Virtualization in Cloud Computing
Large-scale deployment and use of cloud computing in industry
is accompanied and in the same time hampered by concerns regarding protection of
data handled by cloud computing providers. One of the consequences of moving
data processing and storage off company premises is that organizations have
less control over their infrastructure. As a result, cloud service (CS) clients
must trust that the CS provider is able to protect their data and
infrastructure from both external and internal attacks. Currently however, such
trust can only rely on organizational processes declared by the CS
provider and can not be remotely verified and validated by an external party.
Enabling the CS client to verify the integrity of the host where the
virtual machine instance will run, as well as to ensure that the virtual
machine image has not been tampered with, are some steps towards building
trust in the CS provider. Having the tools to perform such
verifications prior to the launch of the VM instance allows the CS
clients to decide in runtime whether certain data should be stored- or calculations
should be made on the VM instance offered by the CS provider.
This thesis combines three components -- trusted computing, virtualization technology
and cloud computing platforms -- to address issues of trust and
security in public cloud computing environments. Of the three components,
virtualization technology has had the longest evolution and is a cornerstone
for the realization of cloud computing. Trusted computing is a recent
industry initiative that aims to implement the root of trust in a hardware
component, the trusted platform module. The initiative has been formalized
in a set of specifications and is currently at version 1.2. Cloud computing
platforms pool virtualized computing, storage and network resources in
order to serve a large number of customers customers that use a multi-tenant
multiplexing model to offer on-demand self-service over broad network.
Open source cloud computing platforms are, similar to trusted computing, a
fairly recent technology in active development.
The issue of trust in public cloud environments is addressed
by examining the state of the art within cloud computing security and
subsequently addressing the issues of establishing trust in the launch of a
generic virtual machine in a public cloud environment. As a result, the thesis
proposes a trusted launch protocol that allows CS clients
to verify and ensure the integrity of the VM instance at launch time, as
well as the integrity of the host where the VM instance is launched. The protocol
relies on the use of Trusted Platform Module (TPM) for key generation and data protection.
The TPM also plays an essential part in the integrity attestation of the
VM instance host. Along with a theoretical, platform-agnostic protocol,
the thesis also describes a detailed implementation design of the protocol
using the OpenStack cloud computing platform.
In order the verify the implementability of the proposed protocol, a prototype
implementation has built using a distributed deployment of OpenStack.
While the protocol covers only the trusted launch procedure using generic
virtual machine images, it presents a step aimed to contribute towards
the creation of a secure and trusted public cloud computing environment
A Comparison of Clustering Techniques for Malware Analysis
In this research, we apply clustering techniques to the malware detection problem. Our goal is to classify malware as part of a fully automated detection strategy. We compute clusters using the well-known �-means and EM clustering algorithms, with scores obtained from Hidden Markov Models (HMM). The previous work in this area consists of using HMM and �-means clustering technique to achieve the same. The current effort aims to extend it to use EM clustering technique for detection and also compare this technique with the �-means clustering
Recommended from our members
Technologies for climate change adaptation: agricultural sector
This Guidebook presents a selection of technologies for climate change adaptation in the agricultural sector. A set of twenty two adaptation technologies are showcased that are primarily based on the principals of agroecology, but also include scientific technologies of climate and biological sciences complemented with important sociological and institutional capacity building processes that are required to make adaptation function. The technologies cover monitoring and forecasting the climate, sustainable water use and management, soil management, sustainable crop management, seed conservation, sustainable forest management and sustainable livestock management.
Technologies that tend to homogenize the natural environment and agricultural production have low possibilities of success in conditions of environmental stress that are likely to result from climate change. On the other hand, technologies that allow for, and indeed promote, diversity are more likely to provide a strategy which strengthens agricultural production in the face of uncertain future climate change scenarios. In this sense, the twenty two technologies showcased in this Guidebook have been selected because they facilitate the conservation and restoration of diversity while at the same time providing opportunities for increasing agricultural productivity. Many of these technologies are not new to agricultural production practices, but they are implemented based on assessment of current and possible future impacts of climate change in a particular location. Agro-ecology is an approach that encompasses concepts of sustainable production and biodiversity promotion and therefore provides a useful framework for identifying and selecting appropriate adaptation technologies for the agricultural sector.
The Guidebook provides a systematic analysis of the most relevant information available on climate change adaptation technologies in the agriculture sector. It has been compiled based on a literature review of key publications, journal articles, and e-platforms, and by drawing on documented experiences sourced from a range of organizations working on projects and programmes concerned with climate change adaptation technologies in the agricultural sector. Its geographic scope is focused on developing countries where high levels of poverty, agricultural production, climate variability and biological diversity currently intersect.
Key concepts around climate change adaptation are not universally agreed. It is therefore important to understand local contexts – especially social and cultural norms - when working with national and sub-national stakeholders to make informed decisions about appropriate technology options. Thus, decision-making processes should be participative, facilitated, and consensus-building oriented and should be based on the following key guiding principles: increasing awareness and knowledge, strengthening institutions, protecting natural resources, providing financial assistance and developing context-specific strategies.
For decision-making the Community–Based Adaptation framework is proposed for creating inclusive governance that engages a range of stakeholders directly with local or district government and national coordinating bodies, and facilitates participatory planning, monitoring and implementation of adaptation activities. Seven criteria are suggested for the prioritization of adaptation technologies: (i) The extent to which the technology maintains or strengthens biological diversity and is environmentally sustainable; (ii) The extent to which the technology facilitates access to information systems and awareness of climate change information; (iii) Whether the technology support water, carbon and nutrient cycles and enables stable and/or increased productivity; (iv) Income-generating potential, cost-benefit analysis and contribution to improved equity; (v) Respect for cultural diversity and facilitation of inter-cultural exchange; (vi) Potential for integration into regional and national policies and can be scaled-up; (vii) The extent to which the technology builds formal and information institutions and social networks.
Finally, recommendations are set out for practitioners and policy makers:
• There is an urgent need for improved climate modelling and forecasting which can provide a basis for informed decision-making and the implementation of adaptation strategies. This should include traditional knowledge.
• Information is also required to better understand the behaviour of plants, animals, pests and diseases as they react to climate change.
• Potential changes in economic and social systems in the future under different climate scenarios should also be investigated so that the implications of adaptation strategy and planning choices are better understood.
• It is important to secure effective flows of information through appropriate dissemination channels. This is vital for building adaptive capacity and decision-making processes.
• Improved analysis of adaptation technologies is required to show how they can contribute to building adaptive capacity and resilience in the agricultural sector. This information needs to be compiled and disseminated for a range of stakeholders from local to national level.
• Relationships between policy makers, researchers and communities should be built so that technologies and planning processes are developed in partnership, responding to producers’ needs and integrating their knowledge
Psychological approaches to the study of saving / 7
Bibliography: p. 95-114
Classical Cryptographic Protocols in a Quantum World
Cryptographic protocols, such as protocols for secure function evaluation
(SFE), have played a crucial role in the development of modern cryptography.
The extensive theory of these protocols, however, deals almost exclusively with
classical attackers. If we accept that quantum information processing is the
most realistic model of physically feasible computation, then we must ask: what
classical protocols remain secure against quantum attackers?
Our main contribution is showing the existence of classical two-party
protocols for the secure evaluation of any polynomial-time function under
reasonable computational assumptions (for example, it suffices that the
learning with errors problem be hard for quantum polynomial time). Our result
shows that the basic two-party feasibility picture from classical cryptography
remains unchanged in a quantum world.Comment: Full version of an old paper in Crypto'11. Invited to IJQI. This is
authors' copy with different formattin
Critical review of the e-loyalty literature: a purchase-centred framework
Over the last few years, the concept of online loyalty has been examined extensively in the literature, and it remains a topic of constant inquiry for both academics and marketing managers. The tremendous development of the Internet for both marketing and e-commerce settings, in conjunction with the growing desire of consumers to purchase online, has promoted two main outcomes: (a) increasing numbers of Business-to-Customer companies running businesses online and (b) the development of a variety of different e-loyalty research models. However, current research lacks a systematic review of the literature that provides a general conceptual framework on e-loyalty, which would help managers to understand their customers better, to take advantage of industry-related factors, and to improve their service quality. The present study is an attempt to critically synthesize results from multiple empirical studies on e-loyalty. Our findings illustrate that 62 instruments for measuring e-loyalty are currently in use, influenced predominantly by Zeithaml et al. (J Marketing. 1996;60(2):31-46) and Oliver (1997; Satisfaction: a behavioral perspective on the consumer. New York: McGraw Hill). Additionally, we propose a new general conceptual framework, which leads to antecedents dividing e-loyalty on the basis of the action of purchase into pre-purchase, during-purchase and after-purchase factors. To conclude, a number of managerial implementations are suggested in order to help marketing managers increase their customers’ e-loyalty by making crucial changes in each purchase stage
- …