7,884 research outputs found
Link Before You Share: Managing Privacy Policies through Blockchain
With the advent of numerous online content providers, utilities and
applications, each with their own specific version of privacy policies and its
associated overhead, it is becoming increasingly difficult for concerned users
to manage and track the confidential information that they share with the
providers. Users consent to providers to gather and share their Personally
Identifiable Information (PII). We have developed a novel framework to
automatically track details about how a users' PII data is stored, used and
shared by the provider. We have integrated our Data Privacy ontology with the
properties of blockchain, to develop an automated access control and audit
mechanism that enforces users' data privacy policies when sharing their data
across third parties. We have also validated this framework by implementing a
working system LinkShare. In this paper, we describe our framework on detail
along with the LinkShare system. Our approach can be adopted by Big Data users
to automatically apply their privacy policy on data operations and track the
flow of that data across various stakeholders.Comment: 10 pages, 6 figures, Published in: 4th International Workshop on
Privacy and Security of Big Data (PSBD 2017) in conjunction with 2017 IEEE
International Conference on Big Data (IEEE BigData 2017) December 14, 2017,
Boston, MA, US
Informacijos saugos reikalavimĹł harmonizavimo, analizÄs ir ÄŻvertinimo automatizavimas
The growing use of Information Technology (IT) in daily operations of enterprises requires an ever-increasing level of protection over organizationâs assets and information from unauthorised access, data leakage or any other type of information security breach. Because of that, it becomes vital to ensure the necessary level of protection. One of the best ways to achieve this goal is to implement controls defined in Information security documents. The problems faced by different organizations are related to the fact that often, organizations are required to be aligned with multiple Information security documents and their requirements. Currently, the organizationâs assets and information protection are based on Information security specialistâs knowledge, skills and experience. Lack of automated tools for multiple Information security documents and their requirements harmonization, analysis and visualization lead to the situation when Information security is implemented by organizations in ineffective ways, causing controls duplication or increased cost of security implementation. An automated approach for Information security documents analysis, mapping and visualization would contribute to solving this issue. The dissertation consists of an introduction, three main chapters and general conclusions. The first chapter introduces existing Information security regulatory documents, current harmonization techniques, information security implementation cost evaluation methods and ways to analyse Information security requirements by applying graph theory optimisation algorithms (Vertex cover and Graph isomorphism). The second chapter proposes ways to evaluate information security implementation and costs through a controls-based approach. The effectiveness of this method could be improved by implementing automated initial data gathering from Business processes diagrams. In the third chapter, adaptive mapping on the basis of Security ontology is introduced for harmonization of different security documents; such an approach also allows to apply visualization techniques for harmonization results presentation. Graph optimization algorithms (vertex cover algorithm and graph isomorphism algorithm) for Minimum Security Baseline identification and verification of achieved results against controls implemented in small and medium-sized enterprises were proposed. It was concluded that the proposed methods provide sufficient data for adjustment and verification of security controls applicable by multiple Information security documents.Dissertatio
Digital Peacekeepers, Drone Surveillance and Information Fusion: A Philosophical Analysis of New Peacekeeping
In June 2014 an Expert Panel on Technology and Innovation in UN Peacekeeping was commissioned to examine how technology and innovation could strengthen peacekeeping missions. The panel\u27s report argues for wider deployment of advanced technologies, including greater use of ground and airborne sensors and other technical sources of data, advanced data analytics and information fusion to assist in data integration. This article explores the emerging intelligence-led, informationist conception of UN peacekeeping against the backdrop of increasingly complex peacekeeping mandates and precarious security conditions. New peacekeeping with its heightened commitment to information as a political resource and the endorsement of offensive military action within robust mandates reflects the multiple and conflicting trajectories generated by asymmetric conflicts, the responsibility to protect and a technology-driven information revolution. We argue that the idea of peacekeeping is being revised (and has been revised) by realities beyond peacekeeping itself that require rethinking the morality of peacekeeping in light of the emergence of \u27digital peacekeeping\u27 and the knowledge revolution engendered by new technologies
Equipment-as-Experience: A Heidegger-Based Position of Information Security
Information security (InfoSec) has ontologically been characterised as an order machine. The order machine connects with other machines through interrupting mechanisms. This way of portraying InfoSec focuses on the correct placement of machine entities to protect information assets. However, what is missing in this view is that for the InfoSec we experience in everyday practice, we are not just observers of the InfoSec phenomena but also active agents of it. To contribute to the quest, we draw on Heideggerâs (1962) notion of equipment and propose the concept of equipment-as-experience to understand the ontological position of InfoSec in everyday practice. In this paper we show how equipment-as-experience provides a richer picture of InfoSec as being a fundamental sociotechnical phenomena. We further contend using an example case to illustrate that InfoSec equipment should not be understood merely by its properties (present-at-hand mode), but rather in ready-to-hand mode when put into practice
Recommended from our members
Big data architecture for pervasive healthcare: a literature review
Pervasive healthcare aims to deliver deinstitutionalised healthcare services to patients anytime and anywhere. Pervasive healthcare involves remote data collection through mobile devices and sensor network which the data is usually in large volume, varied formats and high frequency. The nature of big data such as volume, variety, velocity and veracity, together with its analytical capabilities com-plements the delivery of pervasive healthcare. However, there is limited research in intertwining these two domains. Most research focus mainly on the technical context of big data application in the healthcare sector. Little attention has been paid to a strategic role of big data which impacts the quality of healthcare services provision at the organisational level. Therefore, this paper delivers a conceptual view of big data architecture for pervasive healthcare via an intensive literature review to address the aforementioned research problems. This paper provides three major contributions: 1) identifies the research themes of big data and pervasive healthcare, 2) establishes the relationship between research themes, which later composes the big data architecture for pervasive healthcare, and 3) sheds a light on future research, such as semiosis and sense-making, and enables practitioners to implement big data in the pervasive healthcare through the proposed architecture
An Approach for Managing Access to Personal Information Using Ontology-Based Chains
The importance of electronic healthcare has caused numerous
changes in both substantive and procedural aspects of healthcare
processes. These changes have produced new challenges to patient
privacy and information secrecy. Traditional privacy policies cannot
respond to rapidly increased privacy needs of patients in electronic
healthcare. Technically enforceable privacy policies are needed in
order to protect patient privacy in modern healthcare with its cross
organisational information sharing and decision making.
This thesis proposes a personal information flow model that specifies
a limited number of acts on this type of information. Ontology
classified Chains of these acts can be used instead of the
"intended/business purposes" used in privacy access control to
seamlessly imbuing current healthcare applications and their
supporting infrastructure with security and privacy functionality. In
this thesis, we first introduce an integrated basic architecture, design
principles, and implementation techniques for privacy-preserving
data mining systems. We then discuss the key methods of privacypreserving
data mining systems which include four main methods:
Role based access control (RBAC), Hippocratic database, Chain
method and eXtensible Access Control Markup Language (XACML).
We found out that the traditional methods suffer from two main
problems: complexity of privacy policy design and the lack of context
flexibility that is needed while working in critical situations such as the
one we find in hospitals. We present and compare strategies for
realising these methods. Theoretical analysis and experimental
evaluation show that our new method can generate accurate data
mining models and safe data access management while protecting
the privacy of the data being mined. The experiments followed
comparative kind of experiments, to show the ease of the design first
and then follow real scenarios to show the context flexibility in saving
personal information privacy of our investigated method
IT Laws in the Era of Cloud-Computing
This book documents the findings and recommendations of research into the question of how IT laws should develop on the understanding that todayâs information and communication technology is shaped by cloud computing, which lies at the foundations of contemporary and future IT as its most widespread enabler. In particular, this study develops on both a comparative and an interdisciplinary axis, i.e. comparatively by examining EU and US law, and on an interdisciplinary level by dealing with law and IT. Focusing on the study of data protection and privacy in cloud environments, the book examines three main challenges on the road towards more efficient cloud computing regulation:
-understanding the reasons behind the development of diverging legal structures and schools of thought on IT law
-ensuring privacy and security in digital clouds
-converging regulatory approaches to digital clouds in the hope of more harmonised IT laws in the future
Lico: A Lightweight Access Control Model for Inter-Networking Linkages
Š 2013 IEEE. Processes in operating systems are assigned different privileges to access different resources. A process may invoke other processes whose privileges are different; thus, its privileges are expanded (or escalated) due to such improper 'inheritance.' Inter-networking can also occur between processes, either transitively or iteratively. This complicates the monitoring of inappropriate privilege assignment/escalation, which can result in information leakage. Such information leakage occurs due to privilege transitivity and inheritance and can be defined as a general access control problem for inter-networking linkages. This is also a topic that is generally less studied in existing access control models. Specifically, in this paper, we propose a lightweight directed graph-based model, LiCo, which is designed to facilitate the authorization of privileges among inter-networking processes. To the best of our knowledge, this is the first general access control model for inter-invoking processes and general inter-networking linkages
- âŚ