4,926 research outputs found
CamFlow: Managed Data-sharing for Cloud Services
A model of cloud services is emerging whereby a few trusted providers manage
the underlying hardware and communications whereas many companies build on this
infrastructure to offer higher level, cloud-hosted PaaS services and/or SaaS
applications. From the start, strong isolation between cloud tenants was seen
to be of paramount importance, provided first by virtual machines (VM) and
later by containers, which share the operating system (OS) kernel. Increasingly
it is the case that applications also require facilities to effect isolation
and protection of data managed by those applications. They also require
flexible data sharing with other applications, often across the traditional
cloud-isolation boundaries; for example, when government provides many related
services for its citizens on a common platform. Similar considerations apply to
the end-users of applications. But in particular, the incorporation of cloud
services within `Internet of Things' architectures is driving the requirements
for both protection and cross-application data sharing.
These concerns relate to the management of data. Traditional access control
is application and principal/role specific, applied at policy enforcement
points, after which there is no subsequent control over where data flows; a
crucial issue once data has left its owner's control by cloud-hosted
applications and within cloud-services. Information Flow Control (IFC), in
addition, offers system-wide, end-to-end, flow control based on the properties
of the data. We discuss the potential of cloud-deployed IFC for enforcing
owners' dataflow policy with regard to protection and sharing, as well as
safeguarding against malicious or buggy software. In addition, the audit log
associated with IFC provides transparency, giving configurable system-wide
visibility over data flows. [...]Comment: 14 pages, 8 figure
On Using Encryption Techniques to Enhance Sticky Policies Enforcement
How to enforce privacy policies to protect sensitive personal data has become an urgent research topic for security researchers, as very little has been done in this field apart from some ad hoc research efforts. The sticky policy paradigm, proposed by Karjoth, Schunter, and Waidner, provides very useful inspiration on how we can protect sensitive personal data, but the enforcement is very weak. In this paper we provide an overview of the state of the art in enforcing sticky policies, especially the concept of sticky policy enforcement using encryption techniques including Public-Key Encryption (PKE), Identity-Based Encryption (IBE), Attribute-Based Encryption (ABE), and Proxy Re-Encryption (PRE). We provide detailed comparison results on the (dis)advantages of these enforcement mechanisms. As a result of the analysis, we provide a general framework for enhancing sticky policy enforcement using Type-based PRE (TPRE), which is an extension of general PRE
Privacy in an Ambient World
Privacy is a prime concern in today's information society. To protect\ud
the privacy of individuals, enterprises must follow certain privacy practices, while\ud
collecting or processing personal data. In this chapter we look at the setting where an\ud
enterprise collects private data on its website, processes it inside the enterprise and\ud
shares it with partner enterprises. In particular, we analyse three different privacy\ud
systems that can be used in the different stages of this lifecycle. One of them is the\ud
Audit Logic, recently introduced, which can be used to keep data private when it\ud
travels across enterprise boundaries. We conclude with an analysis of the features\ud
and shortcomings of these systems
De-perimeterisation as a cycle: tearing down and rebuilding security perimeters
If an organisation wants to secure its IT assets, where should the security mechanisms be placed? The traditional view is the hard-shell model, where an organisation secures all its assets using a fixed security border: What is inside the security perimeter is more or less trusted, what is outside is not. Due to changes in technologies, business processes and their legal environments this approach is not adequate anymore.\ud
This paper examines this process, which was coined de-perimeterisation by the Jericho Forum.\ud
In this paper we analyse and define the concepts of perimeter and de-perimeterisation, and show that there is a long term trend in which de-perimeterisation is iteratively accelerated and decelerated. In times of accelerated de-perimeterisation, technical and organisational changes take place by which connectivity between organisations and their environment scales up significantly. In times of deceleration, technical and organisational security measures are taken to decrease the security risks that come with de-perimeterisation, a movement that we call re-perimeterisation. We identify the technical and organisational mechanisms that facilitate de-perimeterisation and re-perimeterisation, and discuss the forces that cause organisations to alternate between these two movements
Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information
Search engines are the prevalently used tools to collect information about
individuals on the Internet. Search results typically comprise a variety of
sources that contain personal information -- either intentionally released by
the person herself, or unintentionally leaked or published by third parties,
often with detrimental effects on the individual's privacy. To grant
individuals the ability to regain control over their disseminated personal
information, the European Court of Justice recently ruled that EU citizens have
a right to be forgotten in the sense that indexing systems, must offer them
technical means to request removal of links from search results that point to
sources violating their data protection rights. As of now, these technical
means consist of a web form that requires a user to manually identify all
relevant links upfront and to insert them into the web form, followed by a
manual evaluation by employees of the indexing system to assess if the request
is eligible and lawful.
We propose a universal framework Oblivion to support the automation of the
right to be forgotten in a scalable, provable and privacy-preserving manner.
First, Oblivion enables a user to automatically find and tag her disseminated
personal information using natural language processing and image recognition
techniques and file a request in a privacy-preserving manner. Second, Oblivion
provides indexing systems with an automated and provable eligibility mechanism,
asserting that the author of a request is indeed affected by an online
resource. The automated ligibility proof ensures censorship-resistance so that
only legitimately affected individuals can request the removal of corresponding
links from search results. We have conducted comprehensive evaluations, showing
that Oblivion is capable of handling 278 removal requests per second, and is
hence suitable for large-scale deployment
Interoperability, Trust Based Information Sharing Protocol and Security: Digital Government Key Issues
Improved interoperability between public and private organizations is of key
significance to make digital government newest triumphant. Digital Government
interoperability, information sharing protocol and security are measured the
key issue for achieving a refined stage of digital government. Flawless
interoperability is essential to share the information between diverse and
merely dispersed organisations in several network environments by using
computer based tools. Digital government must ensure security for its
information systems, including computers and networks for providing better
service to the citizens. Governments around the world are increasingly
revolving to information sharing and integration for solving problems in
programs and policy areas. Evils of global worry such as syndrome discovery and
manage, terror campaign, immigration and border control, prohibited drug
trafficking, and more demand information sharing, harmonization and cooperation
amid government agencies within a country and across national borders. A number
of daunting challenges survive to the progress of an efficient information
sharing protocol. A secure and trusted information-sharing protocol is required
to enable users to interact and share information easily and perfectly across
many diverse networks and databases globally.Comment: 20 page
Let the Computer Say NO! The Neglected Potential of Policy Definition Languages for Data Sovereignty
During interaction with today’s internet services and platform ecosystems, consumer data is often harvested and shared without their consent; that is, consumers seized to be the sovereigns of their own data with the proliferation of the internet. Due to the rapid and abundant nature of interactions in today’s platform ecosystems, manual consent management is impractical. To support development of semi-automated solutions for reestablishing data sovereignty, we investigate the use of policy definition languages as machine-readable and enforceable mechanisms for fostering data sovereignty. We conducted a realist literature review of the capabilities of policy definition languages developed for pertinent application scenarios (e.g., for access control in cloud computing). We consolidate extant literature into a framework of the chances and challenges of leveraging policy definition languages as central building blocks for data sovereignty in platform ecosystems
Recommended from our members
Data-centric access control for cloud computing
© 2016 ACM. The usual approach to security for cloud-hosted applications is strong separation. However, it is often the case that the same data is used by different applications, particularly given the increase in data-driven (big data' and IoT) applications. We argue that access control for the cloud should no longer be application-specific but should be data-centric, associated with the data that can ow between applications. Indeed, the data may originate outside cloud services from diverse sources such as medical monitoring, environmental sensing etc. Information Flow Control (IFC) potentially offers data-centric, system-wide data access control. It has been shown that IFC can be provided at operating system level as part of a PaaS offering, with an acceptable overhead. In this paper we consider how IFC can be integrated with application-specific access control, transparently from application developers, while building from simple IFC primitives, access control policies that align with the data management obligations of cloud providers and tenants.This work was supported by the UK EPSRC grant EP/ K011510 CloudSafetyNet. We acknowledge the support of Microsoft through the Microsoft Cloud Computing Research Centre
Adding Privacy Protection to Policy Based Authorisation Systems
An authorisation system determines who is authorised to do what i.e. it assigns privileges to users and provides a decision on whether someone is allowed to perform a requested action on a resource. A traditional authorisation decision system, which is simply called authorisation system or system in the rest of the thesis, provides the decision based on a policy which is usually written by the system administrator. Such a traditional authorisation system is not sufficient to protect privacy of personal data, since users (the data subjects) are usually given a take it or leave it choice to accept the controlling organisation’s policy. Privacy is the ability of the owners or subjects of personal data to control the flow of data about themselves, according to their own preferences. This thesis describes the design of an authorisation system that will provide privacy for personal data by including sticky authorisation policies from the issuers and data subjects, to supplement the authorisation policy of the controlling organisation. As personal data moves from controlling system to controlling system, the sticky policies travel with the data.
A number of data protection laws and regulations have been formulated to protect the privacy of individuals. The rights and prohibitions provided by the law need to be enforced by the
authorisation system. Hence, the designed authorisation system also includes the authorisation rules from the legislation. This thesis describes the conversion of rules from the EU Data Protection
Directive into machine executable rules. Due to the nature of the legislative rules, not all of them could be converted into deterministic machine executable rules, as in several cases human intervention or human judgement is required. This is catered for by allowing the machine rules to be configurable.
Since the system includes independent policies from various authorities (law, issuer, data subject and controller) conflicts may arise among the decisions provided by them. Consequently, this thesis describes a dynamic, automated conflict resolution mechanism. Different conflict resolution algorithms are chosen based on the request contexts.
As the EU Data Protection Directive allows processing of personal data based on contracts, we designed and implemented a component, Contract Validation Service (ConVS) that can validate an XML based digital contract to allow processing of personal data based on a contract.
The authorisation system has been implemented as a web service and the performance of the system is measured, by first deploying it in a single computer and then in a cloud server. Finally the validity of the design and implementation are tested against a number of use cases based on scenarios involving accessing medical data in a health service provider’s system and accessing personal data such as CVs and degree certificates in an employment service provider’s system. The machine computed authorisation decisions are compared to the theoretical decisions to ensure that the system returns the correct decisions
Local Government Policy and Planning for Unmanned Aerial Systems
This research identifies key state and local government stakeholders in California for drone policy creation and implementation, and describes their perceptions and understanding of drone policy. The investigation assessed stakeholders’ positions, interests, and influence on issues, with the goal of providing potential policy input to achieve successful drone integration in urban environments and within the national airspace of the United States. The research examined regulatory priorities through the use of a two-tiered Stakeholder Analysis Process. The first tier consisted of a detailed survey sent out to over 450 local agencies and jurisdictions in California. The second tier consisted of an in-person focus group to discuss survey results as well as to gain deeper insights into local policymakers’ current concerns. Results from the two tiers of analysis, as well as recommendations, are provided here
- …