26,534 research outputs found
A Declarative Framework for Specifying and Enforcing Purpose-aware Policies
Purpose is crucial for privacy protection as it makes users confident that
their personal data are processed as intended. Available proposals for the
specification and enforcement of purpose-aware policies are unsatisfactory for
their ambiguous semantics of purposes and/or lack of support to the run-time
enforcement of policies.
In this paper, we propose a declarative framework based on a first-order
temporal logic that allows us to give a precise semantics to purpose-aware
policies and to reuse algorithms for the design of a run-time monitor enforcing
purpose-aware policies. We also show the complexity of the generation and use
of the monitor which, to the best of our knowledge, is the first such a result
in literature on purpose-aware policies.Comment: Extended version of the paper accepted at the 11th International
Workshop on Security and Trust Management (STM 2015
The General Data Protection Regulation:A Partial Success for Children on Social Network Sites?
Almost 20 years ago, the first social networking site (âSNSâ) was launched in the U.S. Whilst developers originally intended for SNSs to be used by adultsâwhich they areâthey have also become an integral communication platform in the lives of many children in EU Member States. Sharing personal information on SNSs is now a routine activity for many children and, whilst they are computer literate in a way that their parents are often not, a number of concerns have emerged. One of these concerns is that children are vulnerable since they lack the capacity to consent to the terms of SNS membership agreements regarding the processing of their personal data. A further concern is that childrenâs naĂŻve confidence sometimes leads them to take risksâby sharing information about themselvesâthat adults would not take. This is particularly concerning as children may be ignorant about the fact that their profile and behavioural data is sold to data brokers who use that information to produce targeted advertsâand that these adverts may display age inappropriate content or even may not by recognised by the children as adverts. Directive 95/46/EC regulates the processing of the personal data of EU citizens, including personal data posted on SNSs. Problematically, it was drafted in a pre-SNS era and neither makes reference to children nor considers them vulnerable data subjects whose personal data should be subject to more stringent processing rules. The absence of specific legal protection for childrenâs data on SNSs sparked concerns that children were ignorantly disclosing personal data and being exposed to profiling and advertising without adequate privacy and data protection safeguards in place. In response to these concerns, provisions aimed at safeguarding childrenâs privacy and data protection rights have been included in Regulation (EU) 2016/679 (hereafter âGDPRâ), which will come into force on 25 May 2018. This chapter provides a critical evaluation of the forthcoming measures to address a knowledge gap that exists because of the novelty of these provisions and the fact that scholarship in this area is currently underdeveloped. It begins by providing an overview of SNSs and the problems posed by underage childrenâs access to them. In this regard, it will illustrate that the biological and psychosocial developmental changes that children experience as they progress through their teenage years and develop their capacity for freedom of expression makes them vulnerable to impulsive personal information disclosures and privacy invasions. After this, an exploration of the current legal protections for childrenâs privacy on SNSs from the perspective of privacy as information control will highlight deficiencies in Directive 95/46/EC. This leads to an analysis of the measures in the GDPR to determine whether they will, when introduced, realise the twin goals of legitimising the processing of childrenâs personal data and, at the same time, protecting their fundamental privacy and data protection rights. The compatibility of measures in the GDPR with provisions in the United Nations Convention on the Rights of the Child (1989) (âthe UNCRCâ) and the Charter of Fundamental Rights of the European Union (2000) (âthe EU Charterâ) is considered as these provide a normative framework for evaluating childrenâs legal rights. To comply with both legal frameworks, data protection measures in the GDPR governing childrenâs activities on SNSs should recognise their evolving capacity for freedom of expression and privacy. This would allow them to express themselves with appropriate safeguards in place, ensuring that their best interests are protected and that they are not subject to economic exploitation through activities such as profiling and advertising without consent. Specifically, the analysis presents a critical evaluation of the introduction of an age threshold, below which children are deemed to lack capacity to consent to the processing of their personal data; the conceptual coherence of relying on parental consent for children under the threshold age; the practical implications of Member States being permitted to set the threshold age within a range of ages; and the practical challenges posed by relying on verified parental consent. The chapter concludes that measures in the GDPR are compatible with provisions in the UNCRC and the EU Charter but that a number of practical challenges remain unsolved. For instance, allowing Member States to set the threshold age means that the goal of simplifying and harmonising the regulatory environment for SNSs operating on a transnational basis will not be fully realised. Equally, reliance on parental consent and the consent of children over the threshold age is conceptually coherent, but it is dependent on the introduction of low-cost age-verification mechanisms being integrated into SNSs. It is also dependent on child data subjects (or their parents) being digitally literate enough to give unambiguous, specific consent to the processing of their personal data. Relatedly, whilst the GDPR includes measures to promote and increase the digital literacy of both parents and children, it remains to be seen how effective these will be in practice. For these reasons, the GDPR is an improvement on Directive 95/46/EC, but only a partial success
Update: COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era
In 1998, Congress passed the Children\u27s Online Privacy Protection Act (COPPA) in response to growing concerns over the dissemination of children\u27s personal information over the Internet. Under COPPA\u27s provisions, websites are prohibited from collecting personal information from children under the age of twelve without verifiable parental consent. While in theory COPPA sought to provide parents the control over their children\u27s personal information on the Internet, its practical effect causes websites to attempt to ban children through age screening mechanisms that remain largely ineffective.Twelve years after the passage of COPPA, the landscape of the Internet is dramatically changed. Social networking websites like Facebook, with over 500 million users, provide children with vast opportunities to share their personal information online. Moreover, as COPPA only seeks to protect children under the age of twelve, many of Facebook\u27s most vulnerable demographicteenagers ages thirteen to eighteenfall outside its provisions. COPPA must be revised so that children, teenagers, and parents are provided adequate notice of the uses of personal information online (especially with respect to social networking websites) and a meaningful opportunity to consent to those practices
Mapping and analysis of the current self- and co- regulatory framework of commercial communication aimed at minors
As the advertising sector has been very active in self-regulating commercial communication aimed at children, a patchwork of different rules and instruments exist, drafted by different self-regulatory organisations at international, European and national level. In order to determine the scope and contents of these rules, and hence, the actual level of protection of children, a structured mapping of these rules is needed. As such, this report aims to provide an overview of different categories of Alternative Regulatory Instruments(ARIs,such as self- and co-regulation regarding (new) advertising formats aimed at children. This report complements the first legal AdLit research report, which provided an overview of the legislative provisions in this domain.status: publishe
Secure data sharing and processing in heterogeneous clouds
The extensive cloud adoption among the European Public Sector Players empowered them to own and operate a range of cloud infrastructures. These deployments vary both in the size and capabilities, as well as in the range of employed technologies and processes. The public sector, however, lacks the necessary technology to enable effective, interoperable and secure integration of a multitude of its computing clouds and services. In this work we focus on the federation of private clouds and the approaches that enable secure data sharing and processing among the collaborating infrastructures and services of public entities. We investigate the aspects of access control, data and security policy languages, as well as cryptographic approaches that enable fine-grained security and data processing in semi-trusted environments. We identify the main challenges and frame the future work that serve as an enabler of interoperability among heterogeneous infrastructures and services. Our goal is to enable both security and legal conformance as well as to facilitate transparency, privacy and effectivity of private cloud federations for the public sector needs. © 2015 The Authors
Update: COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era
In 1998, Congress passed the Children\u27s Online Privacy Protection Act (COPPA) in response to growing concerns over the dissemination of children\u27s personal information over the Internet. Under COPPA\u27s provisions, websites are prohibited from collecting personal information from children under the age of twelve without verifiable parental consent. While in theory COPPA sought to provide parents the control over their children\u27s personal information on the Internet, its practical effect causes websites to attempt to ban children through age screening mechanisms that remain largely ineffective.Twelve years after the passage of COPPA, the landscape of the Internet is dramatically changed. Social networking websites like Facebook, with over 500 million users, provide children with vast opportunities to share their personal information online. Moreover, as COPPA only seeks to protect children under the age of twelve, many of Facebook\u27s most vulnerable demographicteenagers ages thirteen to eighteenfall outside its provisions. COPPA must be revised so that children, teenagers, and parents are provided adequate notice of the uses of personal information online (especially with respect to social networking websites) and a meaningful opportunity to consent to those practices
A Generic Information and Consent Framework for the IoT
The Internet of Things (IoT) raises specific issues in terms of information
and consent, which makes the implementation of the General Data Protection
Regulation (GDPR) challenging in this context. In this report, we propose a
generic framework for information and consent in the IoT which is protective
both for data subjects and for data controllers. We present a high level
description of the framework, illustrate its generality through several
technical solutions and case studies, and sketch a prototype implementation
Sharing data from clinical trials: the rationale for a controlled access approach.
The move towards increased transparency around clinical trials is welcome. Much focus has been on under-reporting of trials and access to individual patient data to allow independent verification of findings. There are many other good reasons for data sharing from clinical trials. We describe some key issues in data sharing, including the challenges of open access to data. These include issues in consent and disclosure; risks in identification, including self-identification; risks in distorting data to prevent self-identification; and risks in analysis. These risks have led us to develop a controlled access policy, which safeguards the rights of patients entered in our trials, guards the intellectual property rights of the original researchers who designed the trial and collected the data, provides a barrier against unnecessary duplication, and ensures that researchers have the necessary resources and skills to analyse the data
- âŠ