2,113 research outputs found

    The right to privacy in a Big Data society. Merits and limits of the GDPR

    Get PDF
    With the non-stop development of technology, Big Data generation has seen a rise like no other. The rise of Big Data has given a possibility to numerous ways in which personal data of consumers could be used leaving the people vulnerable. The European Union came up with GDPR as the latest way of protecting the rights of citizens. In this paper, we analyze different aspects of Big Data such as legal framework, consent, and anonymization and see in what ways GDPR has benefitted in protecting personal data and what its limitations are

    MEDIATING THE TENSION BETWEEN DATA SHARING AND PRIVACY: THE CASE OF DMA AND GDPR

    Get PDF
    The Digital Markets Act (DMA) constitutes a crucial part of the European legislative framework addressing the dominance of ‘Big Tech’. It intends to foster fairness and competition in Europe’s digital platform economy by imposing obligations on ‘gatekeepers’ to share end-user-related information with business users. Yet, this may involve the processing of personal data subject to the General Data Protection Regulation (GDPR). The obligation to provide access to personal data in a GDPR-compliant manner poses a regulatory and technical challenge and can serve as a justification for gatekeepers to refrain from data sharing. In this research-in-progress paper, we analyze key tensions between the DMA and the GDPR through the paradox perspective. We argue through a task-technology fit approach how privacyenhancing technologies – particularly anonymization techniques – and portability could help mediate tensions between data sharing and privacy. Our contribution provides theoretical and practical insights to facilitate legal compliance

    Change of Purpose - The effects of the Purpose Limitation Principle in the General Data Protection Regulation on Big Data Profiling

    Get PDF
    Over the past few years, many companies have started to adopt Big Data technologies. Big Data is a method and technology that allows the collection and analysis of huge amounts of all kinds of data, mainly in digital form. Big Data can be used, for example, to create profiles of online shopping users to target ads. I call this Big Data Profiling. Facebook and Google, for example, are able to estimate attributes, such as gender, age and interests, from data provided by their users. This can be worrisome for many users who feel that their privacy is infringed when the Big Data Profiling companies, for example, are able to send advertisements to the users that are scarily relevant to them. Big Data Profiling relies on a vast amount of collected data. Often, at the time of collection, it is not clear how exactly this data will be used and analyzed. The new possibilities with Big Data Profiling have led to companies collecting as much data as possible, and then later figuring out how to extract value from this data. This model can be described as “collect-before select”, since the data is first collected, and then “mined” for correlations that can be used to profile users. In this thesis I analyze whether this form of collection and usage of Personal Data is legal under the General Data Protection Regulation (GDPR), which enters into force in the European Union on 25 May 2018. While many of the provisions of the GDPR already existed in the Data Protection Directive (DPD) since 1995, they have been reinforced and extended in the GDPR. One of the main principles of the GDPR is that of Purpose Limitation. While the principle already exists under the DPD in a very similar fashion, it is likely that it will be enforced more under the GDPR, since the GDPR is directly applicable in member states instead of having to be implemented. The enforcement mechanisms, such as sanctions, have also been significantly strengthened. The Purpose Limitation Principle requires the data controller (such as companies processing Personal Data, like Facebook and Google) to have a specified purpose for and during the collection of Personal Data. Further, the Personal Data cannot be processed beyond this purpose after it has been collected. This seems to run contrary to Big Data Profiling, which regularly looks for purposes only after the Personal Data has been collected. However, I have identified three potential ways the “collect before select” model could still be possible under the GDPR. The first possibility is the anonymization of Personal Data. If data can be efficiently anonymized, it will fall outside of the scope of the GDPR because it will not contain Personal Data after the anonymization. The controller is then free to analyze the data for any purpose, including creating models that could be used to profile other users. However, I found that Big Data methods can often reidentify Personal Data that has been previously anonymized. In such cases even purportedly anonymized data may still fall under the scope of the GDPR. If on the other hand enough Personal Data is removed to make reidentification impossible, the value of the data for large parts of the business world is likely destroyed. The second possibility is collecting Personal Data for a specified purpose that is defined so widely that it covers all potential future use cases. If a controller can collect Personal Data for a vague purpose, such as “marketing”, the controller will have a lot of flexibility in using the data while still being covered by the initial purpose. I found that the GDPR requires data controllers (such as companies) to have a purpose for the data collection that is specific enough so that the data subject is able to determine exactly which kinds of processing the controller will undertake. Having a non-existent or too vague purpose is not sufficient under the GDPR. Companies that collect data with no, or an only vaguely defined, purpose and then try to find a specific purpose for the collected data later will therefore have to stop this practice. The third possibility can be used if the controller wants to re-use Personal Data for further purposes, after the controller has collected the Personal Data initially in compliance with the GDPR for a specified purpose. In this case, the GDPR offers certain possibilities of further processing this data outside of the initial purpose. The GDPR allows this for example if the data subject has given consent to the new purpose. However, I found that Big Data Profiling companies often come up with new purposes later by “letting the data speak”, which means by analyzing the data itself to find new purposes. Before performing an analysis, often the company might not even know how the processing will be done later. In that case, it is impossible to request the data subject’s specific consent, which is required under the GDPR. Even without the data subject’s consent, there are however other possibilities of further processing data under the GDPR, such as determining whether the new processing is compatible with the initial purpose. My thesis examines some of those possibilities for a change of purpose under Big Data Profiling. My conclusion is that the GDPR likely means a drastic impact and limitation on Big Data Profiling as we know it. Personal Data cannot be collected without a purpose or with a vague purpose. Even Personal Data that was collected for a specific purpose cannot be re-used for another purpose except for in very few circumstances. Time will tell how the courts interpret the GDPR and decide different situations, how the companies will adapt to them and if the legislator will react to this reality

    Mediating the Tension between Data Sharing and Privacy: The Case of DMA and GDPR

    Full text link
    The Digital Markets Act (DMA) constitutes a crucial part of the European legislative framework addressing the dominance of 'Big Tech'. It intends to foster fairness and competition in Europe's digital platform economy by imposing obligations on 'gatekeepers' to share end-user-related information with business users. Yet, this may involve the processing of personal data subject to the General Data Protection Regulation (GDPR). The obligation to provide access to personal data in a GDPR-compliant manner poses a regulatory and technical challenge and can serve as a justification for gatekeepers to refrain from data sharing. In this research-in-progress paper, we analyze key tensions between the DMA and the GDPR through the paradox perspective. We argue through a task-technology fit approach how privacy-enhancing technologies-particularly anonymization techniques-and portability could help mediate tensions between data sharing and privacy. Our contribution provides theoretical and practical insights to facilitate legal compliance

    An Analysis of the Consequences of the General Data Protection Regulation on Social Network Research

    Get PDF
    This article examines the principles outlined in the General Data Protection Regulation in the context of social network data. We provide both a practical guide to General Data Protection Regulation--compliant social network data processing, covering aspects such as data collection, consent, anonymization, and data analysis, and a broader discussion of the problems emerging when the general principles on which the regulation is based are instantiated for this research area

    Protected Users: A Moodle Plugin To Improve Confidentiality and Privacy Support through User Aliases

    Get PDF
    [EN]The privacy policies, terms, and conditions of use in any Learning Management System (LMS) are one-way contracts. The institution imposes clauses that the student can accept or decline. Students, once they accept conditions, should be able to exercise the rights granted by the General Data Protection Regulation (GDPR). However, students cannot object to data processing and public profiling because it would be conceived as an impediment to teachers to execute their work with normality. Nonetheless, regarding GDPR and consulted legal advisors, a student could claim identity anonymization in the LMS, if adequate personal justifications are provided. Per contra, the current LMSs do not have any functionality that enables identity anonymization. This is a big problem that generates undesired situations which urgently requires a definitive solution. In this work, we surveyed students and teachers to validate the feasibility and acceptance of using aliases to anonymize their identity in LMSs as a sustainable solution to the problem. Considering the positive results, we developed a user-friendly plugin for Moodle that enables students' identity anonymization by the use of aliases. This plugin, presented in this work and named Protected users, is publicly available online at GitHub and published under GNU General Public License

    Privacy, Risk, Anonymization and Data Sharing in the Internet of Health Things

    Get PDF
    This paper explores a specific risk-mitigation strategy to reduce privacy concerns in the Internet of Health Things (IoHT): data anonymization. It contributes to the current academic debate surrounding the role of anonymization in the IoHT by evaluating how data controllers can balance privacy risks against the quality of output data and select the appropriate privacy model that achieves the aims underlying the concept of Privacy by Design. It sets forth several approaches for identifying the risk of re-identification in the IoHT as well as explores the potential for synthetic data generation to be used as an alternative method to anonymization for data sharing

    Big Data and Analytics in the Age of the GDPR

    Get PDF
    The new European General Data Protection Regulation places stringent restrictions on the processing of personally identifiable data. The GDPR does not only affect European companies, as the regulation applies to all the organizations that track or provide services to European citizens. Free exploratory data analysis is permitted only on anonymous data, at the cost of some legal risks.We argue that for the other kinds of personal data processing, the most flexible and safe legal basis is explicit consent. We illustrate the approach to consent management and compliance with the GDPR being developed by the European H2020 project SPECIAL, and highlight some related big data aspects
    • 

    corecore