2,990 research outputs found

    Investigating the tension between cloud-related actors and individual privacy rights

    Get PDF
    Historically, little more than lip service has been paid to the rights of individuals to act to preserve their own privacy. Personal information is frequently exploited for commercial gain, often without the person’s knowledge or permission. New legislation, such as the EU General Data Protection Regulation Act, has acknowledged the need for legislative protection. This Act places the onus on service providers to preserve the confidentiality of their users’ and customers’ personal information, on pain of punitive fines for lapses. It accords special privileges to users, such as the right to be forgotten. This regulation has global jurisdiction covering the rights of any EU resident, worldwide. Assuring this legislated privacy protection presents a serious challenge, which is exacerbated in the cloud environment. A considerable number of actors are stakeholders in cloud ecosystems. Each has their own agenda and these are not necessarily well aligned. Cloud service providers, especially those offering social media services, are interested in growing their businesses and maximising revenue. There is a strong incentive for them to capitalise on their users’ personal information and usage information. Privacy is often the first victim. Here, we examine the tensions between the various cloud actors and propose a framework that could be used to ensure that privacy is preserved and respected in cloud systems

    “Hey Alexa, Do Consumers Really Want More Data Privacy?”: An Analysis of the Negative Effects of the General Data Protection Regulation

    Get PDF
    Recent news articles discuss the flooding of email inboxes with lengthy terms and condition updates, viral videos of Mark Zuckerberg’s public Cambridge Analytica hearing before Congress, and the phenomenon of internet advertisements appearing for items that consumers merely searched for on Google a day prior. Effective as of May 25, 2018, the European Union’s General Data Protection Regulation (GDPR) established a framework that sets legal standards targeted at businesses and other data collectors to dramatically increase data privacy protections for citizens of the EU. Consumers, however, do not seem to appreciate these increased protections, as they rarely read the updated privacy terms and conditions provided via notifications and email notices. Such behavior suggests that consumers would prefer to keep receiving curated and personalized services, like “New Music Friday” Spotify playlists, rather than gaining transparency over how those services are so thoughtfully created. This note argues that the benefits of the GDPR’s increased data privacy are overshadowed by the burdens the GDPR imposes on businesses. As the U.S. moves towards adopting similar data privacy legislation, Congress should create laws with simpler restrictions, clearer guidelines, and more lenient standards, so the consumer benefits of data privacy reform can be achieved without imposing such large hindrances on companies

    The impact of the general data protection regulation on the financial services’ industry of small European states

    Get PDF
    This paper is based on the unpublished Thesis by Magri, A. (2018). An Evaluation of the Impact of GDPR on the Local Financial Services Industry. Banking and Finance, Department of Banking and Finance, Faculty of Economics, Management and Accountancy, University of Malta, supervised by Dr. Simon GrimaPurpose: With this paper we evaluate the impact and implications of the European Union (EU) General Data Protection Regulation (GDPR) on the Financial Services Industry in small European States; specifically Malta, Slovenia, Luxembourg, Lithuania, Latvia, Estonia and Cyprus. That is, countries within the EU having less than 3 million population. Design/methodology/approach: We collected our primary data by carrying out scheduled semi-structured interviews (using WhatsApp¼, Messenger¼ and Skype¼) with 63 participants who are working directly or indirectly with GDPR in financial services between November 2018 and April 2019. The interview was structured using two impact themes, ‘Trust, Standardisation and Reputation’ and ‘Training and ‘Resources’, with 18 statements under each theme to which participants were required to answer using a 5-point Likert-scale ranging from “Strongly Disagree” to “Strongly Agree”. To answer the research questions, the empirical data collected was subjected to statistical analysis using SPSS (Version 21) namely descriptive statistics and box plots and later MANOVA, while the qualitative data was analysed using the thematic approach. Findings: We found that overall, participants feel that although GDPR has increased the work load and costs, it has helped to improve the trust, standardisation and reputation of the institutions they represent. However, this comes with some repercussions from the data subjects who are not conversant with the regulation and are apprehensive by the consents required. Originality/value: Although, all States might be represented in the decision process, the larger States usually take over and sometimes dictate the final decision. The concept of proportionality in regulations is not clean and is not effectively managed, at the disadvantage of the smaller States. Therefore, this paper is important since it voices the cries of smaller States and allows for an understanding of the impact and implications of new regulations to smaller jurisdictions, in this case within the EU.peer-reviewe

    PERSONAL DATA PROTECTION RULES! GUIDELINES FOR PRIVACY-FRIENDLY SMART ENERGY SERVICES

    Get PDF
    Privacy-friendly processing of personal data is proving to be increasingly challenging in today’s energy systems as the amount of data grows. Smart energy services provide value creation and co-creation by processing sensible user data collected from smart meters, smart home devices, storage systems, and renewable energy plants. To address this challenge, we analyze key topics and develop design requirements and design principles for privacy-friendly personal data processing in smart energy services. We identify these key topics through expert interviews, text-mining, and topic modelling techniques based on 149 publications. Following this, we derive our design requirements and principles and evaluate these with experts and an applicability check with three real-world smart energy services. Based on our results and findings, we establish a further research agenda consisting of five specific research directions

    C is for Cookie: Is the EU\u27s New Cookie Law Good Enough to Protect My Data?

    Get PDF
    [...]data breaches have consistently increased in recent years, with almost 1,300 breaches in 2017 and over 600 as of July 24, 2018.11 This is obviously a problem that affects millions of people across the globe each year and is expected to continually increase as the global economy becomes ever more digital, forcing some to call for action. [...]the comment will speculate as to any potential legal developments as a result of the GDPR\u27s implementation in the European Union and ways that it may evolve over time to affect not only the European Union but also other nations that do business in the European Union and those nations around the globe who have or are considering implementing data protection regulations similar to the GDPR, including the United States. 21 Part of the Federal Constitutional Court\u27s reasoning is the fact that it is now easier than ever to acquire information and exert influence over another using that information and the psychological stresses that can accompany that kind of public awareness.22 The court goes on to worry about a societal structure in which a citizen is unaware of who knows what about him/her, when they knew it, and why they knew it.23 The court then states that the individual must be protected from the unlimited collection, storage, use, and transmission of personal data as a condition for free personality development under modern conditions of data processing and that the individual, not the state or some other power, has the right to determine if another party may use or divulge his personal data.24 The Federal Constitutional Court does recognize there is a limit to this informational self-determination, noting that the individual is a personality within a society and public interest may prevail over the rights of the individual in some cases (examples would be public safety or if an individual is missing).25 According to Bradford, this case and the informational self-determination became the bedrock upon which the EU\u27s views on privacy and data have been built.26 These laws and this decision by the German Federal Constitutional Court set the stage for incredibly forward-thinking data privacy laws as the world becomes more and more digital. 31 In this case, the court ruled that Google must abide by the wishes of users to take down or delete any data they had acquired that appeared to be irrelevant, inadequate, or no longer adequate to their business interests.32 Since that decision, as of May 2018, Google has received more than 655,000 requests to remove roughly 2.5 million links, and they have complied with 43.3 percent of those requests.33 As technology has advanced, the need for a better regulatory framework became apparent to keep up with the innovations in technology.34 A part of the solution for the EU was to pass the GDPR in 2016.35 Many experts state that the GDPR is simply a modern upgrade to the Data Protection Directive based upon a better understanding of how data has been misused.36 Additionally, unlike the Data Protection Directive, the GDPR is a regulation that is required to be enacted by all member nations in the EU.37 To get a sense of the depth and specificity that the GDPR implements, the GDPR is composed of 173 recitals covering forty-five specific regulations on how companies should process data, forty-three conditions of applicability, thirty-five bureaucratic obligations for the EU member states, seventeen enumerated rights, eleven administrative clarifications, nine policy assertions, five enumerated penalties, and two technological allowances.38 The goal of the GDPR is clear: data should serve mankind.39 As the fourth recital of the GDPR states, The processing of personal data should be designed to serve mankind

    A Privacy Impact Assessment Method for Organizations Implementing IoT for Occupational Health and Safety

    Get PDF
    Internet of Things (IoT) technologies are increasingly being integrated into occupational health and safety (OHS) practices; however, their adoption raises significant privacy concerns. The General Data Protection Regulation (GDPR) has established the requirement for organizations to conduct Privacy Impact Assessments (PIAs) prior to processing personal data, emphasizing the need for privacy safeguards in the workplace. Despite this, the GDPR provisions related to the IoT, particularly in the area of OHS, lack clarity and specificity. This research aims to bridge this gap by proposing a tailored method for conducting PIAs in the OHS context, with a particular focus on addressing the how to aspect of the assessment process. The proposed method integrates insights from domain experts, relevant literature sources, and GDPR regulations, ultimately leading to the development of an online PIA tool

    Personal information management systems: a user-centric privacy utopia?

    Get PDF

    Evaluating the Contextual Integrity of Privacy Regulation: Parents' IoT Toy Privacy Norms Versus COPPA

    Full text link
    Increased concern about data privacy has prompted new and updated data protection regulations worldwide. However, there has been no rigorous way to test whether the practices mandated by these regulations actually align with the privacy norms of affected populations. Here, we demonstrate that surveys based on the theory of contextual integrity provide a quantifiable and scalable method for measuring the conformity of specific regulatory provisions to privacy norms. We apply this method to the U.S. Children's Online Privacy Protection Act (COPPA), surveying 195 parents and providing the first data that COPPA's mandates generally align with parents' privacy expectations for Internet-connected "smart" children's toys. Nevertheless, variations in the acceptability of data collection across specific smart toys, information types, parent ages, and other conditions emphasize the importance of detailed contextual factors to privacy norms, which may not be adequately captured by COPPA.Comment: 18 pages, 1 table, 4 figures, 2 appendice

    Mind the FemTech Gap:Regulation Failings and Exploitative Systems

    Get PDF
    The security, privacy, and safety issues around Female-oriented technologies (FemTech) and data can lead to differential harms. These complex risks and harms are enabled by many factors including inadequate regulations, the non-compliant practices of the industry, and the lack of research and guidelines for cyber-secure, privacy-preserving, and safe products. In this paper, we review the existing regulations related to FemTech in the United Kingdom, EU, and Switzerland and identify the gaps. We run experiments on a range of FemTech devices and apps and identify several exploitative practices. We advocate for the policymakers to explicitly acknowledge and accommodate the risks of these technologies in the relevant regulations
    • 

    corecore