2,017 research outputs found

    Third country transfers of personal data under the GDPR: A review on remote access to personal data for business purposes In the light of C-311/18 Schrems II and new practices from the European Data Protection Board

    Get PDF
    Remote access to data has increased significantly in resent years, encompassing everything from cloud service providers to other types of file-sharing between business partners and their employees. To give a review on the legal landscape of third country transfers this thesis offers an assessment on whether remote access in fact constitutes a "transfer" as understood in Chapter V of the GDPR. Furthermore, the standard of essentially equivalent protection is then highlighted to illustrate the level of protection transfer tools and additional safeguards necessarily needs to provide. Subsequently this thesis offers an illustration of transfer tools such as adequacy decisions, Standard Data Protection Clauses and Binding Corporate Rules to provide an in-depth understanding of the impact Schrems II has had on third country transfers of personal data. Finally the thesis provide an overview of possible additional safeguards for the cases of remote access to data, as highlighted by the European Data Protection Board

    CEPS Task Force on Artificial Intelligence and Cybersecurity Technology, Governance and Policy Challenges Task Force Evaluation of the HLEG Trustworthy AI Assessment List (Pilot Version). CEPS Task Force Report 22 January 2020

    Get PDF
    The Centre for European Policy Studies launched a Task Force on Artificial Intelligence (AI) and Cybersecurity in September 2019. The goal of this Task Force is to bring attention to the market, technical, ethical and governance challenges posed by the intersection of AI and cybersecurity, focusing both on AI for cybersecurity but also cybersecurity for AI. The Task Force is multi-stakeholder by design and composed of academics, industry players from various sectors, policymakers and civil society. The Task Force is currently discussing issues such as the state and evolution of the application of AI in cybersecurity and cybersecurity for AI; the debate on the role that AI could play in the dynamics between cyber attackers and defenders; the increasing need for sharing information on threats and how to deal with the vulnerabilities of AI-enabled systems; options for policy experimentation; and possible EU policy measures to ease the adoption of AI in cybersecurity in Europe. As part of such activities, this report aims at assessing the High-Level Expert Group (HLEG) on AI Ethics Guidelines for Trustworthy AI, presented on April 8, 2019. In particular, this report analyses and makes suggestions on the Trustworthy AI Assessment List (Pilot version), a non-exhaustive list aimed at helping the public and the private sector in operationalising Trustworthy AI. The list is composed of 131 items that are supposed to guide AI designers and developers throughout the process of design, development, and deployment of AI, although not intended as guidance to ensure compliance with the applicable laws. The list is in its piloting phase and is currently undergoing a revision that will be finalised in early 2020. This report would like to contribute to this revision by addressing in particular the interplay between AI and cybersecurity. This evaluation has been made according to specific criteria: whether and how the items of the Assessment List refer to existing legislation (e.g. GDPR, EU Charter of Fundamental Rights); whether they refer to moral principles (but not laws); whether they consider that AI attacks are fundamentally different from traditional cyberattacks; whether they are compatible with different risk levels; whether they are flexible enough in terms of clear/easy measurement, implementation by AI developers and SMEs; and overall, whether they are likely to create obstacles for the industry. The HLEG is a diverse group, with more than 50 members representing different stakeholders, such as think tanks, academia, EU Agencies, civil society, and industry, who were given the difficult task of producing a simple checklist for a complex issue. The public engagement exercise looks successful overall in that more than 450 stakeholders have signed in and are contributing to the process. The next sections of this report present the items listed by the HLEG followed by the analysis and suggestions raised by the Task Force (see list of the members of the Task Force in Annex 1)

    THE STRICT NECESSITY TEST ON DATA PROTECTION BY THE CJEU: A PROPORTIONALITY TEST TO FACE THE CHALLENGES AT THE BEGINNING OF A NEW DIGITAL ERA IN THE MIDST OF SECURITY CONCERNS

    Get PDF
    Through the judgments Digital Rights Ireland and Tele2 Sverige, the CJEU emphasised the power of the CFR (in particular arts 7, 8, 52) through the fundamental right of data protection and general principles of law such as the principle of proportionality and legal certainty. Article 52 CFR represents the essence of justification. In the spirit of article 52(3) and (4) CFR it becomes evident that the CJEU, the ECtHR and the German Constitutional Court go in the same direction. The CJEU was brave enough to deliver a scathing verdict on data retention. More strongly than the German CC, the CJEU safeguards data protection. Hence, the decisions of the CJEU were described as milestone decisions and the CJEU as a Court of fundamental rights. On the other hand, the CJEU focused all its power on proportionality expressed through the element of strict necessity. It is astonishing that the Court does not use the existing methodology on proportionality to strengthen legal discipline and confidence. Although proportionality may be assessed differently in single legal systems and cultures, the broad constitutionalisation and application of proportionality in jurisdiction proves the power of this general principle of law. The exploration of this principle is rather challenging, but most beneficial for the future application of primary law

    EU Data Protection Reform: Challenges for Cloud Computing

    Get PDF
    The EC adopted a strategy to unleash the potential of cloud computing, where it marked data protection legislation as one of the main barriers for the development and expansion of cloud computing in Europe. In light of the EC goal to ensure a stimulating environment for the development of cloud computing in the EU, this paper aims to assess the consequences of the new roles and responsibilities of cloud service providers and the new rights for individuals under the GDPR. The analyses show that, in line with the position of data protection in the EU as a fundamental right, the GDPR considerably raises standards of data protection in cloud computing, which faces EU cloud service providers with a more demanding position than their non-EU competition. Further analysis shows that by promoting privacy enabling technology and by the extraterritorial application of the GDPR, together with the hefty fines for non-compliance, the GDPR provides tools that might force non-EU service providers to adjust their business model to EU standards, thus rebalancing possible market disruption in cloud computing. The paper concludes that the GDPR provides tools that might result in raised standards of data protection globally and in cloud computing in particular

    A model to assess organisational information privacy maturity against the Protection of Personal Information Act

    Get PDF
    Includes bibliographical references.Reports on information security breaches have risen dramatically over the past five years with 2014 accounting for some high-profile breaches including Goldman Sachs, Boeing, AT&T, EBay, AOL, American Express and Apple to name a few. One report estimates that 868,045,823 records have been breached from 4,347 data breaches made public since 2005 (Privacy Rights Clearing House, 2013). The theft of laptops, loss of unencrypted USB drives, hackers infiltrating servers, and staff deliberately accessing client’s personal information are all regularly reported (Park, 2014; Privacy Rights Clearing House, 2013) . With the rise of data breaches in the Information Age, the South African government enacted the long awaited Protection of Personal Information (PoPI) Bill at the end of 2013. While South Africa has lagged behind other countries in adopting privacy legislation (the European Union issued their Data Protection Directive in 1995), South African legislators have had the opportunity to draft a privacy Act that draws on the most effective elements from other legislation around the world. Although PoPI has been enacted, a commencement date has still to be decided upon by the Presidency. On PoPI’s commencement date organisations will have an additional year to comply with its requirements, before which they should: review the eight conditions for the lawful processing of personal information set out in Chapter three of the Act; understand the type of personal information they process ; review staff training on mobile technologies and limit access to personal information; ensure laptops and other mobile devices have passwords and are preferably encrypted; look at the physical security of the premises where personal data is store d or processed; and, assess any service providers who process in formation on their behalf. With the demands PoPI places on organisations this research aims to develop a prescriptive model providing organisations with the ability to measure their information privacy maturity based on “generally accepted information security practices and procedure s” ( Protection of Personal Information Act, No.4 of 2013 , sec. 19(3)) . Using a design science research methodology, the development process provides three distinct design cycles: 1) conceptual foundation 2) legal evaluation and 3) organisational evaluation. The end result is the development of a privacy maturity model that allows organisations to measure their current information privacy maturity against the PoPI Act. This research contributes to the knowledge of how PoPI impacts on South African organisations, and in turn, how organisations are able to evaluate their current information privacy maturity in respect of the PoPI Act. The examination and use of global best practices and standards as the foundation for the model, and the integration with the PoPI Act, provides for the development of a unique yet standards-based privacy model aiming to provide practical benefit to South African organisations
    corecore