4,346 research outputs found

    Cyber Babel: Finding the Lingua Franca in Cybersecurity Regulation

    Get PDF
    Cybersecurity regulations have proliferated over the past few years as the significance of the threat has drawn more attention. With breaches making headlines, the public and their representatives are imposing requirements on those that hold sensitive data with renewed vigor. As high-value targets that hold large amounts of sensitive data, financial institutions are among the most heavily regulated. Regulations are necessary. However, regulations also come with costs that impact both large and small companies, their customers, and local, national, and international economies. As the regulations have proliferated so have those costs. The regulations will inevitably and justifiably diverge where different governments view the needs of their citizens differently. However, that should not prevent regulators from recognizing areas of agreement. This Note examines the regulatory regimes governing the data and cybersecurity practices of financial institutions implemented by the Securities and Exchange Commission, the New York Department of Financial Services, and the General Data Protection Regulations of the European Union to identify areas where requirements overlap, with the goal of suggesting implementations that promote consistency, clarity, and cost reduction

    Catalyzing Privacy Law

    Get PDF
    The United States famously lacks a comprehensive federal data privacy law. In the past year, however, over half the states have proposed broad privacy bills or have established task forces to propose possible privacy legislation. Meanwhile, congressional committees are holding hearings on multiple privacy bills. What is catalyzing this legislative momentum? Some believe that Europe’s General Data Protection Regulation (GDPR), which came into force in 2018, is the driving factor. But with the California Consumer Privacy Act (CCPA) which took effect in January 2020, California has emerged as an alternate contender in the race to set the new standard for privacy.Our close comparison of the GDPR and California’s privacy law reveals that the California law is not GDPR-lite: it retains a fundamentally American approach to information privacy. Reviewing the literature on regulatory competition, we argue that California, not Brussels, is catalyzing privacy law across the United States. And what is happening is not a simple story of powerful state actors. It is more accurately characterized as the result of individual networked norm entrepreneurs, influenced and even empowered by data globalization. Our study helps explain the puzzle of why Europe’s data privacy approach failed to spur US legislation for over two decades. Finally, our study answers critical questions of practical interest to individuals—who will protect my privacy?—and to businesses—whose rules should I follow

    From opt-in to obligation? : Examining the regulation of globally operating tech companies through alternative regulatory instruments from a material and territorial viewpoint

    Get PDF
    Modern society’s ever-increasing reliance on technology raises complex legal challenges. In the search for an efficient and effective regulatory response, more and more authorities – in particular the European Union – are relying on alternative regulatory instruments (ARIs) when engaging big tech companies. Materially, this is a natural fit: the tech industry is a complex and rapidly-evolving sector and – unlike the rigid classic legislative process – ARIs allow for meaningful ex ante anticipatory constructions and ex post enforcement due to their unique flexibility. However, from a territorial point of view several complications arise. Although the use of codes of conduct to regulate transnational private actors has a rich history, the way in which such codes are set out under articles 40 and 41 of the EU’s GDPR implies a ‘hardening’ of these soft law instruments that has repercussions for their relationship to the principles of territorial jurisdiction. This contribution serves as a first step for further research into the relationship between codes of conduct, the regulation of the tech industry and the territorial aspects related thereto

    A Blockchain-based Approach for Data Accountability and Provenance Tracking

    Full text link
    The recent approval of the General Data Protection Regulation (GDPR) imposes new data protection requirements on data controllers and processors with respect to the processing of European Union (EU) residents' data. These requirements consist of a single set of rules that have binding legal status and should be enforced in all EU member states. In light of these requirements, we propose in this paper the use of a blockchain-based approach to support data accountability and provenance tracking. Our approach relies on the use of publicly auditable contracts deployed in a blockchain that increase the transparency with respect to the access and usage of data. We identify and discuss three different models for our approach with different granularity and scalability requirements where contracts can be used to encode data usage policies and provenance tracking information in a privacy-friendly way. From these three models we designed, implemented, and evaluated a model where contracts are deployed by data subjects for each data controller, and a model where subjects join contracts deployed by data controllers in case they accept the data handling conditions. Our implementations show in practice the feasibility and limitations of contracts for the purposes identified in this paper

    After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance

    Get PDF
    Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumers’ privacy expectations and thereby, become “over-privileged” to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem

    Making GDPR Usable: A Model to Support Usability Evaluations of Privacy

    Full text link
    We introduce a new model for evaluating privacy that builds on the criteria proposed by the EuroPriSe certification scheme by adding usability criteria. Our model is visually represented through a cube, called Usable Privacy Cube (or UP Cube), where each of its three axes of variability captures, respectively: rights of the data subjects, privacy principles, and usable privacy criteria. We slightly reorganize the criteria of EuroPriSe to fit with the UP Cube model, i.e., we show how EuroPriSe can be viewed as a combination of only rights and principles, forming the two axes at the basis of our UP Cube. In this way we also want to bring out two perspectives on privacy: that of the data subjects and, respectively, that of the controllers/processors. We define usable privacy criteria based on usability goals that we have extracted from the whole text of the General Data Protection Regulation. The criteria are designed to produce measurements of the level of usability with which the goals are reached. Precisely, we measure effectiveness, efficiency, and satisfaction, considering both the objective and the perceived usability outcomes, producing measures of accuracy and completeness, of resource utilization (e.g., time, effort, financial), and measures resulting from satisfaction scales. In the long run, the UP Cube is meant to be the model behind a new certification methodology capable of evaluating the usability of privacy, to the benefit of common users. For industries, considering also the usability of privacy would allow for greater business differentiation, beyond GDPR compliance.Comment: 41 pages, 2 figures, 1 table, and appendixe

    The impact of the general data protection regulation on the financial services’ industry of small European states

    Get PDF
    This paper is based on the unpublished Thesis by Magri, A. (2018). An Evaluation of the Impact of GDPR on the Local Financial Services Industry. Banking and Finance, Department of Banking and Finance, Faculty of Economics, Management and Accountancy, University of Malta, supervised by Dr. Simon GrimaPurpose: With this paper we evaluate the impact and implications of the European Union (EU) General Data Protection Regulation (GDPR) on the Financial Services Industry in small European States; specifically Malta, Slovenia, Luxembourg, Lithuania, Latvia, Estonia and Cyprus. That is, countries within the EU having less than 3 million population. Design/methodology/approach: We collected our primary data by carrying out scheduled semi-structured interviews (using WhatsApp®, Messenger® and Skype®) with 63 participants who are working directly or indirectly with GDPR in financial services between November 2018 and April 2019. The interview was structured using two impact themes, ‘Trust, Standardisation and Reputation’ and ‘Training and ‘Resources’, with 18 statements under each theme to which participants were required to answer using a 5-point Likert-scale ranging from “Strongly Disagree” to “Strongly Agree”. To answer the research questions, the empirical data collected was subjected to statistical analysis using SPSS (Version 21) namely descriptive statistics and box plots and later MANOVA, while the qualitative data was analysed using the thematic approach. Findings: We found that overall, participants feel that although GDPR has increased the work load and costs, it has helped to improve the trust, standardisation and reputation of the institutions they represent. However, this comes with some repercussions from the data subjects who are not conversant with the regulation and are apprehensive by the consents required. Originality/value: Although, all States might be represented in the decision process, the larger States usually take over and sometimes dictate the final decision. The concept of proportionality in regulations is not clean and is not effectively managed, at the disadvantage of the smaller States. Therefore, this paper is important since it voices the cries of smaller States and allows for an understanding of the impact and implications of new regulations to smaller jurisdictions, in this case within the EU.peer-reviewe
    corecore