577 research outputs found

    Profiling and Automated Decision Making in the Present and New EU Data Protection Frameworks

    Get PDF
    The digital world of the 21st century is increasingly the world of automatic decision making. In such a world, an ever larger number of tasks are relegated to computers which gather and process data as well as suggest or make decisions silently and with little supervision. This situation has been made possible by a transfer of a staggering portion of our daily lives from the offline world to the Internet. It is a truism that automation would be impossible without our willing participation on the Internet. We freely take part in social networks, post on blogs, and send our emails. On the other hand, it is equally true that we are increasingly monitored by the state, by profit‐maximizing corporations and by our fellow citizens and that these methods of monitoring are becoming smarter. Vast amounts of data which have become available and which we contribute, form what we today call “big data”.1 This is then harvested for connections and correlations and profiles created that can be used for commercial and other purposes. We fear this world but are also dependant on it. The creation of these profiles and their usage is an uncharted territory for the social sciences as much as it is a strange territory for the regulators

    The Impact of Commission’s Proposed Data Protection Regulation: Appendix to Deliverable D5.1

    Get PDF
    The European Commission recently proposed a General Data Protection Regulation,1 which is meant to replace the EU Data Protection Directive2 and to thoroughly reform and modernize the EU privacy regulatory framework. The Regulation, if adopted, would introduce a number of changes, several of which would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced. Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions of direct relevance for the project and Work Package 5 will be analysed here

    EU Regulatory Models for Platforms on the Content and Carrier Layers: Convergence and Changing Policy Patterns

    Get PDF
    Digital “intermediaries” have been the subject of the lawmakers’ interest both in the EU and elsewhere for at least two decades now. In recent years, however, and coinciding with the introduction of the new Digital Single Market Strategy in 2015, “platforms” in specific - rather than “intermediaries”, “information society services (ISSs)” or “networks” - have incresingly been the target of EU regulators’ attention. This article traces the EU’s increased focus on platforms and argues that its source can be found in a desire to respond to technology convergence. The first part of the article looks at why convergence and the emerging platform regulation stand in a relationship and attempts to outline the boundaries of that relationship. The second part argues that the EU has already undergone a regulatory shift from “services” to “platforms” and traces this idea from its 2015 DSM Strategy origins to several other documents, including the 2016 Communication on Platforms. The third part looks at how the emergence of the idea and its elevation to a policy goal has already prompted changes in each of the three regulatory layers. The concluding part establishes that the EU policy shift from ISSs, networks and services as main regulatory units to platforms is sector-specific and prompted by attempts to address convergence. I argue that this approach may not necessarily be beneficial and look into alternatives involving a rethink of platform regulation across rather than within different layers

    What Can We Learn from the Recent Deals?

    Get PDF
    Recent years have seen a surge in the use of blockchain technologies, not least because of the increased use of cryptocurrencies such as Bitcoin which rely on it. While some of the generated interest can be dismissed as hype, there is little doubt that blockchain is a technology with the potential to revolutionise certain areas of law. A careful reader following the trends would have noticed that Maersk, the Danish business conglomerate, was involved with no less than three deals revolving, in one way or another, around blockchain technology Such news should intrigue the reader since adoption of experimental methods or newest IT technologies does not normally characterise the inert maritime and transport industries. We will in this short piece give an overview of the relevance of blockchain and briefly look at three different deals which Maersk concluded around the technology. We will then give an outline of potential legal problems which these and similar deals might bring. Our preliminary conclusion is that blockchain technology - in some instances at least - has the potential to disrupt the role law traditionally plays in negotiating and executing international contracts

    ePrivacy Directive: Assessment of Transposition, Effectiveness and Compatibility with Proposed Data Protection Regulation

    Get PDF
    The ePrivacy Directive has been implemented in Denmark through a range of legislative instruments, beginning with the Act on Electronic Communications and Services but leading into more important Executive Order on Provision of Electronic Services and the Cookie Order. This structure could be confusing for outsiders as it involves several acts, all of which are concerned not just with one but with several directives. The use of ministerial orders can be explained by the need to introduce flexibility into the fast-changing area, but avoiding a lengthy and complicated full legislative process

    Comments on the Commission’s Green Paper on Copyright in the Knowledge Economy

    Get PDF
    The Commission’s Green Paper on Copyright in the Knowledge Economy highlights, above all, the need for a serious research and dialogue on the future of the Information Society Directive. Although the directive had been drafted with the new technology in mind, the developments of the previous decade already show the need for a serious discussion about it. The debate about the issues pointed out in the Green Paper had begun in earnest not only on this side of the Atlantic but on the other, as well as all around the world

    Study on the Evaluation of the European Union Agency for Network and Information Security

    Get PDF
    The European Union Agency for Network and Information Security (ENISA) was established in 2004. The Agency provides advice and recommendations, data analysis, and supports awareness raising and cooperation by the EU bodies and Member States in the field of cybersecurity. ENISA uses its expertise to improve cooperation between Member States, and between actors from the public and private sectors, as well as to support capacity building. The present study involves the evaluation of ENISA over the 2013-2016 period, assessing the Agency’s performance, governance and organisational structure, and positioning with respect to other EU and national bodies. It assesses ENISA’s strengths, weaknesses, opportunities and threats (SWOTs) with regard to the new cybersecurity and digital privacy landscape. It also provides options to modify the mandate of the Agency to better respond to new, emerging needs and assesses their financial implications. The findings of the evaluation study show that ENISA has made some important achievements towards increasing NIS in the EU. However, a fragmented approach to cybersecurity across the EU and issues internal to the Agency, including limited financial resources, hinder ENISA’s ability to respond to the ever growing needs of stakeholders in a context of technological developments and evolving cybersecurity threats

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe
    corecore