915 research outputs found

    Judging traffic differentiation as network neutrality violation according to internet regulation

    Get PDF
    Network Neutrality (NN) is a principle that establishes that traffic generated by Internet applications should be treated equally and it should not be affected by arbitrary interfer- ence, degradation, or interruption. Despite this common sense, NN has multiple defi- nitions spread across the academic literature, which differ primarily on what constitutes the proper equality level to consider the network as neutral. NN definitions may also be included in regulations that control activities on the Internet. However, the regulations are set by regulators whose acts are valid within a geographical area, named jurisdic- tion. Thus, both the academia and regulations provide multiple and heterogeneous NN definitions. In this thesis, the regulations are used as guidelines to detect NN violations, which are, by this approach, the adoption of traffic management practices prohibited by regulators. Thereafter, the solutions can provide helpful information for users to support claims against illegal traffic management practices. However, state-of-the-art solutions adopt strict academic definitions (e.g., all traffic must be treated equally) or adopt the regulatory definitions from one jurisdiction, which is not realistic or does not consider that multiple jurisdictions may be traversed in an end-to-end network path, respectively An impact analysis showed that, under certain circumstances, from 39% to 48% of the detected Traffic Differentiations (TDs) are not NN violations when the regulations are considered, exposing that the regulatory aspect must not be ignored. In this thesis, a Reg- ulation Assessment step is proposed to be performed after the TD detection. This step shall consider all NN definitions that may be found in an end-to-end network path and point out NN violation when they are violated. A service is proposed to perform this step for TD detection solutions, given the unfeasibility of every solution implementing the re- quired functionalities. A Proof-of-Concept (PoC) prototype was developed based on the requirements identified along with the impact analysis, which was evaluated using infor- mation about TDs detected by a state-of-the-art solution. The verdicts were inconclusive (the TD is an NN violation or not) for a quarter of the scenarios due to lack of information about the traversed network paths and the occurrence zones (where in the network path, the TD is suspected of being deployed). However, the literature already has proposals of approaches to obtain such information. These results should encourage TD detection solution proponents to collect this data and submit them for the Regulation Assessment.Neutralidade da rede (NR) Ă© um princĂ­pio que estabelece que o trĂĄfego de aplicaçÔes e serviços seja tratado igualitariamente e nĂŁo deve ser afetado por interferĂȘncia, degradação, ou interrupção arbitrĂĄria. Apesar deste senso comum, NR tem mĂșltiplas definiçÔes na literatura acadĂȘmica, que diferem principalmente no que constitui o nĂ­vel de igualdade adequado para considerar a rede como neutra. As definiçÔes de NR tambĂ©m podem ser incluĂ­das nas regulaçÔes que controlam as atividades na Internet. No entanto, tais regu- laçÔes sĂŁo definidas por reguladores cujos atos sĂŁo vĂĄlidos apenas dentro de uma ĂĄrea geogrĂĄfica denominada jurisdição. Assim, tanto a academia quanto a regulação forne- cem definiçÔes mĂșltiplas e heterogĂȘneas de NR. Nesta tese, a regulação Ă© utilizada como guia para detecção de violação da NR, que nesta abordagem, Ă© a adoção de prĂĄticas de gerenciamento de trĂĄfego proibidas pelos reguladores. No entanto, as soluçÔes adotam definiçÔes estritas da academia (por exemplo, todo o trĂĄfego deve ser tratado igualmente) ou adotam as definiçÔes regulatĂłrias de uma jurisdição, o que pode nĂŁo ser realista ou pode nĂŁo considerar que vĂĄrias jurisdiçÔes podem ser atravessadas em um caminho de rede, respectivamente. Nesta tese, Ă© proposta uma etapa de Avaliação da Regulação apĂłs a detecção da Diferenciação de TrĂĄfego (DT), que deve considerar todas as definiçÔes de NR que podem ser encontradas em um caminho de rede e sinalizar violaçÔes da NR quando elas forem violadas. Uma anĂĄlise de impacto mostrou que, em determinadas cir- cunstĂąncias, de 39% a 48% das DTs detectadas nĂŁo sĂŁo violaçÔes quando a regulação Ă© considerada. É proposto um serviço para realizar a etapa de Avaliação de Regulação, visto que seria inviĂĄvel que todas as soluçÔes tivessem que implementar tal etapa. Um protĂłtipo foi desenvolvido e avaliado usando informaçÔes sobre DTs detectadas por uma solução do estado-da-arte. Os veredictos foram inconclusivos (a DT Ă© uma violação ou nĂŁo) para 75% dos cenĂĄrios devido Ă  falta de informaçÔes sobre os caminhos de rede percorridos e sobre onde a DT Ă© suspeita de ser implantada. No entanto, existem propostas para realizar a coleta dessas informaçÔes e espera-se que os proponentes de soluçÔes de detecção de DT passem a coletĂĄ-las e submetĂȘ-las para o serviço de Avaliação de Regulação

    A review of behavioural research on data security

    Get PDF
    Protection of confidential information or data from being leaked to the public is a growing concern among organisations and individuals. This paper presents the results of the search for literature on behavioural and security aspects of data protection. The topics covered by this review include a summary of the changes brought about by the EU GDPR (General Data Protection Regulation). It covers human and behavioural aspects of data protection, security and data breach or loss (threats), IT architectures to protect data (prevention), managing data breaches (mitigation), risk assessment and data protection audits. A distinction is made between threats and prevention from within an organisation and from the outside

    The Price of Fashion: The Environmental Cost of the Textile Industry in China

    Get PDF

    The New Gatekeepers: Private Firms as Public Enforcers

    Get PDF
    The world’s largest businesses must routinely police other businesses. By public mandate, Facebook monitors app developers’ privacy safeguards, Citibank audits call centers for deceptive sales practices, and Exxon reviews offshore oil platforms’ environmental standards. Scholars have devoted significant attention to how policy makers deploy other private sector enforcers, such as certification bodies, accountants, lawyers, and other periphery “gatekeepers.” However, the literature has yet to explore the emerging regulatory conscription of large firms at the center of the economy. This Article examines the rise of the enforcer-firm through case studies of the industries that are home to the most valuable companies, in technology, banking, oil, and pharmaceuticals. Over the past two decades, administrative agencies have used legal rules, guidance documents, and court orders to mandate that private firms in these and other industries perform the duties of a public regulator. More specifically, firms must write rules in their contracts that reserve the right to inspect third parties. When they find violations, they must pressure or punish the wrongdoer. This form of governance has important intellectual and policy implications. It imposes more of a public duty on the firm, alters corporate governance, and may even reshape business organizations. It also gives resource-strapped regulators promising tools. If designed poorly, however, the enforcer-firm will create an expansive area of unaccountable authority. Any comprehensive account of the firm or regulation must give a prominent role to the administrative state’s newest gatekeepers

    Rule-Makers or Rule-Takers? Exploring the Transatlantic Trade and Investment Partnership

    Get PDF
    The Transatlantic Trade and Investment Partnership (TTIP) is an effort by the United States and the European Union to reposition themselves for a world of diffuse economic power and intensified global competition. It is a next-generation economic negotiation that breaks the mould of traditional trade agreements. At the heart of the ongoing talks is the question whether and in which areas the two major democratic actors in the global economy can address costly frictions generated by their deep commercial integration by aligning rules and other instruments. The aim is to reduce duplication in various ways in areas where levels of regulatory protection are equivalent as well as to foster wide-ranging regulatory cooperation and set a benchmark for high-quality global norms. In this volume, European and American experts explain the economic context of TTIP and its geopolitical implications, and then explore the challenges and consequences of US-EU negotiations across numerous sensitive areas, ranging from food safety and public procurement to economic and regulatory assessments of technical barriers to trade, automotive, chemicals, energy, services, investor-state dispute settlement mechanisms and regulatory cooperation. Their insights cut through the confusion and tremendous public controversies now swirling around TTIP, and help decision-makers understand how the United States and the European Union can remain rule-makers rather than rule-takers in a globalising world in which their relative influence is waning

    Online Personal Data Processing and EU Data Protection Reform. CEPS Task Force Report, April 2013

    Get PDF
    This report sheds light on the fundamental questions and underlying tensions between current policy objectives, compliance strategies and global trends in online personal data processing, assessing the existing and future framework in terms of effective regulation and public policy. Based on the discussions among the members of the CEPS Digital Forum and independent research carried out by the rapporteurs, policy conclusions are derived with the aim of making EU data protection policy more fit for purpose in today’s online technological context. This report constructively engages with the EU data protection framework, but does not provide a textual analysis of the EU data protection reform proposal as such

    Hitting Refresh: Regulating internet speech in the 21st century

    Get PDF
    By the mid-1990s, the internet had taken new form outside of its original military applications and became commercially available at an unprecedented rate. Western democracies recognized that such a new frontier, therefore, necessitated regulation. Their shared goal was to restrict objectionable content while simultaneously creating a pathway for this nascent industry to blossom. With this in mind, both the U.S. and European Union enacted linearly different and mutually exclusive regulatory regimes to govern “online intermediaries,” or sites like Facebook and Twitter that merely host the speech of their users. The E.U. enacted aggressive content removal statutes, while the U.S. offered nearly blanket immunity to these sites in the hope that the marketplace of ideas would dilute objectionable content. Using the U.S. and Germany as case studies, this thesis argues that, twenty years later, neither pathway emerged particularly victorious in their quest to curb the dissemination of radicalizing content. I find that the failure under the German preemptive framework derives from a contradictory monitoring obligation and lack of oversight by the European Commission on the state. Conversely, I find that the failure under the American deregulatory framework is rooted in a contradictory allocation of jurisdiction and a lack of oversight by the state upon intermediaries. By scrutinizing the incentive structures of both countries’ regulatory regimes, this thesis challenges the way Western democracies conceptualized and continue to conceptualize the internet and points out how neither extreme has responsively moderated internet speech

    Identification of Causal Paths and Prediction of Runway Incursion Risk using Bayesian Belief Networks

    Get PDF
    In the U.S. and worldwide, runway incursions are widely acknowledged as a critical concern for aviation safety. However, despite widespread attempts to reduce the frequency of runway incursions, the rate at which these events occur in the U.S. has steadily risen over the past several years. Attempts to analyze runway incursion causation have been made, but these methods are often limited to investigations of discrete events and do not address the dynamic interactions that lead to breaches of runway safety. While the generally static nature of runway incursion research is understandable given that data are often sparsely available, the unmitigated rate at which runway incursions take place indicates a need for more comprehensive risk models that extend currently available research. This dissertation summarizes the existing literature, emphasizing the need for cross-domain methods of causation analysis applied to runway incursions in the U.S. and reviewing probabilistic methodologies for reasoning under uncertainty. A holistic modeling technique using Bayesian Belief Networks as a means of interpreting causation even in the presence of sparse data is outlined in three phases: causal factor identification, model development, and expert elicitation, with intended application at the systems or regulatory agency level. Further, the importance of investigating runway incursions probabilistically and incorporating information from human factors, technological, and organizational perspectives is supported. A method for structuring a Bayesian network using quantitative and qualitative event analysis in conjunction with structured expert probability estimation is outlined and results are presented for propagation of evidence through the model as well as for causal analysis. In this research, advances in the aggregation of runway incursion data are outlined, and a means of combining quantitative and qualitative information is developed. Building upon these data, a method for developing and validating a Bayesian network while maintaining operational transferability is also presented. Further, the body of knowledge is extended with respect to structured expert judgment, as operationalization is combined with elicitation of expert data to create a technique for gathering expert assessments of probability in a computationally compact manner while preserving mathematical accuracy in rank correlation and dependence structure. The model developed in this study is shown to produce accurate results within the U.S. aviation system, and to provide a dynamic, inferential platform for future evaluation of runway incursion causation. These results in part confirm what is known about runway incursion causation, but more importantly they shed more light on multifaceted causal interactions and do so in a modeling space that allows for causal inference and evaluation of changes to the system in a dynamic setting. Suggestions for future research are also discussed, most prominent of which is that this model allows for robust and flexible assessment of mitigation strategies within a holistic model of runway safety
    • 

    corecore