486 research outputs found

    Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments

    Get PDF
    Decentralized systems are a subset of distributed systems where multiple authorities control different components and no authority is fully trusted by all. This implies that any component in a decentralized system is potentially adversarial. We revise fifteen years of research on decentralization and privacy, and provide an overview of key systems, as well as key insights for designers of future systems. We show that decentralized designs can enhance privacy, integrity, and availability but also require careful trade-offs in terms of system complexity, properties provided, and degree of decentralization. These trade-offs need to be understood and navigated by designers. We argue that a combination of insights from cryptography, distributed systems, and mechanism design, aligned with the development of adequate incentives, are necessary to build scalable and successful privacy-preserving decentralized systems

    Pushing BitTorrent Locality to the Limit

    Get PDF
    Peer-to-peer (P2P) locality has recently raised a lot of interest in the community. Indeed, whereas P2P content distribution enables financial savings for the content providers, it dramatically increases the traffic on inter-ISP links. To solve this issue, the idea to keep a fraction of the P2P traffic local to each ISP was introduced a few years ago. Since then, P2P solutions exploiting locality have been introduced. However, several fundamental issues on locality still need to be explored. In particular, how far can we push locality, and what is, at the scale of the Internet, the reduction of traffic that can be achieved with locality? In this paper, we perform extensive experiments on a controlled environment with up to 10 000 BitTorrent clients to evaluate the impact of high locality on inter-ISP links traffic and peers download completion time. We introduce two simple mechanisms that make high locality possible in challenging scenarios and we show that we save up to several orders of magnitude inter-ISP traffic compared to traditional locality without adversely impacting peers download completion time. In addition, we crawled 214 443 torrents representing 6 113 224 unique peers spread among 9 605 ASes. We show that whereas the torrents we crawled generated 11.6 petabytes of inter-ISP traffic, our locality policy implemented for all torrents would have reduced the global inter-ISP traffic by 40%

    Characterization of ISP Traffic: Trends, User Habits, and Access Technology Impact

    Get PDF
    In the recent years, the research community has increased its focus on network monitoring which is seen as a key tool to understand the Internet and the Internet users. Several studies have presented a deep characterization of a particular application, or a particular network, considering the point of view of either the ISP, or the Internet user. In this paper, we take a different perspective. We focus on three European countries where we have been collecting traffic for more than a year and a half through 5 vantage points with different access technologies. This humongous amount of information allows us not only to provide precise, multiple, and quantitative measurements of "What the user do with the Internet" in each country but also to identify common/uncommon patterns and habits across different countries and nations. Considering different time scales, we start presenting the trend of application popularity; then we focus our attention to a one-month long period, and further drill into a typical daily characterization of users activity. Results depict an evolving scenario due to the consolidation of new services as Video Streaming and File Hosting and to the adoption of new P2P technologies. Despite the heterogeneity of the users, some common tendencies emerge that can be leveraged by the ISPs to improve their servic

    Trusting (and Verifying) Online Intermediaries\u27 Policing

    Get PDF
    All is not well in the land of online self-regulation. However competently internet intermediaries police their sites, nagging questions will remain about their fairness and objectivity in doing so. Is Comcast blocking BitTorrent to stop infringement, to manage traffic, or to decrease access to content that competes with its own for viewers? How much digital due process does Google need to give a site it accuses of harboring malware? If Facebook censors a video of war carnage, is that a token of respect for the wounded or one more reflexive effort of a major company to ingratiate itself with the Washington establishment? Questions like these will persist, and erode the legitimacy of intermediary self-policing, as long as key operations of leading companies are shrouded in secrecy. Administrators must develop an institutional competence for continually monitoring rapidly-changing business practices. A trusted advisory council charged with assisting the Federal Trade Commission (FTC) and Federal Communications Commission (FCC) could help courts and agencies adjudicate controversies concerning intermediary practices. An Internet Intermediary Regulatory Council (IIRC) would spur the development of expertise necessary to understand whether companies’ controversial decisions are socially responsible or purely self-interested. Monitoring is a prerequisite for assuring a level playing field online

    Judging traffic differentiation as network neutrality violation according to internet regulation

    Get PDF
    Network Neutrality (NN) is a principle that establishes that traffic generated by Internet applications should be treated equally and it should not be affected by arbitrary interfer- ence, degradation, or interruption. Despite this common sense, NN has multiple defi- nitions spread across the academic literature, which differ primarily on what constitutes the proper equality level to consider the network as neutral. NN definitions may also be included in regulations that control activities on the Internet. However, the regulations are set by regulators whose acts are valid within a geographical area, named jurisdic- tion. Thus, both the academia and regulations provide multiple and heterogeneous NN definitions. In this thesis, the regulations are used as guidelines to detect NN violations, which are, by this approach, the adoption of traffic management practices prohibited by regulators. Thereafter, the solutions can provide helpful information for users to support claims against illegal traffic management practices. However, state-of-the-art solutions adopt strict academic definitions (e.g., all traffic must be treated equally) or adopt the regulatory definitions from one jurisdiction, which is not realistic or does not consider that multiple jurisdictions may be traversed in an end-to-end network path, respectively An impact analysis showed that, under certain circumstances, from 39% to 48% of the detected Traffic Differentiations (TDs) are not NN violations when the regulations are considered, exposing that the regulatory aspect must not be ignored. In this thesis, a Reg- ulation Assessment step is proposed to be performed after the TD detection. This step shall consider all NN definitions that may be found in an end-to-end network path and point out NN violation when they are violated. A service is proposed to perform this step for TD detection solutions, given the unfeasibility of every solution implementing the re- quired functionalities. A Proof-of-Concept (PoC) prototype was developed based on the requirements identified along with the impact analysis, which was evaluated using infor- mation about TDs detected by a state-of-the-art solution. The verdicts were inconclusive (the TD is an NN violation or not) for a quarter of the scenarios due to lack of information about the traversed network paths and the occurrence zones (where in the network path, the TD is suspected of being deployed). However, the literature already has proposals of approaches to obtain such information. These results should encourage TD detection solution proponents to collect this data and submit them for the Regulation Assessment.Neutralidade da rede (NR) é um princípio que estabelece que o tráfego de aplicações e serviços seja tratado igualitariamente e não deve ser afetado por interferência, degradação, ou interrupção arbitrária. Apesar deste senso comum, NR tem múltiplas definições na literatura acadêmica, que diferem principalmente no que constitui o nível de igualdade adequado para considerar a rede como neutra. As definições de NR também podem ser incluídas nas regulações que controlam as atividades na Internet. No entanto, tais regu- lações são definidas por reguladores cujos atos são válidos apenas dentro de uma área geográfica denominada jurisdição. Assim, tanto a academia quanto a regulação forne- cem definições múltiplas e heterogêneas de NR. Nesta tese, a regulação é utilizada como guia para detecção de violação da NR, que nesta abordagem, é a adoção de práticas de gerenciamento de tráfego proibidas pelos reguladores. No entanto, as soluções adotam definições estritas da academia (por exemplo, todo o tráfego deve ser tratado igualmente) ou adotam as definições regulatórias de uma jurisdição, o que pode não ser realista ou pode não considerar que várias jurisdições podem ser atravessadas em um caminho de rede, respectivamente. Nesta tese, é proposta uma etapa de Avaliação da Regulação após a detecção da Diferenciação de Tráfego (DT), que deve considerar todas as definições de NR que podem ser encontradas em um caminho de rede e sinalizar violações da NR quando elas forem violadas. Uma análise de impacto mostrou que, em determinadas cir- cunstâncias, de 39% a 48% das DTs detectadas não são violações quando a regulação é considerada. É proposto um serviço para realizar a etapa de Avaliação de Regulação, visto que seria inviável que todas as soluções tivessem que implementar tal etapa. Um protótipo foi desenvolvido e avaliado usando informações sobre DTs detectadas por uma solução do estado-da-arte. Os veredictos foram inconclusivos (a DT é uma violação ou não) para 75% dos cenários devido à falta de informações sobre os caminhos de rede percorridos e sobre onde a DT é suspeita de ser implantada. No entanto, existem propostas para realizar a coleta dessas informações e espera-se que os proponentes de soluções de detecção de DT passem a coletá-las e submetê-las para o serviço de Avaliação de Regulação

    Study of Peer-to-Peer Network Based Cybercrime Investigation: Application on Botnet Technologies

    Full text link
    The scalable, low overhead attributes of Peer-to-Peer (P2P) Internet protocols and networks lend themselves well to being exploited by criminals to execute a large range of cybercrimes. The types of crimes aided by P2P technology include copyright infringement, sharing of illicit images of children, fraud, hacking/cracking, denial of service attacks and virus/malware propagation through the use of a variety of worms, botnets, malware, viruses and P2P file sharing. This project is focused on study of active P2P nodes along with the analysis of the undocumented communication methods employed in many of these large unstructured networks. This is achieved through the design and implementation of an efficient P2P monitoring and crawling toolset. The requirement for investigating P2P based systems is not limited to the more obvious cybercrimes listed above, as many legitimate P2P based applications may also be pertinent to a digital forensic investigation, e.g, voice over IP, instant messaging, etc. Investigating these networks has become increasingly difficult due to the broad range of network topologies and the ever increasing and evolving range of P2P based applications. In this work we introduce the Universal P2P Network Investigation Framework (UP2PNIF), a framework which enables significantly faster and less labour intensive investigation of newly discovered P2P networks through the exploitation of the commonalities in P2P network functionality. In combination with a reference database of known network characteristics, it is envisioned that any known P2P network can be instantly investigated using the framework, which can intelligently determine the best investigation methodology and greatly expedite the evidence gathering process. A proof of concept tool was developed for conducting investigations on the BitTorrent network.Comment: This is a thesis submitted in fulfilment of a PhD in Digital Forensics and Cybercrime Investigation in the School of Computer Science, University College Dublin in October 201

    P2P Networks Blocking

    Get PDF
    Tato bakalářská práce se zabývá filtrováním P2P sítí určených k výměně dat. Popisuje vrstvy ISO/OSI modelu, na kterých je možné tyto sítě blokovat. Popisuje nástroje detekující P2P sítě na základě datové části paketu určeného pro operační systémy Linux. Testování dokazuje úspěšnost blokování L7-filtrem a IPP2P filtrem.This bachelor's thesis deals with filtering P2P networks intend for changing data. My work describes layers of ISO/OSI model, which is able to blocking these networks. For Linux describes tools of detecting P2P networks with the aid of data part of packet. Testing proves success of blocking by L7- filter and IPP2P filter.
    corecore