321 research outputs found

    Measuring Infringement of Intellectual Property Rights

    Get PDF
    © Crown Copyright 2014. You may re-use this information (excluding logos) free of charge in any format or medium, under the terms of the Open Government Licence. To view this licence, visit http://www.nationalarchives.gov. uk/doc/open-government-licence/ Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concernedThe review is wide-ranging in scope and overall our findings evidence a lack of appreciation among those producing research for the high-level principles of measurement and assessment of scale. To date, the approaches adopted by industry seem more designed for internal consumption and are usually contingent on particular technologies and/or sector perspectives. Typically, there is a lack of transparency in the methodologies and data used to form the basis of claims, making much of this an unreliable basis for policy formulation. The research approaches we found are characterised by a number of features that can be summarised as a preference for reactive approaches that look to establish snapshots of an important issue at the time of investigation. Most studies are ad hoc in nature and on the whole we found a lack of sustained longitudinal approaches that would develop the appreciation of change. Typically the studies are designed to address specific hypotheses that might serve to support the position of the particular commissioning body. To help bring some structure to this area, we propose a framework for the assessment of the volume of infringement in each different area. The underlying aim is to draw out a common approach wherever possible in each area, rather than being drawn initially to the differences in each field. We advocate on-going survey tracking of the attitudes, perceptions and, where practical, behaviours of both perpetrators and claimants in IP infringement. Clearly, the nature of perpetrators, claimants and enforcement differs within each IPR but in our view the assessment for each IPR should include all of these elements. It is important to clarify that the key element of the survey structure is the adoption of a survey sampling methodology and smaller volumes of representative participation. Once selection is given the appropriate priority, a traditional offline survey will have a part to play, but as the opportunity arises, new technological methodologies, particularly for the voluntary monitoring of online behaviour, can add additional detail to the overall assessment of the scale of activity. This framework can be applied within each of the IP right sectors: copyright, trademarks,patents, and design rights. It may well be that the costs involved with this common approach could be mitigated by a syndicated approach to the survey elements. Indeed, a syndicated approach has a number of advantages in addition to cost. It could be designed to reduce any tendency either to hide inappropriate/illegal activity or alternatively exaggerate its volume to fit with the theme of the survey. It also has the scope to allow for monthly assessments of attitudes rather than being vulnerable to unmeasured seasonal impacts

    Characterization of ISP Traffic: Trends, User Habits, and Access Technology Impact

    Get PDF
    In the recent years, the research community has increased its focus on network monitoring which is seen as a key tool to understand the Internet and the Internet users. Several studies have presented a deep characterization of a particular application, or a particular network, considering the point of view of either the ISP, or the Internet user. In this paper, we take a different perspective. We focus on three European countries where we have been collecting traffic for more than a year and a half through 5 vantage points with different access technologies. This humongous amount of information allows us not only to provide precise, multiple, and quantitative measurements of "What the user do with the Internet" in each country but also to identify common/uncommon patterns and habits across different countries and nations. Considering different time scales, we start presenting the trend of application popularity; then we focus our attention to a one-month long period, and further drill into a typical daily characterization of users activity. Results depict an evolving scenario due to the consolidation of new services as Video Streaming and File Hosting and to the adoption of new P2P technologies. Despite the heterogeneity of the users, some common tendencies emerge that can be leveraged by the ISPs to improve their servic

    Implications of Selfish Neighbor Selection in Overlay Networks

    Full text link
    In a typical overlay network for routing or content sharing, each node must select a fixed number of immediate overlay neighbors for routing traffic or content queries. A selfish node entering such a network would select neighbors so as to minimize the weighted sum of expected access costs to all its destinations. Previous work on selfish neighbor selection has built intuition with simple models where edges are undirected, access costs are modeled by hop-counts, and nodes have potentially unbounded degrees. However, in practice, important constraints not captured by these models lead to richer games with substantively and fundamentally different outcomes. Our work models neighbor selection as a game involving directed links, constraints on the number of allowed neighbors, and costs reflecting both network latency and node preference. We express a node's "best response" wiring strategy as a k-median problem on asymmetric distance, and use this formulation to obtain pure Nash equilibria. We experimentally examine the properties of such stable wirings on synthetic topologies, as well as on real topologies and maps constructed from PlanetLab and AS-level Internet measurements. Our results indicate that selfish nodes can reap substantial performance benefits when connecting to overlay networks composed of non-selfish nodes. On the other hand, in overlays that are dominated by selfish nodes, the resulting stable wirings are optimized to such great extent that even non-selfish newcomers can extract near-optimal performance through naive wiring strategies.Marie Curie Outgoing International Fellowship of the EU (MOIF-CT-2005-007230); National Science Foundation (CNS Cybertrust 0524477, CNS NeTS 0520166, CNS ITR 0205294, EIA RI 020206

    Measuring the State of ECN Readiness in Servers, Clients, and Routers

    Get PDF
    Proceedings of the Eleventh ACM SIGCOMM/USENIX Internet Measurement Conference (IMC 2011), Berlin, DE, November 2011.Better exposing congestion can improve traffic management in the wide-area, at peering points, among residential broadband connections, and in the data center. TCP's network utilization and efficiency depends on congestion information, while recent research proposes economic and policy models based on congestion. Such motivations have driven widespread support of Explicit Congestion Notification (ECN) in modern operating systems. We reappraise the Internet's ECN readiness, updating and extending previous measurements. Across large and diverse server populations, we find a three-fold increase in ECN support over prior studies. Using new methods, we characterize ECN within mobile infrastructure and at the client-side, populations previously unmeasured. Via large-scale path measurements, we find the ECN feedback loop failing in the core of the network 40\% of the time, typically at AS boundaries. Finally, we discover new examples of infrastructure violating ECN Internet standards, and discuss remaining impediments to running ECN while suggesting mechanisms to aid adoption

    The Internet-Wide Impact of P2P Traffic Localization on ISP Profitability

    Get PDF
    We conduct a detailed simulation study to examine how localizing P2P traffic within network boundaries impacts the profitability of an ISP. A distinguishing aspect of our work is the focus on Internet-wide implications, i.e., how adoption of localization within an ISP affects both itself and other ISPs. Our simulations are based on detailed models that estimate inter-autonomous-system (AS) P2P traffic and inter-AS routing, localization models that predict the extent to which P2P traffic is reduced, and pricing models that predict the impact of changes in traffic on the profit of an ISP. We evaluate our models by using a large-scale crawl of BitTorrent containing over 138 million users sharing 2.75 million files. Our results show that the benefits of localization must not be taken for granted. Some of our key findings include: 1) residential ISPs can actually lose money when localization is employed, and some of them will not see increased profitability until other ISPs employ localization; 2) the reduction in costs due to localization will be limited for small ISPs and tends to grow only logarithmically with client population; and 3) some ISPs can better increase profitability through alternate strategies to localization by taking advantage of the business relationships they have with other ISP
    corecore