185 research outputs found

    Quality of service monitoring: Performance metrics across proprietary content domains

    Get PDF
    We propose a quality of service (QoS) monitoring program for broadband access to measure the impact of proprietary network spaces. Our paper surveys other QoS policy initiatives, including those in the airline, and wireless and wireline telephone industries, to situate broadband in the context of other markets undergoing regulatory devolution. We illustrate how network architecture can create impediments to open communications, and how QoS monitoring can detect such effects. We present data from a field test of QoS-monitoring software now in development. We suggest QoS metrics to gauge whether information "walled gardens" represent a real threat for dividing the Internet into proprietary spaces. To demonstrate our proposal, we are placing our software on the computers of a sample of broadband subscribers. The software periodically conducts a battery of tests that assess the quality of connections from the subscriber's computer to various content sites. Any systematic differences in connection quality between affiliated and non-affiliated content sites would warrant research into the behavioral implications of those differences. QoS monitoring is timely because the potential for the Internet to break into a loose network of proprietary content domains appears stronger than ever. Recent court rulings and policy statements suggest a growing trend towards relaxed scrutiny of mergers and the easing or elimination of content ownership rules. This policy environment could lead to a market with a small number of large, vertically integrated network operators, each pushing its proprietary content on subscribers.Comment: 29th TPRC Conference, 200

    Government mandated blocking of foreign Web content

    Full text link
    Blocking of foreign Web content by Internet access providers has been a hot topic for the last 18 months in Germany. Since fall 2001 the state of North-Rhine-Westphalia very actively tries to mandate such blocking. This paper will take a technical view on the problems imposed by the blocking orders and blocking content at access or network provider level in general. It will also give some empirical data on the effects of the blocking orders to help in the legal assessment of the orders.Comment: Preprint, revised 30.6.200

    Analysis of prefetching methods from a graph-theoretical perspective

    Get PDF
    Είναι σημαντικό να τονίσουμε το ρόλο που τα Δίκτυα Διανομής Περιεχομένου (CDNs) παίζουν στις ταχέως αναπτυσσόμενες τοπολογίες του Διαδικτύου. Είναι υπεύθυνα για την εξυπηρέτηση της πλειοψηφίας του περιεχομένου του Διαδικτύου στους τελικούς χρήστες αντιγράφοντας το από το διακομιστή προέλευσης και τοποθετώντας το σε έναν διακομιστή πιο κοντά τους. Τα μεγαλύτερα ίσως προβλήματα που αντιμετωπίζουν τα CDNs έχουν να κάνουν με την επιλογή του περιεχομένου που πρέπει να προανακτηθεί αλλά και την επιλογή ενός κατάλληλου διακομιστή μεσολάβησης στον οποίο θα τοποθετηθεί. Εμείς θα επικεντρωθούμε στο πρόβλημα προανάκτησής περιεχομένου επεκτείνοντας την έρευνα που έγινε από τον Σιδηρόπουλο κ.α. (WorldWideWebJournal, vol. 11, 2008, pp. 39-70). Συγκεκριμένα, θα προσπαθήσουμε να αποφανθούμε πώς η μέθοδος συσταδοποίησής τους μπορεί να δουλέψει σε συγκεκριμένα περιβάλλοντα σε σύγκριση με μια άλλη προσέγγιση που χρησιμοποιείται για την επίλυση του παιχνιδιού επιτήρησης σε γράφους όπως διερευνήθηκε από τον Fomin κ.α. (Proc. 6thInt’lConf. onFUNwithAlgorithms, 2012, pp.166-176) και τον Giroire κ.α. . (JournalofTheoreticalComputerScience, vol. 584, 2015, pp.131-143). Στην πορεία, δίνουμε και έναν άλλο ορισμό για τη συνοχή των συστάδων που καλύπτει και οριακές περιπτώσεις. Τέλος, ορίζουμε ένα καινούριο πρόβλημα, τη διαμέριση δηλαδή του γράφου σε έναν προκαθορισμένο αριθμό ανεξάρτητων συστάδων με βέλτιστη μέση συνοχή.It is important to highlight the role Content Distribution Networks (CDNs) play in rapidly growing Internet topologies. They are responsible for serving the lion's share of Internet content to the end users by replicating it from the origin server and placing it to a caching server closer to them. Probably the biggest issues CDNs have to deal with revolve around deciding which content gets prefetched, in which surrogate/caching server it is placed and allocating storage to each server in an efficient manner. We will focus on the content selection/prefetching problem extending the work done by Sidiropoulos et al. (World Wide Web Journal, vol. 11, 2008, pp. 39-70). Specifically, we are trying to determine how their clustering algorithm can work in specific environments in comparison with an approach used to solve the surveillance game in graphs as discussed by Fomin et al(Proc. 6th Int’l Conf. on FUN with Algorithms, 2012, pp.166-176)and Giroire et al. (Journal of Theoretical Computer Science, vol. 584, 2015, pp.131-143). Along the way, we provide another definition for cluster cohesion that accounts for edge cases. Finally, we define an original problem, which consists of partitioning a graph into a predefined amount of disjoint clusters of optimal average cohesion

    The growing complexity of content delivery networks: Challenges and implications for the Internet ecosystem

    Get PDF
    Since the commercialization of the Internet, content and related applications, including video streaming, news, advertisements, and social interaction have moved online. It is broadly recognized that the rise of all of these different types of content (static and dynamic, and increasingly multimedia) has been one of the main forces behind the phenomenal growth of the Internet, and its emergence as essential infrastructure for how individuals across the globe gain access to the content sources they want. To accelerate the delivery of diverse content in the Internet and to provide commercial-grade performance for video delivery and the Web, Content Delivery Networks (CDNs) were introduced. This paper describes the current CDN ecosystem and the forces that have driven its evolution. We outline the different CDN architectures and consider their relative strengths and weaknesses. Our analysis highlights the role of location, the growing complexity of the CDN ecosystem, and their relationship to and implications for interconnection markets.EC/H2020/679158/EU/Resolving the Tussle in the Internet: Mapping, Architecture, and Policy Making/ResolutioNe

    SocialSensor: sensing user generated input for improved media discovery and experience

    Get PDF
    SocialSensor will develop a new framework for enabling real-time multimedia indexing and search in the Social Web. The project moves beyond conventional text-based indexing and retrieval models by mining and aggregating user inputs and content over multiple social networking sites. Social Indexing will incorporate information about the structure and activity of the users‟ social network directly into the multimedia analysis and search process. Furthermore, it will enhance the multimedia consumption experience by developing novel user-centric media visualization and browsing paradigms. For example, SocialSensor will analyse the dynamic and massive user contributions in order to extract unbiased trending topics and events and will use social connections for improved recommendations. To achieve its objectives, SocialSensor introduces the concept of Dynamic Social COntainers (DySCOs), a new layer of online multimedia content organisation with particular emphasis on the real-time, social and contextual nature of content and information consumption. Through the proposed DySCOs-centered media search, SocialSensor will integrate social content mining, search and intelligent presentation in a personalized, context and network-aware way, based on aggregation and indexing of both UGC and multimedia Web content

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage

    Ill Telecommunications: How Internet Infrastructure Providers Lose First Amendment Protection

    Get PDF
    The Federal Communications Commission (FCC) recently proposed an Internet nondiscrimination rule: Subject to reasonable network management, a provider of broadband Internet access service must treat lawful content, applications, and services in a nondiscriminatory manner. Among other requests, the FCC sought comment on whether the proposed nondiscrimination rule would promote free speech, civic participation, and democratic engagement, and whether it would impose any burdens on access providers\u27 speech that would be cognizable for purposes of the First Amendment. The purpose of this Article is to suggest that a wide range of responses to these First Amendment questions, offered by telecommunications providers and civil society groups alike, have glossed over a fundamental question: whether the activities of broadband Internet providers are sufficiently imbued with speech or expressive conduct to warrant protection under the First Amendment in the first place. Interestingly, it is not only those who argue against governmental regulation who make this threshold mistake. Those who argue for the importance of imposing nondiscrimination and common carriage rules upon telecommunications providers also, in their eagerness to open up a conversation about the values of free speech in the age of the Internet, pay little attention to this preliminary question. Yet if this question is not resolved, any subsequent analysis of those who facilitate Internet-based telecommunications will necessarily rest on an incoherent and insufficiently considered definition of the speech that is at the heart of First Amendment concerns. This Article analyzes the FCC\u27s proposed nondiscrimination rule with an eye towards whether the rule affects the speech or expressive conduct of broadband providers in a manner that is cognizable for First Amendment purposes. Discussion of the values, free speech theories, policies, investment incentives, and economic and governmental interests underlying the resolution of this claim--values emphasized by the vast majority of parties engaged in the network neutrality debate, at significant cost to the clarity of constitutional elements--are deferred pending the evaluation of this threshold question

    Wireless Net Neutrality Regulation and the Problem with Pricing: An Empirical, Cautionary Tale

    Get PDF
    I present here a unique empirical analysis of the consumer welfare benefits of prior regulation in the mobile telecommunications industry. In particular, I analyze the relative consumer benefits of state rate regulation and federal entry regulation. The institution of filing requirements and FTC review and approval of various consumer pricing regimes is highly analogous to the consumer price controls imposed by various state level public utility commissions in the past. Furthermore, the imposition of a zero-price rule is analogous to past rate regulation; in particular it is similar to past wholesale regulation with its underlying principles of open access and interconnection rights to non-network competitors. Consumer welfare in this empirical analysis is defined in terms of consumer prices, not in express terms of innovation increases in the application and equipment markets. A motivating rationale behind the zero-price rule, and network neutrality regulation in general, is that each application provider should enjoy nondiscriminatory access to the Internet for the equal opportunity to compete for the attention of end users. Consumer prices offer a proxy for the size of the available network because as prices decrease subscribership typically increases. As the size of the network increases, the benefit of network effects (e.g., profit, reputation, and notoriety) increases and, therefore, the incentive for innovation by application and equipment innovators increases. My analysis is set forth as follows. Part I presents a brief overview of a few key elements of the network neutrality debate that have led to various proposals for direct or indirect price regulation. Part II presents an introduction to the mobile communications industry and describes the unique dataset I use. Part III sets forth the empirical model to test for the efficacy of past regulation, including consumer price regulation and wholesale open access pricing regulation, and presents the results. Specifically, price regulation, akin to proposed consumer price regulation and the zero-price rule, is shown to have had little or no benefit to consumers and may have harmed consumers in some instances. Moreover, even subjectively innocuous regulation is shown to have, at best, an ambiguous effect on consumer welfare. Comparable analysis of regulation increasing market entry suggests great consumer welfare benefits, indicating that regulation is best directed at encouraging increased competition rather than dictating specific network neutrality requirements to individual operators. Finally, the Conclusion sets forth the policy recommendations indicated by the empirical results
    corecore