2,129 research outputs found

    Flexpop: A popularity-based caching strategy for multimedia applications in information-centric networking

    Get PDF
    Information-Centric Networking (ICN) is the dominant architecture for the future Internet. In ICN, the content items are stored temporarily in network nodes such as routers. When the memory of routers becomes full and there is no room for a new arriving content, the stored contents are evicted to cope with the limited cache size of the routers. Therefore, it is crucial to develop an effective caching strategy for keeping popular contents for a longer period of time. This study proposes a new caching strategy, named Flexible Popularity-based Caching (FlexPop) for storing popular contents. The FlexPop comprises two mechanisms, i.e., Content Placement Mechanism (CPM), which is responsible for content caching, and Content Eviction Mechanism (CEM) that deals with content eviction when the router cache is full and there is no space for the new incoming content. Both mechanisms are validated using Fuzzy Set Theory, following the Design Research Methodology (DRM) to manifest that the research is rigorous and repeatable under comparable conditions. The performance of FlexPop is evaluated through simulations and the results are compared with those of the Leave Copy Everywhere (LCE), ProbCache, and Most Popular Content (MPC) strategies. The results show that the FlexPop strategy outperforms LCE, ProbCache, and MPC with respect to cache hit rate, redundancy, content retrieval delay, memory utilization, and stretch ratio, which are regarded as extremely important metrics (in various studies) for the evaluation of ICN caching. The outcomes exhibited in this study are noteworthy in terms of making FlexPop acceptable to users as they can verify the performance of ICN before selecting the right caching strategy. Thus FlexPop has potential in the use of ICN for the future Internet such as in deployment of the IoT technology

    Emerging technologies for learning (volume 1)

    Get PDF
    Collection of 5 articles on emerging technologies and trend

    Reducing the Download Time in Stochastic P2P Content Delivery Networks by Improving Peer Selection

    Get PDF
    Peer-to-peer (P2P) applications have become a popular method for obtaining digital content. Recent research has shown that the amount of time spent downloading from a poor performing peer effects the total download duration. Current peer selection strategies attempt to limit the amount of time spent downloading from a poor performing peer, but they do not use both advanced knowledge and service capacity after the connection has been made to aid in peer selection. Advanced knowledge has traditionally been obtained from methods that add additional overhead to the P2P network, such as polling peers for service capacity information, using round trip time techniques to calculate the distance between peers, and by using tracker peers. This work investigated the creation of a new download strategy that replaced the random selection of peers with a method that selects server peers based on historic service capacity and ISP in order to further reduce the amount of time needed to complete a download session. Peer-to-peer (P2P) applications have become a popular method for obtaining digital content. Recent research has shown that the amount of time spent downloading from a poor performing peer effects the total download duration. Current peer selection strategies attempt to limit the amount of time spent downloading from a poor performing peer, but they do not use both advanced knowledge and service capacity after the connection has been made to aid in peer selection. Advanced knowledge has traditionally been obtained from methods that add additional overhead to the P2P network, such as polling peers for service capacity information, using round trip time techniques to calculate the distance between peers, and by using tracker peers. This work investigated the creation of a new download strategy that replaced the random selection of peers with a method that selects server peers based on historic service capacity and ISP in order to further reduce the amount of time needed to complete a download session. The results of this new historic based peer selection strategy have shown that there are benefits in using advanced knowledge to select peers and only replacing the worst performing peers. This new approach showed an average download duration improvement of 16.6% in the single client simulation and an average cross ISP traffic reduction of 55.17% when ISPs were participating in cross ISP throttling. In the multiple clients simulation the new approach showed an average download duration improvement of 53.31% and an average cross ISP traffic reduction of 88.83% when ISPs were participating in cross ISP throttling. This new approach also significantly improved the consistency of the download duration between download sessions allowing for the more accurate prediction of download times

    Compendium of effective practice in higher education retention and success

    Get PDF
    A collection of peer reviewed papers focusing on effective practice in higher educatio

    Aiding student transition through a novel approach to mathematics support

    Get PDF
    A compendium of effective practice released by the Higher Education Academy and disseminated to all UK Higher Education Institutions

    Proposal for CGIAR Research Program 7: Climate Change, Agriculture and Food Security (CCAFS)

    Get PDF

    Opening Bottlenecks: On Behalf of Mandated Network Neutrality

    Get PDF
    This Article calls for mandated network neutrality, which would require broadband service providers to treat all nondestructive data equitably. The Author argues that neutral networks are preferable because they better foster online innovation and provide a more equitable distribution of the power to communicate. Without mandated network neutrality, providers in highly concentrated regional broadband markets will likely begin charging content providers for the right to send data to end users at the fastest speeds available. The Author demonstrates that regional broadband competition and forthcoming transmission technologies are unlikely to prevent broadband discrimination, ad hoc regulation under current statutory authority is ineffective in dissuading even grossly anticompetitive network discrimination, and several providers\u27 executives have explicitly outlined their plans to begin discriminating. Additionally, the Author rebuts a congeries of arguments against network neutrality mandates, including appeals to management of network congestion, the call for multiple special-purpose networks, the suggestion to postpone regulation, and predictions of regulatory capture

    Constraining the digital world

    Get PDF
    The ability of law to govern the internet continues to be doubted by many activists. Yet there are still calls from many to sanction actors believed to be construing the freedom of the internet. One popular call has been for net neutrality, aiming to stop Internet Service Providers from customizing traffic for different services. But is legal action the only tool we have to regulate a thing? Lawrence Lessig’s New Chicago School model enables us to structure a problem of regulation by identifying four modalities of regulation that act upon an issue. By looking at how law, markets, norms and architecture affect an issue, we can gain more insight into the intricacies of regulation. In this thesis the author uses the New Chicago School model to analyse and structure the problem of net neutrality regulation. The author constructs an analytical tool that identifies regulations according to agency or self-execution, objectivity or subjectivity, direct or indirect approach, and also how the modalities may counteract each other. The results show that the model is indeed helpful for structuring problems, and that there are many constraints at play, even though there are problems with proper operationalization of the model
    • …
    corecore