1,931 research outputs found

    Modeling Data-Plane Power Consumption of Future Internet Architectures

    Full text link
    With current efforts to design Future Internet Architectures (FIAs), the evaluation and comparison of different proposals is an interesting research challenge. Previously, metrics such as bandwidth or latency have commonly been used to compare FIAs to IP networks. We suggest the use of power consumption as a metric to compare FIAs. While low power consumption is an important goal in its own right (as lower energy use translates to smaller environmental impact as well as lower operating costs), power consumption can also serve as a proxy for other metrics such as bandwidth and processor load. Lacking power consumption statistics about either commodity FIA routers or widely deployed FIA testbeds, we propose models for power consumption of FIA routers. Based on our models, we simulate scenarios for measuring power consumption of content delivery in different FIAs. Specifically, we address two questions: 1) which of the proposed FIA candidates achieves the lowest energy footprint; and 2) which set of design choices yields a power-efficient network architecture? Although the lack of real-world data makes numerous assumptions necessary for our analysis, we explore the uncertainty of our calculations through sensitivity analysis of input parameters

    Content Delivery Latency of Caching Strategies for Information-Centric IoT

    Full text link
    In-network caching is a central aspect of Information-Centric Networking (ICN). It enables the rapid distribution of content across the network, alleviating strain on content producers and reducing content delivery latencies. ICN has emerged as a promising candidate for use in the Internet of Things (IoT). However, IoT devices operate under severe constraints, most notably limited memory. This means that nodes cannot indiscriminately cache all content; instead, there is a need for a caching strategy that decides what content to cache. Furthermore, many applications in the IoT space are timesensitive; therefore, finding a caching strategy that minimises the latency between content request and delivery is desirable. In this paper, we evaluate a number of ICN caching strategies in regards to latency and hop count reduction using IoT devices in a physical testbed. We find that the topology of the network, and thus the routing algorithm used to generate forwarding information, has a significant impact on the performance of a given caching strategy. To the best of our knowledge, this is the first study that focuses on latency effects in ICN-IoT caching while using real IoT hardware, and the first to explicitly discuss the link between routing algorithm, network topology, and caching effects.Comment: 10 pages, 9 figures, journal pape

    Cooperative announcement-based caching for video-on-demand streaming

    Get PDF
    Recently, video-on-demand (VoD) streaming services like Netflix and Hulu have gained a lot of popularity. This has led to a strong increase in bandwidth capacity requirements in the network. To reduce this network load, the design of appropriate caching strategies is of utmost importance. Based on the fact that, typically, a video stream is temporally segmented into smaller chunks that can be accessed and decoded independently, cache replacement strategies have been developed that take advantage of this temporal structure in the video. In this paper, two caching strategies are proposed that additionally take advantage of the phenomenon of binge watching, where users stream multiple consecutive episodes of the same series, reported by recent user behavior studies to become the everyday behavior. Taking into account this information allows us to predict future segment requests, even before the video playout has started. Two strategies are proposed, both with a different level of coordination between the caches in the network. Using a VoD request trace based on binge watching user characteristics, the presented algorithms have been thoroughly evaluated in multiple network topologies with different characteristics, showing their general applicability. It was shown that in a realistic scenario, the proposed election-based caching strategy can outperform the state-of-the-art by 20% in terms of cache hit ratio while using 4% less network bandwidth
    • …
    corecore