171 research outputs found

    Flexible and dynamic network coding for adaptive data transmission in DTNs

    Get PDF
    Existing network coding approaches for Delay-Tolerant Networks (DTNs) do not detect and adapt to congestion in the network. In this paper we describe CafNC (Congestion aware forwarding with Network Coding) that combines adaptive network coding and adaptive forwarding in DTNs. In CafNC each node learns the status of its neighbours, and their egonetworks in order to detect coding opportunities, and codes as long as the recipients can decode. Our flexible design allows CafNC to efficiently support multiple unicast flows, with dynamic traffic demands and dynamic senders and receivers. We evaluate CafNC with two real connectivity traces and a realistic P2P application, introducing congestion by increasing the number of unicast flows in the network. Our results show that CafNC improves the success ratio, delay and packet loss, as the number of flows grows in comparison to no coding and hub-based static coding, while at the same time achieving efficient utilisation of network resources. We also show that static coding misses a number of coding opportunities and increases packet loss rates at times of increased congestion

    Resource management for next generation multi-service mobile network

    Get PDF

    MobiTrade: Trading Content in Disruption Tolerant Networks

    Get PDF
    International audienceThe rapid proliferation of advanced mobile devices has created a growing demand for data content. Existing approaches (e.g. relying on cellular infrastructures) cannot keep up with the large volume of content generated and requested, without the deployment of new expensive infrastructure. Exchanging content of interest opportunistically, when two nodes are in range, presents a low cost and high bandwidth alternative for popular, bulky content. Yet, efficiently collecting, storing, and sharing the content while preventing selfish users from impairing collaborative ones, poses major challenges. In this paper, we present MobiTrade, a collaborative content dissemination system on top of a delay tolerant network. It allows users to head out in the real world, express locally their interests, and wait to get notified whenever an encountered device has content(s) matching these interests. Even though interactions are done between neighboring wireless devices (locally), MobiTrade implements a trading scheme that motivates mobile devices to act as merchants and carry content across the network to satisfy each other's interests. Users continuously profile the type of content requested and the collaboration level of encountered devices. Based on this knowledge, an appropriate utility function is used to rank these requests and collect an optimal inventory of data that maximizes the expected value of stored content for future encounters. Using NS3 simulations based on synthetic and real mobility traces, we show that MobiTrade achieves up to 2 times higher query success rates compared to other content dissemination schemes. Furthermore, we show that MobiTrade successfully isolates selfish, non-collaborative devices. Finally, using a simple game theoretic framework we show that turning on our MobiTrade mechanism is an efficient Nash Equilibrium

    An Analysis of Data Quality Defects in Podcasting Systems

    Get PDF
    Podcasting has emerged as an asynchronous delay-tolerant method for the distribution of multimedia files through a network. Although podcasting has become a popular Internet application, users encounter frequent information quality problems in podcasting systems. To better understand the severity of these quality problems, we have applied the Total Data Quality Management methodology to podcasting. Through the application of this methodology we have quantified the data quality problems inherent within podcasting metadata, and performed an analysis that maps specific metadata defects to failures in popular commercial podcasting platforms. Furthermore, we extracted the Really Simple Syndication (RSS) feeds from the iTunes catalog for the purpose of performing the most comprehensive measurement of podcasting metadata to date. From these findings we attempted to improve the quality of podcasting data through the creation of a metadata validation tool - PodCop. PodCop extends existing RSS validation tools and encapsulates validation rules specific to the context of podcasting. We believe PodCop is the first attempt at improving the overall health of the podcasting ecosyste

    Heterogeneous and opportunistic wireless networks

    Get PDF
    Recent years have witnessed the evolution of a large plethora of wireless technologies with different characteristics, as a response of the operators' and users' needs in terms of an efficient and ubiquitous delivery of advanced multimedia services. The wireless segment of network infrastructure has penetrated in our lives, and wireless connectivity has now reached a state where it is considered to be an indispensable service as electricity or water supply. Wireless data networks grow increasingly complex as a multiplicity of wireless information terminals with sophisticated capabilities get embedded in the infrastructure. © 2012 Springer Milan. All Right Reserved

    An Analysis of Data Quality Defects in Podcasting Systems

    Get PDF
    Podcasting has emerged as an asynchronous delay-tolerant method for the distribution of multimedia files through a network. Although podcasting has become a popular Internet application, users encounter frequent information quality problems in podcasting systems. To better understand the severity of these quality problems, we have applied the Total Data Quality Management methodology to podcasting. Through the application of this methodology we have quantified the data quality problems inherent within podcasting metadata, and performed an analysis that maps specific metadata defects to failures in popular commercial podcasting platforms. Furthermore, we extracted the Really Simple Syndication (RSS) feeds from the iTunes catalog for the purpose of performing the most comprehensive measurement of podcasting metadata to date. From these findings we attempted to improve the quality of podcasting data through the creation of a metadata validation tool - PodCop. PodCop extends existing RSS validation tools and encapsulates validation rules specific to the context of podcasting. We believe PodCop is the first attempt at improving the overall health of the podcasting ecosyste

    An Analysis of Data Quality Defects in Podcasting Systems

    Get PDF
    Podcasting has emerged as an asynchronous delay-tolerant method for the distribution of multimedia files through a network. Although podcasting has become a popular Internet application, users encounter frequent information quality problems in podcasting systems. To better understand the severity of these quality problems, we have applied the Total Data Quality Management methodology to podcasting. Through the application of this methodology we have quantified the data quality problems inherent within podcasting metadata, and performed an analysis that maps specific metadata defects to failures in popular commercial podcasting platforms. Furthermore, we extracted the Really Simple Syndication (RSS) feeds from the iTunes catalog for the purpose of performing the most comprehensive measurement of podcasting metadata to date. From these findings we attempted to improve the quality of podcasting data through the creation of a metadata validation tool - PodCop. PodCop extends existing RSS validation tools and encapsulates validation rules specific to the context of podcasting. We believe PodCop is the first attempt at improving the overall health of the podcasting ecosyste

    On the feasibility of a user-operated mobile content distribution network

    Get PDF
    The vast majority of mobile data transfers today follow the traditional client-server model. Although in the fixed network P2P approaches have been exploited and shown to be very efficient, in the mobile domain there has been limited attempt to leverage on P2P (D2D) for large-scale content distribution (i.e., not DTN-like, point-to-point message transfers). In this paper, we explore the potential of a user-operated, smartphone-centric content distribution model for smartphone applications. In particular, we assume source nodes that are updated directly from the content provider (e.g., BBC, CNN), whenever updates are available; destination nodes are then directly updated by source nodes in a D2D manner. We leverage on sophisticated information-aware and application-centric connectivity techniques to distribute content between mobile devices in densely-populated urban environments. Our target is to investigate the feasibility of an opportunistic content distribution network in an attempt to achieve widespread distribution of heavy content (e.g., video files) to the majority of the destination nodes. We propose ubiCDN as a ubiquitous, user-operated and distributed CDN for mobile applications
    • …
    corecore