5,890 research outputs found
Non-Adaptive Distributed Compression in Networks
In this paper, we discuss non-adaptive distributed compression of inter-node
correlated real-valued messages. To do so, we discuss the performance of
conventional packet forwarding via routing, in terms of the total network load
versus the resulting quality of service (distortion level). As a better
alternative for packet forwarding, we briefly describe our previously proposed
one-step Quantized Network Coding (QNC), and make motivating arguments on its
advantage when the appropriate marginal rates for distributed source coding are
not available at the encoder source nodes. We also derive analytic guarantees
on the resulting distortion of our one-step QNC scenario. Finally, we conclude
the paper by providing a mathematical comparison between the total network
loads of one-step QNC and conventional packet forwarding, showing a significant
reduction in the case of one-step QNC.Comment: Submitted for 2013 IEEE International Symposium on Information Theor
Network Information Flow with Correlated Sources
In this paper, we consider a network communications problem in which multiple
correlated sources must be delivered to a single data collector node, over a
network of noisy independent point-to-point channels. We prove that perfect
reconstruction of all the sources at the sink is possible if and only if, for
all partitions of the network nodes into two subsets S and S^c such that the
sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c}
C_{ij}. Our main finding is that in this setup a general source/channel
separation theorem holds, and that Shannon information behaves as a classical
network flow, identical in nature to the flow of water in pipes. At first
glance, it might seem surprising that separation holds in a fairly general
network situation like the one we study. A closer look, however, reveals that
the reason for this is that our model allows only for independent
point-to-point channels between pairs of nodes, and not multiple-access and/or
broadcast channels, for which separation is well known not to hold. This
``information as flow'' view provides an algorithmic interpretation for our
results, among which perhaps the most important one is the optimality of
implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information
Theory -- contains (very) minor changes based on the last round of review
Networked Slepian-Wolf: theory, algorithms, and scaling laws
Consider a set of correlated sources located at the nodes of a network, and a set of sinks that are the destinations for some of the sources. The minimization of cost functions which are the product of a function of the rate and a function of the path weight is considered, for both the data-gathering scenario, which is relevant in sensor networks, and general traffic matrices, relevant for general networks. The minimization is achieved by jointly optimizing a) the transmission structure, which is shown to consist in general of a superposition of trees, and b) the rate allocation across the source nodes, which is done by Slepian-Wolf coding. The overall minimization can be achieved in two concatenated steps. First, the optimal transmission structure is found, which in general amounts to finding a Steiner tree, and second, the optimal rate allocation is obtained by solving an optimization problem with cost weights determined by the given optimal transmission structure, and with linear constraints given by the Slepian-Wolf rate region. For the case of data gathering, the optimal transmission structure is fully characterized and a closed-form solution for the optimal rate allocation is provided. For the general case of an arbitrary traffic matrix, the problem of finding the optimal transmission structure is NP-complete. For large networks, in some simplified scenarios, the total costs associated with Slepian-Wolf coding and explicit communication (conditional encoding based on explicitly communicated side information) are compared. Finally, the design of decentralized algorithms for the optimal rate allocation is analyzed
Joint Source-Channel Cooperative Transmission over Relay-Broadcast Networks
Reliable transmission of a discrete memoryless source over a multiple-relay
relay-broadcast network is considered. Motivated by sensor network
applications, it is assumed that the relays and the destinations all have
access to side information correlated with the underlying source signal. Joint
source-channel cooperative transmission is studied in which the relays help the
transmission of the source signal to the destinations by using both their
overheard signals, as in the classical channel cooperation scenario, as well as
the available correlated side information. Decode-and-forward (DF) based
cooperative transmission is considered in a network of multiple relay terminals
and two different achievability schemes are proposed: i) a regular encoding and
sliding-window decoding scheme without explicit source binning at the encoder,
and ii) a semi-regular encoding and backward decoding scheme with binning based
on the side information statistics. It is shown that both of these schemes lead
to the same source-channel code rate, which is shown to be the "source-channel
capacity" in the case of i) a physically degraded relay network in which the
side information signals are also degraded in the same order as the channel;
and ii) a relay-broadcast network in which all the terminals want to
reconstruct the source reliably, while at most one of them can act as a relay.Comment: Submitted to IEEE Transactions on Information Theory, 201
Adaptive-Compression Based Congestion Control Technique for Wireless Sensor Networks
Congestion in a wireless sensor network causes an increase in the amount of data loss and delays in data transmission. In this paper, we propose a new congestion control technique (ACT, Adaptive Compression-based congestion control Technique) based on an adaptive compression scheme for packet reduction in case of congestion. The compression techniques used in the ACT are Discrete Wavelet Transform (DWT), Adaptive Differential Pulse Code Modulation (ADPCM), and Run-Length Coding (RLC). The ACT first transforms the data from the time domain to the frequency domain, reduces the range of data by using ADPCM, and then reduces the number of packets with the help of RLC before transferring the data to the source node. It introduces the DWT for priority-based congestion control because the DWT classifies the data into four groups with different frequencies. The ACT assigns priorities to these data groups in an inverse proportion to the respective frequencies of the data groups and defines the quantization step size of ADPCM in an inverse proportion to the priorities. RLC generates a smaller number of packets for a data group with a low priority. In the relaying node, the ACT reduces the amount of packets by increasing the quantization step size of ADPCM in case of congestion. Moreover, in order to facilitate the back pressure, the queue is controlled adaptively according to the congestion state. We experimentally demonstrate that the ACT increases the network efficiency and guarantees fairness to sensor nodes, as compared with the existing methods. Moreover, it exhibits a very high ratio of the available data in the sink
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
- âŠ