65 research outputs found
Can network coding bridge the digital divide in the Pacific?
Conventional TCP performance is significantly impaired under long latency
and/or constrained bandwidth. While small Pacific Island states on satellite
links experience this in the extreme, small populations and remoteness often
rule out submarine fibre connections and their communities struggle to reap the
benefits of the Internet. Network-coded TCP (TCP/NC) can increase goodput under
high latency and packet loss, but has not been used to tunnel conventional TCP
and UDP across satellite links before. We report on a feasibility study aimed
at determining expected goodput gain across such TCP/NC tunnels into island
targets on geostationary and medium earth orbit satellite links.Comment: 5 pages, 3 figures, conference (Netcod2015
Decoding Algorithms for Random Linear Network Codes
Part 2: - NC-Pro 2011 WorkshopInternational audienceWe consider the problem of efficient decoding of a random linear code over a finite field. In particular we are interested in the case where the code is random, relatively sparse, and use the binary finite field as an example. The goal is to decode the data using fewer operations to potentially achieve a high coding throughput, and reduce energy consumption. We use an on-the-fly version of the Gauss-Jordan algorithm as a baseline, and provide several simple improvements to reduce the number of operations needed to perform decoding. Our tests show that the improvements can reduce the number of operations used during decoding with 10-20% on average depending on the code parameters
On Code Parameters and Coding Vector Representation for Practical RLNC
Random Linear Network Coding (RLNC) provides a theoretically efficient method for coding. The drawbacks associated with it are the complexity of the decoding and the overhead resulting from the encoding vector. Increasing the field size and generation length presents a fundamental trade-off between packet-based throughput and operational overhead. On the one hand, decreasing the probability of redundant packets' being transmitted is beneficial for throughput and, consequently, reduces transmission energy. On the other hand, the decoding complexity and amount of header overhead increase with field size and generation length, leading to higher energy consumption. The main findings of this work are bounds for the transmission overhead due to linearly dependent packets. The optimal trade-off is system and topology dependent, as it depends on the cost in energy of performing coding operations versus transmitting data. We show that moderate field sizes are the correct choice when trade-offs are considered. The results show that sparse binary codes perform the best, unless the generation size is very low.Cooperation and Network Coding Project (CONE) (Grant 09-066549/FTP
- …