100 research outputs found
Improving HD-FEC decoding via bit marking
We review the recently introduced soft-aided bit-marking (SABM) algorithm and
its suitability for product codes. Some aspects of the implementation of the
SABM algorithm are discussed. The influence of suboptimal channel soft
information is also analyzed.Comment: OECC 201
Improved Decoding of Staircase Codes: The Soft-aided Bit-marking (SABM) Algorithm
Staircase codes (SCCs) are typically decoded using iterative bounded-distance
decoding (BDD) and hard decisions. In this paper, a novel decoding algorithm is
proposed, which partially uses soft information from the channel. The proposed
algorithm is based on marking certain number of highly reliable and highly
unreliable bits. These marked bits are used to improve the
miscorrection-detection capability of the SCC decoder and the error-correcting
capability of BDD. For SCCs with -error-correcting
Bose-Chaudhuri-Hocquenghem component codes, our algorithm improves upon
standard SCC decoding by up to ~dB at a bit-error rate (BER) of
. The proposed algorithm is shown to achieve almost half of the gain
achievable by an idealized decoder with this structure. A complexity analysis
based on the number of additional calls to the component BDD decoder shows that
the relative complexity increase is only around at a BER of .
This additional complexity is shown to decrease as the channel quality
improves. Our algorithm is also extended (with minor modifications) to product
codes. The simulation results show that in this case, the algorithm offers
gains of up to ~dB at a BER of .Comment: 10 pages, 12 figure
A Soft-Aided Staircase Decoder Using Three-Level Channel Reliabilities
The soft-aided bit-marking (SABM) algorithm is based on the idea of marking
bits as highly reliable bits (HRBs), highly unreliable bits (HUBs), and
uncertain bits to improve the performance of hard-decision (HD) decoders. The
HRBs and HUBs are used to assist the HD decoders to prevent miscorrections and
to decode those originally uncorrectable cases via bit flipping (BF),
respectively. In this paper, an improved SABM algorithm (called iSABM) is
proposed for staircase codes (SCCs). Similar to the SABM, iSABM marks bits with
the help of channel reliabilities, i.e., using the absolute values of the
log-likelihood ratios. The improvements offered by iSABM include: (i) HUBs
being classified using a reliability threshold, (ii) BF randomly selecting
HUBs, and (iii) soft-aided decoding over multiple SCC blocks. The decoding
complexity of iSABM is comparable of that of SABM. This is due to the fact that
on the one hand no sorting is required (lower complexity) because of the use of
a threshold for HUBs, while on the other hand multiple SCC blocks use soft
information (higher complexity). Additional gains of up to 0.53 dB with respect
to SABM and 0.91 dB with respect to standard SCC decoding at a bit error rate
of are reported. Furthermore, it is shown that using 1-bit
reliability marking, i.e., only having HRBs and HUBs, only causes a gain
penalty of up to 0.25 dB with a significantly reduced memory requirement
Post-FEC BER Benchmarking for Bit-Interleaved Coded Modulation with Probabilistic Shaping
Accurate performance benchmarking after forward error correction (FEC)
decoding is essential for system design in optical fiber communications.
Generalized mutual information (GMI) has been shown to be successful at
benchmarking the bit-error rate (BER) after FEC decoding (post-FEC BER) for
systems with soft-decision (SD) FEC without probabilistic shaping (PS).
However, GMI is not relevant to benchmark post-FEC BER for systems with SD-FEC
and PS. For such systems, normalized GMI (NGMI), asymmetric information (ASI),
and achievable FEC rate have been proposed instead. They are good at
benchmarking post-FEC BER or to give an FEC limit in bit-interleaved coded
modulation (BICM) with PS, but their relation has not been clearly explained so
far. In this paper, we define generalized L-values under mismatched decoding,
which are connected to the GMI and ASI. We then show that NGMI, ASI, and
achievable FEC rate are theoretically equal under matched decoding but not
under mismatched decoding. We also examine BER before FEC decoding (pre-FEC
BER) and ASI over Gaussian and nonlinear fiber-optic channels with
approximately matched decoding. ASI always shows better correlation with
post-FEC BER than pre-FEC BER for BICM with PS. On the other hand, post-FEC BER
can differ at a given ASI when we change the bit mapping, which describes how
each bit in a codeword is assigned to a bit tributary.Comment: 14 pages, 8 figure
Robust P2P Live Streaming
Projecte fet en col.laboraciĂł amb la FundaciĂł i2CATThe provisioning of robust real-time communication services (voice, video, etc.) or media contents through the Internet in a distributed manner is an important challenge,
which will strongly influence in current and future Internet evolution. Aware of this, we
are developing a project named Trilogy leaded by the i2CAT Foundation, which has as
main pillar the study, development and evaluation of Peer-to-Peer (P2P) Live
streaming architectures for the distribution of high-quality media contents. In this
context, this work concretely covers media coding aspects and proposes the use of
Multiple Description Coding (MDC) as a flexible solution for providing robust and
scalable live streaming over P2P networks. This work describes current state of the art
in media coding techniques and P2P streaming architectures, presents the
implemented prototype as well as its simulation and validation results
On Transmission System Design for Wireless Broadcasting
This thesis considers aspects related to the design and standardisation of transmission systems for wireless broadcasting, comprising terrestrial and mobile reception. The purpose is to identify which factors influence the technical decisions and what issues could be better considered in the design process in order to assess different use cases, service scenarios and end-user quality. Further, the necessity of cross-layer optimisation for efficient data transmission is emphasised and means to take this into consideration are suggested. The work is mainly related terrestrial and mobile digital video broadcasting systems but many of the findings can be generalised also to other transmission systems and design processes.
The work has led to three main conclusions. First, it is discovered that there are no sufficiently accurate error criteria for measuring the subjective perceived audiovisual quality that could be utilised in transmission system design. Means for designing new error criteria for mobile TV (television) services are suggested and similar work related to other services is recommended.
Second, it is suggested that in addition to commercial requirements there should be technical requirements setting the frame work for the design process of a new transmission system. The technical requirements should include the assessed reception conditions, technical quality of service and service functionalities. Reception conditions comprise radio channel models, receiver types and antenna types. Technical quality of service consists of bandwidth, timeliness and reliability. Of these, the thesis focuses on radio channel models and errorcriteria (reliability) as two of the most important design challenges and provides means to optimise transmission parameters based on these.
Third, the thesis argues that the most favourable development for wireless broadcasting would be a single system suitable for all scenarios of wireless broadcasting. It is claimed that there are no major technical obstacles to achieve this and that the recently published second generation digital terrestrial television broadcasting system provides a good basis. The challenges and opportunities of a universal wireless broadcasting system are discussed mainly from technical but briefly also from commercial and regulatory aspectSiirretty Doriast
Scalable Video Streaming with Prioritised Network Coding on End-System Overlays
PhDDistribution over the internet is destined to become a standard approach for live broadcasting
of TV or events of nation-wide interest. The demand for high-quality live video
with personal requirements is destined to grow exponentially over the next few years. Endsystem
multicast is a desirable option for relieving the content server from bandwidth bottlenecks
and computational load by allowing decentralised allocation of resources to the users
and distributed service management. Network coding provides innovative solutions for a
multitude of issues related to multi-user content distribution, such as the coupon-collection
problem, allocation and scheduling procedure. This thesis tackles the problem of streaming
scalable video on end-system multicast overlays with prioritised push-based streaming.
We analyse the characteristic arising from a random coding process as a linear channel
operator, and present a novel error detection and correction system for error-resilient decoding,
providing one of the first practical frameworks for Joint Source-Channel-Network
coding. Our system outperforms both network error correction and traditional FEC coding
when performed separately. We then present a content distribution system based on endsystem
multicast. Our data exchange protocol makes use of network coding as a way to
collaboratively deliver data to several peers. Prioritised streaming is performed by means
of hierarchical network coding and a dynamic chunk selection for optimised rate allocation
based on goodput statistics at application layer. We prove, by simulated experiments, the
efficient allocation of resources for adaptive video delivery. Finally we describe the implementation
of our coding system. We highlighting the use rateless coding properties, discuss
the application in collaborative and distributed coding systems, and provide an optimised
implementation of the decoding algorithm with advanced CPU instructions. We analyse
computational load and packet loss protection via lab tests and simulations, complementing
the overall analysis of the video streaming system in all its components
Fifty Years of Noise Modeling and Mitigation in Power-Line Communications.
Building on the ubiquity of electric power infrastructure, power line communications (PLC) has been successfully used in diverse application scenarios, including the smart grid and in-home broadband communications systems as well as industrial and home automation. However, the power line channel exhibits deleterious properties, one of which is its hostile noise environment. This article aims for providing a review of noise modeling and mitigation techniques in PLC. Specifically, a comprehensive review of representative noise models developed over the past fifty years is presented, including both the empirical models based on measurement campaigns and simplified mathematical models. Following this, we provide an extensive survey of the suite of noise mitigation schemes, categorizing them into mitigation at the transmitter as well as parametric and non-parametric techniques employed at the receiver. Furthermore, since the accuracy of channel estimation in PLC is affected by noise, we review the literature of joint noise mitigation and channel estimation solutions. Finally, a number of directions are outlined for future research on both noise modeling and mitigation in PLC
LDPC-coded modulation for transmission over AWGN and flat rayleigh fading channels
La modulation codĂ©e est une technique de transmission efficace en largeur de bande qui intĂšgre le codage de canal et la modulation en une seule entitĂ© et ce, afin d'amĂ©liorer les performances tout en conservant la mĂȘme efficacitĂ© spectrale comparĂ© Ă la modulation non codĂ©e. Les codes de paritĂ© Ă faible densitĂ© (low-density parity-check codes, LDPC) sont les codes correcteurs d'erreurs les plus puissants et approchent la limite de Shannon, tout en ayant une complexitĂ© de dĂ©codage relativement faible. L'idĂ©e de combiner les codes LDPC et la modulation efficace en largeur de bande a donc Ă©tĂ© considĂ©rĂ©e par de nombreux chercheurs. Dans ce mĂ©moire, nous Ă©tudions une mĂ©thode de modulation codĂ©e Ă la fois puissante et efficace en largeur de bande, ayant d'excellentes performances de taux d'erreur binaire et une complexitĂ© d'implantation faible. Ceci est rĂ©alisĂ© en utilisant un encodeur rapide, un dĂ©coder de faible complexitĂ© et aucun entrelaceur. Les performances du systĂšme proposĂ© pour des transmissions sur un canal additif gaussien blanc et un canal Ă Ă©vanouissements plats de Rayleigh sont Ă©valuĂ©es au moyen de simulations. Les rĂ©sultats numĂ©riques montrent que la mĂ©thode de modulation codĂ©e utilisant la modulation d'amplitude en quadrature Ă M niveaux (M-QAM) peut atteindre d'excellentes performances pour toute une gamme d'efficacitĂ© spectrale. Une autre contribution de ce mĂ©moire est une mĂ©thode simple pour rĂ©aliser une modulation codĂ©e adaptative avec les codes LDPC pour la transmission sur des canaux Ă Ă©vanouissements plats et lents de Rayleigh. Dans cette mĂ©thode, six combinaisons de paires encodeur modulateur sont employĂ©es pour une adaptation trame par trame. L'efficacitĂ© spectrale moyenne varie entre 0.5 et 5 bits/s/Hz lors de la transmission. Les rĂ©sultats de simulation montrent que la modulation codĂ©e adaptative avec les codes LDPC offre une meilleure efficacitĂ© spectrale tout en maintenant une performance d'erreur acceptable
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
- âŠ