13,873 research outputs found
Improved construction of irregular progressive edge-growth Tanner graphs
The progressive edge-growth algorithm is a well-known procedure to construct
regular and irregular low-density parity-check codes. In this paper, we propose
a modification of the original algorithm that improves the performance of these
codes in the waterfall region when constructing codes complying with both,
check and symbol node degree distributions. The proposed algorithm is thus
interesting if a family of irregular codes with a complex check node degree
distribution is used.Comment: 3 pages, 3 figure
Blind Reconciliation
Information reconciliation is a crucial procedure in the classical
post-processing of quantum key distribution (QKD). Poor reconciliation
efficiency, revealing more information than strictly needed, may compromise the
maximum attainable distance, while poor performance of the algorithm limits the
practical throughput in a QKD device. Historically, reconciliation has been
mainly done using close to minimal information disclosure but heavily
interactive procedures, like Cascade, or using less efficient but also less
interactive -just one message is exchanged- procedures, like the ones based in
low-density parity-check (LDPC) codes. The price to pay in the LDPC case is
that good efficiency is only attained for very long codes and in a very narrow
range centered around the quantum bit error rate (QBER) that the code was
designed to reconcile, thus forcing to have several codes if a broad range of
QBER needs to be catered for. Real world implementations of these methods are
thus very demanding, either on computational or communication resources or
both, to the extent that the last generation of GHz clocked QKD systems are
finding a bottleneck in the classical part. In order to produce compact, high
performance and reliable QKD systems it would be highly desirable to remove
these problems. Here we analyse the use of short-length LDPC codes in the
information reconciliation context using a low interactivity, blind, protocol
that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits
length LDPC codes are suitable for blind reconciliation. Such codes are of high
interest in practice, since they can be used for hardware implementations with
very high throughput.Comment: 22 pages, 8 figure
Untainted Puncturing for Irregular Low-Density Parity-Check Codes
Puncturing is a well-known coding technique widely used for constructing
rate-compatible codes. In this paper, we consider the problem of puncturing
low-density parity-check codes and propose a new algorithm for intentional
puncturing. The algorithm is based on the puncturing of untainted symbols, i.e.
nodes with no punctured symbols within their neighboring set. It is shown that
the algorithm proposed here performs better than previous proposals for a range
of coding rates and short proportions of punctured symbols.Comment: 4 pages, 3 figure
Rate Compatible Protocol for Information Reconciliation: An application to QKD
Information Reconciliation is a mechanism that allows to weed out the
discrepancies between two correlated variables. It is an essential component in
every key agreement protocol where the key has to be transmitted through a
noisy channel. The typical case is in the satellite scenario described by
Maurer in the early 90's. Recently the need has arisen in relation with Quantum
Key Distribution (QKD) protocols, where it is very important not to reveal
unnecessary information in order to maximize the shared key length. In this
paper we present an information reconciliation protocol based on a rate
compatible construction of Low Density Parity Check codes. Our protocol
improves the efficiency of the reconciliation for the whole range of error
rates in the discrete variable QKD context. Its adaptability together with its
low interactivity makes it specially well suited for QKD reconciliation
Secure Optical Networks Based on Quantum Key Distribution and Weakly Trusted Repeaters
In this paper we explore how recent technologies can improve the security of
optical networks. In particular, we study how to use quantum key distribution
(QKD) in common optical network infrastructures and propose a method to
overcome its distance limitations. QKD is the first technology offering
information theoretic secret-key distribution that relies only on the
fundamental principles of quantum physics. Point-to-point QKD devices have
reached a mature industrial state; however, these devices are severely limited
in distance, since signals at the quantum level (e.g. single photons) are
highly affected by the losses in the communication channel and intermediate
devices. To overcome this limitation, intermediate nodes (i.e. repeaters) are
used. Both, quantum-regime and trusted, classical, repeaters have been proposed
in the QKD literature, but only the latter can be implemented in practice. As a
novelty, we propose here a new QKD network model based on the use of not fully
trusted intermediate nodes, referred as weakly trusted repeaters. This approach
forces the attacker to simultaneously break several paths to get access to the
exchanged key, thus improving significantly the security of the network. We
formalize the model using network codes and provide real scenarios that allow
users to exchange secure keys over metropolitan optical networks using only
passive components. Moreover, the theoretical framework allows to extend these
scenarios not only to accommodate more complex trust constraints, but also to
consider robustness and resiliency constraints on the network.Comment: 11 pages, 13 figure
Flooding through the lens of mobile phone activity
Natural disasters affect hundreds of millions of people worldwide every year.
Emergency response efforts depend upon the availability of timely information,
such as information concerning the movements of affected populations. The
analysis of aggregated and anonymized Call Detail Records (CDR) captured from
the mobile phone infrastructure provides new possibilities to characterize
human behavior during critical events. In this work, we investigate the
viability of using CDR data combined with other sources of information to
characterize the floods that occurred in Tabasco, Mexico in 2009. An impact map
has been reconstructed using Landsat-7 images to identify the floods. Within
this frame, the underlying communication activity signals in the CDR data have
been analyzed and compared against rainfall levels extracted from data of the
NASA-TRMM project. The variations in the number of active phones connected to
each cell tower reveal abnormal activity patterns in the most affected
locations during and after the floods that could be used as signatures of the
floods - both in terms of infrastructure impact assessment and population
information awareness. The representativeness of the analysis has been assessed
using census data and civil protection records. While a more extensive
validation is required, these early results suggest high potential in using
cell tower activity information to improve early warning and emergency
management mechanisms.Comment: Submitted to IEEE Global Humanitarian Technologies Conference (GHTC)
201
- …