507 research outputs found

    Classical capacity of bosonic broadcast communication and a new minimum output entropy conjecture

    Full text link
    Previous work on the classical information capacities of bosonic channels has established the capacity of the single-user pure-loss channel, bounded the capacity of the single-user thermal-noise channel, and bounded the capacity region of the multiple-access channel. The latter is a multi-user scenario in which several transmitters seek to simultaneously and independently communicate to a single receiver. We study the capacity region of the bosonic broadcast channel, in which a single transmitter seeks to simultaneously and independently communicate to two different receivers. It is known that the tightest available lower bound on the capacity of the single-user thermal-noise channel is that channel's capacity if, as conjectured, the minimum von Neumann entropy at the output of a bosonic channel with additive thermal noise occurs for coherent-state inputs. Evidence in support of this minimum output entropy conjecture has been accumulated, but a rigorous proof has not been obtained. In this paper, we propose a new minimum output entropy conjecture that, if proved to be correct, will establish that the capacity region of the bosonic broadcast channel equals the inner bound achieved using a coherent-state encoding and optimum detection. We provide some evidence that supports this new conjecture, but again a full proof is not available.Comment: 13 pages, 7 figure

    Optimal ratio between phase basis and bit basis in QKD

    Full text link
    In the original BB84 protocol, the bit basis and the phase basis are used with equal probability. Lo et al (J. of Cryptology, 18, 133-165 (2005)) proposed to modify the ratio between the two bases by increasing the final key generation rate. However, the optimum ratio has not been derived. In this letter, in order to examine this problem, the ratio between the two bases is optimized for exponential constraints given Eve's information distinguishability and the final error probability

    Multimode theory of measurement-induced non-Gaussian operation on wideband squeezed light

    Full text link
    We present a multimode theory of non-Gaussian operation induced by an imperfect on/off-type photon detector on a splitted beam from a wideband squeezed light. The events are defined for finite time duration TT in the time domain. The non-Gaussian output state is measured by the homodyne detector with finite bandwidh BB. Under this time- and band-limitation to the quantm states, we develop a formalism to evaluate the frequency mode matching between the on/off trigger channel and the conditional signal beam in the homodyne channel. Our formalism is applied to the CW and pulsed schemes. We explicitly calculate the Wigner function of the conditional non-Gaussian output state in a realistic situation. Good mode matching is achieved for BT\alt1, where the discreteness of modes becomes prominant, and only a few modes become dominant both in the on/off and the homodyne channels. If the trigger beam is projected nearly onto the single photon state in the most dominant mode in this regime, the most striking non-classical effect will be observed in the homodyne statistics. The increase of BTBT and the dark counts degrades the non-classical effect.Comment: 20 pages, 14 figures, submitted to Phys. Rev.

    Interior Point Decoding for Linear Vector Channels

    Full text link
    In this paper, a novel decoding algorithm for low-density parity-check (LDPC) codes based on convex optimization is presented. The decoding algorithm, called interior point decoding, is designed for linear vector channels. The linear vector channels include many practically important channels such as inter symbol interference channels and partial response channels. It is shown that the maximum likelihood decoding (MLD) rule for a linear vector channel can be relaxed to a convex optimization problem, which is called a relaxed MLD problem. The proposed decoding algorithm is based on a numerical optimization technique so called interior point method with barrier function. Approximate variations of the gradient descent and the Newton methods are used to solve the convex optimization problem. In a decoding process of the proposed algorithm, a search point always lies in the fundamental polytope defined based on a low-density parity-check matrix. Compared with a convectional joint message passing decoder, the proposed decoding algorithm achieves better BER performance with less complexity in the case of partial response channels in many cases.Comment: 18 pages, 17 figures, The paper has been submitted to IEEE Transaction on Information Theor

    Degree Complexity of a Family of Birational Maps

    Full text link
    We compute the degree complexity of a family of birational mappings of the plane with high order singularities

    Finite-Connectivity Spin-Glass Phase Diagrams and Low Density Parity Check Codes

    Get PDF
    We obtain phase diagrams of regular and irregular finite connectivity spin-glasses. Contact is firstly established between properties of the phase diagram and the performances of low density parity check codes (LDPC) within the Replica Symmetric (RS) ansatz. We then study the location of the dynamical and critical transition of these systems within the one step Replica Symmetry Breaking theory (RSB), extending similar calculations that have been performed in the past for the Bethe spin-glass problem. We observe that, away from the Nishimori line, in the low temperature region, the location of the dynamical transition line does change within the RSB theory, in comparison with the (RS) case. For LDPC decoding over the binary erasure channel we find, at zero temperature and rate R=1/4 an RS critical transition point located at p_c = 0.67 while the critical RSB transition point is located at p_c = 0.7450, to be compared with the corresponding Shannon bound 1-R. For the binary symmetric channel (BSC) we show that the low temperature reentrant behavior of the dynamical transition line, observed within the RS ansatz, changes within the RSB theory; the location of the dynamical transition point occurring at higher values of the channel noise. Possible practical implications to improve the performances of the state-of-the-art error correcting codes are discussed.Comment: 21 pages, 15 figure

    Parallel vs. Sequential Belief Propagation Decoding of LDPC Codes over GF(q) and Markov Sources

    Full text link
    A sequential updating scheme (SUS) for belief propagation (BP) decoding of LDPC codes over Galois fields, GF(q)GF(q), and correlated Markov sources is proposed, and compared with the standard parallel updating scheme (PUS). A thorough experimental study of various transmission settings indicates that the convergence rate, in iterations, of the BP algorithm (and subsequently its complexity) for the SUS is about one half of that for the PUS, independent of the finite field size qq. Moreover, this 1/2 factor appears regardless of the correlations of the source and the channel's noise model, while the error correction performance remains unchanged. These results may imply on the 'universality' of the one half convergence speed-up of SUS decoding

    Species-specific abundance of bivalve larvae in relation to biological and physical conditions in a Cape Cod estuary : Waquoit Bay, Massachusetts (USA)

    Get PDF
    Author Posting. © The Author(s), 2012. This is the author's version of the work. It is posted here by permission of Inter-Research for personal use, not for redistribution. The definitive version was published in Marine Ecology Progress Series 469 (2012): 53-69, doi:10.3354/meps09998.Physical and biological conditions impact recruitment and adult population structure of 34 marine invertebrates by affecting early life history processes from spawning to post-settlement. We investigated how temperature, salinity and phytoplankton influenced larval abundance and larval size structure for three species of bivalves over two non-consecutive years in Waquoit Bay, MA. Abundance and size of Mercenaria mercenaria (quahog), Anomia simplex (jingle clam), and Geukensia demissa (ribbed mussel) larvae were compared between locations in the bay and with environmental conditions. Shell birefringence patterns using polarized light microscopy were used to distinguish species. Larval abundances for all three species were higher in 2009 than in 2007 and were positively correlated with temperature in both years. Differences in larval abundance and size structure between bay sites were attributed to salinity tolerances and potential source locations. Higher survival in 2009 than in 2007, as determined by number of pediveligers, was likely due to higher temperatures and greater food availability during the peak abundance months of July and August in 2009. Yearly differences in larval growth and survival can have a large impact on recruitment. Knowing the optimal periods and locations for larval abundance and survival can be useful for isolating species-specific patterns in larval dispersal and to aid resource managers in enhancing or restoring depleted populations.This research was conducted in the National Estuarine Research Reserve System under an award to S. Gallager and C. Mingione Thompson from the Estuarine Reserves Division, Office of Ocean and Coastal Resource Management, National Ocean Service, National Oceanic and Atmospheric Administration

    Processing and Transmission of Information

    Get PDF
    Contains reports on four research projects.Lincoln Laboratory, Purchase Order DDL-B222U.S. Air Force under Air Force Contract AF19(604)-520

    Simplifying Random Satisfiability Problem by Removing Frustrating Interactions

    Full text link
    How can we remove some interactions in a constraint satisfaction problem (CSP) such that it still remains satisfiable? In this paper we study a modified survey propagation algorithm that enables us to address this question for a prototypical CSP, i.e. random K-satisfiability problem. The average number of removed interactions is controlled by a tuning parameter in the algorithm. If the original problem is satisfiable then we are able to construct satisfiable subproblems ranging from the original one to a minimal one with minimum possible number of interactions. The minimal satisfiable subproblems will provide directly the solutions of the original problem.Comment: 21 pages, 16 figure
    • …
    corecore