82 research outputs found

    A Coding Theoretic Approach for Evaluating Accumulate Distribution on Minimum Cut Capacity of Weighted Random Graphs

    Full text link
    The multicast capacity of a directed network is closely related to the ss-tt maximum flow, which is equal to the ss-tt minimum cut capacity due to the max-flow min-cut theorem. If the topology of a network (or link capacities) is dynamically changing or have stochastic nature, it is not so trivial to predict statistical properties on the maximum flow. In this paper, we present a coding theoretic approach for evaluating the accumulate distribution of the minimum cut capacity of weighted random graphs. The main feature of our approach is to utilize the correspondence between the cut space of a graph and a binary LDGM (low-density generator-matrix) code with column weight 2. The graph ensemble treated in the paper is a weighted version of Erd\H{o}s-R\'{e}nyi random graph ensemble. The main contribution of our work is a combinatorial lower bound for the accumulate distribution of the minimum cut capacity. From some computer experiments, it is observed that the lower bound derived here reflects the actual statistical behavior of the minimum cut capacity.Comment: 5 pages, 2 figures, submitted to IEEE ISIT 201

    Second-Order Weight Distributions

    Full text link
    A fundamental property of codes, the second-order weight distribution, is proposed to solve the problems such as computing second moments of weight distributions of linear code ensembles. A series of results, parallel to those for weight distributions, is established for second-order weight distributions. In particular, an analogue of MacWilliams identities is proved. The second-order weight distributions of regular LDPC code ensembles are then computed. As easy consequences, the second moments of weight distributions of regular LDPC code ensembles are obtained. Furthermore, the application of second-order weight distributions in random coding approach is discussed. The second-order weight distributions of the ensembles generated by a so-called 2-good random generator or parity-check matrix are computed, where a 2-good random matrix is a kind of generalization of the uniformly distributed random matrix over a finite filed and is very useful for solving problems that involve pairwise or triple-wise properties of sequences. It is shown that the 2-good property is reflected in the second-order weight distribution, which thus plays a fundamental role in some well-known problems in coding theory and combinatorics. An example of linear intersecting codes is finally provided to illustrate this fact.Comment: 10 pages, accepted for publication in IEEE Transactions on Information Theory, May 201

    Average Stopping Set Weight Distribution of Redundant Random Matrix Ensembles

    Full text link
    In this paper, redundant random matrix ensembles (abbreviated as redundant random ensembles) are defined and their stopping set (SS) weight distributions are analyzed. A redundant random ensemble consists of a set of binary matrices with linearly dependent rows. These linearly dependent rows (redundant rows) significantly reduce the number of stopping sets of small size. An upper and lower bound on the average SS weight distribution of the redundant random ensembles are shown. From these bounds, the trade-off between the number of redundant rows (corresponding to decoding complexity of BP on BEC) and the critical exponent of the asymptotic growth rate of SS weight distribution (corresponding to decoding performance) can be derived. It is shown that, in some cases, a dense matrix with linearly dependent rows yields asymptotically (i.e., in the regime of small erasure probability) better performance than regular LDPC matrices with comparable parameters.Comment: 14 pages, 7 figures, Conference version to appear at the 2007 IEEE International Symposium on Information Theory, Nice, France, June 200

    On Universal Properties of Capacity-Approaching LDPC Ensembles

    Full text link
    This paper is focused on the derivation of some universal properties of capacity-approaching low-density parity-check (LDPC) code ensembles whose transmission takes place over memoryless binary-input output-symmetric (MBIOS) channels. Properties of the degree distributions, graphical complexity and the number of fundamental cycles in the bipartite graphs are considered via the derivation of information-theoretic bounds. These bounds are expressed in terms of the target block/ bit error probability and the gap (in rate) to capacity. Most of the bounds are general for any decoding algorithm, and some others are proved under belief propagation (BP) decoding. Proving these bounds under a certain decoding algorithm, validates them automatically also under any sub-optimal decoding algorithm. A proper modification of these bounds makes them universal for the set of all MBIOS channels which exhibit a given capacity. Bounds on the degree distributions and graphical complexity apply to finite-length LDPC codes and to the asymptotic case of an infinite block length. The bounds are compared with capacity-approaching LDPC code ensembles under BP decoding, and they are shown to be informative and are easy to calculate. Finally, some interesting open problems are considered.Comment: Published in the IEEE Trans. on Information Theory, vol. 55, no. 7, pp. 2956 - 2990, July 200
    • …
    corecore