96 research outputs found

    Statistical mechanics of error exponents for error-correcting codes

    Full text link
    Error exponents characterize the exponential decay, when increasing message length, of the probability of error of many error-correcting codes. To tackle the long standing problem of computing them exactly, we introduce a general, thermodynamic, formalism that we illustrate with maximum-likelihood decoding of low-density parity-check (LDPC) codes on the binary erasure channel (BEC) and the binary symmetric channel (BSC). In this formalism, we apply the cavity method for large deviations to derive expressions for both the average and typical error exponents, which differ by the procedure used to select the codes from specified ensembles. When decreasing the noise intensity, we find that two phase transitions take place, at two different levels: a glass to ferromagnetic transition in the space of codewords, and a paramagnetic to glass transition in the space of codes.Comment: 32 pages, 13 figure

    Strategic Insights From Playing the Quantum Tic-Tac-Toe

    Full text link
    In this paper, we perform a minimalistic quantization of the classical game of tic-tac-toe, by allowing superpositions of classical moves. In order for the quantum game to reduce properly to the classical game, we require legal quantum moves to be orthogonal to all previous moves. We also admit interference effects, by squaring the sum of amplitudes over all moves by a player to compute his or her occupation level of a given site. A player wins when the sums of occupations along any of the eight straight lines we can draw in the 3×33 \times 3 grid is greater than three. We play the quantum tic-tac-toe first randomly, and then deterministically, to explore the impact different opening moves, end games, and different combinations of offensive and defensive strategies have on the outcome of the game. In contrast to the classical tic-tac-toe, the deterministic quantum game does not always end in a draw. In contrast also to most classical two-player games of no chance, it is possible for Player 2 to win. More interestingly, we find that Player 1 enjoys an overwhelming quantum advantage when he opens with a quantum move, but loses this advantage when he opens with a classical move. We also find the quantum blocking move, which consists of a weighted superposition of moves that the opponent could use to win the game, to be very effective in denying the opponent his or her victory. We then speculate what implications these results might have on quantum information transfer and portfolio optimization.Comment: 20 pages, 3 figures, and 3 tables. LaTeX 2e using iopart class, and braket, color, graphicx, multirow, subfig, url package

    Computing with Liquid Crystal Fingers: Models of geometric and logical computation

    Get PDF
    When a voltage is applied across a thin layer of cholesteric liquid crystal, fingers of cholesteric alignment can form and propagate in the layer. In computer simulation, based on experimental laboratory results, we demonstrate that these cholesteric fingers can solve selected problems of computational geometry, logic and arithmetics. We show that branching fingers approximate a planar Voronoi diagram, and non-branching fingers produce a convex subdivision of concave polygons. We also provide a detailed blue-print and simulation of a one-bit half-adder functioning on the principles of collision-based computing, where the implementation is via collision of liquid crystal fingers with obstacles and other fingers.Comment: submitted Sept 201

    Complexity of token swapping and its variants

    Get PDF
    AbstractIn the Token Swapping problem we are given a graph with a token placed on each vertex. Each token has exactly one destination vertex, and we try to move all the tokens to their destinations, using the minimum number of swaps, i.e., operations of exchanging the tokens on two adjacent vertices. As the main result of this paper, we show that Token Swapping is W[1]-hard parameterized by the length k of a shortest sequence of swaps. In fact, we prove that, for any computable function f, it cannot be solved in time f(k)no(k/logk) where n is the number of vertices of the input graph, unless the ETH fails. This lower bound almost matches the trivial nO(k)-time algorithm. We also consider two generalizations of the Token Swapping, namely Colored Token Swapping (where the tokens have colors and tokens of the same color are indistinguishable), and Subset Token Swapping (where each token has a set of possible destinations). To complement the hardness result, we prove that even the most general variant, Subset Token Swapping, is FPT in nowhere-dense graph classes. Finally, we consider the complexities of all three problems in very restricted classes of graphs: graphs of bounded treewidth and diameter, stars, cliques, and paths, trying to identify the borderlines between polynomial and NP-hard cases

    Ensuring message embedding in wet paper steganography

    Get PDF
    International audienceSyndrome coding has been proposed by Crandall in 1998 as a method to stealthily embed a message in a cover-medium through the use of bounded decoding. In 2005, Fridrich et al. introduced wet paper codes to improve the undetectability of the embedding by nabling the sender to lock some components of the cover-data, according to the nature of the cover-medium and the message. Unfortunately, almost all existing methods solving the bounded decoding syndrome problem with or without locked components have a non-zero probability to fail. In this paper, we introduce a randomized syndrome coding, which guarantees the embedding success with probability one. We analyze the parameters of this new scheme in the case of perfect codes

    Enhancing Code Based Zero-knowledge Proofs using Rank Metric

    Get PDF
    The advent of quantum computers is a threat to most currently deployed cryptographic primitives. Among these, zero-knowledge proofs play an important role, due to their numerous applications. The primitives and protocols presented in this work base their security on the difficulty of solving the Rank Syndrome Decoding (RSD) problem. This problem is believed to be hard even in the quantum model. We first present a perfectly binding commitment scheme. Using this scheme, we are able to build an interactive zero-knowledge proof to prove: the knowledge of a valid opening of a committed value, and that the valid openings of three committed values satisfy a given linear relation, and, more generally, any bitwise relation. With the above protocols it becomes possible to prove the relation of two committed values for an arbitrary circuit, with quasi-linear communication complexity and a soundness error of 2/3. To our knowledge, this is the first quantum resistant zero-knowledge protocol for arbitrary circuits based on the RSD problem. An important contribution of this work is the selection of a set of parameters, and an a full implementation, both for our proposal in the rank metric and for the original LPN based one by Jain et. al in the Hamming metric, from which we took the inspiration. Beside demonstrating the practicality of both constructions, we provide evidence of the convenience of rank metric, by reporting performance benchmarks and a detailed comparison

    Land use interpretation for cellular automata models with socioeconomic heterogeneity

    Get PDF
    Cellular automata models for simulation of urban development usually lack the social heterogeneity that is typical of urban environments. In order to handle this shortcoming, this paper proposes the use of supervised clustering analysis to provide socioeconomic intra-urban land use classification at different levels to be applied to cellular automata models. An empirical test in a highly diverse context in the Greater Metropolitan Area of Belo Horizonte (RMBH) in Brazil is provided. The results show that a reliable division into different socioeconomic land-use classes at large scale enable detailed urban dynamic analysis. Furthermore, the results also allow the quantification of the proportion of urban space occupation for different levels of income; (2) and their pattern in relation to the city centre
    • 

    corecore