3,921 research outputs found

    Lattice Gaussian Sampling by Markov Chain Monte Carlo: Bounded Distance Decoding and Trapdoor Sampling

    Get PDF
    Sampling from the lattice Gaussian distribution plays an important role in various research fields. In this paper, the Markov chain Monte Carlo (MCMC)-based sampling technique is advanced in several fronts. Firstly, the spectral gap for the independent Metropolis-Hastings-Klein (MHK) algorithm is derived, which is then extended to Peikert's algorithm and rejection sampling; we show that independent MHK exhibits faster convergence. Then, the performance of bounded distance decoding using MCMC is analyzed, revealing a flexible trade-off between the decoding radius and complexity. MCMC is further applied to trapdoor sampling, again offering a trade-off between security and complexity. Finally, the independent multiple-try Metropolis-Klein (MTMK) algorithm is proposed to enhance the convergence rate. The proposed algorithms allow parallel implementation, which is beneficial for practical applications.Comment: submitted to Transaction on Information Theor

    On the Geometric Ergodicity of Metropolis-Hastings Algorithms for Lattice Gaussian Sampling

    Full text link
    Sampling from the lattice Gaussian distribution is emerging as an important problem in coding and cryptography. In this paper, the classic Metropolis-Hastings (MH) algorithm from Markov chain Monte Carlo (MCMC) methods is adapted for lattice Gaussian sampling. Two MH-based algorithms are proposed, which overcome the restriction suffered by the default Klein's algorithm. The first one, referred to as the independent Metropolis-Hastings-Klein (MHK) algorithm, tries to establish a Markov chain through an independent proposal distribution. We show that the Markov chain arising from the independent MHK algorithm is uniformly ergodic, namely, it converges to the stationary distribution exponentially fast regardless of the initial state. Moreover, the rate of convergence is explicitly calculated in terms of the theta series, leading to a predictable mixing time. In order to further exploit the convergence potential, a symmetric Metropolis-Klein (SMK) algorithm is proposed. It is proven that the Markov chain induced by the SMK algorithm is geometrically ergodic, where a reasonable selection of the initial state is capable to enhance the convergence performance.Comment: Submitted to IEEE Transactions on Information Theor

    DJpsiFDC: an event generator for the process ggβ†’J/ψJ/ψgg\to J/\psi J/\psi at LHC

    Full text link
    DJpsiFDC is an event generator package for the process ggβ†’J/ψJ/ψgg\to J/\psi J/\psi. It generates events for primary leading-order 2β†’22\to 2 processes. The package could generate a LHE document and this document could easily be embedded into detector simulation software frameworks. The package is produced in Fortran codes.Comment: 10 pages, 3 figure

    Polar Coding for the Cognitive Interference Channel with Confidential Messages

    Full text link
    In this paper, we propose a low-complexity, secrecy capacity achieving polar coding scheme for the cognitive interference channel with confidential messages (CICC) under the strong secrecy criterion. Existing polar coding schemes for interference channels rely on the use of polar codes for the multiple access channel, the code construction problem of which can be complicated. We show that the whole secrecy capacity region of the CICC can be achieved by simple point-to-point polar codes due to the cognitivity, and our proposed scheme requires the minimum rate of randomness at the encoder

    Markov Chain Monte Carlo Algorithms for Lattice Gaussian Sampling

    Full text link
    Sampling from a lattice Gaussian distribution is emerging as an important problem in various areas such as coding and cryptography. The default sampling algorithm --- Klein's algorithm yields a distribution close to the lattice Gaussian only if the standard deviation is sufficiently large. In this paper, we propose the Markov chain Monte Carlo (MCMC) method for lattice Gaussian sampling when this condition is not satisfied. In particular, we present a sampling algorithm based on Gibbs sampling, which converges to the target lattice Gaussian distribution for any value of the standard deviation. To improve the convergence rate, a more efficient algorithm referred to as Gibbs-Klein sampling is proposed, which samples block by block using Klein's algorithm. We show that Gibbs-Klein sampling yields a distribution close to the target lattice Gaussian, under a less stringent condition than that of the original Klein algorithm.Comment: 5 pages, 1 figure, IEEE International Symposium on Information Theory(ISIT) 201
    • …
    corecore