2 research outputs found

    Lattice sampling algorithms for communications

    No full text
    In this thesis, we investigate the problem of decoding for wireless communications from the perspective of lattice sampling. In particular, computationally efficient lattice sampling algorithms are exploited to enhance the system performance, which enjoys the system tradeoff between performance and complexity through the sample size. Based on this idea, several novel lattice sampling algorithms are presented in this thesis. First of all, in order to address the inherent issues in the random sampling, derandomized sampling algorithm is proposed. Specifically, by setting a probability threshold to sample candidates, the whole sampling procedure becomes deterministic, leading to considerable performance improvement and complexity reduction over to the randomized sampling. According to the analysis and optimization, the correct decoding radius is given with the optimized parameter setting. Moreover, the upper bound on the sample size, which corresponds to near-maximum likelihood (ML) performance, is also derived. After that, the proposed derandomized sampling algorithm is introduced into the soft-output decoding of MIMO bit-interleaved coded modulation (BICM) systems to further improve the decoding performance. According to the demonstration, we show that the derandomized sampling algorithm is able to achieve the near-maximum a posteriori (MAP) performance in the soft-output decoding. We then extend the well-known Markov Chain Monte Carlo methods into the samplings from lattice Gaussian distribution, which has emerged as a common theme in lattice coding and decoding, cryptography, mathematics. We firstly show that the statistical Gibbs sampling is capable to perform the lattice Gaussian sampling. Then, a more efficient algorithm referred to as Gibbs-Klein sampling is proposed, which samples multiple variables block by block using Klein’s algorithm. After that, for the sake of convergence rate, we introduce the conventional statistical Metropolis-Hastings (MH) sampling into lattice Gaussian distributions and three MH-based sampling algorithms are then proposed. The first one, named as MH multivariate sampling algorithm, is demonstrated to have a faster convergence rate than Gibbs-Klein sampling. Next, the symmetrical distribution generated by Klein’s algorithm is taken as the proposal distribution, which offers an efficient way to perform the Metropolis sampling over high-dimensional models. Finally, the independent Metropolis-Hastings-Klein (MHK) algorithm is proposed, where the Markov chain arising from it is proved to converge to the stationary distribution exponentially fast. Furthermore, its convergence rate can be explicitly calculated in terms of the theta series, making it possible to predict the exact mixing time of the underlying Markov chain.Open Acces
    corecore