778 research outputs found

    What Can Machine Learning Teach Us about Communications

    Get PDF
    Rapid improvements in machine learning over the past decade are beginning to have far-reaching effects. For communications, engineers with limited domain expertise can now use off-the-shelf learning packages to design high-performance systems based on simulations. Prior to the current revolution in machine learning, the majority of communication engineers were quite aware that system parameters (such as filter coefficients) could be learned using stochastic gradient descent. It was not at all clear, however, that more complicated parts of the system architecture could be learned as well.In this paper, we discuss the application of machine-learning techniques to two communications problems and focus on what can be learned from the resulting systems. We were pleasantly surprised that the observed gains in one example have a simple explanation that only became clear in hindsight. In essence, deep learning discovered a simple and effective strategy that had not been considered earlier

    Construction of LDPC Codes Using Randomly Permutated Copies of Parity Check Matrix

    Get PDF
    Low-density parity-check codes (LDPC) have been shown to have good error correcting performance, putting in mind the Shannon's limit approaching capability. This enables an efficient and reliable communication. However, the construction method of LDPC code can vary over a wide range of parameters such as rate, girth and length. There is a need to develop methods of constructing codes over a wide range of rates and lengths with good performance. This research studies the construction of LDPC codes in randomized and structured form. The contribution of this thesis is introducing a method called "Randomly permutated copies of parity check matrix" that uses a base parity check matrix designed by a random or structured construction method such as Gallager or QC-LDPC codes respectively to get codes with multiple lengths and same rate of the base matrix. This is done by using a seed matrix with row and column weights of one, distributed randomly and can be addressed by a number in the base matrix. This method reduces the memory space needed for storing large parity check matrices, and also reduces the probability of failing to construct a parity matrix by random approaches. Numerical results show that the proposed construction performs similarly to random codes with the same length and rate as in Gallager's and Mackay's codes. It also increases the girth average of the Tanner graph and reduces the number of 4 cycles in the resulted matrix if exists in a base graph
    • …
    corecore