25,823 research outputs found

    Analysis of Power Amplifier Modeling Schemes for Crosscorrelation Predistorters

    Get PDF
    Amplification of signals with fluctuating envelopes leads to distortion because of non-linear behavior of the Power Amplifier (PA). Digital Predistortion can counteract these non-linear effects. A crosscorrelation predistorter is a digital predistorter, based on the calculation of crosscorrelation functions using coarsely quantized signals. The crosscorrelation functions are transformed to the frequency domain and the spectra are used to calculate the coefficients of the predistorter memory polynomial. This method has reduced complexity and equivalent performance in comparison with existing schemes. In this paper, four alternative schemes to implement a crosscorrelation predistorter are analyzed. The PA characteristics can be determined either directly or indirectly using ā€™normalā€™ or orthogonal polynomials giving four alternatives. All four alternatives give significant reduction of Adjacent Channel Interference

    A Short Introduction to Model Selection, Kolmogorov Complexity and Minimum Description Length (MDL)

    Full text link
    The concept of overfitting in model selection is explained and demonstrated with an example. After providing some background information on information theory and Kolmogorov complexity, we provide a short explanation of Minimum Description Length and error minimization. We conclude with a discussion of the typical features of overfitting in model selection.Comment: 20 pages, Chapter 1 of The Paradox of Overfitting, Master's thesis, Rijksuniversiteit Groningen, 200

    Some Applications of Coding Theory in Computational Complexity

    Full text link
    Error-correcting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locally-testable and locally-decodable error-correcting codes, and their applications to complexity theory and to cryptography. Locally decodable codes are error-correcting codes with sub-linear time error-correcting algorithms. They are related to private information retrieval (a type of cryptographic protocol), and they are used in average-case complexity and to construct ``hard-core predicates'' for one-way permutations. Locally testable codes are error-correcting codes with sub-linear time error-detection algorithms, and they are the combinatorial core of probabilistically checkable proofs

    CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters

    Full text link
    The rise of graph-structured data such as social networks, regulatory networks, citation graphs, and functional brain networks, in combination with resounding success of deep learning in various applications, has brought the interest in generalizing deep learning models to non-Euclidean domains. In this paper, we introduce a new spectral domain convolutional architecture for deep learning on graphs. The core ingredient of our model is a new class of parametric rational complex functions (Cayley polynomials) allowing to efficiently compute spectral filters on graphs that specialize on frequency bands of interest. Our model generates rich spectral filters that are localized in space, scales linearly with the size of the input data for sparsely-connected graphs, and can handle different constructions of Laplacian operators. Extensive experimental results show the superior performance of our approach, in comparison to other spectral domain convolutional architectures, on spectral image classification, community detection, vertex classification and matrix completion tasks
    • ā€¦
    corecore