12 research outputs found

    Sobre critérios para equalização não-supervisionada

    Get PDF
    In this work, we study the criteria used to solve the blind equalization problem. Two approaches are considered in detail: the constant modulus and the Shalvi-Weinstein criteria. In the course of our exposition, a more recent and less studied technique, the generalized constant modulus criterion, is also discussed. Some of the most important results found in the literature are presented together with some recent contributions related to the comparison between blind criteria and between unsupervised techniques and the Wiener criterion.Neste artigo são abordados critérios usados para resolver o problema da equalização cega também conhecida como autodidata. Consideram-se os critérios clássicos do módulo constante e o do Shalvi-Weinstein. Apresentaremos os principais resultados existentes na literatura e alguns resultados mais recentes, que dizem respeito ao estudo do algoritmo do módulo constante generalizado (GCMA) e à comparação entre os critérios citados e destes com o critério de Wiener.278299Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq

    Uma abordagem unificada para algoritmos de equalização autodidata

    Get PDF
    Orientadores : João Marcos Travassos Romano, Maria D. MirandaDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de ComputaçãoMestrad

    Signal Processing Design of Low Probability of Intercept Waveforms

    Get PDF
    This thesis investigates a modification to Differential Phase Shift Keyed (DPSK) modulation to create a Low Probability of Interception/Exploitation (LPI/LPE) communications signal. A pseudorandom timing offset is applied to each symbol in the communications stream to intentionally create intersymbol interference (ISI) that hinders accurate symbol estimation and bit sequence recovery by a non-cooperative receiver. Two cooperative receiver strategies are proposed to mitigate the ISI due to symbol timing offset: a modified minimum Mean Square Error (MMSE) equalization algorithm and a multiplexed bank of equalizer filters determined by an adaptive Least Mean Square (LMS) algorithm. Both cooperative receivers require some knowledge of the pseudorandom symbol timing dither to successfully demodulate the communications waveform. Numerical Matlab® simulation is used to demonstrate the bit error rate performance of cooperative receivers and notional non-cooperative receivers for binary, 4-ary, and 8-ary DPSK waveforms transmitted through a line-of-sight, additive white Gaussian noise channel. Simulation results suggest that proper selection of pulse shape and probability distribution of symbol timing offsets produces a waveform that is accurately demodulated by the proposed cooperative receivers and significantly degrades non-cooperative receiver symbol estimation accuracy. In typical simulations, non-cooperative receivers required 2-8 dB more signal power than cooperative receivers to achieve a bit error rate of 1.0%. For nearly all reasonable parameter selections, non-cooperative receivers produced bit error rates in excess of 0.1%, even when signal power is unconstrained

    An Examination of Some Signi cant Approaches to Statistical Deconvolution

    No full text
    We examine statistical approaches to two significant areas of deconvolution - Blind Deconvolution (BD) and Robust Deconvolution (RD) for stochastic stationary signals. For BD, we review some major classical and new methods in a unified framework of nonGaussian signals. The first class of algorithms we look at falls into the class of Minimum Entropy Deconvolution (MED) algorithms. We discuss the similarities between them despite differences in origins and motivations. We give new theoretical results concerning the behaviour and generality of these algorithms and give evidence of scenarios where they may fail. In some cases, we present new modifications to the algorithms to overcome these shortfalls. Following our discussion on the MED algorithms, we next look at a recently proposed BD algorithm based on the correntropy function, a function defined as a combination of the autocorrelation and the entropy functiosn. We examine its BD performance when compared with MED algorithms. We find that the BD carried out via correntropy-matching cannot be straightforwardly interpreted as simultaneous moment-matching due to the breakdown of the correntropy expansion in terms of moments. Other issues such as maximum/minimum phase ambiguity and computational complexity suggest that careful attention is required before establishing the correntropy algorithm as a superior alternative to the existing BD techniques. For the problem of RD, we give a categorisation of different kinds of uncertainties encountered in estimation and discuss techniques required to solve each individual case. Primarily, we tackle the overlooked cases of robustification of deconvolution filters based on estimated blurring response or estimated signal spectrum. We do this by utilising existing methods derived from criteria such as minimax MSE with imposed uncertainty bands and penalised MSE. In particular, we revisit the Modified Wiener Filter (MWF) which offers simplicity and flexibility in giving improved RDs to the standard plug-in Wiener Filter (WF)

    Sobre dinamica caotica e convergencia em algoritmos de equalização autodidata

    Get PDF
    Orientador : João Marcos Travassos RomanoDissertação (mestrado) - Universidade de Campinas, Faculdade de Engenharia Eletrica e de ComputaçãoMestrad

    Blind Signal Separation for Digital Communication Data

    Get PDF
    to appear in EURASIP E-reference in Signal Processing, invited paper.International audienceBlind source separation, often called independent component analysis , is a main field of research in signal processing since the eightees. It consists in retrieving the components, up to certain indeterminacies, of a mixture involving statistically independent signals. Solid theoretical results are known; besides, they have given rise to performent algorithms. There are numerous applications of blind source separation. In this contribution, we particularize the separation of telecommunication sources. In this context, the sources stem from telecommunication devices transmitting at the same time in a given band of frequencies. The received data is a mixed version of all these sources. The aim of the receiver is to isolate (separate) the different contributions prior to estimating the unknown parameters associated with a transmitter. The context of telecommunication signals has the particularity that the sources are not stationary but cyclo-stationary. Now, in general, the standard methods of blind source separation assume the stationarity of the sources. In this contribution , we hence make a survey of the well-known methods and show how the results extend to cyclo-stationary sources

    Sobre separação cega de fontes : proposições e analise de estrategias para processamento multi-usuario

    Get PDF
    Orientadores: João Marcos Travassos Romano, Francisco Rodrigo Porto CavalcantiTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de ComputaçãoResumo: Esta tese é dedicada ao estudo de tecnicas de separação cega de fontes aplicadas ao contexto de processamento multiusuario em comunicações digitais. Utilizando estrategias de estimação da função de densidade de probabilidade (fdp), são propostos dois metodos de processamento multiusuario que permitem recuperar os sinais transmitidos pela medida de similaridade de Kullback-Leibler entre a fdp dos sinais a saida do dispositivo de separação e um modelo parametrico que contem as caracteristicas dos sinais transmitidos. Alem desta medida de similaridade, são empregados diferentes metodos que garantem a descorrelação entre as estimativas das fontes de tal forma que os sinais recuperados sejam provenientes de diferentes fontes. E ainda realizada a analise de convergencia dos metodos e suas equivalencias com tecnicas classicas resultando em algumas importantes relações entre criterios cegos e supervisionados, tais como o criterio proposto e o criterio de maxima a posteriori. Estes novos metodos aliam a capacidade de recuperação da informação uma baixa complexidade computacional. A proposição de metodos baseados na estimativa da fdp permitiu a realização de um estudo sobre o impacto das estatisticas de ordem superior em algoritmos adaptativos para separação cega de fontes. A utilização da expansão da fdp em series ortonormais permite avaliar atraves dos cumulantes a dinamica de um processo de separação de fontes. Para tratar com problemas de comunicação digital e proposta uma nova serie ortonormal, desenvolvida em torno de uma função de densidade de probabilidade dada por um somatorio de gaussianas. Esta serie e utilizada para evidenciar as diferenças em relação ao desempenho em tempo real ao se reter mais estatisticas de ordem superior. Simulações computacionais são realizadas para evidenciar o desempenho das propostas frente a tecnicas conhecidas da literatura em varias situações de necessidade de alguma estrategia de recuperação de sinaisAbstract: This thesis is devoted to study blind source separation techniques applied to multiuser processing in digital communications. Using probability density function (pdf) estimation strategies, two multiuser processing methods are proposed. They aim for recovering transmitted signal by using the Kullback-Leibler similarity measure between the signals pdf and a parametric model that contains the signals characteristics. Besides the similarity measure, different methods are employed to guarantee the decorrelation of the sources estimates, providing that the recovered signals origin from different sources. The convergence analysis of the methods as well as their equivalences with classical techniques are presented, resulting on important relationships between blind and supervised criteria such as the proposal and the maximum a posteriori one. Those new methods have a good trade-off between the recovering ability and computational complexity. The proposal os pdf estimation-based methods had allowed the investigation on the impact of higher order statistics on adaptive algorithms for blind source separation. Using pdf orthonormal series expansion we are able to evaluate through cumulants the dynamics of a source separation process. To be able to deal with digital communication signals, a new orthonormal series expansion is proposed. Such expansion is developed in terms of a Gaussian mixture pdf. This new expansion is used to evaluate the differences in real time processing when we retain more higher order statistics. Computational simulations are carried out to stress the performance of the proposals, faced to well known techniques reported in the literature, under the situations where a recovering signal strategy is required.DoutoradoTelecomunicações e TelemáticaDoutor em Engenharia Elétric

    Mixed Norm Equalization with Applications in Television Multipath Cancellation

    Get PDF
    Electrical Engineerin
    corecore