80 research outputs found

    Recent Progress in Image Deblurring

    Full text link
    This paper comprehensively reviews the recent development of image deblurring, including non-blind/blind, spatially invariant/variant deblurring techniques. Indeed, these techniques share the same objective of inferring a latent sharp image from one or several corresponding blurry images, while the blind deblurring techniques are also required to derive an accurate blur kernel. Considering the critical role of image restoration in modern imaging systems to provide high-quality images under complex environments such as motion, undesirable lighting conditions, and imperfect system components, image deblurring has attracted growing attention in recent years. From the viewpoint of how to handle the ill-posedness which is a crucial issue in deblurring tasks, existing methods can be grouped into five categories: Bayesian inference framework, variational methods, sparse representation-based methods, homography-based modeling, and region-based methods. In spite of achieving a certain level of development, image deblurring, especially the blind case, is limited in its success by complex application conditions which make the blur kernel hard to obtain and be spatially variant. We provide a holistic understanding and deep insight into image deblurring in this review. An analysis of the empirical evidence for representative methods, practical issues, as well as a discussion of promising future directions are also presented.Comment: 53 pages, 17 figure

    Hybrid solutions to instantaneous MIMO blind separation and decoding: narrowband, QAM and square cases

    Get PDF
    Future wireless communication systems are desired to support high data rates and high quality transmission when considering the growing multimedia applications. Increasing the channel throughput leads to the multiple input and multiple output and blind equalization techniques in recent years. Thereby blind MIMO equalization has attracted a great interest.Both system performance and computational complexities play important roles in real time communications. Reducing the computational load and providing accurate performances are the main challenges in present systems. In this thesis, a hybrid method which can provide an affordable complexity with good performance for Blind Equalization in large constellation MIMO systems is proposed first. Saving computational cost happens both in the signal sep- aration part and in signal detection part. First, based on Quadrature amplitude modulation signal characteristics, an efficient and simple nonlinear function for the Independent Compo- nent Analysis is introduced. Second, using the idea of the sphere decoding, we choose the soft information of channels in a sphere, and overcome the so- called curse of dimensionality of the Expectation Maximization (EM) algorithm and enhance the final results simultaneously. Mathematically, we demonstrate in the digital communication cases, the EM algorithm shows Newton -like convergence.Despite the widespread use of forward -error coding (FEC), most multiple input multiple output (MIMO) blind channel estimation techniques ignore its presence, and instead make the sim- plifying assumption that the transmitted symbols are uncoded. However, FEC induces code structure in the transmitted sequence that can be exploited to improve blind MIMO channel estimates. In final part of this work, we exploit the iterative channel estimation and decoding performance for blind MIMO equalization. Experiments show the improvements achievable by exploiting the existence of coding structures and that it can access the performance of a BCJR equalizer with perfect channel information in a reasonable SNR range. All results are confirmed experimentally for the example of blind equalization in block fading MIMO systems

    Novel Complex Adaptive Signal Processing Techniques Employing Optimally Derived Time-varying Convergence Factors With Applicatio

    Get PDF
    In digital signal processing in general, and wireless communications in particular, the increased usage of complex signal representations, and spectrally efficient complex modulation schemes such as QPSK and QAM has necessitated the need for efficient and fast-converging complex digital signal processing techniques. In this research, novel complex adaptive digital signal processing techniques are presented, which derive optimal convergence factors or step sizes for adjusting the adaptive system coefficients at each iteration. In addition, the real and imaginary components of the complex signal and complex adaptive filter coefficients are treated as separate entities, and are independently updated. As a result, the developed methods efficiently utilize the degrees of freedom of the adaptive system, thereby exhibiting improved convergence characteristics, even in dynamic environments. In wireless communications, acceptable co-channel, adjacent channel, and image interference rejection is often one of the most critical requirements for a receiver. In this regard, the fixed-point complex Independent Component Analysis (ICA) algorithm, called Complex FastICA, has been previously applied to realize digital blind interference suppression in stationary or slow fading environments. However, under dynamic flat fading channel conditions frequently encountered in practice, the performance of the Complex FastICA is significantly degraded. In this dissertation, novel complex block adaptive ICA algorithms employing optimal convergence factors are presented, which exhibit superior convergence speed and accuracy in time-varying flat fading channels, as compared to the Complex FastICA algorithm. The proposed algorithms are called Complex IA-ICA, Complex OBA-ICA, and Complex CBC-ICA. For adaptive filtering applications, the Complex Least Mean Square algorithm (Complex LMS) has been widely used in both block and sequential form, due to its computational simplicity. However, the main drawback of the Complex LMS algorithm is its slow convergence and dependence on the choice of the convergence factor. In this research, novel block and sequential based algorithms for complex adaptive digital filtering are presented, which overcome the inherent limitations of the existing Complex LMS. The block adaptive algorithms are called Complex OBA-LMS and Complex OBAI-LMS, and their sequential versions are named Complex HA-LMS and Complex IA-LMS, respectively. The performance of the developed techniques is tested in various adaptive filtering applications, such as channel estimation, and adaptive beamforming. The combination of Orthogonal Frequency Division Multiplexing (OFDM) and the Multiple-Input-Multiple-Output (MIMO) technique is being increasingly employed for broadband wireless systems operating in frequency selective channels. However, MIMO-OFDM systems are extremely sensitive to Intercarrier Interference (ICI), caused by Carrier Frequency Offset (CFO) between local oscillators in the transmitter and the receiver. This results in crosstalk between the various OFDM subcarriers resulting in severe deterioration in performance. In order to mitigate this problem, the previously proposed Complex OBA-ICA algorithm is employed to recover user signals in the presence of ICI and channel induced mixing. The effectiveness of the Complex OBA-ICA method in performing ICI mitigation and signal separation is tested for various values of CFO, rate of channel variation, and Signal to Noise Ratio (SNR)

    Source Separation for Hearing Aid Applications

    Get PDF

    Blind source separation for interference cancellation in CDMA systems

    Get PDF
    Communication is the science of "reliable" transfer of information between two parties, in the sense that the information reaches the intended party with as few errors as possible. Modern wireless systems have many interfering sources that hinder reliable communication. The performance of receivers severely deteriorates in the presence of unknown or unaccounted interference. The goal of a receiver is then to combat these sources of interference in a robust manner while trying to optimize the trade-off between gain and computational complexity. Conventional methods mitigate these sources of interference by taking into account all available information and at times seeking additional information e.g., channel characteristics, direction of arrival, etc. This usually costs bandwidth. This thesis examines the issue of developing mitigating algorithms that utilize as little as possible or no prior information about the nature of the interference. These methods are either semi-blind, in the former case, or blind in the latter case. Blind source separation (BSS) involves solving a source separation problem with very little prior information. A popular framework for solving the BSS problem is independent component analysis (ICA). This thesis combines techniques of ICA with conventional signal detection to cancel out unaccounted sources of interference. Combining an ICA element to standard techniques enables a robust and computationally efficient structure. This thesis proposes switching techniques based on BSS/ICA effectively to combat interference. Additionally, a structure based on a generalized framework termed as denoising source separation (DSS) is presented. In cases where more information is known about the nature of interference, it is natural to incorporate this knowledge in the separation process, so finally this thesis looks at the issue of using some prior knowledge in these techniques. In the simple case, the advantage of using priors should at least lead to faster algorithms.reviewe

    Convolutive Blind Source Separation Methods

    Get PDF
    In this chapter, we provide an overview of existing algorithms for blind source separation of convolutive audio mixtures. We provide a taxonomy, wherein many of the existing algorithms can be organized, and we present published results from those algorithms that have been applied to real-world audio separation tasks

    On Detection and Ranking Methods for a Distributed Radio-Frequency Sensor Network: Theory and Algorithmic Implementation

    Get PDF
    A theoretical foundation for pre-detection fusion of sensors is needed if the United States Air Force is to ever field a system of distributed and layered sensors that can detect and perform parameter estimation of complex, extended targets in difficult interference environments, without human intervention, in near real-time. This research is relevant to the United States Air Force within its layered sensing and cognitive radar/sensor initiatives. The asymmetric threat of the twenty-first century introduces stressing sensing conditions that may exceed the ability of traditional monostatic sensing systems to perform their required intelligence, surveillance and reconnaissance missions. In particular, there is growing interest within the United States Air Force to move beyond single sensor sensing systems, and instead begin fielding and leveraging distributed sensing systems to overcome the inherent challenges imposed by the modern threat space. This thesis seeks to analyze the impact of integrating target echoes in the angular domain, to determine if better detection and ranking performance is achieved through the use of a distributed sensor network. Bespoke algorithms are introduced for detection and ranking ISR missions leveraging a distributed network of radio-frequency sensors: the first set of bespoke algorithms area based upon a depth-based nonparametric detection algorithm, which is to shown to enhance the recovery of targets under lower signal-to-noise ratios than an equivalent monostatic radar system; the second set of bespoke algorithms are based upon random matrix theoretic and concentration of measure mathematics, and demonstrated to outperform the depth-based nonparametric approach. This latter approach shall be shown to be effective across a broad range of signal-to-noise ratios, both positive and negative

    Novel Deep Learning Techniques For Computer Vision and Structure Health Monitoring

    Get PDF
    This thesis proposes novel techniques in building a generic framework for both the regression and classification tasks in vastly different applications domains such as computer vision and civil engineering. Many frameworks have been proposed and combined into a complex deep network design to provide a complete solution to a wide variety of problems. The experiment results demonstrate significant improvements of all the proposed techniques towards accuracy and efficiency
    • …
    corecore