376 research outputs found
Proceedings of the 35th WIC Symposium on Information Theory in the Benelux and the 4th joint WIC/IEEE Symposium on Information Theory and Signal Processing in the Benelux, Eindhoven, the Netherlands May 12-13, 2014
Compressive sensing (CS) as an approach for data acquisition has recently received much attention. In CS, the signal recovery problem from the observed data requires the solution of a sparse vector from an underdetermined system of equations. The underlying sparse signal recovery problem is quite general with many applications and is the focus of this talk. The main emphasis will be on Bayesian approaches for sparse signal recovery. We will examine sparse priors such as the super-Gaussian and student-t priors and appropriate MAP estimation methods. In particular, re-weighted l2 and re-weighted l1 methods developed to solve the optimization problem will be discussed. The talk will also examine a hierarchical Bayesian framework and then study in detail an empirical Bayesian method, the Sparse Bayesian Learning (SBL) method. If time permits, we will also discuss Bayesian methods for sparse recovery problems with structure; Intra-vector correlation in the context of the block sparse model and inter-vector correlation in the context of the multiple measurement vector problem
Recommended from our members
Signal Processing in Wireless Communications: Device Fingerprinting and Wide-Band Interference Rejection
The rapid progress of wireless communication technologies that has taken place in recent years has significantly improved the quality of everyday life. However with this expansion of wireless communication systems come significant security threats and significant technological challenges, both of which are due to the fact that the communication medium is shared. The ubiquity of open wireless Internet access networks creates a new avenue for cyber-criminals to impersonate and act in an unauthorized way. The increasing number of deployed wide-band wireless communication systems entails technological challenges for effective utilization of the shared medium, which implies the need for advanced interference rejection methods. Wireless security and interference rejection in wide-band wireless communications are therefore often considered as the two main challenges in wireless network\u27s design and research. Important aspects of these challenges are illuminated and addressed in this dissertation.
This dissertation considers signal processing approaches for exploiting or mitigating the effects of non-ideal components in wireless communication systems. In the first part of the dissertation, we introduce and study a novel, model-based approach to wireless device identification that exploits imperfections in the transmitter caused by manufacturing process nonidealities. Previous approaches to device identification based on hardware imperfections vary from transient analysis to machine learning but have not provided verifiable accuracy. Here, we detail a model-based approach, that uses statistical models of RF transmitter components: digital-to-analog converter, power amplifier and RF oscillator, which are amenable for analysis. Our proposed approach examines the key device characteristics that cause anonymity loss, countermeasures that can be applied by the nodes to regain the anonymity, and ways of thwarting such countermeasures. We develop identification algorithms based on statistical signal processing methods and address the challenging scenario when the units that need to be distinguished from one another are of the same model and from the same manufacturer. Using simulations and measurements of components that are commonly used in commercial communications systems, we show that our anonymity breaking techniques are effective.
In the second part of the dissertation, we consider innovative approaches for the acquisition of frequency-sparse signals with wide-band receivers when a weak signal of interest is received in the presence of a very strong interference, and the effects of the nonlinearities in the low-noise amplifier at the receiver must be mitigated. All samples with amplitude above a given threshold, dictated by the linear input range of the receiver, are discarded to avoid the distortion caused by saturation of the low noise amplifier. Such a sampling scheme, while avoiding nonlinear distortion that cannot be corrected in the digital domain, poses challenges for signal reconstruction techniques, as the samples are taken non-uniformly, but also non-randomly. The considered approaches fall into the field of compressive sensing (CS); however, what differentiates them from conventional CS is that a structure is forced upon the measurement scheme. Such a structure causes a violation of the core CS assumption of the measurements\u27 randomness. We consider two different types of structured acquisition: signal independent and signal dependent structured acquisition. For the first case, we derive bounds on the number of samples needed for successful CS recovery when samples are drawn at random in predefined groups. For the second case, we consider enhancements of CS recovery methods when only small-amplitude samples of the signal that needs to be recovered are available for the recovery. Finally, we address a problem of spectral leakage due to the limited processing block size of block processing, wide-band receivers and propose an adaptive block size adjustment method, which leads to significant dynamic range improvements
Audio Signal Processing Using Time-Frequency Approaches: Coding, Classification, Fingerprinting, and Watermarking
Audio signals are information rich nonstationary signals that play an important role in our day-to-day communication, perception of environment, and entertainment. Due to its non-stationary nature, time- or frequency-only approaches are inadequate in analyzing these signals. A joint time-frequency (TF) approach would be a better choice to efficiently process these signals. In this digital era, compression, intelligent indexing for content-based retrieval, classification, and protection of digital audio content are few of the areas that encapsulate a majority of the audio signal processing applications. In this paper, we present a comprehensive array of TF methodologies that successfully address applications in all of the above mentioned areas. A TF-based audio coding scheme with novel psychoacoustics model, music classification, audio classification of environmental sounds, audio fingerprinting, and audio watermarking will be presented to demonstrate the advantages of using time-frequency approaches in analyzing and extracting information from audio signals.</p
Data Hiding and Its Applications
Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others
EXTRINSIC CHANNEL-LIKE FINGERPRINT EMBEDDING FOR TRANSMITTER AUTHENTICATION IN WIRELESS SYSTEMS
We present a physical-layer fingerprint-embedding scheme for wireless signals, focusing on multiple input multiple output (MIMO) and orthogonal frequency division multiplexing (OFDM) transmissions, where the fingerprint signal conveys a low capacity communication suitable for authenticating the transmission and further facilitating secure communications. Our system strives to embed the fingerprint message into the noise subspace of the channel estimates obtained by the receiver, using a number of signal spreading techniques. When side information of channel state is known and leveraged by the transmitter, the performance of the fingerprint embedding can be improved. When channel state information is not known, blind spreading techniques are applied. The fingerprint message is only visible to aware receivers who explicitly preform detection of the signal, but is invisible to receivers employing typical channel equalization. A taxonomy of overlay designs is discussed and these designs are explored through experiment using time-varying channel-state information (CSI) recorded from IEEE802.16e Mobile WiMax base stations. The performance of the fingerprint signal as received by a WiMax subscriber is demonstrated using CSI measurements derived from the downlink signal. Detection performance for the digital fingerprint message in time-varying channel conditions is also presented via simulation
A Review of Hashing based Image Copy Detection Techniques
Images are considered to be natural carriers of information, and a large number of images are created, exchanged and are made available online. Apart from creating new images, the availability of number of duplicate copies of images is a critical problem. Hashing based image copy detection techniques are a promising alternative to address this problem. In this approach, a hash is constructed by using a set of unique features extracted from the image for identification. This article provides a comprehensive review of the state-of-the-art image hashing techniques. The reviewed techniques are categorized by the mechanism used and compared across a set of functional & performance parameters. The article finally highlights the current issues faced by such systems and possible future directions to motivate further research work
Recent Application in Biometrics
In the recent years, a number of recognition and authentication systems based on biometric measurements have been proposed. Algorithms and sensors have been developed to acquire and process many different biometric traits. Moreover, the biometric technology is being used in novel ways, with potential commercial and practical implications to our daily activities. The key objective of the book is to provide a collection of comprehensive references on some recent theoretical development as well as novel applications in biometrics. The topics covered in this book reflect well both aspects of development. They include biometric sample quality, privacy preserving and cancellable biometrics, contactless biometrics, novel and unconventional biometrics, and the technical challenges in implementing the technology in portable devices. The book consists of 15 chapters. It is divided into four sections, namely, biometric applications on mobile platforms, cancelable biometrics, biometric encryption, and other applications. The book was reviewed by editors Dr. Jucheng Yang and Dr. Norman Poh. We deeply appreciate the efforts of our guest editors: Dr. Girija Chetty, Dr. Loris Nanni, Dr. Jianjiang Feng, Dr. Dongsun Park and Dr. Sook Yoon, as well as a number of anonymous reviewers
- …