28 research outputs found

    Deep Neural Oracles for Short-Window Optimized Compressed Sensing of Biosignals

    Get PDF
    The recovery of sparse signals given their linear mapping on lower-dimensional spaces can be partitioned into a support estimation phase and a coefficient estimation phase. We propose to estimate the support with an oracle based on a deep neural network trained jointly with the linear mapping at the encoder. The divination of the oracle is then used to estimate the coefficients by pseudo-inversion. This architecture allows the definition of an encoding-decoding scheme with state-of-the-art recovery capabilities when applied to biological signals such as ECG and EEG, thus allowing extremely low-complex encoders. As an additional feature, oracle-based recovery is able to self-assess, by indicating with remarkable accuracy chunks of signals that may have been reconstructed with a non-satisfactory quality. This self-assessment capability is unique in the CS literature and paves the way for further improvements depending on the requirements of the specific application. As an example, our scheme is able to satisfyingly compress by a factor of 2.67 an ECG or EEG signal with a complexity equivalent to only 24 signed sums per processed sample

    Adapted Compressed Sensing: A Game Worth Playing

    Get PDF
    Despite the universal nature of the compressed sensing mechanism, additional information on the class of sparse signals to acquire allows adjustments that yield substantial improvements. In facts, proper exploitation of these priors allows to significantly increase compression for a given reconstruction quality. Since one of the most promising scopes of application of compressed sensing is that of IoT devices subject to extremely low resource constraint, adaptation is especially interesting when it can cope with hardware-related constraint allowing low complexity implementations. We here review and compare many algorithmic adaptation policies that focus either on the encoding part or on the recovery part of compressed sensing. We also review other more hardware-oriented adaptation techniques that are actually able to make the difference when coming to real-world implementations. In all cases, adaptation proves to be a tool that should be mastered in practical applications to unleash the full potential of compressed sensing

    Algorithms and Systems for IoT and Edge Computing

    Get PDF
    The idea of distributing the signal processing along the path that starts with the acquisition and ends with the final application has given light to the Internet of Things and Edge Computing, which have demonstrated several advantages in terms of scalability, costs, and reliability. In this dissertation, we focus on designing and implementing algorithms and systems that allow performing a complex task on devices with limited resources. Firstly, we assess the trade-off between compression and anomaly detection from both a theoretical and a practical point of view. Information theory provides the rate-distortion analysis that is extended to consider how information content is processed for detection purposes. Considering an actual Structural Health Monitoring application, two corner cases are analysed: detection in high distortion based on a feature extraction method and detection with low distortion based on Principal Component Analysis. Secondly, we focus on streaming methods for Subspace Analysis. In this context, we revise and study state-of-the-art methods to target devices with limited computational resources. We also consider a real case of deployment of an algorithm for streaming Principal Component Analysis for signal compression in a Structural Health Monitoring application, discussing the trade-off between the possible implementation strategies. Finally, we focus on an alternative compression framework suited for low-end devices that is Compressed Sensing. We propose a different decoding approach that splits the recovery problem into two stages and effectively adopts a deep neural network and basic linear algebra to reconstruct biomedical signals. This novel approach outperforms the state-of-the-art in terms of quality of reconstruction and requires lower computational resources

    Metodi Matriciali per l'Acquisizione Efficiente e la Crittografia di Segnali in Forma Compressa

    Get PDF
    The idea of balancing the resources spent in the acquisition and encoding of natural signals strictly to their intrinsic information content has interested nearly a decade of research under the name of compressed sensing. In this doctoral dissertation we develop some extensions and improvements upon this technique's foundations, by modifying the random sensing matrices on which the signals of interest are projected to achieve different objectives. Firstly, we propose two methods for the adaptation of sensing matrix ensembles to the second-order moments of natural signals. These techniques leverage the maximisation of different proxies for the quantity of information acquired by compressed sensing, and are efficiently applied in the encoding of electrocardiographic tracks with minimum-complexity digital hardware. Secondly, we focus on the possibility of using compressed sensing as a method to provide a partial, yet cryptanalysis-resistant form of encryption; in this context, we show how a random matrix generation strategy with a controlled amount of perturbations can be used to distinguish between multiple user classes with different quality of access to the encrypted information content. Finally, we explore the application of compressed sensing in the design of a multispectral imager, by implementing an optical scheme that entails a coded aperture array and Fabry-PĂ©rot spectral filters. The signal recoveries obtained by processing real-world measurements show promising results, that leave room for an improvement of the sensing matrix calibration problem in the devised imager

    Artificial Intelligence for Multimedia Signal Processing

    Get PDF
    Artificial intelligence technologies are also actively applied to broadcasting and multimedia processing technologies. A lot of research has been conducted in a wide variety of fields, such as content creation, transmission, and security, and these attempts have been made in the past two to three years to improve image, video, speech, and other data compression efficiency in areas related to MPEG media processing technology. Additionally, technologies such as media creation, processing, editing, and creating scenarios are very important areas of research in multimedia processing and engineering. This book contains a collection of some topics broadly across advanced computational intelligence algorithms and technologies for emerging multimedia signal processing as: Computer vision field, speech/sound/text processing, and content analysis/information mining

    Time-Frequency Distributions: Approaches for Incomplete Non-Stationary Signals

    Get PDF
    There are many sources of waveforms or signals existing around us. They can be natural phenomena such as sound, light and invisible like elec- tromagnetic fields, voltage, etc. Getting an insight into these waveforms helps explain the mysteries surrounding our world and the signal spec- tral analysis (i.e. the Fourier transform) is one of the most significant approaches to analyze a signal. Nevertheless, Fourier analysis cannot provide a time-dependent spectrum description for spectrum-varying signals-non-stationary signal. In these cases, time-frequency distribu- tions are employed instead of the traditional Fourier transform. There have been a variety of methods proposed to obtain the time-frequency representations (TFRs) such as the spectrogram or the Wigner-Ville dis- tribution. The time-frequency distributions (TFDs), indeed, offer us a better signal interpretation in a two-dimensional time-frequency plane, which the Fourier transform fails to give. Nevertheless, in the case of incomplete data, the time-frequency displays are obscured by artifacts, and become highly noisy. Therefore, signal time-frequency features are hardly extracted, and cannot be used for further data processing. In this thesis, we propose two methods to deal with compressed observations. The first one applies compressive sensing with a novel chirp dictionary. This method assumes any windowed signal can be approximated by a sum of chirps, and then performs sparse reconstruction from windowed data in the time domain. A few improvements in computational com- plexity are also included. In the second method, fixed kernel as well as adaptive optimal kernels are used. This work is also based on the as- sumption that any windowed signal can be approximately represented by a sum of chirps. Since any chirp ’s auto-terms only occupy a certain area in the ambiguity domain, the kernel can be designed in a way to remove the other regions where auto-terms do not reside. In this manner, not only cross-terms but also missing samples’ artifact are mitigated signifi- cantly. The two proposed approaches bring about a better performance in the time-frequency signature estimations of the signals, which are sim- ulated with both synthetic and real signals. Notice that in this thesis, we only consider the non-stationary signals with frequency changing slowly with time. It is because the signals with rapidly varying frequency are not sparse in time-frequency domain and then the compressive sensing techniques or sparse reconstructions could not be applied. Also, the data with random missing samples are obtained by randomly choosing the samples’ positions and replacing these samples with zeros

    Efficient and secured wireless monitoring systems for detection of cardiovascular diseases

    Get PDF
    Cardiovascular Disease (CVD) is the number one killer for modern era. Majority of the deaths associated with CVD can entirely be prevented if the CVD struck person is treated with urgency. This thesis is our effort in minimizing the delay associated with existing tele-cardiology application. We harnessed the computational power of modern day mobile phones to detect abnormality in Electrocardiogram (ECG). If abnormality is detected, our innovative ECG compression algorithm running on the patient's mobile phone compresses and encrypts the ECG signal and then performs efficient transmission towards the doctors or hospital services. According to the literature, we have achieved the highest possible compression ratio of 20.06 (95% compression) on ECG signal, without any loss of information. Our 3 layer permutation cipher based ECG encoding mechanism can raise the security strength substantially higher than conventional AES or DES algorithms. If in near future, a grid of supercomputers can compare a trillion trillion trillion (1036) combinations of one ECG segment (comprising 500 ECG samples) per second for ECG morphology matching, it will take approximately 9.333 X 10970 years to enumerate all the combinations. After receiving the compressed ECG packets the doctor's mobile phone or the hospital server authenticates the patient using our proposed set of ECG biometric based authentication mechanisms. Once authenticated, the patients are diagnosed with our faster ECG diagnosis algorithms. In a nutshell, this thesis contains a set of algorithms that can save a CVD affected patient's life by harnessing the power of mobile computation and wireless communication
    corecore