987 research outputs found
On the Use of Independent Component Analysis to Denoise Side-Channel Measurements
International audienceIndependent Component Analysis (ICA) is a powerful technique for blind source separation. It has been successfully applied to signal processing problems, such as feature extraction and noise reduction , in many different areas including medical signal processing and telecommunication. In this work, we propose a framework to apply ICA to denoise side-channel measurements and hence to reduce the complexity of key recovery attacks. Based on several case studies, we afterwards demonstrate the overwhelming advantages of ICA with respect to the commonly used preprocessing techniques such as the singular spectrum analysis. Mainly, we target a software masked implementation of an AES and a hardware unprotected one. Our results show a significant Signal-to-Noise Ratio (SNR) gain which translates into a gain in the number of traces needed for a successful side-channel attack. This states the ICA as an important new tool for the security assessment of cryptographic implementations
Deep learning denoising by dimension reduction: Application to the ORION-B line cubes
Context. The availability of large bandwidth receivers for millimeter radio
telescopes allows the acquisition of position-position-frequency data cubes
over a wide field of view and a broad frequency coverage. These cubes contain
much information on the physical, chemical, and kinematical properties of the
emitting gas. However, their large size coupled with inhomogenous
signal-to-noise ratio (SNR) are major challenges for consistent analysis and
interpretation.Aims. We search for a denoising method of the low SNR regions of
the studied data cubes that would allow to recover the low SNR emission without
distorting the signals with high SNR.Methods. We perform an in-depth data
analysis of the 13 CO and C 17 O (1 -- 0) data cubes obtained as part of the
ORION-B large program performed at the IRAM 30m telescope. We analyse the
statistical properties of the noise and the evolution of the correlation of the
signal in a given frequency channel with that of the adjacent channels. This
allows us to propose significant improvements of typical autoassociative neural
networks, often used to denoise hyperspectral Earth remote sensing data.
Applying this method to the 13 CO (1 -- 0) cube, we compare the denoised data
with those derived with the multiple Gaussian fitting algorithm ROHSA,
considered as the state of the art procedure for data line cubes.Results. The
nature of astronomical spectral data cubes is distinct from that of the
hyperspectral data usually studied in the Earth remote sensing literature
because the observed intensities become statistically independent beyond a
short channel separation. This lack of redundancy in data has led us to adapt
the method, notably by taking into account the sparsity of the signal along the
spectral axis. The application of the proposed algorithm leads to an increase
of the SNR in voxels with weak signal, while preserving the spectral shape of
the data in high SNR voxels.Conclusions. The proposed algorithm that combines a
detailed analysis of the noise statistics with an innovative autoencoder
architecture is a promising path to denoise radio-astronomy line data cubes. In
the future, exploring whether a better use of the spatial correlations of the
noise may further improve the denoising performances seems a promising avenue.
In addition
Filtering Random Graph Processes Over Random Time-Varying Graphs
Graph filters play a key role in processing the graph spectra of signals
supported on the vertices of a graph. However, despite their widespread use,
graph filters have been analyzed only in the deterministic setting, ignoring
the impact of stochastic- ity in both the graph topology as well as the signal
itself. To bridge this gap, we examine the statistical behavior of the two key
filter types, finite impulse response (FIR) and autoregressive moving average
(ARMA) graph filters, when operating on random time- varying graph signals (or
random graph processes) over random time-varying graphs. Our analysis shows
that (i) in expectation, the filters behave as the same deterministic filters
operating on a deterministic graph, being the expected graph, having as input
signal a deterministic signal, being the expected signal, and (ii) there are
meaningful upper bounds for the variance of the filter output. We conclude the
paper by proposing two novel ways of exploiting randomness to improve (joint
graph-time) noise cancellation, as well as to reduce the computational
complexity of graph filtering. As demonstrated by numerical results, these
methods outperform the disjoint average and denoise algorithm, and yield a (up
to) four times complexity redution, with very little difference from the
optimal solution
Investigation Of Pressure Fluctuations In The Hyporheic Zone In Response To Flow Around A Hydraulic Structure
Erosion around a cylinders is a well studied field. Particles erode when lift and drag forces overcome a critical threshold. These forces are typically studied from above the water-riverbed interface. This study maps hyporheic pressure fluctuations as they are related to surface water velocity. The pressure map is used to evaluate lift enhancement and destabilization forces on the riverbed. High pressure events in the subsurface help generate a destabilizing force from within the riverbed. This work develops a probability distribution function relating turbulent velocity fluctuations and subsurface pressure fluctuations.
A cylinder was fitted with differential pressure transducers such that the pressure ports were flush with the cylinder surface and below the water-sand interface. Three-component velocities were recorded synchronously with differential pressure fluctuations measured over a 18 mm depth. As expected, results show decay in pressure fluctuations as a function of depth. The standard deviation of the pressure fluctuation in the upper hyporheic zone scales well with shear stress
- …