329 research outputs found
Statistical analysis of low level atmospheric turbulence
The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form
LMMSE Estimation and Interpolation of Continuous-Time Signals from Discrete-Time Samples Using Factor Graphs
The factor graph approach to discrete-time linear Gaussian state space models
is well developed. The paper extends this approach to continuous-time linear
systems/filters that are driven by white Gaussian noise. By Gaussian message
passing, we then obtain MAP/MMSE/LMMSE estimates of the input signal, or of the
state, or of the output signal from noisy observations of the output signal.
These estimates may be obtained with arbitrary temporal resolution. The
proposed input signal estimation does not seem to have appeared in the prior
Kalman filtering literature
Space Time MUSIC: Consistent Signal Subspace Estimation for Wide-band Sensor Arrays
Wide-band Direction of Arrival (DOA) estimation with sensor arrays is an
essential task in sonar, radar, acoustics, biomedical and multimedia
applications. Many state of the art wide-band DOA estimators coherently process
frequency binned array outputs by approximate Maximum Likelihood, Weighted
Subspace Fitting or focusing techniques. This paper shows that bin signals
obtained by filter-bank approaches do not obey the finite rank narrow-band
array model, because spectral leakage and the change of the array response with
frequency within the bin create \emph{ghost sources} dependent on the
particular realization of the source process. Therefore, existing DOA
estimators based on binning cannot claim consistency even with the perfect
knowledge of the array response. In this work, a more realistic array model
with a finite length of the sensor impulse responses is assumed, which still
has finite rank under a space-time formulation. It is shown that signal
subspaces at arbitrary frequencies can be consistently recovered under mild
conditions by applying MUSIC-type (ST-MUSIC) estimators to the dominant
eigenvectors of the wide-band space-time sensor cross-correlation matrix. A
novel Maximum Likelihood based ST-MUSIC subspace estimate is developed in order
to recover consistency. The number of sources active at each frequency are
estimated by Information Theoretic Criteria. The sample ST-MUSIC subspaces can
be fed to any subspace fitting DOA estimator at single or multiple frequencies.
Simulations confirm that the new technique clearly outperforms binning
approaches at sufficiently high signal to noise ratio, when model mismatches
exceed the noise floor.Comment: 15 pages, 10 figures. Accepted in a revised form by the IEEE Trans.
on Signal Processing on 12 February 1918. @IEEE201
A Signal-Processing View on Packet Sampling and Anomaly Detection
International audienceAnomaly detection methods typically operate on preprocessed traffic traces. Firstly, most traffic capturing devices today employ random packet sampling, where each packet is selected with a certain probability, to cope with increasing link speeds. Secondly, temporal aggregation, where all packets in a measurement interval are represented by their temporal mean, is applied to transform the traffic trace to the observation timescale of interest for anomaly detection. These preprocessing steps affect the temporal correlation structure of traffic that is used by anomaly detection methods such as Kalman filtering or PCA, and have thus an impact on anomaly detection performance. Prior work has analyzed how packet sampling degrades the accuracy of anomaly detection methods; however, neither theoretical explanations nor solutions to the sampling problem have been provided. This paper makes the following key contributions: (i) It provides a thorough analysis and quantification of how random packet sampling and temporal aggregation modify the signal properties by introducing noise, distortion and aliasing. (ii) We show that aliasing introduced by the aggregation step has the largest impact on the correlation structure. (iii) We further propose to replace the aggregation step with a specifically designed low-pass filter that reduces the aliasing effect. (iv) Finally, we show that with our solution applied, the performance of anomaly detection systems can be considerably improved in the presence of packet sampling
Spectral analysis of suspension system of a commercial city bus
There is an ever increasing demand for intelligent and efficient urban vehicle systems that fulfill several requirements, e.g., low cost maintainability and high passenger comfort. Concerning these goals reliable methods are needed to model and to evaluate the imposed performances. In this paper a spectral analysis of the suspension system of a commercial city bus is presented. Based on experimental data taken on a city bus, the vibrations emerging on the wheels and the body are analyzed in the frequency domain. The goal of the analysis is to characterize the main eigenfrequencies of the suspension system and its damping in amplitude and also to evaluate both the road and the suspension system in terms of passenger comfort according to ISO standards. © 2016 IEEE
Multiscale analysis of financial volatility
This thesis is concerned with the modeling of financial time series data.
It introduces to the economics literature a set of techniques for this purpose
that are rooted in engineering and physics, but almost unheard of in
economics. The key feature of these techniques is that they combine the
available information in the time and frequency domains simultaneously,
making it possible to enjoy the advantages of both forms of analysis. The
thesis is divided into three sections. First, after briefly outlining the Fourier
methods, a more
exible technique that allows for the study of time-scale
dependent phenomena (motivated from a discussion on Heisenberg's uncertainty
principle) namely Wavelet method is defined. A complete account of
discrete and continuous wavelet transformations, and wavelet variation is
provided and the advantages of wavelet-multiresolution analysis over Fourier
methods are demonstrated. In the second section, the statistical properties
of financial returns at 1-day, 5-day and 10-day sampling intervals are studied
using S&P500 index for over a decade, and the links between dependence
properties of financial returns at lower sampling frequencies are explored.
The concepts of temporal aggregation and skip sampling are discussed and
the effects of temporal aggregation on long range dependent time series are
theoretically outlined and then tested through simulations and empirically
via S&P500. In the third section, the variation of two years of five-minute
GBP/USD exchange rate is analysed and the notion of realised variation is
explored. The characteristics of the intraday data at different sampling
frequencies (5-minute, 30-minute, 60-minute, 10-hour, 1-day, and 5-day)
are compared with each other and filtered out from seasonalities using the
wavelet multiscaling technique. We find that temporal aggregation does not
change the decay rate of autocorrelation functions of long-memory data of
certain frequencies, however the level at which the autocorrelation functions
start from move upward for daily data. This thesis adds to the literature
by outlining and comparing the effects of aggregation between daily and
intra-daily frequencies for the realised variances, which to our knowledge is
a first. The effect temporal aggregation has on daily data is different from
intra-daily data, and we provide three reasons why this might be. First, at
higher frequencies strong periodocities distort the autocorrelation functions
which could bring down the decay rate and mask the long memory feature
of the data. Second, the choice of realised variance is crucial in this matter
and different functions can result in contradictory outcomes. Third, as the
order of aggregation increases the decay rate does not depend on the order
of the aggregation
Community-Aware Graph Signal Processing
The emerging field of graph signal processing (GSP) allows to transpose
classical signal processing operations (e.g., filtering) to signals on graphs.
The GSP framework is generally built upon the graph Laplacian, which plays a
crucial role to study graph properties and measure graph signal smoothness.
Here instead, we propose the graph modularity matrix as the centerpiece of GSP,
in order to incorporate knowledge about graph community structure when
processing signals on the graph, but without the need for community detection.
We study this approach in several generic settings such as filtering, optimal
sampling and reconstruction, surrogate data generation, and denoising.
Feasibility is illustrated by a small-scale example and a transportation
network dataset, as well as one application in human neuroimaging where
community-aware GSP reveals relationships between behavior and brain features
that are not shown by Laplacian-based GSP. This work demonstrates how concepts
from network science can lead to new meaningful operations on graph signals.Comment: 21 pages, 4 figures, Accepted to Signal Processing Magazine: Special
Issue on Graph Signal Processing: Foundations and Emerging Direction
- …