6,202 research outputs found
A new view of nonlinear water waves: the Hilbert spectrum
We survey the newly developed Hilbert spectral analysis method and its applications to Stokes waves, nonlinear wave evolution processes, the spectral form of the random wave field, and turbulence. Our emphasis is on the inadequacy of presently available methods in nonlinear and nonstationary data analysis. Hilbert spectral analysis is here proposed as an alternative. This new method provides not only a more precise definition of particular events in time-frequency space than wavelet analysis, but also more physically meaningful interpretations of the underlying dynamic processes
Recommended from our members
A mission synthesis algorithm for fatigue damage analysis
This paper presents a signal processing based algorithm, the Mildly Nonstationary Mission Synthesis
(MNMS), which produces a short mission signal from long records of experimental data. The
algorithm uses the Discrete Fourier Transform, Orthogonal Wavelet Transform and bump reinsertion
procedures. In order to observe the algorithm effectiveness a fatigue damage case study was
performed for a vehicle lower suspension arm using signals containing tensile and compressive
preloading. The mission synthesis results were compared to the original road data in terms of both the
global signal statistics and the fatigue damage variation as a function of compression ratio. Three
bump reinsertion methods were used and evaluated. The methods differed in the manner in which
bumps (shock events) from different wavelet groups (frequency bands) were synchronised during the
reinsertion process. One method, based on time synchronised section reinsertion, produced the best
results in terms of mission signal kurtosis, crest factor, root-mean-square level and power spectral
density. For improved algorithm performance, bump selection was identified as the main control
parameter requiring optimisation
Compression and Conditional Emulation of Climate Model Output
Numerical climate model simulations run at high spatial and temporal
resolutions generate massive quantities of data. As our computing capabilities
continue to increase, storing all of the data is not sustainable, and thus it
is important to develop methods for representing the full datasets by smaller
compressed versions. We propose a statistical compression and decompression
algorithm based on storing a set of summary statistics as well as a statistical
model describing the conditional distribution of the full dataset given the
summary statistics. The statistical model can be used to generate realizations
representing the full dataset, along with characterizations of the
uncertainties in the generated data. Thus, the methods are capable of both
compression and conditional emulation of the climate models. Considerable
attention is paid to accurately modeling the original dataset--one year of
daily mean temperature data--particularly with regard to the inherent spatial
nonstationarity in global fields, and to determining the statistics to be
stored, so that the variation in the original data can be closely captured,
while allowing for fast decompression and conditional emulation on modest
computers
- …