8,227 research outputs found

    Recent Advances in Theory and Methods for Nonstationary Signal Analysis

    Get PDF
    Cataloged from PDF version of article.All physical processes are nonstationary. When analyzing time series, it should be remembered that nature can be amazingly complex and that many of the theoretical constructs used in stochastic process theory, for example, linearity, ergodicity, normality, and particularly stationarity, are mathematical fairy tales. There are no stationary time series in the strict mathematical sense; at the very least, everything has a beginning and an end. Thus, while it is necessary to know the theory of stationary processes, one should not adhere to it dogmatically when analyzing data from physical sources, particularly when the observations span an extended period. Nonstationary signals are appropriate models for signals arising in several fields of applications including communications, speech and audio, mechanics, geophysics, climatology, solar and space physics, optics, and biomedical engineering. Nonstationary models account for possible time variations of statistical functions and/or spectral characteristics of signals. Thus, they provide analysis tools more general than the classical Fourier transform for finite-energy signals or the power spectrum for finite-power stationary signals. Nonstationarity, being a “nonproperty” has been analyzed from several different points of view. Several approaches that generalize the traditional concepts of Fourier analysis have been considered, including time-frequency, time-scale, and wavelet analysis, and fractional Fourier and linear canonical transforms

    Recent Advances in Theory and Methods for Nonstationary Signal Analysis

    Get PDF
    Cataloged from PDF version of article.All physical processes are nonstationary. When analyzing time series, it should be remembered that nature can be amazingly complex and that many of the theoretical constructs used in stochastic process theory, for example, linearity, ergodicity, normality, and particularly stationarity, are mathematical fairy tales. There are no stationary time series in the strict mathematical sense; at the very least, everything has a beginning and an end. Thus, while it is necessary to know the theory of stationary processes, one should not adhere to it dogmatically when analyzing data from physical sources, particularly when the observations span an extended period. Nonstationary signals are appropriate models for signals arising in several fields of applications including communications, speech and audio, mechanics, geophysics, climatology, solar and space physics, optics, and biomedical engineering. Nonstationary models account for possible time variations of statistical functions and/or spectral characteristics of signals. Thus, they provide analysis tools more general than the classical Fourier transform for finite-energy signals or the power spectrum for finite-power stationary signals. Nonstationarity, being a “nonproperty” has been analyzed from several different points of view. Several approaches that generalize the traditional concepts of Fourier analysis have been considered, including time-frequency, time-scale, and wavelet analysis, and fractional Fourier and linear canonical transforms

    Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    Get PDF
    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures

    A Phase Vocoder based on Nonstationary Gabor Frames

    Full text link
    We propose a new algorithm for time stretching music signals based on the theory of nonstationary Gabor frames (NSGFs). The algorithm extends the techniques of the classical phase vocoder (PV) by incorporating adaptive time-frequency (TF) representations and adaptive phase locking. The adaptive TF representations imply good time resolution for the onsets of attack transients and good frequency resolution for the sinusoidal components. We estimate the phase values only at peak channels and the remaining phases are then locked to the values of the peaks in an adaptive manner. During attack transients we keep the stretch factor equal to one and we propose a new strategy for determining which channels are relevant for reinitializing the corresponding phase values. In contrast to previously published algorithms we use a non-uniform NSGF to obtain a low redundancy of the corresponding TF representation. We show that with just three times as many TF coefficients as signal samples, artifacts such as phasiness and transient smearing can be greatly reduced compared to the classical PV. The proposed algorithm is tested on both synthetic and real world signals and compared with state of the art algorithms in a reproducible manner.Comment: 10 pages, 6 figure

    How Predictable are Temperature-series Undergoing Noise-controlled Dynamics in the Mediterranean

    Get PDF
    Mediterranean is thought to be sensitive to global climate change, but its future interdecadal variability is uncertain for many climate models. A study was made of the variability of the winter temperature over the Mediterranean Sub-regional Area (MSA), employing a reconstructed temperature series covering the period 1698 to 2010. This paper describes the transformed winter temperature data performed via Empirical Mode Decomposition for the purposes of noise reduction and statistical modeling. This emerging approach is discussed to account for the internal dependence structure of natural climate variability

    Dynamic Metric Learning from Pairwise Comparisons

    Full text link
    Recent work in distance metric learning has focused on learning transformations of data that best align with specified pairwise similarity and dissimilarity constraints, often supplied by a human observer. The learned transformations lead to improved retrieval, classification, and clustering algorithms due to the better adapted distance or similarity measures. Here, we address the problem of learning these transformations when the underlying constraint generation process is nonstationary. This nonstationarity can be due to changes in either the ground-truth clustering used to generate constraints or changes in the feature subspaces in which the class structure is apparent. We propose Online Convex Ensemble StrongLy Adaptive Dynamic Learning (OCELAD), a general adaptive, online approach for learning and tracking optimal metrics as they change over time that is highly robust to a variety of nonstationary behaviors in the changing metric. We apply the OCELAD framework to an ensemble of online learners. Specifically, we create a retro-initialized composite objective mirror descent (COMID) ensemble (RICE) consisting of a set of parallel COMID learners with different learning rates, demonstrate RICE-OCELAD on both real and synthetic data sets and show significant performance improvements relative to previously proposed batch and online distance metric learning algorithms.Comment: to appear Allerton 2016. arXiv admin note: substantial text overlap with arXiv:1603.0367
    corecore