878 research outputs found

    Positive time-frequency distributions based on joint marginal constraints

    Get PDF
    This correspondence studies the formulation of members of the Cohen-Posch class of positive time-frequency energy distributions. Minimization of cross-entropy measures with respect to different priors and the case of no prior or maximum entropy were considered. It is concluded that, in general, the information provided by the classical marginal constraints is very limited, and thus, the final distribution heavily depends on the prior distribution. To overcome this limitation, joint time and frequency marginals are derived based on a "direction invariance" criterion on the time-frequency plane that are directly related to the fractional Fourier transform.Peer Reviewe

    Bilinear time-frequency representations of heart rate variability and respiration during stress

    Get PDF
    Recently, joint time-frequency signal representation has received considerable attention as a powerful tool for analyzing a variety of signals and systems. In particular, if the frequency content is time varying as in signals of biological origin which often do not comply with the stationarity assumptions, then this approach is quite attractive. In this dissertation, we explore the possibility of better representation of two particular biological signals, namely heart rate variability (HRV) and respiration. We propose the use of time-frequency analysis as a new and innovative approach to examine the physical and mental exertion attributed to exercise. Two studies are used for the main investigation, the preliminary and anticipation protocols. In the first phase of this work, the application of five different bilinear representations on modeled HRV test signals and experimental HRV and respiration signals of the preliminary protocol is evaluated. Each distribution: the short time Fourier transform (STFT), the pseudo Wigner-Ville (WVD), the smoothed pseudo Wigner-Ville (SPWVD), The Choi-Williams (CWD), and the Born-Jordan-Cohen (RID) has unique characteristics which is shown to affect the amount of smoothing and the generation of cross-terms differently . The CWD and the SPWVD are chosen for further application because of overcoming the drawbacks of the other distributions by providing higher resolution in time arid frequency while suppressing interferences between the signal components. In the second phase of this research, the SPWVD and CWD are used to investigate the presence of an anticipatory component due to the stressful exercise condition as reflected in the HRV signal from a change in behavior in the autonomic nervous system. By expanding the concept of spectral analysis of heart rate variability (HRV) into time-frequency analysis, we are able to quantitatively assess the parasympathetic (HF) and sympatho-vagal balance (LF:HF) changes as a function of time. As a result, the assessment of the autonomic nervous system during rapid changes is made. A new methodology is also proposed that adaptively uncovers the region of parasympathetic activity. It is well known that parasympathetic activity is highly correlated with the respiration frequency. This technique traces the respiration frequency and extracts the corresponding parasympathetic activity from the heart rate variability signal by adaptive filtering

    Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    Get PDF
    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments.We thank the associate editor and three anonymous reviewers whose suggestions helped improve the paper. We acknowledge the CMIP5 climate coupled modelling groups, for producing and making their model outputs available, the U.S. Department of Energy's Program for Climate Model Diagnosis and Intercomparison (PCMDI), which provides coordinating support and led development of software infrastructure in partnership with the Global Organization for Earth System Science Portals. The CMIP5 model outputs used in the present study are available from http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html. We also thank the Iran Meteorological Organization (IRIMO) for providing rainfall data recorded at the Tehran synoptic station. Funding support was provided by the Natural Sciences and Engineering Research Council (NSERC) of Canada

    Climate Change Projection and Time-varying Multi-dimensional Risk Analysis

    Get PDF
    In recent decades, population growth and global warming consequent to greenhouse gas emissions because of human activities, has changed the atmospheric composition leading to intensifying extreme climate phenomena and overall increase of extreme events. These extreme events have caused human suffering and devastating effects in recent record-breaking warming years. To mitigate adverse consequences arising from global warming, the best strategy is to project the future probabilistic behavior of extreme climate phenomena under changing environment. The first contribution of this research is to improve the predictive power of regression-based statistical downscaling processes to accurately project the future behavior of extreme climate phenomena. First, a supervised dimensionality reduction algorithm is proposed for the statistical downscaling to derive a low-dimensional manifold representing climate change signals encoding of high-dimensional atmospheric variables. Such an algorithm is novel in climate change studies as past literature has focused on deriving low-dimensional principal components from large-scale atmospheric projectors without taking into account the target hydro-climate variables. The new algorithm called Supervised Principal Component analysis (Supervised PCA) outperforms all of the existing state-of-the-art dimensionality reduction algorithms. The model improves the performance of the statistical downscaling modelling through deriving subspaces that have maximum dependency with the target hydro-climate variables. A kernel version of Supervised PCA is also introduced to reduce nonlinear dimensionality and capture all of the nonlinear and complex variabilities between hydro-climate response variable and atmospheric projectors. To address the biases arising from difference between observed and simulated large-scale atmospheric projectors, and to represent anomalies of low frequency variability of teleconnections in General Circulation Models (GCMs), a Multivariate Recursive Nesting Bias Correction (MRNBC) is proposed to the regression-based statistical downscaling. The proposed method is able to use multiple variables in multiple locations to simultaneously correct temporal and spatial biases in cross dependent multi-projectors. To reduce another source of uncertainty arising from complexity and nonlinearity in quantitative empirical relationships in the statistical downscaling, the results demonstrate the superiority of a Bayesian machine-learning algorithm. The predictive power of the statistical downscaling is therefore improved through addressing the aforementioned sources of uncertainty. This results in improvement of the projection of the global warming impacts on the probabilistic behavior of hydro-climate variables using future multi-model ensemble GCMs under forcing climate change scenarios. The results of two Design-of-Experiments also reveal that the proposed comprehensive statistical downscaling is credible and adjustable to the changes under non-stationary conditions arising from climate change. Under the impact of climate change arising from anthropogenic global warming, it is demonstrated that the nature and the risk of extreme climate phenomena are changed over time. It is also well known that the extreme climate processes are multi-dimensional by their very nature characterized by multi-dimensions that are highly dependent. Accordingly, to strength the reliability of infrastructure designs and the management of water systems in the changing climate, it is of crucial importance to update the risk concept to a new adaptive multi-dimensional time-varying one to integrate anomalies of dynamic anthropogenically forced environments. The main contribution of this research is to develop a new generation of multivariate time-varying risk concept for an adaptive design framework in non-stationary conditions arising from climate change. This research develops a Bayesian, dynamic conditional copula model describing time-varying dependence structure between mixed continuous and discrete marginals of extreme multi-dimensional climate phenomena. The framework is able to integrate any anomalies in extreme multi-dimensional events in non-stationary conditions arising from climate change. It generates iterative samples using a Markov Chain Monte Carlo (MCMC) method from the full conditional marginals and joint distribution in a fully likelihood-based Bayesian inference. The framework also introduces a fully Bayesian, time-varying Joint Return Period (JRP) concept to quantify the extent of changes in the nature and the risk of extreme multi-dimensional events over time under the impact of climate change. The proposed generalized time-dependent risk framework can be applied to all stochastic multi-dimensional climate systems that are under the influence of changing environments

    Spectral Analysis for Signal Detection and Classification : Reducing Variance and Extracting Features

    Get PDF
    Spectral analysis encompasses several powerful signal processing methods. The papers in this thesis present methods for finding good spectral representations, and methods both for stationary and non-stationary signals are considered. Stationary methods can be used for real-time evaluation, analysing shorter segments of an incoming signal, while non-stationary methods can be used to analyse the instantaneous frequencies of fully recorded signals. All the presented methods aim to produce spectral representations that have high resolution and are easy to interpret. Such representations allow for detection of individual signal components in multi-component signals, as well as separation of close signal components. This makes feature extraction in the spectral representation possible, relevant features include the frequency or instantaneous frequency of components, the number of components in the signal, and the time duration of the components. Two methods that extract some of these features automatically for two types of signals are presented in this thesis. One adapted to signals with two longer duration frequency modulated components that detects the instantaneous frequencies and cross-terms in the Wigner-Ville distribution, the other for signals with an unknown number of short duration oscillations that detects the instantaneous frequencies in a reassigned spectrogram. This thesis also presents two multitaper methods that reduce the influence of noise on the spectral representations. One is designed for stationary signals and the other for non-stationary signals with multiple short duration oscillations. Applications for the methods presented in this thesis include several within medicine, e.g. diagnosis from analysis of heart rate variability, improved ultrasound resolution, and interpretation of brain activity from the electroencephalogram

    Fast DD-classification of functional data

    Full text link
    A fast nonparametric procedure for classifying functional data is introduced. It consists of a two-step transformation of the original data plus a classifier operating on a low-dimensional hypercube. The functional data are first mapped into a finite-dimensional location-slope space and then transformed by a multivariate depth function into the DDDD-plot, which is a subset of the unit hypercube. This transformation yields a new notion of depth for functional data. Three alternative depth functions are employed for this, as well as two rules for the final classification on [0,1]q[0,1]^q. The resulting classifier has to be cross-validated over a small range of parameters only, which is restricted by a Vapnik-Cervonenkis bound. The entire methodology does not involve smoothing techniques, is completely nonparametric and allows to achieve Bayes optimality under standard distributional settings. It is robust, efficiently computable, and has been implemented in an R environment. Applicability of the new approach is demonstrated by simulations as well as a benchmark study

    Forecasting of commercial sales with large scale Gaussian Processes

    Full text link
    This paper argues that there has not been enough discussion in the field of applications of Gaussian Process for the fast moving consumer goods industry. Yet, this technique can be important as it e.g., can provide automatic feature relevance determination and the posterior mean can unlock insights on the data. Significant challenges are the large size and high dimensionality of commercial data at a point of sale. The study reviews approaches in the Gaussian Processes modeling for large data sets, evaluates their performance on commercial sales and shows value of this type of models as a decision-making tool for management.Comment: 1o pages, 5 figure
    • …
    corecore