123 research outputs found

    Statistical Learning for the Spectral Analysis of Time Series Data

    Get PDF
    Spectral analysis of biological processes poses a wide variety of complications. Statistical learning techniques in both the frequentist and Bayesian frameworks are required overcome the unique and varied challenges that exist in analyzing these data in a meaningful way. This dissertation presents new methodologies to address problems in multivariate stationary and univariate nonstationary time series analysis. The first method is motivated by the analysis of heart rate variability time series. Since it is nonstationary, it poses a unique challenge: localized, accurate and interpretable descriptions of both frequency and time are required. By reframing this question in a reduced-rank regression setting, we propose a novel approach that produces a low-dimensional, empirical basis that is localized in bands of time and frequency. To estimate this frequency-time basis, we apply penalized reduced rank regression with singular value decomposition to the localized discrete Fourier transform. An adaptive sparse fused lasso penalty is applied to the left and right singular vectors, resulting in low-dimensional measures that are interpretable as localized bands in time and frequency. We then apply this method to interpret the power spectrum of HRV measured on a single person over the course of a night. The second method considers the analysis of high dimensional resting-state electroencephalography recorded on a group of first-episode psychosis subjects compared to a group of healthy controls. This analysis poses two challenges. First, estimating the spectral density matrix in a high dimensional setting. And second, incorporating covariates into the estimate of the spectral density. To address these, we use a Bayesian factor model which decomposes the Fourier transform of the time series into a matrix of factors and vector of factor loadings. The factor model is then embedded into a mixture model with covariate dependent mixture weights. The method is then applied to examine differences in the power spectrum for first-episode psychosis subjects vs healthy controls. Public health significance: As collection methods for time series data becomes ubiquitous in biomedical research, there is an increasing need for statistical methodology that is robust enough to handle the complicated and potentially high dimensionality of the data while retaining the flexibility needed to answer real world questions of interest

    Idl Signal Processing Library 1.0

    Full text link
    We make available a library of documented IDL .pro files as well as a shareable object library that allows IDL to call routines from LAPACK. The routines are for use in the spectral analysis of time series data. The primary focus of these routines are David Thomson's multitaper methods but a whole range of functions will be made available in future revisions of the submission. At present routines are provided to carry out the following operations: calculate prolate spheroidal sequences and eigenvalues, project time-series into frequency bands, calculate spectral estimates with or without moving windows, and calculate the cross-coherence between two time series as a function of frequency as well as the coherence between frequencies for a single time series.Comment: 13 IDL .pro files, 1 .html file, 1 .ps file, 1 license file. Download the source for the IDL files (save as .tar.gz) Read idl_lib.ps for instructions on use. Originally submitted to the neuro-sys archive which was never publicly announced (was 9801001

    Modeling user navigation

    Get PDF
    This paper proposes the use of neural networks as a tool for studying navigation within virtual worlds. Results indicate that the network learned to predict the next step for a given trajectory. The analysis of hidden layer shows that the network was able to differentiate between two groups of users identified on the basis of their performance for a spatial task. Time series analysis of hidden node activation values and input vectors suggested that certain hidden units become specialised for place and heading, respectively. The benefits of this approach and the possibility of extending the methodology to the study of navigation in Human Computer Interaction applications are discussed

    Spectral Analysis Of Business Cycles In The Visegrad Group Countries

    Get PDF
    This paper examines the business cycle properties of Visegrad group countries. The main objective is to identify business cycles in these countries and to study the relationships between them. The author applies a modification of the Fourier analysis to estimate cycle amplitudes and frequencies. This allows for a more precise estimation of cycle characteristics than the traditional approach. The cross-spectral analysis of GDP cyclical components for the Czech Republic, Hungary, Poland and Slovakia makes it possible to assess the degree of business cycle synchronization between the countries

    A non-linear analysis of Gibson's paradox in the Netherlands, 1800-2012

    Get PDF
    This paper adopts a multivariate, non-linear framework to analyse Gibson’s paradox in the Netherlands over the period 1800-2012. Specifically, SSA (singular spectrum) and MSSA (multichannel singular spectrum) techniques are used. It is shown that changes in monetary policy regimes or volatility in the price of gold by themselves cannot account for the behaviour of government bond yields and prices in the Netherlands over the last 200 years. However, the inclusion of changes in the real rate of return on capital, M1, primary credit rate, expected inflation, and money purchasing power enables a nonlinear model to account for a sizeable percentage of the total variance of Dutch bond yields

    Periodicities of FX Markets in Intrinsic Time

    Get PDF
    This paper utilises advanced methods from Fourier Analysis in order to describe financial ultra-high frequent transaction data. The Lomb-Scargle Fourier Transform is used to take into account the irregularity in spacing in the time-domain. It provides a natural framework for the power spectra of different inhomogeneous time series processes to be easily and quickly estimated,without significant computational effort, in contrast to the common econometric approaches in the finance literature. An event-based approach (intrinsic time), which by its own nature is inhomogeneous in time, is employed using different event thresholds to filter the foreign exchange tick-data leading to a power-law relationship. The calculated spectral density demonstrates that the price process in intrinsic time contains different periodic components, especially in the medium-long term, implying the existence of new stylised facts of ultra-high frequency data in the frequency domain

    A Study of Dynamic Relationship between Housing Values and Interest Rate in the Korean Housing Market

    Get PDF
    The goal of this study is to identify the long-term relationship between housing value and interest rate in the Korean housing market, using the Cointegration Test and Spectral Analysis. The result shows the long-term negative (-) equilibrium relationship between housing values and interest rate. Moreover, the Granger Causality Test for confirming the short-term dynamic relationship between these variables notes the one-way causality from interest rate to the change rate of housing and the transfer function model certifies concretely the causal structure of this relationship. The result of this study suggests that the interest rate adjustment policy in the Korean housing market can work very effectively and it will contribute to forecast the change of future housing values hereafter. Keywords: Dynamic relationship; Housing value; Interest rate; Cointegration and spectral analysis; Long term equilibrium

    Measuring Business Cycles: The Real Business Cycle Approach and Related Controversies

    Get PDF
    This paper presents and discusses the business cycles measurement carried out by the real business cycle (RBC) approach. It shows how the controversy lodges within and outside the RBC frontiers. It also shows how the methodological debate that took place in the 1940s about the relationship between theory and measurement, is revived in modern discussions of business cycles measurement. The paper concludes that this relationship still raises fierce contention.Business cycles; Real business cycles; Measurement.

    Optimal HP filtering for South Africa

    Get PDF
    Among the various methods used to identify the business cycle from aggregate data, the Hodrick-Prescott filter has become an industry standard – it ‘identifies’ the business cycle by removing low-frequency information, thereby smoothing the data. Since the filter’s inception in 1980, the value of the smoothing constant for quarterly data has been set at a ‘default’ of 1600, following the suggestion of Hodrick and Prescott (1980). This paper argues that this ‘default value’ is inappropriate due to its ad hoc nature and problematic underlying assumptions. Instead this paper uses the method of optimal filtering, developed by Pedersen (1998, 2001, and 2002), to determine the optimal value of the smoothing constant for South Africa. The optimal smoothing constant is that value which least distorts the frequency information of the time series. The result depends on both the censoring rule for the duration of the business cycles and the structure of the economy. The paper raises a number of important issues concerning the practical use of the HP filter, and provides an easily replicable method in the form of MATLAB code.Hodrick-Prescott filter, Spectral analysis, Ideal filtering, Optimal filtering, Distortionary filtering, Business cycles, MATLAB
    • 

    corecore