324,152 research outputs found
Essays on noncausal and noninvertible time series
Over the last two decades, there has been growing interest among economists in nonfundamental univariate processes, generally represented by noncausal and non-invertible time series. These processes have become increasingly popular due to their ability to capture nonlinear dynamics such as volatility clustering, asymmetric cycles, and local explosiveness - all of which are commonly observed in Macroeconomics and Finance. In particular, the incorporation of both past and future components into noncausal and noninvertible processes makes them attractive options for modeling forward-looking behavior in economic activities. However, the classical techniques used for analyzing time series models are largely limited to causal and invertible counterparts. This dissertation seeks to contribute to the field by providing theoretical tools robust to noncausal and noninvertible time series in testing and estimation. In the first chapter, "Quantile Autoregression-Based Non-causality Testing", we investigate the statistical properties of empirical conditional quantiles of non-causal processes. Specifically, we show that the quantile autoregression (QAR) estimates for non-causal processes do not remain constant across different quantiles in contrast to their causal counterparts. Furthermore, we demonstrate that non-causal autoregressive processes admit nonlinear representations for conditional quantiles given past observations. Exploiting these properties, we propose three novel testing strategies of non-causality for non-Gaussian processes within the QAR framework. The tests are constructed either by verifying the constancy of the slope coefficients or by applying a misspecification test of the linear QAR model over different quantiles of the process. Some numerical experiments are included to examine the finite sample performance of the testing strategies, where we compare different specification tests for dynamic quantiles with the Kolmogorov-Smirnov constancy test. The new methodology is applied to some time series from financial markets to investigate the presence of speculative bubbles. The extension of the approach based on the specification tests to AR processes driven by innovations with heteroskedasticity is studied through simulations. The performance of QAR estimates of non-causal processes at extreme quantiles is also explored. In the second chapter, "Estimation of Time Series Models Using the Empirical Distribution of Residuals", we introduce a novel estimation technique for general linear time series models, potentially noninvertible and noncausal, by utilizing the empirical cumulative distribution function of residuals. The proposed method relies on the generalized spectral cumulative function to characterize the pairwise dependence of residuals at all lags. Model identification can be achieved by exploiting the information in the joint distribution of residuals under the iid assumption. This method yields consistent estimates of the model parameters without imposing stringent conditions on the higher-order moments or any distributional assumptions on the innovations beyond non-Gaussianity. We investigate the asymptotic distribution of the estimates by employing a smoothed cumulative distribution function to approximate the indicator function, considering the non-differentiability of the original loss function. Efficiency improvements can be achieved by properly choosing the scaling parameter for residuals. Finite sample properties are explored through Monte Carlo simulations. An empirical application to illustrate this methodology is provided by fitting the daily trading volume of Microsoft stock by autoregressive models with noncausal representation. The flexibility of the cumulative distribution function permits the proposed method to be extended to more general dependence structures where innovations are only conditional mean or quantile independent. In the third chapter, "Directional Predictability Tests", joint with Carlos Velasco, we propose new tests of predictability for non-Gaussian sequences that may display general nonlinear dependence in higher-order properties. We test the null of martingale difference against parametric alternatives which can introduce linear or nonlinear dependence as generated by ARMA and all-pass restricted ARMA models, respectively. We also develop tests to check for linear predictability under the white noise null hypothesis parameterized by an all-pass model driven by martingale difference innovations and tests of non-linear predictability on ARMA residuals. Our Lagrange Multiplier tests are developed from a loss function based on pairwise dependence measures that identify the predictability of levels. We provide asymptotic and finite sample analysis of the properties of the new tests and investigate the predictability of different series of financial returns.This thesis has been possible thanks to the financial support from the grant BES-2017-082695 from the Ministerio de EconomĂa Industria y Competitividad.Programa de Doctorado en EconomĂa por la Universidad Carlos III de MadridPresidente: Miguel ĂĄngel Delgado GonzĂĄlez.- Secretario: Manuel DomĂnguez Toribio.- Vocal: Majid M. Al Sadoo
Notion of Neutrosophic Risk and Financial Markets Prediction
Presenting an application of the neutrosophic logic in the prediction of the financial markets
Estimation and prediction of travel time from loop detector data for intelligent transportation systems applications
With the advent of Advanced Traveler Information Systems (ATIS), short-term travel time prediction is becoming increasingly important. Travel time can be obtained directly from instrumented test vehicles, license plate matching, probe vehicles etc., or from indirect methods such as loop detectors. Because of their wide spread deployment, travel time estimation from loop detector data is one of the most widely used methods. However, the major criticism about loop detector data is the high probability of error due to the prevalence of equipment malfunctions. This dissertation presents methodologies for estimating and predicting travel time from the loop detector data after correcting for errors. The methodology is a multi-stage process, and includes the correction of data, estimation of travel time and prediction of travel time, and each stage involves the judicious use of suitable techniques. The various techniques selected for each of these stages are detailed below. The test sites are from the freeways in San Antonio, Texas, which are equipped with dual inductance loop detectors and AVI.
?? Constrained non-linear optimization approach by Generalized Reduced Gradient (GRG) method for data reduction and quality control, which included a check for the accuracy of data from a series of detectors for conservation of vehicles, in addition to the commonly adopted checks.
?? A theoretical model based on traffic flow theory for travel time estimation for both off-peak and peak traffic conditions using flow, occupancy and speed values obtained from detectors.
?? Application of a recently developed technique called Support Vector Machines (SVM) for travel time prediction. An Artificial Neural Network (ANN) method is also developed for comparison.
Thus, a complete system for the estimation and prediction of travel time from loop detector data is detailed in this dissertation. Simulated data from CORSIM simulation software is used for the validation of the results
Extreme Value distribution for singular measures
In this paper we perform an analytical and numerical study of Extreme Value
distributions in discrete dynamical systems that have a singular measure. Using
the block maxima approach described in Faranda et al. [2011] we show that,
numerically, the Extreme Value distribution for these maps can be associated to
the Generalised Extreme Value family where the parameters scale with the
information dimension. The numerical analysis are performed on a few low
dimensional maps. For the middle third Cantor set and the Sierpinskij triangle
obtained using Iterated Function Systems, experimental parameters show a very
good agreement with the theoretical values. For strange attractors like Lozi
and H\`enon maps a slower convergence to the Generalised Extreme Value
distribution is observed. Even in presence of large statistics the observed
convergence is slower if compared with the maps which have an absolute
continuous invariant measure. Nevertheless and within the uncertainty computed
range, the results are in good agreement with the theoretical estimates
Generalized spectral tests for the martingale difference hypothesis
This article proposes a test for the martingale difference hypothesis (MDH) using dependence measures related to the characteristic function. The MDH typically has been tested using the sample autocorrelations or in the spectral domain using the periodogram. Tests based on these statistics are inconsistent against uncorrelated non-martingales processes. Here, we generalize the spectral test of Durlauf (1991) for testing the MDH taking into account linear and nonlinear dependence. Our test considers dependence at all lags and is consistent against general pairwise nonparametric Pitman's local alternatives converging at the parametric rate n-1/2, with n the sample size. Furthermore, with our methodology there is no need to choose a lag order, to smooth the data or to formulate a parametric alternative. Our approach could be extended to specification testing of the conditional mean of possibly nonlinear models. The asymptotic null distribution of our test depends on the data generating process, so a bootstrap procedure is proposed and theoretically justified. Our bootstrap test is robust to higher order dependence, in particular to conditional heteroskedasticity. A Monte Carlo study examines the finite sample performance of our test and shows that it is more powerful than some competing tests. Finally, an application to the S&P 500 stock index and exchange rates highlights the merits of our approach.Publicad
Long term memories of developed and emerging markets: using the scaling analysis to characterize their stage of development
The scaling properties encompass in a simple analysis many of the volatility
characteristics of financial markets. That is why we use them to probe the
different degree of markets development. We empirically study the scaling
properties of daily Foreign Exchange rates, Stock Market indices and fixed
income instruments by using the generalized Hurst approach. We show that the
scaling exponents are associated with characteristics of the specific markets
and can be used to differentiate markets in their stage of development. The
robustness of the results is tested by both Monte-Carlo studies and a
computation of the scaling in the frequency-domain.Comment: 46 pages, 7 figures, accepted for publication in Journal of Banking &
Financ
Recommended from our members
Econometrics: A bird's eye view
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledge and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks and forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of the "real time econometrics". This paper attempts to provide an overview of some of these developments
Approximate cost-efficient sequential designs for binary response models with application to switching measurements
The efficiency of an experimental design is ultimately measured in terms of
time and resources needed for the experiment. Optimal sequential (multi-stage)
design is studied in the situation where each stage involves a fixed cost. The
problem is motivated by switching measurements on superconducting Josephson
junctions. In this quantum mechanical experiment, the sequences of current
pulses are applied to the Josephson junction sample and a binary response
indicating the presence or the absence of a voltage response is measured. The
binary response can be modeled by a generalized linear model with the
complementary log-log link function. The other models considered are the logit
model and the probit model. For these three models, the approximately optimal
sample size for the next stage as a function of the current Fisher information
and the stage cost is determined. The cost-efficiency of the proposed design is
demonstrated in simulations based on real data from switching measurements. The
results can be directly applied to switching measurements and they may lead to
substantial savings in the time needed for the experiment.Comment: revised version, accepted for publicatio
- âŠ