216 research outputs found

    Semiparametric Estimation of Long-Memory Models

    Get PDF
    This chapter reviews semiparametric methods of inference on different aspects of long memory time series. The main focus is on estimation of the memory parameter of linear models, analyzing bandwidth choice, bias reduction techniques and robustness properties of different estimates, with sorne emphasis on nonstationarity and trending behaviors. These techniques extend naturally to multivariate series, where the important issues are the estimation of the long-run relationship and testing for fractional cointegration. Specific techniques for the estimation of the degree of persistence of volatility for nonlinear time series are also considered

    Forecasting under Long Memory and Nonstationarity

    Full text link
    Long memory in the sense of slowly decaying autocorrelations is a stylized fact in many time series from economics and finance. The fractionally integrated process is the workhorse model for the analysis of these time series. Nevertheless, there is mixed evidence in the literature concerning its usefulness for forecasting and how forecasting based on it should be implemented. Employing pseudo-out-of-sample forecasting on inflation and realized volatility time series and simulations we show that methods based on fractional integration clearly are superior to alternative methods not accounting for long memory, including autoregressions and exponential smoothing. Our proposal of choosing a fixed fractional integration parameter of d=0.5d=0.5 a priori yields the best results overall, capturing long memory behavior, but overcoming the deficiencies of methods using an estimated parameter. Regarding the implementation of forecasting methods based on fractional integration, we use simulations to compare local and global semiparametric and parametric estimators of the long memory parameter from the Whittle family and provide asymptotic theory backed up by simulations to compare different mean estimators. Both of these analyses lead to new results, which are also of interest outside the realm of forecasting

    Heteroskedastic Factor Vector Autoregressive Estimation of Persistent and Non Persistent Processes Subject to Structural Breaks

    Get PDF
    In the paper the fractionally integrated heteroskedastic factor vec- tor autoregressive (FI-HF-VAR) model is introduced. The proposed approach is characterized by minimal pretesting requirements and sim- plicity of implementation also in very large systems, performing well independently of integration properties and sources of persistence, i.e. deterministic or stochastic, accounting for common features of di¤erent kinds, i.e. common integrated (of the fractional or inte- ger type) or non integrated stochastic factors, also featuring condi- tional heteroskedasticity, and common deterministic break processes. The proposed approach allows for accurate investigation of economic time series, from persistence and copersistence analysis to impulse responses and forecast error variance decomposition. Monte Carlo results strongly support the proposed methodology. Key words: long and short memory, structural breaks, fractionally integrated heteroskedastic factor vector autoregressive model.

    Multistep forecasting of long memory series using fractional exponential models

    Get PDF
    We develop forecasting methodology for the fractional exponential (FEXP) model. First, we devise algorithms for fast exact computation of the coefficients in the infinite order autoregressive and moving average representations of a FEXP process. We also describe an algorithm to accurately approximate the autocovariances and to simulate realizations of the process. Next, we present a fast frequency-domain cross validation method for selecting the order of the model. This model selection method is designed to yield the model which provides the best multistep forecast for the given lead time, without assuming that the process actually obeys a FEXP model. Finally, we use the infinite order autoregressive coefficients of a fitted FEXP model to construct multistep forecasts of inflation in the United Kingdom. These forecasts are substantially different than those from a fitted ARFIMA model.Statistics Working Papers Serie

    The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    Get PDF
    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time series processes, also featuring long memory, we discuss likelihood inferences based on the periodogram, for which the estimation of the cepstrum yields a generalized linear model for exponential data with logarithmic link, focusing on the issue of separating the contribution of the long memory component to the log-spectrum. We then propose two extensions. The first deals with replacing the logarithmic link with a more general Box-Cox link, which encompasses also the identity and the inverse links: this enables nesting alternative spectral estimation methods (autoregressive, exponential, etc.) under the same likelihood-based framework. Secondly, we propose a gradient boosting algorithm for the estimation of the log-spectrum and illustrate its potential for distilling the long memory component of the log-spectrum

    The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    Get PDF
    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time series processes, also featuring long memory, we discuss likelihood inferences based on the periodogram, for which the estimation of the cepstrum yields a generalized linear model for exponential data with logarithmic link, focusing on the issue of separating the contribution of the long memory component to the log-spectrum. We then propose two extensions. The first deals with replacing the logarithmic link with a more general Box-Cox link, which encompasses also the identity and the inverse links: this enables nesting alternative spectral estimation methods (autoregressive, exponential, etc.) under the same likelihood-based framework. Secondly, we propose a gradient boosting algorithm for the estimation of the log-spectrum and illustrate its potential for distilling the long memory component of the log-spectrum

    Reactive traffic control mechanisms for communication networks with self-similar bandwidth demands

    Get PDF
    Communication network architectures are in the process of being redesigned so that many different services are integrated within the same network. Due to this integration, traffic management algorithms need to balance the requirements of the traffic which the algorithms are directly controlling with Quality of Service (QoS) requirements of other classes of traffic which will be encountered in the network. Of particular interest is one class of traffic, termed elastic traffic, that responds to dynamic feedback from the network regarding the amount of available resources within the network. Examples of this type of traffic include the Available Bit Rate (ABR) service in Asynchronous Transfer Mode (ATM) networks and connections using Transmission Control Protocol (TCP) in the Internet. Both examples aim to utilise available bandwidth within a network. Reactive traffic management, like that which occurs in the ABR service and TCP, depends explicitly on the dynamic bandwidth requirements of other traffic which is currently using the network. In particular, there is significant evidence that a wide range of network traffic, including Ethernet, World Wide Web, Varible Bit Rate video and signalling traffic, is self-similar. The term self-similar refers to the particular characteristic of network traffic to remain bursty over a wide range of time scales. A closely associated characteristic of self-similar traffic is its long-range dependence (LRD), which refers to the significant correlations that occur with the traffic. By utilising these correlations, greater predictability of network traffic can be achieved, and hence the performance of reactive traffic management algorithms can be enhanced. A predictive rate control algorithm, called PERC (Predictive Explicit Rate Control), is proposed in this thesis which is targeted to the ABR service in ATM networks. By incorporating the LRD stochastic structure of background traffic, measurements of the bandwidth requirements of background traffic, and the delay associated with a particular ABR connection, a predictive algorithm is defined which provides explicit rate information that is conveyed to ABR sources. An enhancement to PERC is also described. This algorithm, called PERC+, uses previous control information to correct prediction errors that occur for connections with larger round-trip delay. These algorithms have been extensively analysed with regards to their network performance, and simulation results show that queue lengths and cell loss rates are significantly reduced when these algorithms are deployed. An adaptive version of PERC has also been developed using real-time parameter estimates of self-similar traffic. This has excellent performance compared with standard ABR rate control algorithms such as ERICA. Since PERC and its enhancement PERC+ have explicitly utilised the index of self-similarity, known as the Hurst parameter, the sensitivity of these algorithms to this parameter can be determined analytically. Research work described in this thesis shows that the algorithms have an asymmetric sensitivity to the Hurst parameter, with significant sensitivity in the region where the parameter is underestimated as being close to 0.5. Simulation results reveal the same bias in the performance of the algorithm with regards to the Hurst parameter. In contrast, PERC is insensitive to estimates of the mean, using the sample mean estimator, and estimates of the traffic variance, which is due to the algorithm primarily utilising the correlation structure of the traffic to predict future bandwidth requirements. Sensitivity analysis falls into the area of investigative research, but it naturally leads to the area of robust control, where algorithms are designed so that uncertainty in traffic parameter estimation or modelling can be accommodated. An alternative robust design approach, to the standard maximum entropy approach, is proposed in this thesis that uses the maximum likelihood function to develop the predictive rate controller. The likelihood function defines the proximity of a specific traffic model to the traffic data, and hence gives a measure of the performance of a chosen model. Maximising the likelihood function leads to optimising robust performance, and it is shown, through simulations, that the system performance is close to the optimal performance as compared with maximising the spectral entropy. There is still debate regarding the influence of LRD on network performance. This thesis also considers the question of the influence of LRD on traffic predictability, and demonstrates that predictive rate control algorithms that only use short-term correlations have close performance to algorithms that utilise long-term correlations. It is noted that predictors based on LRD still out-perform ones which use short-term correlations, but that there is Potential simplification in the design of predictors, since traffic predictability can be achieved using short-term correlations. This thesis forms a substantial contribution to the understanding of control in the case where self-similar processes form part of the overall system. Rather than doggedly pursuing self-similar control, a broader view has been taken where the performance of algorithms have been considered from a number of perspectives. A number of different research avenues lead on from this work, and these are outlined

    Modelos de memoria larga para series económicas y financieras

    Get PDF
    En este trabajo se hace una revisión de los modelos de series temporales con memoria larga para la media y la varianza condicionada, con especial atención a los modelos ARMA fraccionalmente integrados (ARFIMA) y a los modelos GARCH y SV fraccionalmente integrados. Se estudian sus propiedades más importantes y se discute su aplicación en la modelización de series económicas y financieras. También se describen los principales métodos de estimación propuestos para estos modelos y se revisan algunos contrastes para detectar la presencia de memoria larga. Finalmente, se revisan los principales resultados sobre predicción de valores futuros de series temporales con memoria larga.This paper provides a review of time series models with long memory in the mean and conditional variance, with special attention to Fractionally Integrated ARMA processes (ARFIMA) and fractionally integrated GARCH and SV processes. Their more important properties are reviewed and its application to model economic and financial time series is discussed. The main estimation methods and tests proposed in the literature for the long memory property are also reviewed. Finally, this paper reviews the main results on prediction of future values of long memory time series.Agradecemos asimismo la ayuda financiera de los proyectos SEC97-1379 (CICYT) y PB98-0026 (DGCICYT).Publicad
    corecore