4,100,089 research outputs found

    Spatial interpolation of high-frequency monitoring data

    Full text link
    Climate modelers generally require meteorological information on regular grids, but monitoring stations are, in practice, sited irregularly. Thus, there is a need to produce public data records that interpolate available data to a high density grid, which can then be used to generate meteorological maps at a broad range of spatial and temporal scales. In addition to point predictions, quantifications of uncertainty are also needed. One way to accomplish this is to provide multiple simulations of the relevant meteorological quantities conditional on the observed data taking into account the various uncertainties in predicting a space-time process at locations with no monitoring data. Using a high-quality dataset of minute-by-minute measurements of atmospheric pressure in north-central Oklahoma, this work describes a statistical approach to carrying out these conditional simulations. Based on observations at 11 stations, conditional simulations were produced at two other sites with monitoring stations. The resulting point predictions are very accurate and the multiple simulations produce well-calibrated prediction uncertainties for temporal changes in atmospheric pressure but are substantially overconservative for the uncertainties in the predictions of (undifferenced) pressure.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS208 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Local Parametric Estimation in High Frequency Data

    Full text link
    In this paper, we give a general time-varying parameter model, where the multidimensional parameter possibly includes jumps. The quantity of interest is defined as the integrated value over time of the parameter process Θ=T10Tθtdt\Theta = T^{-1} \int_0^T \theta_t^* dt. We provide a local parametric estimator (LPE) of Θ\Theta and conditions under which we can show the central limit theorem. Roughly speaking those conditions correspond to some uniform limit theory in the parametric version of the problem. The framework is restricted to the specific convergence rate n1/2n^{1/2}. Several examples of LPE are studied: estimation of volatility, powers of volatility, volatility when incorporating trading information and time-varying MA(1).Comment: 67 pages, 4 figure

    Nowcasting inflation using high frequency data

    Get PDF
    This paper proposes a methodology to nowcast and forecast inflation using data with sampling frequency higher than monthly. The nowcasting literature has been focused on GDP, typically using monthly indicators in order to produce an accurate estimate for the current and next quarter. This paper exploits data with weekly and daily frequency in order to produce more accurate estimates of inflation for the current and followings months. In particular, this paper uses the Weekly Oil Bulletin Price Statistics for the euro area, the Weekly Retail Gasoline and Diesel Prices for the US and daily World Market Prices of Raw Materials. The data are modeled as a trading day frequency factor model with missing observations in a state space representation. For the estimation we adopt the methodology exposed in Banbura and Modugno (2010). In contrast to other existing approaches, the methodology used in this paper has the advantage of modeling all data within a unified single framework that, nevertheless, allows one to produce forecasts of all variables involved. This offers the advantage of disentangling a model-based measure of ”news” from each data release and subsequently to assess its impact on the forecast revision. The paper provides an illustrative example of this procedure. Overall, the results show that these data improve forecast accuracy over models that exploit data available only at monthly frequency for both countries. JEL Classification: C53, E31, E37Factor models, forecasting, inflation, Mixed Frequencies

    Analysis of Binarized High Frequency Financial Data

    Full text link
    A non-trivial probability structure is evident in the binary data extracted from the up/down price movements of very high frequency data such as tick-by-tick data for USD/JPY. In this paper, we analyze the Sony bank USD/JPY rates, ignoring the small deviations from the market price. We then show there is a similar non-trivial probability structure in the Sony bank rate, in spite of the Sony bank rate's having less frequent and larger deviations than tick-by-tick data. However, this probability structure is not found in the data which has been sampled from tick-by-tick data at the same rate as the Sony bank rate. Therefore, the method of generating the Sony bank rate from the market rate has the potential for practical use since the method retains the probability structure as the sampling frequency decreases.Comment: 8pages, 4figures, contribution to the 3rd International Conference NEXT-SigmaPh

    Margin setting with high-frequency data

    Get PDF
    Both in practice and in the academic literature, models for setting margin requirements in futures markets classically use daily closing price changes. However, as well documented by research on high-frequency data, financial markets have recently shown high intraday volatility, which could bring more risk than expected. This paper tries to answer two questions relevant for margin committees in practice: is it right to compute margin levels based on closing prices and ignoring intraday dynamics? Is it justified to implement intraday margin calls? The paper focuses on the impact of intraday dynamics of market prices on daily margin levels. Daily margin levels are obtained in two ways: first, by using daily price changes defined with different time-intervals (say from 3 pm to 3 pm on the following trading day instead of traditional closing times); second, by using 5-minute and 1-hour price changes and scaling the results to one day. Our empirical analysis uses the FTSE 100 futures contract traded on LIFFE.

    Interpolation of nonstationary high frequency spatial-temporal temperature data

    Full text link
    The Atmospheric Radiation Measurement program is a U.S. Department of Energy project that collects meteorological observations at several locations around the world in order to study how weather processes affect global climate change. As one of its initiatives, it operates a set of fixed but irregularly-spaced monitoring facilities in the Southern Great Plains region of the U.S. We describe methods for interpolating temperature records from these fixed facilities to locations at which no observations were made, which can be useful when values are required on a spatial grid. We interpolate by conditionally simulating from a fitted nonstationary Gaussian process model that accounts for the time-varying statistical characteristics of the temperatures, as well as the dependence on solar radiation. The model is fit by maximizing an approximate likelihood, and the conditional simulations result in well-calibrated confidence intervals for the predicted temperatures. We also describe methods for handling spatial-temporal jumps in the data to interpolate a slow-moving cold front.Comment: Published in at http://dx.doi.org/10.1214/13-AOAS633 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore