3,695 research outputs found

    Daily minimum and maximum temperature simulation over complex terrain

    Full text link
    Spatiotemporal simulation of minimum and maximum temperature is a fundamental requirement for climate impact studies and hydrological or agricultural models. Particularly over regions with variable orography, these simulations are difficult to produce due to terrain driven nonstationarity. We develop a bivariate stochastic model for the spatiotemporal field of minimum and maximum temperature. The proposed framework splits the bivariate field into two components of "local climate" and "weather." The local climate component is a linear model with spatially varying process coefficients capturing the annual cycle and yielding local climate estimates at all locations, not only those within the observation network. The weather component spatially correlates the bivariate simulations, whose matrix-valued covariance function we estimate using a nonparametric kernel smoother that retains nonnegative definiteness and allows for substantial nonstationarity across the simulation domain. The statistical model is augmented with a spatially varying nugget effect to allow for locally varying small scale variability. Our model is applied to a daily temperature data set covering the complex terrain of Colorado, USA, and successfully accommodates substantial temporally varying nonstationarity in both the direct-covariance and cross-covariance functions.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS602 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Compression and Conditional Emulation of Climate Model Output

    Full text link
    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus it is important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. The statistical model can be used to generate realizations representing the full dataset, along with characterizations of the uncertainties in the generated data. Thus, the methods are capable of both compression and conditional emulation of the climate models. Considerable attention is paid to accurately modeling the original dataset--one year of daily mean temperature data--particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers

    A Bayesian Nonparametric Markovian Model for Nonstationary Time Series

    Full text link
    Stationary time series models built from parametric distributions are, in general, limited in scope due to the assumptions imposed on the residual distribution and autoregression relationship. We present a modeling approach for univariate time series data, which makes no assumptions of stationarity, and can accommodate complex dynamics and capture nonstandard distributions. The model for the transition density arises from the conditional distribution implied by a Bayesian nonparametric mixture of bivariate normals. This implies a flexible autoregressive form for the conditional transition density, defining a time-homogeneous, nonstationary, Markovian model for real-valued data indexed in discrete-time. To obtain a more computationally tractable algorithm for posterior inference, we utilize a square-root-free Cholesky decomposition of the mixture kernel covariance matrix. Results from simulated data suggest the model is able to recover challenging transition and predictive densities. We also illustrate the model on time intervals between eruptions of the Old Faithful geyser. Extensions to accommodate higher order structure and to develop a state-space model are also discussed

    Inference of time-varying regression models

    Full text link
    We consider parameter estimation, hypothesis testing and variable selection for partially time-varying coefficient models. Our asymptotic theory has the useful feature that it can allow dependent, nonstationary error and covariate processes. With a two-stage method, the parametric component can be estimated with a n1/2n^{1/2}-convergence rate. A simulation-assisted hypothesis testing procedure is proposed for testing significance and parameter constancy. We further propose an information criterion that can consistently select the true set of significant predictors. Our method is applied to autoregressive models with time-varying coefficients. Simulation results and a real data application are provided.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1010 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Dynamic modeling of mean-reverting spreads for statistical arbitrage

    Full text link
    Statistical arbitrage strategies, such as pairs trading and its generalizations, rely on the construction of mean-reverting spreads enjoying a certain degree of predictability. Gaussian linear state-space processes have recently been proposed as a model for such spreads under the assumption that the observed process is a noisy realization of some hidden states. Real-time estimation of the unobserved spread process can reveal temporary market inefficiencies which can then be exploited to generate excess returns. Building on previous work, we embrace the state-space framework for modeling spread processes and extend this methodology along three different directions. First, we introduce time-dependency in the model parameters, which allows for quick adaptation to changes in the data generating process. Second, we provide an on-line estimation algorithm that can be constantly run in real-time. Being computationally fast, the algorithm is particularly suitable for building aggressive trading strategies based on high-frequency data and may be used as a monitoring device for mean-reversion. Finally, our framework naturally provides informative uncertainty measures of all the estimated parameters. Experimental results based on Monte Carlo simulations and historical equity data are discussed, including a co-integration relationship involving two exchange-traded funds.Comment: 34 pages, 6 figures. Submitte
    corecore