146,788 research outputs found

    Improved Subset Autoregression: With R Package

    Get PDF
    The FitAR R (R Development Core Team 2008) package that is available on the Comprehensive R Archive Network is described. This package provides a comprehensive approach to fitting autoregressive and subset autoregressive time series. For long time series with complicated autocorrelation behavior, such as the monthly sunspot numbers, subset autoregression may prove more feasible and/or parsimonious than using AR or ARMA models. The two principal functions in this package are SelectModel and FitAR for automatic model selection and model fitting respectively. In addition to the regular autoregressive model and the usual subset autoregressive models (Tong'77), these functions implement a new family of models. This new family of subset autoregressive models is obtained by using the partial autocorrelations as parameters and then selecting a subset of these parameters. Further properties and results for these models are discussed in McLeod and Zhang (2006). The advantages of this approach are that not only is an efficient algorithm for exact maximum likelihood implemented but that efficient methods are derived for selecting high-order subset models that may occur in massive datasets containing long time series. A new improved extended {BIC} criterion, {UBIC}, developed by Chen and Chen (2008) is implemented for subset model selection. A complete suite of model building functions for each of the three types of autoregressive models described above are included in the package. The package includes functions for time series plots, diagnostic testing and plotting, bootstrapping, simulation, forecasting, Box-Cox analysis, spectral density estimation and other useful time series procedures. As well as methods for standard generic functions including print, plot, predict and others, some new generic functions and methods are supplied that make it easier to work with the output from FitAR for bootstrapping, simulation, spectral density estimation and Box-Cox analysis.

    SLOPE - Adaptive variable selection via convex optimization

    Get PDF
    We introduce a new estimator for the vector of coefficients ÎČ\beta in the linear model y=XÎČ+zy=X\beta+z, where XX has dimensions n×pn\times p with pp possibly larger than nn. SLOPE, short for Sorted L-One Penalized Estimation, is the solution to min⁥b∈Rp12∄y−Xb∄ℓ22+λ1∣b∣(1)+λ2∣b∣(2)+⋯+λp∣b∣(p),\min_{b\in\mathbb{R}^p}\frac{1}{2}\Vert y-Xb\Vert _{\ell_2}^2+\lambda_1\vert b\vert _{(1)}+\lambda_2\vert b\vert_{(2)}+\cdots+\lambda_p\vert b\vert_{(p)}, where λ1≄λ2≄⋯≄λp≄0\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_p\ge0 and ∣b∣(1)â‰„âˆŁb∣(2)â‰„â‹Żâ‰„âˆŁb∣(p)\vert b\vert_{(1)}\ge\vert b\vert_{(2)}\ge\cdots\ge\vert b\vert_{(p)} are the decreasing absolute values of the entries of bb. This is a convex program and we demonstrate a solution algorithm whose computational complexity is roughly comparable to that of classical ℓ1\ell_1 procedures such as the Lasso. Here, the regularizer is a sorted ℓ1\ell_1 norm, which penalizes the regression coefficients according to their rank: the higher the rank - that is, stronger the signal - the larger the penalty. This is similar to the Benjamini and Hochberg [J. Roy. Statist. Soc. Ser. B 57 (1995) 289-300] procedure (BH) which compares more significant pp-values with more stringent thresholds. One notable choice of the sequence {λi}\{\lambda_i\} is given by the BH critical values λBH(i)=z(1−i⋅q/2p)\lambda_{\mathrm {BH}}(i)=z(1-i\cdot q/2p), where q∈(0,1)q\in(0,1) and z(α)z(\alpha) is the quantile of a standard normal distribution. SLOPE aims to provide finite sample guarantees on the selected model; of special interest is the false discovery rate (FDR), defined as the expected proportion of irrelevant regressors among all selected predictors. Under orthogonal designs, SLOPE with λBH\lambda_{\mathrm{BH}} provably controls FDR at level qq. Moreover, it also appears to have appreciable inferential properties under more general designs XX while having substantial power, as demonstrated in a series of experiments running on both simulated and real data.Comment: Published at http://dx.doi.org/10.1214/15-AOAS842 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Compression and Conditional Emulation of Climate Model Output

    Full text link
    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus it is important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. The statistical model can be used to generate realizations representing the full dataset, along with characterizations of the uncertainties in the generated data. Thus, the methods are capable of both compression and conditional emulation of the climate models. Considerable attention is paid to accurately modeling the original dataset--one year of daily mean temperature data--particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers

    In-loop Feature Tracking for Structure and Motion with Out-of-core Optimization

    Get PDF
    In this paper, a novel and approach for obtaining 3D models from video sequences captured with hand-held cameras is addressed. We define a pipeline that robustly deals with different types of sequences and acquiring devices. Our system follows a divide and conquer approach: after a frame decimation that pre-conditions the input sequence, the video is split into short-length clips. This allows to parallelize the reconstruction step which translates into a reduction in the amount of computational resources required. The short length of the clips allows an intensive search for the best solution at each step of reconstruction which robustifies the system. The process of feature tracking is embedded within the reconstruction loop for each clip as opposed to other approaches. A final registration step, merges all the processed clips to the same coordinate fram

    A fast and robust hand-driven 3D mouse

    Get PDF
    The development of new interaction paradigms requires a natural interaction. This means that people should be able to interact with technology with the same models used to interact with everyday real life, that is through gestures, expressions, voice. Following this idea, in this paper we propose a non intrusive vision based tracking system able to capture hand motion and simple hand gestures. The proposed device allows to use the hand as a "natural" 3D mouse, where the forefinger tip or the palm centre are used to identify a 3D marker and the hand gesture can be used to simulate the mouse buttons. The approach is based on a monoscopic tracking algorithm which is computationally fast and robust against noise and cluttered backgrounds. Two image streams are processed in parallel exploiting multi-core architectures, and their results are combined to obtain a constrained stereoscopic problem. The system has been implemented and thoroughly tested in an experimental environment where the 3D hand mouse has been used to interact with objects in a virtual reality application. We also provide results about the performances of the tracker, which demonstrate precision and robustness of the proposed syste
    • 

    corecore