13,144 research outputs found

    Smoothing sparse and unevenly sampled curves using semiparametric mixed models: An application to online auctions

    Get PDF
    Functional data analysis can be challenging when the functional objects are sampled only very sparsely and unevenly. Most approaches rely on smoothing to recover the underlying functional object from the data which can be difficult if the data is irregularly distributed. In this paper we present a new approach that can overcome this challenge. The approach is based on the ideas of mixed models. Specifically, we propose a semiparametric mixed model with boosting to recover the functional object. While the model can handle sparse and unevenly distributed data, it also results in conceptually more meaningful functional objects. In particular, we motivate our method within the framework of eBay's online auctions. Online auctions produce monotonic increasing price curves that are often correlated across two auctions. The semiparametric mixed model accounts for this correlation in a parsimonious way. It also estimates the underlying increasing trend from the data without imposing model-constraints. Our application shows that the resulting functional objects are conceptually more appealing. Moreover, when used to forecast the outcome of an online auction, our approach also results in more accurate price predictions compared to standard approaches. We illustrate our model on a set of 183 closed auctions for Palm M515 personal digital assistants

    Aggregated functional data model for Near-Infrared Spectroscopy calibration and prediction

    Full text link
    Calibration and prediction for NIR spectroscopy data are performed based on a functional interpretation of the Beer-Lambert formula. Considering that, for each chemical sample, the resulting spectrum is a continuous curve obtained as the summation of overlapped absorption spectra from each analyte plus a Gaussian error, we assume that each individual spectrum can be expanded as a linear combination of B-splines basis. Calibration is then performed using two procedures for estimating the individual analytes curves: basis smoothing and smoothing splines. Prediction is done by minimizing the square error of prediction. To assess the variance of the predicted values, we use a leave-one-out jackknife technique. Departures from the standard error models are discussed through a simulation study, in particular, how correlated errors impact on the calibration step and consequently on the analytes' concentration prediction. Finally, the performance of our methodology is demonstrated through the analysis of two publicly available datasets.Comment: 27 pages, 7 figures, 7 table

    Functional Regression

    Full text link
    Functional data analysis (FDA) involves the analysis of data whose ideal units of observation are functions defined on some continuous domain, and the observed data consist of a sample of functions taken from some population, sampled on a discrete grid. Ramsay and Silverman's 1997 textbook sparked the development of this field, which has accelerated in the past 10 years to become one of the fastest growing areas of statistics, fueled by the growing number of applications yielding this type of data. One unique characteristic of FDA is the need to combine information both across and within functions, which Ramsay and Silverman called replication and regularization, respectively. This article will focus on functional regression, the area of FDA that has received the most attention in applications and methodological development. First will be an introduction to basis functions, key building blocks for regularization in functional regression methods, followed by an overview of functional regression methods, split into three types: [1] functional predictor regression (scalar-on-function), [2] functional response regression (function-on-scalar) and [3] function-on-function regression. For each, the role of replication and regularization will be discussed and the methodological development described in a roughly chronological manner, at times deviating from the historical timeline to group together similar methods. The primary focus is on modeling and methodology, highlighting the modeling structures that have been developed and the various regularization approaches employed. At the end is a brief discussion describing potential areas of future development in this field

    Optimal estimation of the mean function based on discretely sampled functional data: Phase transition

    Get PDF
    The problem of estimating the mean of random functions based on discretely sampled data arises naturally in functional data analysis. In this paper, we study optimal estimation of the mean function under both common and independent designs. Minimax rates of convergence are established and easily implementable rate-optimal estimators are introduced. The analysis reveals interesting and different phase transition phenomena in the two cases. Under the common design, the sampling frequency solely determines the optimal rate of convergence when it is relatively small and the sampling frequency has no effect on the optimal rate when it is large. On the other hand, under the independent design, the optimal rate of convergence is determined jointly by the sampling frequency and the number of curves when the sampling frequency is relatively small. When it is large, the sampling frequency has no effect on the optimal rate. Another interesting contrast between the two settings is that smoothing is necessary under the independent design, while, somewhat surprisingly, it is not essential under the common design.Comment: Published in at http://dx.doi.org/10.1214/11-AOS898 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Trajectory Reconstruction Techniques for Evaluation of ATC Systems

    Get PDF
    This paper is focused on trajectory reconstruction techniques for evaluating ATC systems, using real data of recorded opportunity traffic. We analyze different alternatives for this problem, from traditional interpolation approaches based on curve fitting to our proposed schemes based on modeling regular motion patterns with optimal smoothers. The extraction of trajectory features such as motion type (or mode of flight), maneuvers profile, geometric parameters, etc., allows a more accurate computation of the curve and the detailed evaluation of the data processors used in the ATC centre. Different alternatives will be compared with some performance results obtained with simulated and real data sets

    Knot selection by boosting techniques

    Get PDF
    A novel concept for estimating smooth functions by selection techniques based on boosting is developed. It is suggested to put radial basis functions with different spreads at each knot and to do selection and estimation simultaneously by a componentwise boosting algorithm. The methodology of various other smoothing and knot selection procedures (e.g. stepwise selection) is summarized. They are compared to the proposed approach by extensive simulations for various unidimensional settings, including varying spatial variation and heteroskedasticity, as well as on a real world data example. Finally, an extension of the proposed method to surface fitting is evaluated numerically on both, simulation and real data. The proposed knot selection technique is shown to be a strong competitor to existing methods for knot selection
    • …
    corecore