28,899 research outputs found

    Low-Pass Filter Design using Locally Weighted Polynomial Regression and Discrete Prolate Spheroidal Sequences

    Get PDF
    The paper concerns the design of nonparametric low-pass filters that have the property of reproducing a polynomial of a given degree. Two approaches are considered. The first is locally weighted polynomial regression (LWPR), which leads to linear filters depending on three parameters: the bandwidth, the order of the fitting polynomial, and the kernel. We find a remarkable linear (hyperbolic) relationship between the cutoff period (frequency) and the bandwidth, conditional on the choices of the order and the kernel, upon which we build the design of a low-pass filter. The second hinges on a generalization of the maximum concentration approach, leading to filters related to discrete prolate spheroidal sequences (DPSS). In particular, we propose a new class of lowpass filters that maximize the concentration over a specified frequency range, subject to polynomial reproducing constraints. The design of generalized DPSS filters depends on three parameters: the bandwidth, the polynomial order, and the concentration frequency. We discuss the properties of the corresponding filters in relation to the LWPR filters, and illustrate their use for the design of low-pass filters by investigating how the three parameters are related to the cutoff frequency.Trend filters; Kernels; Concentration; Filter Design.

    Compositional loess modeling

    Get PDF
    Cleveland (1979) is usually credited with the introduction of the locally weighted regression, Loess. The concept was further developed by Cleveland and Devlin (1988). The general idea is that for an arbitrary number of explanatory data points xi the value of a dependent variable is estimated ^yi. The ^yi is the tted value from a dth degree polynomial in xi. (In practice often d = 1.) The ^yi is tted using weighted least squares, WLS, where the points xk (k = 1; : : : ; n) closest to xi are given the largest weights. We de ne a weighted least squares estimation for compositional data, C-WLS. In WLS the sum of the weighted squared Euclidean distances between the observed and the estimated values is minimized. In C-WLS we minimize the weighted sum of the squared simplicial distances (Aitchison, 1986, p. 193) between the observed compositions and their estimates. We then de ne a compositional locally weighted regression, C-Loess. Here a composition is assumed to be explained by a real valued (multivariate) variable. For an arbitrary number of data points xi we for each xi t a dth degree polynomial in xi yielding an estimate ^yi of the composition yi. We use C-WLS to t the polynomial giving the largest weights to the points xk (k = 1; : : : ; n) closest to xi. Finally the C-Loess is applied to Swedish opinion poll data to create a poll-of-polls time series. The results are compared to previous results not acknowledging the compositional structure of the data

    Compositional Loess modeling

    Get PDF
    Cleveland (1979) is usually credited with the introduction of the locally weighted regression, Loess. The concept was further developed by Cleveland and Devlin (1988). The general idea is that for an arbitrary number of explanatory data points xi the value of a dependent variable is estimated ŷi. The ŷi is the fitted value from a dth degree polynomial in xi. (In practice often d = 1.) The ŷi is fitted using weighted least squares, WLS, where the points xk (k = 1, ..., n) closest to xi are given the largest weights. We define a weighted least squares estimation for compositional data, C-WLS. In WLS the sum of the weighted squared Euclidean distances between the observed and the estimated values is minimized. In C-WLS we minimize the weighted sum of the squared simplicial distances (Aitchison, 1986, p. 193) between the observed compositions and their estimates. We then define a compositional locally weighted regression, C-Loess. Here a composition is assumed to be explained by a real valued (multivariate) variable. For an arbitrary number of data points xi we for each xi fit a dth degree polynomial in xi yielding an estimate ŷi of the composition yi. We use C-WLS to fit the polynomial giving the largest weights to the points xk (k = 1, ..., n) closest to xi. Finally the C-Loess is applied to Swedish opinion poll data to create a poll-of-polls time series. The results are compared to previous results not acknowledging the compositional structure of the data

    Locally Adaptive and Differentiable Regression

    Full text link
    Over-parameterized models like deep nets and random forests have become very popular in machine learning. However, the natural goals of continuity and differentiability, common in regression models, are now often ignored in modern overparametrized, locally-adaptive models. We propose a general framework to construct a global continuous and differentiable model based on a weighted average of locally learned models in corresponding local regions. This model is competitive in dealing with data with different densities or scales of function values in different local regions. We demonstrate that when we mix kernel ridge and polynomial regression terms in the local models, and stitch them together continuously, we achieve faster statistical convergence in theory and improved performance in various practical settings

    Locally Weighted Polynomial Regression: Parameter Choice and Application to Forecasts of the Great Salt Lake

    Get PDF
    Relationships between hydrologic variables are often nonlinear. Usually the functional form of such a relationship is not known a priori. A multivariate, nonparametric regression methodology is provided here for approximating the underlying regression function using locally veighted polynomials. Locally weighted polynomials consider the approximation of the target function through a Taylor series expansion of the function in the neighborhood of the point of estimate. Cross validatory procedures for the selection of the size of the neighborhood over which this approximation should take place, and for the order of the local polynomial to use are provided and shown for some simple situations. The utility of this nonparametric regression approach is demonstrated through an application to nonparametric short term forecasts of the biweekly Great Salt Lake volume. Blind forecasts up to four years in the future using the 1847-1993 time series of the Great Salt Lake are presented

    Bayesian and maximin optimal designs for heteroscedastic regression models

    Get PDF
    The problem of constructing standardized maximin D-optimal designs for weighted polynomial regression models is addressed. In particular it is shown that, by following the broad approach to the construction of maximin designs introduced recently by Dette, Haines and Imhof (2003), such designs can be obtained as weak limits of the corresponding Bayesian Φq-optimal designs. The approach is illustrated for two specific weighted polynomial models and also for a particular growth model. --

    Optimal designs for comparing curves

    Get PDF
    We consider the optimal design problem for a comparison of two regression curves, which is used to establish the similarity between the dose response relationships of two groups. An optimal pair of designs minimizes the width of the confidence band for the difference between the two regression functions. Optimal design theory (equivalence theorems, efficiency bounds) is developed for this non standard design problem and for some commonly used dose response models optimal designs are found explicitly. The results are illustrated in several examples modeling dose response relationships. It is demonstrated that the optimal pair of designs for the comparison of the regression curves is not the pair of the optimal designs for the individual models. In particular it is shown that the use of the optimal designs proposed in this paper instead of commonly used "non-optimal" designs yields a reduction of the width of the confidence band by more than 50%.Comment: 27 pages, 3 figure
    corecore