163 research outputs found

    Monotonicity preserving approximation of multivariate scattered data

    Full text link
    This paper describes a new method of monotone interpolation and smoothing of multivariate scattered data. It is based on the assumption that the function to be approximated is Lipschitz continuous. The method provides the optimal approximation in the worst case scenario and tight error bounds. Smoothing of noisy data subject to monotonicity constraints is converted into a quadratic programming problem. Estimation of the unknown Lipschitz constant from the data by sample splitting and cross-validation is described. Extension of the method for locally Lipschitz functions is presented.<br /

    Shape preserving approximation using least squares splines

    Full text link
    Least squares polynomial splines are an effective tool for data fitting, but they may fail to preserve essential properties of the underlying function, such as monotonicity or convexity. The shape restrictions are translated into linear inequality conditions on spline coefficients. The basis functions are selected in such a way that these conditions take a simple form, and the problem becomes non-negative least squares problem, for which effecitive and robust methods of solution exist. Multidimensional monotone approximation is achieved by using tensor-product splines with the appropriate restrictions. Additional inter polation conditions can also be introduced. The conversion formulas to traditional B-spline representation are provided. <br /

    Semiparametric Regression During 2003–2007

    Get PDF
    Semiparametric regression is a fusion between parametric regression and nonparametric regression and the title of a book that we published on the topic in early 2003. We review developments in the field during the five year period since the book was written. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application

    Monotone approximation of aggregation operators using least squares splines

    Full text link
    The need for monotone approximation of scattered data often arises in many problems of regression, when the monotonicity is semantically important. One such domain is fuzzy set theory, where membership functions and aggregation operators are order preserving. Least squares polynomial splines provide great flexbility when modeling non-linear functions, but may fail to be monotone. Linear restrictions on spline coefficients provide necessary and sufficient conditions for spline monotonicity. The basis for splines is selected in such a way that these restrictions take an especially simple form. The resulting non-negative least squares problem can be solved by a variety of standard proven techniques. Additional interpolation requirements can also be imposed in the same framework. The method is applied to fuzzy systems, where membership functions and aggregation operators are constructed from empirical data.<br /

    Pointwise Convergence in Probability of General Smoothing Splines

    Get PDF
    Establishing the convergence of splines can be cast as a variational problem which is amenable to a Γ\Gamma-convergence approach. We consider the case in which the regularization coefficient scales with the number of observations, nn, as λn=n−p\lambda_n=n^{-p}. Using standard theorems from the Γ\Gamma-convergence literature, we prove that the general spline model is consistent in that estimators converge in a sense slightly weaker than weak convergence in probability for p≤12p\leq \frac{1}{2}. Without further assumptions we show this rate is sharp. This differs from rates for strong convergence using Hilbert scales where one can often choose p>12p>\frac{1}{2}

    Improving Point and Interval Estimates of Monotone Functions by Rearrangement

    Get PDF
    Suppose that a target function is monotonic, namely, weakly increasing, and an available original estimate of this target function is not weakly increasing. Rearrangements, univariate and multivariate, transform the original estimate to a monotonic estimate that always lies closer in common metrics to the target function. Furthermore, suppose an original simultaneous confidence interval, which covers the target function with probability at least 1−α1-\alpha, is defined by an upper and lower end-point functions that are not weakly increasing. Then the rearranged confidence interval, defined by the rearranged upper and lower end-point functions, is shorter in length in common norms than the original interval and also covers the target function with probability at least 1−α1-\alpha. We demonstrate the utility of the improved point and interval estimates with an age-height growth chart example.Comment: 24 pages, 4 figures, 3 table

    B-spline techniques for volatility modeling

    Full text link
    This paper is devoted to the application of B-splines to volatility modeling, specifically the calibration of the leverage function in stochastic local volatility models and the parameterization of an arbitrage-free implied volatility surface calibrated to sparse option data. We use an extension of classical B-splines obtained by including basis functions with infinite support. We first come back to the application of shape-constrained B-splines to the estimation of conditional expectations, not merely from a scatter plot but also from the given marginal distributions. An application is the Monte Carlo calibration of stochastic local volatility models by Markov projection. Then we present a new technique for the calibration of an implied volatility surface to sparse option data. We use a B-spline parameterization of the Radon-Nikodym derivative of the underlying's risk-neutral probability density with respect to a roughly calibrated base model. We show that this method provides smooth arbitrage-free implied volatility surfaces. Finally, we sketch a Galerkin method with B-spline finite elements to the solution of the partial differential equation satisfied by the Radon-Nikodym derivative.Comment: 25 page

    Improving point and interval estimates of monotone functions by rearrangement

    Get PDF
    Suppose that a target function is monotonic, namely weakly increasing, and an original estimate of this target function is available, which is not weakly increasing. Many common estimation methods used in statistics produce such estimates. We show that these estimates can always be improved with no harm by using rearrangement techniques: The rearrangement methods, univariate and multivariate, transform the original estimate to a monotonic estimate, and the resulting estimate is closer to the true curve in common metrics than the original estimate. The improvement property of the rearrangement also extends to the construction of confidence bands for monotone functions. Let l and u be the lower and upper endpoint functions of a simultaneous confidence interval [l,u] that covers the true function with probability (1-a), then the rearranged confidence interval, defined by the rearranged lower and upper end-point functions, is shorter in length in common norms than the original interval and covers the true function with probability greater or equal to (1-a). We illustrate the results with a computational example and an empirical example dealing with age-height growth charts. Please note: This paper is a revised version of cemmap working Paper CWP09/07.
    • …
    corecore