3,086 research outputs found

    Cauchy-Stieltjes families with polynomial variance functions and generalized orthogonality

    Full text link
    This paper studies variance functions of Cauchy-Stieltjes Kernel families generated by compactly supported centered probability measures. We describe several operations that allow us to construct additional variance functions from known ones. We construct a class of examples which exhausts all cubic variance functions, and provide examples of polynomial variance functions of arbitrary degree. We also relate Cauchy-Stieltjes Kernel families with polynomial variance functions to generalized orthogonality. Our main results are stated solely in terms of classical probability; some proofs rely on analytic machinery of free probability.Comment: Minor typos correcte

    ESTIMATING VARIANCE FUNCTIONS FOR WEIGHTED LINEAR REGRESSION

    Get PDF
    For linear models with heterogeneous error structure, four variance function models are examined for predicting the error structure in two loblolly pine data sets and one white oak data set. An index of fit and a simulation study were used to determine which models were best. The size of coefficients for linear and higher order terms varied drastically across different data sets, thus it is not desirable to recommend a general model containing both linear and higher order terms. The unspecified exponent model σ2vi = σ2(Di2 Hi)k 1 is recommended for all data sets considered. The k1 values ranged from 1.8 to 2.1. We recommend k1 = 2.0 for simplicity

    Semi-bayesian D-optimal designs and estimation procedures for mean and variance functions.

    Get PDF
    Semi-Bayesian D-optimal designs for fitting mean and variance functions are derived for some prior distributions on the variance function parameters. The impact of the mean of the prior and of the uncertainty about this mean is analyzed. Simulation studies are performed to investigate whether the choice of design has a substantial impact on the efficiency of the mean and the variance function parameter estimation and whether the D-optimality criterion is appropriate irrespective of the method applied to estimate the variance function parameters.Functions;

    Simultaneous Testing of Mean and Variance Structures in Nonlinear Time Series Models

    Get PDF
    This paper proposes a nonparametric simultaneous test for parametric specification of the conditional mean and variance functions in a time series regression model. The test is based on an empirical likelihood (EL) statistic that measures the goodness of fit between the parametric estimates and the nonparametric kernel estimates of the mean and variance functions. A unique feature of the test is its ability to distribute natural weights automatically between the mean and the variance components of the goodness of fit. To reduce the dependence of the test on a single pair of smoothing bandwidths, we construct an adaptive test by maximizing a standardized version of the empirical likelihood test statistic over a set of smoothing bandwidths. The test procedure is based on a bootstrap calibration to the distribution of the empirical likelihood test statistic. We demonstrate that the empirical likelihood test is able to distinguish local alternatives which are different from the null hypothesis at an optimal rate.Bootstrap, empirical likelihood, goodness{of{t test, kernel estimation, least squares empirical likelihood, rate-optimal test

    One-sided Cauchy-Stieltjes Kernel Families

    Full text link
    This paper continues the study of a kernel family which uses the Cauchy-Stieltjes kernel in place of the celebrated exponential kernel of the exponential families theory. We extend the theory to cover generating measures with support that is unbounded on one side. We illustrate the need for such an extension by showing that cubic pseudo-variance functions correspond to free-infinitely divisible laws without the first moment. We also determine the domain of means, advancing the understanding of Cauchy-Stieltjes kernel families also for compactly supported generating measures

    Non-stationary self-similar Gaussian processes as scaling limits of power law shot noise processes and generalizations of fractional Brownian motion

    Full text link
    We study shot noise processes with Poisson arrivals and non-stationary noises. The noises are conditionally independent given the arrival times, but the distribution of each noise does depend on its arrival time. We establish scaling limits for such shot noise processes in two situations: 1) the conditional variance functions of the noises have a power law and 2) the conditional noise distributions are piecewise. In both cases, the limit processes are self-similar Gaussian with nonstationary increments. Motivated by these processes, we introduce new classes of self-similar Gaussian processes with non-stationary increments, via the time-domain integral representation, which are natural generalizations of fractional Brownian motions.Published versio

    Comparison of three estimators in a polynomial regression with measurement errors

    Get PDF
    In a polynomial regression with measurement errors in the covariate, which is supposed to be normally distributed, one has (at least) three ways to estimate the unknown regression parameters: one can apply ordinary least squares (OLS) to the model without regard of the measurement error or one can correct for the measurement error, either by correcting the estimating equation (ALS) or by correcting the mean and variance functions of the dependent variable, which is done by conditioning on the observable, error ridden, counter part of the covariate (SLS). While OLS is biased the other two estimators are consistent. Their asymptotic covariance matrices can be compared to each other, in particular for the case of a small measurement error variance
    corecore