1,436,895 research outputs found

    Beyond first-order asymptotics for Cox regression

    Get PDF
    To go beyond standard first-order asymptotics for Cox regression, we develop parametric bootstrap and second-order methods. In general, computation of PP-values beyond first order requires more model specification than is required for the likelihood function. It is problematic to specify a censoring mechanism to be taken very seriously in detail, and it appears that conditioning on censoring is not a viable alternative to that. We circumvent this matter by employing a reference censoring model, matching the extent and timing of observed censoring. Our primary proposal is a parametric bootstrap method utilizing this reference censoring model to simulate inferential repetitions of the experiment. It is shown that the most important part of improvement on first-order methods - that pertaining to fitting nuisance parameters - is insensitive to the assumed censoring model. This is supported by numerical comparisons of our proposal to parametric bootstrap methods based on usual random censoring models, which are far more unattractive to implement. As an alternative to our primary proposal, we provide a second-order method requiring less computing effort while providing more insight into the nature of improvement on first-order methods. However, the parametric bootstrap method is more transparent, and hence is our primary proposal. Indications are that first-order partial likelihood methods are usually adequate in practice, so we are not advocating routine use of the proposed methods. It is however useful to see how best to check on first-order approximations, or improve on them, when this is expressly desired.Comment: Published at http://dx.doi.org/10.3150/13-BEJ572 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    On the First Order Regression Procedure of Estimation for Incomplete Regression Models

    Get PDF
    This article discusses some properties of the first order regression method for imputation of missing values on an explanatory variable in linear regression model and presents an estimation strategy based on hypothesis testing

    Modified First Order Regression, eine Simulationsstudie

    Get PDF
    In diesem Bericht werden verschiedene Imputationsmechanismen fuer fehlende Kovariablen in einem linearen Regressionsmodell mit zwei Kovariablen untersucht. Hierbei ist eine der Kovariablen vollstaendig beobachtet, die andere nur teilweise. Die betrachteten Imputationsmechanismen sind Zero Order Regression (ZOR), First Order Regression (FOR), First Order Regression plus random noise (FOR+) und Modified First Order Regression (MFOR)

    Max-affine regression via first-order methods

    Full text link
    We consider regression of a max-affine model that produces a piecewise linear model by combining affine models via the max function. The max-affine model ubiquitously arises in applications in signal processing and statistics including multiclass classification, auction problems, and convex regression. It also generalizes phase retrieval and learning rectifier linear unit activation functions. We present a non-asymptotic convergence analysis of gradient descent (GD) and mini-batch stochastic gradient descent (SGD) for max-affine regression when the model is observed at random locations following the sub-Gaussianity and an anti-concentration with additive sub-Gaussian noise. Under these assumptions, a suitably initialized GD and SGD converge linearly to a neighborhood of the ground truth specified by the corresponding error bound. We provide numerical results that corroborate the theoretical finding. Importantly, SGD not only converges faster in run time with fewer observations than alternating minimization and GD in the noiseless scenario but also outperforms them in low-sample scenarios with noise

    Improved Coefficient and Variance Estimation in Stable First-Order Dynamic Regression Models

    Get PDF
    In dynamic regression models the least-squares coefficient estimators are biased in finite samples, and so are the usual estimators for the disturbance variance and for the variance of the coefficient estimators. By deriving the expectation of the initial terms in an expansion of the usual expression for the asymptotic coefficient variance estimator and by comparing these with an approximation to the true variance we find an approximation to the bias in variance estimation from which a bias corrected estimator for the variance readily follows. This is also achieved for a bias corrected coefficient estimator and allows to compare analytically the second-order approximation to the mean squared error of the least-squares estimator and its counterpart for the first-order bias corrected coefficient estimator. Two rather strong results on efficiency gains through bias correction for AR(1) models follow. Illustrative simulation results on the magnitude of bias in coefficient and variance estimation and on the scope for effective bias correction and efficiency improvement are presented for some relevant particular cases of the ARX(1) class of models.

    Estimation of Parameters in Multiple Regression With Missing X-Observations using Modified First Order Regression Procedure

    Get PDF
    This paper considers the estimation of coefficients in a linear regression model with missing observations in the independent variables and introduces a modification of the standard first order regression method for imputation of missing values. The modification provides stochastic values for imputation. Asymptotic properties of the estimators for the regression coefficients arising from the proposed modification are derived when either both the number of complete observations and the number of missing values grow large or only the number of complete observations grows large and the number of missing observations stays fixed. Using these results, the proposed procedure is compared with two popular procedures - one which utilizes only the complete observations and the other which employs the standard first order regression imputation method for missing values. It is suggested that an elaborate simulation experiment will be helpful to evaluate the gain in efficiency especially in case of discrete regressor variables and to examine some other interesting issues like the impact of varying degree of multicollinearity in explanatory variables. Applications to some concrete data sets may also shed some light on these aspects. Some work on these lines is in progress and will be reported in a future article to follow
    • …
    corecore