33 research outputs found

    Hessian sufficiency for bordered Hessian

    Get PDF
    We show that the second–order condition for strict local extrema in both constrained and unconstrained optimization problems can be expressed solely in terms of principal minors of the (Lagrengean) [sic] Hessian. This approach unifies the determinantal tests in the sense that the second-order condition can be always given solely in terms of Hessian matrix

    Narrower eigenbounds for Hadamard products

    Get PDF
    AbstractThe Schur theorem, established in 1911, defines the global bounds for all eigenvalues of Hadamard products of positive semidefinite matrices. We establish narrower, specific bounds for each eigenvalue

    A Homogeneous Class of Linear Estimators and Stronger Aitken Estimator

    Get PDF
    We define a new class of linear estimators which includes as a subset all linear unbiased estimators. Subsequently, we establish Aitken estimator, the best linear unbiased estimator, further as the best in this larger class of linear estimators.

    Discontinuous Extraction of a Nonrenewable Resource

    Get PDF
    This paper examines the sequence of optimal extraction of nonrenewable resources in the presence of multiple demands. We provide conditions under which extraction of a nonrenewable resource may be discontinuous over the course of its depletion.backstop technology, dynamic optimization, energy resources, Herfindahl principle, multiple demands

    A Benchmark Table For Significance Test in the Class of Adaptive Regression Model

    Get PDF
    We derive the precise analytic limit of the variance estimator of g, the concentrated ML estimator of gamma-naught in the adaptive regression model, and show that the limit and the original estimator generate virtually identical estimates of the variance of g and the corresponding significance test statistics. Then, based on the limit variance estimator we generate a table for the significance test of g in the adaptive regression model. The table may well be used for the same purpose even in the generalized model where g is extremely robust with respect to alternative specifications of the unknown covariance matrices of the parameter vector.

    A Note On Derivation of the Least Squares Estimator

    Get PDF
    Derivation of Least Squares (LS) estimators of intercept and slope in bivariate regression model has been solely calculus-based. Herein, for the first time in the chronicle of regression, we provide a derivation of the LS estimators in very basic algebra within the grasp of the intended readers of many introductory books in statistics and econometrics. Also, we provide a similar derivation of the LS estimator of a parameter vector for the multiple regression model which takes only a few steps of basic matrix operation.Derivation of LS estimator, bivariate regression, calculus-based, self-contained, multiple regression, matrix calculus, orthogonality

    Stationarity Condition for AR Index Process

    Get PDF
    The stationarity conditions for an autoregressive (AR) process in general are reduced to a remarkably simple inequality if the lag coefficients are restricted to be identical. The condition is not only analytically elegant but also applicable in checking the validity of the stationarity conditions for such a restricted AR process of any order

    A comment on stochastic parameter variation

    No full text
    corecore