13 research outputs found

    Truncated Stochastic Approximation with Moving Bounds: Convergence

    Full text link
    In this paper we propose a wide class of truncated stochastic approximation procedures with moving random bounds. While we believe that the proposed class of procedures will find its way to a wider range of applications, the main motivation is to accommodate applications to parametric statistical estimation theory. Our class of stochastic approximation procedures has three main characteristics: truncations with random moving bounds, a matrix valued random step-size sequence, and dynamically changing random regression function. We establish convergence and consider several examples to illustrate the results

    Recursive Parameter Estimation: Convergence

    Full text link
    We consider estimation procedures which are recursive in the sense that each successive estimator is obtained from the previous one by a simple adjustment. We propose a wide class of recursive estimation procedures for the general statistical model and study convergence.Comment: 25 pages with 1 postscript figur

    On Recursive Parametric Estimation Theory

    Get PDF
    The classical non-recursive methods to estimate unknown parameters of the model, such as the maximum likelihood method, the method of least squares etc. eventually require maximization procedures. These methods are often difficult to implement, especially for non i.i.d. models. If for every sample size n, when new data are acquired, an estimator has to be computed afresh, and if a numerical method is needed to do so, it generally becomes very laborious. Therefore, it is important to consider recursive estimation procedures which are appealing from the computational point of view. Recursive procedures are those which at each step allow one to re-estimate values of unknown parameters based on the values already obtained at the previous step together with new information. We propose a wide class of recursive estimation procedures for the general statistical model and study convergence, the rate of convergence, and the local asymptotic linearity. Also, we demonstrate the use of the results on some examples

    Truncated Stochastic Approximation with Moving Bounds:Convergence

    Get PDF

    An extension of the Liouville theorem for Fourier multipliers to sub-exponentially growing solutions

    Get PDF
    We study the equation m(D)ƒ=0 in a large class of sub-exponentially growing functions. Under appropriate restrictions on m ∈ C(Rn), we show that every such solution can be analytically continued to a sub-exponentially growing entire function on Cn if and only if m(ξ)≠0 for ξ≠0

    On the recursive parameter estimation in the general discrete time statistical model

    No full text
    The consistency and asymptotic linearity of recursive maximum likelihood estimator is proved under some regularity and ergodicity assumptions on the logarithmic derivative of a transition density for a general statistical model. © 1998 Elsevier Science B.V.Recursive estimation Conditional density of distribution Martingales Stochastic approximation
    corecore