251 research outputs found

    Non-stationary log-periodogram regression

    Get PDF
    We study asymptotic properties of the log-periodogram semiparametric estimate of the memory parameter d for non-stationary (d>=1/2) time series with Gaussian increments, extending the results of Robinson (1995) for stationary and invertible Gaussian processes. We generalize the definition of the memory parameter d for non-stationary processes in terms of the (successively) differentiated series. We obtain that the log-periodogram estimate is asymptotically normal for dE[1/2, 3/4) and still consistent for dE[1/2, 1). We show that with adequate data tapers, a modified estimate is consistent and asymptotically normal distributed for any d, including both non-stationary and non-invertible processes. The estimates are invariant to the presence of certain deterministic trends, without any need of estimation.Publicad

    Generalized Forward-Backward Splitting

    Full text link
    This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form F+∑i=1nGiF + \sum_{i=1}^n G_i, where FF has a Lipschitz-continuous gradient and the GiG_i's are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than n=1n = 1 non-smooth function, our method generalizes it to the case of arbitrary nn. Our method makes an explicit use of the regularity of FF in the forward step, and the proximity operators of the GiG_i's are applied in parallel in the backward step. This allows the generalized forward backward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of FF. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.Comment: 24 pages, 4 figure

    On the criterion function for arma estimation

    Get PDF

    Krasnoselskii-Mann method for non-self mappings

    Get PDF
    AbstractLet H be a Hilbert space and let C be a closed, convex and nonempty subset of H. If T:C→HT:C\to H T : C → H is a non-self and non-expansive mapping, we can define a map h:C→Rh:C\to\mathbb{R} h : C → R by h(x):=inf⁥{λ≄0:λx+(1−λ)Tx∈C}h(x):=\inf\{\lambda\geq 0:\lambda x+(1-\lambda)Tx\in C\} h ( x ) : = inf { λ ≄ 0 : λ x + ( 1 − λ ) T x ∈ C } . Then, for a fixed x0∈Cx_{0}\in C x 0 ∈ C and for α0:=max⁥{1/2,h(x0)}\alpha_{0}:=\max\{1/2, h(x_{0})\} α 0 : = max { 1 / 2 , h ( x 0 ) } , we define the Krasnoselskii-Mann algorithm xn+1=αnxn+(1−αn)Txnx_{n+1}=\alpha _{n}x_{n}+(1-\alpha_{n})Tx_{n} x n + 1 = α n x n + ( 1 − α n ) T x n , where αn+1=max⁥{αn,h(xn+1)}\alpha_{n+1}=\max\{\alpha_{n},h(x_{n+1})\} α n + 1 = max { α n , h ( x n + 1 ) } . We will prove both weak and strong convergence results when C is a strictly convex set and T is an inward mapping
    • 

    corecore