251 research outputs found
Non-stationary log-periodogram regression
We study asymptotic properties of the log-periodogram semiparametric estimate of the memory parameter d for non-stationary (d>=1/2) time series with Gaussian increments, extending the results of Robinson (1995) for stationary and invertible Gaussian processes. We generalize the definition of the memory parameter d for non-stationary processes in terms of the (successively) differentiated series. We obtain that the log-periodogram estimate is asymptotically normal for dE[1/2, 3/4) and still consistent for dE[1/2, 1). We show that with adequate data tapers, a modified estimate is consistent and asymptotically normal distributed for any d, including both non-stationary and non-invertible processes. The estimates are invariant to the presence of certain deterministic trends, without any need of estimation.Publicad
Generalized Forward-Backward Splitting
This paper introduces the generalized forward-backward splitting algorithm
for minimizing convex functions of the form , where
has a Lipschitz-continuous gradient and the 's are simple in the sense
that their Moreau proximity operators are easy to compute. While the
forward-backward algorithm cannot deal with more than non-smooth
function, our method generalizes it to the case of arbitrary . Our method
makes an explicit use of the regularity of in the forward step, and the
proximity operators of the 's are applied in parallel in the backward
step. This allows the generalized forward backward to efficiently address an
important class of convex problems. We prove its convergence in infinite
dimension, and its robustness to errors on the computation of the proximity
operators and of the gradient of . Examples on inverse problems in imaging
demonstrate the advantage of the proposed methods in comparison to other
splitting algorithms.Comment: 24 pages, 4 figure
Krasnoselskii-Mann method for non-self mappings
AbstractLet H be a Hilbert space and let C be a closed, convex and nonempty subset of H. If
T
:
C
â
H
is a non-self and non-expansive mapping, we can define a map
h
:
C
â
R
by
h
(
x
)
:
=
inf
{
λ
â„
0
:
λ
x
+
(
1
â
λ
)
T
x
â
C
}
. Then, for a fixed
x
0
â
C
and for
α
0
:
=
max
{
1
/
2
,
h
(
x
0
)
}
, we define the Krasnoselskii-Mann algorithm
x
n
+
1
=
α
n
x
n
+
(
1
â
α
n
)
T
x
n
, where
α
n
+
1
=
max
{
α
n
,
h
(
x
n
+
1
)
}
. We will prove both weak and strong convergence results when C is a strictly convex set and T is an inward mapping
- âŠ