6,421 research outputs found

    Convergence rate and averaging of nonlinear two-time-scale stochastic approximation algorithms

    Full text link
    The first aim of this paper is to establish the weak convergence rate of nonlinear two-time-scale stochastic approximation algorithms. Its second aim is to introduce the averaging principle in the context of two-time-scale stochastic approximation algorithms. We first define the notion of asymptotic efficiency in this framework, then introduce the averaged two-time-scale stochastic approximation algorithm, and finally establish its weak convergence rate. We show, in particular, that both components of the averaged two-time-scale stochastic approximation algorithm simultaneously converge at the optimal rate n\sqrt{n}.Comment: Published at http://dx.doi.org/10.1214/105051606000000448 in the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Estimating the geometric median in Hilbert spaces with stochastic gradient algorithms: LpL^{p} and almost sure rates of convergence

    Full text link
    The geometric median, also called L1L^{1}-median, is often used in robust statistics. Moreover, it is more and more usual to deal with large samples taking values in high dimensional spaces. In this context, a fast recursive estimator has been introduced by Cardot, Cenac and Zitt. This work aims at studying more precisely the asymptotic behavior of the estimators of the geometric median based on such non linear stochastic gradient algorithms. The LpL^{p} rates of convergence as well as almost sure rates of convergence of these estimators are derived in general separable Hilbert spaces. Moreover, the optimal rate of convergence in quadratic mean of the averaged algorithm is also given

    A fast and recursive algorithm for clustering large datasets with kk-medians

    Get PDF
    Clustering with fast algorithms large samples of high dimensional data is an important challenge in computational statistics. Borrowing ideas from MacQueen (1967) who introduced a sequential version of the kk-means algorithm, a new class of recursive stochastic gradient algorithms designed for the kk-medians loss criterion is proposed. By their recursive nature, these algorithms are very fast and are well adapted to deal with large samples of data that are allowed to arrive sequentially. It is proved that the stochastic gradient algorithm converges almost surely to the set of stationary points of the underlying loss criterion. A particular attention is paid to the averaged versions, which are known to have better performances, and a data-driven procedure that allows automatic selection of the value of the descent step is proposed. The performance of the averaged sequential estimator is compared on a simulation study, both in terms of computation speed and accuracy of the estimations, with more classical partitioning techniques such as kk-means, trimmed kk-means and PAM (partitioning around medoids). Finally, this new online clustering technique is illustrated on determining television audience profiles with a sample of more than 5000 individual television audiences measured every minute over a period of 24 hours.Comment: Under revision for Computational Statistics and Data Analysi

    Performance of a Distributed Stochastic Approximation Algorithm

    Full text link
    In this paper, a distributed stochastic approximation algorithm is studied. Applications of such algorithms include decentralized estimation, optimization, control or computing. The algorithm consists in two steps: a local step, where each node in a network updates a local estimate using a stochastic approximation algorithm with decreasing step size, and a gossip step, where a node computes a local weighted average between its estimates and those of its neighbors. Convergence of the estimates toward a consensus is established under weak assumptions. The approach relies on two main ingredients: the existence of a Lyapunov function for the mean field in the agreement subspace, and a contraction property of the random matrices of weights in the subspace orthogonal to the agreement subspace. A second order analysis of the algorithm is also performed under the form of a Central Limit Theorem. The Polyak-averaged version of the algorithm is also considered.Comment: IEEE Transactions on Information Theory 201

    A companion for the Kiefer--Wolfowitz--Blum stochastic approximation algorithm

    Full text link
    A stochastic algorithm for the recursive approximation of the location θ\theta of a maximum of a regression function was introduced by Kiefer and Wolfowitz [Ann. Math. Statist. 23 (1952) 462--466] in the univariate framework, and by Blum [Ann. Math. Statist. 25 (1954) 737--744] in the multivariate case. The aim of this paper is to provide a companion algorithm to the Kiefer--Wolfowitz--Blum algorithm, which allows one to simultaneously recursively approximate the size μ\mu of the maximum of the regression function. A precise study of the joint weak convergence rate of both algorithms is given; it turns out that, unlike the location of the maximum, the size of the maximum can be approximated by an algorithm which converges at the parametric rate. Moreover, averaging leads to an asymptotically efficient algorithm for the approximation of the couple (θ,μ)(\theta,\mu).Comment: Published in at http://dx.doi.org/10.1214/009053606000001451 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Fast Estimation of the Median Covariation Matrix with Application to Online Robust Principal Components Analysis

    Full text link
    The geometric median covariation matrix is a robust multivariate indicator of dispersion which can be extended without any difficulty to functional data. We define estimators, based on recursive algorithms, that can be simply updated at each new observation and are able to deal rapidly with large samples of high dimensional data without being obliged to store all the data in memory. Asymptotic convergence properties of the recursive algorithms are studied under weak conditions. The computation of the principal components can also be performed online and this approach can be useful for online outlier detection. A simulation study clearly shows that this robust indicator is a competitive alternative to minimum covariance determinant when the dimension of the data is small and robust principal components analysis based on projection pursuit and spherical projections for high dimension data. An illustration on a large sample and high dimensional dataset consisting of individual TV audiences measured at a minute scale over a period of 24 hours confirms the interest of considering the robust principal components analysis based on the median covariation matrix. All studied algorithms are available in the R package Gmedian on CRAN

    Online estimation of the geometric median in Hilbert spaces : non asymptotic confidence balls

    Full text link
    Estimation procedures based on recursive algorithms are interesting and powerful techniques that are able to deal rapidly with (very) large samples of high dimensional data. The collected data may be contaminated by noise so that robust location indicators, such as the geometric median, may be preferred to the mean. In this context, an estimator of the geometric median based on a fast and efficient averaged non linear stochastic gradient algorithm has been developed by Cardot, C\'enac and Zitt (2013). This work aims at studying more precisely the non asymptotic behavior of this algorithm by giving non asymptotic confidence balls. This new result is based on the derivation of improved L2L^2 rates of convergence as well as an exponential inequality for the martingale terms of the recursive non linear Robbins-Monro algorithm
    • …
    corecore