960 research outputs found
Supremum-Norm Convergence for Step-Asynchronous Successive Overrelaxation on M-matrices
Step-asynchronous successive overrelaxation updates the values contained in a
single vector using the usual Gau\ss-Seidel-like weighted rule, but arbitrarily
mixing old and new values, the only constraint being temporal coherence: you
cannot use a value before it has been computed. We show that given a
nonnegative real matrix , a and a vector such that , every iteration of
step-asynchronous successive overrelaxation for the problem , with , reduces geometrically the -norm of the current error by a factor that we can compute explicitly. Then,
we show that given a it is in principle always possible to
compute such a . This property makes it possible to estimate the
supremum norm of the absolute error at each iteration without any additional
hypothesis on , even when is so large that computing the product
is feasible, but estimating the supremum norm of
is not
FrogWild! -- Fast PageRank Approximations on Graph Engines
We propose FrogWild, a novel algorithm for fast approximation of high
PageRank vertices, geared towards reducing network costs of running traditional
PageRank algorithms. Our algorithm can be seen as a quantized version of power
iteration that performs multiple parallel random walks over a directed graph.
One important innovation is that we introduce a modification to the GraphLab
framework that only partially synchronizes mirror vertices. This partial
synchronization vastly reduces the network traffic generated by traditional
PageRank algorithms, thus greatly reducing the per-iteration cost of PageRank.
On the other hand, this partial synchronization also creates dependencies
between the random walks used to estimate PageRank. Our main theoretical
innovation is the analysis of the correlations introduced by this partial
synchronization process and a bound establishing that our approximation is
close to the true PageRank vector.
We implement our algorithm in GraphLab and compare it against the default
PageRank implementation. We show that our algorithm is very fast, performing
each iteration in less than one second on the Twitter graph and can be up to 7x
faster compared to the standard GraphLab PageRank implementation
The Modified Matrix Splitting Iteration Method for Computing PageRank Problem
In this paper, based on the iteration methods [3,10], we propose a modified multi-step power-inner-outer (MMPIO) iteration method for solving the PageRank problem. In the MMPIO iteration method, we use the multi-step matrix splitting iterations instead of the power method, and combine with the inner-outer iteration [24]. The convergence of the MMPIO iteration method is analyzed in detail, and some comparison results are also given. Several numerical examples are presented to illustrate the effectiveness of the proposed algorithm
Asynchronous iterative solution for dominant eigenvectors with applications in performance modelling and PageRank
Imperial Users onl
Convergence of iterative aggregation/disaggregation methods based on splittings with cyclic iteration matrices
Iterative aggregation/disaggregation methods (IAD) belong to competitive tools for computation the characteristics of Markov chains as shown in some publications devoted to testing and comparing various methods designed to this purpose. According to Dayar T., Stewart W.J., ``Comparison of
partitioning techniques for two-level iterative solvers on large, sparse Markov chains,\u27\u27 SIAM J. Sci. Comput., Vol.21, No. 5, 1691-1705 (2000), the IAD methods are effective in particular when applied to large ill posed problems. One of the purposes of this
paper is to contribute to a possible explanation of this fact. The
novelty may consist of the fact that the IAD algorithms do converge independently of whether the iteration matrix of the corresponding process is primitive or not. Some numerical tests
are presented and possible applications mentioned; e.g. computing the PageRank
- …