77 research outputs found
Sharp entrywise perturbation bounds for Markov chains
For many Markov chains of practical interest, the invariant distribution is
extremely sensitive to perturbations of some entries of the transition matrix,
but insensitive to others; we give an example of such a chain, motivated by a
problem in computational statistical physics. We have derived perturbation
bounds on the relative error of the invariant distribution that reveal these
variations in sensitivity.
Our bounds are sharp, we do not impose any structural assumptions on the
transition matrix or on the perturbation, and computing the bounds has the same
complexity as computing the invariant distribution or computing other bounds in
the literature. Moreover, our bounds have a simple interpretation in terms of
hitting times, which can be used to draw intuitive but rigorous conclusions
about the sensitivity of a chain to various types of perturbations
Perturbation analysis in verification of discrete-time Markov chains
Perturbation analysis in probabilistic verification addresses the robustness and sensitivity problem for verification of stochastic models against qualitative and quantitative properties. We identify two types of perturbation bounds, namely non-asymptotic bounds and asymptotic bounds. Non-asymptotic bounds are exact, pointwise bounds that quantify the upper and lower bounds of the verification result subject to a given perturbation of the model, whereas asymptotic bounds are closed-form bounds that approximate non-asymptotic bounds by assuming that the given perturbation is sufficiently small. We perform perturbation analysis in the setting of Discrete-time Markov Chains. We consider three basic matrix norms to capture the perturbation distance, and focus on the computational aspect. Our main contributions include algorithms and tight complexity bounds for calculating both non-asymptotic bounds and asymptotic bounds with respect to the three perturbation distances. © 2014 Springer-Verlag
Asymmetry Helps: Eigenvalue and Eigenvector Analyses of Asymmetrically Perturbed Low-Rank Matrices
This paper is concerned with the interplay between statistical asymmetry and
spectral methods. Suppose we are interested in estimating a rank-1 and
symmetric matrix , yet only a
randomly perturbed version is observed. The noise matrix
is composed of zero-mean independent (but not
necessarily homoscedastic) entries and is, therefore, not symmetric in general.
This might arise, for example, when we have two independent samples for each
entry of and arrange them into an {\em asymmetric} data
matrix . The aim is to estimate the leading eigenvalue and
eigenvector of . We demonstrate that the leading eigenvalue
of the data matrix can be times more accurate --- up
to some log factor --- than its (unadjusted) leading singular value in
eigenvalue estimation. Further, the perturbation of any linear form of the
leading eigenvector of --- say, entrywise eigenvector perturbation
--- is provably well-controlled. This eigen-decomposition approach is fully
adaptive to heteroscedasticity of noise without the need of careful bias
correction or any prior knowledge about the noise variance. We also provide
partial theory for the more general rank- case. The takeaway message is
this: arranging the data samples in an asymmetric manner and performing
eigen-decomposition could sometimes be beneficial.Comment: accepted to Annals of Statistics, 2020. 37 page
Two approaches to the construction of perturbation bounds for continuous-time Markov chains
The paper is largely of a review nature. It considers two main methods used
to study stability and obtain appropriate quantitative estimates of
perturbations of (inhomogeneous) Markov chains with continuous time and a
finite or countable state space. An approach is described to the construction
of perturbation estimates for the main five classes of such chains associated
with queuing models. Several specific models are considered for which the limit
characteristics and perturbation bounds for admissible "perturbed" processes
are calculated
Relative perturbation theory for diagonally dominant matrices
In this paper, strong relative perturbation bounds are developed for a number of linear algebra problems involving diagonally dominant matrices. The key point is to parameterize diagonally dominant matrices using their off-diagonal entries and diagonally dominant parts and to consider small relative componentwise perturbations of these parameters. This allows us to obtain new relative perturbation bounds for the inverse, the solution to linear systems, the symmetric indefinite eigenvalue problem, the singular value problem, and the nonsymmetric eigenvalue problem. These bounds are much stronger than traditional perturbation results, since they are independent of either the standard condition number or the magnitude of eigenvalues/singular values. Together with previously derived perturbation bounds for the LDU factorization and the symmetric positive definite eigenvalue problem, this paper presents a complete and detailed account of relative structured perturbation theory for diagonally dominant matrices.This research was partially supported by the Ministerio de EconomÃa y Competitividad of Spain under grant MTM2012-32542.Publicad
On Kemeny's constant for trees with fixed order and diameter
Kemeny's constant of a connected graph is a measure of the
expected transit time for the random walk associated with . In the current
work, we consider the case when is a tree, and, in this setting, we provide
lower and upper bounds for in terms of the order and diameter
of by using two different techniques. The lower bound is given as
Kemeny's constant of a particular caterpillar tree and, as a consequence, it is
sharp. The upper bound is found via induction, by repeatedly removing pendent
vertices from . By considering a specific family of trees - the broom-stars
- we show that the upper bound is asymptotically sharp.Comment: 20 pages, 5 figure
- …