3,822 research outputs found
Bayesian Inference of Log Determinants
The log-determinant of a kernel matrix appears in a variety of machine
learning problems, ranging from determinantal point processes and generalized
Markov random fields, through to the training of Gaussian processes. Exact
calculation of this term is often intractable when the size of the kernel
matrix exceeds a few thousand. In the spirit of probabilistic numerics, we
reinterpret the problem of computing the log-determinant as a Bayesian
inference problem. In particular, we combine prior knowledge in the form of
bounds from matrix theory and evidence derived from stochastic trace estimation
to obtain probabilistic estimates for the log-determinant and its associated
uncertainty within a given computational budget. Beyond its novelty and
theoretic appeal, the performance of our proposal is competitive with
state-of-the-art approaches to approximating the log-determinant, while also
quantifying the uncertainty due to budget-constrained evidence.Comment: 12 pages, 3 figure
On landmark selection and sampling in high-dimensional data analysis
In recent years, the spectral analysis of appropriately defined kernel
matrices has emerged as a principled way to extract the low-dimensional
structure often prevalent in high-dimensional data. Here we provide an
introduction to spectral methods for linear and nonlinear dimension reduction,
emphasizing ways to overcome the computational limitations currently faced by
practitioners with massive datasets. In particular, a data subsampling or
landmark selection process is often employed to construct a kernel based on
partial information, followed by an approximate spectral analysis termed the
Nystrom extension. We provide a quantitative framework to analyse this
procedure, and use it to demonstrate algorithmic performance bounds on a range
of practical approaches designed to optimize the landmark selection process. We
compare the practical implications of these bounds by way of real-world
examples drawn from the field of computer vision, whereby low-dimensional
manifold structure is shown to emerge from high-dimensional video data streams.Comment: 18 pages, 6 figures, submitted for publicatio
Matrix geometric approach for random walks: stability condition and equilibrium distribution
In this paper, we analyse a sub-class of two-dimensional homogeneous nearest
neighbour (simple) random walk restricted on the lattice using the matrix
geometric approach. In particular, we first present an alternative approach for
the calculation of the stability condition, extending the result of Neuts drift
conditions [30] and connecting it with the result of Fayolle et al. which is
based on Lyapunov functions [13]. Furthermore, we consider the sub-class of
random walks with equilibrium distributions given as series of product-forms
and, for this class of random walks, we calculate the eigenvalues and the
corresponding eigenvectors of the infinite matrix appearing in the
matrix geometric approach. This result is obtained by connecting and extending
three existing approaches available for such an analysis: the matrix geometric
approach, the compensation approach and the boundary value problem method. In
this paper, we also present the spectral properties of the infinite matrix
- …