95,700 research outputs found
Fast and scalable non-parametric Bayesian inference for Poisson point processes
We study the problem of non-parametric Bayesian estimation of the intensity
function of a Poisson point process. The observations are independent
realisations of a Poisson point process on the interval . We propose two
related approaches. In both approaches we model the intensity function as
piecewise constant on bins forming a partition of the interval . In
the first approach the coefficients of the intensity function are assigned
independent gamma priors, leading to a closed form posterior distribution. On
the theoretical side, we prove that as the posterior
asymptotically concentrates around the "true", data-generating intensity
function at an optimal rate for -H\"older regular intensity functions (). In the second approach we employ a gamma Markov chain prior on the
coefficients of the intensity function. The posterior distribution is no longer
available in closed form, but inference can be performed using a
straightforward version of the Gibbs sampler. Both approaches scale well with
sample size, but the second is much less sensitive to the choice of .
Practical performance of our methods is first demonstrated via synthetic data
examples. We compare our second method with other existing approaches on the UK
coal mining disasters data. Furthermore, we apply it to the US mass shootings
data and Donald Trump's Twitter data.Comment: 45 pages, 22 figure
Efficient Non-parametric Bayesian Hawkes Processes
In this paper, we develop an efficient nonparametric Bayesian estimation of
the kernel function of Hawkes processes. The non-parametric Bayesian approach
is important because it provides flexible Hawkes kernels and quantifies their
uncertainty. Our method is based on the cluster representation of Hawkes
processes. Utilizing the stationarity of the Hawkes process, we efficiently
sample random branching structures and thus, we split the Hawkes process into
clusters of Poisson processes. We derive two algorithms -- a block Gibbs
sampler and a maximum a posteriori estimator based on expectation maximization
-- and we show that our methods have a linear time complexity, both
theoretically and empirically. On synthetic data, we show our methods to be
able to infer flexible Hawkes triggering kernels. On two large-scale Twitter
diffusion datasets, we show that our methods outperform the current
state-of-the-art in goodness-of-fit and that the time complexity is linear in
the size of the dataset. We also observe that on diffusions related to online
videos, the learned kernels reflect the perceived longevity for different
content types such as music or pets videos
Object Edge Contour Localisation Based on HexBinary Feature Matching
This paper addresses the issue of localising object
edge contours in cluttered backgrounds to support robotics
tasks such as grasping and manipulation and also to improve
the potential perceptual capabilities of robot vision systems. Our
approach is based on coarse-to-fine matching of a new recursively
constructed hierarchical, dense, edge-localised descriptor,
the HexBinary, based on the HexHog descriptor structure first
proposed in [1]. Since Binary String image descriptors [2]β
[5] require much lower computational resources, but provide
similar or even better matching performance than Histogram
of Orientated Gradient (HoG) descriptors, we have replaced
the HoG base descriptor fields used in HexHog with Binary
Strings generated from first and second order polar derivative
approximations. The ALOI [6] dataset is used to evaluate
the HexBinary descriptors which we demonstrate to achieve
a superior performance to that of HexHoG [1] for pose
refinement. The validation of our object contour localisation
system shows promising results with correctly labelling ~86% of edgel positions and mis-labelling ~3%
- β¦