52,074 research outputs found
Sequential Metric Dimension
International audienceSeager introduced the following game in 2013. An invisible and immobile target is hidden at some vertex of a graph . Every step, one vertex of can be probed which results in the knowledge of the distance between and the target. The objective of the game is to minimize the number of steps needed to locate the target, wherever it is. We address the generalization of this game where vertices can be probed at every step. Our game also generalizes the notion of the metric dimension of a graph. Precisely, given a graph and two integers , the Localization Problem asks whether there exists a strategy to locate a target hidden in in at most steps by probing at most vertices per step. We show this problem is NP-complete when (resp.,) is a fixed parameter. Our main results are for the class of trees where we prove this problem is NP-complete when and are part of the input but, despite this, we design a polynomial-time (+1)-approximation algorithm in trees which gives a solution using at most one more step than the optimal one. It follows that the Localization Problem is polynomial-time solvable in trees if is fixed
Solving k-center Clustering (with Outliers) in MapReduce and Streaming, almost as Accurately as Sequentially.
Center-based clustering is a fundamental primitive for data analysis and becomes very challenging for large datasets. In this paper, we focus on the popular k-center variant which, given a set S of points from some metric space and a parameter k0, the algorithms yield solutions whose approximation ratios are a mere additive term \u3f5 away from those achievable by the best known polynomial-time sequential algorithms, a result that substantially improves upon the state of the art. Our algorithms are rather simple and adapt to the intrinsic complexity of the dataset, captured by the doubling dimension D of the metric space. Specifically, our analysis shows that the algorithms become very space-efficient for the important case of small (constant) D. These theoretical results are complemented with a set of experiments on real-world and synthetic datasets of up to over a billion points, which show that our algorithms yield better quality solutions over the state of the art while featuring excellent scalability, and that they also lend themselves to sequential implementations much faster than existing ones
Langevin and Hamiltonian based Sequential MCMC for Efficient Bayesian Filtering in High-dimensional Spaces
Nonlinear non-Gaussian state-space models arise in numerous applications in
statistics and signal processing. In this context, one of the most successful
and popular approximation techniques is the Sequential Monte Carlo (SMC)
algorithm, also known as particle filtering. Nevertheless, this method tends to
be inefficient when applied to high dimensional problems. In this paper, we
focus on another class of sequential inference methods, namely the Sequential
Markov Chain Monte Carlo (SMCMC) techniques, which represent a promising
alternative to SMC methods. After providing a unifying framework for the class
of SMCMC approaches, we propose novel efficient strategies based on the
principle of Langevin diffusion and Hamiltonian dynamics in order to cope with
the increasing number of high-dimensional applications. Simulation results show
that the proposed algorithms achieve significantly better performance compared
to existing algorithms
- …