107,886 research outputs found
Adaptive distance measures for sequential data
Mokbel B, Paaßen B, Hammer B. Adaptive distance measures for sequential data. In: Verleysen M, ed. ESANN, 22nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges, Belgium: i6doc.com; 2014: 265-270.Recent extensions of learning vector quantization (LVQ) to general (dis-)similarity data have paved the way towards LVQ classifiers for possibly discrete, structured objects such as sequences addressed by classical alignment. In this contribution, we propose a metric learning scheme based on this framework which allows for autonomous learning of the underlying scoring matrix according to a given discriminative task. Besides facilitating the often crucial and problematic choice of the scoring matrix in applications, this extension offers an increased interpretability of the results by pointing out structural invariances for the given task
Unbiased and Consistent Nested Sampling via Sequential Monte Carlo
We introduce a new class of sequential Monte Carlo methods called Nested
Sampling via Sequential Monte Carlo (NS-SMC), which reframes the Nested
Sampling method of Skilling (2006) in terms of sequential Monte Carlo
techniques. This new framework allows convergence results to be obtained in the
setting when Markov chain Monte Carlo (MCMC) is used to produce new samples. An
additional benefit is that marginal likelihood estimates are unbiased. In
contrast to NS, the analysis of NS-SMC does not require the (unrealistic)
assumption that the simulated samples be independent. As the original NS
algorithm is a special case of NS-SMC, this provides insights as to why NS
seems to produce accurate estimates despite a typical violation of its
assumptions. For applications of NS-SMC, we give advice on tuning MCMC kernels
in an automated manner via a preliminary pilot run, and present a new method
for appropriately choosing the number of MCMC repeats at each iteration.
Finally, a numerical study is conducted where the performance of NS-SMC and
temperature-annealed SMC is compared on several challenging and realistic
problems. MATLAB code for our experiments is made available at
https://github.com/LeahPrice/SMC-NS .Comment: 45 pages, some minor typographical errors fixed since last versio
Nonasymptotic analysis of adaptive and annealed Feynman-Kac particle models
Sequential and quantum Monte Carlo methods, as well as genetic type search
algorithms can be interpreted as a mean field and interacting particle
approximations of Feynman-Kac models in distribution spaces. The performance of
these population Monte Carlo algorithms is strongly related to the stability
properties of nonlinear Feynman-Kac semigroups. In this paper, we analyze these
models in terms of Dobrushin ergodic coefficients of the reference Markov
transitions and the oscillations of the potential functions. Sufficient
conditions for uniform concentration inequalities w.r.t. time are expressed
explicitly in terms of these two quantities. We provide an original
perturbation analysis that applies to annealed and adaptive Feynman-Kac models,
yielding what seems to be the first results of this kind for these types of
models. Special attention is devoted to the particular case of Boltzmann-Gibbs
measures' sampling. In this context, we design an explicit way of tuning the
number of Markov chain Monte Carlo iterations with temperature schedule. We
also design an alternative interacting particle method based on an adaptive
strategy to define the temperature increments. The theoretical analysis of the
performance of this adaptive model is much more involved as both the potential
functions and the reference Markov transitions now depend on the random
evolution on the particle model. The nonasymptotic analysis of these complex
adaptive models is an open research problem. We initiate this study with the
concentration analysis of a simplified adaptive models based on reference
Markov transitions that coincide with the limiting quantities, as the number of
particles tends to infinity.Comment: Published at http://dx.doi.org/10.3150/14-BEJ680 in the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- …