45,451 research outputs found
Bayesian interpolation
Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other data modeling problems. Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. “Occam's razor” is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling
Relative entropy and the multi-variable multi-dimensional moment problem
Entropy-like functionals on operator algebras have been studied since the
pioneering work of von Neumann, Umegaki, Lindblad, and Lieb. The most
well-known are the von Neumann entropy and a
generalization of the Kullback-Leibler distance , refered to as quantum relative entropy and used to quantify
distance between states of a quantum system. The purpose of this paper is to
explore these as regularizing functionals in seeking solutions to
multi-variable and multi-dimensional moment problems. It will be shown that
extrema can be effectively constructed via a suitable homotopy. The homotopy
approach leads naturally to a further generalization and a description of all
the solutions to such moment problems. This is accomplished by a
renormalization of a Riemannian metric induced by entropy functionals. As an
application we discuss the inverse problem of describing power spectra which
are consistent with second-order statistics, which has been the main motivation
behind the present work.Comment: 24 pages, 3 figure
A rarefaction-tracking method for hyperbolic conservation laws
We present a numerical method for scalar conservation laws in one space
dimension. The solution is approximated by local similarity solutions. While
many commonly used approaches are based on shocks, the presented method uses
rarefaction and compression waves. The solution is represented by particles
that carry function values and move according to the method of characteristics.
Between two neighboring particles, an interpolation is defined by an analytical
similarity solution of the conservation law. An interaction of particles
represents a collision of characteristics. The resulting shock is resolved by
merging particles so that the total area under the function is conserved. The
method is variation diminishing, nevertheless, it has no numerical dissipation
away from shocks. Although shocks are not explicitly tracked, they can be
located accurately. We present numerical examples, and outline specific
applications and extensions of the approach.Comment: 21 pages, 7 figures. Similarity 2008 conference proceeding
An exactly conservative particle method for one dimensional scalar conservation laws
A particle scheme for scalar conservation laws in one space dimension is
presented. Particles representing the solution are moved according to their
characteristic velocities. Particle interaction is resolved locally, satisfying
exact conservation of area. Shocks stay sharp and propagate at correct speeds,
while rarefaction waves are created where appropriate. The method is variation
diminishing, entropy decreasing, exactly conservative, and has no numerical
dissipation away from shocks. Solutions, including the location of shocks, are
approximated with second order accuracy. Source terms can be included. The
method is compared to CLAWPACK in various examples, and found to yield a
comparable or better accuracy for similar resolutions.Comment: 29 pages, 21 figure
Time and spectral domain relative entropy: A new approach to multivariate spectral estimation
The concept of spectral relative entropy rate is introduced for jointly
stationary Gaussian processes. Using classical information-theoretic results,
we establish a remarkable connection between time and spectral domain relative
entropy rates. This naturally leads to a new spectral estimation technique
where a multivariate version of the Itakura-Saito distance is employed}. It may
be viewed as an extension of the approach, called THREE, introduced by Byrnes,
Georgiou and Lindquist in 2000 which, in turn, followed in the footsteps of the
Burg-Jaynes Maximum Entropy Method. Spectral estimation is here recast in the
form of a constrained spectrum approximation problem where the distance is
equal to the processes relative entropy rate. The corresponding solution
entails a complexity upper bound which improves on the one so far available in
the multichannel framework. Indeed, it is equal to the one featured by THREE in
the scalar case. The solution is computed via a globally convergent matricial
Newton-type algorithm. Simulations suggest the effectiveness of the new
technique in tackling multivariate spectral estimation tasks, especially in the
case of short data records.Comment: 32 pages, submitted for publicatio
Joint morphological-lexical language modeling for processing morphologically rich languages with application to dialectal Arabic
Language modeling for an inflected language
such as Arabic poses new challenges for speech recognition and
machine translation due to its rich morphology. Rich morphology
results in large increases in out-of-vocabulary (OOV) rate and
poor language model parameter estimation in the absence of large
quantities of data. In this study, we present a joint
morphological-lexical language model (JMLLM) that takes
advantage of Arabic morphology. JMLLM combines
morphological segments with the underlying lexical items and
additional available information sources with regards to
morphological segments and lexical items in a single joint model.
Joint representation and modeling of morphological and lexical
items reduces the OOV rate and provides smooth probability
estimates while keeping the predictive power of whole words.
Speech recognition and machine translation experiments in
dialectal-Arabic show improvements over word and morpheme
based trigram language models. We also show that as the
tightness of integration between different information sources
increases, both speech recognition and machine translation
performances improve
- …