11,009 research outputs found
Efficient Sequential Monte-Carlo Samplers for Bayesian Inference
In many problems, complex non-Gaussian and/or nonlinear models are required
to accurately describe a physical system of interest. In such cases, Monte
Carlo algorithms are remarkably flexible and extremely powerful approaches to
solve such inference problems. However, in the presence of a high-dimensional
and/or multimodal posterior distribution, it is widely documented that standard
Monte-Carlo techniques could lead to poor performance. In this paper, the study
is focused on a Sequential Monte-Carlo (SMC) sampler framework, a more robust
and efficient Monte Carlo algorithm. Although this approach presents many
advantages over traditional Monte-Carlo methods, the potential of this emergent
technique is however largely underexploited in signal processing. In this work,
we aim at proposing some novel strategies that will improve the efficiency and
facilitate practical implementation of the SMC sampler specifically for signal
processing applications. Firstly, we propose an automatic and adaptive strategy
that selects the sequence of distributions within the SMC sampler that
minimizes the asymptotic variance of the estimator of the posterior
normalization constant. This is critical for performing model selection in
modelling applications in Bayesian signal processing. The second original
contribution we present improves the global efficiency of the SMC sampler by
introducing a novel correction mechanism that allows the use of the particles
generated through all the iterations of the algorithm (instead of only
particles from the last iteration). This is a significant contribution as it
removes the need to discard a large portion of the samples obtained, as is
standard in standard SMC methods. This will improve estimation performance in
practical settings where computational budget is important to consider.Comment: arXiv admin note: text overlap with arXiv:1303.3123 by other author
Choice and information in the public sector: a Higher Education case study
Successive governments have encouraged the view of users of public services as consumers, choosing between different providers on the basis of information about the quality of service. As part of this approach, prospective students are expected to make their decisions about which universities to apply to with reference to the consumer evaluations provided by the National Student Survey. However, a case study of a post-1992 university showed that not all students made genuine choices and those who did tended to be in stronger social and economic positions. Where choices were made, they were infrequently based on external evaluations of quality
Global Carbon Budget: Ocean carbon sink.
CO2 emissions from human activities, the main contributor to global climate change, are set to rise again in 2014 reaching 40 billion tonnes CO2 The natural carbon âsinksâ on land and in the ocean absorb on average 55% of the total CO2 emissions, thus slowing the rate of global climate change Increasing CO2 in the oceans is causing ocean acidificatio
Accelerating Motion Planning via Optimal Transport
Motion planning is still an open problem for many disciplines, e.g.,
robotics, autonomous driving, due to their need for high computational
resources that hinder real-time, efficient decision-making. A class of methods
striving to provide smooth solutions is gradient-based trajectory optimization.
However, those methods usually suffer from bad local minima, while for many
settings, they may be inapplicable due to the absence of easy-to-access
gradients of the optimization objectives. In response to these issues, we
introduce Motion Planning via Optimal Transport (MPOT) -- a
\textit{gradient-free} method that optimizes a batch of smooth trajectories
over highly nonlinear costs, even for high-dimensional tasks, while imposing
smoothness through a Gaussian Process dynamics prior via the
planning-as-inference perspective. To facilitate batch trajectory optimization,
we introduce an original zero-order and highly-parallelizable update rule: the
Sinkhorn Step, which uses the regular polytope family for its search
directions. Each regular polytope, centered on trajectory waypoints, serves as
a local cost-probing neighborhood, acting as a \textit{trust region} where the
Sinkhorn Step "transports" local waypoints toward low-cost regions. We
theoretically show that Sinkhorn Step guides the optimizing parameters toward
local minima regions of non-convex objective functions. We then show the
efficiency of MPOT in a range of problems from low-dimensional point-mass
navigation to high-dimensional whole-body robot motion planning, evincing its
superiority compared to popular motion planners, paving the way for new
applications of optimal transport in motion planning.Comment: Published as a conference paper at NeurIPS 2023. Project website:
https://sites.google.com/view/sinkhorn-step
A Carleman-Picard approach for reconstructing zero-order coefficients in parabolic equations with limited data
We propose a globally convergent computational technique for the nonlinear
inverse problem of reconstructing the zero-order coefficient in a parabolic
equation using partial boundary data. This technique is called the "reduced
dimensional method". Initially, we use the polynomial-exponential basis to
approximate the inverse problem as a system of 1D nonlinear equations. We then
employ a Picard iteration based on the quasi-reversibility method and a
Carleman weight function. We will rigorously prove that the sequence derived
from this iteration converges to the accurate solution for that 1D system
without requesting a good initial guess of the true solution. The key tool for
the proof is a Carleman estimate. We will also show some numerical examples
Reconstruction of Short-Lived Particles using Graph-Hypergraph Representation Learning
In collider experiments, the kinematic reconstruction of heavy, short-lived particles is vital for precision tests of the Standard Model and in searches for physics beyond it. Performing kinematic reconstruction in collider events with many final-state jets, such as the all-hadronic decay of topantitop quark pairs, is challenging. We present HyPER, a graph neural network that uses blended graph-hypergraph representation learning to reconstruct parent particles from sets of final-state objects. HyPER is tested on simulation and shown to perform favorably when compared to existing state-of-the-art reconstruction techniques, while demonstrating superior parameter efficiency. The novel hypergraph approach allows the method to be applied to particle reconstruction in a multitude of different physics processes
Reaching peak emissions
Rapid growth in global CO2 emissions from fossil fuels and industry ceased in the past two years, despite continued economic growth. Decreased coal use in China was largely responsible, coupled with slower global growth in petroleum and faster growth in renewables
- âŠ