677 research outputs found
Standard and Generalized Newtonian Gravities as ``Gauge'' Theories of the Extended Galilei Group - I: The Standard Theory
Newton's standard theory of gravitation is reformulated as a {\it gauge}
theory of the {\it extended} Galilei Group. The Action principle is obtained by
matching the {\it gauge} technique and a suitable limiting procedure from the
ADM-De Witt action of general relativity coupled to a relativistic mass-point.Comment: 51 pages , compress, uuencode LaTex fil
Gauging tensor networks with belief propagation
Effectively compressing and optimizing tensor networks requires reliable
methods for fixing the latent degrees of freedom of the tensors, known as the
gauge. Here we introduce a new algorithm for gauging tensor networks using
belief propagation, a method that was originally formulated for performing
statistical inference on graphical models and has recently found applications
in tensor network algorithms. We show that this method is closely related to
known tensor network gauging methods. It has the practical advantage, however,
that existing belief propagation implementations can be repurposed for tensor
network gauging, and that belief propagation is a very simple algorithm based
on just tensor contractions so it can be easier to implement, optimize, and
generalize. We present numerical evidence and scaling arguments that this
algorithm is faster than existing gauging algorithms, demonstrating its usage
on structured, unstructured, and infinite tensor networks. Additionally, we
apply this method to improve the accuracy of the widely used simple update gate
evolution algorithm.Comment: 47 Pages. 11 Figure
Active Learning based on Data Uncertainty and Model Sensitivity
Robots can rapidly acquire new skills from demonstrations. However, during
generalisation of skills or transitioning across fundamentally different
skills, it is unclear whether the robot has the necessary knowledge to perform
the task. Failing to detect missing information often leads to abrupt movements
or to collisions with the environment. Active learning can quantify the
uncertainty of performing the task and, in general, locate regions of missing
information. We introduce a novel algorithm for active learning and demonstrate
its utility for generating smooth trajectories. Our approach is based on deep
generative models and metric learning in latent spaces. It relies on the
Jacobian of the likelihood to detect non-smooth transitions in the latent
space, i.e., transitions that lead to abrupt changes in the movement of the
robot. When non-smooth transitions are detected, our algorithm asks for an
additional demonstration from that specific region. The newly acquired
knowledge modifies the data manifold and allows for learning a latent
representation for generating smooth movements. We demonstrate the efficacy of
our approach on generalising elementary skills, transitioning across different
skills, and implicitly avoiding collisions with the environment. For our
experiments, we use a simulated pendulum where we observe its motion from
images and a 7-DoF anthropomorphic arm.Comment: Published on 2018 IEEE/RSJ International Conference on Intelligent
Robots and Syste
Pareto Smoothed Importance Sampling
Importance weighting is a general way to adjust Monte Carlo integration to
account for draws from the wrong distribution, but the resulting estimate can
be noisy when the importance ratios have a heavy right tail. This routinely
occurs when there are aspects of the target distribution that are not well
captured by the approximating distribution, in which case more stable estimates
can be obtained by modifying extreme importance ratios. We present a new method
for stabilizing importance weights using a generalized Pareto distribution fit
to the upper tail of the distribution of the simulated importance ratios. The
method, which empirically performs better than existing methods for stabilizing
importance sampling estimates, includes stabilized effective sample size
estimates, Monte Carlo error estimates and convergence diagnostics.Comment: Major revision: 1) proofs for consistency, finite variance, and
asymptotic normality, 2) justification of k<0.7 with theoretical
computational complexity analysis, 3) major rewrit
- …