6,879 research outputs found
Formation of Quantum Phase Slip Pairs in Superconducting Nanowires
Macroscopic quantum tunneling (MQT) is a fundamental phenomenon of quantum
mechanics related to the actively debated topic of quantum-to-classical
transition. The ability to realize MQT affects implementation of qubit-based
quantum computing schemes and their protection against decoherence. Decoherence
in qubits can be reduced by means of topological protection, e.g. by exploiting
various parity effects. In particular, paired phase slips can provide such
protection for superconducting qubits. Here, we report on the direct
observation of quantum paired phase slips in thin-wire superconducting loops.
We show that in addition to conventional single phase slips that change
superconducting order parameter phase by , there are quantum transitions
changing the phase by . Quantum paired phase slips represent a
synchronized occurrence of two macroscopic quantum tunneling events, i.e.
cotunneling. We demonstrate the existence of a remarkable regime in which
paired phase slips are exponentially more probable than single ones
Inverse Density as an Inverse Problem: The Fredholm Equation Approach
In this paper we address the problem of estimating the ratio
where is a density function and is another density, or, more generally
an arbitrary function. Knowing or approximating this ratio is needed in various
problems of inference and integration, in particular, when one needs to average
a function with respect to one probability distribution, given a sample from
another. It is often referred as {\it importance sampling} in statistical
inference and is also closely related to the problem of {\it covariate shift}
in transfer learning as well as to various MCMC methods. It may also be useful
for separating the underlying geometry of a space, say a manifold, from the
density function defined on it.
Our approach is based on reformulating the problem of estimating
as an inverse problem in terms of an integral operator
corresponding to a kernel, and thus reducing it to an integral equation, known
as the Fredholm problem of the first kind. This formulation, combined with the
techniques of regularization and kernel methods, leads to a principled
kernel-based framework for constructing algorithms and for analyzing them
theoretically.
The resulting family of algorithms (FIRE, for Fredholm Inverse Regularized
Estimator) is flexible, simple and easy to implement.
We provide detailed theoretical analysis including concentration bounds and
convergence rates for the Gaussian kernel in the case of densities defined on
, compact domains in and smooth -dimensional sub-manifolds of
the Euclidean space.
We also show experimental results including applications to classification
and semi-supervised learning within the covariate shift framework and
demonstrate some encouraging experimental comparisons. We also show how the
parameters of our algorithms can be chosen in a completely unsupervised manner.Comment: Fixing a few typos in last versio
Geodesics in Heat
We introduce the heat method for computing the shortest geodesic distance to
a specified subset (e.g., point or curve) of a given domain. The heat method is
robust, efficient, and simple to implement since it is based on solving a pair
of standard linear elliptic problems. The method represents a significant
breakthrough in the practical computation of distance on a wide variety of
geometric domains, since the resulting linear systems can be prefactored once
and subsequently solved in near-linear time. In practice, distance can be
updated via the heat method an order of magnitude faster than with
state-of-the-art methods while maintaining a comparable level of accuracy. We
provide numerical evidence that the method converges to the exact geodesic
distance in the limit of refinement; we also explore smoothed approximations of
distance suitable for applications where more regularity is required
Major John Bradford Homestead archaeological collections report
This report describes a collections management project undertaken on archaeological finds excavated at the Major John Bradford Homestead in 1972 and 1973. One of the chief goals of the project were to clean all artifacts that had not been processed after sorting the materials that had been processed and labeled and to reunite them with their provenience groups. The next goal was to catalogue all of the finds and to re-bag and re-box all of the materials in archivally appropriate bags and acid-free boxes and to provide a box inventory keyed to the catalogue so that future researchers or exhibit designers could readily locate objects of interest. A further goal was to provide a narrative about the excavations and to make suggestions about how to interpret the archaeological evidence and to suggest potential future research. All of these goals were met and are detailed in this report
- …