11,967 research outputs found
Generalized (c,d)-entropy and aging random walks
Complex systems are often inherently non-ergodic and non-Markovian for which
Shannon entropy loses its applicability. In particular accelerating,
path-dependent, and aging random walks offer an intuitive picture for these
non-ergodic and non-Markovian systems. It was shown that the entropy of
non-ergodic systems can still be derived from three of the Shannon-Khinchin
axioms, and by violating the fourth -- the so-called composition axiom. The
corresponding entropy is of the form and depends on two system-specific scaling exponents, and . This
entropy contains many recently proposed entropy functionals as special cases,
including Shannon and Tsallis entropy. It was shown that this entropy is
relevant for a special class of non-Markovian random walks. In this work we
generalize these walks to a much wider class of stochastic systems that can be
characterized as `aging' systems. These are systems whose transition rates
between states are path- and time-dependent. We show that for particular aging
walks is again the correct extensive entropy. Before the central part
of the paper we review the concept of -entropy in a self-contained way.Comment: 8 pages, 5 eps figures. arXiv admin note: substantial text overlap
with arXiv:1104.207
Time and M-theory
We review our recent proposal for a background independent formulation of a
holographic theory of quantum gravity. The present review incorporates the
necessary background material on geometry of canonical quantum theory,
holography and spacetime thermodynamics, Matrix theory, as well as our specific
proposal for a dynamical theory of geometric quantum mechanics, as applied to
Matrix theory. At the heart of this review is a new analysis of the conceptual
problem of time and the closely related and phenomenologically relevant problem
of vacuum energy in quantum gravity. We also present a discussion of some
observational implications of this new viewpoint on the problem of vacuum
energy.Comment: 86 pages, 5 figures, LaTeX, typos fixed, references added, and Sec.
6.2 revised; invited review for Int. J. Mod. Phys.
Robustness and Regularization of Support Vector Machines
We consider regularized support vector machines (SVMs) and show that they are
precisely equivalent to a new robust optimization formulation. We show that
this equivalence of robust optimization and regularization has implications for
both algorithms, and analysis. In terms of algorithms, the equivalence suggests
more general SVM-like algorithms for classification that explicitly build in
protection to noise, and at the same time control overfitting. On the analysis
front, the equivalence of robustness and regularization, provides a robust
optimization interpretation for the success of regularized SVMs. We use the
this new robustness interpretation of SVMs to give a new proof of consistency
of (kernelized) SVMs, thus establishing robustness as the reason regularized
SVMs generalize well
Asymptotic equivalence of probability measures and stochastic processes
Let and be two probability measures representing two different
probabilistic models of some system (e.g., an -particle equilibrium system,
a set of random graphs with vertices, or a stochastic process evolving over
a time ) and let be a random variable representing a 'macrostate' or
'global observable' of that system. We provide sufficient conditions, based on
the Radon-Nikodym derivative of and , for the set of typical values
of obtained relative to to be the same as the set of typical values
obtained relative to in the limit . This extends to
general probability measures and stochastic processes the well-known
thermodynamic-limit equivalence of the microcanonical and canonical ensembles,
related mathematically to the asymptotic equivalence of conditional and
exponentially-tilted measures. In this more general sense, two probability
measures that are asymptotically equivalent predict the same typical or
macroscopic properties of the system they are meant to model.Comment: v1: 16 pages. v2: 17 pages, precisions, examples and references
added. v3: Minor typos corrected. Close to published versio
Variational Principle of Bogoliubov and Generalized Mean Fields in Many-Particle Interacting Systems
The approach to the theory of many-particle interacting systems from a
unified standpoint, based on the variational principle for free energy is
reviewed. A systematic discussion is given of the approximate free energies of
complex statistical systems. The analysis is centered around the variational
principle of N. N. Bogoliubov for free energy in the context of its
applications to various problems of statistical mechanics and condensed matter
physics. The review presents a terse discussion of selected works carried out
over the past few decades on the theory of many-particle interacting systems in
terms of the variational inequalities. It is the purpose of this paper to
discuss some of the general principles which form the mathematical background
to this approach, and to establish a connection of the variational technique
with other methods, such as the method of the mean (or self-consistent) field
in the many-body problem, in which the effect of all the other particles on any
given particle is approximated by a single averaged effect, thus reducing a
many-body problem to a single-body problem. The method is illustrated by
applying it to various systems of many-particle interacting systems, such as
Ising and Heisenberg models, superconducting and superfluid systems, strongly
correlated systems, etc. It seems likely that these technical advances in the
many-body problem will be useful in suggesting new methods for treating and
understanding many-particle interacting systems. This work proposes a new,
general and pedagogical presentation, intended both for those who are
interested in basic aspects, and for those who are interested in concrete
applications.Comment: 60 pages, Refs.25
A note on conditional versus joint unconditional weak convergence in bootstrap consistency results
The consistency of a bootstrap or resampling scheme is classically validated
by weak convergence of conditional laws. However, when working with stochastic
processes in the space of bounded functions and their weak convergence in the
Hoffmann-J{\o}rgensen sense, an obstacle occurs: due to possible
non-measurability, neither laws nor conditional laws are well-defined. Starting
from an equivalent formulation of weak convergence based on the bounded
Lipschitz metric, a classical circumvent is to formulate bootstrap consistency
in terms of the latter distance between what might be called a
\emph{conditional law} of the (non-measurable) bootstrap process and the law of
the limiting process. The main contribution of this note is to provide an
equivalent formulation of bootstrap consistency in the space of bounded
functions which is more intuitive and easy to work with. Essentially, the
equivalent formulation consists of (unconditional) weak convergence of the
original process jointly with two bootstrap replicates. As a by-product, we
provide two equivalent formulations of bootstrap consistency for statistics
taking values in separable metric spaces: the first in terms of (unconditional)
weak convergence of the statistic jointly with its bootstrap replicates, the
second in terms of convergence in probability of the empirical distribution
function of the bootstrap replicates. Finally, the asymptotic validity of
bootstrap-based confidence intervals and tests is briefly revisited, with
particular emphasis on the, in practice unavoidable, Monte Carlo approximation
of conditional quantiles.Comment: 21 pages, 1 Figur
- …