229 research outputs found
Cusps and shocks in the renormalized potential of glassy random manifolds: How Functional Renormalization Group and Replica Symmetry Breaking fit together
We compute the Functional Renormalization Group (FRG) disorder- correlator
function R(v) for d-dimensional elastic manifolds pinned by a random potential
in the limit of infinite embedding space dimension N. It measures the
equilibrium response of the manifold in a quadratic potential well as the
center of the well is varied from 0 to v. We find two distinct scaling regimes:
(i) a "single shock" regime, v^2 ~ 1/L^d where L^d is the system volume and
(ii) a "thermodynamic" regime, v^2 ~ N. In regime (i) all the equivalent
replica symmetry breaking (RSB) saddle points within the Gaussian variational
approximation contribute, while in regime (ii) the effect of RSB enters only
through a single anomaly. When the RSB is continuous (e.g., for short-range
disorder, in dimension 2 <= d <= 4), we prove that regime (ii) yields the
large-N FRG function obtained previously. In that case, the disorder correlator
exhibits a cusp in both regimes, though with different amplitudes and of
different physical origin. When the RSB solution is 1-step and non- marginal
(e.g., d < 2 for SR disorder), the correlator R(v) in regime (ii) is
considerably reduced, and exhibits no cusp. Solutions of the FRG flow
corresponding to non-equilibrium states are discussed as well. In all cases the
regime (i) exhibits a cusp non-analyticity at T=0, whose form and thermal
rounding at finite T is obtained exactly and interpreted in terms of shocks.
The results are compared with previous work, and consequences for manifolds at
finite N, as well as extensions to spin glasses and related models are
discussed.Comment: v2: Note added in proo
The cavity method for large deviations
A method is introduced for studying large deviations in the context of
statistical physics of disordered systems. The approach, based on an extension
of the cavity method to atypical realizations of the quenched disorder, allows
us to compute exponentially small probabilities (rate functions) over different
classes of random graphs. It is illustrated with two combinatorial optimization
problems, the vertex-cover and coloring problems, for which the presence of
replica symmetry breaking phases is taken into account. Applications include
the analysis of models on adaptive graph structures.Comment: 18 pages, 7 figure
Statistical Mechanics of maximal independent sets
The graph theoretic concept of maximal independent set arises in several
practical problems in computer science as well as in game theory. A maximal
independent set is defined by the set of occupied nodes that satisfy some
packing and covering constraints. It is known that finding minimum and
maximum-density maximal independent sets are hard optimization problems. In
this paper, we use cavity method of statistical physics and Monte Carlo
simulations to study the corresponding constraint satisfaction problem on
random graphs. We obtain the entropy of maximal independent sets within the
replica symmetric and one-step replica symmetry breaking frameworks, shedding
light on the metric structure of the landscape of solutions and suggesting a
class of possible algorithms. This is of particular relevance for the
application to the study of strategic interactions in social and economic
networks, where maximal independent sets correspond to pure Nash equilibria of
a graphical game of public goods allocation
An algorithm for counting circuits: application to real-world and random graphs
We introduce an algorithm which estimates the number of circuits in a graph
as a function of their length. This approach provides analytical results for
the typical entropy of circuits in sparse random graphs. When applied to
real-world networks, it allows to estimate exponentially large numbers of
circuits in polynomial time. We illustrate the method by studying a graph of
the Internet structure.Comment: 7 pages, 3 figures, minor corrections, accepted versio
A Theory of Cheap Control in Embodied Systems
We present a framework for designing cheap control architectures for embodied
agents. Our derivation is guided by the classical problem of universal
approximation, whereby we explore the possibility of exploiting the agent's
embodiment for a new and more efficient universal approximation of behaviors
generated by sensorimotor control. This embodied universal approximation is
compared with the classical non-embodied universal approximation. To exemplify
our approach, we present a detailed quantitative case study for policy models
defined in terms of conditional restricted Boltzmann machines. In contrast to
non-embodied universal approximation, which requires an exponential number of
parameters, in the embodied setting we are able to generate all possible
behaviors with a drastically smaller model, thus obtaining cheap universal
approximation. We test and corroborate the theory experimentally with a
six-legged walking machine. The experiments show that the sufficient controller
complexity predicted by our theory is tight, which means that the theory has
direct practical implications. Keywords: cheap design, embodiment, sensorimotor
loop, universal approximation, conditional restricted Boltzmann machineComment: 27 pages, 10 figure
Deterministic generation of an on-demand Fock state
We theoretically study the deterministic generation of photon Fock states
on-demand using a protocol based on a Jaynes Cummings quantum random walk which
includes damping. We then show how each of the steps of this protocol can be
implemented in a low temperature solid-state quantum system with a
Nitrogen-Vacancy centre in a nano-diamond coupled to a nearby high-Q optical
cavity. By controlling the coupling duration between the NV and the cavity via
the application of a time dependent Stark shift, and by increasing the decay
rate of the NV via stimulated emission depletion (STED) a Fock state with high
photon number can be generated on-demand. Our setup can be integrated on a chip
and can be accurately controlled.Comment: 13 pages, 9 figure
A List Referring Monte Carlo Method for Lattice Glass Models
We present an effcient Monte-Carlo method for lattice glass models which are
characterized by hard constraint conditions. The basic idea of the method is
similar to that of the -fold way method. By using a list of sites into which
we can insert a particle, we avoid trying a useless transition which is
forbidden by the constraint conditions. We applied the present method to a
lattice glass model proposed by Biroli and M{\'e}zard. We first evaluated the
efficiency of the method through measurements of the autocorrelation function
of particle configurations. As a result, we found that the efficiency is much
higher than that of the standard Monte-Carlo method. We also compared the
efficiency of the present method with that of the -fold way method in
detail. We next examined how the efficiency of extended ensemble methods such
as the replica exchange method and the Wang-Landau method is inflnuenced by the
choice of the local update method. The results show that the efficiency is
considerably improved by the use of efficient local update methods. For
example, when the number of sites is 1024, the ergodic time
of the replica exchange method in the grand-canonical ensemble,
which is the average round-trip time of a replica in chemical-potential space,
with the present local update method is more than times shorter than
that with the standard local update method. This result shows that the
efficient local update method is quite important to make extended ensemble
methods more effective.Comment: 16 pages, 21 figures; 1 subsection, 1 appendix, and 5 figures are
added, abstract is changed, 1 figure is remove
HAMAP: a database of completely sequenced microbial proteome sets and manually curated microbial protein families in UniProtKB/Swiss-Prot
The growth in the number of completely sequenced microbial genomes (bacterial and archaeal) has generated a need for a procedure that provides UniProtKB/Swiss-Prot-quality annotation to as many protein sequences as possible. We have devised a semi-automated system, HAMAP (High-quality Automated and Manual Annotation of microbial Proteomes), that uses manually built annotation templates for protein families to propagate annotation to all members of manually defined protein families, using very strict criteria. The HAMAP system is composed of two databases, the proteome database and the family database, and of an automatic annotation pipeline. The proteome database comprises biological and sequence information for each completely sequenced microbial proteome, and it offers several tools for CDS searches, BLAST options and retrieval of specific sets of proteins. The family database currently comprises more than 1500 manually curated protein families and their annotation templates that are used to annotate proteins that belong to one of the HAMAP families. On the HAMAP website, individual sequences as well as whole genomes can be scanned against all HAMAP families. The system provides warnings for the absence of conserved amino acid residues, unusual sequence length, etc. Thanks to the implementation of HAMAP, more than 200 000 microbial proteins have been fully annotated in UniProtKB/Swiss-Prot (HAMAP website: http://www.expasy.org/sprot/hamap)
Age of Stratospheric Air: Progress on Processes, Observations, and Long‐Term Trends
Age of stratospheric air is a well established metric for the stratospheric transport circulation. Rooted in a robust theoretical framework, this approach offers the benefit of being deducible from observations of trace gases. Given potential climate-induced changes, observational constraints on stratospheric circulation are crucial. In the past two decades, scientific progress has been made in three main areas: (a) Enhanced process understanding and the development of process diagnostics led to better quantification of individual transport processes from observations and to a better understanding of model deficits. (b) The global age of air climatology is now well constrained by observations thanks to improved quality and quantity of data, including global satellite data, and through improved and consistent age calculation methods. (c) It is well established and understood that global models predict a decrease in age, that is, an accelerating stratospheric circulation, in response to forcing by greenhouse gases and ozone depleting substances. Observational records now confirm long-term forced trends in mean age in the lower stratosphere. However, in the mid-stratosphere, uncertainties in observational records are too large to confirm or disprove the model predictions. Continuous monitoring of stratospheric trace gases and further improved methods to derive age from those tracers will be crucial to better constrain variability and long-term trends from observations. Future work on mean age as a metric for stratospheric transport will be important due to its potential to enhance the understanding of stratospheric composition changes, address climate model biases, and assess the impacts of proposed climate geoengineering methods
- …