412 research outputs found
The cavity method for large deviations
A method is introduced for studying large deviations in the context of
statistical physics of disordered systems. The approach, based on an extension
of the cavity method to atypical realizations of the quenched disorder, allows
us to compute exponentially small probabilities (rate functions) over different
classes of random graphs. It is illustrated with two combinatorial optimization
problems, the vertex-cover and coloring problems, for which the presence of
replica symmetry breaking phases is taken into account. Applications include
the analysis of models on adaptive graph structures.Comment: 18 pages, 7 figure
Assessment of subseasonal-to-seasonal (S2S) ensemble extreme precipitation forecast skill over Europe
Heavy precipitation can lead to floods and landslides, resulting in widespread damage and significant casualties. Some of its impacts can be mitigated if reliable forecasts and warnings are available. Of particular interest is the subseasonal-to-seasonal (S2S) prediction timescale. The S2S prediction timescale has received increasing attention in the research community because of its importance for many sectors. However, very few forecast skill assessments of precipitation extremes in S2S forecast data have been conducted. The goal of this article is to assess the forecast skill of rare events, here extreme precipitation, in S2S forecasts, using a metric specifically designed for extremes. We verify extreme precipitation events over Europe in the S2S forecast model from the European Centre for Medium-Range Weather Forecasts. The verification is conducted against ERA5 reanalysis precipitation. Extreme precipitation is defined as daily precipitation accumulations exceeding the seasonal 95th percentile. In addition to the classical Brier score, we use a binary loss index to assess skill. The binary loss index is tailored to assess the skill of rare events. We analyze daily events that are locally and spatially aggregated, as well as 7 d extreme-event counts. Results consistently show a higher skill in winter compared to summer. The regions showing the highest skill are Norway, Portugal and the south of the Alps. Skill increases when aggregating the extremes spatially or temporally. The verification methodology can be adapted and applied to other variables, e.g., temperature extremes or river discharge.</p
Message passing for vertex covers
Constructing a minimal vertex cover of a graph can be seen as a prototype for
a combinatorial optimization problem under hard constraints. In this paper, we
develop and analyze message passing techniques, namely warning and survey
propagation, which serve as efficient heuristic algorithms for solving these
computational hard problems. We show also, how previously obtained results on
the typical-case behavior of vertex covers of random graphs can be recovered
starting from the message passing equations, and how they can be extended.Comment: 25 pages, 9 figures - version accepted for publication in PR
Entropy landscape and non-Gibbs solutions in constraint satisfaction problems
We study the entropy landscape of solutions for the bicoloring problem in
random graphs, a representative difficult constraint satisfaction problem. Our
goal is to classify which type of clusters of solutions are addressed by
different algorithms. In the first part of the study we use the cavity method
to obtain the number of clusters with a given internal entropy and determine
the phase diagram of the problem, e.g. dynamical, rigidity and SAT-UNSAT
transitions. In the second part of the paper we analyze different algorithms
and locate their behavior in the entropy landscape of the problem. For instance
we show that a smoothed version of a decimation strategy based on Belief
Propagation is able to find solutions belonging to sub-dominant clusters even
beyond the so called rigidity transition where the thermodynamically relevant
clusters become frozen. These non-equilibrium solutions belong to the most
probable unfrozen clusters.Comment: 38 pages, 10 figure
Minimal cold knife conization height for high-grade cervical squamous intraepithelial lesion treatment
AbstractObjectivesTo assess the relationship between cold-knife conization specimen height, cervical intraepithelial neoplasia (CIN II/III) size and endocervical margin involvement by CIN II/II.Study designA cross-sectional study was performed. Cold knife cone specimens with a diagnosis of CIN II/III were selected. Epidemiological data and pathology reports were obtained through a chart review. All samples from each cone specimen showing CIN II/III and the squamocolumnar junction were selected. Cone height (mean±standard deviation), intraepithelial lesion size, and size of endocervical surgical margins were measured.ResultsFour hundred and forty-seven samples were analyzed from 97 cone specimens. Section size ranged from 3.4 to 29.7mm, tumor size from 0.3 to 17.5mm, and tumor distance from the endocervical margin, from 0.0 to 22.0mm. Age and parity were similar in the positive vs. negative margin groups (37.6±10.0 years vs. 37.7±11.9 years respectively, p=0.952, and 2.2±1.7 births vs. 2.6±1.9 births respectively, p=0.804), whereas cone height (22.4±6.9mm vs. 17.1±5.6mm, p=0.013) and tumor size (6.12±3.25mm vs. 10.6±4.45mm, p<0.001) were significantly different in negative vs. positive margin groups respectively.ConclusionsUse of cone height to identify the likelihood of negative margins enables better estimation of the risk–benefit ratio of greater risks of bleeding, stenosis, and obstetric complications (cervical incompetence) versus greater risks of residual and recurrent disease
The theoretical capacity of the Parity Source Coder
The Parity Source Coder is a protocol for data compression which is based on
a set of parity checks organized in a sparse random network. We consider here
the case of memoryless unbiased binary sources. We show that the theoretical
capacity saturate the Shannon limit at large K. We also find that the first
corrections to the leading behavior are exponentially small, so that the
behavior at finite K is very close to the optimal one.Comment: Added references, minor change
Reducing multi-photon rates in pulsed down-conversion by temporal multiplexing
We present a simple technique to reduce the emission rate of higher-order
photon events from pulsed spontaneous parametric down-conversion. The technique
uses extra-cavity control over a mode locked ultrafast laser to simultaneously
increase repetition rate and reduce the energy of each pulse from the pump
beam. We apply our scheme to a photonic quantum gate, showing improvements in
the non-classical interference visibility for 2-photon and 4-photon
experiments, and in the quantum-gate fidelity and entangled state production in
the 2-photon case.Comment: 8 pages, 6 figure
Random multi-index matching problems
The multi-index matching problem (MIMP) generalizes the well known matching
problem by going from pairs to d-uplets. We use the cavity method from
statistical physics to analyze its properties when the costs of the d-uplets
are random. At low temperatures we find for d>2 a frozen glassy phase with
vanishing entropy. We also investigate some properties of small samples by
enumerating the lowest cost matchings to compare with our theoretical
predictions.Comment: 22 pages, 16 figure
An algorithm for counting circuits: application to real-world and random graphs
We introduce an algorithm which estimates the number of circuits in a graph
as a function of their length. This approach provides analytical results for
the typical entropy of circuits in sparse random graphs. When applied to
real-world networks, it allows to estimate exponentially large numbers of
circuits in polynomial time. We illustrate the method by studying a graph of
the Internet structure.Comment: 7 pages, 3 figures, minor corrections, accepted versio
- …