58 research outputs found
Spatial mixing and approximation algorithms for graphs with bounded connective constant
The hard core model in statistical physics is a probability distribution on
independent sets in a graph in which the weight of any independent set I is
proportional to lambda^(|I|), where lambda > 0 is the vertex activity. We show
that there is an intimate connection between the connective constant of a graph
and the phenomenon of strong spatial mixing (decay of correlations) for the
hard core model; specifically, we prove that the hard core model with vertex
activity lambda < lambda_c(Delta + 1) exhibits strong spatial mixing on any
graph of connective constant Delta, irrespective of its maximum degree, and
hence derive an FPTAS for the partition function of the hard core model on such
graphs. Here lambda_c(d) := d^d/(d-1)^(d+1) is the critical activity for the
uniqueness of the Gibbs measure of the hard core model on the infinite d-ary
tree. As an application, we show that the partition function can be efficiently
approximated with high probability on graphs drawn from the random graph model
G(n,d/n) for all lambda < e/d, even though the maximum degree of such graphs is
unbounded with high probability.
We also improve upon Weitz's bounds for strong spatial mixing on bounded
degree graphs (Weitz, 2006) by providing a computationally simple method which
uses known estimates of the connective constant of a lattice to obtain bounds
on the vertex activities lambda for which the hard core model on the lattice
exhibits strong spatial mixing. Using this framework, we improve upon these
bounds for several lattices including the Cartesian lattice in dimensions 3 and
higher.
Our techniques also allow us to relate the threshold for the uniqueness of
the Gibbs measure on a general tree to its branching factor (Lyons, 1989).Comment: 26 pages. In October 2014, this paper was superseded by
arxiv:1410.2595. Before that, an extended abstract of this paper appeared in
Proc. IEEE Symposium on the Foundations of Computer Science (FOCS), 2013, pp.
300-30
Three lectures on random proper colorings of
A proper -coloring of a graph is an assignment of one of colors to
each vertex of the graph so that adjacent vertices are colored differently.
Sample uniformly among all proper -colorings of a large discrete cube in the
integer lattice . Does the random coloring obtained exhibit any
large-scale structure? Does it have fast decay of correlations? We discuss
these questions and the way their answers depend on the dimension and the
number of colors . The questions are motivated by statistical physics
(anti-ferromagnetic materials, square ice), combinatorics (proper colorings,
independent sets) and the study of random Lipschitz functions on a lattice. The
discussion introduces a diverse set of tools, useful for this purpose and for
other problems, including spatial mixing, entropy and coupling methods, Gibbs
measures and their classification and refined contour analysis.Comment: 53 pages, 10 figures; Based on lectures given at the workshop on
Random Walks, Random Graphs and Random Media, September 2019, Munich and at
the school Lectures on Probability and Stochastic Processes XIV, December
2019, Delh
Trees of self-avoiding walks
We consider the biased random walk on a tree constructed from the set of
finite self-avoiding walks on a lattice, and use it to construct probability
measures on infinite self-avoiding walks. The limit measure (if it exists)
obtained when the bias converges to its critical value is conjectured to
coincide with the weak limit of the uniform SAW. Along the way, we obtain a
criterion for the continuity of the escape probability of a biased random walk
on a tree as a function of the bias, and show that the collection of escape
probability functions for spherically symmetric trees of bounded degree is
stable under uniform convergence
Recommended from our members
Strongly Correlated Random Interacting Processes
The focus of the workshop was to discuss the recent developments and future research directions in the area of large scale random interacting processes, with main emphasis in models where local microscopic interactions either produce strong correlations at macroscopic levels, or generate non-equilibrium dynamics. This report contains extended abstracts of the presentations, which featured research in several directions including selfinteracting random walks, spatially growing processes, strongly dependent percolation, spin systems with long-range order, and random permutations
Recommended from our members
Annual Report for NERI Proposal No.2000-0109 on Forewarning of Failure in Critical Equipment at Next-Generation Nuclear Power Plants
This annual report describes the first year's accomplishments under the NERI2000-109 project. We present a model-independent approach to quantify changes in the nonlinear dynamics underlying time-serial data. From time-windowed data sets, we construct discrete distribution functions on the phase space. Condition change between base case and test case distribution functions is assessed by dissimilarity measures via L{sub 1}-distance and {chi}{sup 2} statistic. The discriminating power of these measures is first tested on noiseless model data, and then applied for detecting dynamical change in power from a motor-pump system. We compare the phase-space dissimilarities with traditional linear and nonlinear measures used in the analysis of chaotic systems. We also assess the potential usefulness of the new measures for robust, accurate, and timely forewarning of equipment failure
Recommended from our members
Annual Report Nucelar Energy Research and Development Program Nuclear Energy Research Initiative
NERI Project No.2000-0109 began in August 2000 and has three tasks. The first project year addressed Task 1, namely development of nonlinear prognostication for critical equipment in nuclear power facilities. That work is described in the first year's annual report (ORNLTM-2001/195). The current (second) project year (FY02) addresses Task 2, while the third project year will address Tasks 2-3. This report describes the work for the second project year, spanning August 2001 through August 2002, including status of the tasks, issues and concerns, cost performance, and status summary of tasks. The objective of the second project year's work is a compelling demonstration of the nonlinear prognostication algorithm using much more data. The guidance from Dr. Madeline Feltus (DOE/NE-20) is that it would be preferable to show forewarning of failure for different kinds of nuclear-grade equipment, as opposed to many different failure modes from one piece of equipment. Long-term monitoring of operational utility equipment is possible in principle, but is not practically feasible for the following reason. Time and funding constraints for this project do not allow us to monitor the many machines (thousands) that will be necessary to obtain even a few failure sequences, due to low failure rates (<10{sup -3}/year) in the operational environment. Moreover, the ONLY way to guarantee a controlled failure sequence is to seed progressively larger faults in the equipment or to overload the equipment for accelerated tests. Both of these approaches are infeasible for operational utility machinery, but are straight-forward in a test environment. Our subcontractor has provided such test sequences. Thus, we have revised Tasks 2.1-2.4 to analyze archival test data from such tests. The second phase of our work involves validation of the nonlinear prognostication over the second and third years of the proposed work. Recognizing the inherent limitations outlined in the previous paragraph, Dr. Feltus urged Oak Ridge National Laboratory (ORNL) to contact other researchers for additional data from other test equipment. Consequently, we have revised the work plan for Tasks 2.1-2.2, with corresponding changes to the work plan as shown in the Status Summary of NERI Tasks. The revised tasks are as follows: Task 2.1--ORNL will obtain test data from a subcontractor and other researchers for various test equipment. This task includes development of a test plan or a description of the historical testing, as appropriate: test facility, equipment to be tested, choice of failure mode(s), testing protocol, data acquisition equipment, and resulting data from the test sequence. ORNL will analyze this data for quality, and subsequently via the nonlinear paradigm for prognostication. Task 2.2--ORNL will evaluate the prognostication capability of the nonlinear paradigm. The comparison metrics for reliability of the predictions will include the true positives, true negatives, and the forewarning times. Task 2.3--ORNL will improve the nonlinear paradigm as appropriate, in accord with the results of Tasks 2.1-2.2, to maximize the rate of true positive and true negative indications of failure. Maximal forewarning time is also highly desirable. Task 2.4--ORNL will develop advanced algorithms for the phase-space distribution function (PS-DF) pattern change recognition, based on the results of Task 2.3. This implementation will provide a capability for automated prognostication, as part of the maintenance decision-making. Appendix A provides a detailed description of the analysis methods, which include conventional statistics, traditional nonlinear measures, and ORNL's patented nonlinear PSDM. The body of this report focuses on results of this analysis
Quenched invariance principle for simple random walk on percolation clusters
We consider the simple random walk on the (unique) infinite cluster of
super-critical bond percolation in with . We prove that, for
almost every percolation configuration, the path distribution of the walk
converges weakly to that of non-degenerate, isotropic Brownian motion. Our
analysis is based on the consideration of a harmonic deformation of the
infinite cluster on which the random walk becomes a square-integrable
martingale. The size of the deformation, expressed by the so called corrector,
is estimated by means of ergodicity arguments.Comment: 38 pages (PTRF format) 4 figures. Version to appear in PTR
- …