1,462 research outputs found
Recommended from our members
Combinatorics
Combinatorics is a fundamental mathematical discipline that focuses on the study of discrete objects and their properties. The present workshop featured research in such diverse areas as Extremal, Probabilistic and Algebraic Combinatorics, Graph Theory, Discrete Geometry, Combinatorial Optimization, Theory of Computation and Statistical Mechanics. It provided current accounts of exciting developments and challenges in these fields and a stimulating venue for a variety of fruitful interactions. This is a report on the meeting, containing abstracts of the presentations and a summary of the problem session
Rainbow Coloring Hardness via Low Sensitivity Polymorphisms
A k-uniform hypergraph is said to be r-rainbow colorable if there is an r-coloring of its vertices such that every hyperedge intersects all r color classes. Given as input such a hypergraph, finding a r-rainbow coloring of it is NP-hard for all k >= 3 and r >= 2. Therefore, one settles for finding a rainbow coloring with fewer colors (which is an easier task). When r=k (the maximum possible value), i.e., the hypergraph is k-partite, one can efficiently 2-rainbow color the hypergraph, i.e., 2-color its vertices so that there are no monochromatic edges. In this work we consider the next smaller value of r=k-1, and prove that in this case it is NP-hard to rainbow color the hypergraph with q := ceil[(k-2)/2] colors. In particular, for k <=6, it is NP-hard to 2-color (k-1)-rainbow colorable k-uniform hypergraphs.
Our proof follows the algebraic approach to promise constraint satisfaction problems. It proceeds by characterizing the polymorphisms associated with the approximate rainbow coloring problem, which are rainbow colorings of some product hypergraphs on vertex set [r]^n. We prove that any such polymorphism f: [r]^n -> [q] must be C-fixing, i.e., there is a small subset S of C coordinates and a setting a in [q]^S such that fixing x_{|S} = a determines the value of f(x). The key step in our proof is bounding the sensitivity of certain rainbow colorings, thereby arguing that they must be juntas. Armed with the C-fixing characterization, our NP-hardness is obtained via a reduction from smooth Label Cover
Recommended from our members
Combinatorics
Combinatorics is a fundamental mathematical discipline which focuses on the study of discrete objects and their properties. The current workshop brought together researchers from diverse fields such as Extremal and Probabilistic Combinatorics, Discrete Geometry, Graph theory, Combiantorial Optimization and Algebraic Combinatorics for a fruitful interaction. New results, methods and developments and future challenges were discussed. This is a report on the meeting containing abstracts of the presentations and a summary of the problem session
Recommended from our members
Complexity Theory
Computational Complexity Theory is the mathematical study of the intrinsic power and limitations of computational resources like time, space, or randomness. The current workshop focused on recent developments in various sub-areas including arithmetic complexity, Boolean complexity, communication complexity, cryptography, probabilistic proof systems, pseudorandomness, and quantum computation. Many of the developments are related to diverse mathematical ïŹelds such as algebraic geometry, combinatorial number theory, probability theory, representation theory, and the theory of error-correcting codes
Statistical physics of constraint satisfaction problems
La technique des rĂ©pliques est une technique formidable prenant ses origines de la physique statistique, comme un moyen de calculer l'espĂ©rance du logarithme de la constante de normalisation d'une distribution de probabilitĂ© Ă haute dimension. Dans le jargon de physique, cette quantitĂ© est connue sous le nom de lâĂ©nergie libre, et toutes sortes de quantitĂ©s utiles, telle que lâentropie, peuvent ĂȘtre obtenue de lĂ par des dĂ©rivĂ©es. Cependant, ceci est un problĂšme NP-difficile, quâune bonne partie de statistique computationelle essaye de rĂ©soudre, et qui apparaĂźt partout; de la thĂ©orie des codes, Ă la statistique en hautes dimensions, en passant par les problĂšmes de satisfaction de contraintes. Dans chaque cas, la mĂ©thode des rĂ©pliques, et son extension par (Parisi et al., 1987), se sont prouvĂ©es fortes utiles pour illuminer quelques aspects concernant la corrĂ©lation des variables de la distribution de Gibbs et la nature fortement nonconvexe de son logarithme negatif. Algorithmiquement, il existe deux principales mĂ©thodologies adressant la difficultĂ© de calcul que pose la constante de normalisation:
a). Le point de vue statique: dans cette approche, on reformule le problĂšme en tant que graphe dont les nĆuds correspondent aux variables individuelles de la distribution de Gibbs, et dont les arĂȘtes reflĂštent les dĂ©pendances entre celles-ci. Quand le graphe en question est localement un arbre, les procĂ©dures de message-passing sont garanties dâapproximer arbitrairement bien les probabilitĂ©s marginales de la distribution de Gibbs et de maniĂšre Ă©quivalente d'approximer la constante de normalisation. Les prĂ©dictions de la physique concernant la disparition des corrĂ©lations Ă longues portĂ©es se traduise donc, par le fait que le graphe soit localement un arbre, ainsi permettant lâutilisation des algorithmes locaux de passage de messages. Ceci va ĂȘtre le sujet du chapitre 4.
b). Le point de vue dynamique: dans une direction orthogonale, on peut contourner le problĂšme que pose le calcul de la constante de normalisation, en dĂ©finissant une chaĂźne de Markov le long de laquelle, lâĂ©chantillonnage converge Ă celui selon la distribution de Gibbs, tel quâaprĂšs un certain nombre dâitĂ©rations (sous le nom de temps de relaxation), les Ă©chantillons sont garanties dâĂȘtre approximativement gĂ©nĂ©rĂ©s selon elle. Afin de discuter des conditions dans lesquelles chacune de ces approches Ă©choue, il est trĂšs utile dâĂȘtre familier avec la mĂ©thode de replica symmetry breaking de Parisi.
Cependant, les calculs nécessaires sont assez compliqués, et requiÚrent des notions qui sont typiquemment étrangÚres à ceux sans un entrainement en physique statistique.
Ce mĂ©moire a principalement deux objectifs : i) de fournir une introduction a la thĂ©orie des rĂ©pliques, ses prĂ©dictions, et ses consĂ©quences algorithmiques pour les problĂšmes de satisfaction de constraintes, et ii) de donner un survol des mĂ©thodes les plus rĂ©centes adressant la transition de phase, prĂ©dite par la mĂ©thode des rĂ©pliques, dans le cas du problĂšme kâSAT, Ă partir du point de vu statique et dynamique, et finir en proposant un nouvel algorithme qui prend en considĂ©ration la transition de phase en question.The replica trick is a powerful analytic technique originating from statistical physics as an attempt to compute the expectation of the logarithm of the normalization constant of a high dimensional probability distribution known as the Gibbs measure. In physics jargon this quantity is known as the free energy, and all kinds of useful quantities, such as the entropy, can be obtained from it using simple derivatives. The computation of this normalization constant is however an NP-hard problem that a large part of computational statistics attempts to deal with, and which shows up everywhere from coding theory, to high dimensional statistics, compressed sensing, protein folding analysis and constraint satisfaction problems. In each of these cases, the replica trick, and its extension by (Parisi et al., 1987), have proven incredibly successful at shedding light on keys aspects relating to the correlation structure of the Gibbs measure and the highly non-convex nature of â log(the Gibbs measure()). Algorithmic speaking, there exists two main methodologies addressing the intractability of the normalization constant:
a) Statics: in this approach, one casts the system as a graphical model whose vertices represent individual variables, and whose edges reflect the dependencies between them. When the underlying graph is locally tree-like, local messagepassing procedures are guaranteed to yield near-exact marginal probabilities or equivalently compute Z. The physics predictions of vanishing long range correlation in the Gibbs measure, then translate into the associated graph being locally tree-like, hence permitting the use message passing procedures. This will be the focus of chapter 4.
b) Dynamics: in an orthogonal direction, we can altogether bypass the issue of computing the normalization constant, by defining a Markov chain along which sampling converges to the Gibbs measure, such that after a number of iterations known as the relaxation-time, samples
are guaranteed to be approximately sampled according to the Gibbs measure. To get into the conditions in which each of the two approaches is likely to fail (strong long range correlation, high energy barriers, etc..), it is very helpful to be familiar with the so-called replica symmetry breaking picture of Parisi. The computations involved are however quite involved, and come with a number of prescriptions and prerequisite notions (s.a. large deviation principles, saddle-point approximations) that are typically foreign to those without a statistical physics background. The purpose of this thesis is then twofold: i) to provide a self-contained introduction to replica theory, its predictions, and its algorithmic implications for constraint satisfaction problems, and ii) to give an account of state of the art methods in addressing the predicted phase transitions in the case of kâSAT, from both the statics and dynamics points of view,
and propose a new algorithm takes takes these into consideration
- âŠ