1,460 research outputs found

    Phase transition for cutting-plane approach to vertex-cover problem

    Full text link
    We study the vertex-cover problem which is an NP-hard optimization problem and a prototypical model exhibiting phase transitions on random graphs, e.g., Erdoes-Renyi (ER) random graphs. These phase transitions coincide with changes of the solution space structure, e.g, for the ER ensemble at connectivity c=e=2.7183 from replica symmetric to replica-symmetry broken. For the vertex-cover problem, also the typical complexity of exact branch-and-bound algorithms, which proceed by exploring the landscape of feasible configurations, change close to this phase transition from "easy" to "hard". In this work, we consider an algorithm which has a completely different strategy: The problem is mapped onto a linear programming problem augmented by a cutting-plane approach, hence the algorithm operates in a space OUTSIDE the space of feasible configurations until the final step, where a solution is found. Here we show that this type of algorithm also exhibits an "easy-hard" transition around c=e, which strongly indicates that the typical hardness of a problem is fundamental to the problem and not due to a specific representation of the problem.Comment: 4 pages, 3 figure

    Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data

    Full text link
    Constraint Programming (CP) has proved an effective paradigm to model and solve difficult combinatorial satisfaction and optimisation problems from disparate domains. Many such problems arising from the commercial world are permeated by data uncertainty. Existing CP approaches that accommodate uncertainty are less suited to uncertainty arising due to incomplete and erroneous data, because they do not build reliable models and solutions guaranteed to address the user's genuine problem as she perceives it. Other fields such as reliable computation offer combinations of models and associated methods to handle these types of uncertain data, but lack an expressive framework characterising the resolution methodology independently of the model. We present a unifying framework that extends the CP formalism in both model and solutions, to tackle ill-defined combinatorial problems with incomplete or erroneous data. The certainty closure framework brings together modelling and solving methodologies from different fields into the CP paradigm to provide reliable and efficient approches for uncertain constraint problems. We demonstrate the applicability of the framework on a case study in network diagnosis. We define resolution forms that give generic templates, and their associated operational semantics, to derive practical solution methods for reliable solutions.Comment: Revised versio

    Optimization by Quantum Annealing: Lessons from hard 3-SAT cases

    Full text link
    The Path Integral Monte Carlo simulated Quantum Annealing algorithm is applied to the optimization of a large hard instance of the Random 3-SAT Problem (N=10000). The dynamical behavior of the quantum and the classical annealing are compared, showing important qualitative differences in the way of exploring the complex energy landscape of the combinatorial optimization problem. At variance with the results obtained for the Ising spin glass and for the Traveling Salesman Problem, in the present case the linear-schedule Quantum Annealing performance is definitely worse than Classical Annealing. Nevertheless, a quantum cooling protocol based on field-cycling and able to outperform standard classical simulated annealing over short time scales is introduced.Comment: 10 pages, 6 figures, submitted to PR

    Highlights of the Zeno Results from the USMP-2 Mission

    Get PDF
    The Zeno instrument, a High-precision, light-scattering spectrometer, was built to measure the decay rates of density fluctuations in xenon near its liquid-vapor critical point in the low-gravity environment of the U.S. Space Shuttle. Eliminating the severe density gradients created in a critical fluid by Earth's gravity, we were able to make measurements to within 100 microKelvin of the critical point. The instrument flew for fourteen days in March, 1994 on the Space Shuttle Columbia, STS-62 flight, as part of the very successful USMP-2 payload. We describe the instrument and document its performance on orbit, showing that it comfortably reached the desired 3 microKelvin temperature control of the sample. Locating the critical temperature of the sample on orbit was a scientific challenge; we discuss the advantages and short-comings of the two techniques we used. Finally we discuss problems encountered with making measurements of the turbidity of the sample, and close with the results of the measurement of the decay rates of the critical-point fluctuations

    Entropy-based analysis of the number partitioning problem

    Full text link
    In this paper we apply the multicanonical method of statistical physics on the number-partitioning problem (NPP). This problem is a basic NP-hard problem from computer science, and can be formulated as a spin-glass problem. We compute the spectral degeneracy, which gives us information about the number of solutions for a given cost EE and cardinality mm. We also study an extension of this problem for QQ partitions. We show that a fundamental difference on the spectral degeneracy of the generalized (Q>2Q>2) NPP exists, which could explain why it is so difficult to find good solutions for this case. The information obtained with the multicanonical method can be very useful on the construction of new algorithms.Comment: 6 pages, 4 figure

    Galactose inhibition of the constitutive transport of hexoses in Saccharomyces cerevisiae

    Get PDF
    The relationship between the pathways of glucose and galactose utilization in Saccharomyces cerevisiae has been studied. Galactose (which is transported and phosphorylated by inducible systems) is a strong inhibitor of the utilization of glucose, fructose and mannose (which have the same constitutive transport and phosphorylation systems). Conversely, all these three hexoses inhibit the utilization of galactose, though with poor efficiency. These cross-inhibitions only occur in yeast adapted to galactose or in galactose-constitutive mutants. The efficiency of galactose as inhibitor is even greater than the efficiencies of each of the other three hexoses to inhibit the utilization of each other. Phosphorylation is not involved in the inhibition and transport of sugars is the affected step. The cross-inhibitions between galactose and either glucose, fructose or mannose do not implicate utilization of one hexose at the expense of the other, as it occurs in the mutual interactions between the latter three sugars. it seems that, by growing the yeast in galactose, a protein component is synthesized, or alternatively modified, that once bound to either galactose or any one of the other three hexoses (glucose, fructose or mannose), cross-interacts respectively with the constitutive or the inducible transport systems, impairing their function.This work was supported by a grant (PB87-0206) from the DGICYT, PromociĂłn General del Conocimiento.Peer Reviewe

    Implementation of the LANS-alpha turbulence model in a primitive equation ocean model

    Get PDF
    This paper presents the first numerical implementation and tests of the Lagrangian-averaged Navier-Stokes-alpha (LANS-alpha) turbulence model in a primitive equation ocean model. The ocean model in which we work is the Los Alamos Parallel Ocean Program (POP); we refer to POP and our implementation of LANS-alpha as POP-alpha. Two versions of POP-alpha are presented: the full POP-alpha algorithm is derived from the LANS-alpha primitive equations, but requires a nested iteration that makes it too slow for practical simulations; a reduced POP-alpha algorithm is proposed, which lacks the nested iteration and is two to three times faster than the full algorithm. The reduced algorithm does not follow from a formal derivation of the LANS-alpha model equations. Despite this, simulations of the reduced algorithm are nearly identical to the full algorithm, as judged by globally averaged temperature and kinetic energy, and snapshots of temperature and velocity fields. Both POP-alpha algorithms can run stably with longer timesteps than standard POP. Comparison of implementations of full and reduced POP-alpha algorithms are made within an idealized test problem that captures some aspects of the Antarctic Circumpolar Current, a problem in which baroclinic instability is prominent. Both POP-alpha algorithms produce statistics that resemble higher-resolution simulations of standard POP. A linear stability analysis shows that both the full and reduced POP-alpha algorithms benefit from the way the LANS-alpha equations take into account the effects of the small scales on the large. Both algorithms (1) are stable; (2) make the Rossby Radius effectively larger; and (3) slow down Rossby and gravity waves.Comment: Submitted to J. Computational Physics March 21, 200

    An initial intercomparison of atmospheric and oceanic climatology for the ICE-5G and ICE-4G models of LGM paleotopography

    Get PDF
    This paper investigates the impact of the new ICE-5G paleotopography dataset for Last Glacial Maximum (LGM) conditions on a coupled model simulation of the thermal and dynamical state of the glacial atmosphere and on both land surface and sea surface conditions. The study is based upon coupled climate simulations performed with the ocean–atmosphere–sea ice model of intermediate-complexity Climate de Bilt-coupled large-scale ice–ocean (ECBilt-Clio) model. Four simulations focusing on the Last Glacial Maximum [21 000 calendar years before present (BP)] have been analyzed: a first simulation (LGM-4G) that employed the original ICE-4G ice sheet topography and albedo, and a second simulation (LGM-5G) that employed the newly constructed ice sheet topography, denoted ICE-5G, and its respective albedo. Intercomparison of the results obtained in these experiments demonstrates that the LGM-5G simulation delivers significantly enhanced cooling over Canada compared to the LGM-4G simulation whereas positive temperature anomalies are simulated over southern North America and the northern Atlantic. Moreover, introduction of the ICE-5G topography is shown to lead to a deceleration of the subtropical westerlies and to the development of an intensified ridge over North America, which has a profound effect upon the hydrological cycle. Additionally, two flat ice sheet experiments were carried out to investigate the impact of the ice sheet albedo on global climate. By comparing these experiments with the full LGM simulations, it becomes evident that the climate anomalies between LGM-5G and LGM-4G are mainly driven by changes of the earth’s topography
    • …
    corecore