1,132 research outputs found
Phase Transition in the Number Partitioning Problem
Number partitioning is an NP-complete problem of combinatorial optimization.
A statistical mechanics analysis reveals the existence of a phase transition
that separates the easy from the hard to solve instances and that reflects the
pseudo-polynomiality of number partitioning. The phase diagram and the value of
the typical ground state energy are calculated.Comment: minor changes (references, typos and discussion of results
Random Costs in Combinatorial Optimization
The random cost problem is the problem of finding the minimum in an
exponentially long list of random numbers. By definition, this problem cannot
be solved faster than by exhaustive search. It is shown that a classical
NP-hard optimization problem, number partitioning, is essentially equivalent to
the random cost problem. This explains the bad performance of heuristic
approaches to the number partitioning problem and allows us to calculate the
probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR
Phase transition for cutting-plane approach to vertex-cover problem
We study the vertex-cover problem which is an NP-hard optimization problem
and a prototypical model exhibiting phase transitions on random graphs, e.g.,
Erdoes-Renyi (ER) random graphs. These phase transitions coincide with changes
of the solution space structure, e.g, for the ER ensemble at connectivity
c=e=2.7183 from replica symmetric to replica-symmetry broken. For the
vertex-cover problem, also the typical complexity of exact branch-and-bound
algorithms, which proceed by exploring the landscape of feasible
configurations, change close to this phase transition from "easy" to "hard". In
this work, we consider an algorithm which has a completely different strategy:
The problem is mapped onto a linear programming problem augmented by a
cutting-plane approach, hence the algorithm operates in a space OUTSIDE the
space of feasible configurations until the final step, where a solution is
found. Here we show that this type of algorithm also exhibits an "easy-hard"
transition around c=e, which strongly indicates that the typical hardness of a
problem is fundamental to the problem and not due to a specific representation
of the problem.Comment: 4 pages, 3 figure
New scaling for the alpha effect in slowly rotating turbulence
Using simulations of slowly rotating stratified turbulence, we show that the
alpha effect responsible for the generation of astrophysical magnetic fields is
proportional to the logarithmic gradient of kinetic energy density rather than
that of momentum, as was previously thought. This result is in agreement with a
new analytic theory developed in this paper for large Reynolds numbers. Thus,
the contribution of density stratification is less important than that of
turbulent velocity. The alpha effect and other turbulent transport coefficients
are determined by means of the test-field method. In addition to forced
turbulence, we also investigate supernova-driven turbulence and stellar
convection. In some cases (intermediate rotation rate for forced turbulence,
convection with intermediate temperature stratification, and supernova-driven
turbulence) we find that the contribution of density stratification might be
even less important than suggested by the analytic theory.Comment: 10 pages, 9 figures, revised version, Astrophys. J., in pres
Optimization by Quantum Annealing: Lessons from hard 3-SAT cases
The Path Integral Monte Carlo simulated Quantum Annealing algorithm is
applied to the optimization of a large hard instance of the Random 3-SAT
Problem (N=10000). The dynamical behavior of the quantum and the classical
annealing are compared, showing important qualitative differences in the way of
exploring the complex energy landscape of the combinatorial optimization
problem. At variance with the results obtained for the Ising spin glass and for
the Traveling Salesman Problem, in the present case the linear-schedule Quantum
Annealing performance is definitely worse than Classical Annealing.
Nevertheless, a quantum cooling protocol based on field-cycling and able to
outperform standard classical simulated annealing over short time scales is
introduced.Comment: 10 pages, 6 figures, submitted to PR
Influence of severity and level of injury on the occurrence of complications during the subacute and chronic stage of traumatic spinal cord injury:a systematic review
Objective: Secondary health conditions (SHCs) are long-term complications that frequently occur due to traumatic spinal cord injury (tSCI) and can negatively affect quality of life in this patient population. This study provides an overview of the associations between the severity and level of injury and the occurrence of SHCs in tSCI. Methods: A systematic search was conducted in PubMed and Embase that retrieved 44 studies on the influence of severity and/or level of injury on the occurrence of SHCs in the subacute and chronic phase of tSCI (from 3 months after trauma). The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. Results: In the majority of studies, patients with motor-complete tSCI (American Spinal Injury Association [ASIA] Impairment Scale [AIS] grade A or B) had a significantly increased occurrence of SHCs in comparison to patients with motor-incomplete tSCI (AIS grade C or D), such as respiratory and urogenital complications, musculoskeletal disorders, pressure ulcers, and autonomic dysreflexia. In contrast, an increased prevalence of pain was seen in patients with motor-incomplete injuries. In addition, higher rates of pulmonary infections, spasticity, and autonomic dysreflexia were observed in patients with tetraplegia. Patients with paraplegia more commonly suffered from hypertension, venous thromboembolism, and pain. Conclusions: This review suggests that patients with a motor-complete tSCI have an increased risk of developing SHCs during the subacute and chronic stage of tSCI in comparison with patients with motor-incomplete tSCI. Future studies should examine whether systematic monitoring during rehabilitation and the subacute and chronic phase in patients with motor-complete tSCI could lead to early detection and potential prevention of SHCs in this population
Phase transition and landscape statistics of the number partitioning problem
The phase transition in the number partitioning problem (NPP), i.e., the
transition from a region in the space of control parameters in which almost all
instances have many solutions to a region in which almost all instances have no
solution, is investigated by examining the energy landscape of this classic
optimization problem. This is achieved by coding the information about the
minimum energy paths connecting pairs of minima into a tree structure, termed a
barrier tree, the leaves and internal nodes of which represent, respectively,
the minima and the lowest energy saddles connecting those minima. Here we apply
several measures of shape (balance and symmetry) as well as of branch lengths
(barrier heights) to the barrier trees that result from the landscape of the
NPP, aiming at identifying traces of the easy/hard transition. We find that it
is not possible to tell the easy regime from the hard one by visual inspection
of the trees or by measuring the barrier heights. Only the {\it difficulty}
measure, given by the maximum value of the ratio between the barrier height and
the energy surplus of local minima, succeeded in detecting traces of the phase
transition in the tree. In adddition, we show that the barrier trees associated
with the NPP are very similar to random trees, contrasting dramatically with
trees associated with the spin-glass and random energy models. We also
examine critically a recent conjecture on the equivalence between the NPP and a
truncated random energy model
Evidence for proton acceleration up to TeV energies based on VERITAS and Fermi-LAT observations of the Cas A SNR
We present a study of -ray emission from the core-collapse supernova
remnant Cas~A in the energy range from 0.1GeV to 10TeV. We used 65 hours of
VERITAS data to cover 200 GeV - 10 TeV, and 10.8 years of \textit{Fermi}-LAT
data to cover 0.1-500 GeV. The spectral analysis of \textit{Fermi}-LAT data
shows a significant spectral curvature around GeV that is
consistent with the expected spectrum from pion decay. Above this energy, the
joint spectrum from \textit{Fermi}-LAT and VERITAS deviates significantly from
a simple power-law, and is best described by a power-law with spectral index of
with a cut-off energy of TeV. These
results, along with radio, X-ray and -ray data, are interpreted in the
context of leptonic and hadronic models. Assuming a one-zone model, we exclude
a purely leptonic scenario and conclude that proton acceleration up to at least
6 TeV is required to explain the observed -ray spectrum. From modeling
of the entire multi-wavelength spectrum, a minimum magnetic field inside the
remnant of is deduced.Comment: 33 pages, 9 Figures, 6 Table
- …