235 research outputs found
QRAT+: Generalizing QRAT by a More Powerful QBF Redundancy Property
The QRAT (quantified resolution asymmetric tautology) proof system simulates
virtually all inference rules applied in state of the art quantified Boolean
formula (QBF) reasoning tools. It consists of rules to rewrite a QBF by adding
and deleting clauses and universal literals that have a certain redundancy
property. To check for this redundancy property in QRAT, propositional unit
propagation (UP) is applied to the quantifier free, i.e., propositional part of
the QBF. We generalize the redundancy property in the QRAT system by QBF
specific UP (QUP). QUP extends UP by the universal reduction operation to
eliminate universal literals from clauses. We apply QUP to an abstraction of
the QBF where certain universal quantifiers are converted into existential
ones. This way, we obtain a generalization of QRAT we call QRAT+. The
redundancy property in QRAT+ based on QUP is more powerful than the one in QRAT
based on UP. We report on proof theoretical improvements and experimental
results to illustrate the benefits of QRAT+ for QBF preprocessing.Comment: preprint of a paper to be published at IJCAR 2018, LNCS, Springer,
including appendi
Controlling qubit arrays with anisotropic XXZ Heisenberg interaction by acting on a single qubit
We investigate anisotropic XXZ Heisenberg spin-1/2 chains with control fields acting on one of the end spins, with the aim of exploring local quantum control in arrays of interacting qubits. In this work, which uses a recent Lie-algebraic result on the local controllability of spin chains with "always-on” interactions, we determine piecewise-constant control pulses corresponding to optimal fidelities for quantum gates such as spin-flip (NOT), controlled-NOT (CNOT), and square-root-of-SWAP (). We find the minimal times for realizing different gates depending on the anisotropy parameter Δ of the model, showing that the shortest among these gate times are achieved for particular values of Δ larger than unity. To study the influence of possible imperfections in anticipated experimental realizations of qubit arrays, we analyze the robustness of the obtained results for the gate fidelities to random variations in the control-field amplitudes and finite rise time of the pulses. Finally, we discuss the implications of our study for superconducting charge-qubit array
A Linear Weight Transfer Rule for Local Search
The Divide and Distribute Fixed Weights algorithm (ddfw) is a dynamic local
search SAT-solving algorithm that transfers weight from satisfied to falsified
clauses in local minima. ddfw is remarkably effective on several hard
combinatorial instances. Yet, despite its success, it has received little study
since its debut in 2005. In this paper, we propose three modifications to the
base algorithm: a linear weight transfer method that moves a dynamic amount of
weight between clauses in local minima, an adjustment to how satisfied clauses
are chosen in local minima to give weight, and a weighted-random method of
selecting variables to flip. We implemented our modifications to ddfw on top of
the solver yalsat. Our experiments show that our modifications boost the
performance compared to the original ddfw algorithm on multiple benchmarks,
including those from the past three years of SAT competitions. Moreover, our
improved solver exclusively solves hard combinatorial instances that refute a
conjecture on the lower bound of two Van der Waerden numbers set forth by Ahmed
et al. (2014), and it performs well on a hard graph-coloring instance that has
been open for over three decades
Entanglement dynamics of two qubits under the influence of external kicks and Gaussian pulses
We have investigated the dynamics of entanglement between two spin-1/2 qubits
that are subject to independent kick and Gaussian pulse type external magnetic
fields analytically as well as numerically. Dyson time ordering effect on the
dynamics is found to be important for the sequence of kicks. We show that
"almost-steady" high entanglement can be created between two initially
unentangled qubits by using carefully designed kick or pulse sequences
Nonexistence Certificates for Ovals in a Projective Plane of Order Ten
In 1983, a computer search was performed for ovals in a projective plane of
order ten. The search was exhaustive and negative, implying that such ovals do
not exist. However, no nonexistence certificates were produced by this search,
and to the best of our knowledge the search has never been independently
verified. In this paper, we rerun the search for ovals in a projective plane of
order ten and produce a collection of nonexistence certificates that, when
taken together, imply that such ovals do not exist. Our search program uses the
cube-and-conquer paradigm from the field of satisfiability (SAT) checking,
coupled with a programmatic SAT solver and the nauty symbolic computation
library for removing symmetries from the search.Comment: Appears in the Proceedings of the 31st International Workshop on
Combinatorial Algorithms (IWOCA 2020
Volumetric measurements of weak current-induced magnetic fields in the human brain at high resolution
PURPOSE
Clinical use of transcranial electrical stimulation (TES) requires accurate knowledge of the injected current distribution in the brain. MR current density imaging (MRCDI) uses measurements of the TES-induced magnetic fields to provide this information. However, sufficient sensitivity and image quality in humans in vivo has only been documented for single-slice imaging.
METHODS
A recently developed, optimally spoiled, acquisition-weighted, gradient echo-based 2D-MRCDI method has now been advanced for volume coverage with densely or sparsely distributed slices: The 3D rectilinear sampling (3D-DENSE) and simultaneous multislice acquisition (SMS-SPARSE) were optimized and verified by cable-loop experiments and tested with 1-mA TES experiments for two common electrode montages.
RESULTS
Comparisons between the volumetric methods against the 2D-MRCDI showed that relatively long acquisition times of 3D-DENSE using a single slab with six slices hindered the expected sensitivity improvement in the current-induced field measurements but improved sensitivity by 61% in the Laplacian of the field, on which some MRCDI reconstruction methods rely. Also, SMS-SPARSE acquisition of three slices, with a factor 2 CAIPIRINHA (controlled aliasing in parallel imaging results in higher acceleration) acceleration, performed best against the 2D-MRCDI with sensitivity improvements for the and Laplacian noise floors of 56% and 78% (baseline without current flow) as well as 43% and 55% (current injection into head). SMS-SPARSE reached a sensitivity of 67 pT for three distant slices at 2 × 2 × 3 mm resolution in 10 min of total scan time, and consistently improved image quality.
CONCLUSION
Volumetric MRCDI measurements with high sensitivity and image quality are well suited to characterize the TES field distribution in the human brain
On QBF Proofs and Preprocessing
QBFs (quantified boolean formulas), which are a superset of propositional
formulas, provide a canonical representation for PSPACE problems. To overcome
the inherent complexity of QBF, significant effort has been invested in
developing QBF solvers as well as the underlying proof systems. At the same
time, formula preprocessing is crucial for the application of QBF solvers. This
paper focuses on a missing link in currently-available technology: How to
obtain a certificate (e.g. proof) for a formula that had been preprocessed
before it was given to a solver? The paper targets a suite of commonly-used
preprocessing techniques and shows how to reconstruct certificates for them. On
the negative side, the paper discusses certain limitations of the
currently-used proof systems in the light of preprocessing. The presented
techniques were implemented and evaluated in the state-of-the-art QBF
preprocessor bloqqer.Comment: LPAR 201
Automating Deductive Verification for Weak-Memory Programs
Writing correct programs for weak memory models such as the C11 memory model
is challenging because of the weak consistency guarantees these models provide.
The first program logics for the verification of such programs have recently
been proposed, but their usage has been limited thus far to manual proofs.
Automating proofs in these logics via first-order solvers is non-trivial, due
to reasoning features such as higher-order assertions, modalities and rich
permission resources. In this paper, we provide the first implementation of a
weak memory program logic using existing deductive verification tools. We
tackle three recent program logics: Relaxed Separation Logic and two forms of
Fenced Separation Logic, and show how these can be encoded using the Viper
verification infrastructure. In doing so, we illustrate several novel encoding
techniques which could be employed for other logics. Our work is implemented,
and has been evaluated on examples from existing papers as well as the Facebook
open-source Folly library.Comment: Extended version of TACAS 2018 publicatio
Evaluating QBF Solvers: Quantifier Alternations Matter
We present an experimental study of the effects of quantifier alternations on
the evaluation of quantified Boolean formula (QBF) solvers. The number of
quantifier alternations in a QBF in prenex conjunctive normal form (PCNF) is
directly related to the theoretical hardness of the respective QBF
satisfiability problem in the polynomial hierarchy. We show empirically that
the performance of solvers based on different solving paradigms substantially
varies depending on the numbers of alternations in PCNFs. In related
theoretical work, quantifier alternations have become the focus of
understanding the strengths and weaknesses of various QBF proof systems
implemented in solvers. Our results motivate the development of methods to
evaluate orthogonal solving paradigms by taking quantifier alternations into
account. This is necessary to showcase the broad range of existing QBF solving
paradigms for practical QBF applications. Moreover, we highlight the potential
of combining different approaches and QBF proof systems in solvers.Comment: preprint of a paper to be published at CP 2018, LNCS, Springer,
including appendi
MaxPre : An Extended MaxSAT Preprocessor
We describe MaxPre, an open-source preprocessor for (weighted partial) maximum satisfiability (MaxSAT). MaxPre implements both SAT-based and MaxSAT-specific preprocessing techniques, and offers solution reconstruction, cardinality constraint encoding, and an API for tight integration into SAT-based MaxSAT solvers.Peer reviewe
- …