526 research outputs found
Dependency Stochastic Boolean Satisfiability: A Logical Formalism for NEXPTIME Decision Problems with Uncertainty
Stochastic Boolean Satisfiability (SSAT) is a logical formalism to model
decision problems with uncertainty, such as Partially Observable Markov
Decision Process (POMDP) for verification of probabilistic systems. SSAT,
however, is limited by its descriptive power within the PSPACE complexity
class. More complex problems, such as the NEXPTIME-complete Decentralized POMDP
(Dec-POMDP), cannot be succinctly encoded with SSAT. To provide a logical
formalism of such problems, we extend the Dependency Quantified Boolean Formula
(DQBF), a representative problem in the NEXPTIME-complete class, to its
stochastic variant, named Dependency SSAT (DSSAT), and show that DSSAT is also
NEXPTIME-complete. We demonstrate the potential applications of DSSAT to
circuit synthesis of probabilistic and approximate design. Furthermore, to
study the descriptive power of DSSAT, we establish a polynomial-time reduction
from Dec-POMDP to DSSAT. With the theoretical foundations paved in this work,
we hope to encourage the development of DSSAT solvers for potential broad
applications.Comment: 10 pages, 5 figures. A condensed version of this work is published in
the AAAI Conference on Artificial Intelligence (AAAI) 202
Justicia: A Stochastic SAT Approach to Formally Verify Fairness
As a technology ML is oblivious to societal good or bad, and thus, the field
of fair machine learning has stepped up to propose multiple mathematical
definitions, algorithms, and systems to ensure different notions of fairness in
ML applications. Given the multitude of propositions, it has become imperative
to formally verify the fairness metrics satisfied by different algorithms on
different datasets. In this paper, we propose a \textit{stochastic
satisfiability} (SSAT) framework, Justicia, that formally verifies different
fairness measures of supervised learning algorithms with respect to the
underlying data distribution. We instantiate Justicia on multiple
classification and bias mitigation algorithms, and datasets to verify different
fairness metrics, such as disparate impact, statistical parity, and equalized
odds. Justicia is scalable, accurate, and operates on non-Boolean and compound
sensitive attributes unlike existing distribution-based verifiers, such as
FairSquare and VeriFair. Being distribution-based by design, Justicia is more
robust than the verifiers, such as AIF360, that operate on specific test
samples. We also theoretically bound the finite-sample error of the verified
fairness measure.Comment: 24 pages, 7 figures, 5 theorem
Phase Transitions of the Typical Algorithmic Complexity of the Random Satisfiability Problem Studied with Linear Programming
Here we study the NP-complete -SAT problem. Although the worst-case
complexity of NP-complete problems is conjectured to be exponential, there
exist parametrized random ensembles of problems where solutions can typically
be found in polynomial time for suitable ranges of the parameter. In fact,
random -SAT, with as control parameter, can be solved quickly
for small enough values of . It shows a phase transition between a
satisfiable phase and an unsatisfiable phase. For branch and bound algorithms,
which operate in the space of feasible Boolean configurations, the empirically
hardest problems are located only close to this phase transition. Here we study
-SAT () and the related optimization problem MAX-SAT by a linear
programming approach, which is widely used for practical problems and allows
for polynomial run time. In contrast to branch and bound it operates outside
the space of feasible configurations. On the other hand, finding a solution
within polynomial time is not guaranteed. We investigated several variants like
including artificial objective functions, so called cutting-plane approaches,
and a mapping to the NP-complete vertex-cover problem. We observed several
easy-hard transitions, from where the problems are typically solvable (in
polynomial time) using the given algorithms, respectively, to where they are
not solvable in polynomial time. For the related vertex-cover problem on random
graphs these easy-hard transitions can be identified with structural properties
of the graphs, like percolation transitions. For the present random -SAT
problem we have investigated numerous structural properties also exhibiting
clear transitions, but they appear not be correlated to the here observed
easy-hard transitions. This renders the behaviour of random -SAT more
complex than, e.g., the vertex-cover problem.Comment: 11 pages, 5 figure
FourierSAT: A Fourier Expansion-Based Algebraic Framework for Solving Hybrid Boolean Constraints
The Boolean SATisfiability problem (SAT) is of central importance in computer
science. Although SAT is known to be NP-complete, progress on the engineering
side, especially that of Conflict-Driven Clause Learning (CDCL) and Local
Search SAT solvers, has been remarkable. Yet, while SAT solvers aimed at
solving industrial-scale benchmarks in Conjunctive Normal Form (CNF) have
become quite mature, SAT solvers that are effective on other types of
constraints, e.g., cardinality constraints and XORs, are less well studied; a
general approach to handling non-CNF constraints is still lacking. In addition,
previous work indicated that for specific classes of benchmarks, the running
time of extant SAT solvers depends heavily on properties of the formula and
details of encoding, instead of the scale of the benchmarks, which adds
uncertainty to expectations of running time.
To address the issues above, we design FourierSAT, an incomplete SAT solver
based on Fourier analysis of Boolean functions, a technique to represent
Boolean functions by multilinear polynomials. By such a reduction to continuous
optimization, we propose an algebraic framework for solving systems consisting
of different types of constraints. The idea is to leverage gradient information
to guide the search process in the direction of local improvements. Empirical
results demonstrate that FourierSAT is more robust than other solvers on
certain classes of benchmarks.Comment: The paper was accepted by Thirty-Fourth AAAI Conference on Artificial
Intelligence (AAAI 2020). V2 (Feb 24): Typos correcte
Message Passing Algorithm for Solving QBF Using More Reasoning
We present a novel solver for solving Quantified Boolean Formulae problem (QBF). In order to improve the performance, we introduce some reasoning rules into the message passing algorithm for solving QBF. When preprocessing the formulae, the solver incorporates the equality reduction and the hyperbinary resolution. Further, the solver employs the message passing method to obtain more information when selecting branches. By using the unit propagation, conflict driven learning, and satisfiability directed implication and learning, the solver handles the branches. The experimental results also show that the solver can solve QBF problem efficiently
ADDMC: Weighted Model Counting with Algebraic Decision Diagrams
We present an algorithm to compute exact literal-weighted model counts of
Boolean formulas in Conjunctive Normal Form. Our algorithm employs dynamic
programming and uses Algebraic Decision Diagrams as the primary data structure.
We implement this technique in ADDMC, a new model counter. We empirically
evaluate various heuristics that can be used with ADDMC. We then compare ADDMC
to state-of-the-art exact weighted model counters (Cachet, c2d, d4, and
miniC2D) on 1914 standard model counting benchmarks and show that ADDMC
significantly improves the virtual best solver.Comment: Presented at AAAI 202
Preprocessing and Stochastic Local Search in Maximum Satisfiability
Problems which ask to compute an optimal solution to its instances are called optimization problems. The maximum satisfiability (MaxSAT) problem is a well-studied combinatorial optimization problem with many applications in domains such as cancer therapy design, electronic markets, hardware debugging and routing. Many problems, including the aforementioned ones, can be encoded in MaxSAT. Thus MaxSAT serves as a general optimization paradigm and therefore advances in MaxSAT algorithms translate to advances in solving other problems.
In this thesis, we analyze the effects of MaxSAT preprocessing, the process of reformulating the input instance prior to solving, on the perceived costs of solutions during search. We show that after preprocessing most MaxSAT solvers may misinterpret the costs of non-optimal solutions. Many MaxSAT algorithms use the found non-optimal solutions in guiding the search for solutions and so the misinterpretation of costs may misguide the search.
Towards remedying this issue, we introduce and study the concept of locally minimal solutions. We show that for some of the central preprocessing techniques for MaxSAT, the perceived cost of a locally minimal solution to a preprocessed instance equals the cost of the corresponding reconstructed solution to the original instance.
We develop a stochastic local search algorithm for MaxSAT, called LMS-SLS, that is prepended with a preprocessor and that searches over locally minimal solutions. We implement LMS-SLS and analyze the performance of its different components, particularly the effects of preprocessing and computing locally minimal solutions, and also compare LMS-SLS with the state-of-the-art SLS solver SATLike for MaxSAT.
Contingent planning under uncertainty via stochastic satisfiability
We describe a new planning technique that efficiently solves probabilistic propositional contingent planning problems by converting them into instances of stochastic satisfiability (SSAT) and solving these problems instead. We make fundamental contributions in two areas: the solution of SSAT problems and the solution of stochastic planning problems. This is the first work extending the planning-as-satisfiability paradigm to stochastic domains. Our planner, ZANDER, can solve arbitrary, goal-oriented, finite-horizon partially observable Markov decision processes (POMDPs). An empirical study comparing ZANDER to seven other leading planners shows that its performance is competitive on a range of problems. © 2003 Elsevier Science B.V. All rights reserved
Towards a quantitative alloy
Dissertação de mestrado integrado em Engenharia InformáticaWhen one comes across a new problem that needs to be solved, by abstracting from its associated details
in a simple and concise way through the use of formal methods, one is able to better understand the matter
at hand. Alloy (Jackson, 2012), a declarative specification language based on relational logic, is an example
of an effective modelling tool, allowing high-level specification of potentially very complex systems. However,
along with the irrelevant information, measurable data of the system is often lost in the abstraction as well,
making it not as adequate for certain situations.
The Alloy Analyzer represents the relations under analysis by Boolean matrices. By extending this type of
structure to:
• numeric matrices, over N0
, one is able to work with multirelations, i.e. relations whose arcs are
weighted; each tuple is thus associated with a natural number, which allows reasoning in a similar
fashion as in optimization problems and integer programming techniques;
• left-Stochastic matrices, one is able to model faulty behaviour and other forms of quantitative
information about software systems in a probabilistic way; in particular, this introduces the notion of
a probabilistic contract in software design.
Such an increase in Alloy’s capabilities strengthens its position in the area of formal methods for software
design, in particular towards becoming a quantitative formal method.
This dissertation explores the motivation and importance behind quantitative analysis by studying and
establishing theoretical foundations through categorial approaches to accomplish such reasoning in Alloy.
This starts by reviewing the required tools to support such groundwork and proceeds to the design and
implementation of such a quantitative Alloy extension.
This project aims to promote the evolution of quantitative formal methods by successfully achieving
quantitative abstractions in Alloy, extending its support to these concepts and implementing them in the
Alloy Analyzer.Quando se depara com um novo problema que precisa de ser resolvido, ao abstrair dos seus detalhes
associados de forma simples e concisa recorrendo a métodos formais, é possível compreender melhor
o assunto em questão. Alloy (Jackson, 2012), uma linguagem de especificação declarativa baseada em
lógica relacional, é um exemplo de uma ferramenta de modelação eficaz, possibilitando especificações
de alto-nível de sistemas potencialmente bastante complexos. Contudo, em conjunto com a informação
irrelevante, os dados mensuráveis são muitas vezes também perdidos na abstração, tornando-a não tão
adequada para certas situações.
O Alloy Analyzer representa as relações sujeitas a análise através de matrizes Booleanas. Ao estender
este tipo de estrutura para:
• matrizes numéricas, em N0
, é possível lidar com multirelações, i.e., relações cujos arcos são
pesados; cada tuplo é consequentemente associado a um número natural, o que proporciona uma
linha de raciocínio semelhante à de técnicas de problemas de otimização e de programação inteira;
• matrizes estocásticas, permitindo a modelação de comportamento defeituoso e de outros tipos de
informação quantitativa de sistemas de software probabilisticamente; em particular, é introduzida a
noção de contrato probabilístico em design de software.
Tal aumento às capacidades do Alloy, fortalece a sua posição na área de métodos formais para design de
software, em particular, a caminho de se tornar um método formal quantitativo.
Esta dissertação explora a motivação e a importância subjacente à análise quantitativa, a partir do estudo
e consolidação dos fundamentos teóricos através de abordagens categóricas de forma a conseguir suportar
esse tipo de raciocínio em Alloy. Inicialmente, as ferramentas imprescindíveis para assegurar tal base são
analisadas, passando de seguida ao planeamento e posterior implementação de tal extensão quantitativa
do Alloy.
Este projecto pretende promover a evolução dos métodos formais quantitativos através da concretização de
abstracção quantitativa em Alloy, estendendo a sua base para suportar estes conceitos e assim implementá los no Alloy Analyzer
- …