8 research outputs found
On synthesizing Skolem functions for first order logic formulae
Skolem functions play a central role in logic, from eliminating quantifiers
in first order logic formulas to providing functional implementations of
relational specifications. While classical results in logic are only interested
in their existence, the question of how to effectively compute them is also
interesting, important and useful for several applications. In the restricted
case of Boolean propositional logic formula, this problem of synthesizing
Boolean Skolem functions has been addressed in depth, with various recent work
focussing on both theoretical and practical aspects of the problem. However,
there are few existing results for the general case, and the focus has been on
heuristical algorithms.
In this article, we undertake an investigation into the computational
hardness of the problem of synthesizing Skolem functions for first order logic
formula. We show that even under reasonable assumptions on the signature of the
formula, it is impossible to compute or synthesize Skolem functions. Then we
determine conditions on theories of first order logic which would render the
problem computable. Finally, we show that several natural theories satisfy
these conditions and hence do admit effective synthesis of Skolem functions
LTLf Synthesis with Fairness and Stability Assumptions
In synthesis, assumptions are constraints on the environment that rule out
certain environment behaviors. A key observation here is that even if we
consider systems with LTLf goals on finite traces, environment assumptions need
to be expressed over infinite traces, since accomplishing the agent goals may
require an unbounded number of environment action. To solve synthesis with
respect to finite-trace LTLf goals under infinite-trace assumptions, we could
reduce the problem to LTL synthesis. Unfortunately, while synthesis in LTLf and
in LTL have the same worst-case complexity (both 2EXPTIME-complete), the
algorithms available for LTL synthesis are much more difficult in practice than
those for LTLf synthesis. In this work we show that in interesting cases we can
avoid such a detour to LTL synthesis and keep the simplicity of LTLf synthesis.
Specifically, we develop a BDD-based fixpoint-based technique for handling
basic forms of fairness and of stability assumptions. We show, empirically,
that this technique performs much better than standard LTL synthesis
Sequential Relational Decomposition
The concept of decomposition in computer science and engineering is
considered a fundamental component of computational thinking and is prevalent
in design of algorithms, software construction, hardware design, and more. We
propose a simple and natural formalization of sequential decomposition, in
which a task is decomposed into two sequential sub-tasks, with the first
sub-task to be executed before the second sub-task is executed. These tasks are
specified by means of input/output relations. We define and study decomposition
problems, which is to decide whether a given specification can be sequentially
decomposed. Our main result is that decomposition itself is a difficult
computational problem. More specifically, we study decomposition problems in
three settings: where the input task is specified explicitly, by means of
Boolean circuits, and by means of automatic relations. We show that in the
first setting decomposition is NP-complete, in the second setting it is
NEXPTIME-complete, and in the third setting there is evidence to suggest that
it is undecidable. Our results indicate that the intuitive idea of
decomposition as a system-design approach requires further investigation. In
particular, we show that adding a human to the loop by asking for a
decomposition hint lowers the complexity of decomposition problems
considerably
Sequential Relational Decomposition
The concept of decomposition in computer science and engineering is
considered a fundamental component of computational thinking and is prevalent
in design of algorithms, software construction, hardware design, and more. We
propose a simple and natural formalization of sequential decomposition, in
which a task is decomposed into two sequential sub-tasks, with the first
sub-task to be executed before the second sub-task is executed. These tasks are
specified by means of input/output relations. We define and study decomposition
problems, which is to decide whether a given specification can be sequentially
decomposed. Our main result is that decomposition itself is a difficult
computational problem. More specifically, we study decomposition problems in
three settings: where the input task is specified explicitly, by means of
Boolean circuits, and by means of automatic relations. We show that in the
first setting decomposition is NP-complete, in the second setting it is
NEXPTIME-complete, and in the third setting there is evidence to suggest that
it is undecidable. Our results indicate that the intuitive idea of
decomposition as a system-design approach requires further investigation. In
particular, we show that adding a human to the loop by asking for a
decomposition hint lowers the complexity of decomposition problems
considerably