2,588 research outputs found
Exploiting short supports for improved encoding of arbitrary constraints into SAT
Encoding to SAT and applying a highly efficient modern SAT solver is an increasingly popular method of solving finite-domain constraint problems. In this paper we study encodings of arbitrary constraints where unit propagation on the encoding provides strong reasoning. Specifically, unit propagation on the encoding simulates generalised arc consistency on the original constraint. To create compact and efficient encodings we use the concept of short support. Short support has been successfully applied to create efficient propagation algorithms for arbitrary constraints. A short support of a constraint is similar to a satisfying tuple however a short support is not required to assign every variable in scope. Some variables are left free to take any value. In some cases a short support representation is smaller than the table of satisfying tuples by an exponential factor. We present two encodings based on short supports and evaluate them on a set of benchmark problems, demonstrating a substantial improvement over the state of the art
SLIDE: A Useful Special Case of the CARDPATH Constraint
We study the CardPath constraint. This ensures a given constraint holds a
number of times down a sequence of variables. We show that SLIDE, a special
case of CardPath where the slid constraint must hold always, can be used to
encode a wide range of sliding sequence constraints including CardPath itself.
We consider how to propagate SLIDE and provide a complete propagator for
CardPath. Since propagation is NP-hard in general, we identify special cases
where propagation takes polynomial time. Our experiments demonstrate that using
SLIDE to encode global constraints can be as efficient and effective as
specialised propagators.Comment: 18th European Conference on Artificial Intelligenc
Computer Aided Verification of Lamport's Fast Mutual Exclusion Algorithm - Using Coloured Petri Nets and Occurrence Graphs with Symmetries
In this paper, we present a new computer tool for verification of distributed systems. As an example, we establish the correctness of Lamport's Fast Mutual Exclusion Algorithm. The tool implements the method of occurrence graphs with symmetries (OS-graphs) for Coloured Petri Nets(CP-nets). The basic idea in the approach is to exploit the symmetries inherent in many distributed systems to construct a condensed state space. We demonstrate a signigicant increase in the number of states which can be analysed. The paper is to a large extent self-contained and does not assume any prior knowledge of CP-nets (or any other kinds of Petri Nets) or OS-graphs. CP-nets and OS-graphs are not our invention. Our contribution is development of the tool and verification of the example.Index Terms: Modelling and Analysis of Distributed Systems, Formal Verification, Coloured Petri Nets, High-Level Petri Nets, Occurrence Graphs, State Spaces, Symmetries, Mutual Exclusion
Predicting the approximate functional behaviour of physical systems
This dissertation addresses the problem of the computer prediction of the approximate
behaviour of physical systems describable by ordinary differential equations.Previous approaches to behavioural prediction have either focused on an exact
mathematical description or on a qualitative account. We advocate a middle ground: a
representation more coarse than an exact mathematical solution yet more specific than a
qualitative one. What is required is a mathematical expression, simpler than the exact
solution, whose qualitative features mirror those of the actual solution and whose
functional form captures the principal parameter relationships underlying the behaviour of
the real system. We term such a representation an approximate functional solution.Approximate functional solutions are superior to qualitative descriptions because they
reveal specific functional relationships, restore a quantitative time scale to a process and
support more sophisticated comparative analysis queries. Moreover, they can be superior to
exact mathematical solutions by emphasizing comprehensibility, adequacy and practical
utility over precision.Two strategies for constructing approximate functional solutions are proposed. The first
abstracts the original equation, predicts behaviour in the abstraction space and maps this
back to the approximate functional level. Specifically, analytic abduction exploits
qualitative simulation to predict the qualitative properties of the solution and uses this
knowledge to guide the selection of a parameterized trial function which is then tuned with
respect to the differential equation. In order to limit the complexity of a proposed
approximate functional solution, and hence maintain its comprehensibility,
back-of-the-envelope reasoning is used to simplify overly complex expressions in a
magnitude extreme. If no function is recognised which matches the predicted behaviour,
segment calculus is called upon to find a composite function built from known primitives
and a set of operators. At the very least, segment calculus identifies a plausible structure
for the form of the solution (e.g. that it is a composition of two unknown functions).
Equation parsing capitalizes on this partial information to look for a set of termwise
interactions which, when interpreted, expose a particular solution of the equation.The second, and more direct, strategy for constructing an approximate functional solution is
embodied in the closed form approximation technique. This extends approximation
methods to equations which lack a closed form solution. This involves solving the
differential equation exactly, as an infinite series, and obtaining an approximate functional
solution by constructing a closed form function whose Taylor series is close to that of the
exact solutionThe above techniques dovetail together to achieve a style of reasoning closer to that of an
engineer or physicist rather than a mathematician. The key difference being to sacrifice the
goal of finding the correct solution of the differential equation in favour of finding an
approximation which is adequate for the purpose to which the knowledge will be put.
Applications to Intelligent Tutoring and Design Support Systems are suggested
A specialised constraint approach for stable matching problems
Constraint programming is a generalised framework designed to solve combinatorial problems. This framework is made up of a set of predefined independent components and generalised algorithms. This is a very versatile structure which allows for a variety of rich combinatorial problems to be represented and solved relatively easily.
Stable matching problems consist of a set of participants wishing to be matched into pairs or groups in a stable manner. A matching is said to be stable if there is no pair or group of participants that would rather make a private arrangement to improve their situation and thus undermine the matching. There are many important "real life" applications of stable matching problems across the world. Some of which includes the Hospitals/Residents problem in which a set of graduating medical students, also known as residents, need to be assigned to hospital posts. Some authorities assign children to schools as a stable matching problem. Many other such problems are also tackled as stable matching problems. A number of classical stable matching problems have efficient specialised algorithmic solutions.
Constraint programming solutions to stable matching problems have been investigated in the past. These solutions have been able to match the theoretically optimal time complexities of the algorithmic solutions. However, empirical evidence has shown that in reality these constraint solutions run significantly slower than the specialised algorithmic solutions. Furthermore, their memory requirements prohibit them from solving problems which the specialised algorithmic solutions can solve in a fraction of a second.
My contribution investigates the possibility of modelling stable matching problems as specialised constraints. The motivation behind this approach was to find solutions to these problems which maintain the versatility of the constraint solutions, whilst significantly reducing the performance gap between constraint and specialised algorithmic solutions.
To this end specialised constraint solutions have been developed for the stable marriage problem and the Hospitals/Residents problem. Empirical evidence has been presented which shows that these solutions can solve significantly larger problems than previously published constraint solutions. For these larger problem instances it was seen that the specialised constraint solutions came within a factor of four of the time required by algorithmic solutions. It has also been shown that, through further specialisation, these constraint solutions can be made to run significantly faster. However, these improvements came at the cost of versatility. As a demonstration of the versatility of these solutions it is shown that, by adding simple side constraints, richer problems can be easily modelled. These richer problems add additional criteria and/or an optimisation requirement to the original stable matching problems. Many of these problems have been proven to be NP-Hard and some have no known algorithmic solutions. Included with these models are results from empirical studies which show that these are indeed feasible solutions to the richer problems. Results from the studies also provide some insight into the structure of these problems, some of which have had little or no previous study
An Integrated Methodology for Creating Composed Web/Grid Services
This thesis presents an approach to design, specify, validate, verify, implement, and evaluate composed web/grid services. Web and grid services can be composed to create new services
with complex behaviours. The BPEL (Business Process Execution Language) standard was created to enable the orchestration of web services, but there have also been investigation of
its use for grid services. BPEL specifies the implementation of service composition but has no formal semantics; implementations are in practice checked by testing. Formal methods are
used in general to define an abstract model of system behaviour that allows simulation and reasoning about properties. The approach can detect and reduce potentially costly errors at
design time.
CRESS (Communication Representation Employing Systematic Specification) is a domainindependent,
graphical, abstract notation, and integrated toolset for developing composite web service. The original version of CRESS had automated support for formal specification in
LOTOS (Language Of Temporal Ordering Specification), executing formal validation with MUSTARD (Multiple-Use Scenario Testing and Refusal Description), and implementing in
BPEL4WS as the early version of BPEL standard. This thesis work has extended CRESS and its integrated tools to design, specify, validate, verify, implement, and evaluate composed web/grid
services. The work has extended the CRESS notation to support a wider range of service compositions, and has applied it to grid services as a new domain. The thesis presents two new
tools, CLOVE (CRESS Language-Oriented Verification Environment) and MINT (MUSTARD Interpreter), to respectively support formal verification and implementation testing. New work
has also extended CRESS to automate implementation of composed services using the more recent BPEL standard WS-BPEL 2.0
A review of literature on parallel constraint solving
As multicore computing is now standard, it seems irresponsible for constraints researchers to ignore the implications of it. Researchers need to address a number of issues to exploit parallelism, such as: investigating which constraint algorithms are amenable to parallelisation; whether to use shared memory or distributed computation; whether to use static or dynamic decomposition; and how to best exploit portfolios and cooperating search. We review the literature, and see that we can sometimes do quite well, some of the time, on some instances, but we are far from a general solution. Yet there seems to be little overall guidance that can be given on how best to exploit multicore computers to speed up constraint solving. We hope at least that this survey will provide useful pointers to future researchers wishing to correct this situation
- …