2,152 research outputs found
A Pragmatic, Scalable Approach to Correct-by-Construction Process Composition Using Classical Linear Logic Inference
The need for rigorous process composition is encountered in many situations
pertaining to the development and analysis of complex systems. We discuss the
use of Classical Linear Logic (CLL) for correct-by-construction resource-based
process composition, with guaranteed deadlock freedom, systematic resource
accounting, and concurrent execution. We introduce algorithms to automate the
necessary inference steps for binary compositions of processes in parallel,
conditionally, and in sequence. We combine decision procedures and heuristics
to achieve intuitive and practically useful compositions in an applied setting.Comment: Post-proceedings paper presented at the 28th International Symposium
on Logic-Based Program Synthesis and Transformation (LOPSTR 2018), Frankfurt
am Main, Germany, 4-6 September 2018 (arXiv:1808.03326). arXiv admin note:
substantial text overlap with arXiv:1803.0261
Object-Level Reasoning with Logics Encoded in HOL Light
We present a generic framework that facilitates object level reasoning with
logics that are encoded within the Higher Order Logic theorem proving
environment of HOL Light. This involves proving statements in any logic using
intuitive forward and backward chaining in a sequent calculus style. It is made
possible by automated machinery that take care of the necessary structural
reasoning and term matching automatically. Our framework can also handle type
theoretic correspondences of proofs, effectively allowing the type checking and
construction of computational processes via proof. We demonstrate our
implementation using a simple propositional logic and its Curry-Howard
correspondence to the lambda-calculus, and argue its use with linear logic and
its various correspondences to session types.Comment: In Proceedings LFMTP 2020, arXiv:2101.0283
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
A Real-world Case Study of Process and Data Driven Predictive Analytics for Manufacturing Workflows
We present a novel application of business process modelling and simulation of manufacturing workflows. Using formal methods, we produce correct-by-construction executable models that can be simulated in an interleaved way. The simulation draws advanced analytics from live IoT monitoring as well as an ERP system to provide predictive business intelligence. We describe our process and resource modelling efforts in the context of a collaborative project with two manufacturing partners. We evaluate our results based on the improvement of the scheduling accuracy for real production flows
The Measurement Calculus
Measurement-based quantum computation has emerged from the physics community
as a new approach to quantum computation where the notion of measurement is the
main driving force of computation. This is in contrast with the more
traditional circuit model which is based on unitary operations. Among
measurement-based quantum computation methods, the recently introduced one-way
quantum computer stands out as fundamental.
We develop a rigorous mathematical model underlying the one-way quantum
computer and present a concrete syntax and operational semantics for programs,
which we call patterns, and an algebra of these patterns derived from a
denotational semantics. More importantly, we present a calculus for reasoning
locally and compositionally about these patterns.
We present a rewrite theory and prove a general standardization theorem which
allows all patterns to be put in a semantically equivalent standard form.
Standardization has far-reaching consequences: a new physical architecture
based on performing all the entanglement in the beginning, parallelization by
exposing the dependency structure of measurements and expressiveness theorems.
Furthermore we formalize several other measurement-based models:
Teleportation, Phase and Pauli models and present compositional embeddings of
them into and from the one-way model. This allows us to transfer all the theory
we develop for the one-way model to these models. This shows that the framework
we have developed has a general impact on measurement-based computation and is
not just particular to the one-way quantum computer.Comment: 46 pages, 2 figures, Replacement of quant-ph/0412135v1, the new
version also include formalization of several other measurement-based models:
Teleportation, Phase and Pauli models and present compositional embeddings of
them into and from the one-way model. To appear in Journal of AC
Recommended from our members
Proceedings ICPW'07: 2nd International Conference on the Pragmatic Web, 22-23 Oct. 2007, Tilburg: NL
Proceedings ICPW'07: 2nd International Conference on the Pragmatic Web, 22-23 Oct. 2007, Tilburg: N
CBR and MBR techniques: review for an application in the emergencies domain
The purpose of this document is to provide an in-depth analysis of current reasoning engine practice and the integration strategies of Case Based Reasoning and Model Based Reasoning that will be used in the design and development of the RIMSAT system.
RIMSAT (Remote Intelligent Management Support and Training) is a European Commission funded project designed to:
a.. Provide an innovative, 'intelligent', knowledge based solution aimed at improving the quality of critical decisions
b.. Enhance the competencies and responsiveness of individuals and organisations involved in highly complex, safety critical incidents - irrespective of their location.
In other words, RIMSAT aims to design and implement a decision support system that using Case Base Reasoning as well as Model Base Reasoning technology is applied in the management of emergency situations.
This document is part of a deliverable for RIMSAT project, and although it has been done in close contact with the requirements of the project, it provides an overview wide enough for providing a state of the art in integration strategies between CBR and MBR technologies.Postprint (published version
The Road to General Intelligence
Humans have always dreamed of automating laborious physical and intellectual tasks, but the latter has proved more elusive than naively suspected. Seven decades of systematic study of Artificial Intelligence have witnessed cycles of hubris and despair. The successful realization of General Intelligence (evidenced by the kind of cross-domain flexibility enjoyed by humans) will spawn an industry worth billions and transform the range of viable automation tasks.The recent notable successes of Machine Learning has lead to conjecture that it might be the appropriate technology for delivering General Intelligence. In this book, we argue that the framework of machine learning is fundamentally at odds with any reasonable notion of intelligence and that essential insights from previous decades of AI research are being forgotten. We claim that a fundamental change in perspective is required, mirroring that which took place in the philosophy of science in the mid 20th century. We propose a framework for General Intelligence, together with a reference architecture that emphasizes the need for anytime bounded rationality and a situated denotational semantics. We given necessary emphasis to compositional reasoning, with the required compositionality being provided via principled symbolic-numeric inference mechanisms based on universal constructions from category theory. • Details the pragmatic requirements for real-world General Intelligence. • Describes how machine learning fails to meet these requirements. • Provides a philosophical basis for the proposed approach. • Provides mathematical detail for a reference architecture. • Describes a research program intended to address issues of concern in contemporary AI. The book includes an extensive bibliography, with ~400 entries covering the history of AI and many related areas of computer science and mathematics.The target audience is the entire gamut of Artificial Intelligence/Machine Learning researchers and industrial practitioners. There are a mixture of descriptive and rigorous sections, according to the nature of the topic. Undergraduate mathematics is in general sufficient. Familiarity with category theory is advantageous for a complete understanding of the more advanced sections, but these may be skipped by the reader who desires an overall picture of the essential concepts This is an open access book
- …