157 research outputs found
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation
This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains
Supporting adaptiveness of cyber-physical processes through action-based formalisms
Cyber Physical Processes (CPPs) refer to a new generation of business processes enacted in many application environments (e.g., emergency management, smart manufacturing, etc.), in which the presence of Internet-of-Things devices and embedded ICT systems (e.g., smartphones, sensors, actuators) strongly influences the coordination of the real-world entities (e.g., humans, robots, etc.) inhabitating such environments. A Process Management System (PMS) employed for executing CPPs is required to automatically adapt its running processes to anomalous situations and exogenous events by minimising any human intervention. In this paper, we tackle this issue by introducing an approach and an adaptive Cognitive PMS, called SmartPM, which combines process execution monitoring, unanticipated exception detection and automated resolution strategies leveraging on three well-established action-based formalisms developed for reasoning about actions in Artificial Intelligence (AI), including the situation calculus, IndiGolog and automated planning. Interestingly, the use of SmartPM does not require any expertise of the internal working of the AI tools involved in the system
The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools
On September 14, 2015, the newly upgraded Laser Interferometer
Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW)
signal, emitted a billion light-years away by a coalescing binary of two
stellar-mass black holes. The detection was announced in February 2016, in time
for the hundredth anniversary of Einstein's prediction of GWs within the theory
of general relativity (GR). The signal represents the first direct detection of
GWs, the first observation of a black-hole binary, and the first test of GR in
its strong-field, high-velocity, nonlinear regime. In the remainder of its
first observing run, LIGO observed two more signals from black-hole binaries,
one moderately loud, another at the boundary of statistical significance. The
detections mark the end of a decades-long quest, and the beginning of GW
astronomy: finally, we are able to probe the unseen, electromagnetically dark
Universe by listening to it. In this article, we present a short historical
overview of GW science: this young discipline combines GR, arguably the
crowning achievement of classical physics, with record-setting, ultra-low-noise
laser interferometry, and with some of the most powerful developments in the
theory of differential geometry, partial differential equations,
high-performance computation, numerical analysis, signal processing,
statistical inference, and data science. Our emphasis is on the synergy between
these disciplines, and how mathematics, broadly understood, has historically
played, and continues to play, a crucial role in the development of GW science.
We focus on black holes, which are very pure mathematical solutions of
Einstein's gravitational-field equations that are nevertheless realized in
Nature, and that provided the first observed signals.Comment: 41 pages, 5 figures. To appear in Bulletin of the American
Mathematical Societ
A Panorama on Multiscale Geometric Representations, Intertwining Spatial, Directional and Frequency Selectivity
The richness of natural images makes the quest for optimal representations in
image processing and computer vision challenging. The latter observation has
not prevented the design of image representations, which trade off between
efficiency and complexity, while achieving accurate rendering of smooth regions
as well as reproducing faithful contours and textures. The most recent ones,
proposed in the past decade, share an hybrid heritage highlighting the
multiscale and oriented nature of edges and patterns in images. This paper
presents a panorama of the aforementioned literature on decompositions in
multiscale, multi-orientation bases or dictionaries. They typically exhibit
redundancy to improve sparsity in the transformed domain and sometimes its
invariance with respect to simple geometric deformations (translation,
rotation). Oriented multiscale dictionaries extend traditional wavelet
processing and may offer rotation invariance. Highly redundant dictionaries
require specific algorithms to simplify the search for an efficient (sparse)
representation. We also discuss the extension of multiscale geometric
decompositions to non-Euclidean domains such as the sphere or arbitrary meshed
surfaces. The etymology of panorama suggests an overview, based on a choice of
partially overlapping "pictures". We hope that this paper will contribute to
the appreciation and apprehension of a stream of current research directions in
image understanding.Comment: 65 pages, 33 figures, 303 reference
Recommended from our members
A Survey of Top-Level Ontologies - to inform the ontological choices for a Foundation Data Model
The Centre for Digital Built Britain has been tasked through the Digital Framework Task Group to develop an Information Management Framework (IMF) to support the development of a National Digital Twin (NDT) as set out in “The Pathway to an Information Management Framework” (Hetherington, 2020). A key component of the IMF is a Foundation Data Model (FDM),
built upon a top-level ontology (TLO), as a basis for ensuring consistent data across the NDT. This document captures the results collected from a broad survey of top-level ontologies, conducted by the IMF technical team. It focuses on the core ontological choices made in their foundations and
the pragmatic engineering consequences these have on how the ontologies can be applied and further scaled. This document will provide the basis for discussions on a suitable TLO for the FDM. It is also expected that these top-level ontologies will provide a resource whose components can be harvested and adapted for inclusion in the FDM
Improving Model Finding for Integrated Quantitative-qualitative Spatial Reasoning With First-order Logic Ontologies
Many spatial standards are developed to harmonize the semantics and specifications of GIS data and for sophisticated reasoning. All these standards include some types of simple and complex geometric features, and some of them incorporate simple mereotopological relations. But the relations as used in these standards, only allow the extraction of qualitative information from geometric data and lack formal semantics that link geometric representations with mereotopological or other qualitative relations. This impedes integrated reasoning over qualitative data obtained from geometric sources and “native” topological information – for example as provided from textual sources where precise locations or spatial extents are unknown or unknowable. To address this issue, the first contribution in this dissertation is a first-order logical ontology that treats geometric features (e.g. polylines, polygons) and relations between them as specializations of more general types of features (e.g. any kind of 2D or 1D features) and mereotopological relations between them. Key to this endeavor is the use of a multidimensional theory of space wherein, unlike traditional logical theories of mereotopology (like RCC), spatial entities of different dimensions can co-exist and be related. However terminating or tractable reasoning with such an expressive ontology and potentially large amounts of data is a challenging AI problem. Model finding tools used to verify FOL ontologies with data usually employ a SAT solver to determine the satisfiability of the propositional instantiations (SAT problems) of the ontology. These solvers often experience scalability issues with increasing number of objects and size and complexity of the ontology, limiting its use to ontologies with small signatures and building small models with less than 20 objects. To investigate how an ontology influences the size of its SAT translation and consequently the model finder’s performance, we develop a formalization of FOL ontologies with data. We theoretically identify parameters of an ontology that significantly contribute to the dramatic growth in size of the SAT problem. The search space of the SAT problem is exponential in the signature of the ontology (the number of predicates in the axiomatization and any additional predicates from skolemization) and the number of distinct objects in the model. Axiomatizations that contain many definitions lead to large number of SAT propositional clauses. This is from the conversion of biconditionals to clausal form. We therefore postulate that optional definitions are ideal sentences that can be eliminated from an ontology to boost model finder’s performance. We then formalize optional definition elimination (ODE) as an FOL ontology preprocessing step and test the simplification on a set of spatial benchmark problems to generate smaller SAT problems (with fewer clauses and variables) without changing the satisfiability and semantic meaning of the problem. We experimentally demonstrate that the reduction in SAT problem size also leads to improved model finding with state-of-the-art model finders, with speedups of 10-99%. Altogether, this dissertation improves spatial reasoning capabilities using FOL ontologies – in terms of a formal framework for integrated qualitative-geometric reasoning, and specific ontology preprocessing steps that can be built into automated reasoners to achieve better speedups in model finding times, and scalability with moderately-sized datasets
- …