1,901 research outputs found
A foundation for synthesising programming language semantics
Programming or scripting languages used in real-world systems are seldom designed
with a formal semantics in mind from the outset. Therefore, the first step for developing well-founded analysis tools for these systems is to reverse-engineer a formal
semantics. This can take months or years of effort.
Could we automate this process, at least partially? Though desirable, automatically reverse-engineering semantics rules from an implementation is very challenging,
as found by Krishnamurthi, Lerner and Elberty. They propose automatically learning
desugaring translation rules, mapping the language whose semantics we seek to a simplified, core version, whose semantics are much easier to write. The present thesis
contains an analysis of their challenge, as well as the first steps towards a solution.
Scaling methods with the size of the language is very difficult due to state space
explosion, so this thesis proposes an incremental approach to learning the translation
rules. I present a formalisation that both clarifies the informal description of the challenge by Krishnamurthi et al, and re-formulates the problem, shifting the focus to the
conditions for incremental learning. The central definition of the new formalisation is
the desugaring extension problem, i.e. extending a set of established translation rules
by synthesising new ones.
In a synthesis algorithm, the choice of search space is important and non-trivial,
as it needs to strike a good balance between expressiveness and efficiency. The rest
of the thesis focuses on defining search spaces for translation rules via typing rules.
Two prerequisites are required for comparing search spaces. The first is a series of
benchmarks, a set of source and target languages equipped with intended translation
rules between them. The second is an enumerative synthesis algorithm for efficiently
enumerating typed programs. I show how algebraic enumeration techniques can be applied to enumerating well-typed translation rules, and discuss the properties expected
from a type system for ensuring that typed programs be efficiently enumerable.
The thesis presents and empirically evaluates two search spaces. A baseline search
space yields the first practical solution to the challenge. The second search space is
based on a natural heuristic for translation rules, limiting the usage of variables so that
they are used exactly once. I present a linear type system designed to efficiently enumerate translation rules, where this heuristic is enforced. Through informal analysis
and empirical comparison to the baseline, I then show that using linear types can speed
up the synthesis of translation rules by an order of magnitude
Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students
In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies.
To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on).
The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts.
This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution.
The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials.
The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world
Special Delivery: Programming with Mailbox Types (Extended Version)
The asynchronous and unidirectional communication model supported by
mailboxes is a key reason for the success of actor languages like Erlang and
Elixir for implementing reliable and scalable distributed systems. While many
actors may send messages to some actor, only the actor may (selectively)
receive from its mailbox. Although actors eliminate many of the issues stemming
from shared memory concurrency, they remain vulnerable to communication errors
such as protocol violations and deadlocks.
Mailbox types are a novel behavioural type system for mailboxes first
introduced for a process calculus by de'Liguoro and Padovani in 2018, which
capture the contents of a mailbox as a commutative regular expression. Due to
aliasing and nested evaluation contexts, moving from a process calculus to a
programming language is challenging.
This paper presents Pat, the first programming language design incorporating
mailbox types, and describes an algorithmic type system. We make essential use
of quasi-linear typing to tame some of the complexity introduced by aliasing.
Our algorithmic type system is necessarily co-contextual, achieved through a
novel use of backwards bidirectional typing, and we prove it sound and complete
with respect to our declarative type system. We implement a prototype type
checker, and use it to demonstrate the expressiveness of Pat on a factory
automation case study and a series of examples from the Savina actor benchmark
suite.Comment: Extended version of paper accepted to ICFP'2
Exploiting Process Algebras and BPM Techniques for Guaranteeing Success of Distributed Activities
The communications and collaborations among activities, pro-
cesses, or systems, in general, are the base of complex sys-
tems defined as distributed systems. Given the increasing
complexity of their structure, interactions, and functionali-
ties, many research areas are interested in providing mod-
elling techniques and verification capabilities to guarantee
their correctness and satisfaction of properties. In particular,
the formal methods community provides robust verification
techniques to prove system properties. However, most ap-
proaches rely on manually designed formal models, making
the analysis process challenging because it requires an expert
in the field. On the other hand, the BPM community pro-
vides a widely used graphical notation (i.e., BPMN) to design
internal behaviour and interactions of complex distributed
systems that can be enhanced with additional features (e.g.,
privacy technologies). Furthermore, BPM uses process min-
ing techniques to automatically discover these models from
events observation. However, verifying properties and ex-
pected behaviour, especially in collaborations, still needs a
solid methodology.
This thesis aims at exploiting the features of the formal meth-
ods and BPM communities to provide approaches that en-
able formal verification over distributed systems. In this con-
text, we propose two approaches. The modelling-based ap-
proach starts from BPMN models and produces process al-
gebra specifications to enable formal verification of system
properties, including privacy-related ones. The process mining-
based approach starts from logs observations to automati-
xv
cally generate process algebra specifications to enable veri-
fication capabilities
Markets and markups : a new empirical framework and evidence on exporters from China
Firms that dominate global trade export to multiple countries and frequently change their foreign destinations. We develop a new empirical framework for analysing markup elasticities to the exchange rate in this environment. The framework embodies a new estimator of these elasticities that controls for endogenous market participation and a new classification of products based on Chinese linguistics to proxy for firms’ power in local markets. Applying this framework to Chinese customs data, we document significant pricing-to-market for highly differentiated goods. Measured in the importer’s currency, the prices of highly differentiated goods are far more stable than those of less differentiated products.We thank Cambridge INET, the Centre For Macroeconomics, and the Economic and Social Research
Council (United Kingdom) Brexit Priority Grant R001553/1 for supporting our research. We thank
the staff of the HMRC datalab, especially Yee-Wan Yau, for supporting us in our analysis of UK
administrative data
Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5
This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered.
First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes.
Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification.
Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well
Numerical methods for computing the discrete and continuous Laplace transforms
We propose a numerical method to spline-interpolate discrete signals and then
apply the integral transforms to the corresponding analytical spline functions.
This represents a robust and computationally efficient technique for estimating
the Laplace transform for noisy data. We revisited a Meijer-G symbolic approach
to compute the Laplace transform and alternative approaches to extend canonical
observed time-series. A discrete quantization scheme provides the foundation
for rapid and reliable estimation of the inverse Laplace transform. We derive
theoretic estimates for the inverse Laplace transform of analytic functions and
demonstrate empirical results validating the algorithmic performance using
observed and simulated data. We also introduce a generalization of the Laplace
transform in higher dimensional space-time. We tested the discrete LT algorithm
on data sampled from analytic functions with known exact Laplace transforms.
The validation of the discrete ILT involves using complex functions with known
analytic ILTs
Jointly learning consistent causal abstractions over multiple interventional distributions
An abstraction can be used to relate two structural causal models representing the same system at different levels of resolution. Learning abstractions which guarantee consistency with respect to interventional distributions would allow one to jointly reason about evidence across multiple levels of granularity while respecting the underlying cause-effect relationships. In this paper, we introduce a first framework for causal abstraction learning between SCMs based on the formalization of abstraction recently proposed by Rischel (2020). Based on that, we propose a differentiable programming solution that jointly solves a number of combinatorial sub-problems, and we study its performance and benefits against independent and sequential approaches on synthetic settings and on a challenging real-world problem related to electric vehicle battery manufacturing
Motion-Bias-Free Feature-Based SLAM
For SLAM to be safely deployed in unstructured real world environments, it
must possess several key properties that are not encompassed by conventional
benchmarks. In this paper we show that SLAM commutativity, that is, consistency
in trajectory estimates on forward and reverse traverses of the same route, is
a significant issue for the state of the art. Current pipelines show a
significant bias between forward and reverse directions of travel, that is in
addition inconsistent regarding which direction of travel exhibits better
performance. In this paper we propose several contributions to feature-based
SLAM pipelines that remedies the motion bias problem. In a comprehensive
evaluation across four datasets, we show that our contributions implemented in
ORB-SLAM2 substantially reduce the bias between forward and backward motion and
additionally improve the aggregated trajectory error. Removing the SLAM motion
bias has significant relevance for the wide range of robotics and computer
vision applications where performance consistency is important.Comment: BMVC 202
- …