611 research outputs found
Ten virtues of structured graphs
This paper extends the invited talk by the first author about the virtues
of structured graphs. The motivation behind the talk and this paper relies on our
experience on the development of ADR, a formal approach for the design of styleconformant,
reconfigurable software systems. ADR is based on hierarchical graphs
with interfaces and it has been conceived in the attempt of reconciling software architectures
and process calculi by means of graphical methods. We have tried to
write an ADR agnostic paper where we raise some drawbacks of flat, unstructured
graphs for the design and analysis of software systems and we argue that hierarchical,
structured graphs can alleviate such drawbacks
Local area [pye]-calculus
All computers on the Internet are connected, but not all connections are
equal. Hosts are grouped into islands of local communication. It is the agreed
conventions and shared knowledge that connect these islands, just as much as the
switches and wires that run between them.
The power and limitation of these conventions and shared knowledge and
hence their effectiveness can be investigated by an appropriate calculus. In this
thesis I describe a development of the 7r-calculus that is particularly well suited to
express such systems. The process calculus, which I call the local area n-calculus
or Ian, extends the 7r-calculus so that a channel name can have within its scope
several disjoint local areas. Such a channel name may be used for communication
within an area or it may be sent between areas, but it cannot itself be used to
transmit information from one area to another. Areas are arranged in a hierarchy
of levels which distinguish, for example, between a single application, a machine,
or a whole network. I present a semantics for this calculus that relies on several
side-conditions which are essentially runtime level checks. I show that a suitable
type system can provide enough static information to make most of these checks
unnecessary.
I examine the descriptive power of the /a7r-calculus by comparing it to the
7r-calculus. I find that, perhaps surprisingly, local area communication can be
encoded into the 7T-calculus with conditional matching. The encoding works by
replacing communication inside an area with communication on a new channel
created just for that area. This is analogous to replacing direct communication
between two points with a system that broadcasts packets over a background
ether. I show a form of operational correspondence between the behaviour of a
process in lan and its 7r-calculus translation.
One of my aims in developing this calculus is to provide a convenient and ex¬
pressive framework with which to examine convention-laden, distributed systems.
I offer evidence that the calculus has achieved this by way of an extended case
study. I present a model of Internet communication based on Sockets and TCP
over IP and then extend this system with Network Address Translation. I then
4
give a model of the File Transfer Protocol that uses TCP/IP to communicate
between networks.
Traces of the model show that FTP, run in its normal mode, will fail when
the client is using Network Address Translation, whereas, an alternative mode of
FTP will succeed. Moreover a normal run of the model over NAT fails in the
same way as the real life system would, demonstrating that the model can pick
up this failure and correctly highlight the reasons behind it
Session Types in Concurrent Calculi: Higher-Order Processes and Objects
This dissertation investigates different formalisms, in the form of programming language calculi,
that are aimed at providing a theoretical foundation for structured concurrent programming based
on session types. The structure of a session type is essentially a process-algebraic style description
of the behaviour of a single program identifier serving as a communication medium (and usually
referred to as a channel): the types incorporate typed inputs, outputs, and choices which can be
composed to form larger protocol descriptions. The effectiveness of session typing can be attributed
to the linear treatment of channels and session types, and to the use of tractable methods
such as syntactic duality to decide if the types of two connected channels are compatible. Linearity
is ensured when accumulating the uses of a channel into a composite type that describes also
the order of those actions. Duality provides a tractable and intuitive method for deciding when
two connected channels can interact and exchange values in a statically determined type-safe way.
We present our contributions to the theory of sessions, distilled into two families of programming
calculi, the first based on higher-order processes and the second based on objects. Our work unifies,
improves and extends, in manifold ways, the session primitives and typing systems for the
Lambda-calculus, the Pi-calculus, the Object-calculus, and their combinations in multi-paradigm
languages. Of particular interest are: the treatment of infinite interactions expressed with recursive
sessions; the capacity to encapsulate channels in higher-order structures which can be exchanged
and kept suspended, i.e., the use of code as data; the integration of protocol structure directly
into the description of objects, providing a powerful and uniformly extensible set of implementation
abstractions; finally, the introduction of asynchronous subtyping, which enables controlled
reordering of actions on either side of a session. Our work on higher-order processes and on object
calculi for session-based concurrent programming provides a theoretical foundation for programming
language design integrating functional, process, and object-oriented features
Integrated Framework for Data Quality and Security Evaluation on Mobile Devices
Data quality (DQ) is an important concept that is used in the design and employment of information, data management, decision making, and engineering systems with multiple applications already available for solving specific problems. Unfortunately, conventional approaches to DQ evaluation commonly do not pay enough attention or even ignore the security and privacy of the evaluated data. In this research, we develop a framework for the DQ evaluation of the sensor originated data acquired from smartphones, that incorporates security and privacy aspects into the DQ evaluation pipeline. The framework provides support for selecting the DQ metrics and implementing their calculus by integrating diverse sensor data quality and security metrics. The framework employs a knowledge graph to facilitate its adaptation in new applications development and enables knowledge accumulation. Privacy aspects evaluation is demonstrated by the detection of novel and sophisticated attacks on data privacy on the example of colluded applications attack recognition. We develop multiple calculi for DQ and security evaluation, such as a hierarchical fuzzy rules expert system, neural networks, and an algebraic function. Case studies that demonstrate the framework\u27s performance in solving real-life tasks are presented, and the achieved results are analyzed. These case studies confirm the framework\u27s capability of performing comprehensive DQ evaluations. The framework development resulted in producing multiple products, and tools such as datasets and Android OS applications. The datasets include the knowledge base of sensors embedded in modern mobile devices and their quality analysis, technological signals recordings of smartphones during the normal usage, and attacks on users\u27 privacy. These datasets are made available for public use and can be used for future research in the field of data quality and security. We also released under an open-source license a set of Android OS tools that can be used for data quality and security evaluation
On Modelling and Analysis of Dynamic Reconfiguration of Dependable Real-Time Systems
This paper motivates the need for a formalism for the modelling and analysis
of dynamic reconfiguration of dependable real-time systems. We present
requirements that the formalism must meet, and use these to evaluate well
established formalisms and two process algebras that we have been developing,
namely, Webpi and CCSdp. A simple case study is developed to illustrate the
modelling power of these two formalisms. The paper shows how Webpi and CCSdp
represent a significant step forward in modelling adaptive and dependable
real-time systems.Comment: Presented and published at DEPEND 201
Extending and Relating Semantic Models of Compensating CSP
Business transactions involve multiple partners coordinating and interacting with each other. These transactions have hierarchies of activities which need to be orchestrated. Usual database approaches (e.g.,checkpoint, rollback) are not applicable to handle faults in a long running transaction due to interaction with multiple partners. The compensation mechanism handles faults that can arise in a long running transaction. Based on the framework of Hoare's CSP process algebra, Butler et al introduced Compensating CSP (cCSP), a language to model long-running transactions. The language introduces a method to declare a transaction as a process and it has constructs for orchestration of compensation. Butler et al also defines a trace semantics for cCSP. In this thesis, the semantic models of compensating CSP are extended by defining an operational semantics, describing how the state of a program changes during its execution. The semantics is encoded into Prolog to animate the specification. The semantic models are further extended to define the synchronisation of processes. The notion of partial behaviour is defined to model the behaviour of deadlock that arises during process synchronisation. A correspondence relationship is then defined between the semantic models and proved by using structural induction. Proving the correspondence means that any of the presentation can be accepted as a primary definition of the meaning of the language and each definition can be used correctly at different times, and for different purposes. The semantic models and their relationships are mechanised by using the theorem prover PVS. The semantic models are embedded in PVS by using Shallow embedding. The relationships between semantic models are proved by mutual structural induction. The mechanisation overcomes the problems in hand proofs and improves the scalability of the approach
Agoric computation: trust and cyber-physical systems
In the past two decades advances in miniaturisation and economies of scale have led to the emergence of billions of connected components that have provided both a spur and a blueprint for the development of smart products acting in specialised environments which are uniquely identifiable, localisable, and capable of autonomy. Adopting the computational perspective of multi-agent systems (MAS) as a technological abstraction married with the engineering perspective of cyber-physical systems (CPS) has provided fertile ground for designing, developing and deploying software applications in smart automated context such as manufacturing, power grids, avionics, healthcare and logistics, capable of being decentralised, intelligent, reconfigurable, modular, flexible, robust, adaptive and responsive. Current agent technologies are, however, ill suited for information-based environments, making it difficult to formalise and implement multiagent systems based on inherently dynamical functional concepts such as trust and reliability, which present special challenges when scaling from small to large systems of agents. To overcome such challenges, it is useful to adopt a unified approach which we term agoric computation, integrating logical, mathematical and programming concepts towards the development of agent-based solutions based on recursive, compositional principles, where smaller systems feed via directed information flows into larger hierarchical systems that define their global environment. Considering information as an integral part of the environment naturally defines a web of operations where components of a systems are wired in some way and each set of inputs and outputs are allowed to carry some value. These operations are stateless abstractions and procedures that act on some stateful cells that cumulate partial information, and it is possible to compose such abstractions into higher-level ones, using a publish-and-subscribe interaction model that keeps track of update messages between abstractions and values in the data. In this thesis we review the logical and mathematical basis of such abstractions and take steps towards the software implementation of agoric modelling as a framework for simulation and verification of the reliability of increasingly complex systems, and report on experimental results related to a few select applications, such as stigmergic interaction in mobile robotics, integrating raw data into agent perceptions, trust and trustworthiness in orchestrated open systems, computing the epistemic cost of trust when reasoning in networks of agents seeded with contradictory information, and trust models for distributed ledgers in the Internet of Things (IoT); and provide a roadmap
for future developments of our research
Semantic interoperability in ad-hoc computing environments
This thesis introduces a novel approach in which multiple heterogeneous devices collaborate to provide useful applications in an ad-hoc network. This thesis proposes a smart home as a particular ubiquitous computing scenario considering all the requirements given by the literature for succeed in this kind of systems. To that end, we envision a horizontally integrated smart home built up from independent components that provide services. These components are described with enough syntactic, semantic and pragmatic knowledge to accomplish spontaneous collaboration. The objective of these collaboration is domestic use, that is, the provision of valuable services for home residents capable of supporting users in their daily activities. Moreover, for the system to be attractive for potential customers, it should offer high levels of trust and reliability, all of them not at an excessive price. To achieve this goal, this thesis proposes to study the synergies available when an ontological description of home device functionality is paired with a formal method. We propose an ad-hoc home network in which components are home devices modelled as processes represented as semantic services by means of the Web Service Ontology (OWL-S). In addition, such services are specified, verified and implemented by means of the Communicating Sequential Processes (CSP), a process algebra for describing concurrent systems. The utilisation of an ontology brings the desired levels of knowledge for a system to compose services in a ad-hoc environment. Services are composed by a goal based system in order to satisfy user needs. Such system is capable of understaning, both service representations and user context information. Furthermore, the inclusion of a formal method contributes with additional semantics to check that such compositions will be correctly implemented and executed, achieving the levels of reliability and costs reduction (costs derived form the design, development and implementation of the system) needed for a smart home to succeed.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
- …