1,051 research outputs found
A Linear Logic approach to RESTful web service modelling and composition
A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyRESTful Web Services are gaining increasing attention from both the service and the Web communities. The rising number of services being implemented and made available on the Web is creating a demand for modelling techniques that can abstract REST design from the implementation in order better to specify, analyse and implement large-scale RESTful Web systems. It can also help by providing suitable RESTful Web Service composition methods which can reduce costs by effi ciently re-using the large number of services that are already available and by exploiting existing services for complex business purposes.
This research considers RESTful Web Services as state transition systems and proposes a novel Linear Logic based approach, the first of its kind, for both the modelling and the composition of RESTful Web Services. The thesis demonstrates the capabilities of resource-sensitive Linear Logic for modelling five key REST constraints and proposes a two-stage approach to service composition involving Linear Logic theorem proving and proof-as-process based on the Ļ-calculus.
Whereas previous approaches have focused on each aspect of the composition of RESTful Web Services individually (e.g. execution or high-level modelling), this work bridges the gap between abstract formal modelling and application-level execution in an efficient and effective way. The approach not only ensures the completeness and correctness of the resulting composed services but also produces their process models naturally, providing the possibility to translate them into executable business languages.
Furthermore, the research encodes the proposed modelling and composition method into the Coq proof assistant, which enables both the Linear Logic theorem proving and the Ļ-calculus extraction to be conducted semi-automatically.
The feasibility and versatility studies performed in two disparate user scenarios (shopping and biomedical service composition) show that the proposed method provides a good level of scalability when the numbers of services and resources grow
Knowledge society arguments revisited in the semantic technologies era
In the light of high profile governmental and international efforts to realise the knowledge society, I review the arguments made for and against it from a technology standpoint. I focus on advanced knowledge technologies with applications on a large scale and in open- ended environments like the World Wide Web and its ambitious extension, the Semantic Web. I argue for a greater role of social networks in a knowledge society and I explore the recent developments in mechanised trust, knowledge certification, and speculate on their blending with traditional societal institutions. These form the basis of a sketched roadmap for enabling technologies for a knowledge society
Formalising non-functional requirements embedded in user requirements notation (URN) models
The growing need for computer software in different sectors of activity, (health, agriculture,
industries, education, aeronautic, science and telecommunication) together with the
increasing reliance of the society as a whole on information technology, is placing a heavy
and fast growing demand on complex and high quality software systems. In this regard, the
anticipation has been on non-functional requirements (NFRs) engineering and formal methods.
Despite their common objective, these techniques have in most cases evolved separately.
NFRs engineering proceeds firstly, by deriving measures to evaluate the quality of the constructed
software (product-oriented approach), and secondarily by improving the engineering
process (process-oriented approach). With the ability to combine the analysis of both functional
and non-functional requirements, Goal-Oriented Requirements Engineering (GORE)
approaches have become de facto leading requirements engineering methods. They propose
through refinement/operationalisation, means to satisfy NFRs encoded in softgoals at an
early phase of software development. On the other side, formal methods have kept, so far,
their promise to eliminate errors in software artefacts to produce high quality software products
and are therefore particularly solicited for safety and mission critical systems for which
a single error may cause great loss including human life.
This thesis introduces the concept of Complementary Non-functional action (CNF-action)
to extend the analysis and development of NFRs beyond the traditional goals/softgoals
analysis, based on refinement/operationalisation, and to propagate the influence of NFRs
to other software construction phases. Mechanisms are also developed to integrate the formal
technique Z/Object-Z into the standardised User Requirements Notation (URN) to
formalise GRL models describing functional and non-functional requirements, to propagate
CNF-actions of the formalised NFRs to UCMs maps, to facilitate URN construction process
and the quality of URN models.School of ComputingD. Phil (Computer Science
Recommended from our members
Ontology-based information standards development
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Standards may be argued to be important enablers for achieving interoperability as they aim to provide unambiguous specifications for error-free exchange of documents and information. By implication, therefore, it is important to model and represent the concept of a standard in a clear, precise and unambiguous way. Although standards development organisations usually provide guidelines for the process of developing and approving standards, they are usually more concerned with administrative aspect of the process. As a consequence, the state-of-the-art lacks practical support for developing the structure and content of a standard specification. In short, there is no systematic development method currently available: (a) For developing the conceptual model underpinning a standard; and/or (b) to guide a group of stakeholders to develop a standard specification.
Semantic interoperability is considered to be an essential factor for effective interoperation ā the ability to achieve semantic interoperability effectively and efficiently being strongly equated with quality by some. Semantics require that the meaning of terms, their relationships and also the restrictions and rules in the standards should be clearly defined in the early stages of standard development and act as a basis for the latter stages. This research proposes that ontology can help standards developers and stakeholders to address the issues of improving conceptual models and providing a robust and shared understanding of the domain. This thesis presents OntoStanD, a comprehensive ontology-based standards development methodology, which utilises the best practices of the existing ontology creation methods.
The potential value of OntoStanD is in providing a comprehensive, clear and unambiguous method for developing robust information standards, which are more test friendly and of higher quality. OntoStanD also facilitates standards conformance testing and change management, impacts interoperability and also assists in improved communication among the standards development team. Last, OntoStanD provides an approach that is repeatable, teachable and potentially general enough for creating any kinds of information standard.Fujitsu Laboratories of Europe Ltd, Google Anitaborg Memorial Scholarshi
Combining behavioural types with security analysis
Today's software systems are highly distributed and interconnected, and they
increasingly rely on communication to achieve their goals; due to their
societal importance, security and trustworthiness are crucial aspects for the
correctness of these systems. Behavioural types, which extend data types by
describing also the structured behaviour of programs, are a widely studied
approach to the enforcement of correctness properties in communicating systems.
This paper offers a unified overview of proposals based on behavioural types
which are aimed at the analysis of security properties
Innovative configurable and collaborative approach to automation systems engineering for automotive powertrain assembly
Presently the automotive industry is facing enormous pressure due to global
competition and ever changing legislative, economic and customer demands. Both,
agility and reconfiguration are widely recognised as important attributes for
manufacturing systems to satisfy the needs of competitive global markets. To facilitate
and accommodate unforeseen business changes within the automotive industry, a new
proactive methodology is urgently required for the design, build, assembly and
reconfiguration of automation systems. There is also need for the promotion of new
technologies and engineering methods to enable true engineering concurrency between
product and process development. Virtual construction and testing of new automation
systems prior to build is now identified as a crucial requirement to enable system
verification and to allow the investigation of design alternatives prior to building and
testing physical systems. The main focus of this research was to design and develop
reconfigurable assembly systems within the powertrain sector of the automotive
industry by capturing and modelling relevant business and engineering processes.
This research has proposed and developed a more process-efficient and robust
automation system design, build and implementation approach via new engineering
services and a standard library of reusable mechanisms. Existing research at
Loughborough had created the basic technology for a component based approach to
automation. However, no research had been previously undertaken on the application of
this approach in a user engineering and business context. The objective of this research
was therefore to utilise this prototype method and associated engineering tools and to
devise novel business and engineering processes to enable the component-based
approach to be applied in industry. This new approach has been named Configurable
and Collaborative Automation Systems (CO AS). In particular this new research has
studied the implications of migration to a COAS approach in terms of I) necessary
changes to the end-users business processes, 2) potential to improve the robustness of
the resultant system and 3) potential for improved efficiency and greater collaboration
across the supply chain... cont'
Extending and Relating Semantic Models of Compensating CSP
Business transactions involve multiple partners coordinating and interacting with each other. These transactions have hierarchies of activities which need to be orchestrated. Usual database approaches (e.g.,checkpoint, rollback) are not applicable to handle faults in a long running transaction due to interaction with multiple partners. The compensation mechanism handles faults that can arise in a long running transaction. Based on the framework of Hoare's CSP process algebra, Butler et al introduced Compensating CSP (cCSP), a language to model long-running transactions. The language introduces a method to declare a transaction as a process and it has constructs for orchestration of compensation. Butler et al also defines a trace semantics for cCSP. In this thesis, the semantic models of compensating CSP are extended by defining an operational semantics, describing how the state of a program changes during its execution. The semantics is encoded into Prolog to animate the specification. The semantic models are further extended to define the synchronisation of processes. The notion of partial behaviour is defined to model the behaviour of deadlock that arises during process synchronisation. A correspondence relationship is then defined between the semantic models and proved by using structural induction. Proving the correspondence means that any of the presentation can be accepted as a primary definition of the meaning of the language and each definition can be used correctly at different times, and for different purposes. The semantic models and their relationships are mechanised by using the theorem prover PVS. The semantic models are embedded in PVS by using Shallow embedding. The relationships between semantic models are proved by mutual structural induction. The mechanisation overcomes the problems in hand proofs and improves the scalability of the approach
Model morphisms (MoMo) to enable language independent information models and interoperable business networks
MSc. Dissertation presented at Faculdade de CiĆŖncias e Tecnologia of Universidade Nova de Lisboa to obtain the Master degree in Electrical and Computer EngineeringWith the event of globalisation, the opportunities for collaboration became more evident with the effect of enlarging business networks. In such conditions, a key for enterprise success is a reliable communication with all the partners. Therefore, organisations have been searching for flexible integrated environments to better manage their services and product life cycle, where their software applications could be easily integrated independently of the platform in use. However, with so many different information models and implementation standards being used, interoperability problems arise. Moreover,organisations are themselves at different technological maturity levels, and the solution that might be good for one, can be too advanced for another, or vice-versa. This dissertation responds to the above needs, proposing a high level meta-model to be used at the entire business network, enabling to abstract individual models from their specificities and increasing language independency and interoperability, while keeping all the enterprise legacy softwareās integrity intact. The strategy presented allows an incremental mapping
construction, to achieve a gradual integration. To accomplish this, the author proposes Model Driven Architecture (MDA) based technologies for the development of traceable transformations and execution of automatic Model Morphisms
- ā¦