4,334 research outputs found

    Analyzing Consistency of Behavioral REST Web Service Interfaces

    Full text link
    REST web services can offer complex operations that do more than just simply creating, retrieving, updating and deleting information from a database. We have proposed an approach to design the interfaces of behavioral REST web services by defining a resource and a behavioral model using UML. In this paper we discuss the consistency between the resource and behavioral models that represent service states using state invariants. The state invariants are defined as predicates over resources and describe what are the valid state configurations of a behavioral model. If a state invariant is unsatisfiable then there is no valid state configuration containing the state and there is no service that can implement the service interface. We also show how we can use reasoning tools to determine the consistency between these design models.Comment: In Proceedings WWV 2012, arXiv:1210.578

    Transitioning Applications to Semantic Web Services: An Automated Formal Approach

    No full text
    Semantic Web Services have been recognized as a promising technology that exhibits huge commercial potential, and attract significant attention from both industry and the research community. Despite expectations being high, the industrial take-up of Semantic Web Service technologies has been slower than expected. One of the main reasons is that many systems have been developed without considering the potential of the web in integrating services and sharing resources. Without a systematic methodology and proper tool support, the migration from legacy systems to Semantic Web Service-based systems can be a very tedious and expensive process, which carries a definite risk of failure. There is an urgent need to provide strategies which allow the migration of legacy systems to Semantic Web Services platforms, and also tools to support such a strategy. In this paper we propose a methodology for transitioning these applications to Semantic Web Services by taking the advantage of rigorous mathematical methods. Our methodology allows users to migrate their applications to Semantic Web Services platform automatically or semi-automatically

    Discovery and Selection of Certified Web Services Through Registry-Based Testing and Verification

    Get PDF
    Reliability and trust are fundamental prerequisites for the establishment of functional relationships among peers in a Collaborative Networked Organisation (CNO), especially in the context of Virtual Enterprises where economic benefits can be directly at stake. This paper presents a novel approach towards effective service discovery and selection that is no longer based on informal, ambiguous and potentially unreliable service descriptions, but on formal specifications that can be used to verify and certify the actual Web service implementations. We propose the use of Stream X-machines (SXMs) as a powerful modelling formalism for constructing the behavioural specification of a Web service, for performing verification through the generation of exhaustive test cases, and for performing validation through animation or model checking during service selection

    Provenance-based validation of E-science experiments

    No full text
    E-Science experiments typically involve many distributed services maintained by different organisations. After an experiment has been executed, it is useful for a scientist to verify that the execution was performed correctly or is compatible with some existing experimental criteria or standards. Scientists may also want to review and verify experiments performed by their colleagues. There are no existing frameworks for validating such experiments in today's e-Science systems. Users therefore have to rely on error checking performed by the services, or adopt other ad hoc methods. This paper introduces a platform-independent framework for validating workflow executions. The validation relies on reasoning over the documented provenance of experiment results and semantic descriptions of services advertised in a registry. This validation process ensures experiments are performed correctly, and thus results generated are meaningful. The framework is tested in a bioinformatics application that performs protein compressibility analysis

    The Distributed Ontology Language (DOL): Use Cases, Syntax, and Extensibility

    Full text link
    The Distributed Ontology Language (DOL) is currently being standardized within the OntoIOp (Ontology Integration and Interoperability) activity of ISO/TC 37/SC 3. It aims at providing a unified framework for (1) ontologies formalized in heterogeneous logics, (2) modular ontologies, (3) links between ontologies, and (4) annotation of ontologies. This paper presents the current state of DOL's standardization. It focuses on use cases where distributed ontologies enable interoperability and reusability. We demonstrate relevant features of the DOL syntax and semantics and explain how these integrate into existing knowledge engineering environments.Comment: Terminology and Knowledge Engineering Conference (TKE) 2012-06-20 to 2012-06-21 Madrid, Spai

    An active, ontology-driven network service for Internet collaboration

    No full text
    Web portals have emerged as an important means of collaboration on the WWW, and the integration of ontologies promises to make them more accurate in how they serve usersā€™ collaboration and information location requirements. However, web portals are essentially a centralised architecture resulting in difficulties supporting seamless roaming between portals and collaboration between groups supported on different portals. This paper proposes an alternative approach to collaboration over the web using ontologies that is de-centralised and exploits content-based networking. We argue that this approach promises a user-centric, timely, secure and location-independent mechanism, which is potentially more scaleable and universal than existing centralised portals

    Link Before You Share: Managing Privacy Policies through Blockchain

    Full text link
    With the advent of numerous online content providers, utilities and applications, each with their own specific version of privacy policies and its associated overhead, it is becoming increasingly difficult for concerned users to manage and track the confidential information that they share with the providers. Users consent to providers to gather and share their Personally Identifiable Information (PII). We have developed a novel framework to automatically track details about how a users' PII data is stored, used and shared by the provider. We have integrated our Data Privacy ontology with the properties of blockchain, to develop an automated access control and audit mechanism that enforces users' data privacy policies when sharing their data across third parties. We have also validated this framework by implementing a working system LinkShare. In this paper, we describe our framework on detail along with the LinkShare system. Our approach can be adopted by Big Data users to automatically apply their privacy policy on data operations and track the flow of that data across various stakeholders.Comment: 10 pages, 6 figures, Published in: 4th International Workshop on Privacy and Security of Big Data (PSBD 2017) in conjunction with 2017 IEEE International Conference on Big Data (IEEE BigData 2017) December 14, 2017, Boston, MA, US

    Leveraging Semantic Web Service Descriptions for Validation by Automated Functional Testing

    Get PDF
    Recent years have seen the utilisation of Semantic Web Service descriptions for automating a wide range of service-related activities, with a primary focus on service discovery, composition, execution and mediation. An important area which so far has received less attention is service validation, whereby advertised services are proven to conform to required behavioural specifications. This paper proposes a method for validation of service-oriented systems through automated functional testing. The method leverages ontology-based and rule-based descriptions of service inputs, outputs, preconditions and effects (IOPE) for constructing a stateful EFSM specification. The specification is subsequently utilised for functional testing and validation using the proven Stream X-machine (SXM) testing methodology. Complete functional test sets are generated automatically at an abstract level and are then applied to concrete Web services, using test drivers created from the Web service descriptions. The testing method comes with completeness guarantees and provides a strong method for validating the behaviour of Web services
    • ā€¦
    corecore