887 research outputs found

    Interoperability, Trust Based Information Sharing Protocol and Security: Digital Government Key Issues

    Full text link
    Improved interoperability between public and private organizations is of key significance to make digital government newest triumphant. Digital Government interoperability, information sharing protocol and security are measured the key issue for achieving a refined stage of digital government. Flawless interoperability is essential to share the information between diverse and merely dispersed organisations in several network environments by using computer based tools. Digital government must ensure security for its information systems, including computers and networks for providing better service to the citizens. Governments around the world are increasingly revolving to information sharing and integration for solving problems in programs and policy areas. Evils of global worry such as syndrome discovery and manage, terror campaign, immigration and border control, prohibited drug trafficking, and more demand information sharing, harmonization and cooperation amid government agencies within a country and across national borders. A number of daunting challenges survive to the progress of an efficient information sharing protocol. A secure and trusted information-sharing protocol is required to enable users to interact and share information easily and perfectly across many diverse networks and databases globally.Comment: 20 page

    Secure data sharing and processing in heterogeneous clouds

    Get PDF
    The extensive cloud adoption among the European Public Sector Players empowered them to own and operate a range of cloud infrastructures. These deployments vary both in the size and capabilities, as well as in the range of employed technologies and processes. The public sector, however, lacks the necessary technology to enable effective, interoperable and secure integration of a multitude of its computing clouds and services. In this work we focus on the federation of private clouds and the approaches that enable secure data sharing and processing among the collaborating infrastructures and services of public entities. We investigate the aspects of access control, data and security policy languages, as well as cryptographic approaches that enable fine-grained security and data processing in semi-trusted environments. We identify the main challenges and frame the future work that serve as an enabler of interoperability among heterogeneous infrastructures and services. Our goal is to enable both security and legal conformance as well as to facilitate transparency, privacy and effectivity of private cloud federations for the public sector needs. © 2015 The Authors

    Using ontologies to support customisation and maintain interoperability in distributed information systems with application to the Domain Name System

    Get PDF
    ©2006 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.Global distributed systems must be standards-based to allow interoperability between all of their components. While this guarantees interoperability, it often causes local inflexibility and an inability to adapt to specialised local requirements. We show how local flexibility and global consistency can coexist by changing the way that we represent these systems. The proven technologies already in use in the Semantic Web, to support and interpret metadata annotation, provide a well-tested starting point. We can use OWL ontologies and RDF to describe distributed systems using a knowledge-based approach. This allows us to maintain separate local and global operational spaces which, in turn, gives us local flexibility and global consistency. The annotated and well-defined data is better structured, more easily maintained and less prone to errors since its purpose can be clearly determined prior to use. To illustrate the application of our approach in distributed systems, we present our implementation of an ontologically-based Domain Name System (DNS) server and client. We also present performance figures to demonstrate that the use of this approach does not add significant overhead to system performance.Nickolas J. G. Falkner, Paul D. Coddington, Andrew L. Wendelbor

    Data mediation to message level conflict in heterogeneous web services

    Get PDF
    Enterprise Information Systems (EISs) are built in isolated and independent environments leading to unpredictable and incompatible structure of data stores. An EI system can expose its functionalities as Web services to share resources of existing global internet infrastructure. The goal of this research paper is to facilitate inter-operation between Web services. Successful and reliable information (message) exchange between Web services is necessary to meet the current challenge of Enterprise Information Integration (EII). A real-\ud world business process, which consists of Web services WS1 and WS2, can be used as a practical scenario for data or message exchange between Web services. In this scenario, message is exchanged by using output of WS1 as input of WS2. If data format of WS1 and WS2 are heterogeneous or incompatible, interoperation between them is impossible if data mediation is not used to resolve message level conflict and incompatibility in the context of syntax and semantics. Data Mediation requires mapping a message from one format to another. We propose to derive a mediation technique that will enable previously less inter-operative heterogeneous Web services to become more inter-operative now. To improve inter-operational performance between Web services our data mediation approach extends and utilizes existing Web service supporting tools WSDL and SAWSDL. (Authors' abstract

    An Analysis of Composability and Composition Anomalies

    Get PDF
    The separation of concerns principle aims at decomposing a given design problem into concerns that are mapped to multiple independent software modules. The application of this principle eases the composition of the concerns and as such supports composability. Unfortunately, a clean separation (and composition of concerns) at the design level does not always imply the composability of the concerns at the implementation level. The composability might be reduced due to limitations of the implementation abstractions and composition mechanisms. The paper introduces the notion of composition anomaly to describe a general set of unexpected composition problems that arise when mapping design concerns to implementation concerns. To distinguish composition anomalies from other composition problems the requirements for composability at the design level is provided. The ideas are illustrated for a distributed newsgroup system

    The Dynamic Practice and Static Theory of Gradual Typing

    Get PDF
    We can tease apart the research on gradual types into two `lineages\u27: a pragmatic, implementation-oriented dynamic-first lineage and a formal, type-theoretic, static-first lineage. The dynamic-first lineage\u27s focus is on taming particular idioms - `pre-existing conditions\u27 in untyped programming languages. The static-first lineage\u27s focus is on interoperation and individual type system features, rather than the collection of features found in any particular language. Both appear in programming languages research under the name "gradual typing", and they are in active conversation with each other. What are these two lineages? What challenges and opportunities await the static-first lineage? What progress has been made so far

    The integration of LwM2M and OPC UA : an interoperability approach for industrial IoT

    Get PDF
    Over the past years, Internet of Things (IoT) has been emerging with connected and smart things that can communicate with each other and exchange information. Similarly, with the emergence of Industry 4.0, the industrial world is also undergoing a strong evolution by connecting devices, sensors and machines to the Internet. In this paper, we investigate the integration of these two domains and examine the interconnection of two of the promising interoperability standards in these domains, namely OPC Unified Architecture and Lightweight Machine-to-Machine (LwM2M) protocol. For this purpose, we introduce an efficient and scalable approach, based on Docker Containers, for the cross-domain integration and interoperation. Besides, we also demonstrate and validate our interoperability approach by means of real world implementations and also theoretical and practical analysis

    Manufacturing systems interoperability in dynamic change environments

    Get PDF
    The benefits of rapid i.e. nearly real time, data and information enabled decision making at all levels of a manufacturing enterprise are clearly documented: the ability to plan accurately, react quickly and even pre-empt situations can save industries billions of dollars in waste. As the pace of industry increases with automation and technology, so the need for accurate, data, information and knowledge increases. As the required pace of information collection, processing and exchange change so to do the challenges of achieving and maintaining interoperability as the systems develop: this thesis focuses on the particular challenge of interoperability between systems defined in different time frames, which may have very different terminology. This thesis is directed to improve the ability to assess the requirement for systems to interoperate, and their suitability to do so, as new systems emerge to support this need for change. In this thesis a novel solution concept is proposed that assesses the requirement and suitability of systems for interoperability. The solution concept provides a mechanism for describing systems consistently and unambiguously, even if they are developed in different timeframes. Having resolved the issue of semantic consistency through time the analysis of the systems against logical rules for system interoperability is then possible. The solution concept uses a Core Concept ontology as the foundation for a multi-level heavyweight ontology. The multiple level ontology allows increasing specificity (to ensure accuracy), while the heavyweight (i.e. computer interpretable) nature provides the semantic and logical, rigour required. A detailed investigation has been conducted to test the solution concept using a suitably dynamic environment: Manufacturing Systems, and in particular the emerging field of Manufacturing Intelligence Systems. A definitive definition for the Manufacturing Intelligence domain, constraining interoperability logic, and a multi-level domain ontology have been defined and used to successfully prove the Solution Concept. Using systems from different timeframes, the Solution concept testing successfully identified systems which needed to interoperate, whether they were suitable for interoperation and provided feedback on the reasons for unsuitability which were validated as correct against real world observations
    corecore