4,512 research outputs found

    Implications of Integration and Interoperability for Enterprise Cloud-based Applications

    Get PDF
    Enterprise’s adoption of cloud-based solutions is often hindered by problems associated with the integration of the cloud environment with on-premise systems. Currently, each cloud provider creates its proprietary application programing interfaces (APIs), which will complicate integration efforts for companies as they struggle to understand and manage these unique application interfaces in an interoperable way. This paper aims to address this challenge by providing recommendations to enterprises. The presented work is based on a quantitative study of 114 companies, which discuss current issues and future trends of integration and interoperability requirements for enterprise cloud application adoption and migration. The outcome of the discussion provides a guideline applicable to support decision makers, software architects and developers when considering to design and develop interoperable applications in order to avoid lock-in and integrate seamlessly into other cloud and on-premise systems

    Realising the open virtual commissioning of modular automation systems

    Get PDF
    To address the challenges in the automotive industry posed by the need to rapidly manufacture more product variants, and the resultant need for more adaptable production systems, radical changes are now required in the way in which such systems are developed and implemented. In this context, two enabling approaches for achieving more agile manufacturing, namely modular automation systems and virtual commissioning, are briefly reviewed in this contribution. Ongoing research conducted at Loughborough University which aims to provide a modular approach to automation systems design coupled with a virtual engineering toolset for the (re)configuration of such manufacturing automation systems is reported. The problems faced in the virtual commissioning of modular automation systems are outlined. AutomationML - an emerging neutral data format which has potential to address integration problems is discussed. The paper proposes and illustrates a collaborative framework in which AutomationML is adopted for the data exchange and data representation of related models to enable efficient open virtual prototype construction and virtual commissioning of modular automation systems. A case study is provided to show how to create the data model based on AutomationML for describing a modular automation system

    Smart technologies for effective reconfiguration: the FASTER approach

    Get PDF
    Current and future computing systems increasingly require that their functionality stays flexible after the system is operational, in order to cope with changing user requirements and improvements in system features, i.e. changing protocols and data-coding standards, evolving demands for support of different user applications, and newly emerging applications in communication, computing and consumer electronics. Therefore, extending the functionality and the lifetime of products requires the addition of new functionality to track and satisfy the customers needs and market and technology trends. Many contemporary products along with the software part incorporate hardware accelerators for reasons of performance and power efficiency. While adaptivity of software is straightforward, adaptation of the hardware to changing requirements constitutes a challenging problem requiring delicate solutions. The FASTER (Facilitating Analysis and Synthesis Technologies for Effective Reconfiguration) project aims at introducing a complete methodology to allow designers to easily implement a system specification on a platform which includes a general purpose processor combined with multiple accelerators running on an FPGA, taking as input a high-level description and fully exploiting, both at design time and at run time, the capabilities of partial dynamic reconfiguration. The goal is that for selected application domains, the FASTER toolchain will be able to reduce the design and verification time of complex reconfigurable systems providing additional novel verification features that are not available in existing tool flows

    Designing Traceability into Big Data Systems

    Full text link
    Providing an appropriate level of accessibility and traceability to data or process elements (so-called Items) in large volumes of data, often Cloud-resident, is an essential requirement in the Big Data era. Enterprise-wide data systems need to be designed from the outset to support usage of such Items across the spectrum of business use rather than from any specific application view. The design philosophy advocated in this paper is to drive the design process using a so-called description-driven approach which enriches models with meta-data and description and focuses the design process on Item re-use, thereby promoting traceability. Details are given of the description-driven design of big data systems at CERN, in health informatics and in business process management. Evidence is presented that the approach leads to design simplicity and consequent ease of management thanks to loose typing and the adoption of a unified approach to Item management and usage.Comment: 10 pages; 6 figures in Proceedings of the 5th Annual International Conference on ICT: Big Data, Cloud and Security (ICT-BDCS 2015), Singapore July 2015. arXiv admin note: text overlap with arXiv:1402.5764, arXiv:1402.575
    corecore