34,128 research outputs found

    An Integrated Approach to Defining Enterprise Computing Architectures

    Get PDF
    This paper describes research into the development of enterprise computing architectures that employ a mix of mainframe, local area network, and cooperative computing paradigms. It outlines a robust approach that permits the incorporation of several different distribution criteria, while accommodating designer preferences

    ClouNS - A Cloud-native Application Reference Model for Enterprise Architects

    Full text link
    The capability to operate cloud-native applications can generate enormous business growth and value. But enterprise architects should be aware that cloud-native applications are vulnerable to vendor lock-in. We investigated cloud-native application design principles, public cloud service providers, and industrial cloud standards. All results indicate that most cloud service categories seem to foster vendor lock-in situations which might be especially problematic for enterprise architectures. This might sound disillusioning at first. However, we present a reference model for cloud-native applications that relies only on a small subset of well standardized IaaS services. The reference model can be used for codifying cloud technologies. It can guide technology identification, classification, adoption, research and development processes for cloud-native application and for vendor lock-in aware enterprise architecture engineering methodologies

    Ontology-based patterns for the integration of business processes and enterprise application architectures

    Get PDF
    Increasingly, enterprises are using Service-Oriented Architecture (SOA) as an approach to Enterprise Application Integration (EAI). SOA has the potential to bridge the gap between business and technology and to improve the reuse of existing applications and the interoperability with new ones. In addition to service architecture descriptions, architecture abstractions like patterns and styles capture design knowledge and allow the reuse of successfully applied designs, thus improving the quality of software. Knowledge gained from integration projects can be captured to build a repository of semantically enriched, experience-based solutions. Business patterns identify the interaction and structure between users, business processes, and data. Specific integration and composition patterns at a more technical level address enterprise application integration and capture reliable architecture solutions. We use an ontology-based approach to capture architecture and process patterns. Ontology techniques for pattern definition, extension and composition are developed and their applicability in business process-driven application integration is demonstrated

    Architectural implications for context adaptive smart spaces

    Get PDF
    Buildings and spaces are complex entities containing complex social structures and interactions. A smart space is a composite of the users that inhabit it, the IT infrastructure that supports it, and the sensors and appliances that service it. Rather than separating the IT from the buildings and from the appliances that inhabit them and treating them as separate systems, pervasive computing combines them and allows them to interact. We outline a reactive context architecture that supports this vision of integrated smart spaces and explore some implications for building large-scale pervasive systems

    Information standards to support application and enterprise interoperability for the smart grid

    Get PDF
    Copyright @ 2012 IEEE.Current changes in the European electricity industry are driven by regulatory directives to reduce greenhouse gas emissions, at the same time as replacing aged infrastructure and maintaining energy security. There is a wide acceptance of the requirement for smarter grids to support such changes and accommodate variable injections from renewable energy sources. However the design templates are still emerging to manage the level of information required to meet challenges such as balancing, planning and market dynamics under this new paradigm. While secure and scalable cloud computing architectures may contribute to supporting the informatics challenges of the smart grid, this paper focuses on the essential need for business alignment with standardised information models such as the IEC Common Information Model (CIM), to leverage data value and control system interoperability. In this paper we present details of use cases being considered by National Grid, the GB transmission system operator for information interoperability in pan-network system management and planning.This study is financially supported by the National Grid, UK

    A Tale of Two Data-Intensive Paradigms: Applications, Abstractions, and Architectures

    Full text link
    Scientific problems that depend on processing large amounts of data require overcoming challenges in multiple areas: managing large-scale data distribution, co-placement and scheduling of data with compute resources, and storing and transferring large volumes of data. We analyze the ecosystems of the two prominent paradigms for data-intensive applications, hereafter referred to as the high-performance computing and the Apache-Hadoop paradigm. We propose a basis, common terminology and functional factors upon which to analyze the two approaches of both paradigms. We discuss the concept of "Big Data Ogres" and their facets as means of understanding and characterizing the most common application workloads found across the two paradigms. We then discuss the salient features of the two paradigms, and compare and contrast the two approaches. Specifically, we examine common implementation/approaches of these paradigms, shed light upon the reasons for their current "architecture" and discuss some typical workloads that utilize them. In spite of the significant software distinctions, we believe there is architectural similarity. We discuss the potential integration of different implementations, across the different levels and components. Our comparison progresses from a fully qualitative examination of the two paradigms, to a semi-quantitative methodology. We use a simple and broadly used Ogre (K-means clustering), characterize its performance on a range of representative platforms, covering several implementations from both paradigms. Our experiments provide an insight into the relative strengths of the two paradigms. We propose that the set of Ogres will serve as a benchmark to evaluate the two paradigms along different dimensions.Comment: 8 pages, 2 figure

    Data center virtualization and its economic implications for the companies

    Get PDF
    In the current situation of the economic crisis, when companies target budget cuttings in a context of an explosive data growth, the IT community must evaluate potential technology developments not only on their technical advantages, but on their economic effects as well.data centre; virtualization; tiered storage; provisioning software; unified computing.
    • 

    corecore