40,047 research outputs found

    Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World

    Get PDF
    This report documents the program and the outcomes of GI-Dagstuhl Seminar 16394 "Software Performance Engineering in the DevOps World". The seminar addressed the problem of performance-aware DevOps. Both, DevOps and performance engineering have been growing trends over the past one to two years, in no small part due to the rise in importance of identifying performance anomalies in the operations (Ops) of cloud and big data systems and feeding these back to the development (Dev). However, so far, the research community has treated software engineering, performance engineering, and cloud computing mostly as individual research areas. We aimed to identify cross-community collaboration, and to set the path for long-lasting collaborations towards performance-aware DevOps. The main goal of the seminar was to bring together young researchers (PhD students in a later stage of their PhD, as well as PostDocs or Junior Professors) in the areas of (i) software engineering, (ii) performance engineering, and (iii) cloud computing and big data to present their current research projects, to exchange experience and expertise, to discuss research challenges, and to develop ideas for future collaborations

    An agile business process and practice meta-model

    Get PDF
    Business Process Management (BPM) encompasses the discovery, modelling, monitoring, analysis and improvement of business processes. Limitations of traditional BPM approaches in addressing changes in business requirements have resulted in a number of agile BPM approaches that seek to accelerate the redesign of business process models. Meta-models are a key BPM feature that reduce the ambiguity of business process models. This paper describes a meta-model supporting the agile version of the Business Process and Practice Alignment Methodology (BPPAM) for business process improvement, which captures process information from actual work practices. The ability of the meta-model to achieve business process agility is discussed and compared with other agile meta-models, based on definitions of business process flexibility and agility found in the literature. (C) 2017 The Authors. Published by Elsevier B.V

    Microservice Transition and its Granularity Problem: A Systematic Mapping Study

    Get PDF
    Microservices have gained wide recognition and acceptance in software industries as an emerging architectural style for autonomic, scalable, and more reliable computing. The transition to microservices has been highly motivated by the need for better alignment of technical design decisions with improving value potentials of architectures. Despite microservices' popularity, research still lacks disciplined understanding of transition and consensus on the principles and activities underlying "micro-ing" architectures. In this paper, we report on a systematic mapping study that consolidates various views, approaches and activities that commonly assist in the transition to microservices. The study aims to provide a better understanding of the transition; it also contributes a working definition of the transition and technical activities underlying it. We term the transition and technical activities leading to microservice architectures as microservitization. We then shed light on a fundamental problem of microservitization: microservice granularity and reasoning about its adaptation as first-class entities. This study reviews state-of-the-art and -practice related to reasoning about microservice granularity; it reviews modelling approaches, aspects considered, guidelines and processes used to reason about microservice granularity. This study identifies opportunities for future research and development related to reasoning about microservice granularity.Comment: 36 pages including references, 6 figures, and 3 table

    Conditioned emergence: a dissipative structures approach to transformation

    Get PDF
    This paper presents a novel framework for the management of organisational transformation, defined here as a relatively rapid transition from one archetype to another. The concept of dissipative structures, from the field of complexity theory, is used to develop and explain a specific sequence of activities which underpin effective transformation. This sequence integrates selected concepts from the literatures on strategic change, organisational learning and business processes; in so doing, it introduces a degree of prescriptiveness which differentiates it from other managerial interpretations of complexity theory. Specifically, it proposes a three-stage process: first, the organisation conditions the outcome of the transformation process by articulating and reconfiguring the rules which underpin its deep structure; second, it takes steps to move from its current equilibrium and, finally, it moves into a period where positive and negative feedback loops become the focus of managerial attention. The paper argues that by managing at the level of deep structure in social systems, organisations can gain some influence over self-organising processes which are typically regarded as unpredictable in the natural sciences. However, the paper further argues that this influence is limited to archetypal features and that detailed forms and behaviours are emergent properties of the system. Two illustrative case-vignettes are presented to give an insight into the practical application of the model before conclusions are reached which speculate on the implications of this approach for strategy research

    The new EFQM model: What is really new and could be considered as a suitable tool with respect to Quality 4.0 concept?

    Get PDF
    Purpose: The paper offers a set of original information based on critical analysis of description two last versions of excellence models presented by the European Organisation for Quality Management (EFQM). The principle goal is to present the main advantages and weaknesses of the latest version of The EFQM Model, especially from a practical point of view with respect to a Quality 4.0 era. Methodology/Approach: Comparative analysis of two relevant documents (EFQM, 2012; EFQM, 2019a) was used as a key method. Discussions with 18 quality professionals from Czech production organisations served as a complementary approach. Findings: The basic structure of a new model was completely changed. But the description of certain recommendations by way of guidance points are superficial and confusing. It lays stress on the necessity to transform organisations for the future as well as on comprehensive feedback from key stakeholders. Research Limitation/implication: The latest version of The EFQM Model was published in November 2019, and general knowledge related to this version is naturally limited. Published studies or publicly available experience completely absent. That is why a more in-depth literature review focused on the latest version of The EFQM Model could not be included in this text. Originality/Value of paper: The paper brings an original set of information that was not published yet before. The value of this set should be examined not only from theoretical but primarily from a practical viewpoint.Web of Science241281

    Autonomic care platform for optimizing query performance

    Get PDF
    Background: As the amount of information in electronic health care systems increases, data operations get more complicated and time-consuming. Intensive Care platforms require a timely processing of data retrievals to guarantee the continuous display of recent data of patients. Physicians and nurses rely on this data for their decision making. Manual optimization of query executions has become difficult to handle due to the increased amount of queries across multiple sources. Hence, a more automated management is necessary to increase the performance of database queries. The autonomic computing paradigm promises an approach in which the system adapts itself and acts as self-managing entity, thereby limiting human interventions and taking actions. Despite the usage of autonomic control loops in network and software systems, this approach has not been applied so far for health information systems. Methods: We extend the COSARA architecture, an infection surveillance and antibiotic management service platform for the Intensive Care Unit (ICU), with self-managed components to increase the performance of data retrievals. We used real-life ICU COSARA queries to analyse slow performance and measure the impact of optimizations. Each day more than 2 million COSARA queries are executed. Three control loops, which monitor the executions and take action, have been proposed: reactive, deliberative and reflective control loops. We focus on improvements of the execution time of microbiology queries directly related to the visual displays of patients' data on the bedside screens. Results: The results show that autonomic control loops are beneficial for the optimizations in the data executions in the ICU. The application of reactive control loop results in a reduction of 8.61% of the average execution time of microbiology results. The combined application of the reactive and deliberative control loop results in an average query time reduction of 10.92% and the combination of reactive, deliberative and reflective control loops provides a reduction of 13.04%. Conclusions: We found that by controlled reduction of queries' executions the performance for the end-user can be improved. The implementation of autonomic control loops in an existing health platform, COSARA, has a positive effect on the timely data visualization for the physician and nurse

    On Modeling and Analyzing Cost Factors in Information Systems Engineering

    Get PDF
    Introducing enterprise information systems (EIS) is usually associated with high costs. It is therefore crucial to understand those factors that determine or influence these costs. Though software cost estimation has received considerable attention during the last decades, it is difficult to apply existing approaches to EIS. This difficulty particularly stems from the inability of these methods to deal with the dynamic interactions of the many technological, organizational and projectdriven cost factors which specifically arise in the context of EIS. Picking up this problem, we introduce the EcoPOST framework to investigate the complex cost structures of EIS engineering projects through qualitative cost evaluation models. This paper extends previously described concepts and introduces design rules and guidelines for cost evaluation models in order to enhance the development of meaningful and useful EcoPOST cost evaluation models. A case study illustrates the benefits of our approach. Most important, our EcoPOST framework is an important tool supporting EIS engineers in gaining a better understanding of the critical factors determining the costs of EIS engineering projects

    System Dynamics in Food Quality Certifications: Development of an Audit Integrity System

    Get PDF
    Due to the complex structure of certification schemes the risk of flaws and scandals is generally high. It has further increased by several developments during the last years. With regard to their potential effects, it is questionable whether the certification approaches are actually able to detect deficiencies within the system and thus prevent crises which may lead to its breakdown. Hence, the ability of a standard to meet its objectives of food quality and safety needs to be enforced. In this contribution we launch the implementation of a controlling tool which automatically monitors audit quality based on information of the respective data bases. By analysing possible negative influences, opportunistic behaviour can thus be detected.certification, quality assurance systems, risk oriented auditing approach, Food Consumption/Nutrition/Food Safety, Food Security and Poverty,

    The nature of risk in complex projects

    Get PDF
    © 2017 Project Management Institute, Inc. Risk analysis is important for complex projects; however, systemicity makes evaluating risk in real projects difficult. Looking at the causal structure of risks is a start, but causal chains need to include management actions, the motivations of project actors, and sociopolitical project complexities as well as intra-connectedness and feedback. Common practice based upon decomposition-type methods is often shown to point to the wrong risks. A complexity structure is used to identify systemicity and draws lessons about key risks. We describe how to analyze the systemic nature of risk and how the contractor and client can understand the ramifications of their actions
    • 

    corecore