7,167 research outputs found

    Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World

    Get PDF
    This report documents the program and the outcomes of GI-Dagstuhl Seminar 16394 "Software Performance Engineering in the DevOps World". The seminar addressed the problem of performance-aware DevOps. Both, DevOps and performance engineering have been growing trends over the past one to two years, in no small part due to the rise in importance of identifying performance anomalies in the operations (Ops) of cloud and big data systems and feeding these back to the development (Dev). However, so far, the research community has treated software engineering, performance engineering, and cloud computing mostly as individual research areas. We aimed to identify cross-community collaboration, and to set the path for long-lasting collaborations towards performance-aware DevOps. The main goal of the seminar was to bring together young researchers (PhD students in a later stage of their PhD, as well as PostDocs or Junior Professors) in the areas of (i) software engineering, (ii) performance engineering, and (iii) cloud computing and big data to present their current research projects, to exchange experience and expertise, to discuss research challenges, and to develop ideas for future collaborations

    Winning customer loyalty in an automotive company through Six Sigma: a case study

    Get PDF
    Six Sigma is a disciplined approach to improving product, process and service quality. Since its inception at Motorola in the mid 1980s Six Sigma has evolved significantly and continues to expand to improve process performance, enhance business profitability and increase customer satisfaction. This paper presents an extensive literature review based on the experiences of both academics and practitioners on Six Sigma, followed by the application of the Define, Measure, Analyse, Improve, Control (DMAIC) problem-solving methodology to identify the parameters causing casting defects and to control these parameters. The results of the study are based on the application of tools and techniques in the DMAIC methodology, i.e. Pareto Analysis, Measurement System Analysis, Regression Analysis and Design of Experiment. The results of the study show that the application of the Six Sigma methodology reduced casting defects and increased the process capability of the process from 0.49 to 1.28. The application of DMAIC has resulted in a significant financial impact (over U.S. $110 000 per annum) on the bottom-line of the company

    Optimizing Service Differentiation Scheme with Sized-based Queue Management in DiffServ Networks

    Get PDF
    In this paper we introduced Modified Sized-based Queue Management as a dropping scheme that aims to fairly prioritize and allocate more service to VoIP traffic over bulk data like FTP as the former one usually has small packet size with less impact to the network congestion. In the same time, we want to guarantee that this prioritization is fair enough for both traffic types. On the other hand we study the total link delay over the congestive link with the attempt to alleviate this congestion as much as possible at the by function of early congestion notification. Our M-SQM scheme has been evaluated with NS2 experiments to measure the packets received from both and total link-delay for different traffic. The performance evaluation results of M-SQM have been validated and graphically compared with the performance of other three legacy AQMs (RED, RIO, and PI). It is depicted that our M-SQM outperformed these AQMs in providing QoS level of service differentiation.Comment: 10 pages, 9 figures, 1 table, Submitted to Journal of Telecommunication

    Benchmarking network propagation methods for disease gene identification

    Get PDF
    In-silico identification of potential target genes for disease is an essential aspect of drug target discovery. Recent studies suggest that successful targets can be found through by leveraging genetic, genomic and protein interaction information. Here, we systematically tested the ability of 12 varied algorithms, based on network propagation, to identify genes that have been targeted by any drug, on gene-disease data from 22 common non-cancerous diseases in OpenTargets. We considered two biological networks, six performance metrics and compared two types of input gene-disease association scores. The impact of the design factors in performance was quantified through additive explanatory models. Standard cross-validation led to over-optimistic performance estimates due to the presence of protein complexes. In order to obtain realistic estimates, we introduced two novel protein complex-aware cross-validation schemes. When seeding biological networks with known drug targets, machine learning and diffusion-based methods found around 2-4 true targets within the top 20 suggestions. Seeding the networks with genes associated to disease by genetics decreased performance below 1 true hit on average. The use of a larger network, although noisier, improved overall performance. We conclude that diffusion-based prioritisers and machine learning applied to diffusion-based features are suited for drug discovery in practice and improve over simpler neighbour-voting methods. We also demonstrate the large impact of choosing an adequate validation strategy and the definition of seed disease genesPeer ReviewedPostprint (published version

    Search based software engineering: Trends, techniques and applications

    Get PDF
    © ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives. This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E

    Estimating, planning and managing Agile Web development projects under a value-based perspective

    Get PDF
    Context: The processes of estimating, planning and managing are crucial for software development projects, since the results must be related to several business strategies. The broad expansion of the Internet and the global and interconnected economy make Web development projects be often characterized by expressions like delivering as soon as possible, reducing time to market and adapting to undefined requirements. In this kind of environment, traditional methodologies based on predictive techniques sometimes do not offer very satisfactory results. The rise of Agile methodologies and practices has provided some useful tools that, combined with Web Engineering techniques, can help to establish a framework to estimate, manage and plan Web development projects. Objective: This paper presents a proposal for estimating, planning and managing Web projects, by combining some existing Agile techniques with Web Engineering principles, presenting them as an unified framework which uses the business value to guide the delivery of features. Method: The proposal is analyzed by means of a case study, including a real-life project, in order to obtain relevant conclusions. Results: The results achieved after using the framework in a development project are presented, including interesting results on project planning and estimation, as well as on team productivity throughout the project. Conclusion: It is concluded that the framework can be useful in order to better manage Web-based projects, through a continuous value-based estimation and management process.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-

    What Directions for Public Health Under the Affordable Care Act?

    Get PDF
    Outlines opportunities for public health efforts under the 2010 healthcare reform law, such as building prevention into insurance expansion and boosting innovation in population health, as well as challenges, such as budget constraints

    Easy over Hard: A Case Study on Deep Learning

    Full text link
    While deep learning is an exciting new technique, the benefits of this method need to be assessed with respect to its computational cost. This is particularly important for deep learning since these learners need hours (to weeks) to train the model. Such long training time limits the ability of (a)~a researcher to test the stability of their conclusion via repeated runs with different random seeds; and (b)~other researchers to repeat, improve, or even refute that original work. For example, recently, deep learning was used to find which questions in the Stack Overflow programmer discussion forum can be linked together. That deep learning system took 14 hours to execute. We show here that applying a very simple optimizer called DE to fine tune SVM, it can achieve similar (and sometimes better) results. The DE approach terminated in 10 minutes; i.e. 84 times faster hours than deep learning method. We offer these results as a cautionary tale to the software analytics community and suggest that not every new innovation should be applied without critical analysis. If researchers deploy some new and expensive process, that work should be baselined against some simpler and faster alternatives.Comment: 12 pages, 6 figures, accepted at FSE201
    corecore