11 research outputs found
Evaluating the Impact of Critical Factors in Agile Continuous Delivery Process: A System Dynamics Approach
Continuous Delivery is aimed at the frequent delivery of good quality software in a speedy, reliable and efficient fashion – with strong emphasis on automation and team collaboration. However, even with this new paradigm, repeatability of project outcome is still not guaranteed: project performance varies due to the various interacting and inter-related factors in the Continuous Delivery 'system'. This paper presents results from the investigation of various factors, in particular agile practices, on the quality of the developed software in the Continuous Delivery process. Results show that customer involvement and the cognitive ability of the QA have the most significant individual effects on the quality of software in continuous delivery
Toward a Model for Customer-Driven Release Management
Undetected software bugs frequently result in service disruptions, productivity losses, and in some instances significant threat to human life. One way to prevent such bugs is to engage customers in acceptance testing prior to the production software release, yet there is a considerable lack of empirical examination of the release process from the customer’s perspective. To address this research-practice gap, this study proposes a model for customer-driven release management that has been shown to minimize the number of software bugs discovered in production systems. The model is evaluated during a 27 month study at a municipality using the action research method. Following the model, 361 software bugs were detected and eliminated prior to final production releases, confirming the value of customer-driven release management for elimination of production software bugs
Source tree composition
Dividing software systems in components improves software reusability as well as software maintainability. Components live at several levels, we concentrate on the implementation level where components are formed by source files, divided over directory structures. Such source code components are usually strongly coupled in the directory structure of a software system. Their compilation is usually controlled by a single global build process. This entangling of source trees and build processes often makes reuse of source code components in different software systems difficult. It also makes software systems inflexible because integration of additional source code components in source trees and build processes is difficult. This paper's subject is to increase software reuse by decreasing coupling of source code components. It is achieved by automized assembly of software systems from reusable source code components and involves integration of source trees, build processes, and configuration processes. Application domains include generative programming, product-line architectures, and commercial off-the-shelf (COTS) software engineering
Information Technology Staff Perceptions of Optimizing End User Interaction during Systems Development
This study explores the perceptions that information technology (IT) professionals have of optimizing end user involvement during application development. The Discovery IT staff at GlaxoSmithKline is studied. Using seven distinct stages of systems development, the study is able to make precise conclusions about the selection, timing, and degree of end user involvement. It was found that involving intelligent end users early on in the development process is beneficial, creating only limited drawbacks
Linha de produtos de software dinâmica direcionada por qualidade : o caso de redes de monitoração do corpo humano
Dissertação (mestrado)—Universidade de BrasĂlia, Instituto de CiĂŞncias Exatas,
Departamento de CiĂŞncia da Computação, 2012.Na atualidade, os indivĂduos passam a ter uma posição mais ativa no processo de investigação de doenças querendo acompanhar o seu estado de saĂşde continuamente. Por ser Ă© inviável manter um profissional de saĂşde para cada indivĂduo, mais apoio da tecnologia tem sido requerido a fim de auxiliar esse processo de monitoração. Diante deste quadro, mais soluções automatizadas estĂŁo sendo propostas, em particular, Redes de Sensores do Corpo Humano (RSCH), no qual um indivĂduo monitora suas atividades diárias e sinais vitais e o sistema o auxilia na prevenção e detecção de situações de emergĂŞncia. Este trabalho explora como a metodologia de Linha de Produto de Software Dinâmica (LPSD) no contexto de RSCH gerencia e balanceia requisitos conflitantes, tais como disponibilidade e confiabilidade, de tal forma que quando o indivĂduo estiver em uma situação normal de saĂşde, o sistema possa desativar alguns sensores ou funcionalidades visando economia de bateria e processamento; e por outro lado, quando o indivĂduo desmaiar ou alterar seus batimentos cardĂacos, o oposto deva acontecer com os sensores afim de se prover o melhor serviço para o indivĂduo em uma situação de alto risco de saĂşde. Uma LPSD para RSCH se reconfigura baseando-se em mudanças de contexto, no caso, mudança na situação de saĂşde do indivĂduo monitorado, afim de atingir um novo objetivo de qualidade para esta nova situação de risco. Neste trabalho, a situação de um indivĂduo Ă© especificada como um contrato de qualidade, provido por um especialista no domĂnio (mĂ©dico). O contrato Ă© modelado como uma máquina de estados, onde as transições entre estados sĂŁo causadas por eventos de saĂşde (queda, desmaio, alteração de pressĂŁo) e os estados definem objetivos de qualidade. A verificação de nĂŁo conformidade com o objetivo de qualidade motiva a reconfiguração do sistema. A confiabilidade de uma determinada configuração Ă© medida como uma Ăşnica fĂłrmula, parametrizada com a presença e ausĂŞncia das features da LPSD e das qualidades associadas a elas. AlĂ©m de confiabilidade, exploram-se tambĂ©m parâmetros de qualidade tais como tempo de vida estimado para o sistema, taxa de amostragem, qualidade e quantidade de informação das configurações. As estratĂ©gias de cálculo de qualidade Simple MultiAttribute Rating Technique (SMART) e orientação a objetivos (GOAL) sĂŁo comparadas no domĂnio de RSCH. Avaliou-se a abordagem proposta via simulações com dados reais de monitoração e obteve-se resultado favorável Ă utilização da metodologia proposta no contexto. _______________________________________________________________________________________________________________________________ ABSTRACTNowadays, individuals have a more active stance in the investigation of diseases in the sense that they want to monitor their health status continuously. Because it is not sustainable to have dedicated health professional for each individual, more technology support has been applied to assist this monitoring process. In this context, automatic solutions are being proposed, in particular Body Sensor Network (BSN), in which an individual monitors his vital signs and the system aids him in the prevention and detection of emergency situations. BSN must manage and balance conflicting requirements, such as availability and reliability, in a way that if the patient is in a normal or low health risk situation, the system can turn off some sensors or disable features to save power and processing. On the other hand, when the individual faints or changes its heartbeats dangerously, the opposite should happen with the sensors and features in order to provide the best service for this high risk situation. We explore how Dynamic Software Product Line (DSPL) achieves this goal. A DSPL reconfigures itself based on some context changes e.g., the persons' medical situation, to meet a new quality goal for that new situation, as specified by a reliability contract provided by the domain expert (a medical doctor). This contract is modeled as a state machine, whose transitions are medical events (e.g., fall, stroke) and states are target reliability goals, prompting a reconfiguration to meet it. The reliability of any given configuration is measured by a single formula, parametrizing over the features of the DSPL and related quality information. Besides reliability, we also explore other quality parameters such as lifetime, sensor sample rate, quality and amount of information. Strategies for calculating quality such as Simple MultiAttribute Rating Technique (SMART) and goal-oriented are compared in the BSN domain. We evaluated the proposed approach via simulations with real monitoring data and obtained favorable results with the use of the proposed methodology in the BSN context
Software Service Innovation: An Action Research into Release Cycle Management
Fierce competition in the market is driving software vendors to rely on Software-as-a-Service (SaaS) strategies and to continuously match new software versions with customers’ needs and competitors’ moves. Although release management as a recurrent activity related to SaaS arguably shapes how a vendor services its customers, the literature is surprisingly limited on how software releases are managed to support SaaS strategies. Against this backdrop, we present a collaborative action-research study with Software Inc., a large multi-national software provider, focused on improving the release cycle management process for a complex security software service. The study is part of a comprehensive intervention into Software Inc. that combines a perspective rooted in software process improvement and engineering practices with one rooted in service delivery and customer interactions. The part that is reported in this dissertation draws on the service-dominant logic framework to analyze how the release cycle management process was organized to improve Software Inc.’s ongoing value co-creation with its customers. As a result, the study contributed to improving release cycle management at Software Inc. and it expands industry knowledge about the challenges and opportunities for software vendors to manage releases and improve the value delivered to and co-created with their customers. This added knowledge is of interest to both practitioners and researchers as SaaS strategies increasingly shape the industry with important implications for how software is released
Improving Recurrent Software Development: A Contextualist Inquiry Into Release Cycle Management
Software development is increasingly conducted in a recurrent fashion, where the same product or service is continuously being developed for the marketplace. Still, we lack detailed studies about this particular context of software development. Against this backdrop, this dissertation presents an action research study into Software Inc., a large multi-national software provider. The research addressed the challenges the company faced in managing releases and organizing software process improvement (SPI) to help recurrently develop and deliver a specific product, Secure-on-Request, to its customers and the wider marketplace. The initial problem situation was characterized by recent acquisition of additional software, complexity of service delivery, new engineering and product management teams, and low software development process maturity. Asking how release management can be organized and improved in the context of recurrent development of software, we draw on Pettigrew’s contextualist inquiry to focus on the ongoing interaction between the contents, context and process to organize and improve release cycle practices and outcomes. As a result, the dissertation offers two contributions. Practically, it contributes to the resolution of the problem situation at Software Inc. Theoretically, it introduces a new software engineering discipline, release cycle management (RCM), focused on recurrent delivery of software, including SPI as an integral part, and grounded in the specific experiences at Software Inc
Recommended from our members
Analysis and Development of Instrument Software Paradigms: Conception and Implementation of a New Instrument Control and Data Acquisition System, Proven by Material Scientific Applications
During the last 50 years, the quality of analysis methods in many scientific disciplines has been enhanced by electronic applications, automation and data processing. While the features, performance and usability of these processes have been continually enhanced, it is conspicuous that the majority of institutes operate own proprietary software. This situation arises for both historical and financial reasons, plus a wish to retain autonomy fuelled by the requirement for a system that remains compatible with both new and legacy hardware.
This thesis reviews the commonly used scientific software systems and their stakeholders and tries to identify generic problems. The demands on instrument systems are summarized by a requirement specification. Based on these requirements, a basic concept is developed that reflects the current state-of-the art in software design and which may provide a blueprint for instrument system architectures. The results are used to create a proof-of-concept implementation. Core to this approach is an application server that comes with a container, which makes use of the Inversion-of-Control pattern to loosely couple and execute components. These do not need to implement fixed interfaces and are thus decoupled from a specific use-case. Components can, for example, be proxies that control and acquire data from legacy hardware, perform calculations, provide a human-machine interface or act as storage. They are dynamically wired to experiments using XML-based Assembly files. Both Assemblies and Components can be published using a central store on a collaboration platform and shared by the community. This increases reusability and allows the use of existing Assemblies with new hardware by simply replacing the hardware proxy modules.
Example components have been provided for the access to legacy and new instrument hardware, the storage of results in the NeXus format, data reduction, simulation with McStas, the execution of customizable scans and the visualization of data