1,158 research outputs found

    Technical Debt Prioritization: State of the Art. A Systematic Literature Review

    Get PDF
    Background. Software companies need to manage and refactor Technical Debt issues. Therefore, it is necessary to understand if and when refactoring Technical Debt should be prioritized with respect to developing features or fixing bugs. Objective. The goal of this study is to investigate the existing body of knowledge in software engineering to understand what Technical Debt prioritization approaches have been proposed in research and industry. Method. We conducted a Systematic Literature Review among 384 unique papers published until 2018, following a consolidated methodology applied in Software Engineering. We included 38 primary studies. Results. Different approaches have been proposed for Technical Debt prioritization, all having different goals and optimizing on different criteria. The proposed measures capture only a small part of the plethora of factors used to prioritize Technical Debt qualitatively in practice. We report an impact map of such factors. However, there is a lack of empirical and validated set of tools. Conclusion. We observed that technical Debt prioritization research is preliminary and there is no consensus on what are the important factors and how to measure them. Consequently, we cannot consider current research conclusive and in this paper, we outline different directions for necessary future investigations

    Generative Diffusion of Innovations: An Organizational Genetics Approach

    Get PDF
    Innovation in open ecosystems such as open source software is characterized by generative diffusion, the property of such ecosystems to evolve and change over time through the actions of uncoordinated participants. In this research, we contend that existing models of diffusion are not adequate to capture the multi-faceted nature of generative diffusion. To address this challenge, we use concepts from biological sciences to propose a multi-dimensional perspective to study generative diffusion, and construct three metrics: proliferation,evolvability, and temporality. Further, we use techniques inspired by genetics to measure these constructs in the context of open source software. In this research in progress manuscript, we demonstrate the applicability of our work with one example of an open source software project. This study makes an immense contribution not only to the study of open innovation, but also makes a methodological contribution by introducing the use of evolutionary genetics to study digital artifacts

    VCF2Networks: applying genotype networks to single-nucleotide variants data

    Get PDF
    Summary: A wealth of large-scale genome sequencing projects opens the doors to new approaches to study the relationship between genotype and phenotype. One such opportunity is the possibility to apply genotype networks analysis to population genetics data. Genotype networks are a representation of the set of genotypes associated with a single phenotype, and they allow one to estimate properties such as the robustness of the phenotype to mutations, and the ability of its associated genotypes to evolve new adaptations. So far, though, genotype networks analysis has rarely been applied to population genetics data. To help fill this gap, here we present VCF2Networks, a tool to determine and study genotype network structure from single-nucleotide variant data. Availability and implementation: VCF2Networks is available at https://bitbucket.org/dalloliogm/vcf2networks. Contact: [email protected] Supplementary information: Supplementary data are available at Bioinformatics onlin

    Evolvable Smartphone-Based Platforms for Point-Of-Care In-Vitro Diagnostics Applications

    Get PDF
    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods

    On Integrating Student Empirical Software Engineering Studies with Research and Teaching Goals

    Get PDF
    Background: Many empirical software engineering studies use students as subjects and are conducted as part of university courses. Aim: We aim at reporting our experiences with using guidelines for integrating empirical studies with our research and teaching goals. Method: We document our experience from conducting three studies with graduate students in two software architecture courses. Results: Our results show some problems that we faced when following the guidelines and deviations we made from the original guidelines. Conclusions: Based on our results we propose recommendations for empirical software engineering studies that are integrated in university courses.

    Technology Assessment of High Capacity Data Storage Systems: Can We Avoid a Data Survivability Crisis?

    Get PDF
    This technology assessment of long-term high capacity data storage systems identifies an emerging crisis of severe proportions related to preserving important historical data in science, healthcare, manufacturing, finance and other fields. For the last 50 years, the information revolution, which has engulfed all major institutions of modem society, centered itself on data-their collection, storage, retrieval, transmission, analysis and presentation. The transformation of long term historical data records into information concepts, according to Drucker, is the next stage in this revolution towards building the new information based scientific and business foundations. For this to occur, data survivability, reliability and evolvability of long term storage media and systems pose formidable technological challenges. Unlike the Y2K problem, where the clock is ticking and a crisis is set to go off at a specific time, large capacity data storage repositories face a crisis similar to the social security system in that the seriousness of the problem emerges after a decade or two. The essence of the storage crisis is as follows: since it could take a decade to migrate a peta-byte of data to a new media for preservation, and the life expectancy of the storage media itself is only a decade, then it may not be possible to complete the transfer before an irrecoverable data loss occurs. Over the last two decades, a number of anecdotal crises have occurred where vital scientific and business data were lost or would have been lost if not for major expenditures of resources and funds to save this data, much like what is happening today to solve the Y2K problem. A pr-ime example was the joint NASA/NSF/NOAA effort to rescue eight years worth of TOVS/AVHRR data from an obsolete system, which otherwise would have not resulted in the valuable 20-year long satellite record of global warming. Current storage systems solutions to long-term data survivability rest on scalable architectures having parallel paths for data migration

    NASA space station automation: AI-based technology review. Executive summary

    Get PDF
    Research and Development projects in automation technology for the Space Station are described. Artificial Intelligence (AI) based technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics

    Robustness - a challenge also for the 21st century: A review of robustness phenomena in technical, biological and social systems as well as robust approaches in engineering, computer science, operations research and decision aiding

    Get PDF
    Notions on robustness exist in many facets. They come from different disciplines and reflect different worldviews. Consequently, they contradict each other very often, which makes the term less applicable in a general context. Robustness approaches are often limited to specific problems for which they have been developed. This means, notions and definitions might reveal to be wrong if put into another domain of validity, i.e. context. A definition might be correct in a specific context but need not hold in another. Therefore, in order to be able to speak of robustness we need to specify the domain of validity, i.e. system, property and uncertainty of interest. As proofed by Ho et al. in an optimization context with finite and discrete domains, without prior knowledge about the problem there exists no solution what so ever which is more robust than any other. Similar to the results of the No Free Lunch Theorems of Optimization (NLFTs) we have to exploit the problem structure in order to make a solution more robust. This optimization problem is directly linked to a robustness/fragility tradeoff which has been observed in many contexts, e.g. 'robust, yet fragile' property of HOT (Highly Optimized Tolerance) systems. Another issue is that robustness is tightly bounded to other phenomena like complexity for which themselves exist no clear definition or theoretical framework. Consequently, this review rather tries to find common aspects within many different approaches and phenomena than to build a general theorem for robustness, which anyhow might not exist because complex phenomena often need to be described from a pluralistic view to address as many aspects of a phenomenon as possible. First, many different robustness problems have been reviewed from many different disciplines. Second, different common aspects will be discussed, in particular the relationship of functional and structural properties. This paper argues that robustness phenomena are also a challenge for the 21st century. It is a useful quality of a model or system in terms of the 'maintenance of some desired system characteristics despite fluctuations in the behaviour of its component parts or its environment' (s. [Carlson and Doyle, 2002], p. 2). We define robustness phenomena as solution with balanced tradeoffs and robust design principles and robustness measures as means to balance tradeoffs. --
    corecore