6,510 research outputs found

    Empirical studies of open source evolution

    Get PDF
    Copyright @ 2008 Springer-VerlagThis chapter presents a sample of empirical studies of Open Source Software (OSS) evolution. According to these studies, the classical results from the studies of proprietary software evoltion, such as Lehman’s laws of software evolution, might need to be revised, if not fully, at least in part, to account for the OSS observations. The book chapter also summarises what appears to be the empirical status of each of Lehman’s laws with respect to OSS and highlights the threads to validity that frequently emerge in these empirical studies. The chapter also discusses related topics for further research

    Effort estimation of FLOSS projects: A study of the Linux kernel

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2011 SpringerEmpirical research on Free/Libre/Open Source Software (FLOSS) has shown that developers tend to cluster around two main roles: “core” contributors differ from “peripheral” developers in terms of a larger number of responsibilities and a higher productivity pattern. A further, cross-cutting characterization of developers could be achieved by associating developers with “time slots”, and different patterns of activity and effort could be associated to such slots. Such analysis, if replicated, could be used not only to compare different FLOSS communities, and to evaluate their stability and maturity, but also to determine within projects, how the effort is distributed in a given period, and to estimate future needs with respect to key points in the software life-cycle (e.g., major releases). This study analyses the activity patterns within the Linux kernel project, at first focusing on the overall distribution of effort and activity within weeks and days; then, dividing each day into three 8-hour time slots, and focusing on effort and activity around major releases. Such analyses have the objective of evaluating effort, productivity and types of activity globally and around major releases. They enable a comparison of these releases and patterns of effort and activities with traditional software products and processes, and in turn, the identification of company-driven projects (i.e., working mainly during office hours) among FLOSS endeavors. The results of this research show that, overall, the effort within the Linux kernel community is constant (albeit at different levels) throughout the week, signalling the need of updated estimation models, different from those used in traditional 9am–5pm, Monday to Friday commercial companies. It also becomes evident that the activity before a release is vastly different from after a release, and that the changes show an increase in code complexity in specific time slots (notably in the late night hours), which will later require additional maintenance efforts

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Adapting the “Staged Model for Software Evolution” to FLOSS

    Get PDF
    Research into traditional software evolution has been tackled from two broad perspectives: that focused on the how, which looks at the processes, methods and techniques to implement and evolve software; and that focused on the what/why perspective, aiming at achieving an understanding of the drivers and general characteristics of the software evolution phenomenon. The two perspectives are related in various ways: the study of the what/why is for instance essential to achieve an appropriate management of software engineering activities, and to guide innovation in processes, methods and tools, that is, the how. The output of the what/why studies is exemplified by empirical hypotheses, such as the staged model of software evolution,. This paper focuses on the commonalities and differences between the evolution and patterns in the lifecycles of traditional commercial systems and free/libre/open source software (FLOSS) systems. The existing staged model for software evolution is therefore revised for its applicability on FLOSS systems

    Parametrizations of Inclusive Cross Sections for Pion Production in Proton-Proton Collisions

    Full text link
    Accurate knowledge of cross sections for pion production in proton-proton collisions finds wide application in particle physics, astrophysics, cosmic ray physics and space radiation problems, especially in situations where an incident proton is transported through some medium, and one requires knowledge of the output particle spectrum given the input spectrum. In such cases accurate parametrizations of the cross sections are desired. In this paper we review much of the experimental data and compare to a wide variety of different cross section parametrizations. In so doing, we provide parametrizations of neutral and charged pion cross sections which provide a very accurate description of the experimental data. Lorentz invariant differential cross sections, spectral distributions and total cross section parametrizations are presented.Comment: 32 pages with 15 figures. Published in Physical Review D62, 094030. File includes 6 tex files. The main file is paper.tex which has include statements refering to the rest. figures are in graphs.di

    Are Developers Fixing Their Own Bugs? Tracing Bug-ïŹxing and Bug-seeding Committers

    Get PDF
    The process of ïŹxing software bugs plays a key role in the maintenance activities of a soft- ware project. Ideally, code ownership and responsibility should be enforced among developers working on the same artifacts, so that those introducing buggy code could also contribute to its ïŹx. However, especially in FLOSS projects, this mechanism is not clearly understood: in particular, it is not known whether those contributors ïŹxing a bug are the same introducing and seeding it in the ïŹrst place. This paper aims to study this issue, by analysing the comm-central FLOSS project, which hosts part of the Thunderbird, SeaMonkey, Lightning extensions and Sunbird projects from the Mozilla community. The analysis is focused at the level of lines of code and it uses the information stored in the source code management system. The results of this study show, at ïŹrst, that in 80% of the cases, the bug-ïŹxing activity involves source code modiïŹed by at most two developers. It also emerges that the developers ïŹxing the bug are only responsible for 3.5% of the previous modiïŹcations to the lines affected; this implies that the other developers making changes to those lines could have made that ïŹx. We conclude by stating that, in most of the cases the bug ïŹxing process in comm-central is not carried out by the same developers than those who seeded the buggy code

    HEP Applications Evaluation of the EDG Testbed and Middleware

    Full text link
    Workpackage 8 of the European Datagrid project was formed in January 2001 with representatives from the four LHC experiments, and with experiment independent people from five of the six main EDG partners. In September 2002 WP8 was strengthened by the addition of effort from BaBar and D0. The original mandate of WP8 was, following the definition of short- and long-term requirements, to port experiment software to the EDG middleware and testbed environment. A major additional activity has been testing the basic functionality and performance of this environment. This paper reviews experiences and evaluations in the areas of job submission, data management, mass storage handling, information systems and monitoring. It also comments on the problems of remote debugging, the portability of code, and scaling problems with increasing numbers of jobs, sites and nodes. Reference is made to the pioneeering work of Atlas and CMS in integrating the use of the EDG Testbed into their data challenges. A forward look is made to essential software developments within EDG and to the necessary cooperation between EDG and LCG for the LCG prototype due in mid 2003.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics Conference (CHEP03), La Jolla, CA, USA, March 2003, 7 pages. PSN THCT00

    Some Findings Concerning Requirements in Agile Methodologies

    Get PDF
    gile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies

    Fluctuations in Hadronic and Nuclear Collisions

    Get PDF
    We investigate several fluctuation effects in high-energy hadronic and nuclear collisions through the analysis of different observables. To introduce fluctuations in the initial stage of collisions, we use the Interacting Gluon Model (IGM) modified by the inclusion of the impact parameter. The inelasticity and leading-particle distributions follow directly from this model. The fluctuation effects on rapidity distributions are then studied by using Landau's Hydrodynamic Model in one dimension. To investigate further the effects of the multiplicity fluctuation, we use the Longitudinal Phase-Space Model, with the multiplicity distribution calculated within the hydrodynamic model, and the initial conditions given by the IGM. Forward-backward correlation is obtained in this way.Comment: 22 pages, RevTex, 8 figures (included); Invited paper to the special issue of Foundation of Physics dedicated to Mikio Namiki's 70th. birthda
    • 

    corecore