331,753 research outputs found

    Week of December 14, 2015

    Get PDF
    Summer Fellowship Pays Off in the Form of Published Research Department of Surgery Hosts Twelfth Annual Louis R.M. Del Guercio, M.D., Distinguished Visiting Professorship and Research Day D.P.T. Students Share Their Community Service Projects Department of Pediatrics Hosts Fifth Annual Assistant Professor Pediatric Research Symposium NYMC Serves Up International Food and Handicrafts P2P Committee Recognizes Role Models at NYMChttps://touroscholar.touro.edu/in_touch/1264/thumbnail.jp

    Investigating the Impact of Continuous Integration Practices on the Productivity and Quality of Open-Source Projects

    Full text link
    Background: Much research has been conducted to investigate the impact of Continuous Integration (CI) on the productivity and quality of open-source projects. Most of studies have analyzed the impact of adopting a CI server service (e.g, Travis-CI) but did not analyze CI sub-practices. Aims: We aim to evaluate the impact of five CI sub-practices with respect to the productivity and quality of GitHub open-source projects. Method: We collect CI sub-practices of 90 relevant open-source projects for a period of 2 years. We use regression models to analyze whether projects upholding the CI sub-practices are more productive and/or generate fewer bugs. We also perform a qualitative document analysis to understand whether CI best practices are related to a higher quality of projects. Results: Our findings reveal a correlation between the Build Activity and Commit Activity sub-practices and the number of merged pull requests. We also observe a correlation between the Build Activity, Build Health and Time to Fix Broken Builds sub-practices and number of bug-related issues. The qualitative analysis reveals that projects with the best values for CI sub-practices face fewer CI-related problems compared to projects that exhibit the worst values for CI sub-practices. Conclusions: We recommend that projects should strive to uphold the several CI sub-practices as they can impact in the productivity and quality of projects.Comment: Paper accepted for publication by The ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM

    Innovation and research in organic farming: A multi‐level approach to facilitate cooperation among stakeholders

    Get PDF
    A wider range of stakeholders is expected to be involved in organic research. A decision‐support tool is needed to define priorities and to allocate tasks among institutions. Based on research and management experience in organic research, the authors have developed a framework for experimental and research projects. The framework is based on a multi‐level approach. Each level is defined according to the directness of the innovation impact on the organic systems. The projects carried out for each level were assessed over a ten-year period. Two applications are presented: analysis of crop protection strategies in horticulture and plant breeding programmes. When combined with four development models of organic farming, this multi‐level analysis appears to be promising for defining research agendas

    PICES Press, Vol. 18, No. 2, Summer 2010

    Get PDF
    •The 2010 Inter-sessional Science Board Meeting: A Note from the Science Board Chairman (pp. 1-3) •2010 Symposium on “Effects of Climate Change on Fish and Fisheries” (pp. 4-11) •2009 Mechanism of North Pacific Low Frequency Variability Workshop (pp. 12-14) •The Fourth China-Japan-Korea GLOBEC/IMBER Symposium (pp. 15-17, 23) •2010 Sendai Ocean Acidification Workshop (pp. 18-19, 31) •2010 Sendai Coupled Climate-to-Fish-to-Fishers Models Workshop (pp. 20-21) •2010 Sendai Salmon Workshop on Climate Change (pp. 22-23) •2010 Sendai Zooplankton Workshop (pp. 24-25, 28) •2010 Sendai Workshop on “Networking across Global Marine Hotspots” (pp. 26-28) •The Ocean, Salmon, Ecology and Forecasting in 2010 (pp. 29, 44) •The State of the Northeast Pacific during the Winter of 2009/2010 (pp. 30-31) •The State of the Western North Pacific in the Second Half of 2009 (pp. 32-33) •The Bering Sea: Current Status and Recent Events (pp. 34-35, 39) •PICES Seafood Safety Project: Guatemala Training Program (pp. 36-39) •The Pacific Ocean Boundary Ecosystem and Climate Study (POBEX) (pp. 40-43) •PICES Calendar (p. 44

    Too Trivial To Test? An Inverse View on Defect Prediction to Identify Methods with Low Fault Risk

    Get PDF
    Background. Test resources are usually limited and therefore it is often not possible to completely test an application before a release. To cope with the problem of scarce resources, development teams can apply defect prediction to identify fault-prone code regions. However, defect prediction tends to low precision in cross-project prediction scenarios. Aims. We take an inverse view on defect prediction and aim to identify methods that can be deferred when testing because they contain hardly any faults due to their code being "trivial". We expect that characteristics of such methods might be project-independent, so that our approach could improve cross-project predictions. Method. We compute code metrics and apply association rule mining to create rules for identifying methods with low fault risk. We conduct an empirical study to assess our approach with six Java open-source projects containing precise fault data at the method level. Results. Our results show that inverse defect prediction can identify approx. 32-44% of the methods of a project to have a low fault risk; on average, they are about six times less likely to contain a fault than other methods. In cross-project predictions with larger, more diversified training sets, identified methods are even eleven times less likely to contain a fault. Conclusions. Inverse defect prediction supports the efficient allocation of test resources by identifying methods that can be treated with less priority in testing activities and is well applicable in cross-project prediction scenarios.Comment: Submitted to PeerJ C

    Integrate the GM(1,1) and Verhulst models to predict software stage effort

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2009 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Software effort prediction clearly plays a crucial role in software project management. In keeping with more dynamic approaches to software development, it is not sufficient to only predict the whole-project effort at an early stage. Rather, the project manager must also dynamically predict the effort of different stages or activities during the software development process. This can assist the project manager to reestimate effort and adjust the project plan, thus avoiding effort or schedule overruns. This paper presents a method for software physical time stage-effort prediction based on grey models GM(1,1) and Verhulst. This method establishes models dynamically according to particular types of stage-effort sequences, and can adapt to particular development methodologies automatically by using a novel grey feedback mechanism. We evaluate the proposed method with a large-scale real-world software engineering dataset, and compare it with the linear regression method and the Kalman filter method, revealing that accuracy has been improved by at least 28% and 50%, respectively. The results indicate that the method can be effective and has considerable potential. We believe that stage predictions could be a useful complement to whole-project effort prediction methods.National Natural Science Foundation of China and the Hi-Tech Research and Development Program of Chin
    • …
    corecore