22 research outputs found

    OSSMETER D3.2 – Report on Source Code Activity Metrics

    No full text
    This deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been investigated for Source Code Quality Analysis and initial language dependent and independent metrics have been prototyped. Task 3.2 builds partly on the results of Task 3.1 and partly introduces new infra-structure. This includes: • the extraction and analysis of the meta data of version control systems (VCS); • the development of selected source code metrics from Task 3.1 over time. The following initial measurements of VCS meta data were planned and have been executed, in the context of the SVN and GIT version control systems: • Number of committed changes; • Size of committed changes (churn); • Number of committers; • Activity distribution per committer (over files). On top of this we explored analysis of the activity in terms of certain language-specific source code metrics from the previous Task 3.1: • Number of changed, added, deleted methods and classes to experiment with language-specific activities. • Measurement of evenness of distributions (Gini coefficient) of the metrics developed in Task 3.1, for the purpose of detecting trends and spikes. The goal of these additional metrics is to start bridging the gap from code and VCS meta data metrics to the analysis requirements of the project partners. What makes WP3 in OSSMETER special is its integrated infrastructure that provides a homogeneous view on languages, analyses and metrics. We generate metrics using high level (descriptive) code in the Rascal language. In this deliverable we present: • A brief summary of motivation and challenges for Task 3.2; • A streamlined interface between the platform and the Rascal programming language; • A mapping from an object-oriented VCS deltas model to a functional VCS delta model; • Platform support for managing full working copies and source code diffs; • A description of the rationale, design and implementation of the above metrics

    OSSMETER Deliverable 3.1 - Report on Domain Analysis of OSS Quality Attributes

    No full text
    Meaningful and effective measurement of quality attributes of Open Source Software (OSS) requires: • Analysis of and insight in the domain of software quality measurement. • Identification of relevant metrics to measure software quality attributes. • A meta-model to store the results of measurement (i.e., facts directly extracted from the source) as well as any metrics that are derived from these measurements. • Calculation of metrics based on the extracted facts. In this deliverable we present: • A brief summary of motivation and challenges for Task 3.1. • Introduction to software quality as described in ISO/IEC 9126-1:2001. • An initial set of requirements for quality attributes in OSSMETER. • A survey of the domain of software quality. • An initial selection of metrics that are relevant for measuring software quality attributes. • A tool survey of existing tools for metric calculation. • A quick overview of the RASCAL meta-programming language that will be used for metric calculation. • A proposal for and illustration of the Metrics Meta-Model (M3), a general framework for representing basic facts as well as derived (computed) metrics. We show how M3 can be ex- tended to represent Java-specific facts and how these facts form the basis for metric computation and visualization. The presented metrics are already useful but are primarily intended as a proof-of-concept of the approach. • An estimation how well the general OSSMETER requirements for WP3 can be satisfied

    OSSMETER D3.2 – Report on Source Code Activity Metrics

    No full text
    This deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been investigated for Source Code Quality Analysis and initial language dependent and independent metrics have been prototyped. Task 3.2 builds partly on the results of Task 3.1 and partly introduces new infra-structure. This includes: • the extraction and analysis of the meta data of version control systems (VCS); • the development of selected source code metrics from Task 3.1 over time. The following initial measurements of VCS meta data were planned and have been executed, in the context of the SVN and GIT version control systems: • Number of committed changes; • Size of committed changes (churn); • Number of committers; • Activity distribution per committer (over files). On top of this we explored analysis of the activity in terms of certain language-specific source code metrics from the previous Task 3.1: • Number of changed, added, deleted methods and classes to experiment with language-specific activities. • Measurement of evenness of distributions (Gini coefficient) of the metrics developed in Task 3.1, for the purpose of detecting trends and spikes. The goal of these additional metrics is to start bridging the gap from code and VCS meta data metrics to the analysis requirements of the project partners. What makes WP3 in OSSMETER special is its integrated infrastructure that provides a homogeneous view on languages, analyses and metrics. We generate metrics using high level (descriptive) code in the Rascal language. In this deliverable we present: • A brief summary of motivation and challenges for Task 3.2; • A streamlined interface between the platform and the Rascal programming language; • A mapping from an object-oriented VCS deltas model to a functional VCS delta model; • Platform support for managing full working copies and source code diffs; • A description of the rationale, design and implementation of the above metrics

    M3: An Open Model For Measuring Code Artifacts

    No full text
    In the context of the EU FP7 project ''OSSMETER'' we are developing an infra-structure for measuring source code. The goal of OSSMETER is to obtain insight in the quality of open-source projects from all possible perspectives, including product, process and community. This is a "white paper" on M3, a set of code models, which should be easy to construct, easy to extend to include language specifics and easy to consume to produce metrics and other analyses. We solicit feedback on its usability

    OSSMETER D3.3 – Language Agnostic Source Code Quality Analysis

    No full text
    This deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and a domain analysis have been investigated for Source Code Quality Analysis and initial language-dependent and language-agnostic metrics have been prototyped. • Deliverable 3.2 where source code activity metrics have been investigated. • Collaboration with WP2 and WP5, where an integrated quality model is developed which brings together metrics into concise and comparable descriptions (“factoids”) of project quality aspects. In this deliverable we report solely on source code and source code activity metrics which work on any (programming) language. For language specific quality analysis we refer to Deliverable 3.4. The work on Tasks 3.3 and 3.4 has been done in parallel. On the one hand, in order to prevent unnecessary duplication the final report on the satisfaction of the requirements that were identified in Deliverable 1.1 are presented not here, but in Deliverable 3.4 instead. On the other hand, for the sake of cohesion and readability, some general design considerations concerning the metrics and their aggregation is copied between the two deliverable documents in the introduction sections of both documents 1). The current deliverable does include an update to the Source Code Activity Analysis metrics which were reported on earlier in Deliverable 3.2. The reason is that these metrics are also language independent. The update is necessary to link the results of Deliverable 3.2 into the overall quality model of OSSMETER (see Deliverable 2.4). In this deliverable we present: • A brief summary of motivation and challenges for Task 3.3; • A list of all language-independent source code quality metrics including their motivation in GQM terminology. • An updated list of all source code activity metrics including their motivation in GQM terminol- ogy. • An overview of the use of the above metrics in the OSSMETER quality model through factoids

    OSSMETER Deliverable 3.1 - Report on Domain Analysis of OSS Quality Attributes

    No full text
    Meaningful and effective measurement of quality attributes of Open Source Software (OSS) requires: • Analysis of and insight in the domain of software quality measurement. • Identification of relevant metrics to measure software quality attributes. • A meta-model to store the results of measurement (i.e., facts directly extracted from the source) as well as any metrics that are derived from these measurements. • Calculation of metrics based on the extracted facts. In this deliverable we present: • A brief summary of motivation and challenges for Task 3.1. • Introduction to software quality as described in ISO/IEC 9126-1:2001. • An initial set of requirements for quality attributes in OSSMETER. • A survey of the domain of software quality. • An initial selection of metrics that are relevant for measuring software quality attributes. • A tool survey of existing tools for metric calculation. • A quick overview of the RASCAL meta-programming language that will be used for metric calculation. • A proposal for and illustration of the Metrics Meta-Model (M3), a general framework for representing basic facts as well as derived (computed) metrics. We show how M3 can be ex- tended to represent Java-specific facts and how these facts form the basis for metric computation and visualization. The presented metrics are already useful but are primarily intended as a proof-of-concept of the approach. • An estimation how well the general OSSMETER requirements for WP3 can be satisfied

    OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    No full text
    This deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and a domain analysis have been investigated for Source Code Quality Analysis and initial language-dependent and language-agnostic metrics have been prototyped. • Deliverable 3.2 where an integrated architecture for source code analysis and source code activity analysis was presented. The current document adds multi-language support to this architecture. • Collaboration with WP2 and WP5, where an integrated quality model is developed which brings together metrics into concise and comparable descriptions (“factoids”) of project quality aspects. In this deliverable we report solely on source code a metrics which work for specific programming languages, namely Java and PHP. For language agnostic quality analysis we refer to Deliverable 3.3. The work on Tasks 3.3 and 3.4 has been done in parallel. On the one hand, in order to prevent unnecessary duplication the final report on the satisfaction of the requirements that were identified in Deliverable 1.1 are presented here, but not in Deliverable 3.3. On the other hand, for the sake of cohesion and readability, some general design considerations concerning the metrics and their aggregation is copied between the two deliverable documents in the introduction sections of both documents (Section 1). In this deliverable we present: • A brief summary of motivation and challenges for Task 3.4; • The infra-structure to support metrics for PHP; • A build-information (e.g. class paths) recovery tool for Java, called BMW; • Metrics including their motivation in GQM terminology [3]: • Shared metric providers between Java and PHP • Metric providers specific for Java • Metric providers specific for PHP • An overview of the use of the above metrics in the OSSMETER quality model through factoids. • A status update of the full requirements table relevant for WP3. This document is also good to read if you are a user of OSSMETER
    corecore