132 research outputs found

    Panel: A call for action in tackling environmental sustainability through green information technologies and systems

    Get PDF
    In a previous paper, we have found empirical evidence supporting a positive relationship between network centrality and success. However, we have also found that more successful projects have a lower technical quality. A first, straightforward argument explaining previous findings is that more central contributors are also highly skilled developers who are well known for their ability to manage the complexity of code with a lower attention to the software structure. The consolidated metrics of software quality used by the authors in their previous research represent measures of code structure. This paper provides empirical evidence supporting the idea that the negative impact of success on quality is caused by the careless behaviour of skilled developers, who are also hubs within the social network. Research hypotheses are tested on a sample of 56 OS applications from the SourceForge.net repository, with a total of 378 developers. The sample includes some of the most successful and large OS projects, as well as a cross-section of less famous active projects evenly distributed among SourceForge.net’s project categories

    A Multi Model Algorithm for the Cost Oriented Design of the Information Technology Infastructure

    Get PDF
    Multiple combinations of hardware and network components can be selected to design an information technology (IT) infrastructure that satisfies performance requirements. The professional criterion to deal with these degrees of freedom is cost minimization. However, a scientific approach has been rarely applied to cost minimization and a rigorous methodological support to cost issues of infrastructural design is still lacking. The methodological contribution of this paper is the representation of complex infrastructural design issues as a set of four intertwined cost-minimization sub-problems: two set-coverings, a set-packing and a min k-cut with a non linear objective function. Optimization is accomplished by sequentially solving all sub-problems with a heuristic approach and finally tuning the solution with a local-search approach. The methodology is empirically verified with a software tool including a database of costs that has also been built as part of this research. The work shows how an overall costminimization approach can provide significant savings and indicates how corresponding infrastructural design rules can substantially differ from the local optima previously identified by the professional literature

    The economics of community open source software projects: an empirical analysis of maintenance effort

    Get PDF
    Previous contributions in the empirical software engineering literature have consistently observed a quality degradation effect of proprietary code as a consequence of maintenance. This degradation effect, referred to as entropy effect, has been recognized to be responsible for significant increases in maintenance effort. In the Open Source context, the quality of code is a fundamental design principle. As a consequence, the maintenance effort of Open Source applications may not show a similar increasing trend over time. The goal of this paper is to empirically verify the entropy effect for a sample of 4,289 community Open Source application versions. Analyses are based on the comparison with an estimate of effort obtained with a traditional effort estimation model. Findings indicate that community Open Source applications show a slower growth of maintenance effort over time, and, therefore, are less subject to the entropy effect

    The Impact of MIS Software on IT Energy Consumption

    Get PDF
    The energy consumption of IT has a great impact on operational costs, in addition to being important for social responsibility and system scalability issues. Research on IT energy efficiency has always focused on hardware, whereas within the software domain it has mainly focused on embedded systems. In this paper we present the preliminary results of some experiments that we conducted to evaluate MIS applications from an energy efficiency point of view. We analyze in details some selected case studies, including 2 ERPs, 2 CRMs and 4 DBMS. Our evidence suggests i) that not only the infrastructural layers, but also the MIS applications layer does impact on the energy consumption; ii) that different MIS applications satisfying the same functional requirements consume significantly different amounts of energy; and iii) that in some scenarios energy efficiency cannot be increased simply by improving time performance

    Efficiency implications of open source commonality and reuse

    Get PDF
    This paper analyzes the reuse choices made by open source developers and relates them to cost efficiency. We make a distinction between the commonality among applications and the actual reuse of code. The former represents the similarity between the requirements of different applications and, consequently, the functionalities that they provide. The latter represents the actual reuse of code. No application can be maintained for ever. A fundamental reason for the need for periodical replacement of code is the exponential growth of costs with the number of maintenance interventions. Intuitively, this is due to the increasing complexity of software that grows in both size and coupling among different modules. The paper measures commonality, reuse and development costs of 26 open-source projects for a total of 171 application versions. Results show that reuse choices in open-source contexts are not cost efficient. Developers tend to reuse code from the most recent version of applications, even if their requirements are closer to previous versions. Furthermore, the latest version of an application is always the one that has incurred the highest number of maintenance interventions. Accordingly, the development cost per new line of code is found to grow with reuse

    Informing Observers: Quality-driven Filtering and Composition of Web 2.0 Sources

    Get PDF
    Current Web technologies enable an active role of users, who can create and share their contents very easily. This mass of information includes opinions about a variety of key interest topics and represents a new and invaluable source of marketing information. Public and private organizations that aim at understanding and analyzing this unsolicited feedback need adequate platforms that can support the detection and monitoring of key topics. Hence, there is an emerging trend towards automated market intelligence and the crafting of tools that allow monitoring in a mechanized fashion. We therefore present an approach that is based on quality of Web 2.0 sources as the key factor for information filtering and also allows the users to flexibly and easily compose their analysis environments thanks to the adoption of a mashup platform

    A capacity and value based model for data architectures adopting integration technologies

    Get PDF
    The paper discusses two concepts that have been associated with various approaches to data and information, namelycapacity and value, focusing on data base architectures, and on two types of technologies diffusely used in integrationprojects, namely data integration, in the area of Enterprise Information Integration, and publish & subscribe, in the area ofEnterprise Application Integration. Furthermore, the paper proposes and discusses a unifying model for information capacityand value, that considers also quality constraints and run time costs of the data base architecture
    • …
    corecore