141,594 research outputs found

    Towards guidelines for building a business case and gathering evidence of software reference architectures in industry

    Get PDF
    Background: Software reference architectures are becoming widely adopted by organizations that need to support the design and maintenance of software applications of a shared domain. For organizations that plan to adopt this architecture-centric approach, it becomes fundamental to know the return on investment and to understand how software reference architectures are designed, maintained, and used. Unfortunately, there is little evidence-based support to help organizations with these challenges. Methods: We have conducted action research in an industry-academia collaboration between the GESSI research group and everis, a multinational IT consulting firm based in Spain. Results: The results from such collaboration are being packaged in order to create guidelines that could be used in similar contexts as the one of everis. The main result of this paper is the construction of empirically-grounded guidelines that support organizations to decide on the adoption of software reference architectures and to gather evidence to improve RA-related practices. Conclusions: The created guidelines could be used by other organizations outside of our industry-academia collaboration. With this goal in mind, we describe the guidelines in detail for their use.Peer ReviewedPostprint (published version

    Measuring Software Process: A Systematic Mapping Study

    Get PDF
    Context: Measurement is essential to reach predictable performance and high capability processes. It provides support for better understanding, evaluation, management, and control of the development process and project, as well as the resulting product. It also enables organizations to improve and predict its process’s performance, which places organizations in better positions to make appropriate decisions. Objective: This study aims to understand the measurement of the software development process, to identify studies, create a classification scheme based on the identified studies, and then to map such studies into the scheme to answer the research questions. Method: Systematic mapping is the selected research methodology for this study. Results: A total of 462 studies are included and classified into four topics with respect to their focus and into three groups based on the publishing date. Five abstractions and 64 attributes were identified, 25 methods/models and 17 contexts were distinguished. Conclusion: capability and performance were the most measured process attributes, while effort and performance were the most measured project attributes. Goal Question Metric and Capability Maturity Model Integration were the main methods and models used in the studies, whereas agile/lean development and small/medium-size enterprise were the most frequently identified research contexts.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-RMinisterio de Economía y Competitividad TIN2016-76956-C3-2- RMinisterio de Economía y Competitividad TIN2015-71938-RED

    A Survey of Prediction and Classification Techniques in Multicore Processor Systems

    Get PDF
    In multicore processor systems, being able to accurately predict the future provides new optimization opportunities, which otherwise could not be exploited. For example, an oracle able to predict a certain application\u27s behavior running on a smart phone could direct the power manager to switch to appropriate dynamic voltage and frequency scaling modes that would guarantee minimum levels of desired performance while saving energy consumption and thereby prolonging battery life. Using predictions enables systems to become proactive rather than continue to operate in a reactive manner. This prediction-based proactive approach has become increasingly popular in the design and optimization of integrated circuits and of multicore processor systems. Prediction transforms from simple forecasting to sophisticated machine learning based prediction and classification that learns from existing data, employs data mining, and predicts future behavior. This can be exploited by novel optimization techniques that can span across all layers of the computing stack. In this survey paper, we present a discussion of the most popular techniques on prediction and classification in the general context of computing systems with emphasis on multicore processors. The paper is far from comprehensive, but, it will help the reader interested in employing prediction in optimization of multicore processor systems

    CloudHealth: A Model-Driven Approach to Watch the Health of Cloud Services

    Full text link
    Cloud systems are complex and large systems where services provided by different operators must coexist and eventually cooperate. In such a complex environment, controlling the health of both the whole environment and the individual services is extremely important to timely and effectively react to misbehaviours, unexpected events, and failures. Although there are solutions to monitor cloud systems at different granularity levels, how to relate the many KPIs that can be collected about the health of the system and how health information can be properly reported to operators are open questions. This paper reports the early results we achieved in the challenge of monitoring the health of cloud systems. In particular we present CloudHealth, a model-based health monitoring approach that can be used by operators to watch specific quality attributes. The CloudHealth Monitoring Model describes how to operationalize high level monitoring goals by dividing them into subgoals, deriving metrics for the subgoals, and using probes to collect the metrics. We use the CloudHealth Monitoring Model to control the probes that must be deployed on the target system, the KPIs that are dynamically collected, and the visualization of the data in dashboards.Comment: 8 pages, 2 figures, 1 tabl

    Towards building information modelling for existing structures

    Get PDF
    The transformation of cities from the industrial age (unsustainable) to the knowledge age (sustainable) is essentially a ‘whole life cycle’ process consisting of; planning, development, operation, reuse and renewal. During this transformation, a multi-disciplinary knowledge base, created from studies and research about the built environment aspects is fundamental: historical, architectural, archeologically, environmental, social, economic, etc is critical. Although there are a growing number of applications of 3D VR modelling applications, some built environment applications such as disaster management, environmental simulations, computer aided architectural design and planning require more sophisticated models beyond 3D graphical visualization such as multifunctional, interoperable, intelligent, and multi-representational. Advanced digital mapping technologies such as 3D laser scanner technologies can be are enablers for effective e-planning, consultation and communication of users’ views during the planning, design, construction and lifecycle process of the built environment. For example, the 3D laser scanner enables digital documentation of buildings, sites and physical objects for reconstruction and restoration. It also facilitates the creation of educational resources within the built environment, as well as the reconstruction of the built environment. These technologies can be used to drive the productivity gains by promoting a free-flow of information between departments, divisions, offices, and sites; and between themselves, their contractors and partners when the data captured via those technologies are processed and modelled into BIM (Building Information Modelling). The use of these technologies is key enablers to the creation of new approaches to the ‘Whole Life Cycle’ process within the built and human environment for the 21st century. The paper describes the research towards Building Information Modelling for existing structures via the point cloud data captured by the 3D laser scanner technology. A case study building is elaborated to demonstrate how to produce 3D CAD models and BIM models of existing structures based on designated technique

    Software dependability modeling using an industry-standard architecture description language

    Full text link
    Performing dependability evaluation along with other analyses at architectural level allows both making architectural tradeoffs and predicting the effects of architectural decisions on the dependability of an application. This paper gives guidelines for building architectural dependability models for software systems using the AADL (Architecture Analysis and Design Language). It presents reusable modeling patterns for fault-tolerant applications and shows how the presented patterns can be used in the context of a subsystem of a real-life application
    • 

    corecore