9,959 research outputs found

    Exploiting a Goal-Decomposition Technique to Prioritize Non-functional Requirements

    Get PDF
    Business stakeholders need to have clear and realistic goals if they want to meet commitments in application development. As a consequence, at early stages they prioritize requirements. However, requirements do change. The effect of change forces the stakeholders to balance alternatives and reprioritize requirements accordingly. In this paper we discuss the problem of priorities to non-functional requirements subjected to change. We, then, propose an approach to help smooth the impact of such changes. Our approach favors the translation of nonoperational specifications into operational definitions that can be evaluated once the system is developed. It uses the goal-question-metric method as the major support to decompose non-operational specifications into operational ones. We claim that the effort invested in operationalizing NFRs helps dealing with changing requirements during system development. Based on\ud this transformation and in our experience, we provide guidelines to prioritize volatile non-functional requirements

    ADAPTS: An Intelligent Sustainable Conceptual Framework for Engineering Projects

    Get PDF
    This paper presents a conceptual framework for the optimization of environmental sustainability in engineering projects, both for products and industrial facilities or processes. The main objective of this work is to propose a conceptual framework to help researchers to approach optimization under the criteria of sustainability of engineering projects, making use of current Machine Learning techniques. For the development of this conceptual framework, a bibliographic search has been carried out on the Web of Science. From the selected documents and through a hermeneutic procedure the texts have been analyzed and the conceptual framework has been carried out. A graphic representation pyramid shape is shown to clearly define the variables of the proposed conceptual framework and their relationships. The conceptual framework consists of 5 dimensions; its acronym is ADAPTS. In the base are: (1) the Application to which it is intended, (2) the available DAta, (3) the APproach under which it is operated, and (4) the machine learning Tool used. At the top of the pyramid, (5) the necessary Sensing. A study case is proposed to show its applicability. This work is part of a broader line of research, in terms of optimization under sustainability criteria.Telefónica Chair “Intelligence in Networks” of the University of Seville (Spain

    System Qualities Ontology, Tradespace and Affordability (SQOTA) Project Phase 5

    Get PDF
    Motivation and Context: One of the key elements of the SERC's research strategy is transforming the practice of systems engineering and associated management practices- "SE and Management Transformation (SEMT)." The Grand Challenge goal for SEMT is to transform the DoD community 's current systems engineering and management methods, processes, and tools (MPTs) and practices away from sequential, single stovepipe system, hardware-first ,document-driven, point- solution, acquisition-oriented approaches; and toward concurrent, portfolio and enterprise-oriented, hardware-software-human engineered, model-driven, set-based, full life cycle approaches.This material is based upon work supported, in whole or in part, by the U.S. Department of Defense through the Office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) under Contract H98230-08-D-0171 and HQ0034-13-D-0004 (TO 0060).This material is based upon work supported, in whole or in part, by the U.S. Department of Defense through the Office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) under Contract H98230-08-D-0171 and HQ0034-13-D-0004 (TO 0060)

    New Effort and Schedule Estimation Models for Agile Processes in the U.S. DoD

    Get PDF
    Excerpt from the Proceedings of the Nineteenth Annual Acquisition Research SymposiumThe DoD’s new software acquisition pathway prioritizes speed of delivery, advocating agile software processes. Estimating the cost and schedule of agile software projects is critical at an early phase to establish baseline budgets and to select competitive bidders. The challenge is that common ag-ile sizing measures such as story points and user stories are not practical for early estimation as these are often reported after contract award in DoD. This study provides a set of parametric effort and schedule estimation models for agile projects using a sizing measure that is available before proposal evaluation based on data from 36 DoD agile projects. The results suggest that initial software requirements, defined as the sum of functions and external interfaces, is an effective sizing measure for early estimation of effort and schedule of agile projects. The models’ accuracy improves when application domain groups and peak staff are added as inputs.Approved for public release; distribution is unlimited

    New Effort and Schedule Estimation Models for Agile Processes in the U.S. DoD

    Get PDF
    Excerpt from the Proceedings of the Nineteenth Annual Acquisition Research SymposiumThe DoD’s new software acquisition pathway prioritizes speed of delivery, advocating agile software processes. Estimating the cost and schedule of agile software projects is critical at an early phase to establish baseline budgets and to select competitive bidders. The challenge is that common ag-ile sizing measures such as story points and user stories are not practical for early estimation as these are often reported after contract award in DoD. This study provides a set of parametric effort and schedule estimation models for agile projects using a sizing measure that is available before proposal evaluation based on data from 36 DoD agile projects. The results suggest that initial software requirements, defined as the sum of functions and external interfaces, is an effective sizing measure for early estimation of effort and schedule of agile projects. The models’ accuracy improves when application domain groups and peak staff are added as inputs.Approved for public release; distribution is unlimited

    Inefficiencies in Digital Advertising Markets

    Get PDF
    Digital advertising markets are growing and attracting increased scrutiny. This article explores four market inefficiencies that remain poorly understood: ad effect measurement, frictions between and within advertising channel members, ad blocking, and ad fraud. Although these topics are not unique to digital advertising, each manifests in unique ways in markets for digital ads. The authors identify relevant findings in the academic literature, recent developments in practice, and promising topics for future research

    Harmonizing Systems and Software Cost Estimation

    Get PDF
    The objective of this paper is to examine the gaps and overlaps between software and systems engineering cost models with intent to harmonize the estimates in engineering engineering estimation. In particular, we evaluate the central assumptions of the COSYSMO and COCOMO II models and propose an approach to identify gaps and overlaps between them. We provide guidelines on how to reconcile and resolve the identified gaps and overlaps. The ultimate purpose of this work is to develop effective techniques for accurately estimating the combined systems and software engineering effort for software-intensive systems

    Towards making functional size measurement easily usable in practice

    Get PDF
    Functional Size Measurement methods \u2013like the IFPUG Function Point Analysis and COSMIC methods\u2013 are widely used to quantify the size of applications. However, the measurement process is often too long or too expensive, or it requires more knowledge than available when development effort estimates are due. To overcome these problems, simplified measurement methods have been proposed. This research explores easily usable functional size measurement method, aiming to improve efficiency, reduce difficulty and cost, and make functional size measurement widely adopted in practice. The first stage of the research involved the study of functional size measurement methods (in particular Function Point Analysis and COSMIC), simplified methods, and measurement based on measurement-oriented models. Then, we modeled a set of applications in a measurement-oriented way, and obtained UML models suitable for functional size measurement. From these UML models we derived both functional size measures and object-oriented measures. Using these measures it was possible to: 1) Evaluate existing simplified functional size measurement methods and derive our own simplified model. 2) Explore whether simplified method can be used in various stages of modeling and evaluate their accuracy. 3) Analyze the relationship between functional size measures and object oriented measures. In addition, the conversion between FPA and COSMIC was studied as an alternative simplified functional size measurement process. Our research revealed that: 1) In general it is possible to size software via simplified measurement processes with acceptable accuracy. In particular, the simplification of the measurement process allows the measurer to skip the function weighting phases, which are usually expensive, since they require a thorough analysis of the details of both data and operations. The models obtained from out dataset yielded results that are similar to those reported in the literature. All simplified measurement methods that use predefined weights for all the transaction and data types identified in Function Point Analysis provided similar results, characterized by acceptable accuracy. On the contrary, methods that rely on just one of the elements that contribute to functional size tend to be quite inaccurate. In general, different methods showed different accuracy for Real-Time and non Real-Time applications. 2) It is possible to write progressively more detailed and complete UML models of user requirements that provide the data required by the simplified COSMIC methods. These models yield progressively more accurate measures of the modeled software. Initial measures are based on simple models and are obtained quickly and with little effort. As V models grow in completeness and detail, the measures increase their accuracy. Developers that use UML for requirements modeling can obtain early estimates of the applications\u2018 sizes at the beginning of the development process, when only very simple UML models have been built for the applications, and can obtain increasingly more accurate size estimates while the knowledge of the products increases and UML models are refined accordingly. 3) Both Function Point Analysis and COSMIC functional size measures appear correlated to object-oriented measures. In particular, associations with basic object- oriented measures were found: Function Points appear associated with the number of classes, the number of attributes and the number of methods; CFP appear associated with the number of attributes. This result suggests that even a very basic UML model, like a class diagram, can support size measures that appear equivalent to functional size measures (which are much harder to obtain). Actually, object-oriented measures can be obtained automatically from models, thus dramatically decreasing the measurement effort, in comparison with functional size measurement. In addition, we proposed conversion method between Function Points and COSMIC based on analytical criteria. Our research has expanded the knowledge on how to simplify the methods for measuring the functional size of the software, i.e., the measure of functional user requirements. Basides providing information immediately usable by developers, the researchalso presents examples of analysis that can be replicated by other researchers, to increase the reliability and generality of the results

    Scientific knowledge and scientific uncertainty in bushfire and flood risk mitigation: literature review

    Get PDF
    EXECUTIVE SUMMARY The Scientific Diversity, Scientific Uncertainty and Risk Mitigation Policy and Planning (RMPP) project aims to investigate the diversity and uncertainty of bushfire and flood science, and its contribution to risk mitigation policy and planning. The project investigates how policy makers, practitioners, courts, inquiries and the community differentiate, understand and use scientific knowledge in relation to bushfire and flood risk. It uses qualitative social science methods and case studies to analyse how diverse types of knowledge are ordered and judged as salient, credible and authoritative, and the pragmatic meaning this holds for emergency management across the PPRR spectrum. This research report is the second literature review of the RMPP project and was written before any of the case studies had been completed. It synthesises approximately 250 academic sources on bushfire and flood risk science, including research on hazard modelling, prescribed burning, hydrological engineering, development planning, meteorology, climatology and evacuation planning. The report also incorporates theoretical insights from the fields of risk studies and science and technology studies (STS), as well as indicative research regarding the public understandings of science, risk communication and deliberative planning. This report outlines the key scientific practices (methods and knowledge) and scientific uncertainties in bushfire and flood risk mitigation in Australia. Scientific uncertainties are those ‘known unknowns’ and ‘unknown unknowns’ that emerge from the development and utilisation of scientific knowledge. Risk mitigation involves those processes through which agencies attempt to limit the vulnerability of assets and values to a given hazard. The focus of this report is the uncertainties encountered and managed by risk mitigation professionals in regards to these two hazards, though literature regarding natural sciences and the scientific method more generally are also included where appropriate. It is important to note that while this report excludes professional experience and local knowledge from its consideration of uncertainties and knowledge, these are also very important aspects of risk mitigation which will be addressed in the RMPP project’s case studies. Key findings of this report include: Risk and scientific knowledge are both constructed categories, indicating that attempts to understand any individual instance of risk or scientific knowledge should be understood in light of the social, political, economic, and ecological context in which they emerge. Uncertainty is a necessary element of scientific methods, and as such risk mitigation practitioners and researchers alike should seek to ‘embrace uncertainty’ (Moore et al., 2005) as part of navigating bushfire and flood risk mitigation
    • 

    corecore