132,475 research outputs found
Early Quantitative Assessment of Non-Functional Requirements
Non-functional requirements (NFRs) of software systems are a well known source of uncertainty in effort estimation. Yet, quantitatively approaching NFR early in a project is hard. This paper makes a step towards reducing the impact of uncertainty due to NRF. It offers a solution that incorporates NFRs into the functional size quantification process. The merits of our solution are twofold: first, it lets us quantitatively assess the NFR modeling process early in the project, and second, it lets us generate test cases for NFR verification purposes. We chose the NFR framework as a vehicle to integrate NFRs into the requirements modeling process and to apply quantitative assessment procedures. Our solution proposal also rests on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. We extend its use for NFR testing purposes, which is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the advantages of our approach and the open questions related to its design as well
Valuing Biodiversity in Life Cycle Impact Assessment
Erratum published on 13 March 2020, see Sustainability 2020, 12(6), 2270. https://doi.org/10.3390/su11205628In this article, the authors propose an impact assessment method for life cycle assessment (LCA) that adheres to established LCA principles for land use-related impact assessment, bridges current research gaps and addresses the requirements of different stakeholders for a methodological framework. The conservation of biodiversity is a priority for humanity, as expressed in the framework of the Sustainable Development Goals (SDGs). Addressing biodiversity across value chains is a key challenge for enabling sustainable production pathways. Life cycle assessment is a standardised approach to assess and compare environmental impacts of products along their value chains. The impact assessment method presented in this article allows the quantification of the impact of land-using production processes on biodiversity for several broad land use classes. It provides a calculation framework with degrees of customisation (e.g., to take into account regional conservation priorities), but also offers a default valuation of biodiversity based on naturalness. The applicability of the method is demonstrated through an example of a consumer product. The main strength of the approach is that it yields highly aggregated information on the biodiversity impacts of products, enabling biodiversity-conscious decisions about raw materials, production routes and end user products
Requirements Prioritization Based on Benefit and Cost Prediction: An Agenda for Future Research
In early phases of the software cycle, requirements
prioritization necessarily relies on the specified
requirements and on predictions of benefit and cost of
individual requirements. This paper presents results of
a systematic review of literature, which investigates
how existing methods approach the problem of
requirements prioritization based on benefit and cost.
From this review, it derives a set of under-researched
issues which warrant future efforts and sketches an
agenda for future research in this area
Network Community Detection on Metric Space
Community detection in a complex network is an important problem of much
interest in recent years. In general, a community detection algorithm chooses
an objective function and captures the communities of the network by optimizing
the objective function, and then, one uses various heuristics to solve the
optimization problem to extract the interesting communities for the user. In
this article, we demonstrate the procedure to transform a graph into points of
a metric space and develop the methods of community detection with the help of
a metric defined for a pair of points. We have also studied and analyzed the
community structure of the network therein. The results obtained with our
approach are very competitive with most of the well-known algorithms in the
literature, and this is justified over the large collection of datasets. On the
other hand, it can be observed that time taken by our algorithm is quite less
compared to other methods and justifies the theoretical findings
A Periodicity Metric for Assessing Maintenance Strategies
Organised by: Cranfield UniversityThe maintenance policy in manufacturing systems is devised to reset the machines functionality
in an economical fashion in order to keep the products quality within acceptable levels. Therefore,
there is a need for a metric to evaluate and quantify function resetting due to the adopted
maintenance policy. A novel metric for measuring the functional periodicity has been developed
using the complexity theory. It is based on the rate and extent of function resetting. It can be used
as an important criterion for comparing the different maintenance policy alternatives. An industrial
example is used to illustrate the application of the new metric.Mori Seiki – The Machine Tool Company; BAE Systems; S4T – Support Service Solutions: Strategy and Transitio
Discovering, quantifying, and displaying attacks
In the design of software and cyber-physical systems, security is often
perceived as a qualitative need, but can only be attained quantitatively.
Especially when distributed components are involved, it is hard to predict and
confront all possible attacks. A main challenge in the development of complex
systems is therefore to discover attacks, quantify them to comprehend their
likelihood, and communicate them to non-experts for facilitating the decision
process. To address this three-sided challenge we propose a protection analysis
over the Quality Calculus that (i) computes all the sets of data required by an
attacker to reach a given location in a system, (ii) determines the cheapest
set of such attacks for a given notion of cost, and (iii) derives an attack
tree that displays the attacks graphically. The protection analysis is first
developed in a qualitative setting, and then extended to quantitative settings
following an approach applicable to a great many contexts. The quantitative
formulation is implemented as an optimisation problem encoded into
Satisfiability Modulo Theories, allowing us to deal with complex cost
structures. The usefulness of the framework is demonstrated on a national-scale
authentication system, studied through a Java implementation of the framework.Comment: LMCS SPECIAL ISSUE FORTE 201
Non-functional requirements: size measurement and testing with COSMIC-FFP
The non-functional requirements (NFRs) of software systems are well known to add a degree of uncertainty to process of estimating the cost of any project. This paper contributes to the achievement of more precise project size measurement through incorporating NFRs into the functional size quantification process. We report on an initial solution proposed to deal with the problem of quantitatively assessing the NFR modeling process early in the project, and of generating test cases for NFR verification purposes. The NFR framework has been chosen for the integration of NFRs into the requirements modeling process and for their quantitative assessment. Our proposal is based on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. Also in this paper, we extend the use of COSMIC-FFP for NFR testing purposes. This is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the merits of the proposed approach and the open questions related to its design
- …