863 research outputs found
Dependency-Aware Software Requirements Selection using Fuzzy Graphs and Integer Programming
Software requirements selection aims to find an optimal subset of the
requirements with the highest value while respecting the project constraints.
But the value of a requirement may depend on the presence or absence of other
requirements in the optimal subset. Such Value Dependencies, however, are
imprecise and hard to capture. In this paper, we propose a method based on
integer programming and fuzzy graphs to account for value dependencies and
their imprecision in software requirements selection. The proposed method,
referred to as Dependency-Aware Software Requirements Selection (DARS), is
comprised of three components: (i) an automated technique for the
identification of value dependencies from user preferences, (ii) a modeling
technique based on fuzzy graphs that allows for capturing the imprecision of
value dependencies, and (iii) an Integer Linear Programming (ILP) model that
takes into account user preferences and value dependencies identified from
those preferences to reduce the risk of value loss in software projects. Our
work is verified by studying a real-world software project. The results show
that our proposed method reduces the value loss in software projects and is
scalable to large requirement sets.Comment: arXiv admin note: text overlap with arXiv:2003.0480
Replan: Release planning for agile development
Release Planning methodologies have made possible that project managers
and users in general can plan project’s releases. These methods try
to automatize the human-based planning processes. Currently they are a
few web-based and stand-alone tools about release planning, but not all of
them offer the same functionalities, like the update of an already planned
release or a detailed plan expressed in a timeline. Moreover, these systems
are oriented to stakeholders criteria, without taking enough consideration
to the available resources. This becomes a limitation, because in many occasions
it is vital to have a temporal planning of a release. It also affects key
aspects like the planning efficiency or the speed at which it is executed.
In this project a web-based release planning tool has been developed.
In this tool, users can create a release with different entities in an easy and
simple way. The tool is based in a mathematical model that generates an
scheduled plan as tight as possible to the available time and resources. On
the other hand, the tool also guarantees the priority fulfillment of features,
by respecting the temporal criteria that the user could establish.
The system is also modular, as it can be integrated with other possible
different visualizations. Its development in a cloud server also provides
public access and scalability.
The tests performed to the system show that the presented mathematical
model guarantees the scheduled and efficient planning of a project’s
release
Recommended from our members
Combinatorial optimization and metaheuristics
Today, combinatorial optimization is one of the youngest and most active areas of discrete mathematics. It is a branch of optimization in applied mathematics and computer science, related to operational research, algorithm theory and computational complexity theory. It sits at the intersection of several fields, including artificial intelligence, mathematics and software engineering. Its increasing interest arises for the fact that a large number of scientific and industrial problems can be formulated as abstract combinatorial optimization problems, through graphs and/or (integer) linear programs. Some of these problems have polynomial-time (“efficient”) algorithms, while most of them are NP-hard, i.e. it is not proved that they can be solved in polynomial-time. Mainly, it means that it is not possible to guarantee that an exact solution to the problem can be found and one has to settle for an approximate solution with known performance guarantees. Indeed, the goal of approximate methods is to find “quickly” (reasonable run-times), with “high” probability, provable “good” solutions (low error from the real optimal solution). In the last 20 years, a new kind of algorithm commonly called metaheuristics have emerged in this class, which basically try to combine heuristics in high level frameworks aimed at efficiently and effectively exploring the search space. This report briefly outlines the components, concepts, advantages and disadvantages of different metaheuristic approaches from a conceptual point of view, in order to analyze their similarities and differences. The two very significant forces of intensification and diversification, that mainly determine the behavior of a metaheuristic, will be pointed out. The report concludes by exploring the importance of hybridization and integration methods
How to Place Your Apps in the Fog -- State of the Art and Open Challenges
Fog computing aims at extending the Cloud towards the IoT so to achieve
improved QoS and to empower latency-sensitive and bandwidth-hungry
applications. The Fog calls for novel models and algorithms to distribute
multi-service applications in such a way that data processing occurs wherever
it is best-placed, based on both functional and non-functional requirements.
This survey reviews the existing methodologies to solve the application
placement problem in the Fog, while pursuing three main objectives. First, it
offers a comprehensive overview on the currently employed algorithms, on the
availability of open-source prototypes, and on the size of test use cases.
Second, it classifies the literature based on the application and Fog
infrastructure characteristics that are captured by available models, with a
focus on the considered constraints and the optimised metrics. Finally, it
identifies some open challenges in application placement in the Fog
Optimizing the Prioritization of Natural Disaster Recovery Projects
Prioritizing reconstruction projects to recover a base from a natural disaster is a complicated and arduous process that involves all levels of leadership. The project prioritization phase of base recovery has a direct affect on the allocation of funding, the utilization of human resources, the obligation of projects, and the overall speed and efficiency of the recovery process. The focus of this research is the development of an objective and repeatable process for optimizing the project prioritization phase of the recovery effort. This work will focus on promoting objectivity in the project prioritizing process, improving the communication of the overall base recovery requirement, increasing efficiency in utilizing human and monetary resources, and the creation of a usable and repeatable decision-making tool based on Value-Focused Thinking and integer programming methods
- …