4 research outputs found
Automated Analysis of Diverse Variability Models with Tool Support
Over the past twenty years, there have been many contributions
in the area of automated analysis of variability models. However,
the majority of these researches are focused on feature models. We propose
that the knowledge obtained during recent years on the analysis
of feature models can be applied to automatically analyse different variability
models. In this paper we present FaMa OVM and FaMa DEB,
which are prototypical implementations for the automated analysis of
two distinct variability models, namely Orthogonal Variability Models
and Debian Variablity Models, respectively. In order to minimise efforts
and benefit from the feature model know–how, we use FaMa Framework
which allows the development of analysis tools for diverse variability
modelling languages. This framework provides a well tested system that
guides the tool development. Due to the structure provided by the framework,
FaMa OVM and FaMa DEB tools are easy to extend and integrate
with other tools. We report on the main points of both tools, such as the
analysis operations provided and the logical solvers used for the analysis.Comisión Interministerial de Ciencia y Tecnología (CICYT) TIN2012-32273Junta de Andalucía TIC-5906Junta de Andalucía P12-TIC-186
ETHOM: An Evolutionary Algorithm for Optimized Feature Models Generation - TECHNICAL REPORT ISA-2012-TR-01 (v. 1.1)
A feature model defines the valid combinations of features in a domain.
The automated extraction of information from feature models is a thriv ing topic involving numerous analysis operations, techniques and tools.
The progress of this discipline is leading to an increasing concern to test
and compare the performance of analysis solutions using tough input mod els that show the behaviour of the tools in extreme situations (e.g. those
producing longest execution times or highest memory consumption). Cur rently, these feature models are generated randomly ignoring the internal
aspects of the tools under tests. As a result, these only provide a rough idea
of the behaviour of the tools with average problems and are not sufficient
to reveal their real strengths and weaknesses.
In this technical report, we model the problem of finding computationally–
hard feature models as an optimization problem and we solve it using a
novel evolutionary algorithm. Given a tool and an analysis operation, our
algorithm generates input models of a predefined size maximizing aspects
as the execution time or the memory consumption of the tool when per forming the operation over the model. This allows users and developers to
know the behaviour of tools in pessimistic cases providing a better idea of
their real power. Experiments using our evolutionary algorithm on a num ber of analysis operations and tools have successfully identified input mod els causing much longer executions times and higher memory consumption
than random models of identical or even larger size. Our solution is generic
and applicable to a variety of optimization problems on feature models, not
only those involving analysis operations. In view of the positive results, we
expect this work to be the seed for a new wave of research contributions
exploiting the benefit of evolutionary programming in the field of feature
modelling
ETHOM: An Evolutionary Algorithm for Optimized Feature Models Generation (v. 1.2): Technical Report ISA-2012-TR-05
A feature model defines the valid combinations of features in a domain.
The automated extraction of information from feature models is a thriving
topic involving numerous analysis operations, techniques and tools.
The progress of this discipline is leading to an increasing concern to test
and compare the performance of analysis solutions using tough input models
that show the behaviour of the tools in extreme situations (e.g. those
producing longest execution times or highest memory consumption). Currently,
these feature models are generated randomly ignoring the internal
aspects of the tools under tests. As a result, these only provide a rough idea
of the behaviour of the tools with average problems and are not sufficient
to reveal their real strengths and weaknesses.
In this technical report, we model the problem of finding computationally–
hard feature models as an optimization problem and we solve it using a
novel evolutionary algorithm. Given a tool and an analysis operation, our
algorithm generates input models of a predefined size maximizing aspects
as the execution time or the memory consumption of the tool when performing
the operation over the model. This allows users and developers to
know the behaviour of tools in pessimistic cases providing a better idea of
their real power. Experiments using our evolutionary algorithm on a number
of analysis operations and tools have successfully identified input models
causing much longer executions times and higher memory consumption
than random models of identical or even larger size. Our solution is generic
and applicable to a variety of optimization problems on feature models, not
only those involving analysis operations. In view of the positive results, we
expect this work to be the seed for a new wave of research contributions
exploiting the benefit of evolutionary programming in the field of feature
modelling