9,887 research outputs found
Automated Analysis of Orthogonal Variability Models Using Constraint Programming.
Software Product Line (SPL) Engineering is about producing a family of products that share commonalities and variabilities. The variability models are used for variability management in SPLs. Currently, the automated analysis of variability models has become an active research area. in this paper we focus on the automated analysis of Orthogonal Variability Model (OVM), which is a modelling language for representing variability. The automated analysis of OVMs deals with the computer-aided extraction of information from OVMs. The automated analysis of OVMs has been hardly explored and currently has no tooling support. Considering our know-how to analyse feature models, which are the most popular variability models in SPLs, we propose to automate the analysis of OVMs by means of constraint programming. in addition, we propose to extend OVMs with attributes, allowing to add extra-functional information to OVMs. With this proposal we contribute with a step forward toward a tooling support for analysing OVMs
Automated Analysis of Diverse Variability Models with Tool Support
Over the past twenty years, there have been many contributions
in the area of automated analysis of variability models. However,
the majority of these researches are focused on feature models. We propose
that the knowledge obtained during recent years on the analysis
of feature models can be applied to automatically analyse different variability
models. In this paper we present FaMa OVM and FaMa DEB,
which are prototypical implementations for the automated analysis of
two distinct variability models, namely Orthogonal Variability Models
and Debian Variablity Models, respectively. In order to minimise efforts
and benefit from the feature model know–how, we use FaMa Framework
which allows the development of analysis tools for diverse variability
modelling languages. This framework provides a well tested system that
guides the tool development. Due to the structure provided by the framework,
FaMa OVM and FaMa DEB tools are easy to extend and integrate
with other tools. We report on the main points of both tools, such as the
analysis operations provided and the logical solvers used for the analysis.Comisión Interministerial de Ciencia y Tecnología (CICYT) TIN2012-32273Junta de Andalucía TIC-5906Junta de Andalucía P12-TIC-186
A Petri Net approach for representing Orthogonal Variability Models
The software product line (SPL) paradigm is used for developing software system products from a set of reusable artifacts, known as platform. The Orthogonal Variability Modeling (OVM) is a technique for representing and managing the variability and composition of those artifacts for deriving products in the SPL. Nevertheless, OVM does not support the formal analysis of the models. For example, the detection of dead artifacts (i.e., artifcats that cannot be included in any product) is an exhaustive activity which implies the verification of relationships between artifacs, artifacts parents, and so on. In this work, we introduce a Petri nets approach for representing and analyzing OVM models. The proposed net is built from elemental topologies that represents OVM concepts and relationships. Finally, we simulate the net and study their properties in order to avoid the product feasibility problems.Fil: Martinez, Cristian. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Santa Fe. Instituto de Desarrollo y Diseño (i); ArgentinaFil: Leone, Horacio Pascual. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Santa Fe. Instituto de Desarrollo y Diseño (i); ArgentinaFil: Gonnet, Silvio Miguel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Santa Fe. Instituto de Desarrollo y Diseño (i); Argentin
A Graphical Web Tool with DL-based Reasoning Support over Orthogonal Variability Models
Variability Management is one of the most challenging tasks in a Software Product Line (SPL) development. This is reflected in the way software is developed, maintained and extended. Therefore, automatic variability analysis has emerged in order to validate models in early development stages, avoiding affecting derived products quality. In this work, we present crowd-variability, a novel graphical tool designed for modelling and validating Orthogonal Variability Models (OVM) using Description Logics (DL)-based reasoning services.
We describe the tool and demonstrate the usage of the first prototype along with examples of use. Currently, we are working to release the first beta version of crowd-variability.X Workshop Innovación en Sistemas de Software (WISS)Red de Universidades con Carreras en Informática (RedUNCI
Configuration Analysis for Large Scale Feature Models: Towards Speculative-Based Solutions
Los sistemas de alta variabilidad son sistemas de software en los que la gestión de la
variabilidad es una actividad central. Algunos ejemplos actuales de sistemas de alta
variabilidad son el sistema web de gesión de contenidos Drupal, el núcleo de Linux,
y las distribuciones Debian de Linux.
La configuración en sistemas de alta variabilidad es la selección de opciones
de configuración según sus restricciones de configuración y los requerimientos de
usuario. Los modelos de características son un estándar “de facto” para modelar las
funcionalidades comunes y variables de sistemas de alta variabilidad. No obstante,
el elevado número de componentes y configuraciones que un modelo de características
puede contener hacen que el análisis manual de estos modelos sea una tarea muy
costosa y propensa a errores. Así nace el análisis automatizado de modelos de características
con mecanismos y herramientas asistidas por computadora para extraer
información de estos modelos. Las soluciones tradicionales de análisis automatizado
de modelos de características siguen un enfoque de computación secuencial para
utilizar una unidad central de procesamiento y memoria. Estas soluciones son adecuadas
para trabajar con sistemas de baja escala. Sin embargo, dichas soluciones demandan
altos costos de computación para trabajar con sistemas de gran escala y alta
variabilidad. Aunque existan recusos informáticos para mejorar el rendimiento de
soluciones de computación, todas las soluciones con un enfoque de computación secuencial
necesitan ser adaptadas para el uso eficiente de estos recursos y optimizar su
rendimiento computacional. Ejemplos de estos recursos son la tecnología de múltiples
núcleos para computación paralela y la tecnología de red para computación distribuida.
Esta tesis explora la adaptación y escalabilidad de soluciones para el analisis automatizado
de modelos de características de gran escala. En primer lugar, nosotros
presentamos el uso de programación especulativa para la paralelización de soluciones.
Además, nosotros apreciamos un problema de configuración desde otra perspectiva,
para su solución mediante la adaptación y aplicación de una solución no
tradicional. Más tarde, nosotros validamos la escalabilidad y mejoras de rendimiento
computacional de estas soluciones para el análisis automatizado de modelos de características
de gran escala.
Concretamente, las principales contribuciones de esta tesis son:
• Programación especulativa para la detección de un conflicto mínimo y
1
2
preferente. Los algoritmos de detección de conflictos mínimos determinan
el conjunto mínimo de restricciones en conflicto que son responsables de comportamiento
defectuoso en el modelo en análisis. Nosotros proponemos una
solución para, mediante programación especulativa, ejecutar en paralelo y reducir
el tiempo de ejecución de operaciones de alto costo computacional que
determinan el flujo de acción en la detección de conflicto mínimo y preferente
en modelos de características de gran escala.
• Programación especulativa para un diagnóstico mínimo y preferente. Los
algoritmos de diagnóstico mínimo determinan un conjunto mínimo de restricciones
que, por una adecuada adaptación de su estado, permiten conseguir un
modelo consistente o libre de conflictos. Este trabajo presenta una solución
para el diagnóstico mínimo y preferente en modelos de características de gran
escala mediante la ejecución especulativa y paralela de operaciones de alto
costo computacional que determinan el flujo de acción, y entonces disminuir
el tiempo de ejecución de la solución.
• Completar de forma mínima y preferente una configuración de modelo
por diagnóstico. Las soluciones para completar una configuración parcial
determinan un conjunto no necesariamente mínimo ni preferente de opciones
para obtener una completa configuración. Esta tesis soluciona el completar
de forma mínima y preferente una configuración de modelo mediante técnicas
previamente usadas en contexto de diagnóstico de modelos de características.
Esta tesis evalua que todas nuestras soluciones preservan los valores de salida esperados,
y también presentan mejoras de rendimiento en el análisis automatizado de
modelos de características con modelos de gran escala en las operaciones descrita
A Graphical Web Tool with DL-based Reasoning Support over Orthogonal Variability Models
Variability Management is one of the most challenging tasks in a Software Product Line (SPL) development. This is reflected in the way software is developed, maintained and extended. Therefore, automatic variability analysis has emerged in order to validate models in early development stages, avoiding affecting derived products quality. In this work, we present crowd-variability, a novel graphical tool designed for modelling and validating Orthogonal Variability Models (OVM) using Description Logics (DL)-based reasoning services.
We describe the tool and demonstrate the usage of the first prototype along with examples of use. Currently, we are working to release the first beta version of crowd-variability.X Workshop Innovación en Sistemas de Software (WISS)Red de Universidades con Carreras en Informática (RedUNCI
Automated analysis of feature models: Quo vadis?
Feature models have been used since the 90's to describe software product lines as a way of reusing common parts in a family of software systems. In 2010, a systematic literature review was published summarizing the advances and settling the basis of the area of Automated Analysis of Feature Models (AAFM). From then on, different studies have applied the AAFM in different domains. In this paper, we provide an overview of the evolution of this field since 2010 by performing a systematic mapping study considering 423 primary sources. We found six different variability facets where the AAFM is being applied that define the tendencies: product configuration and derivation; testing and evolution; reverse engineering; multi-model variability-analysis; variability modelling and variability-intensive systems. We also confirmed that there is a lack of industrial evidence in most of the cases. Finally, we present where and when the papers have been published and who are the authors and institutions that are contributing to the field. We observed that the maturity is proven by the increment in the number of journals published along the years as well as the diversity of conferences and workshops where papers are published. We also suggest some synergies with other areas such as cloud or mobile computing among others that can motivate further research in the future.Ministerio de Economía y Competitividad TIN2015-70560-RJunta de Andalucía TIC-186
Quality-Aware Analysis in Product Line Engineering with the Orthogonal Variability Model
Software product line engineering (SPLE) is about producing a set of similar
products in a certain domain. A variability model documents the variability amongst products
in a product line. The specification of variability can be extended with quality information,
such as measurable quality attributes (e.g., CPU and memory consumption) and constraints
on these attributes (e.g., memory consumption should be in a range of values). However,
the wrong use of constraints may cause anomalies in the specification which must be
detected (e.g., the model could represent no products). Furthermore, based on such quality
information it is possible to carry out quality-aware analyses, i.e., the product line engineer
may want to verify whether it is possible to build a product that satisfies a desired quality.
The challenge for quality-aware specification and analysis is three-fold. First, there should
be a way to specify quality information in variability models. Second, it should be possible
to detect anomalies in the variability specification associated with quality information.
Third, there should be mechanisms to verify the variability model to extract useful information,
such as the possibility to build a product that fulfils certain quality conditions (e.g., is
there any product that requires less than 512MB of memory?). In this article, we present an
approach for quality-aware analysis in software product lines using the orthogonal variability
model (OVM) to represent variability. We propose to map variability represented in the
OVM associated with quality information to a constraint satisfaction problem and to use an
off-the-shelf constraint programming solver to automatically perform the verification task.
To illustrate our approach, we use a product line in the automotive domain which is an example
that was created in a national project by a leading car company. We have developed
a prototype tool named FaMa-OVM, which works as a proof of concepts. We were able to
identify void models, dead and false optional elements, and check whether the product line
example satisfies quality conditions
Using Constraint Programming to Verify DOPLER Variability Models
Software product lines are typically developed using model-based approaches. Models are used to guide and automate key activities such as the derivation of products. The verification of product line models is thus essential to ensure the consistency of the derived products. While many authors have proposed approaches for verifying feature models there is so far no such approach for decision models. We discuss challenges of analyzing and verifying decision-oriented DOPLER variability models. The manual verification of these models is an error-prone, tedious, and sometimes infeasible task. We present a preliminary approach that converts DOPLER variability models into constraint programs to support their verification. We assess the feasibility of our approach by identifying defects in two existing variability models
- …