252 research outputs found

    Towards an I/O Conformance Testing Theory for Software Product Lines based on Modal Interface Automata

    Full text link
    We present an adaptation of input/output conformance (ioco) testing principles to families of similar implementation variants as appearing in product line engineering. Our proposed product line testing theory relies on Modal Interface Automata (MIA) as behavioral specification formalism. MIA enrich I/O-labeled transition systems with may/must modalities to distinguish mandatory from optional behavior, thus providing a semantic notion of intrinsic behavioral variability. In particular, MIA constitute a restricted, yet fully expressive subclass of I/O-labeled modal transition systems, guaranteeing desirable refinement and compositionality properties. The resulting modal-ioco relation defined on MIA is preserved under MIA refinement, which serves as variant derivation mechanism in our product line testing theory. As a result, modal-ioco is proven correct in the sense that it coincides with traditional ioco to hold for every derivable implementation variant. Based on this result, a family-based product line conformance testing framework can be established.Comment: In Proceedings FMSPLE 2015, arXiv:1504.0301

    Ecological Life Cycle Assessment Modified Novolaks Waste Used in Industrial Wastewater Treatment

    Get PDF
    Ecological Life Cycle Assessment (LCA) applied in the assessment of the impact of products on the environment is a technique that allows for the evaluation of the environmental impact of polymeric flocculants used in industrial wastewater treatment. The possibility of conducting a full life cycle and thus manufacturing process analysis allows for reliable and accurate identification of the sources of environmental hazards and the impact of new products on the environment. Newly synthesized waste-based polymers are water soluble and possess the properties of flocculants, while reducing the parameters in industrial wastewater. In the paper, there are presented the results of the analysis conducted using LCA technique for the assessment of the impact of modified waste phenol formaldehyde resin (Novolak) on the environment. LCA technique was used to assess the impact of the new flocculant applied in the process of metallurgical wastewater treatment taking into account the environmental impact of the fl occulant manufacturing process

    Analysis of Software Binaries for Reengineering-Driven Product Line Architecture\^aAn Industrial Case Study

    Full text link
    This paper describes a method for the recovering of software architectures from a set of similar (but unrelated) software products in binary form. One intention is to drive refactoring into software product lines and combine architecture recovery with run time binary analysis and existing clustering methods. Using our runtime binary analysis, we create graphs that capture the dependencies between different software parts. These are clustered into smaller component graphs, that group software parts with high interactions into larger entities. The component graphs serve as a basis for further software product line work. In this paper, we concentrate on the analysis part of the method and the graph clustering. We apply the graph clustering method to a real application in the context of automation / robot configuration software tools.Comment: In Proceedings FMSPLE 2015, arXiv:1504.0301

    A Systematic Review of Tracing Solutions in Software Product Lines

    Get PDF
    Software Product Lines are large-scale, multi-unit systems that enable massive, customized production. They consist of a base of reusable artifacts and points of variation that provide the system with flexibility, allowing generating customized products. However, maintaining a system with such complexity and flexibility could be error prone and time consuming. Indeed, any modification (addition, deletion or update) at the level of a product or an artifact would impact other elements. It would therefore be interesting to adopt an efficient and organized traceability solution to maintain the Software Product Line. Still, traceability is not systematically implemented. It is usually set up for specific constraints (e.g. certification requirements), but abandoned in other situations. In order to draw a picture of the actual conditions of traceability solutions in Software Product Lines context, we decided to address a literature review. This review as well as its findings is detailed in the present article.Comment: 22 pages, 9 figures, 7 table

    Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Get PDF
    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average cost of products.Comment: In Proceedings FMSPLE 2015, arXiv:1504.0301

    Long-Term Average Cost in Featured Transition Systems

    Get PDF
    A software product line is a family of software products that share a common set of mandatory features and whose individual products are differentiated by their variable (optional or alternative) features. Family-based analysis of software product lines takes as input a single model of a complete product line and analyzes all its products at the same time. As the number of products in a software product line may be large, this is generally preferable to analyzing each product on its own. Family-based analysis, however, requires that standard algorithms be adapted to accomodate variability. In this paper we adapt the standard algorithm for computing limit average cost of a weighted transition system to software product lines. Limit average is a useful and popular measure for the long-term average behavior of a quality attribute such as performance or energy consumption, but has hitherto not been available for family-based analysis of software product lines. Our algorithm operates on weighted featured transition systems, at a symbolic level, and computes limit average cost for all products in a software product line at the same time. We have implemented the algorithm and evaluated it on several examples

    Feature-Aware Verification

    Full text link
    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feature interactions can be detected automatically based on specifications that have only feature-local knowledge, and that variability encoding significantly improves the verification performance when proving the absence of interactions.Comment: 12 pages, 9 figures, 1 tabl

    From Professional Business Partner to Strategic Talent Leader : What’s Next for Human Resource Management

    Get PDF
    The HR profession is at a critical inflection point. It can evolve into a true decision science of talent, and aspire to the level of influence of disciplines such as Finance and Marketing, or it can continue the traditional focus on support services and program delivery to organizational clients. In this paper, we suggest that the transition to a decision science is essential and not only feasible, but historically predictable. However, we show that making the transition is not a function of achieving best-practice professional practices. Rather, it requires developing a logical, deep and coherent framework linking organizational talent to strategic success. We show how the evolution of the decision sciences of Finance and Marketing, out of the professional practices of Accounting and Sales, provide the principles to guide the evolution from the current professional practice of HR, to the emerging decision science of talentship
    corecore