7,096 research outputs found

    Generating Logical Specifications from Requirements Models for Deduction-based Formal Verification

    Full text link
    The work concerns automatic generation of logical specifications from requirements models. Logical specifications obtained in such a way can be subjected to formal verification using deductive reasoning. Formal verification concerns correctness of a model behaviour. Reliability of the requirements engineering is essential for all phases of software development processes. Deductive reasoning is an important alternative among other formal methods. However, logical specifications, considered as sets of temporal logic formulas, are difficult to specify manually by inexperienced users and this fact can be regarded as a significant obstacle to practical use of deduction-based verification tools. A method of building requirements models using some UML diagrams, including their logical specifications, is presented step by step. Organizing activity diagrams into predefined workflow patterns enables automated extraction of logical specifications. The crucial aspect of the presented approach is integrating the requirements engineering phase and the automatic generation of logical specifications. A system of the deduction-based verification is proposed. The reasoning process could be based on the semantic tableaux method. A simple yet illustrative example of the requirements elicitation and verification is provided

    From user requirements to UML class diagram

    Full text link
    The transition from user requirements to UML diagrams is a difficult task for the designer especially when he handles large texts expressing these needs. Modeling class Diagram must be performed frequently, even during the development of a simple application. This paper proposes an approach to facilitate class diagram extraction from textual requirements using NLP techniques and domain ontology.Comment: International Conference on Computer Related Knowledg

    Performance-oriented DevOps: A Research Agenda

    Full text link
    DevOps is a trend towards a tighter integration between development (Dev) and operations (Ops) teams. The need for such an integration is driven by the requirement to continuously adapt enterprise applications (EAs) to changes in the business environment. As of today, DevOps concepts have been primarily introduced to ensure a constant flow of features and bug fixes into new releases from a functional perspective. In order to integrate a non-functional perspective into these DevOps concepts this report focuses on tools, activities, and processes to ensure one of the most important quality attributes of a software system, namely performance. Performance describes system properties concerning its timeliness and use of resources. Common metrics are response time, throughput, and resource utilization. Performance goals for EAs are typically defined by setting upper and/or lower bounds for these metrics and specific business transactions. In order to ensure that such performance goals can be met, several activities are required during development and operation of these systems as well as during the transition from Dev to Ops. Activities during development are typically summarized by the term Software Performance Engineering (SPE), whereas activities during operations are called Application Performance Management (APM). SPE and APM were historically tackled independently from each other, but the newly emerging DevOps concepts require and enable a tighter integration between both activity streams. This report presents existing solutions to support this integration as well as open research challenges in this area

    Transition from Analysis to Software Design: A Review and New Perspective

    Full text link
    Analysis and design phases are the most crucial part of the software development life-cycle. Reusing the artifacts of these early phases is very beneficial to improve the productivity and software quality. In this paper we analyze the literature on the automatic transformation of artifacts from the problem space (i.e., requirement analysis models) into artifacts in the solution space (i.e., architecture, design and implementation code). The goal is to assess the current state of the art with regard to the ability of automatically reusing previously developed software designs in synthesizing a new design for a given requirement. We surveyed various related areas such as model-driven development and model transformation techniques. Our analysis revealed that this topic has not been satisfactorily covered yet. Accordingly, we propose a framework consists of three stages to address uncovered limitations in current approaches

    Review on Requirements Modeling and Analysis for Self-Adaptive Systems: A Ten-Year Perspective

    Full text link
    Context: Over the last decade, software researchers and engineers have developed a vast body of methodologies and technologies in requirements engineering for self-adaptive systems. Although existing studies have explored various aspects of this field, no systematic study has been performed on summarizing modeling methods and corresponding requirements activities. Objective: This study summarizes the state-of-the-art research trends, details the modeling methods and corresponding requirements activities, identifies relevant quality attributes and application domains and assesses the quality of each study. Method: We perform a systematic literature review underpinned by a rigorously established and reviewed protocol. To ensure the quality of the study, we choose 21 highly regarded publication venues and 8 popular digital libraries. In addition, we apply text mining to derive search strings and use Kappa coefficient to mitigate disagreements of researchers. Results: We selected 109 papers during the period of 2003-2013 and presented the research distributions over various kinds of factors. We extracted 29 modeling methods which are classified into 8 categories and identified 14 requirements activities which are classified into 4 requirements timelines. We captured 8 concerned software quality attributes based on the ISO 9126 standard and 12 application domains. Conclusion: The frequency of application of modeling methods varies greatly. Enterprise models were more widely used while behavior models were more rigorously evaluated. Requirements-driven runtime adaptation was the most frequently studied requirements activity. Activities at runtime were conveyed with more details. Finally, we draw other conclusions by discussing how well modeling dimensions were considered in these modeling methods and how well assurance dimensions were conveyed in requirements activities.Comment: The extension of Our REFSQ'14 paper. Keywords: Requirements engineering, Requirements modeling, Requirements analysis, Self-adaptive systems, Autonomic computing, Systematic literature revie

    Towards an Automated Requirements-driven Development of Smart Cyber-Physical Systems

    Full text link
    The Invariant Refinement Method for Self Adaptation (IRM-SA) is a design method targeting development of smart Cyber-Physical Systems (sCPS). It allows for a systematic translation of the system requirements into the system architecture expressed as an ensemble-based component system (EBCS). However, since the requirements are captured using natural language, there exists the danger of their misinterpretation due to natural language requirements' ambiguity, which could eventually lead to design errors. Thus, automation and validation of the design process is desirable. In this paper, we (i) analyze the translation process of natural language requirements into the IRM-SA model, (ii) identify individual steps that can be automated and/or validated using natural language processing techniques, and (iii) propose suitable methods.Comment: In Proceedings FESCA 2016, arXiv:1603.0837

    Automatic generation of analysis class diagrams from use case specifications

    Full text link
    In object oriented software development, the analysis modeling is concerned with the task of identifying problem level objects along with the relationships between them from software requirements. The software requirements are usually written in some natural language, and the analysis modeling is normally performed by experienced human analysts. The huge gap between the software requirements which are unstructured texts and analysis models which are usually structured UML diagrams, along with human slip-ups inevitably makes the transformation process error prone. The automation of this process can help in reducing the errors in the transformation. In this paper we propose a tool supported approach for automated transformation of use case specifications documented in English language into analysis class diagrams. The approach works in four steps. It first takes the textual specification of a use case as input, and then using a natural language parser generates type dependencies and parts of speech tags for each sentence in the specification. Then, it identifies the sentence structure of each sentence using a set of comprehensive sentence structure rules. Next, it applies a set of transformation rules on the type dependencies and parts of speech tags of the sentences to discover the problem level objects and the relationships between them. Finally, it generates and visualizes the analysis class diagram. We conducted a controlled experiment to compare the correctness, completeness and redundancy of the analysis class diagrams generated by our approach with those generated by the existing automated approaches. The results showed that the analysis class diagrams generated by our approach were more correct, more complete, and less redundant than those generated by the other approaches.Comment: 38 pages, 5 figures, 20 table

    Concurrent Development of Model and Implementation

    Full text link
    This paper considers how a formal mathematically-based model can be used in support of evolutionary software development, and in particular how such a model can be kept consistent with the implementation as it changes to meet new requirements. A number of techniques are listed can make use of such a model to enhance the development process, and also ways to keep model and implementation consistent. The effectiveness of these techniques is investigated through two case studies concerning the development of small e-business applications, a travel agent and a mortgage broker. Some successes are reported, notably in the use of rapid throwaway modelling to investigate design alternatives, and also in the use of close team working and modelbased trace-checking to maintain synchronisation between model and implementation throughout the development. The main areas of weakness were seen to derive from deficiencies in tool support. Recommendations are therefore made for future improvements to tools supporting formal models which would, in principle, make this co-evolutionary approach attractive to industrial software developers. It is claimed that in fact tools already exist that provide the desired facilities, but these are not necessarily production-quality, and do not all support the same notations, and hence cannot be used together

    A use case driven approach for system level testing

    Full text link
    Use case scenarios are created during the analysis phase to specify software system requirements and can also be used for creating system level test cases. Using use cases to get system tests has several benefits including test design at early stages of software development life cycle that reduces over all development cost of the system. Current approaches for system testing using use cases involve functional details and does not include guards as passing criteria i.e. use of class diagram that seem to be difficult at very initial level which lead the need of specification based testing without involving functional details. In this paper, we proposed a technique for system testing directly derived from the specification without involving functional details. We utilize initial and post conditions applied as guards at each level of the use cases that enables us generation of formalized test cases and makes it possible to generate test cases for each flow of the system. We used use case scenarios to generate system level test cases, whereas system sequence diagram is being used to bridge the gap between the test objective and test cases, derived from the specification of the system. Since, a state chart derived from the combination of sequence diagrams can model the entire behavior of the system.Generated test cases can be employed and executed to state chart in order to capture behavior of the system with the state change.All these steps enable us to systematically refine the specification to achieve the goals of system testing at early development stages

    Bio-inspired Requirements Variability Modeling with Use Case

    Full text link
    Background. Feature Model (FM) is the most important technique used to manage the variability through products in Software Product Lines (SPLs). Often, the SPLs requirements variability is by using variable use case model which is a real challenge in actual approaches: large gap between their concepts and those of real world leading to bad quality, poor supporting FM, and the variability does not cover all requirements modeling levels. Aims. This paper proposes a bio-inspired use case variability modeling methodology dealing with the above shortages. Method. The methodology is carried out through variable business domain use case meta modeling, variable applications family use case meta modeling, and variable specific application use case generating. Results. This methodology has leaded to integrated solutions to the above challenges: it decreases the gap between computing concepts and real world ones. It supports use case variability modeling by introducing versions and revisions features and related relations. The variability is supported at three meta levels covering business domain, applications family, and specific application requirements. Conclusion. A comparative evaluation with the closest recent works, upon some meaningful criteria in the domain, shows the conceptual and practical great value of the proposed methodology and leads to promising research perspective
    corecore