279 research outputs found

    The Systematic Discovery of Services in Early Stages of Agile Developments: A Systematic Literature Review

    Get PDF
    In recent years, agile methodologies have been consolidated and extended in organizations that develop software in Web environments. For this reason, the development methodology of these organizations will not only be related to Services, but also to the Web Engineering paradigm. These organizations are heading for incorporating software development methodologies whose paradigm can allow integration, naturally and in the earlier stages of Web applications develop with the services of the organization that described and published in the Services Portfolio. The aim of this study will be to analyze the current state of the art in the process of discovering services in early stages of agile software development with focus on those identified requirements that could be covered with the services included in the Service Portfolio. We have identified 20 relevant papers through conducting a double systematic literature review (SLR). It is concluded that no study has been found that can solve the entire process of discovering candidate services within an organization that cover the requirements of a new application developed with agile methodologies. At the same time, guidelines have been found to formalize the solution to this problem and fill in that gap of knowledge by proposing in a single process, the formalization of a requirement based on agile techniques, which can be managed against a Services PortfolioMinisterio de Economía y Competitividad TIN2016-76956-C3-2-R (POLOLAS

    Assessing and improving quality of QVTo model transformations

    Get PDF
    We investigate quality improvement in QVT operational mappings (QVTo) model transformations, one of the languages defined in the OMG standard on model-to-model transformations. Two research questions are addressed. First, how can we assess quality of QVTo model transformations? Second, how can we develop higher-quality QVTo transformations? To address the first question, we utilize a bottom–up approach, starting with a broad exploratory study including QVTo expert interviews, a review of existing material, and introspection. We then formalize QVTo transformation quality into a QVTo quality model. The quality model is validated through a survey of a broader group of QVTo developers. We find that although many quality properties recognized as important for QVTo do have counterparts in general purpose languages, a number of them are specific to QVTo or model transformation languages. To address the second research question, we leverage the quality model to identify developer support tooling for QVTo. We then implemented and evaluated one of the tools, namely a code test coverage tool. In designing the tool, code coverage criteria for QVTo model transformations are also identified. The primary contributions of this paper are a QVTo quality model relevant to QVTo practitioners and an open-source code coverage tool already usable by QVTo transformation developers. Secondary contributions are a bottom–up approach to building a quality model, a validation approach leveraging developer perceptions to evaluate quality properties, code test coverage criteria for QVTo, and numerous directions for future research and tooling related to QVTo quality

    A heuristic-based approach to code-smell detection

    Get PDF
    Encapsulation and data hiding are central tenets of the object oriented paradigm. Deciding what data and behaviour to form into a class and where to draw the line between its public and private details can make the difference between a class that is an understandable, flexible and reusable abstraction and one which is not. This decision is a difficult one and may easily result in poor encapsulation which can then have serious implications for a number of system qualities. It is often hard to identify such encapsulation problems within large software systems until they cause a maintenance problem (which is usually too late) and attempting to perform such analysis manually can also be tedious and error prone. Two of the common encapsulation problems that can arise as a consequence of this decomposition process are data classes and god classes. Typically, these two problems occur together – data classes are lacking in functionality that has typically been sucked into an over-complicated and domineering god class. This paper describes the architecture of a tool which automatically detects data and god classes that has been developed as a plug-in for the Eclipse IDE. The technique has been evaluated in a controlled study on two large open source systems which compare the tool results to similar work by Marinescu, who employs a metrics-based approach to detecting such features. The study provides some valuable insights into the strengths and weaknesses of the two approache

    Multi-paradigm modelling for cyber–physical systems: a descriptive framework

    Get PDF
    The complexity of cyber–physical systems (CPSS) is commonly addressed through complex workflows, involving models in a plethora of different formalisms, each with their own methods, techniques, and tools. Some workflow patterns, combined with particular types of formalisms and operations on models in these formalisms, are used successfully in engineering practice. To identify and reuse them, we refer to these combinations of workflow and formalism patterns as modelling paradigms. This paper proposes a unifying (Descriptive) Framework to describe these paradigms, as well as their combinations. This work is set in the context of Multi-Paradigm Modelling (MPM), which is based on the principle to model every part and aspect of a system explicitly, at the most appropriate level(s) of abstraction, using the most appropriate modelling formalism(s) and workflows. The purpose of the Descriptive Framework presented in this paper is to serve as a basis to reason about these formalisms, workflows, and their combinations. One crucial part of the framework is the ability to capture the structural essence of a paradigm through the concept of a paradigmatic structure. This is illustrated informally by means of two example paradigms commonly used in CPS: Discrete Event Dynamic Systems and Synchronous Data Flow. The presented framework also identifies the need to establish whether a paradigm candidate follows, or qualifies as, a (given) paradigm. To illustrate the ability of the framework to support combining paradigms, the paper shows examples of both workflow and formalism combinations. The presented framework is intended as a basis for characterisation and classification of paradigms, as a starting point for a rigorous formalisation of the framework (allowing formal analyses), and as a foundation for MPM tool development

    DataMock: An Agile Approach for Building Data Models from User Interface Mockups

    Get PDF
    In modern software development, much time is devoted and much attention is paid to the activity of data modeling and the translation of data models into databases. This has motivated the proposal of different approaches and tools to support this activity, such as semiautomatic approaches that generate data models from requirements artifacts using text analysis and sets of heuristics, among other techniques. However, these approaches still suffer from important limitations, including the lack of support for requirements traceability, the poor support for detecting and solving conflicts in domain-specific requirements, and the considerable effort required for manually checking the generated models. This paper introduces DataMock, an Agile approach that enables the iterative building of data models from requirements specifications, while supporting traceability and allowing inconsistencies detection in data requirements and specifications. The paper also describes how the approach effectively allows improving traceability and reducing errors and effort to build data models in comparison with traditional, state-of-the-art, data modeling approaches

    DataMock: An Agile Approach for Building Data Models from User Interface Mockups

    Get PDF
    In modern software development, much time is devoted and much attention is paid to the activity of data modeling and the translation of data models into databases. This has motivated the proposal of different approaches and tools to support this activity, such as semiautomatic approaches that generate data models from requirements artifacts using text analysis and sets of heuristics, among other techniques. However, these approaches still suffer from important limitations, including the lack of support for requirements traceability, the poor support for detecting and solving conflicts in domain-specific requirements, and the considerable effort required for manually checking the generated models. This paper introduces DataMock, an Agile approach that enables the iterative building of data models from requirements specifications, while supporting traceability and allowing inconsistencies detection in data requirements and specifications. The paper also describes how the approach effectively allows improving traceability and reducing errors and effort to build data models in comparison with traditional, state-of-the-art, data modeling approaches.Laboratorio de Investigación y Formación en Informática Avanzad

    A framework for modeling and improving agile requirements engineering.

    Get PDF
    Context. Companies adopt hybrid development models consisting of an integration of agile methodologies and Human-Centered Design (HCD) with the aim to increase value delivery as well as to reduce time to market. This has an impact on how Requirements Engineering (RE) is carried out in an agile environment. To this end, people apply different kind of agile techniques like artifacts, meetings, methods, and roles. In this context, companies often struggle with improving their value chain in a systematic manner, since guidelines for choosing an appropriate set of agile techniques are missing. Objective. The vision of this PhD thesis is to build a framework for modeling agile RE. Organizations benefit from implementing this framework by increasing their value delivery (organization external) and improving the collaboration (organizational intern). Method. We followed an inductive research approach, where we used the learnings from several studies to create the framework. In the beginning, we carried out a Systematic Literature Review (SLR) to analyze the state of the art of agile RE with focus on user and stakeholder involvement. Subsequent, we created the agile RE metamodel, which evolved iteratively along the consecutively studies. Based on the metamodel, we defined an profile that can be used to create domain specific models according to the organizational environment. Moreover, we conducted a Delphi study in order to identify the most important problems industry has to face today in terms of agile RE. The results were used as input for a systematic pattern mining process, which was utilized in order to create agile RE patterns. Results. The framework for modeling agile RE consists of three main components: i) agile RE metamodel, which can be used to analyze the organizational environment in terms of value delivery ii) catalogue of agile RE problems, which allows to detect recurring problems in terms of agile RE iii) catalogue of agile RE patterns, which allows to solve the detected problems. The agile RE metamodel comes with a profile, which can be used to deviate domain specific models. In addition, we created tool support for the framework by means of a web application (agileRE.org), which allows us to share the knowledge and best practices for agile RE. Furthermore, we proved how the framework can be applied in industry by means of case studies in Germany and in Spain. Conclusion. The framework for modeling agile RE empowers companies to improve their organizational environments in terms of value delivery and collaboration. Companies can use the framework for improving their value chain in a systematic manner. In particular, it gives guidance for choosing appropriate agile techniques, which fit to the changing needs of the organizational environment. In addition, we can state that the framework is applicable on an international level.Contexto. Con el objetivo de incrementar la potencialidad de sus desarrollos y de reducir el tiempo de puesta en el mercado, las empresas adoptan modelos de desarrollo híbridos que integran metodologías ágiles y diseño centrado en el usuario (DCU). El tratamiento de los requisitos de software en entornos ágiles es algo que impacta de manera directa en la consecución de estos objetivos. Por ello, los equipos aplican diferentes técnicas de tratamiento de requisitos como los artefactos, reuniones, métodos de trabajos grupales o el tratamiento efectivo de roles. Sin embargo, las empresas a menudo se encuentran con dificultades para elegir las mejores técnicas a aplicar en su contexto y hay una carencia de guías de soporte. Objetivo. La visión de esta tesis doctoral es construir un framework para trabajar de manera efectiva con requisitos ágiles. La idea esencial es que las organizaciones y empresas puedan usar el framework para mejorar tanto su cadena de valor (visión externa) como para mejorar sus procesos de desarrollo (visión interna). Método. Para el desarrollo del trabajo se ha usado una metodología de investigación inductiva, usando diferentes métodos de trabajo. Inicialmente, se ha llevado a cabo un estudio sistemático de la literatura (SLR) que nos permite evaluar el estado del arte en el tratamiento de requisitos ágiles pero centrado en cómo se trabaja con la involucración de los diferentes stakeholders en el proceso. Hemos continuado aplicando la ingeniería guiada por modelos desarrollando un metamodelo para trabajar con los requisitos ágiles y un profile que permite definir un lenguaje específico de dominio para el uso del metamodelo en entornos concretos. Este trabajo se ha enriquecido con la aplicación de un estudio usando Delphi para identificar los problemas más importantes que la industria se encuentra a la hora de trabajar con ingeniería de requisitos en entornos agiles. Finalmente, con los resultados hemos conseguido desarrollar un conjunto de patrones para la creación de requisitos ágiles. Resultados. El framework para modelar requisitos ágiles tiene tres componentes principales: i) Metamodelo para trabajar con requisitos ágiles que servirá para analizar el entorno de la organización. ii) un catálogo de posibles problemas que se encuentran en entornos agiles y iii) un catálogo de patrones de requisitos ágiles que resuelven los problemas detectados. El metamodelo para el trabajo con requisitos ágiles viene acompañado de un lenguaje específico de dominio, basado en un perfil. Y, además, se ha creado una aplicación web (agileRE.org) que ayuda a poner en común el conocimiento. Por último, el framework ha sido aplicado con éxito en entornos empresariales españoles y alemanes. Conclusión. El framework para modelar requisitos ágiles ayuda a las compañías a mejorar sus entornos organizaciones in términos de costes de desarrollo y aspectos colaborativos. Las empresas pueden usar el framework para mejorar su cadena de valor de una manera sistemática. En particular, da una guía para elegir técnicas apropiadas en el tratamiento de requisitos ágiles, pudiendo adaptarse al a realidad del entorno concreto de trabajo

    A Catalog of Reusable Design Decisions for Developing UML/MOF-based Domain-specific Modeling Languages

    Get PDF
    In model-driven development (MDD), domain-specific modeling languages (DSMLs) act as a communication vehicle for aligning the requirements of domain experts with the needs of software engineers. With the rise of the UML as a de facto standard, UML/MOF-based DSMLs are now widely used for MDD. This paper documents design decisions collected from 90 UML/MOF-based DSML projects. These recurring design decisions were gained, on the one hand, by performing a systematic literature review (SLR) on the development of UML/MOF-based DSMLs. Via the SLR, we retrieved 80 related DSML projects for review. On the other hand, we collected decisions from developing ten DSML projects by ourselves. The design decisions are presented in the form of reusable decision records, with each decision record corresponding to a decision point in DSML development processes. Furthermore, we also report on frequently observed (combinations of) decision options as well as on associations between options which may occur within a single decision point or between two decision points. This collection of decision-record documents targets decision makers in DSML development (e.g., DSML engineers, software architects, domain experts).Series: Technical Reports / Institute for Information Systems and New Medi
    • …
    corecore