55,851 research outputs found

    Projeto do componente gerenciador de execuçao de workflow segundo a abordagem de linha de produto de software

    Get PDF
    Orientador : Itana Maria de Souza GimenesDissertaçao (mestrado) - Universidade Federal do ParanáResumo: A engenharia de software busca constantemente por um conjunto de processos, técnicas e ferramentas que propiciem o desenvolvimento de produtos com qualidade e que sejam economicamente viáveis. A reutilização é uma das técnicas deste conjunto. Considera-se que ao se reutilizar partes bem especificadas, desenvolvidas e testadas pode-se construir software em menor tempo e com maior confiabilidade. Há um número, sempre crescente, de técnicas e propostas de técnicas que favorecem a reutilização. Entre elas estão a engenharia de domínio, frameworks, padrões, arquitetura de software e desenvolvimento baseado em componentes. No entanto, neste contexto falta uma maneira sistemática e previsível para realizar a reutilização. A abordagem de linha de produto de software preenche esta lacuna, pois, tem como principal objetivo possibilitar a reutilização de maneira sistemática e previsível, não abolindo as demais técnicas, mas considerando-as como complementares. A abordagem de linha de produto é aplicável a sistemas que compartilham um conjunto gerenciado de características, que satisfazem necessidades específicas de um segmento ou missão e que são desenvolvidos a partir de um núcleo de artefatos seguindo um plano previamente definido. Deste modo, percebe-se que o domínio dos Sistemas Gerenciadores de Workflow é propício à aplicação desta abordagem. A tecnologia de workflow tem apresentado um significante crescimento nos últimos anos o que implica na necessidade de novas técnicas de engenharia de software para facilitar construção deste tipo de sistema. Esta dissertação apresenta o projeto do componente Gerenciador de Execução de Workflow (WorkflowExecutionMgr) segundo a abordagem de linha de produto de software. O componente WorkflowExecutionMgr se caracteriza por executar um workflow previamente instanciado através do gerenciamento de suas tarefas e foi projetado para permitir diferentes variantes de algoritmos de escalonamento possibilitando a instanciação de produtos com características diferentes. O projeto do componente seguiu um processo e uma arquitetura de linha de produto para Sistemas Gerenciadores de Workflow previamente definidos. A validação do componente proposto foi realizada através da implementação de um protótipo. As contribuições deste trabalho incluem o projeto do componente Gerenciador de Execução de Workflow que incrementa o núcleo de artefatos para a arquitetura de linha de produto para WfMS e a revisão da arquitetura previamente proposta.Abstract: The software engineering area has been constantly looking for processes, techniques and tools that enable the development of high quality products at economically feasible costs. Reuse is amongst these techniques. It is considered that the reuse of parts well specified, developed and tested, increases the reliability of software products as well as allowing rapid development. There has been an increasing number of techniques that encourages software reuse, such as domain engineering, frameworks, patterns, software architecture and component based development. However, it seems that we are still missing a systematic and predictable means to effectively apply software reuse. The software product line approach can be viewed as a way of filling this gap. The objective of this approach is to allow software reuse based on well-defined processes, artefacts and rules. It encompasses most of the reuse techniques previously defined. The software product line approach is applicable to systems that share a manageable set of characteristics that fulfils specific needs of a sector or mission (domain). It considers products that can be developed from a core set of artefacts following a well-defined production plan. Taking this into account, the Workflow Management Systems domain is a potential candidate for the application of this approach. The use of these systems have been significantly increasing during last years, thus efficient software engineering techniques that facilitates the development of these systems are required. This dissertation presents the design of the component Workflow Execution Manager (WorkflowExecutionMgr) according to the software product line approach. The Component WorkflowExecutionMgr manages the task execution of a previously instantiated workflow. It was designed to allow different scheduling algorithms so that products with different characteristics can be instantiated. The component design followed both a software product line architecture and a development process previously defined. A prototype was developed in order to validate the component design. The contributions of this work include the component design that increments the artefacts core set of the product line as well as the revision of the software architecture previously defined

    From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    Full text link
    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of (often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems (also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commercial workflow products cannot support the highly dynamic activities found both in the design stages of product development and in rapidly evolving workflow definitions. The integration of Product Data Management with Workflow Management can provide support for product development from initial CAD/CAM collaborative design through to the support and optimisation of production workflow activities. This paper investigates this integration and proposes a philosophy for the support of product data throughout the full development and production lifecycle and demonstrates its usefulness in the construction of CMS detectors.Comment: 18 pages, 13 figure

    Incremental Consistency Checking in Delta-oriented UML-Models for Automation Systems

    Full text link
    Automation systems exist in many variants and may evolve over time in order to deal with different environment contexts or to fulfill changing customer requirements. This induces an increased complexity during design-time as well as tedious maintenance efforts. We already proposed a multi-perspective modeling approach to improve the development of such systems. It operates on different levels of abstraction by using well-known UML-models with activity, composite structure and state chart models. Each perspective was enriched with delta modeling to manage variability and evolution. As an extension, we now focus on the development of an efficient consistency checking method at several levels to ensure valid variants of the automation system. Consistency checking must be provided for each perspective in isolation, in-between the perspectives as well as after the application of a delta.Comment: In Proceedings FMSPLE 2016, arXiv:1603.0857

    PaPaS: A Portable, Lightweight, and Generic Framework for Parallel Parameter Studies

    Full text link
    The current landscape of scientific research is widely based on modeling and simulation, typically with complexity in the simulation's flow of execution and parameterization properties. Execution flows are not necessarily straightforward since they may need multiple processing tasks and iterations. Furthermore, parameter and performance studies are common approaches used to characterize a simulation, often requiring traversal of a large parameter space. High-performance computers offer practical resources at the expense of users handling the setup, submission, and management of jobs. This work presents the design of PaPaS, a portable, lightweight, and generic workflow framework for conducting parallel parameter and performance studies. Workflows are defined using parameter files based on keyword-value pairs syntax, thus removing from the user the overhead of creating complex scripts to manage the workflow. A parameter set consists of any combination of environment variables, files, partial file contents, and command line arguments. PaPaS is being developed in Python 3 with support for distributed parallelization using SSH, batch systems, and C++ MPI. The PaPaS framework will run as user processes, and can be used in single/multi-node and multi-tenant computing systems. An example simulation using the BehaviorSpace tool from NetLogo and a matrix multiply using OpenMP are presented as parameter and performance studies, respectively. The results demonstrate that the PaPaS framework offers a simple method for defining and managing parameter studies, while increasing resource utilization.Comment: 8 pages, 6 figures, PEARC '18: Practice and Experience in Advanced Research Computing, July 22--26, 2018, Pittsburgh, PA, US

    Designing Traceability into Big Data Systems

    Full text link
    Providing an appropriate level of accessibility and traceability to data or process elements (so-called Items) in large volumes of data, often Cloud-resident, is an essential requirement in the Big Data era. Enterprise-wide data systems need to be designed from the outset to support usage of such Items across the spectrum of business use rather than from any specific application view. The design philosophy advocated in this paper is to drive the design process using a so-called description-driven approach which enriches models with meta-data and description and focuses the design process on Item re-use, thereby promoting traceability. Details are given of the description-driven design of big data systems at CERN, in health informatics and in business process management. Evidence is presented that the approach leads to design simplicity and consequent ease of management thanks to loose typing and the adoption of a unified approach to Item management and usage.Comment: 10 pages; 6 figures in Proceedings of the 5th Annual International Conference on ICT: Big Data, Cloud and Security (ICT-BDCS 2015), Singapore July 2015. arXiv admin note: text overlap with arXiv:1402.5764, arXiv:1402.575

    Support for collaborative component-based software engineering

    Get PDF
    Collaborative system composition during design has been poorly supported by traditional CASE tools (which have usually concentrated on supporting individual projects) and almost exclusively focused on static composition. Little support for maintaining large distributed collections of heterogeneous software components across a number of projects has been developed. The CoDEEDS project addresses the collaborative determination, elaboration, and evolution of design spaces that describe both static and dynamic compositions of software components from sources such as component libraries, software service directories, and reuse repositories. The GENESIS project has focussed, in the development of OSCAR, on the creation and maintenance of large software artefact repositories. The most recent extensions are explicitly addressing the provision of cross-project global views of large software collections and historical views of individual artefacts within a collection. The long-term benefits of such support can only be realised if OSCAR and CoDEEDS are widely adopted and steps to facilitate this are described. This book continues to provide a forum, which a recent book, Software Evolution with UML and XML, started, where expert insights are presented on the subject. In that book, initial efforts were made to link together three current phenomena: software evolution, UML, and XML. In this book, focus will be on the practical side of linking them, that is, how UML and XML and their related methods/tools can assist software evolution in practice. Considering that nowadays software starts evolving before it is delivered, an apparent feature for software evolution is that it happens over all stages and over all aspects. Therefore, all possible techniques should be explored. This book explores techniques based on UML/XML and a combination of them with other techniques (i.e., over all techniques from theory to tools). Software evolution happens at all stages. Chapters in this book describe that software evolution issues present at stages of software architecturing, modeling/specifying, assessing, coding, validating, design recovering, program understanding, and reusing. Software evolution happens in all aspects. Chapters in this book illustrate that software evolution issues are involved in Web application, embedded system, software repository, component-based development, object model, development environment, software metrics, UML use case diagram, system model, Legacy system, safety critical system, user interface, software reuse, evolution management, and variability modeling. Software evolution needs to be facilitated with all possible techniques. Chapters in this book demonstrate techniques, such as formal methods, program transformation, empirical study, tool development, standardisation, visualisation, to control system changes to meet organisational and business objectives in a cost-effective way. On the journey of the grand challenge posed by software evolution, the journey that we have to make, the contributory authors of this book have already made further advances

    Data Workflow - A Workflow Model for Continuous Data Processing

    Get PDF
    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since continuous data is potentially infinite while business process instances are always finite.\ud In this paper a formal workflow model is proposed with data driven coordination and explicating properties of the continuous data processing. These properties can be used to optimize data workflows, i.e., reducing the computational power for processing the workflows in an engine by reusing intermediate processing results in several workflows

    Analysis reuse exploiting taxonomical information and belief assignment in industrial problem solving

    Get PDF
    To take into account the experience feedback on solving complex problems in business is deemed as a way to improve the quality of products and processes. Only a few academic works, however, are concerned with the representation and the instrumentation of experience feedback systems. We propose, in this paper, a model of experiences and mechanisms to use these experiences. More specifically, we wish to encourage the reuse of already performed expert analysis to propose a priori analysis in the solving of a new problem. The proposal is based on a representation in the context of the experience of using a conceptual marker and an explicit representation of the analysis incorporating expert opinions and the fusion of these opinions. The experience feedback models and inference mechanisms are integrated in a commercial support tool for problem solving methodologies. The results obtained to this point have already led to the definition of the role of ‘‘Rex Manager’’ with principles of sustainable management for continuous improvement of industrial processes in companies
    corecore