5 research outputs found

    What we know and what we do not know about DMN

    Get PDF
    The recent Decision Model and Notation (DMN) establishes business decisions as first-class citizens of executable business processes. This research note has two objectives: first, to describe DMN's technical and theoretical foundations; second, to identify research directions for investigating DMN's potential benefits on a technological, individual and organizational level. To this end, we integrate perspectives from management science, cognitive theory and information systems research

    Aplicação integrada das técnicas de modelagem BPMN e DMN em um processo administrativo de uma instituição federal de ensino superior

    Get PDF
    The Business Process Management (BPM) principles and practices that guide organizations in managing their processes can also be used in administrative processes. The management of processes has as a fundamental part the explanation of the same, allowing a knowledge of its essential elements. This explanation can be performed using modeling tools. The Business Process Model and Notation (BPMN) and Decision Model and Notation (DMN) modeling techniques, applied in an integrated manner, allow decision-aware business processes to be designed, separating process logic from decision logic. The objective of this work was the construction of process and decision models, from the integrated application of BPMN and DMN modeling techniques in an administrative process of a Federal Institution of Higher Education (IFES). A Literature Review was presented in order to portray the state of the art of the subject in question. The method used in this research was modeling, applying the problemsolving diagram through the construction of models, with the adoption of the shortcut called Loop I - II - III - I. The research cycle was carried out three times in a way sequenced, the first cycle being for process modeling and the others for decision modeling at the requirement and logic level. The integrated use of the techniques resulted in independent and complementary process and decision models, which allow the elaboration of flexible documents, which are easy to maintain and which provide knowledge management aimed at the interested party and their area of expertise. The models portray the process and decisions as they are, enabling the analysis of improvements, standardization of procedures, employee training, knowledge management and auditing in the development of activities. This research contributes to the literature, since only 04 studies were identified referring to the integrated use of modeling techniques in administrative processes, and none of them dealt specifically with the referred processes belonging to an IFES. The research collaborated in a practical way by building models that allowed the analysis and projection of improvements to the process and decisions, even enabling the construction of a model of the future state (to be). It also collaborated, proposing a roadmap for the integrated application of BPMN and DMN modeling techniques, so that it can be replicated in other administrative processes in an IFES. And, finally, it collaborated by specifying its limitations and proposing recommendations for future work.Os princípios e práticas do Business Process Management (BPM) que orientam as organizações na gestão de seus processos também podem ser utilizados em processos administrativos. O gerenciamento de processos tem como parte fundamental a explicitação dos mesmos, possibilitando um conhecimento de seus elementos essenciais. Esta explicitação pode ser realizada por meio de ferramentas de modelagem. As técnicas de modelagem Business Process Model and Notation (BPMN) e Decision Model and Notation (DMN), aplicadas de forma integrada, permitem que sejam desenhados processos de negócios conscientes de decisão, separando-se a lógica do processo da lógica de decisão. O objetivo deste trabalho foi a construção de modelos de processo e decisão, a partir da aplicação integrada das técnicas de modelagem BPMN e DMN em um processo administrativo de uma Instituição Federal de Ensino Superior (IFES). Foi apresentada uma Revisão da Literatura a fim de retratar o estado da arte do tema em questão. O método utilizado nesta pesquisa foi a modelagem, aplicando-se o diagrama de resolução de problema através da construção de modelos, com a adoção do atalhado denominado Loop I – II – III – I. Realizou-se o ciclo de pesquisa três vezes de forma sequenciada, sendo o primeiro ciclo para modelagem de processo e os demais para a modelagem de decisão no nível requisito e lógica. O uso integrado das técnicas resultou em modelos de processo e decisão independentes e complementares, que permitem a elaboração de documentos flexíveis, de fácil manutenção e que proporcionam uma gestão do conhecimento direcionada ao interessado e sua área de atuação. Os modelos retratam o processo e as decisões como estão (as is), possibilitando a análise de melhorias, a padronização dos procedimentos, o treinamento de servidores, a gestão do conhecimento e a auditoria no desenvolvimento das atividades. Esta pesquisa contribui com a literatura, visto que foram identificados apenas 04 estudos referentes ao uso integrado das técnicas de modelagem em processos administrativos, sendo que nenhum deles tratava especificamente dos referidos processos pertencentes a uma IFES. A pesquisa colaborou de forma prática construindo modelos que permitiram a análise e projeção de melhorias ao processo e as decisões possibilitando, inclusive, a construção de um modelo do estado futuro (to be). Colaborou ainda, propondo um roteiro para a aplicação integradas das técnicas de modelagem BPMN e DMN, para que possa ser replicado em outros processos administrativos em uma IFES. E, por fim, colaborou especificando suas limitações e propondo recomendações para trabalhos futuros

    On the enhancement of Big Data Pipelines through Data Preparation, Data Quality, and the distribution of Optimisation Problems

    Get PDF
    Nowadays, data are fundamental for companies, providing operational support by facilitating daily transactions. Data has also become the cornerstone of strategic decision-making processes in businesses. For this purpose, there are numerous techniques that allow to extract knowledge and value from data. For example, optimisation algorithms excel at supporting decision-making processes to improve the use of resources, time and costs in the organisation. In the current industrial context, organisations usually rely on business processes to orchestrate their daily activities while collecting large amounts of information from heterogeneous sources. Therefore, the support of Big Data technologies (which are based on distributed environments) is required given the volume, variety and speed of data. Then, in order to extract value from the data, a set of techniques or activities is applied in an orderly way and at different stages. This set of techniques or activities, which facilitate the acquisition, preparation, and analysis of data, is known in the literature as Big Data pipelines. In this thesis, the improvement of three stages of the Big Data pipelines is tackled: Data Preparation, Data Quality assessment, and Data Analysis. These improvements can be addressed from an individual perspective, by focussing on each stage, or from a more complex and global perspective, implying the coordination of these stages to create data workflows. The first stage to improve is the Data Preparation by supporting the preparation of data with complex structures (i.e., data with various levels of nested structures, such as arrays). Shortcomings have been found in the literature and current technologies for transforming complex data in a simple way. Therefore, this thesis aims to improve the Data Preparation stage through Domain-Specific Languages (DSLs). Specifically, two DSLs are proposed for different use cases. While one of them is a general-purpose Data Transformation language, the other is a DSL aimed at extracting event logs in a standard format for process mining algorithms. The second area for improvement is related to the assessment of Data Quality. Depending on the type of Data Analysis algorithm, poor-quality data can seriously skew the results. A clear example are optimisation algorithms. If the data are not sufficiently accurate and complete, the search space can be severely affected. Therefore, this thesis formulates a methodology for modelling Data Quality rules adjusted to the context of use, as well as a tool that facilitates the automation of their assessment. This allows to discard the data that do not meet the quality criteria defined by the organisation. In addition, the proposal includes a framework that helps to select actions to improve the usability of the data. The third and last proposal involves the Data Analysis stage. In this case, this thesis faces the challenge of supporting the use of optimisation problems in Big Data pipelines. There is a lack of methodological solutions that allow computing exhaustive optimisation problems in distributed environments (i.e., those optimisation problems that guarantee the finding of an optimal solution by exploring the whole search space). The resolution of this type of problem in the Big Data context is computationally complex, and can be NP-complete. This is caused by two different factors. On the one hand, the search space can increase significantly as the amount of data to be processed by the optimisation algorithms increases. This challenge is addressed through a technique to generate and group problems with distributed data. On the other hand, processing optimisation problems with complex models and large search spaces in distributed environments is not trivial. Therefore, a proposal is presented for a particular case in this type of scenario. As a result, this thesis develops methodologies that have been published in scientific journals and conferences.The methodologies have been implemented in software tools that are integrated with the Apache Spark data processing engine. The solutions have been validated through tests and use cases with real datasets

    A framework for the integration of information requirements within infrastructure digital construction

    Get PDF
    It can be anticipated that the adoption of digital construction/BIM processes on projects will enhance the efficiency of the management of an asset over its lifecycle. Several initiatives have been taken to foster the implementation of Standard Methods and Procedures (SMP) related to BIM, such as the UK government’s mandate for them to be adopted on all centrally procured public sector projects. However, this research identifies that there are still many barriers hindering the adoption of BIM. To help break down these barriers the initial stage of this research involved the implementation and analysis of BIM SMP on a highway infrastructure project in the UK. This entailed adopting the relevant procedures during construction of the project in order to better understand the challenges faced when adopting BIM, barriers to adoption and the type of information generated over the course of an infrastructure project. The analysis highlighted that there was still a need to align SMP with existing construction processes as this was considered to be one of the greatest barriers to adoption. Further, it was observed that over 90% of the information handed over on completion was in flat file formats, therefore losing the benefits of data that can be readily queried and updated. Based on the findings of the initial stage, the research explores the process and digital construction domains in order to analyse how project specific requirements can be identified. The research then explores which of these processes can be automated in order to enhance the reliability of the information that is collected. The thesis finally presents a framework that has been developed to help engineers identify the project specific information requirements and processes that are required to assure the successful implementation of a digital construction approach. The framework that was developed was then trialled on an airport infrastructure project and identified processes that would have enhanced the implementation and delivery of the digital construction model
    corecore