338 research outputs found

    The Future of Modeling in Material Handling Systems

    Get PDF
    Today, when we talk about “modeling” in the context of material handling systems, invariably we are referring to a mathematical or computational model for analyzing some aspect of the system, such as its throughput rate, response time, cost of ownership, required storage capacity, etc. Creating these kinds of models requires considerable knowledge in at least two domains the material handling system domain, and the analysis methodology domain—and considerable skill in the “art of modeling” in order to express the former in the terms of the latter. The results can be somewhat ad hoc—e.g., two different modelers are likely to create two somewhat different simulation models of exactly the same material handling system. In the past, the situation in software development was very similar, with individual programming experts idiosyncratically driving software development. Over the past twenty years, however, computer scientists and software engineers have created a radically different approach to the process of software “modeling” called Model Driven Architecture, or MDA, that is used to create software for standard applications. The thesis of this paper is that MDA can be adapted to the kind of modeling done to support design and operational decision making in material handling systems. The paper describes MDA technologies in the context of material handling system modeling, and explains how adapting this approach to our context will transform the way we do research and the way material handling systems are analyzed and designed in practice

    A Process Warehouse for Process Variants Analysis

    Get PDF
    Process model variants are collections of similar process models evolved over time because of the adjustments that were made to a particular process in a given domain, \eg ,order-to-cash or procure-to-pay process in reseller or procurement domain. These adjustments produce some variations between these process models that mainly should be identical but may differ slightly. These variations are due to new procedures, law regulations in different countries, variations due to different decision histories and organizational responsibilities and to different requirements for different branches of an enterprise. Existing approaches related to data warehouse solutions suffer from adequately abstracting and consolidating all variants into one generic process model, to provide the possibility to distinguish and compare among different parts of different variants. This shortcoming affects decision making of business analysts for a specific process context. This paper addresses the above shortcoming by proposing a framework to analyse process variants.\\ The framework consists of two original contributions: (i) a novel meta-model of processes as a generic data model to capture and consolidate process variants into a reference process model; (ii) a process warehouse model to perform typical online analytical processing operations on different variation parts thus providing support to decision-making through KPIs; The framework concepts were defined and validated using a real-life case study. Moreover, a prototype is implemented to support the validation of the framework and performance dashboards are provided with detailed statistics at different levels of abstraction

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    Web service composition: A survey of techniques and tools

    Get PDF
    Web services are a consolidated reality of the modern Web with tremendous, increasing impact on everyday computing tasks. They turned the Web into the largest, most accepted, and most vivid distributed computing platform ever. Yet, the use and integration of Web services into composite services or applications, which is a highly sensible and conceptually non-trivial task, is still not unleashing its full magnitude of power. A consolidated analysis framework that advances the fundamental understanding of Web service composition building blocks in terms of concepts, models, languages, productivity support techniques, and tools is required. This framework is necessary to enable effective exploration, understanding, assessing, comparing, and selecting service composition models, languages, techniques, platforms, and tools. This article establishes such a framework and reviews the state of the art in service composition from an unprecedented, holistic perspective

    A BPMN-Based Design and Maintenance Framework for ETL Processes

    Get PDF
    Business Intelligence (BI) applications require the design, implementation, and maintenance of processes that extract, transform, and load suitable data for analysis. The development of these processes (known as ETL) is an inherently complex problem that is typically costly and time consuming. In a previous work, we have proposed a vendor-independent language for reducing the design complexity due to disparate ETL languages tailored to specific design tools with steep learning curves. Nevertheless, the designer still faces two major issues during the development of ETL processes: (i) how to implement the designed processes in an executable language, and (ii) how to maintain the implementation when the organization data infrastructure evolves. In this paper, we propose a model-driven framework that provides automatic code generation capability and ameliorate maintenance support of our ETL language. We present a set of model-to-text transformations able to produce code for different ETL commercial tools as well as model-to-model transformations that automatically update the ETL models with the aim of supporting the maintenance of the generated code according to data source evolution. A demonstration using an example is conducted as an initial validation to show that the framework covering modeling, code generation and maintenance could be used in practice

    Supporting adaptiveness of cyber-physical processes through action-based formalisms

    Get PDF
    Cyber Physical Processes (CPPs) refer to a new generation of business processes enacted in many application environments (e.g., emergency management, smart manufacturing, etc.), in which the presence of Internet-of-Things devices and embedded ICT systems (e.g., smartphones, sensors, actuators) strongly influences the coordination of the real-world entities (e.g., humans, robots, etc.) inhabitating such environments. A Process Management System (PMS) employed for executing CPPs is required to automatically adapt its running processes to anomalous situations and exogenous events by minimising any human intervention. In this paper, we tackle this issue by introducing an approach and an adaptive Cognitive PMS, called SmartPM, which combines process execution monitoring, unanticipated exception detection and automated resolution strategies leveraging on three well-established action-based formalisms developed for reasoning about actions in Artificial Intelligence (AI), including the situation calculus, IndiGolog and automated planning. Interestingly, the use of SmartPM does not require any expertise of the internal working of the AI tools involved in the system

    A standards-based ICT framework to enable a service-oriented approach to clinical decision support

    Get PDF
    This research provides evidence that standards based Clinical Decision Support (CDS) at the point of care is an essential ingredient of electronic healthcare service delivery. A Service Oriented Architecture (SOA) based solution is explored, that serves as a task management system to coordinate complex distributed and disparate IT systems, processes and resources (human and computer) to provide standards based CDS. This research offers a solution to the challenges in implementing computerised CDS such as integration with heterogeneous legacy systems. Reuse of components and services to reduce costs and save time. The benefits of a sharable CDS service that can be reused by different healthcare practitioners to provide collaborative patient care is demonstrated. This solution provides orchestration among different services by extracting data from sources like patient databases, clinical knowledge bases and evidence-based clinical guidelines (CGs) in order to facilitate multiple CDS requests coming from different healthcare settings. This architecture aims to aid users at different levels of Healthcare Delivery Organizations (HCOs) to maintain a CDS repository, along with monitoring and managing services, thus enabling transparency. The research employs the Design Science research methodology (DSRM) combined with The Open Group Architecture Framework (TOGAF), an open source group initiative for Enterprise Architecture Framework (EAF). DSRM’s iterative capability addresses the rapidly evolving nature of workflows in healthcare. This SOA based solution uses standards-based open source technologies and platforms, the latest healthcare standards by HL7 and OMG, Decision Support Service (DSS) and Retrieve, Update Locate Service (RLUS) standard. Combining business process management (BPM) technologies, business rules with SOA ensures the HCO’s capability to manage its processes. This architectural solution is evaluated by successfully implementing evidence based CGs at the point of care in areas such as; a) Diagnostics (Chronic Obstructive Disease), b) Urgent Referral (Lung Cancer), c) Genome testing and integration with CDS in screening (Lynch’s syndrome). In addition to medical care, the CDS solution can benefit organizational processes for collaborative care delivery by connecting patients, physicians and other associated members. This framework facilitates integration of different types of CDS ideal for the different healthcare processes, enabling sharable CDS capabilities within and across organizations

    BPMN4sML: A BPMN Extension for Serverless Machine Learning. Technology Independent and Interoperable Modeling of Machine Learning Workflows and their Serverless Deployment Orchestration

    Full text link
    Machine learning (ML) continues to permeate all layers of academia, industry and society. Despite its successes, mental frameworks to capture and represent machine learning workflows in a consistent and coherent manner are lacking. For instance, the de facto process modeling standard, Business Process Model and Notation (BPMN), managed by the Object Management Group, is widely accepted and applied. However, it is short of specific support to represent machine learning workflows. Further, the number of heterogeneous tools for deployment of machine learning solutions can easily overwhelm practitioners. Research is needed to align the process from modeling to deploying ML workflows. We analyze requirements for standard based conceptual modeling for machine learning workflows and their serverless deployment. Confronting the shortcomings with respect to consistent and coherent modeling of ML workflows in a technology independent and interoperable manner, we extend BPMN's Meta-Object Facility (MOF) metamodel and the corresponding notation and introduce BPMN4sML (BPMN for serverless machine learning). Our extension BPMN4sML follows the same outline referenced by the Object Management Group (OMG) for BPMN. We further address the heterogeneity in deployment by proposing a conceptual mapping to convert BPMN4sML models to corresponding deployment models using TOSCA. BPMN4sML allows technology-independent and interoperable modeling of machine learning workflows of various granularity and complexity across the entire machine learning lifecycle. It aids in arriving at a shared and standardized language to communicate ML solutions. Moreover, it takes the first steps toward enabling conversion of ML workflow model diagrams to corresponding deployment models for serverless deployment via TOSCA.Comment: 105 pages 3 tables 33 figure
    corecore