216,645 research outputs found

    Collaborative Knowledge Framework for Mediation Information System Engineering

    Get PDF
    With the worldwide interenterprise collaboration and interoperability background, automatic collaborative business process deduction is crucial and imperative researching subject. A methodology of deducing collaborative process is designed by collecting collaborative knowledge. Due to the complexity of deduction methodology, a collaborative knowledge framework is defined to organize abstract and concrete collaborative information. The collaborative knowledge framework contains three dimensions: elements, levels, and life cycle. To better define the framework, the relations in each dimension are explained in detail. They are (i) relations among elements, which organize the gathering orders and methods of different collaborative elements, (ii) relations among life cycle, which present modeling processes and agility management, and (iii) relations among levels, which define relationships among different levels of collaborative processes: strategy, operation, and support. This paper aims to explain the collaborative knowledge framework and the relations inside

    Structuring and Modeling Knowledge in the Context of Enterprise Resource Planning

    Get PDF
    In recent years, the Information Technology (IT) industry has been overwhelmed by a new class of packaged application software collectively known as Enterprise Systems (ES). Enterprise Systems are comprehensive business operating systems that weave together all the data within an organisation's business processes and associated functional areas. In particular, ES provide organisations with the ability to manage data and information in a real-time environment and to integrate operations between various departments; capacities that had been previously unrealized in traditional information systems. ES have since been established as an integral development in the Information Systems (IS) field and extensively studied by academics. The implementation and operation of ES are known to be complex and costly installations that require knowledge and expertise from various areas and sources. The knowledge necessary for managing ES is diverse and varied; it extends from the application of knowledge in different phases of the ES life cycle to the exchange of knowledge between ES vendors, clients and consultants. The communication of knowledge between the various agents adds another dimension to the complex nature of ES. Thus, ES clients have been motivated to reduce costs and retain ES knowledge within the organisation. Research has been conducted on the critical success factors and issues involved in implementing ES. These studies often address the lack of appropriate in-house ES knowledge and the need to actively manage ES-related knowledge. With motivation from another area of research known as Knowledge Management, academia and industry have strived to provide solutions and strategies for managing ES-related knowledge. However, it is often not clear what this 'knowledge' is, what type(s) of knowledge is relevant, who possesses the type(s) of knowledge and how knowledge can be instituted to facilitate the execution of processes. This research aims to identify the relevant knowledge in the context of Enterprise Systems. The types of knowledge required for ES are derived by studying the knowledge (techne)1 for different ES roles, managers and implementation consultants. This provides a perspective for understanding how ES knowledge can be structured. By applying a process modeling approach, the understanding of the relation of ES knowledge to roles and business processes thus gained will demonstrate how knowledge can be modeled. The understanding of ES knowledge and how it can be managed is first formalized by the development of a conceptual framework based on the existing literature. An exploratory study found that the identification of ES knowledge was necessary before the other activities in the knowledge management dimension could be effected. As an appropriate concept of knowledge could not be derived from the IS literature, the concept of techne emerged from a more comprehensive literature review. Techne ('art' or 'applied science' or 'skill') is defined as the trained ability of rationally producing, i.e. the ability to produce something reliably, under a variety of conditions, on the basis of reasoning. This involves having knowledge, or having what seems to be knowledge (awareness) of whatever principles and patterns one relies on. With this foundation, the main focus of the research is on the content analysis of the most popular implementation tool for Enterprise Systems management, ValueSAP. This tool is studied with respect to the types of knowledge (techne), roles and activities in ES implementation. The analysis of ValueSAP thus contributes to the understanding of the structure and distribution of knowledge in ES projects. Consequently, case studies were conducted to understand how the derived ES knowledge can be instituted in business processes using process modeling techniques. This part of the study demonstrates the modeling perspective of the research. 1. The terms 'knowledge' and 'skills' will be used interchangeably for the context of this thesis; where the term 'knowledge' is mentioned, the author refers to the skills required in the ES context. This section is further elaborated in Chapter 2 on techne and skills

    Aligning business processes and work practices

    Get PDF
    Current business process modeling methodologies offer little guidance regarding how to keep business process models aligned with their actual execution. This paper describes how to achieve this goal by uncovering and supervising business process models in connection with work practices using BAM. BAM is a methodology for business process modeling, supervision and improvement that works at two dimensions; the dimension of processes and the dimension of work practices. The business modeling component of BAM is illustrated with a case study in an organizational setting

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    Navigating in Complex Process Model Collections

    Get PDF
    The increasing adoption of process-aware information systems (PAIS) has led to the emergence of large process model collections. In the automotive and healthcare domains, for example, such collections may comprise hundreds or thousands of process models, each consisting of numerous process elements (e.g., process tasks or data objects). In existing modeling environments, process models are presented to users in a rather static manner; i.e., as image maps not allowing for any context-specific user interactions. As process participants have different needs and thus require specific presentations of available process information, such static approaches are usually not sufficient to assist them in their daily work. For example, a business manager only requires an abstract overview of a process model collection, whereas a knowledge worker (e.g., a requirements engineer) needs detailed information on specific process tasks. In general, a more flexible navigation and visualization approach is needed, which allows process participants to flexibly interact with process model collections in order to navigate from a standard (i.e., default) visualization of a process model collection to a context-specific one. With the Process Navigation and Visualization (ProNaVis) framework, this thesis provides such a flexible navigation approach for large and complex process model collections. Specifically, ProNaVis enables the flexible navigation within process model collections along three navigation dimensions. First, the geographic dimension allows zooming in and out of the process models. Second, the semantic dimension may be utilized to increase or decrease the level of detail. Third, the view dimension allows switching between different visualizations. All three navigation dimensions have been addressed in an isolated fashion in existing navigation approaches so far, but only ProNaVis provides an integrated support for all three dimensions. The concepts developed in this thesis were validated using various methods. First, they were implemented in the process navigation tool Compass, which has been used by several departments of an automotive OEM (Original Equipment Manufacturer). Second, ProNaVis concepts were evaluated in two experiments, investigating both navigation and visualization aspects. Third, the developed concepts were successfully applied to process-oriented information logistics (POIL). Experimental as well as empirical results have provided evidence that ProNaVis will enable a much more flexible navigation in process model repositories compared to existing approaches

    Measuring Process Modelling Success

    Get PDF
    Process-modelling has seen widespread acceptance, par ticularly on large IT-enabled Business Process Reengineering projects. It is applied, as a process design and management technique, across all life-cycle phases of a system. While there has been much research on aspects of process-modelling, little attention has focused on post-hoc evaluation of process-modelling success. This paper addresses this gap, and presents a process-modelling success measurement (PMS) framework, which includes the dimensions: process-model quality; model use; user satisfaction; and process modelling impact. Measurement items for each dimension are also suggested

    Using Ontologies for the Design of Data Warehouses

    Get PDF
    Obtaining an implementation of a data warehouse is a complex task that forces designers to acquire wide knowledge of the domain, thus requiring a high level of expertise and becoming it a prone-to-fail task. Based on our experience, we have detected a set of situations we have faced up with in real-world projects in which we believe that the use of ontologies will improve several aspects of the design of data warehouses. The aim of this article is to describe several shortcomings of current data warehouse design approaches and discuss the benefit of using ontologies to overcome them. This work is a starting point for discussing the convenience of using ontologies in data warehouse design.Comment: 15 pages, 2 figure

    The Structured Process Modeling Method (SPMM) : what is the best way for me to construct a process model?

    Get PDF
    More and more organizations turn to the construction of process models to support strategical and operational tasks. At the same time, reports indicate quality issues for a considerable part of these models, caused by modeling errors. Therefore, the research described in this paper investigates the development of a practical method to determine and train an optimal process modeling strategy that aims to decrease the number of cognitive errors made during modeling. Such cognitive errors originate in inadequate cognitive processing caused by the inherent complexity of constructing process models. The method helps modelers to derive their personal cognitive profile and the related optimal cognitive strategy that minimizes these cognitive failures. The contribution of the research consists of the conceptual method and an automated modeling strategy selection and training instrument. These two artefacts are positively evaluated by a laboratory experiment covering multiple modeling sessions and involving a total of 149 master students at Ghent University
    • …
    corecore