34,058 research outputs found

    Improving root cause analysis through the integration of PLM systems with cross supply chain maintenance data

    Get PDF
    The purpose of this paper is to demonstrate a system architecture for integrating Product Lifecycle Management (PLM) systems with cross supply chain maintenance information to support root-cause analysis. By integrating product-data from PLM systems with warranty claims, vehicle diagnostics and technical publications, engineers were able to improve the root-cause analysis and close the information gaps. Data collection was achieved via in-depth semi-structured interviews and workshops with experts from the automotive sector. Unified Modelling Language (UML) diagrams were used to design the system architecture proposed. A user scenario is also presented to demonstrate the functionality of the system

    CrossFlow: Integrating Workflow Management and Electronic Commerce

    Get PDF
    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when augmented by virtual market technology, the dynamic configuration of the contract enactment infrastructures, and the provision of fine grained service monitoring and control. Standard ways of describing services and contracts can be combined with matchmaking technology to create a virtual market for such service provision and consumption. A provider can then advertise its services in the market and consumers can search for a compatible business partner. This provides choice in selecting a partner and allows the deferment of the decision to a point in time where it can be made on the most up-to-date requirements of the consumer and service offers in the market. The penalty for deferred decision making is the time to set up the infrastructure in each organisation for the dynamically established contract. Thus, a further aspect of CrossFlow was to exploit the contract in the dynamic and automatic configuration of the contract enactment and supervision infrastructures of the respective organisations and in linking them in a dynamic fashion. The electronic contract, which results from the agreement between the newly established business partners, completely specifies the intended collaboration between them. Given the importance of the business process enacted by the provider, this includes fine-grained monitoring and control to allow tight co-operation between the organisations

    Engineering Workflow: The Process in Product Data Technology

    Get PDF
    The prevailing paradigm for enterprises in the new decade is undoubtedly speed. This enterprise view is driven by the availability of e-business technology that enables new forms of collaboration between companies. The rapid developments in e-business also have an impact on the future of engineering organizations. This paper focuses on the early phases of a product’s life cycle, i.e. between initial concept and release to manufacturing. New engineering workflow capabilities are presented, that have been tailored to speed up the engineering of new products

    CrossFlow: Cross-Organizational Workflow Management for Service Outsourcing in Dynamic Virtual Enterprises

    Get PDF
    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dynamic service outsourcing specified in electronic contracts. Service enactment is performed by dynamically linking the workflow management infrastructures of the involved organizations. Extended service enactment support is provided in the form of cross-organizational transaction management and process control, advanced quality of service monitoring, and support for high-level flexibility in service enactment. CrossFlow technology is realized on top of a commercial workflow management platform and applied in two real-world scenarios in the contexts of a logistics and an insurance company

    Addressing the tacit knowledge of a digital library system

    Get PDF
    Recent surveys, about the Linked Data initiatives in library organizations, report the experimental nature of related projects and the difficulty in re-using data to provide improvements of library services. This paper presents an approach for managing data and its "tacit" organizational knowledge, as the originating data context, improving the interpretation of data meaning. By analyzing a Digital Libray system, we prototyped a method for turning data management into a "semantic data management", where local system knowledge is managed as a data, and natively foreseen as a Linked Data. Semantic data management aims to curates the correct consumers' understanding of Linked Datasets, driving to a proper re-use

    Expressing the tacit knowledge of a digital library system as linked data

    Get PDF
    Library organizations have enthusiastically undertaken semantic web initiatives and in particular the data publishing as linked data. Nevertheless, different surveys report the experimental nature of initiatives and the consumer difficulty in re-using data. These barriers are a hindrance for using linked datasets, as an infrastructure that enhances the library and related information services. This paper presents an approach for encoding, as a Linked Vocabulary, the "tacit" knowledge of the information system that manages the data source. The objective is the improvement of the interpretation process of the linked data meaning of published datasets. We analyzed a digital library system, as a case study, for prototyping the "semantic data management" method, where data and its knowledge are natively managed, taking into account the linked data pillars. The ultimate objective of the semantic data management is to curate the correct consumers' interpretation of data, and to facilitate the proper re-use. The prototype defines the ontological entities representing the knowledge, of the digital library system, that is not stored in the data source, nor in the existing ontologies related to the system's semantics. Thus we present the local ontology and its matching with existing ontologies, Preservation Metadata Implementation Strategies (PREMIS) and Metadata Objects Description Schema (MODS), and we discuss linked data triples prototyped from the legacy relational database, by using the local ontology. We show how the semantic data management, can deal with the inconsistency of system data, and we conclude that a specific change in the system developer mindset, it is necessary for extracting and "codifying" the tacit knowledge, which is necessary to improve the data interpretation process

    Integration of BPM systems

    Get PDF
    New technologies have emerged to support the global economy where for instance suppliers, manufactures and retailers are working together in order to minimise the cost and maximise efficiency. One of the technologies that has become a buzz word for many businesses is business process management or BPM. A business process comprises activities and tasks, the resources required to perform each task, and the business rules linking these activities and tasks. The tasks may be performed by human and/or machine actors. Workflow provides a way of describing the order of execution and the dependent relationships between the constituting activities of short or long running processes. Workflow allows businesses to capture not only the information but also the processes that transform the information - the process asset (Koulopoulos, T. M., 1995). Applications which involve automated, human-centric and collaborative processes across organisations are inherently different from one organisation to another. Even within the same organisation but over time, applications are adapted as ongoing change to the business processes is seen as the norm in today’s dynamic business environment. The major difference lies in the specifics of business processes which are changing rapidly in order to match the way in which businesses operate. In this chapter we introduce and discuss Business Process Management (BPM) with a focus on the integration of heterogeneous BPM systems across multiple organisations. We identify the problems and the main challenges not only with regards to technologies but also in the social and cultural context. We also discuss the issues that have arisen in our bid to find the solutions

    On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis

    Get PDF
    In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies

    Extracting, Transforming and Archiving Scientific Data

    Get PDF
    It is becoming common to archive research datasets that are not only large but also numerous. In addition, their corresponding metadata and the software required to analyse or display them need to be archived. Yet the manual curation of research data can be difficult and expensive, particularly in very large digital repositories, hence the importance of models and tools for automating digital curation tasks. The automation of these tasks faces three major challenges: (1) research data and data sources are highly heterogeneous, (2) future research needs are difficult to anticipate, (3) data is hard to index. To address these problems, we propose the Extract, Transform and Archive (ETA) model for managing and mechanizing the curation of research data. Specifically, we propose a scalable strategy for addressing the research-data problem, ranging from the extraction of legacy data to its long-term storage. We review some existing solutions and propose novel avenues of research.Comment: 8 pages, Fourth Workshop on Very Large Digital Libraries, 201

    Data-driven Design of Engineering Processes with COREPROModeler

    Get PDF
    Enterprises increasingly demand IT support for the coordination of their engineering processes, which often consist of hundreds up to thousands of sub-processes. From a technical viewpoint, these sub-processes have to be concurrently executed and synchronized considering numerous interdependencies. So far, this coordination has mainly been accomplished manually, which has resulted in errors and inconsistencies. In order to deal with this problem, we have to better understand the interdependencies between the subprocesses to be coordinated. In particular, we can benefit from the fact that sub-processes are often correlated to the assembly of a product (represented by a product data structure). This information can be utilized for the modeling and execution of so-called data-driven process structures. In this paper, we present the COREPRO demonstrator that supports the data-driven modeling of these process structures. The approach explicitly establishes a close linkage between product data structures and engineering processes
    corecore