23,717 research outputs found

    Specifications and Development of Interoperability Solution dedicated to Multiple Expertise Collaboration in a Design Framework

    Get PDF
    This paper describes the specifications of an interoperability platform based on the PPO (Product Process Organization) model developed by the French community IPPOP in the context of collaborative and innovative design. By using PPO model as a reference, this work aims to connect together heterogonous tools used by experts easing data and information exchanges. After underlining the growing needs of collaborative design process, this paper focuses on interoperability concept by describing current solutions and their limits. Then a solution based on the flexibility of the PPO model adapted to the philosophy of interoperability is proposed. To illustrate these concepts, several examples are more particularly described (robustness analysis, CAD and Product Lifecycle Management systems connections)

    Ship product modelling

    Get PDF
    This paper is a fundamental review of ship product modeling techniques with a focus on determining the state of the art, to identify any shortcomings and propose future directions. The review addresses ship product data representations, product modeling techniques and integration issues, and life phase issues. The most significant development has been the construction of the ship Standard for the Exchange of Product Data (STEP) application protocols. However, difficulty has been observed with respect to the general uptake of the standards, in particular with the application to legacy systems, often resulting in embellishments to the standards and limiting the ability to further exchange the product data. The EXPRESS modeling language is increasingly being superseded by the extensible mark-up language (XML) as a method to map the STEP data, due to its wider support throughout the information technology industry and its more obvious structure and hierarchy. The associated XML files are, however, larger than those produced using the EXPRESS language and make further demands on the already considerable storage required for the ship product model. Seamless integration between legacy applications appears to be difficult to achieve using the current technologies, which often rely on manual interaction for the translation of files. The paper concludes with a discussion of future directions that aim to either solve or alleviate these issues

    BlogForever D3.2: Interoperability Prospects

    Get PDF
    This report evaluates the interoperability prospects of the BlogForever platform. Therefore, existing interoperability models are reviewed, a Delphi study to identify crucial aspects for the interoperability of web archives and digital libraries is conducted, technical interoperability standards and protocols are reviewed regarding their relevance for BlogForever, a simple approach to consider interoperability in specific usage scenarios is proposed, and a tangible approach to develop a succession plan that would allow a reliable transfer of content from the current digital archive to other digital repositories is presented

    Linking design and manufacturing domains via web-based and enterprise integration technologies

    Get PDF
    The manufacturing industry faces many challenges such as reducing time-to-market and cutting costs. In order to meet these increasing demands, effective methods are need to support the early product development stages by bridging the gap of communicating early design ideas and the evaluation of manufacturing performance. This paper introduces methods of linking design and manufacturing domains using disparate technologies. The combined technologies include knowledge management supporting for product lifecycle management (PLM) systems, enterprise resource planning (ERP) systems, aggregate process planning systems, workflow management and data exchange formats. A case study has been used to demonstrate the use of these technologies, illustrated by adding manufacturing knowledge to generate alternative early process plan which are in turn used by an ERP system to obtain and optimise a rough-cut capacity plan

    Integrating Distributed Sources of Information for Construction Cost Estimating using Semantic Web and Semantic Web Service technologies

    Get PDF
    A construction project requires collaboration of several organizations such as owner, designer, contractor, and material supplier organizations. These organizations need to exchange information to enhance their teamwork. Understanding the information received from other organizations requires specialized human resources. Construction cost estimating is one of the processes that requires information from several sources including a building information model (BIM) created by designers, estimating assembly and work item information maintained by contractors, and construction material cost data provided by material suppliers. Currently, it is not easy to integrate the information necessary for cost estimating over the Internet. This paper discusses a new approach to construction cost estimating that uses Semantic Web technology. Semantic Web technology provides an infrastructure and a data modeling format that enables accessing, combining, and sharing information over the Internet in a machine processable format. The estimating approach presented in this paper relies on BIM, estimating knowledge, and construction material cost data expressed in a web ontology language. The approach presented in this paper makes the various sources of estimating data accessible as Simple Protocol and Resource Description Framework Query Language (SPARQL) endpoints or Semantic Web Services. We present an estimating application that integrates distributed information provided by project designers, contractors, and material suppliers for preparing cost estimates. The purpose of this paper is not to fully automate the estimating process but to streamline it by reducing human involvement in repetitive cost estimating activities

    Extending the DSE: LOD support and TEI/IIIF integration in EVT

    Get PDF
    Current digital scholarly editions (DSEs) have the opportunity of evolving to dynamic objects interacting with other Internet-based resources thanks to open frameworks such as IIIF and LOD. This paper showcases and discusses two new functionalities of EVT (Edition Visualization Technology), version 2: one improving the management of named entities (f.i. personal names) through the use of LOD resources such as FOAF and DBpedia; the other, providing integration of the published text with digital images of the textual primary sources accessed from online repositories (e.g. e-codices or digital libraries such as the Vaticana or the Ambrosiana) via the IIIF protocol

    Design and implementation of an integrated surface texture information system for design, manufacture and measurement

    Get PDF
    The optimised design and reliable measurement of surface texture are essential to guarantee the functional performance of a geometric product. Current support tools are however often limited in functionality, integrity and efficiency. In this paper, an integrated surface texture information system for design, manufacture and measurement, called “CatSurf”, has been designed and developed, which aims to facilitate rapid and flexible manufacturing requirements. A category theory based knowledge acquisition and knowledge representation mechanism has been devised to retrieve and organize knowledge from various Geometrical Product Specifications (GPS) documents in surface texture. Two modules (for profile and areal surface texture) each with five components are developed in the CatSurf. It also focuses on integrating the surface texture information into a Computer-aided Technology (CAx) framework. Two test cases demonstrate design process of specifications for the profile and areal surface texture in AutoCAD and SolidWorks environments respectively

    Theory and Practice of Data Citation

    Full text link
    Citations are the cornerstone of knowledge propagation and the primary means of assessing the quality of research, as well as directing investments in science. Science is increasingly becoming "data-intensive", where large volumes of data are collected and analyzed to discover complex patterns through simulations and experiments, and most scientific reference works have been replaced by online curated datasets. Yet, given a dataset, there is no quantitative, consistent and established way of knowing how it has been used over time, who contributed to its curation, what results have been yielded or what value it has. The development of a theory and practice of data citation is fundamental for considering data as first-class research objects with the same relevance and centrality of traditional scientific products. Many works in recent years have discussed data citation from different viewpoints: illustrating why data citation is needed, defining the principles and outlining recommendations for data citation systems, and providing computational methods for addressing specific issues of data citation. The current panorama is many-faceted and an overall view that brings together diverse aspects of this topic is still missing. Therefore, this paper aims to describe the lay of the land for data citation, both from the theoretical (the why and what) and the practical (the how) angle.Comment: 24 pages, 2 tables, pre-print accepted in Journal of the Association for Information Science and Technology (JASIST), 201
    • 

    corecore