86 research outputs found

    Interopérabilité et partage de connaissances

    Get PDF
    National audienceL'interopérabilité vise à accroître la capacité de systèmes et d'organisations hétérogènes à coordonner leurs activités de manière efficace. A travers six articles sélectionnés par leur qualité mais aussi leur complémentarité, ce numéro spécial établit des liens multiples entre les problématiques d'interopérabilité et de partage de connaissances. Il constitue ainsi une invitation à l'interaction entre différentes communautés scientifiques dans un but d'enrichissement mutuel

    Towards ontology based BPMN Implementation.

    Get PDF
    International audienceNatural language is understandable by human and not machine. None technical persons can only use natural language to specify their business requirements. However, the current version of Business process management and notation (BPMN) tools do not allow business analysts to implement their business processes without having technical skills. BPMN tool is a tool that allows users to design and implement the business processes by connecting different business tasks and rules together. The tools do not provide automatic implementation of business tasks from users' specifications in natural language (NL). Therefore, this research aims to propose a framework to automatically implement the business processes that are expressed in NL requirements. Ontology is used as a mechanism to solve this problem by comparing between users' requirements and web services' descriptions. Web service is a software module that performs a specific task and ontology is a concept that defines the relationships between different terms

    Data Compliance in Pharmaceutical Industry, Interoperability to align Business and Information Systems

    Get PDF
    International audienceThe ultimate goal in the pharmaceutical sector is product quality. However this quality can be altered by the use of a number of heterogeneous information systems with different business structures and concepts along the lifecycle of the product. Interoperability is then needed to guarantee a certain correspondence and compliance between different product data. In this paper we focus on a particular compliance problem, between production technical data, represented in an ERP, and the corresponding regulatory directives and specifications, represented by the Marketing Authorizations (MA). The MA detail the process for manufacturing the medicine according to the requirements imposed by health organisations such as Food and Drug Administration (FDA) and Committee for Medicinal Products for Human use (CHMP). The proposed approach uses an interoperability framework which is based on a multi-layer separation between the organisational aspects, business trades, and information technologies for each involved entity into the communication between the used systems

    Emergent technologies for inter-enterprises collaboration and business evaluation

    Get PDF
    International audienceConventional manufacturing systems are designed for intra-enterprise process management, and they hardly handle processes with tasks using extra-enterprise boundaries data. Besides, inter-enterprise collaboration and new IT enablers for industry 4.0 are becoming a highly topical issue to study, due to : (a) The emergence of new technologies mainly Internet of Things, big data processing and Cyber-Physical systems (b) The new customers' needs that face the SMEs. Many constraints and issues have to be taken into account before establishing Inter-enterprises collaboration, namely: The product information, the business processes and the heterogeneous data. Moreover, the exponential growth of data coming from all the enterprises causes several challenges regarding their exploitation. In this context, this study is interested in Big Data capabilities to help Small and Medium Enterprises to find out more lurking opportunities. We have focus on the combination between emergent IT technologies, mainly Big Data, and inter-interprises collaboration in order to provide an added value. The result of this study is a new approach, that could be adapted by SMEs, for new project evaluation within a network of enterprises

    Deep Semantic and Strutural Features Learning based on Graph for Just-in-Time Defect Prediction

    No full text
    International audienceChange-level defect prediction which is also known as just-in-time defect prediction, will not only improve the software quality and reduce costs, but also give more accurate and earlier feedback to developers than tra ditional file-level defect prediction. To build just-in-time defect prediction models, most existing approaches focused on using manually traditional features (metrics of code change), and exploited different machine learning. However, those approaches fail to capture the semantic differences between code changes and the dependency information within programs; and consequently do not cover all types of bugs. Such information has an important role to play in improving the accuracy of the defect prediction model. In this paper, to bridge this research gap, we propose an end to end deep learning framework that extracts features from the code change automatically. To this purpose, we present the code change by code property sub-graphs (CP-SG) extracted from code property graphs (CPG) that merges existing concepts of classic program analysis, namely abstract syntax tree (AST), control flow graphs (CFG) and program dependence graphs (PDG). Then, we apply a deep graph convolutional neural network (DGCNN) that takes as input the selected features. The experi mental results prove that our approach can significantly improve the baseline method DBN-based features by an average of 20.86 percentage points for within-project and 32.85 percentage points for cross-project

    IoT, Risk and Resilience based Framework for Quality Control: Application for Production in Plastic Machining

    No full text
    International audienc

    Model-Driven Engineering for vaccine product data compliance

    No full text
    International audienceData quality is widely considered as a very serious problem for the majority of companies due to the specificities of each business context and the lack of adapted solutions. In this paper, we present the benefits of Model-Driven Engineering (MDE) concepts in ensuring the interconnection of different business contexts specifications by providing a linked structure of models. This enables to generate bridges that connect implementations in different platforms. In this way, the systems interoperability can be satisfied throughout product lifecycle. The MDA approach is widely considered as a methodology for software generation from models, with a focus on enterprise and business models. Deploying a MDA approach in the supply chain context of vaccine industry allows us to deal with product data quality. In fact, it helps to translate some business models at a computer independent model through the MDA framework to generate a newly data model as well as some business rules and recommendations helping to communicate models. To ensure the quality of product data in the Enterprise Resource Planning (ERP), a new generated data model is compared with the one of the ERP and proposed mapping rules are structured through a data reference model. Finally, three levels of reference frames are proposed to ensure the share and the traceability of generated metadata in order to ensure the evolution of defined models and preserve product data quality throughout its lifecycle. A deployment of the proposed approach is presented through some application cases studies

    Dynamic Preservation Approach Design in Digital Preservation Platform

    No full text
    International audienceAbstract
    • …
    corecore