367 research outputs found

    Information integration platform for CIMS

    Get PDF
    A new information integration platform for computer integrated manufacturing system (CIMS) is presented, which is based on agent and CORBA. CORBA enhances the system integration because it is an industry-standard for interoperable, distributed objects across heterogeneous hardware and software platform. Agent technology is used to improve intelligence of the integration system. In order to implement the information integration platform, we use a network integration server to integrate the network, design a generic database agent to integrate database, adopt multi-agent based architecture to integrate application, and utilize wrapper as a CORBA object to integrate legacy code.published_or_final_versio

    An extensible manufacturing resource model for process integration

    Get PDF
    Driven by industrial needs and enabled by process technology and information technology, enterprise integration is rapidly shifting from information integration to process integration to improve overall performance of enterprises. Traditional resource models are established based on the needs of individual applications. They cannot effectively serve process integration which needs resources to be represented in a unified, comprehensive and flexible way to meet the needs of various applications for different business processes. This paper looks into this issue and presents a configurable and extensible resource model which can be rapidly reconfigured and extended to serve for different applications. To achieve generality, the presented resource model is established from macro level and micro level. A semantic representation method is developed to improve the flexibility and extensibility of the model

    Event-Cloud Platform to Support Decision- Making in Emergency Management

    Full text link
    The challenge of this paper is to underline the capability of an Event-Cloud Platform to support efficiently an emergency situation. We chose to focus on a nuclear crisis use case. The proposed approach consists in modeling the business processes of crisis response on the one hand, and in supporting the orchestration and execution of these processes by using an Event-Cloud Platform on the other hand. This paper shows how the use of Event-Cloud techniques can support crisis management stakeholders by automatizing non-value added tasks and by directing decision- makers on what really requires their capabilities of choice. If Event-Cloud technology is a very interesting and topical subject, very few research works have considered this to improve emergency management. This paper tries to fill this gap by considering and applying these technologies on a nuclear crisis use-case

    Proceedings of the Workshop on Models and Model-driven Methods for Enterprise Computing (3M4EC 2008)

    Get PDF

    The Road to Business-IT Alignment: A Case Study of Two Chinese Companies

    Get PDF
    Business information technology alignment (BITA) has been found to improve firm performance. Yet little is known about the process through which firms achieve BITA. Less is known about this process in China, the fastest-growing economy in the world. We conducted case studies with two Chinese firms. We first used the Strategic Alignment Maturity assessment model to evaluate these two firms’ BITA degree. Then we applied punctuated equilibrium Theory to study the firms’ dynamic BITA changing process. The results reveal BITA revolutions and identify their external and internal antecedents. Consistent with prior research, we find that the competitive environment, macro environment, performance deterioration, leadership change, and perception transformation contribute to revolutionary changes of BITA. More important, we find some unique impetuses of revolutionary changes of BITA in China, which include government support, organizational inertia, and social culture preferences. Theoretical and managerial implications of these findings are discussed

    Ubiquitous computing: a learning system solution in the era of industry 4.0

    Get PDF
    Ubiquitous computing, which was initially advocated by Mark Weiser has become one of the keywords to express a vision of the near future of computing systems. The "ubiquitous world" is a ubiquitous computing environment with integrated networks; computer integrated manufacturing system (CIMS) and invisible computers which equipped sensor microchips and radio frequency identification systems. Anyone can access the ubiquitous computing systems anytime and anywhere broader, without individual awareness or skills. Ubiquitous computing is becoming crucial elements to organize the activities of groups of people by use of groupware under workforce mobility. The computer-supported cooperative work is transforming from telework to ubiquitous work with new information and communication technologies that support people working cooperatively. Ubiquitous learning is a demand for the knowledge workforce for more multi-skilled professionals. It is a new and emerging education and training system that integrating e-learning of cyberspace and mobile learning of physical space with a global repository that has the potential to be accessed by anyone at any place and anytime under ubiquitous integrated computing environment. In this paper, we discuss the study of emerging trends through the implementation of work and learning that influenced ubiquitous computing technology prospects. Furthermore, the perspective of ubiquitous work and learning system, gaining quality, and hence credibility with emerging information and communication technologies in education and training systems in the area of the education system are discussed. The experimental results showed that CIMS could improve the students learned more efficiently and achieved better learning performance

    Metodología de implantación de modelos de gestión de la información dentro de los sistemas de planificación de recursos empresariales. Aplicación en la pequeña y mediana empresa

    Get PDF
    La Siguiente Generación de Sistemas de Fabricación (SGSF) trata de dar respuesta a los requerimientos de los nuevos modelos de empresas, en contextos de inteligencia, agilidad y adaptabilidad en un entono global y virtual. La Planificación de Recursos Empresariales (ERP) con soportes de gestión del producto (PDM) y el ciclo de vida del producto (PLM) proporciona soluciones de gestión empresarial sobre la base de un uso coherente de tecnologías de la información para la implantación en sistemas CIM (Computer-Integrated Manufacturing), con un alto grado de adaptabilidad a la estnictura organizativa deseada. En general, esta implementación se lleva desarrollando hace tiempo en grandes empresas, siendo menor (casi nula) su extensión a PYMEs. La presente Tesis Doctoral, define y desarrolla una nueva metodología de implementación pan la generación automática de la información en los procesos de negocio que se verifican en empresas con requerimientos adaptados a las necesidades de la SGSF, dentro de los sistemas de gestión de los recursos empresariales (ERP), atendiendo a la influencia del factor humano. La validez del modelo teórico de la metodología mencionada se ha comprobado al implementarlo en una empresa del tipo PYME, del sector de Ingeniería. Para el establecimiento del Estado del Arte de este tema se ha diseñado y aplicado una metodología específica basada en el ciclo de mejora continua de Shewhart/Deming, aplicando las herramientas de búsqueda y análisis bibliográfico disponibles en la red con acceso a las correspondientes bases de datos

    2017 DWH Long-Term Data Management Coordination Workshop Report

    Get PDF
    On June 7 and 8, 2017, the Coastal Response Research Center (CRRC)[1], NOAA Office of Response and Restoration (ORR) and NOAA National Marine Fisheries Service (NMFS) Restoration Center (RC), co-sponsored the Deepwater Horizon Oil Spill (DWH) Long Term Data Management (LTDM) workshop at the ORR Gulf of Mexico (GOM) Disaster Response Center (DRC) in Mobile, AL. There has been a focus on restoration planning, implementation and monitoring of the on-going DWH-related research in the wake of the DWH Natural Resource Damage Assessment (NRDA) settlement. This means that data management, accessibility, and distribution must be coordinated among various federal, state, local, non-governmental organizations (NGOs), academic, and private sector partners. The scope of DWH far exceeded any other spill in the U.S. with an immense amount of data (e.g., 100,000 environmental samples, 15 million publically available records) gathered during the response and damage assessment phases of the incident as well as data that continues to be produced from research and restoration efforts. The challenge with the influx in data is checking the quality, documenting data collection, storing data, integrating it into useful products, managing it and archiving it for long term use. In addition, data must be available to the public in an easily queried and accessible format. Answering questions regarding the success of the restoration efforts will be based on data generated for years to come. The data sets must be readily comparable, representative and complete; be collected using cross-cutting field protocols; be as interoperable as possible; meet standards for quality assurance/quality control (QA/QC); and be unhindered by conflicting or ambiguous terminology. During the data management process for the NOAA Natural Resource Damage Assessment (NRDA) for the DWH disaster, NOAA developed a data management warehouse and visualization system that will be used as a long term repository for accessing/archiving NRDA injury assessment data. This serves as a foundation for the restoration project planning and monitoring data for the next 15 or more years. The main impetus for this workshop was to facilitate public access to the DWH data collected and managed by all entities by developing linkages to or data exchanges among applicable GOM data management systems. There were 66 workshop participants (Appendix A) representing a variety of organizations who met at NOAA’s GOM Disaster Response Center (DRC) in order to determine the characteristics of a successful common operating picture for DWH data, to understand the systems that are currently in place to manage DWH data, and make the DWH data interoperable between data generators, users and managers. The external partners for these efforts include, but are not limited to the: RESTORE Council, Gulf of Mexico Research Initiative (GoMRI), Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC), the National Academy of Sciences (NAS) Gulf Research Program, Gulf of Mexico Alliance (GOMA), and National Fish and Wildlife Foundation (NFWF). The workshop objectives were to: Foster collaboration among the GOM partners with respect to data management and integration for restoration planning, implementation and monitoring; Identify standards, protocols and guidance for LTDM being used by these partners for DWH NRDA, restoration, and public health efforts; Obtain feedback and identify next steps for the work completed by the Environmental Disasters Data Management (EDDM) Working Groups; and Work towards best practices on public distribution and access of this data. The workshop consisted of plenary presentations and breakout sessions. The workshop agenda (Appendix B) was developed by the organizing committee. The workshop presentations topics included: results of a pre-workshop survey, an overview of data generation, the uses of DWH long term data, an overview of LTDM, an overview of existing LTDM systems, an overview of data management standards/ protocols, results from the EDDM working groups, flow diagrams of existing data management systems, and a vision on managing big data. The breakout sessions included discussions of: issues/concerns for data stakeholders (e.g., data users, generators, managers), interoperability, ease of discovery/searchability, data access, data synthesis, data usability, and metadata/data documentation. [1] A list of acronyms is provided on Page 1 of this report
    corecore