355 research outputs found

    Conceptual information processing: A robust approach to KBS-DBMS integration

    Get PDF
    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing

    Integrating CLIPS applications into heterogeneous distributed systems

    Get PDF
    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center

    Spitzer Science Center within an Enterprise Architecture

    Get PDF
    The Spitzer Science Centerā€™s (SSC) evolutionary development approach, coupled with a flexible, scaleable hardware and software architecture has been key in Spitzerā€™s ability to handle an explosion of data products, evolving data definitions, and changing data quality requirements. Spitzer is generating (depending on the campaign and instrument) about 10 TB of pre-archive data every 14 to 20 days. This generally reduces to between 3 TB and 6 TB of standard products, again depending on the campaign and instrument. This paper will discuss (1) the Spitzer Science Centerā€™s responses to evolving data, quality, and processing requirements and (2) how robust or not was the original architecture to allow Spitzer to accommodate on-going change

    On incremental global update support in cooperative database systems

    Get PDF
    OzGateway is a cooperative database system designed for integrating heterogeneous existing information systems into an interoperable environment. It also aims to provide a gatewway for legacy information system migration. This paper summarises the problems and results of multidatabase transaction management research. In supporting global updates in OzGateway in an evolutionary way, we introduce a classification of multidatabase transactions and discuss the problems in each category. The architecture of OzGateway and the design of the global transaction manager and servers are presented

    An extensible view system for supporting the integration and interoperation of heterogeneous, autonomous, and distributed database management systems

    Get PDF
    In this thesis the problem of integrating heterogeneous, autonomous and distributed database management systems (DBMSs) is addressed. To provide a solution, we have developed an approach, a design method, and a view system. Our approach is based on the invention of the abstract view constructs that have uniform and stable representations for supporting semantic relativism and distributed abstraction modeling. Our design method applies object-oriented techniques and software engineering concepts to manage the system complexity. Our view system has been constructed upon established experience with the development of large-scale distributed systems in a distributed object infrastructure provided by the Common Object Request Broker Architecture (CORBA). The scope of our research identifies the goals of Project Zeus in which we have created the Zeus View Mechanism ( ZVM) as the theoretical foundation of our approach. The notion of frameworks has been introduced as part of our design methodology to promote code/design reuse and enhance the portability/extensibility of the architectural design. A multidatabase system, the Zeus Multidatabase System ( ZMS), has provided a test bed for our concept. Project Zeus has exciting prospects. The foundation established in this research has created new directions in multidatabase research and will have a significant impact on future integration and interoperation technologies

    DCMS: A data analytics and management system for molecular simulation

    Get PDF
    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface (i.e., SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression
    • ā€¦
    corecore